TSM - Minting my own business

Mihai Șerban - Senior Software Developer @ Accesa

In 1977, humanity sent the Voyager 1 probe into an interstellar journey carrying with it some of our memories, wishes and aspirations carefully encrypted on a golden disk. Chances are that the information on that disk will last more than any other record on the surface of the earth: our closest aspiration to immortality. It also signals something else: it is ours, earth-made, carrying our signature. Immortality and ownership — the two powerful forces shaping the destiny of our society since forever have now merged with technology.

Fourteen years ago, Satoshi Nakamoto published his famous paper on how you could leverage Merkle trees and create a peer-to-peer electronic payment system that has become the first Blockchain. Even before that, programmers worldwide (I was too young to be a programmer at that time) talked about immutability as the Holy Grail of design: don't let anyone modify your data, not even yourself if you want to trust it.

Final fields everywhere, the war on setters, deep, expensive clones, builder patterns, you know the story. You can imagine our enthusiasm when someone proposed the idea to use an immutable database that is confidential and as secure and unbreakable as the SHA-2 hashing algorithm. For software architects: the perfect implementation of an event sourcing design pattern is one of the most elegant ways to keep the state of a system, the same pattern that gave us GIT, and this allows you to recreate the form of your application from the initial value (block 0) and the events that follow, being them commits or transactions.

Lost consistency, the trade-off for any distributed system, was elegantly solved by the byzantine fault tolerance algorithms, proving that eventual statistical consistency can hold even in the financial domain. The advantages gained were too great to pass unnoticed. First, as any developer knows (and QA probably disagrees), code is much easier to trust than humans, let alone institutions, especially after the 2007 financial crisis. Secondly, given a large enough decentralized network and a proper design, data has a good chance to last much longer than any institution or company that might at any time shut down its servers or stop paying its cloud bill.

With the Ethereum blockchain launch in 2015 and the introduction of Smart Contracts, a new layer of abstraction over blockchain emerged: the possibility to run custom code on the blockchain that responds to user requests. Then appeared the opportunity to store any asset that can be represented by data on a blockchain that supports smart contracts: the ownership proof of your songs, art, memes, real estate assets and — who would have guessed — links to bored apes. As by design, there is no distinction between a smart contract address and a simple wallet address from the network perspective, integration is smooth.

I like to think of them as immutable microservices sitting on the blockchain that executes transactions on behalf of the users or other contracts. They both look the same, but programmable operations can happen when a transaction is issued to a Smart Contract instead of just passing a cryptocurrency amount. These operations are typically issuance of digital tokens, staking, participating in voting polls, minting of NTFs and pretty much everything that supports the underlying blockchain infrastructure. With great power comes great responsibility; they have also introduced security risks and vulnerabilities, as their source code is usually public to provide trust through transparency.

Making the contract upgradable requires a lot of trust from the users, as the upgrade could be detrimental to some. The immutability helps with the trust aspect, as each user knows the same code will run for every request/transaction. But it makes it difficult to fix a bug discovered after the deployment, or even a security risk that an attacker might profit on, probably the most famous being the DAO attack in 2016.

This raised the need for standardization of Smart Contracts. The first such proposal was the ERC-20 (Ethereum Request for Comments 20), proposed by Fabian Vogelsteller in November 2015, which defines the API that a Smart Contract should implement. You can read the ERC specs here, where you will find they are surprisingly simple.

Along with smart contracts, the concept of "tokens" was introduced, representing digital data managed by smart contracts, making properties like liquidity, total supply, or price configurable (programmable). The costs of running the transactions are, of course, bound to the underlying cryptocurrency of the blockchain, cleverly decoupled by the smart contracts. One analogy is viewing the blockchain as the physical layer, Smart Contracts the network layer, while the tokens they carry are user data.

The non-fungible token naming makes a clear distinction between NFTs and digital currencies. Until 2017, tokens managed by smart contracts were mostly fungible, which for the end-users were indistinguishable from the Blockchain cryptocurrency. In 2017, the first standard for unique tokens (NFTs) was released by the Ethereum team, ERC-721, defining the API of the contract made to manage (references of) unique assets on the blockchain. This allowed developers and creators to release and sell NFTs in a market that had an estimated value of ~25 billion USD in 2021.

Nothing prevented the users from wanting to own these assets in their digital wallets, not even the fact that, quoting Wikipedia:

"NFT ledgers claim to provide a public certificate of authenticity or proof of ownership, but the legal rights conveyed by an NFT can be uncertain. NFTs do not restrict the sharing or copying of the underlying digital files, do not necessarily convey the copyright of the digital files, and do not prevent the creation of NFTs with identical associated files."

Of course, players in the NFT market are much more optimistic, as they should be:

"This triggered an avalanche of unprecedented use cases for digital assets where true ownership, unstoppable transfers and immutable provenance are essential. While there is a vast amount of domains and business verticals where such properties are important, art and entertainment got a massive head start." — https://elrond.com/blog/elrond-nft-space/

In other words, having a link to the asset in a digital wallet is recognized by online communities as proof of ownership and is viewed as a property right on the asset. Is it still soon to tell if any legal value will be assigned to this means of ownership? Now, buyers certainly trust it because the transactions and the ownership are stored (presumably forever) on the blockchain. But where is the actual asset?

Indeed, it is not stored on the blockchain, as it will have to be replicated on thousands of nodes and kept forever, which will increase the blockchain size beyond the limits of feasibility. To offer an example, storing the data on the blockchain, especially on Ethereum, which still operates with proof of work consensus algorithm, storing 1MB on the chain costs at least 4ETH, which depending on when you read this article, has a value between 0 and infinity in USD. Not an option. Of course, going decentralized and storing the data in GCS or S3 buckets means simply shooting yourself in the foot. You have the proof of owning the asset, but you've lost the asset (or it's in control of a company you can't trust, let's say Facebook).

Here comes the salvation: the IPFS, acronym for InterPlanetary File System, a homage after the dream that was born in the mind of J. C. R. Licklide in 1962 while leading the IPTO (Information Processing Techniques Office) office of ARPA (Advanced Research Projects Agency), the dream of implementing a peer to peer 'Intergalactic Network' which led to the internet we know today. But the truth is, the internet is not how it used to be. It used to be peer to peer, decentralized. Every computer in the network used to have an HTTP server, serving its content to other peers requesting it. There was no panic due to FANG servers failing because nobody was big enough to have an impact.

These days, we don't communicate with each other anymore. We only exchange information with the giants that trade our information for money. You might feel that you are sending a WhatsApp message to your friend because the network is fast. You are not. You are sending a WhatsApp message to Facebook servers, who perform a sentiment analysis on your words, organize a public auction for your attention on the advertising market, sell your attention to the highest bidder. After that, they send a notification to your friend on your behalf and start the process all over again. The IPFS aims to improve the internet in doing just that: work at the inter-planetary scale, decentralized and peer to peer, fix the internet, or better said the web.

How does IPFS work?

IPFS is an application layer protocol (hypermedia) much like HTTP. Still, while current WEB relies on URLs to locate resources which is essentially a location on a server, IPFS uses fingerprints of the data for addressing. So called CIDs (Content Identifiers). A CID is a multi-hash identifier of the data, which in addition to the hash of the asset, contains hashing algorithm (usually SHA2), the encoding and the version (currently 0 or 1).

Each CID is unique for a piece of data, which is extremely useful if you want to ensure that no one tampered with your asset. Changing a bit in your data will completely change its address so you can immediately see the enormous advantage it brings over URLs, where today might take you to a picture of a cat. On the following day, the same URL will serve you a Keanu Reeves meme, depending on the mood of the server admin. The main difference between the URL and CID addressing model is that URL does not identify the resource but a place on a server. The resource can change without notice, while the CID identifies the resource.

Consequently, HTTP URLs are not suitable for addressing your highly valued assets like the above-mentioned bored apes, but IPFS URLs can, you guessed it, by using the same Merkle trees data structure. When you request an asset over IPFS, you need the CID. The node you are querying searches the data in its repo. If it has it — there you go, instant access to the file. If it doesn't have it, it will ask the neighboring peers for the file. When received, it caches the data and serves it back to you. The data is stored on at least two nodes, and it is accessible for as long as at least one node holds the data. If the node with the asset shuts down or the garbage collector erases the asset to clean up space, the data is lost forever. Almost forever, as if you have a copy of the file and no one altered the contents, it will have the same hash and consequently the same CID (given that you use the same hashing algorithm, encoding and version). A new market for distributed data storage emerges on top of IPFS. It offers guarantees that your data will still be available on specific amounts of nodes for a certain amount of time, giving you have a certain amount of crypto to spare, or even for free with some limitations. To name a few, you have filecoin, pinata, eternum, infura and many others that will pin your data so that you can be sure your NFT is still there until you sell it. And there is the option of maintaining the IPFS node or cluster on your server (or company cloud partner if you are lucky enough).

Most browsers ignore the IPFS protocol, with the notable exception of Opera, which has native support for it, so you could put in the address bar the ipfs://<CID> and the asset will be displayed. For other browsers, several gateways exist that translate the IPFS CID to a standard HTTP URL. Unfortunately, the response time is incomparably slower than the "traditional" HTTP-powered world wide web.

To what purpose?

We must ask several questions whenever we develop a new technology: Can humanity benefit? Can it be valid, at least for us? Does it create more problems than it solves? From the entire 25-billion-dollar market, how much of it brings value for artists, creators, businesses, and institutions? And how much of it is just a speculative bubble of generated jpeg cartoons powered by smart contracts that are essentially candy slot machines offering some dopamine by storing the IPFS CID of a Jpeg on a blockchain wallet when the user calls the "mint" method on the Smart Contract with the sole purpose to sell it later at a higher price when there are no NFTs left to be "minted" in the contract or just to prove that they can afford it, boosting their social status on a virtual yacht party in the Metaverse? I don't have the answers to such questions, but I believe some items deserve to be stored on the blockchain, and the "gods" of Silicon Valley have blessed us with a perfect solution for that: the NFT.

One such thing that is important for me is my digital self. All my thoughts, ideas, and photos remind me of who I was years ago (usually involving lots of shame) and how much I've changed over the time that accidentally came into the possession of social media giants, namely Facebook. I am still responsible for data, but that does not belong to me. Join me in the process of taking back ownership of my digital self. I believe it's a good starting point for an NFT project: owning yourself. I am keeping my ideas, photos, and idiotic comments in my digital wallet, stored presumably forever on IPFS and blockchain. This is one way to achieve immortality or to leave a trace in the future for as long as humanity will use information processing devices, and it's profitable to keep them on.

Talk is cheap. Here is the code.

If you want to own your digital self, the first step is to get the data. Luckily, Facebook added the functionality to download all your data. I hope you have a large enough hard disk for it. They are even nice enough to give you the data in high quality. You can find this option under Settings and Privacy > Settings > Your Facebook Information, and you can request a download of all your data you are interested in moving on the blockchain, in JSON or HTML format. I have extracted the data in both formats out of curiosity. Funny enough, Facebook stocks plummeted the next day by 25%. And correlation does not mean causation. The next step is NFT-izing™ the data. I have a lot of text on my Facebook wall and pictures with text from time to time. I don't want to lose anything, so I need to generate images from all this data.

If you are a JDK ecosystem software developer, like me, you are probably thinking: hmm, I need an HTML parser / manipulator, then some rendering solution to create custom image sizes based on text length, but there are also links with images so I need to take the URLs and replace them with a base64 encoded versions and then capture the rendered content and when this train of thoughts goes too deep the experience in you kicks in and stops you right there, knowing that efficiency is the key to success and what you are trying to do doesn't seem like an unsolved problem in 2022, point in which you remember watching in awe the excellent life of your QA colleague sipping from her/his cup of coffee browsing on reddit on one screen while on the other the automation tool engine is tearing up an application on three different browsers and shows her/him from time to time screenshots with the flow — what an excellent job they have, while you're so deep into coding that your coffee is cold for half an hour already and you forgot to drink it. You got the point: let Selenium do the work for us. I apologize to my fellow testers if this is already an obsolete framework. It works for our purposes, and funny enough, and I've used it several times in my career but never for testing).

I wrote 40 non-refactored lines of java code in 20 minutes that opened the downloaded HTML version of the data in Chrome and used Selenium to scroll through all of my 2343 posts on Facebook, taking a screenshot, cutting the image and saving it into a folder. You can reuse the code from my github. Instead of making it configurable and reusable as I should have, I watched 11 years of digital life scrolling in front of me and taking screenshots while carelessly sipping my coffee and being ashamed. At which point, I noticed that more than 100 posts did not fit the screen, so instead of combining images with the JDK graphics package, I decided it's much easier to set the monitor in portrait mode and run the script all over again. Half an hour later, I had all the pictures soon to become NFTs.

And we reached step 2: moving the data on IPFS. I wanted to play with it a bit, so I installed an IPFS node on my PC. You have multiple options available, a Desktop option, a slimmed-down command-line tool, or my preferred way: a docker container to keep my environment cleaner than my code.

I was good to go with a simple docker pull of the official latest image followed by a docker run. Remember to mount two volumes for staging and data to keep your node data after stopping the container.

>docker pull ipfs/go-ipfs:latest

>docker run -d --name ipfs_host 
-v "d:/ipfs/staging":/export 
-v "d:/ipfs/data":/data/ipfs 
-p 4001:4001 -p 4001:4001/udp -p 127.0.0.1:8080:8080 -p 127.0.0.1:5001:5001 ipfs/go-ipfs:latest

After that, check the logs to see if the node started properly and connect to other peers, to the "swarm" as they call it:

>docker logs -f ipfs_host
>docker exec ipfs_host ipfs swarm peers

You should see your node connecting to peers, which means you are part of a distributed peer to peer network running IPFS protocol. Congrats!

IPFS connecting to peers

Once this finishes, you can add your data to the IPFS node. I stored the outputs of the add command in the add_output.txt file to have access to the CIDs of the assets.

>docker exec -it ipfs_host sh
>ipfs add -r /export/*.png > /export/add_output.txt

Let's test it! Let's take one CID from the list and check it out in Opera Browser.

Data added to IPFS

Lovely memory loaded in Opera through IPFS protocol.

Notice that the CID changed automatically. The add command generates by default v0 CIDs while Opera browser and most gateways automatically convert it to a v1 CID. That's a remarkable memory, but it's not mine yet. It's unique through its CID. It's online, it's immutable, it's inerasable, it's not efficiently compressed (less critical for our use case), but it's not mine. What good is an eternal memory not owned by anyone? If I could only prove to the world that this memory belongs to me, carrying the proof in my wallet, possessing it, that would undoubtedly be a good use case for the technology. This is what we're going to do.

In addition to maintaining your own IPFS node or cluster, you can create an account on Pinata, which will enable you to upload and pin the files on IPFS for free up to 1 GB. You can deploy the cluster in the cloud, or you can use more reliable paid services like Filecoin if you care about your data. You can link your nodes to any provider to duplicate the data, or you can request the data on your nodes and pin it. This will improve the availability and the lifespan of your files. Because I want to have the files available even if I shut down my PC, I used the Pinata service to pin my assets. Notice that I have just stored my files twice. Someone else can download them and upload them with different CIDs (storing them with varying sizes of a block), which often happens in the world of NFTs: someone steals an NFT collection and launches it on a different platform, monetizing someone else's talent. There are ways to prevent this from happening, one of which is "submarining" the NFT, in essence making it available for everyone except the owner in low resolution or making it visible only on specific platforms. Blockchain must prove the ownership of each asset. The timestamp at which the NFT was launched or minted is recorded in the transactions. The data is easy to steal, modify, duplicate, but the ownership makes the people buy the NFTs.

Here we are at step 3: storing the IPFS addresses of our memories on the Blockchain to have the proof of ownership in our digital wallet. We need a blockchain that supports NFTs. Ethereum defined the first NFT protocol in 2017, so it would be the first pick, as it's the most decentralized and secure Blockchain at the date (that supports smart contracts). However, transactions on the ETH blockchain are costly as it's still using proof of work as a consensus algorithm. If you think I'm cheap, check the below ETH transaction recorded seven days ago in which an address minted an NFT for 3.42 ETH, which at the time of writing approaches 14.000 USD, and you will get the point.

10 000 USD NFT minting transaction

We will use one of the more affordable but still feature complete solutions for our purposes. Why not the solution of our neighbors from Sibiu, Elrond blockchain, which supports NFTs natively and is handled similarly to any other token. They have decent documentation on how to interact with the blockchain. They use proof of stake, and a transaction typically costs less than 0.1$ depending on the price of the underlying Egold cryptocurrency. Suppose we wanted to store our NFTs on Ethereum. In that case, we could opt for a layer-two technology on top of blockchain-like Immutable or we can go for another layer one project like Solana, which also offers cheap transactions with other trade-offs (like lack of decentralization). Elrond can deploy an NFT collection on the blockchain for as little as 0.05 Egold + NFT creation and transfer transactions. We should fit our entire project in a 30-40$ budget because we want the real deal, the main net. We want to show our friends the NFT in our mobile app and send them over via the Maiar app or web wallet. If you're going to play with the technology, you can use their dev and test blockchains for free.

If you want to go deeper into all the specifics of what we are about to do next, I suggest you go through their documentation first. Let's start!

We will interact with their blockchain using their python SDK. They offer SDK versions for multiple languages on their github but as a Java developer, why not write some good old bash scripts to achieve our goal? You can find the documentation for the python sdk here. To run our bash scripts we will use an Ubuntu docker image. You can find most of the scripts we are about to use on my github.

Let's install everything we need in a docker container with the following Dockerfile:

FROM ubuntu:latest
RUN export DEBIAN_FRONTEND=noninteractive
RUN apt update
RUN apt -qy install xxd
RUN apt -qy install python3.9
RUN apt -qy install python3.9-venv
RUN apt -qy install bc
RUN apt -qy install vim
COPY /erdpy-up/erdpy-up.py erdpy-up/erdpy-up.py
RUN useradd --create-home -u 1111 main
USER main
RUN python3.9 erdpy-up/erdpy-up.py
COPY walletKey.pem /home/main/elrondsdk/walletKey.pem
COPY issue_collection.sh /home/main/elrondsdk/is
     sue_collection.sh
COPY set_nft_special_role.sh /home/main/elrondsdk/
     set_nft_special_role.sh
COPY generate_nft.sh /home/main/elrondsdk
     /generate_nft.sh
WORKDIR /home/main/elrondsdk/

We start from a clean Ubuntu image and use apt to install the tools we need, most importantly python. We will need to use hexadecimal encoded parameters for working with erdpy, so we install xxd, bc in case we need some math and, of course vim, for bug fixing in production. The rest of the scripts are installing erdpy and copying the bash scripts we need to issue our collection of NFTs, generate them and send them to our wallet. Previously, I generated my wallet key, walletKey.pem file from the key phrase generated when I created the wallet using the erdpy tool in a different docker image. We will need the key to sign the transactions on my behalf. Please keep it safe, far away from repos.

The next step is to issue a collection of NFTs. To launch an NFT collection on Elrond, we will need to:

  1. Call a smart contract that creates the collection and assigns a token id to it

  2. Assign the writes to manage the collection to other wallet addresses. They can be smart contract addresses or simple wallet addresses.

  3. Generate NFTs using the addresses configured.

For phase 1, I prepared the issue_collection.sh bash script:

#!/usr/bin/bashWALLET_KEY=walletKey.pem
COLLECTION_ID_HEX=
 "0x$(echo -n SerbyDId | xxd -p -u | tr -d '\n')"
TICKER_NAME_HEX=
 "0x$(echo -n SERBDID | xxd -p -u | tr -d '\n')"
CAN_FREEZE=
  "0x$(echo -n canFreeze | xxd -p -u | tr -d '\n')"
CAN_WIPE=
  "0x$(echo -n canWipe | xxd -p -u | tr -d '\n')"
CAN_PAUSE=
  "0x$(echo -n canPause | xxd -p -u | tr -d '\n')"
CAN_TRANSFER_CREATE_ROLE=
  "0x$(echo -n canTransferNFTCreateRole | xxd -p -u | 
  tr -d '\n')"
CAN_CHANGE_OWNER=
  "0x$(echo -n canChangeOwner | xxd -p -u | tr -d 
  '\n')"
CAN_UPGRADE=
  "0x$(echo -n canUpgrade | xxd -p -u | tr -d '\n')"
CAN_ADD_SPECIAL_ROLES=
  "0x$(echo -n canAddSpecialRoles | xxd -p -u | tr -d 
  '\n')"
TRUE="0x$(echo -n true | xxd -p -u | tr -d '\n')"
FALSE=
 "0x$(echo -n false | xxd -p -u | tr -d '\n')"
CONTRACT=erd1qqqqqqqqqqqqqqqpqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqzllls8a5w6u
GAS_LIMIT=100000000
ELROND_PROXY=https://gateway.elrond.com
ELROND_CHAIN=1./erdpy --verbose contract call 
$CONTRACT --recall-nonce \
    --pem=${WALLET_KEY} \
    --gas-limit=${GAS_LIMIT} \
    --value=50000000000000000 \
    --proxy=${ELROND_PROXY} \
    --chain=${ELROND_CHAIN} \
    --function="issueNonFungible" \
    --arguments $COLLECTION_ID_HEX $TICKER_NAME_HEX \
        $CAN_FREEZE $FALSE  \
        $CAN_WIPE $TRUE  \
        $CAN_PAUSE $FALSE  \
        $CAN_TRANSFER_CREATE_ROLE $TRUE  \
        $CAN_CHANGE_OWNER $TRUE  \
        $CAN_UPGRADE $TRUE  \
        $CAN_ADD_SPECIAL_ROLES $TRUE  \
        --send

The contract we are calling a particular contract deployed for this purpose on the Elrond blockchain. We need to call it. We have to assign a value of 0.05 Egold to the transaction, as this is the price for creating an NFT collection. We have the option to specify several flags. Whether the collection is upgradeable, whether we can add roles to it, if we can change the collection's owner and so on, and that's it! We have a collection.

We can verify the result of the transaction on Elrond blockchain explorer. Here is a direct link.

Issue Non-Fungible Collection

This is the beauty of blockchain technology. Everything is transparent but confidential. I could have created a new wallet without providing any personal information to the blockchain, and I could use it to generate NFTs anonymously. However, those 0.05 Egold cryptocurrencies must come from somewhere and can be traced to the last decimal from wallet to wallet, from swap contracts through liquidity pools. My blue markings on the above transaction are pointless for whoever wants to check all the NFT collections issued recently, and it's not even hard as the exact timestamp is there. The transaction result will give us the Id of our token collection that we will use to issue NFTs to:

NFTs are launched in collections because deploying a smart contract that creates and manages a group is generally expensive. Typically, thousands of NFTs are established in a single collection and are managed by a smart contract that receives "mint" transaction requests with a preconfigured sum of crypto, which gets stored in the agreement. The owner can collect it at any time. Minting is just a fancy word for pressing the lever of the candy slot machines, which randomly gives you one of the NFTs in the contract via a transaction from the contract to the caller's wallet. If you are interested in a Smart Contract managing NFTs with a minting function, check the smart contract created by Julian Ćwirko on github. As Elrond is built in Rust, so are the smart contracts, even though they should support everything that compiles to WASM. We are not interested in minting, we just want our Digital Identity to be ours. For phase 2, we will give ourselves (our wallet) the roles to generate NFTs for this collection. For this purpose, we will call the set_nft_special_role.sh script with the collection ID parameter extracted from the above transaction. The data looks like this:

We can generate now NFTs with our wallet address using the generate_nft.sh script below:

#!/usr/bin/bashWALLET_KEY=walletKey.pemIPFS_CID=$1
TOKEN_IDENTIFIER=$2
TOKEN_IDENTIFIER_HEX=
  "0x$(echo -n ${TOKEN_IDENTIFIER} | xxd -p -u 
  | tr -d '\n')"function generate_nft(){
 NFT_NAME=$1
 NFT_NAME_HEX="0x$(echo -n ${NFT_NAME} | xxd -p 
  -u | tr -d '\n')"
 GATEWAY_1_URL=
  https://dweb.link/ipfs/${IPFS_CID}/${NFT_NAME}.png
GATEWAY_2_URL=https://ipfs.cf-ipfs.com/ipfs/
  ${IPFS_CID}/${NFT_NAME}.png
GATEWAY_3_URL=https://cloudflare-ipfs.com/ipfs/
  ${IPFS_CID}/${NFT_NAME}.pngIPFS_URL_HEX=
  "0x$(echo -n ${IPFS_URL} | xxd -p -u | tr -d '\n')"
GATEWAY_1_URL_HEX="0x$(echo -n ${GATEWAY_1_URL} | xxd 
  -p -u | tr -d '\n')"
GATEWAY_2_URL_HEX="0x$(echo -n ${GATEWAY_2_URL} | xxd 
  -p -u | tr -d '\n')"
GATEWAY_3_URL_HEX="0x$(echo -n ${GATEWAY_3_URL} | xxd 
  -p -u | tr -d '\n')"#contract is your address
WALLET_ADDRESS=001863b862703277521ed0cd5f22d5932921905fd6926aac157d08f3618b9c63
 GAS_LIMIT=2805501
 ELROND_PROXY=https://gateway.elrond.com
 ELROND_CHAIN=1./erdpy --verbose contract call 
  $WALLET_ADDRESS --recall-nonce \
   --pem=${WALLET_KEY} \
   --gas-limit=${GAS_LIMIT} \
   --value=0 \
   --proxy=${ELROND_PROXY} \
   --chain=${ELROND_CHAIN} \
   --function="ESDTNFTCreate" \
   --arguments $TOKEN_IDENTIFIER_HEX \
   01  \
   $NFT_NAME_HEX  \
   00  \
   00  \
   00  \
   $GATEWAY_2_URL_HEX  \
   $GATEWAY_3_URL_HEX  \
   --send
}FILES="/files/*.png"
for f in $FILES
do
  echo "Processing $f file..."
  NFT_NAME_1="$(echo $f | sed 's/\/files\///' | 
  sed 's/.png//')"
  echo $NFT_NAME_1
  generate_nft $NFT_NAME_1
  sleep 20  
done

This script reads the png files one by one and generates an NFT for each, assigning it to my wallet. You can set multiple URLs for each NFT to make it accessible via multiple IPFS gateways, but each byte of data will cost you more per transaction. In the end, I decided to generate NFTs one by one with only the memories I wanted to keep (some things are better forgotten), so I made slight modifications to the above script.

We did it! My memories are now in my wallet, finally mine, and if you managed to read this far and have an Elrond wallet, ask for one, and I shall send it to you as a gift, up to a max number of 50 NFTs.

NFTs on Web Elrond wallet

Imagine now all the other assets that you might want to carry with you in your digital wallet: the proof that you own your house (and there is a smart contract that gets executed automatically and transfers your home to the bank if you don't pay your mortgage — scary version), all the words you've ever written, all the art you've ever created, or in a pandemic the proof that you are vaccinated. Political statements of our politicians are immutable on the blockchain. Ok, maybe this is a bit too Orwellian. I would still argue that our digital identities should belong to us, not to some over sun screened tech CEOs, and I can now erase my Facebook whenever I want. However, I might wait for an excellent D-App social media website owned by smart contracts before shutting it down. But that would be a web3 discussion for which we need another thread.

You should be both terrified and excited about the possibilities lying ahead of us, and the pressure is on our shoulders to make sure the technological future we are building doesn't turn against us. If NFTs will be used to prove the ownership of art, real estate and digital identity or if they will just be used as candies coming out from slot machine smart contracts for a boost of dopamine and the hope of getting rich fast is still up to us. Owning my digital self brings me a bit of joy and makes me feel responsible for everything I do and say, as for sure, even if our human mind forgets, current era technology won't. Everything will last forever, especially on blockchains, and IPFS does not have an "erase my profile" button.