DeFi and traditional finance could converge thanks to tokenization

HashFlare



Nonfungible tokens could become a bridge to connect the legacy financial system to the emerging fintech world in the near future. During a recent interview, Adrian Lai, CEO of Liquefy — an investment firm and an incubator for decentralized finance platforms — told Cointelegraph China that synthetic assets, NFTs and digital securities are redefining the way capital markets operate. 

Lai especially believes that the value of synthetic assets could give each individual in decentralized finance access to essentially any asset, as long as there is a reliable data feed. This emerging trend between traditional finance and DeFi is inevitable.

Lai also pointed out that as the convergence between security tokens and digital currencies grows greater, we will see increased activity between traditional finance and cryptocurrencies. He added that:

“We are seeing a merger of security tokens, utility tokens and NFTs. NFTs can also now represent real assets, which was not considered several years ago. The convergence of traditional finance and the crypto space is increasing more and more.”

Lai gave centralized exchanges as an example, saying that some of them have been moving beyond the traditional understanding of being simply a trading venue. Platforms like BlockFi and Coinbase offer retail-focused services like savings accounts and crypto payment options — services that make these platforms function like traditional financial institutions, at least partially.

Phemex

Lai explained that synthetic assets are meant to imitate other investment products. They can combine various derivatives products such as futures, options or swaps to simulate an underlying asset. These underlying assets can include stocks, bonds, indexes, commodities, currencies or interest rates.

Challenges ahead

Although the convergence of traditional finance and the crypto industry is inevitable, Lai believes the current crypto industry still faces challenges such as liquidity exposure and reliable data oracles: “There is simply not enough information in the crypto space. When someone in crypto wants to trade illiquid assets, in many cases, there’s no adequate pricing data and other supportive information on the blockchain to facilitate the trade.”

Lai also pointed out that even though there is a lot of hype around NFTs, the current NFT market is only a digital collectible market, which does not require much liquidity. While Lai believes this collectible market is likely here to stay in the long run, several changes have to be made to help the broader NFT market grow further.

He thinks that breaking down an NFT into several parts for investment purposes could become a new trend for the digital collectible market:

“NFTs could also represent real assets, and creating a fraction of an NFT out of a real asset is a good way to offer traditional finance exposure to crypto. In this case, liquidity is important because you want to trade a fraction of the real asset.”

Tokenizing DeFi

According to Lai, tokenization has previously been primarily done via security token offerings. However, he believes that this will change due to DeFi, as tokenizing assets with DeFi could make tokenization more accessible for everyone:

“While security tokens are backed by real-world assets and their ownership is legally recognized, the liquidity of security tokens can vary, and we’ve seen in many cases that when security token owners want to sell their holdings, they may not be able to execute the trade at the best price.”

Lai believes that the maturation of DeFi and tokenization of real-world assets via DeFi protocols will have more potential than using the traditional security token offering model: “Tokenizing assets in a decentralized fashion opens up much greater liquidity for asset owners. At the same time, it gives real-world assets exposure to all of DeFi’s users.”

As Cointelegraph previously reported, 2021 will likely be a pivotal year for DeFi that will transform the way financial services are used. So, could tokenization also play a part in this?



Source link

Be the first to comment

Leave a Reply

Your email address will not be published.


*