Despite some volatility in the cryptocurrency market, the open innovation space of public blockchains seems to move forward undisturbed with the tokenization of the world. Next to more general forms of tokenization geared towards store of value, means of exchange and utilities, we also see that illiquid and implicit forms of value are being turned into tokens, including the tokenization of property, likes, data-sharing, volunteer work, and charity. But what are the potential downsides of these forms of tokenization?
The motivation to tokenize seems to arise from the need to make economic value explicit. Take for example the use of personal data by online service platforms, in which consumers feel wronged as the use of their data is not appropriately valued. In response, Datum aims to explicate this value exchange through their DAT token. Furthermore, tokenization also allows illiquid assets (e.g. real estate, business inventory, collectibles) to become more liquid, by having a tradeable token that digitally represents the respective asset through a public blockchain, thereby removing a lot of the current friction caused by intermediaries. However, tokenization is not without its potential pitfalls. Firstly, the value of these tokens is determined by markets of supply and demand, which are currently subject to wild speculation and consequently high volatility. Thereby, it could very well be that a past value exchange (e.g. volunteer work) suddenly does not meet the perceived value anymore due to price volatility. A way of dealing with volatility is the creation of closed systems (e.g. circular money), in which the tokens cannot be easily exchanged and participants are carefully selected and audited. However, these solutions usually result in bureaucratic systems that are difficult to govern and scale.Another risk of explicating the economic value is that in some instances detrimental economic dynamics are being introduced. Within the context of data-sharing, one could imagine that privacy becomes a matter of being rich, as poor people will probably sell their data to get by, whereas rich people can buy their privacy.Another cluster of issues has to do with the requirement that the underlying asset should be quantified, which runs the risk of validity and reliability issues, i.e. does the token really represent said value/asset. Do likes and the tokens that correlate with that metric really represent quality of the content? Is someone with a lot of ‘volunteer tokens’ a real well-doer? In the case of the tokenization of physical assets, some platforms propose the use of sensors and/or other external data feeds (i.e. oracles). However, these data feeds can potentially be sabotaged, thereby creating a disconnect between the token and underlying asset. Lastly, privacy could be further intruded as tokenization also requires our daily lives to be quantified.Although some of these issues are maybe not unique to tokenization, since they already exist in some form (e.g. data for pay), cryptocurrencies certainly could push it towards the extreme. On these issues are not unsurmountable, they pose some new challenges to our governance, technology and our economy.