Beijing is prepared to apply a new policy to strengthen the actions to foster energy conservation…
You're reading
McKinsey Report on Tokenization – Reply
Posted at August 20, 2023 | Post by Victor Rollman
McKinsey’s Perspective on Tokenization: Embracing the Positive and Identifying Opportunities for Improvement
The McKinsey report presents an insightful perspective on the realm of tokenization. It offers clear and concise explanations of the tokenization process, delves into the obstacles it faces, and highlights the challenges tied to cost considerations and adoption hurdles.
A notable highlight is that the report wisely avoids the misleading and unnecessary use of the term “Real World Asset.” This term often creates unnecessary discord between proponents of blockchain solutions and key decision-makers such as CIOs, CTOs, and mainstream Venture Capital investors. McKinsey’s authoritative stance in omitting this term is commendable.
The process of asset tokenization benefits greatly from aligning with established financial terminology and practices. McKinsey astutely categorizes assets into three fundamental groups: financial, tangible, and intangible assets. This contextual approach to describing tokenization makes considerable sense.
McKinsey succeeds in offering a succinct overview of the tokenization process. However, it falls short of elaborating on the distinctions in process and implementation dictated by the nature of the underlying assets. Specifically, the report doesn’t thoroughly explore the contrasts between natively digital assets and those that need to be digitized prior to tokenization.
For instance, servicing natively digital assets varies significantly from the tokenization of tangible assets. The difference in handling a tokenized patent, trade secret, or digitally stored music track versus a fractional ownership claim to a physical painting stored in a secured warehouse is noteworthy.
Consequently, the focus shifts to the challenges of digitizing the underlying asset or representing a natively digital asset on-chain. To leverage the efficiency, transparency, smart contract automation, and auditability offered by blockchain technology, machines must play a more central role in task completion compared to human involvement. In the realm of finance, the ability for computers to resolve reconciliation discrepancies becomes paramount.
Tokenization of financial assets emerges as a distinctive scenario. The McKinsey report, while comprehensive, doesn’t fully delve into why this tokenization case is special. Financial assets’ foundation lies in digital, algorithmic contracts that encapsulate cash flows over time. These contracts are defined by calculations, making them inherently suitable for standardized and deterministic tokenization through blockchain or DLT technology.
This fusion of well-defined financial contracts with blockchain/DLT technology holds tremendous potential for revamping traditional finance and fostering innovation. The report acknowledges the success of tokenizing cash and cash equivalents. However, when we look beyond these, there’s a broader realm of financial contracts that tokenization can significantly impact, more than the report’s assessment suggests.
The benefits of financial asset tokenization, grounded in algorithmic standards, extend beyond what McKinsey acknowledges. This includes enhanced capital efficiency, democratized access to markets, operational cost savings, improved compliance, auditability, and transparency, as well as the potential for more agile infrastructure.
In terms of challenges addressed by McKinsey, the report does well in underscoring the successes of tokenizing cash and cash equivalents. The next logical steps are the tokenization of fund shares, which showcases blockchain/DLT’s technical prowess in facilitating efficient value transfer, settlement, and collateralization.
However, there are nuanced disagreements or modifications to some of the challenges discussed in the report:
- Technology and infrastructure preparedness: The report recognizes managing private keys isn’t new, but the scarcity of enterprise-grade blockchains, along with associated tooling and security protocols, is a barrier to implementation. The tech solutions driven by certain major players don’t necessarily meet the broader industry’s standards.
- Limited short-term business case and high implementation cost: The report spotlights standardized and algorithmic financial contract definitions as pivotal to mid- and back-office efficiency. Such a representation is crucial irrespective of whether blockchain/DLT systems are employed.
- Market immaturity: The report hints at the inflection point for tokenizing cash equivalents and fund shares. Yet, the momentum extends further to encompass debt and structured instruments, with ACTUS-enabled systems demonstrating their potential.
- Regulatory uncertainty: Global players grasp blockchain/DLT’s potential, aiming to construct capital market infrastructures without relying on a single country’s dominance. Regulatory delays can cede an advantage to competitors in shaping future markets.
- Standard setting: McKinsey prompts participation in standard-setting initiatives, but innovation truly stems from acknowledging that financial assets’ cash flows are algorithmic and can be standardized using open-source solutions like ACTUS.
In closing, the report concludes by advocating the shift from tokenizing cash equivalents to cash flow-based instruments. However, genuine and intrinsic digitization of financial assets, rooted in the ACTUS standard, remains a prerequisite for effective tokenization. This holistic approach holds the promise of revolutionizing the financial landscape and catalyzing further advancements in the field.
Curious about Bitcoin?
How about mining this new and asymmetric asset?
Find out how it works.
Book your edge now!
www.rollmanmining.com
In the tumultuous world of cryptocurrency mining, where fortunes can be made and lost in the…
This week, Joe Lonsdale, a founding partner of Palantir, engaged in a dialogue about artificial intelligence…