Ocean Protocol: Unleashing the Potential of the New Data Economy
Ocean Protocol is a decentralized data exchange protocol that provides a platform for people to own and monetize their data. It was launched in 2018 on Ethereum and it is one of the first Web3 projects focusing on the Data Economy. The aim of Ocean Protocol is to give individuals and organizations the ability to share, sell, and purchase data in a secure and privacy-preserving manner. The project went through a few iterations and it is currently at stage V4. The current structure of Ocean focuses on three main features: The Ocean Data Market, veOCEAN staking, and Data Farming. By using blockchain Ocean offers an alternative way of tracking ownership of data. The acts of publishing data, purchasing data, and consuming data are all recorded on the blockchain to make a tamper-proof audit trail.
Authors: Valentin Kalinov, Christian Viehof
What problem is Ocean solving?
How do we ensure that the playing field around data stays leveled? If you are a data scientist, how can you get access to data without having to join a big corporation like Google?
The rise of big data and AI has brought about a new era of technological advancement, allowing companies to extract valuable insights from vast amounts of data. However, this has created a divide between companies with access to vast amounts of data and startups that lack data access. This inequality in access to data has significant implications for both society and businesses. Therefore, it is crucial to not store data in data silos and keep it accessible only to big corporations.
The concentration of data in the hands of a few large corporations can also have significant implications for society. For example, data collected by social media companies can be used to manipulate public opinion, shape political outcomes, and perpetuate inequality. The Cambridge Analytica scandal demonstrated how the data collected by Facebook was used to influence the outcome of the 2016 US presidential election.
As the world becomes increasingly digital, data has become an essential asset for many businesses and organizations. However, traditional data-sharing methods are often fraught with issues related to privacy, security, and transparency. Ocean Protocol provides a solution to these issues by leveraging the power of blockchain technology to create a decentralized and secure platform for data sharing.
The Importance of Data Sharing
Breaking down data silos and ensuring data accessibility is crucial to promote innovation, and democratize the benefits of AI. Here are a few examples on why it is important to share data:
- Encouraging Innovation and Competition — Data silos restrict the flow of information and hinder the potential for innovation.
- Enhancing AI Model Accuracy and Robustness — By breaking down data silos and enabling data sharing, AI developers can access more diverse and representative datasets.
- Fostering Collaboration and Knowledge Exchange — Encouraging data sharing allows for the cross-pollination of ideas and fosters collaboration between organizations, accelerating AI advancements and contributing to the development of better solutions for a wide range of problems.
- Promoting Transparency and Accountability — When data is confined to a few organizations, it becomes challenging to hold these entities accountable. Going back to the example of Cambridge Analytica.
Ocean tries to establish policies and frameworks that encourage data sharing (while preserving data privacy), collaboration, and open innovation, ensuring that AI’s transformative potential is realized for the benefit of all.
One of the most significant features of Ocean Protocol is the privacy it provides to data owners. The protocol ensures that the data shared on its platform is encrypted and remains private. This means that individuals and organizations can share their data without worrying about it being misused or stolen. Moreover, Ocean Protocol provides individuals with the option to share their data anonymously or selectively. This allows data owners to control who has access to their data and how it is used.
Data Monetization (the Marketplace)
The Ocean Market plays a crucial role in the structure of the protocol by allowing individuals and organizations to monetize their data. The marketplace is powered by a native token called Ocean (OCEAN) which is used to facilitate transactions on the platform. Data on the Marketplace is managed by Data NFTs (ERC721 tokens) and Datatokens (ERC20 tokens). Both of these standards surve as interface to connect and manage data assets. Data NFTs represent the unique asset and the Intelectual Property (IP) rights associated with it. In essence the owner of the Data NFT can access the asset associated with it. By default revenue is sent to the owner of the data NFT and this automatically updates when the data NFT is sent to a new owner. Datatokens on the other hand, act as access control, where the owner of the Data NFT grants access to the data by issuing ERC20 tokens. Each token represents a type of licensing. For example, one could grant access to a dataset for three days or forever. The license could also give the rights to use and resell the data. By holding the token, one could proof the rights they have associated with that asset. If a Datatoken gets transferred to a different address, then the original owner no longer has the rights to use the data. Of course all of these rules not only need to be embedded into smart contract logic but also be part of legal contracts to create a fully functional data economy.
Enabling data privacy with Compute-to-Data
The Ocean Market offers Compute-to-Data services where data is shared in a privacy-preserving manner. Instead of revealing the data by selling the dataset, the seller sells the right to run an algorithm on top of the data. This way no data gets revealed while doing computations. It’s not necessary to move the data; the algorithm is sent to the data. This service opens new opportunities for collaboration in a decentralized way. For example, a hospital could publish Compute-to-Data dataset, where users can run algorithms on top of the data without revealing patients’ sensitive data.
veOCEAN and Data Farming
Holders of OCEAN tokens can lock OCEAN to earn yield and curate data. Users can lock their OCEAN tokens for up to 4 years and will automatically get passive staking rewards. veOCEAN is non-transferable. The passive rewards are generated by the fee collection from the Ocean Market. 50% of the community fees go to veOCEAN holders. In addition to passive rewards, veOCEAN holders can participate in active staking. Active staking is associated with the concept of Data Farming. The Ocean protocol is using the voting power of veOCEAN token holders to create a curated market for datasets. By allocating their tokens to high-quality datasets, token holders can increase their rewards. The Data Farming program aims to incentivize a supply of relevant and high-quality data assets.
The classification of OCEAN according to the ITC
Economic Purpose (EEP): OCEAN is listed as a Settlement and Governance Token (EEP22TU03) due to its design as a means of payment combined with governance functionality trough veOCEAN.
Industry Type (EIN): The issuer of OCEAN is active in the field of Other Software, Data Storage and Processing (EIN05DA05).
Technological Setup (TTS): OCEAN is an Ethereum ERC-20 Standard Token (TTS42ET01).
Legal Clam (LLC): OCAN does not entitle its holder to any legal claim or rights against the issuing organization, therefore, it is listed as a No-Claim Token (LLC31).
Issuer Type (LIT): The dimension “Issuer Type” provides information on the nature of the issuer of the token. OCEAN’s platform is built by private company. Its Issuer Type is an Private Sector Legal Entity (LIT61PV).
Regulatory Framework (EU) (REU): The dimension “Regulatory Status EU” provides information of the potential classification of a token according to the European Commission’s proposal for a Regulation on Markets in Crypto Assets (MiCA, Regulation Proposal COM/2020/593 final). OCEAN qualifies as a Utility Token (REU52) according to the definition provided in Article 3 (5) of Regulation Proposal COM/2020/593 final.
Consensus Mechanism (TCM): The dimension describes the mechanism that is deployed to achieve consensus on the token’s distributed ledger. OCEAN tokens are issued on Ethereum; therefore, they are listed as Proof-of-Stake (TCM71PS).
Type of Maximum Supply (EMS): The dimension describes the token’s type of maximum supply. The supply of OCEAN is capped to 1.41 billion tokens. Therefore, OCEAN supply is listed as Fixed (EMS81)
Primary Mode of Origination (EMO): The OCEAN Token Sale ended
in May 2019 raising around $30,000,000. Therefore, the token is listed as Sale (EMO92).
Taxes (RTA): One common distinction can be drawn between crypto-assets: those crypto-assets that resemble ‘conventional’ assets, like securities, and which are merely recorded on DLT systems (Conventional Asset Tokens DTA71), and those assets and activities that raise new regulatory challenges such as virtual currencies (New Asset Tokens DTA 72; OECD 2020). OCEAN is listed in the Tokenbase as a New Asset Token (RTA72).
The International Token Standardization Association (ITSA) e.V.
The International Token Standardization Association (ITSA) e.V. is a not-for-profit association of German law that aims at promoting the development and implementation of comprehensive market standards for the identification, classification, and analysis of DLT- and blockchain-based cryptographic tokens. As an independent industry membership body, ITSA unites over 100 international associated founding members from various interest groups. In order to increase transparency and safety on global token markets, ITSA currently develops and implements the International Token Identification Number (ITIN) as a market standard for the identification of cryptographic tokens, the International Token Classification (ITC) as a standard framework for the classification of cryptographic tokens according to their inherent characteristics. ITSA then adds the identified and classified token to the world’s largest register for tokens in our Tokenbase.
- The International Token Identification Number (ITIN) is a 9-digit alphanumeric technical identifier for both fungible and non-fungible DLT-based tokens. Thanks to its underlying Uniform Token Locator (UTL), ITIN presents a unique and fork-resilient identification of tokens. The ITIN also allows for the connecting and matching of other media and data to the token, such as legal contracts or price data, and increases safety and operational transparency when handling these tokens.
- The International Token Classification (ITC) is a multi-dimensional, expandable framework for the classification of tokens. Current dimensions include technological, economic, legal, and regulatory dimensions with multiple sub-dimensions. By mid-2021, there will be at least two new dimensions added, including a tax dimension. So far, our classification framework has been applied to 99% of the token market according to market capitalization of classified tokens.
- ITSA’s Tokenbase currently holds data on over 4000 tokens. Tokenbase is a holistic database for the analysis of tokens and combines our identification and classification data with market and blockchain data from external providers. Third-party data of several partners is already integrated, and API access is also in development.
If you like this article, we would be happy if you forward it to your colleagues or share it on social networks. More information about the International Token Standardization Association can be found on the Internet, on Twitter, or on LinkedIn.
Valentin Kalinov is an Executive Director at International Token Standardization Association (ITSA) e.V., working to create the world’s largest token database, including a classification framework and unique token identifiers and locators. He has over five years of experience working at BlockchainHub Berlin in content creation and token analysis, as a project manager at the Research Institute for Cryptoeconomics at the Vienna University of Economics and token analyst at Token Kitchen. You can contact Valentin via firstname.lastname@example.org and connect on Linkedin if you would like to further discuss ITSA e.V. or have any other open questions.
Christian Viehof is an Executive Director at the International Token Standardization Association (ITSA) e.V., working to create the world’s largest token database including a classification framework and unique token identifiers and locators. He completed his Bachelor in Economics at the University of Bonn, the Hong Kong University and the London School of Economics and Political Science with a focus on Behavioral Economics and Finance. Currently pursuing his Master of Finance at the Frankfurt School of Finance and Management, you can contact him via email@example.com and connect with him on Linkedin, if you would like to further discuss ITSA e.V. or have any open questions.