Owndata

Friday, November 23, 2018

Unification’s UAPPs the biggest fish in the sea called data economy.


The above statement is so true that in making it, I have no fear of being contradicted. The unification APP is a quantitative leap in the evolution of the data economy so much so that, once it is fully deployed, the data economy is sure to move to a new level. I shall soon make clear why this is so.

Permit me here to further explain what data economy is as this concept is in itself relatively new.

Whereas we all generate data, either at individual or corporate level, the standard up to recent times has been to place such little value on the data so generated that we often freely sign away the right to them in exchange for the right to use certain services even when we are paying for such services as in hospitals, banks etc.

As the computer and the internet matured, they made it possible to collate very big amounts of such data and with the introduction of machine learning and artificial intelligence, the collation, analysis and transmission of these data moved a few steps up. This gave  birth to the concept of Big Data.

If was soon discovered however that the collated data was not quite so adequately detached from the person it was generated from. Thus successful incursions into the data silos (centralized) could , and in many instances did, put the persons linked to the data in grave danger It also could act as a powerful tool in the hands ill-intentioned data thieves .

This in brief explains why the GDPR was ultimately established.

In spite of the danger posed by centralized data silos and generally by the old order of data gathering and data acquisition, the GDPR could not go so far as prohibiting data gathering. This was so because the emergence of big data has led man to new, admirable heights in the use of data to capture new heights in his evolution in crucial areas like health, environment etc.

The GDPR however set crucial new parameters for data gathering and in respect of the continued claim of the data generation to the data limited to them. Two of the crucial parameters relate to (1) The right of the person generating data to give or withhold permission for the use of data linked to him (his data)  and (2) the right to have such data erased.

The terms the data generator was allowed to set included the right to obtain value exchange (monetary or in kind) for the release of his personal data or to expressly grant it for free.

This proposition for the incentivisation of the data owner essentially led to what is now known as the data economy.

The data ecosystem has a number of persons that play different roles along the value chain that it represents. These are:

1.  Data providers: these collect user data by providing engaging application/ services. They possess silos in which such data they obtain are stored.

2.  Data consumers: these are entities willing to obtain data in exchange for some remuneration. They may also be data gatherers who simply need to extend their store of data by purchasing from other data gatherers (providers)

3.  End users are the sources of the generated data. They offer their data either directly  who or in the course of consuming an application or service.

This chain suffered from the absence of certain elements. Most significant was the need to keep the data from falling into wrong hands and to ascertain the source of leakage if such did happen.

Man was fortunate in that this new realization coincided with the appearance of a new technology – the Blockchain technology – which holds great promise for solving issues relating to data handling – capture, storage and retrieval especially.

Even so, some problems persisted. One was that there was no unified method of standardization and transfer of data. Thus, although individual data silos had moved up in terms of the size of data they could handle, their capacity for analysis etc, the overall data industry remained fragmented and enterprises could not transfer data one to the other in a seamless fashion.

Herein lies the  huge significance of the introduction of Unification's data liquidity  protocol which deploys a series of software development kits (SDKs) as CAPSULE on enterprise servers to standardize and encrypt data(the enterprise's existing data), allowing persons involved to transfer data peer to peer in a harmonized format.

The developers of the unification project also observed one other hurdle – the potentially high cost of storing data on the Block chain. Blockchains are arguably not the most conducive to storing large quantities of data. They are also not adequately secretive enough since anyone could query a block chain all the way back to its genesis block such that a compromised algorithm  could enable the jumping of access hurdles.  Finally, it was recognized that the immutable nature of blockchain represents a potential violation of the right of the user to have his/her data erased.

In reaction to the above, data is not stored on chain in the Unification project.  On-chain transaction is limited to hashes of  the data which are used to validate the data.  Actual data transfer is done outside the block chain via State Channels.

A final significant element of the Unification project is its Unified Verifiable Credential ID (UVCID) infrastructure which allows actors to securely issue, hold, distribute and cryptographically verify personal information.

Unification is in-sum an ecosystem of Data providers, Data consumers and Data end – users that utilizes blockchain technology, Machine learning and auxiliary software components to allow the monetized, secure transfer of data(Data Liquidity)

It is built around a web of smart contracts which allow the utilization and flow of data transfer agreements between Data Providers and Data Consumers while putting the control of the data flow in the hands of the users themselves . This is in recognition of the fact that data suppliers ultimately own and should be  remunerated for  their data. 

While unification is not the only blockchain project in this sphere, it distinguishes itself by  introducing :
a.   A data standardization protocol, thus enabling data liquidity.
b.  Avoidance of the potentially high cost of on-chain storage of data through the use of state-chain and its replacement by data hash and
c.   A credential ID infrastructure (UVCID).

 This game-changing project deserves the patronage of all cryptocurrency investors and adoption by all data economy actors.

No comments:

Post a Comment