The usage compatibility of internet platforms is getting belated but worthwhile policy attention
Listen to this article
Even though email has been around since the dawn of the internet, it remains, to this day, one of its most widely used applications. Last year, the world’s estimated 4 billion people who have active email accounts exchanged over 300 billion email messages every day. Even though we often complain about having to deal with its ever-increasing volumes, email remains the single most widely used means of communication on the planet.
The ability to exchange messages between different computer systems was made possible shortly after Massachusetts Institute of Technology developed the Compatible Time Sharing System that allowed multiple users to log into a central system. In those days, to send a message, you had to use the file transfer protocol (FTP) to append text at the end of a private ‘mailbox’ file. This file could only be read by the person for whom the message was intended, and so, in order to check the messages other people sent them, users had to log on to the server and open this file.
A messaging system like this could only be used to exchange messages between computers within the same closed environment, either on the same corporate network, or, in some cases, one using the same operating system. As one can well imagine, this was a significant constraint, and had this remained the only way that messages were exchanged between computers, email might never have proliferated to the extent it has.
In 1971, Ray Tomlinson developed a messaging protocol that identified the recipient of electronic messages with reference to a given user at a specified computer location. He adopted the now ubiquitous ‘@’ sign as a means to distinguish between the user and the computer system on which that user’s mailbox file resided. Precisely because the messaging system he had built used an open, interoperable protocol that was independent of the underlying operating technology, it was widely adopted, allowing electronic messages to be freely exchanged over the internet. This is how the Simple Message Transfer Protocol, or SMTP, became the standard transport layer for messages over the internet, regardless of the operating system or email client that you used. There is probably no better example of how the decision to use open, interoperable protocols resulted in the widespread adoption of a foundational technology.
Much of the core infrastructure of the internet has been designed in this fashion, using open interoperable protocols that have, in turn, been used to build the various services that make up our current internet experience. However, despite the fact that many of these services are built on open protocols, a majority, if not all them, are only accessible within tightly controlled, closed environments. This includes not just the purely digital spaces through which we receive information and entertainment, but also extends to the many digital interfaces through which we access offline services, such as the applications we use for shopping, transportation, banking and healthcare. Since all these digital on-ramps to the offline world were built by corporate entities, the singular focus of all these endeavours has been the customer. As a result, they have for the most part overlooked the need to build open digital systems for the entire industry. A whole generation of online interactions has, as a result, taken place in silos, and the lack of interoperability is starting to adversely affect the next stage of our digital evolution.
To better understand this, look no further than the financial services industry.
Banks were early adopters of digital technologies offering online banking facilities long before other sectors even started to come to grips with all that digital tools had to offer. However, they focused chiefly on providing digital services to their customers, and as a consequence completely overlooked the necessity to build digital bridges within the financial services ecosystem. As an upshot, despite all the remarkable improvements we have witnessed in digital transactions between banks and their customers, transactions that involve multiple banks remains painfully analogue to this day.
Take loans, for example. Notwithstanding all our advances in online banking and digital payments, the process for availing a loan —particularly from institutions with which you don’t already have an account—remains unbearably cumbersome. All the data necessary to comply with Know-Your-Customer obligations or to evaluate a customer’s eligibility for a loan already resides with some entity or the other within the ecosystem. Had we put in place data-sharing protocols, it would have been trivial to share that data with any other entity within the ecosystem that was permitted to access it. This would have greatly simplified many of the processes for which we still rely on paper-based mechanisms for data transfers.
Countries around the world are waking up to the urgent need to enable open interoperable protocols. The Consumer Data Rights initiative in Australia and the Data Governance Act in Europe are but two examples of regulatory measures being put in place to enable data-sharing protocols. Since there are no pre-existing protocols, we are being forced to build them from scratch, so that the various bespoke systems that banks have built can operate with one another.
As we start filling in the missing pieces of our digital infrastructure, I can’t help reflect on how much easier all this would’ve been had we used open protocols in the first place.
Rahul Matthan is a partner at Trilegal and also has a podcast by the name Ex Machina. His Twitter handle is @matthan