Long ago, I worked as an analyst on Wall Street. The first company that I analysed was Federal Express, which at the time had not yet shipped its first package.
The idea behind FedEx was simple and compelling: The cost of complexity was higher than the cost of air transport, so the company would ship all its packages overnight to Memphis, Tennessee. By radically simplifying the myriad combinations of start and end points—the only routes were to and from Memphis—all of the packages could be delivered the next day reliably.
All of that has changed, thanks to developments in information technology. Indeed, with the ability to make reliable predictions, we can put people and things into categories, whether market segments, disease risks, likely loan defaulters, potential purchasers, and so on. That’s big data.
But now we can also do “small data”. We can treat many things, even packages, like individuals. The exceptions—whether individual genotypes, individual privacy preferences, or digital rights to use content in specific contexts—have become the rule. We don’t need to guess at everyone’s preferences or settle for one-size-fits-all policies.
Over time, we will be able to figure out which people, based on their genotype, will be helped or harmed by a particular drug, or how children can learn best with personalized feedback, or how to produce furniture and clothing in a world of 3D printers and real-time modelling. And so on. Would you like your customized car seat in leather or cloth, sir? The market will rise to the challenge. So the design challenge of the future will be to create good defaults with easy editing/customization tools for those who care.
But this change will raise challenging social and political questions as well, particularly concerning privacy preferences and health care—both already controversial issues.
Of course, no one can define or guarantee privacy. But individuals could get the opportunity to control the use of their data—and entities that want to use it could negotiate with them. Yet, somehow, the data collectors can manage to record individuals’ purchase histories, their airline seat preferences, and so on. There is no reason why they could not also record how and by whom each piece of such information can be used.
Indeed, millions of people now do set specific privacy preferences within Facebook, opt out of being tracked, and the like. At the same time, they gladly share data with vendors and even track their own data—whether airline mileage or steps walked, check-ins at their favourite venues (especially if they can earn discounts or special offers), or their movie, music, or book purchases.
Now suppose that you could tell people to whom you had sold their data. Most people would not care, but those who did would appreciate the transparency and perhaps want a little share. Suppose you started a business that managed data on behalf of the users.
That is not such a crazy idea —the airlines, among other companies, are already doing it to some extent. United, American, and British Airwaysall know my travel patterns on their airlines, and help me manage both my past trips (and related rewards) and my future reservations. WellnessFX does the same for my blood biomarkers. A new start-up called Moven plans to track small payments so that you can see in real time how you are sticking to or deviating from a budget.
All of this works well in markets for goods and services, where people who want choice can pay for it. Businesses can treat customers as individuals, and give them the amount of special consideration that they are willing to pay for. Companies can also decide not to serve certain customers, focusing on the most profitable segments.
But this approach does not work for things that the government pays for. In the public sector, the one-size-fits-all approach still prevails. In democracies, each citizen gets one vote. So shouldn’t everyone get the same benefits?
Yes, we tax rich people more and give poor people more benefits, and that is contentious enough.
But consider all the qualitative services and conditions for which individuals have different preferences, needs, and outcomes that are now more predictable. If we can predict individual outcomes, what is an individual’s responsibility, and what remains a collective task?
These questions will become especially acute in areas such as education and health care. For example, we treat children differently in school according to their potential—as we understand it. But, if we help some children “to realize their potential”, are we thereby limiting the potential of others?
Likewise, how do we allocate health care resources? What responsibility do individuals have to modify their behaviour in response to their individual vulnerabilities and predispositions? And, most important, who—if anyone—should impose that responsibility? ©2013/PROJECT SYNDICATE
Esther Dyson CEO of EDventure Holdings, is an active investor in a variety of start-ups around the world. Her interests include information technology, health care, private aviation, and space travel.
Comments are welcome at email@example.com