In the information technology world, we have seen an explosion of Artificial Intelligence, automation, blockchains and data science tools in the last few years. All of these technologies, while distinct in their approach, actually cross-pollinate each other. The world’s denizens—and their governments—have rightly started to worry about issues such as data privacy, control over fiat currencies, job losses and widening income disparities. I had written in this column earlier that governments started getting significantly more active in the regulation of this space in 2018.

Sure enough, this debate has continued into 2019. These questions are not simply for governments and philosophers; the world needs leadership and self-governance by technology giants in the formation of ethics to help us navigate turbulent times ahead. In my opinion, most attempts at self-governance came after the negative effects of new technologies were felt. The profit motive did not allow for Big Tech to pause before it rushed headlong into grey areas. Now that the world has begun to call them out, they claim to be in ethical introspection mode.

There is another area of technology that is beginning to raise concerns. Technology and bioengineering have begun to take over in the life sciences industry. We have already seen ethical questions that arose from a Chinese experiment on interfering with the DNA of babies. Our capacity to re-engineer the biological world is truly mind-boggling in its scope now, and many technological offshoots attempt to fuse man and machine into a cohesive “super-human".

I was recently in Cambridge, Massachusetts, visiting my sister Seeta Pai and brother-in-law James Gray, who both hold doctoral degrees from Harvard in human development. Cambridge has become a hotbed for bio-tech innovation and startups. Gray works for MIT Media Lab and gave me a quick tour.

Thankfully, the MIT Media Lab, along with other bodies, has established a Community Biotechnology Initiative to both share tools and research as well as create an ethical framework for the world’s researchers to follow. This is to be done by engendering discussions by researchers and hosting various events to facilitate such debates.

One such event is the Global Community Bio Summit. During its recent 2019 conference, it came out with a list of ethical principles that bio-tech researchers will have to ask themselves on a regular basis. The fact that these questions have already been identified before the damage is done is an important achievement that sharply contrasts with the approach of Big Tech.

In the rest of this column, I will paraphrase the Bio Summit’s list of areas where ethical conundrums lie, while attempting to remain faithful to the original text with some of the questions suggested for further reflection. This list is enough to engender thought along several vectors. The summit said that these could be applied in any “community bio" context, but made clear that it was not a central fiat. It is just a list that would allow for discussions in both local and global contexts.

One, respect: How can we prioritize the rights of humans, animals and ecosystems? How can we attune our ways so that our practices do not harm other living beings?

Two, credit: How do we make sure our work serves as a resource for the community and the broader public while still valuing those who do the work?

Three, community: How do we make decisions collectively? How do we identify and engage non-bio stakeholders?

Four, autonomy: How do we decide what forms of self-determination we value? How do we identify relationships of power that impact autonomy?

Five, education: How can we create space to learn and the confidence to teach?

Six, open science: How can we encourage replicability and collaboratively share results? How do we stay open while being mindful of the risks posed by openness?

Seven, transparency: How do we stay open about our failures? How can we acknowledge ethical conflicts? How do we decide what acceptable funding sources are?

Eight, data privacy: How do we respect the sovereignty of data, treat stakeholders as peers, and agree on terms of use through informed consent?

Nine, safety: How do we embrace safe practices in unconventional contexts? How do we protect each other and create resources for communities to experiment safely?

Ten, justice and fairness: How do we engender justice and fairness in our practices? How can we avoid perpetuating systems of winners and losers? How do we account for the varied impact of our work?

Eleven, diversity and inclusion: How do we make sure our organizations respect vulnerabilities and acknowledge privilege? How do we make our spaces valuable and accessible to communities whose interests are historically under-represented in the sciences?

And twelve, accountability: Is there an active commitment to consider these questions? How will we be accountable to these ethics? How do we hold each other accountable and make ourselves accountable to those outside this community?

It is heartening to note that the scientific community, unlike its technology counterpart, is beginning to consider these questions at the outset, long before we find ourselves overrun by droids and Frankensteins

Siddharth Pai is founder of Siana Capital, a venture fund management company focused on tech

Close
×
My Reads Logout