Where’s the #humanITy?
We have seen an explosion of new technology over the past decades. This unprecedented advancement is impacting our everyday lives and will continue to do so in ways we can’t imagine in the years to come. For this reason, it is vital to consider the potential implications of new technologies.
Last week in Stockholm, a panel consisting of Tove Andersson, Pieter-Jan van Eggermont and Mathias Antonsson spoke at a Global Bar talk, moderated by David Isaksson. The panel discussed the latest trends in tech and human rights. This conversation focused on artificial intelligence (AI), blockchain, and the automatisation of labour. It attracted a diverse crowd of academics, innovation leaders, government actors and civil society.
The panel agreed that tech is not “good” or “bad” in-and-of itself. Human beings have always been pushing the boundaries of our capabilities. Can we make batteries and food last longer, planes go faster, computers more efficient? This expansion of technological capability is vital to accommodate an increasing human population. It is also necessary to improve our quality of life. Pieter-Jan, who works as Humanitarian Advisor for the Swedish section of Doctors Without Borders/MSF, spoke about MSF field innovation using 3D printing to create prosthetics limbs. Aside from 3D printing, MSF uses a wide range of new technologies to expanded the capacity to deliver care to people in need.
But better tech is not all that we need to live better. Technology is only a tool and people design and use tools in different ways, to different ends. A person can use binoculars to birdwatch or spy on their neighbours. People can use information to connect or to control. The distinguishing factor is what people want tech to achieve.
Some visions for the future of technology are beneficial for humanity and others less so. It is for this reason that we must consider the possible range of uses for each new piece of tech that emerges. For instance, in the wrong hands, 3D printing could produce armaments, not medical equipment.
Mathias, who leads up the Innovation Initiative at Civil Rights Defenders, brought up a few examples of digital human rights issues emerging from the East. Recently, China has launched a new social credit system, awarding citizens points depending on their behaviour, and as such essentially digitising part of the citizenship. Facial recognition software is also being used to track people’s movements. The purpose? “To help cities run more efficiently.”
On the surface, this seems harmless enough. Yet, let’s imagine this technology when driven by desires that contradict human rights. It could offer authoritarian governments an unprecedented tool for controlling their citizens. If one centralized power is responsible for determining what is “good” behaviour and what is “bad” behaviour, then there will no longer be any room for that essentially human quality of free will. It will be reduced to a series of false choices between previously vetted and approved alternatives.
To add to this, AI has an, albeit unintended, authoritarian bias. The competitive advantage for democracies has been the delegation of authority, meaning ideas have been competing on a market and the best ideas rewarded, which in turn has meant stable development and economic growth. Authoritarian regimes, on the other hand, have in the past been crippled by a data processing issue. People in supreme positions of power can only make so many effective evidence-based decisions. But now AI’s tremendous data processing power makes this handicap a thing of the past. Dictatorships will now be able to automate decision-making on a grand scale based on tremendous data streams about their citizenry making it possible to control more people with less effort and risk than ever before.
Panelist Tove and her company Superblocks is developing advancements in blockchain technology. Originally famous for facilitating cryptocurrency transactions, blockchain is a secure way of storing data, which could also be used to combat trends of centralised power. Normally, a singular copy of data is stored locally. If that local computer is hacked, then that singular copy of data is changed forever and the original data is lost. But with blockchain, data is copied to many networked computers which prevents data tampering. If one local computer is hacked, then, by comparing the hacked data to the rest of the network, it is easy to verify the original data set and continue on as normal. Today, blockchain is being used to decentralize everything from banking to marriage and the applications are far-reaching.
Recently, the Innovation Initiative was invited to attend the 150 Think Tank by Techfestival in Copenhagen. The purpose was to bring together experts on technology and ask them to outline principles for a better digital future. Together, we produced the Copenhagen Catalog. It’s one of many signs that there is momentum for change within the tech community right now, a push to make it more human, to serve humanity. Today’s digital landscape much resembles the wild west. Companies and governments are able to operate with relatively little oversight on the internet. The Copenhagen Catalog challenges this lawless anarchic narrative.
This growing awareness of the importance of digital responsibility is spreading further than Copenhagen. Tech giants such as Facebook and Twitter are realising that their platforms are struggling to achieve what they were designed to, and are instead being used for “fake news”, hate-speech, and promoting filter bubbles. These companies profit from selling their users’ data. Their services are “free” to the public because they are designed to entice users to input their personal data. Then they syphon off that personal data and sell it to advertisers and other interest groups for money. In short, Google is a data mining platform for third parties, posing as a research platform for its users. Facebook is a data mining platform for third parties, posing as a friendship platform for its users. And so on for probably most platforms you engage on today. To people in the tech industry this is old news, but this idea, that the purpose of these tech giants is to make a profit by selling user’s valuable personal data, is often news to the average user.
Despite this lack of public awareness, tech giants are becoming infamous. While Google is bowing to censorship in China, Facebook is being used by Russian troll factories to influence democratic elections, and Twitter is plagued by increasing levels of hate speech. These bad reputations risk hurting their profit margins. People who don’t trust them won’t hand over their personal data. So these actors have an explicit financial incentive to set up more ethical systems. The Innovation Initiative has a role to play in this process, and we’ll have some big news about this in the near future…
Ethical redesigns with the users in mind are not just necessary for the tech giants, but also for governments. It is paramount that democracies drive the conversation around digital regulations. That conversation has already produced such progress as the General Data Protection Regulation within the EU which helps to give individuals control over their personal data. This is a good step, and we hope it’s the start of more to come.
We need to find ways for governance to protect against the misuse of digital personal information by authoritarian governments now and in the future. Without this governance there is almost no limit to the penetration of censorship and control than a digital dictatorship will be able to exhibit. If governments and civil society actors do not act now, then many people may find themselves living under an authority that knows everything about them and is able to control every part of their lives.
Humanity will not stop creating and developing; it is our heritage and legacy. The very existence of technology does not necessarily pose a threat. But the potential that it could come to harm or command us does. For this reason, we must be vigilant and install preventative governance structures to protect our digital rights wherever possible.
Stay tuned for more on the future of tech and human rights.