Are algorithms making us W.E.I.R.D.?
From what we see in our internet search results to deciding how we manage our investments, travel routes and love lives, algorithms have become a ubiquitous part of our society. Algorithms are not just an online phenomenon: they are having an ever-increasing impact on the real-world. Children are being born to couples who were matched by dating site algorithms, whilst the navigation systems for driverless cars are poised to transform our roads.
Last month, Germany announced ethical guidelines for determining how driverless cars should react in unavoidable collisions. According to these guidelines, in such instances where driverless cars are unable to avoid a collision, they will be expected to protect people rather than property or animals.
The problem is that we live in an incredibly diverse world, with different nations having their own morals and culture, hence algorithms that are acceptable to one set of people may not be acceptable to a different group. “We do not really have a consensus on morals,” says Dr Sandra Wachter, a researcher in data ethics at the Oxford Internet Institute and a research fellow at the Alan Turing Institute. “Even regionally in Europe we greatly differ in what is important to us and what values we need to protect. Privacy is one of the major issues in Europe, but if you look how Germany perceives privacy – what they think of privacy and why it is important – that is not necessarily reflected in the UK.”
Ethics are influenced by personal, cultural and religious factors. These unique perspectives stem from the societal experiences of each country. “Even though we talk about the same thing, they are flavoured in a different colour, because we have different experiences,” says Wachter.
The ethical guidelines for driverless cars may be broadly suitable in Germany, but they would not be universally acceptable. For example, in areas of the world where cows are widely considered to be sacred, drivers would naturally avoid hitting a cow at all costs. Likewise, there are some religions, such as Jainism, that place great importance on the lives of animals.
“We often talk about US norms, but really what we are talking about are WEIRD norms that are being imposed through these things,” says researcher Matthew Blakstad, author of Lucky Ghost. WEIRD, in this case, stands for western, educated, industrialised, rich and democratic.
“We often talk about US norms, but really what we are talking about are W.E.I.R.D. norms that are being imposed”
This highlights an issue that, although algorithms can be deployed around the world, they invariably have the cultural background and perspective of their programmers. Wachter says problems can occur if these technologies are developed by people from a single demographic. If that happens, the results will lack the personal, cultural and religious nuances that are needed to reflect a global population. “It will be very one-sided, when it needs to be culturally diverse,” she warns.
The cultural perceptions of coders can subconsciously influence the decisions they make during the development phase of the algorithmic applications. Decisions about the rule sets and data sets, which are used to train the machine-learning element of the algorithm, can naturally affect the resulting behaviour. “The data they collect and the way they test is combined to be inward looking, so it works for them and their friends,” says mathematician Cathy O’Neil, author of Weapons of Math Destruction.
A prime example of this is how a video game developer once inadvertently created a racist video game. The motion-tracking algorithm was unable to detect dark skin tones, as the camera relied on reflected light to detect motion. The fact that this was not noticed during development or testing indicated an unfortunate bias.
Meanwhile, Snapchat courted controversy last year when they deployed the Coachella Festival filter. This filter was designed to create a ‘summer festival vibe’ with a garland of flowers. It also featured a skin-whitening function.
It was the first ‘beautification’ filter to automatically lighten skin tones. The main difference between Snapchat and other photo-sharing platforms is that it used predefined filters, without the user’s specific consent. It means somebody, somewhere, assumed that lightening an individual’s skin tone would make them more appealing.
These unfortunate examples of bias in algorithms may be specific cases of lack of foresight, but when such applications are deployed worldwide, there is the potential to inadvertently influence the cultural development of other countries. We could very well be heading towards a homogenised future, where algorithms influence values through a form of cultural colonisation. “They will change the cultures of the countries to suit their product,” says O’Neil.
“They will change the cultures of the countries to suit their product”
However, this isn’t necessarily a one-way trickle of values from wealthy westerners across the globe. China, for example, has also become a powerhouse in artificial intelligence. As we start using tools incorporating logic coded in India, China and other fast-developing economies, the West could find itself on the receiving end of cultural colonisation.
As dramatic as it may sound, these scenarios don’t need to lead to mass homogenisation. Indeed, with more cultural perspectives involved in the creation of programming tools, the hope is that a multiplicity of values will offer a greater level of choice when considering what algorithms should power the technologies of the future. “It is like chopsticks and the fork,” says Professor Luciano Floridi of the University of Oxford. “For someone like me, the fork is much easier to handle, and yet it has not replaced the chopsticks. Depending on the specific application and the business, cultural and societal needs, we will find that a variety of tools get adopted.”
By that argument, the coming decades could see a fusion of different societal perspectives fuel the development of sensors, artificial intelligence and technologies such as autonomous cars; a future where we choose which tool to use and which cultural approach is best suited to the situation. All of this hinges on there being a diverse set of developers laying the groundwork. Non-WEIRD code is needed, as well as an awareness that algorithms are not neutral; that they come with a great deal of cultural baggage.