Written by Başak Ozan Özparlak

Photo by Johanna Buguet

The meaning of empathy has multiplied over centuries. Today the term is used to define putting oneself into another’s shoes, trying to understand someone else before judging them based on their feelings or ways of life. Whether we are aware of it or not, empathy plays a central role in various disciplines from historiography to politics, or from law to psychology, and its absence affects the projected outcomes of these disciplines as much as its presence. A researcher who goes under the ground in order to investigate drawings on cave walls, one who proceeds in darkness, covered in mud, would reach something priceless if they interpret what they see not only with regard to form, but instead through the empathy they have developed by putting themselves into the positions of predecessors from thousands of years ago who have experienced similar hardship on the very same cave. This is because art cannot be comprehended outside of its social context [1].

Today, ten thousands of years after the cave paintings, we are able to produce robots that can draw. And our need for a legal order that adopts ethical values is more than ever for securing justice. We do not need to fear robots themselves, but the consequences of power being concentrated more than ever in limited sites. This, in fact, is how an era that began with globalisation reaches its peak with technology. That is why in 2019 many international organisations published ethical principles for the age of big data/artificial intelligence. The shared values of these texts have been acknowledged as transparency, respect for the superiority of law and fundamental rights, accountability/security and respect for privacy. The relationship between law and ethics can be compared with the one between software and hardware; one completes the other. Much like how a hardware without software is no different than a simple object, software (ethics) that lacks hardware (binding rules of law) cannot always reach us through the physical world. Even though empathy has not been explicitly acknowledged as an ethical value such as transparency or respect for fundamental rights, it is a natural reflex that can spring to mind when ethics is concerned.

Even though we generally compare the age of big data and artificial intelligence with the invention of steam engine, the “exploration” of the American continent and passing the Cape of Good Hope can also be considered for comparison. Even historiography itself contains a lack of empathy, embodied in the term “exploration” of America. There were civilizations in America also before 1493. These developments were of course the result of technological developments that enabled progress in seafaring, and the ignitor of new ones. These discoveries helped humanity to create new commercial, economic, and political structures. Then again this did not create equal opportunities for all the societies of the world[3].

After Europeans set foot in the area in question, what begins is a form of exploitation rather than exploration. This one sided understanding of history is lacking in empathy, and is both an inadequate and dangerous approach while educating the children of tomorrow. This is very much in line with Amin Maalouf’s comments regarding how an interpretation of Christianity as a religion of tolerance and democracy, and Islam as one devoted to despotism would lead the world towards darkness [4].

Empathy is also vital with regard to design. For instance after the realisation that the tests regarding belts and vehicle safety have historically been conducted by taking only male drivers into account without noticing the deadly consequences of this approach, these tests have been improved so that they could include women and children. Just like how a more hospitable city can be created by designing physical spaces in accordance with the needs of everyone –including the disabled, elderly, and children–, the effects of an effective justice mechanism within the society can be more accessible if the legal system and service can be designed by taking empathy into account.

In the future where artificial intelligence systems will gradually become more common, the presence or the absence of empathy in applications which we will use instead of our smartphones following the implantation of 5G and 6G communication technologies, those regarding augmented reality, virtual reality, or even mixed reality can define the pathways these technologies can lead us towards. This is the reason why the Ethics Guidelines for Trustworthy Artificial Intelligence, which was prepared by the Ethics Committee of the European Union and issued on April 2019, featured a question under the social impact section regarding the evaluation of the artificial intelligence system forming a connection and establishing empathy with human systems [5].

InAs stated in the report titled “Artificial Intelligence and Life in 2030” by Stanford University, virtual reality-based entertainment systems will also start to be dialogue based following the advancements in the automatic language recognition technology, possibly becoming more and more anthropomorphous with regard to sound. The report suggests this may result in interactive systems acquiring new qualities such as affections and empathy [6]. Perhaps it is more helpful to work towards robots being equipped with empathy rather than being afraid from them acquiring human-like qualities. One can also suggest that it would be more risky for humanity to lose the ability of empathy against robots than from losing their jobs or vocational knowledge and abilities. We need to invest in empathy and ethical values as much as the jobs of the future. That is why while teaching kindergartners the basics of coding, or even before that, one should teach them about behaving ethically.

Is our ability for empathy as humanity at stake? I’m afraid it is. Here’s why:

1) Echo chambers

2) Personalised product offers. Let me elaborate on this:

• On social media we prefer to befriend people who have similar opinions. Even among the content that does not feature any form of insult, we are frustrated from seeing the posts of those who do not share our opinions. We may not be fond of celery, yet it would definitely not be healthy to only feed on pasta. Adding a certain amount of vegetables to the menu would only benefit the meal.

• We are drowned in suggestions that are of similar nature to the things we buy. Previously we had the luxury to come across a film out of pure coincidence, and put up with it even if we get bored, just out of curiosity. Even if we did not watch it through the end, we were able to witness something that was quite different than our own taste. Nowadays, an almighty algorithm, which is told to know us better than we know ourselves –which may be true–, suggests which books, songs, films, TV series, clothing or any commodity we should choose.

Our brains are fed unilaterally. We are surrounded by our own selves and those that are like us. Does this cause a decrease in our ability for empathy? This is undoubtedly a question that should be answered from the point of view of philosophy, psychology, sociology, or neurology rather than by a legal expert. Then again, I feel that we, as the whole humanity, are afflicted with a regression as such.

The title of the 5th Istanbul Design Biennial was announced as Empathy Revisited: designs more for than one. Even though the content of the biennial exhibition has not been announced yet, this question within the curatorial text is quite essential in this time that art and new technologies intersect with each other: “How does empathy form across platforms that mix up virtual and real worlds?” In 2018 the 4th Istanbul Design Biennial had delved into education and the dilemmas that schools currently face, manifested through a philosophical-satisfying book full of thought-provoking pieces. Witnessing and interpreting both biennials is a golden opportunity for those who reflect on the new ways of communication in the big data age.

This piece was originally published in Turkish at medium.

[1] Lewis-Williams, James David. Mağaradaki Zihin, YKY, İstanbul 1. Baskı, 2019 (Çev: Esmer, Tolga), 44.

[2] Tekeli, İlhan. Tarih Yazımı Üzerine Düşünmek, Dost Kitabevi, Birinci Baskı, Ankara, 1998, 86–92.

[3] Acemoğlu, Daron, & Robison, James A. Ulusların Düşüşü, Doğan Kitap, Birinci Baskı, İstanbul, 2013, 25.

[4] Maalouf, Amin. Ölümcül Kimlikler (Çev. Bora, Aysel), YKY, Birinci Baskı, 2000, İstanbul, 49.

[5] Independent High-Level Expert Group on Artificial Intelligence Set Up by the EU Commission. Ethics Guidelines for Trustworthy AI, 30–31.

[6] Stanford University, Artificial Intelligence and Life, 41.

Yukarı