With 3.6 billion monthly users by 2022, social media platforms like Facebook, Instagram, WhatsApp (all owned by Meta), Twitter, Tik Tok, YouTube, WeChat, and Snapchat will continue to grow. In the last 15 years, they have revolutionised how people interact, communicate, and share information and news.
However, social media platforms soon revealed their dark sides in spreading misinformation, disinformation and hate speech. According to the UN, Facebook did little to prevent the spread of hatred against the Rohingya Muslim minority in Myanmar in 2018, which led to what some have called genocide.
Human rights groups have accused the platform of spreading violent extremism across Eastern Africa, including Kenya, and fuelling the Ethiopian conflict, which has already claimed 600,000 lives, making it one of the bloodiest conflicts since World War II. Whistleblower Frances Haugen testified that Facebook prioritised profits over lives, especially in foreign countries like Ethiopia.
Meta was sued last year by Daniel Motaung, a South African employee of Sama, a company in Kenya that Facebook outsources to moderate content across most of Africa. According to him, Facebook and Sama failed to adequately protect workers when moderating content, including sexual violence, child abuse, suicide, bestiality, murders, etcetera. It was reported in Time magazine that content moderators earn as little as Sh200 per hour, far less than their counterparts in the global North.
Meta, however, filed a preliminary objection claiming that the Kenyan courts had no jurisdiction over it since it had no physical presence in the country. The High Court ruled this week that Facebook was an appropriate party to the lawsuit and therefore was not to be removed. Meta may appeal the decision. However, they may now be held responsible for their failures in Kenya, Africa, and the Global South.
This case has profound implications for other cases brought against tech giants in developing nations. Interestingly, the Katiba Institute and two Ethiopian nationals are also pursuing a lawsuit against Meta before the Constitutional Court of Kenya. Meta was widely expected to challenge the Kenyan court’s jurisdiction in a similar manner. Notably, Facebook’s content moderation operations in Africa are headquartered in Kenya, and despite outsourcing the service, Facebook staff regularly visit Kenya to supervise operations.
It is alleged that Meta failed to moderate content adequately in Ethiopia. In Ethiopia’s bloody civil war, Facebook and its parent company Meta are alleged to have inflamed the conflict by inciting hatred. According to international human rights law, states are the primary duty bearers of human rights and have a duty to protect against rights abuses committed by third parties. Companies are responsible for respecting all human rights independent of a state’s willingness or ability to fulfil its obligations. These rights exist over and above compliance with national laws and regulations. To fulfil this responsibility, tech companies and social media giants must have a policy commitment to respect human rights.
They need to take ongoing, proactive and reactive steps to ensure that they do not cause or contribute to human rights abuses such as discrimination and hate that severely affect societies. To do this, these companies must invest in the requisite technology and human resources and offer appropriate wages. In Africa for example, they must have enough content moderators to deal with the hundreds or thousands of languages and dialects and give them the required support including mental health services to cater for the trauma involved in the work.
First published in The Standard on 10th February 2023. Kindly reproduced here with permission from The Standard.
Demas Kiprono is a human rights lawyer and a Campaign Manager at Amnesty International Kenya. He writes in his personal capacity. Email: [email protected]