Canadian actress August Ames, real name Mercedes Grabowiski, took her life in 2017. It was widely believed that the backlash from her post on Twitter had driven the 23-year-old to hang herself.
Her husband Kevin Moore said his wife had committed suicide one day after enduring an avalanche of online abuse and bullying. “If you fire a gun into the air and that bullet randomly hits someone that you never intended to kill, you still killed them,” said Mr Moore.
Social media is an essential and perhaps inevitable reality of our time. It can be used as a force for good or evil. Given its potential for abuse, it is legitimate to ask, should social media be regulated?
In the world of social media, anyone can create a fake account instantly and use it for their ulterior motives, including malicious and evil intentions such as sending spam, child pornography, terrorism, hate speech, incitement of violence, offensive communication, cyberbullying, trolling and defamation.
With no straightforward way of establishing the hidden identity of people online, what’s to stop criminally minded individuals from engaging in online criminal activity with the sense of security that anonymity brings? These concerns, as well as privacy, data protection and copyright infringement, call for regulation.
However, owing to online anonymity, the cross-border nature of the Internet, and evolving technological innovations, the traditional media policy and the regulatory regime is often limited. Thus new measures are required in the convergent digital online environment.
Regulation v. Freedom of Expression
Whenever regulation is mentioned in the communications context, questions are bound to be asked about freedom of expression. Article 9 of the Universal Declaration of Human Rights (UDHR) recognises the right to freedom of expression. Specifically, it provides for freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media.
Article 29 (1) (a) of the Constitution of the Republic of Uganda provides that “Every person shall have the right to freedom of speech and expression, which shall include freedom of the press and other media.”
However, both the United Nations, through the UDHR, and the Constitution of Uganda, recognise that the right to freedom of expression is not absolute.
Under Article 29(2) of the UDHR, the exercise of freedom of expression is subject “to such limitations as are determined by law solely for the purpose of securing due recognition and respect for the rights and freedoms of others and of meeting the just requirements of morality, public order, and the general welfare in a democratic society.”
On its part, the Constitution of Uganda in Article 43, which underlines some limitations on rights and freedoms, including freedom of speech and expression, says in section 1: “In the enjoyment of the rights and freedoms prescribed in this chapter, no person shall prejudice the fundamental or other human rights and freedoms of others or public interest.”
There is, therefore, a need for specific guidance on the regulation of online activities, and the protection thresholds of online communication. However, any form of social media regulation must, as the Constitution of Uganda demands, balance both the individual right to freedom of expression vis a vis the human rights and freedoms of others, and the public interest.
In the landmark case on freedom of expression in Charles Onyango Obbo and Another v Attorney General (2004), JusticeJoseph Berko held, among other things;
“I do agree that Article 29(1) of the Constitution guarantees free speech and expression and also secures press freedom. These are fundamental rights. It can be said that tolerating offensive conduct and speech is one of the prices to be paid for a reasonably free and open society. Therefore in my view, the functions of the law, and particularly criminal law should (be to) exclude from the range of individual choice those acts that are incompatible with the maintenance of public peace and safety and rights of individuals. Freedom of speech and expression cannot be invoked to protect a person ‘who falsely shouts fire, fire, in a theatre and causing panic’….. A citizen is entitled to express himself freely except where the expression would prejudice the fundamental or other human rights and freedoms of others or the public interest.”
Communication services and the space they occupy is regulated under the Uganda Communications Act of 2013. Section 5 of this law mandates the Commission to monitor, inspect, license, supervise, control, and regulate all communications services. The Commission is also mandated to set standards and enforce compliance relating to the content.
According to Black’s Law dictionary, social media refers to any cell phone or Internet-based tools and applications that are used to share and distribute information. This includes applications like Facebook, Twitter, WhatsApp and YouTube. Therefore, as long as these platforms are used for communication, there are rules and guidelines on the same.
Section 2 of the Uganda Communications Act defines ‘communications services’ to mean services consisting of the dissemination or interchange of audio, visual or data content using postal, radio or telecommunications media, data communication, and includes broadcasting.
On the other hand, ‘content’ is defined to include “any sound, text, still picture, moving picture or other audiovisual representation, tactile representation or any combination of the proceeding which is capable of being created, manipulated, stored, retrieved or communicated electronically.”
‘Data’ is defined to mean the electronic representation of information in any form.
Hence, under the law, the provision of any services that involve communication to the public, whether by way of audio, video, sound, still or moving pictures or a combination thereof, is a communication service that is subject to the regulatory control of the Commission.
International Approaches to Social Media regulation
Just as in Uganda, many countries are grappling with social media regulation. Under section 127 of the Communications Act 2003 of the United Kingdom, which governs the internet, email, mobile phone calls and text messaging, it is an offence to send messages that are “grossly offensive or of an indecent, obscene or menacing character.” The offence occurs whether those targeted receive the message or not.
In December 2013, the Crown Prosecution Service of the United Kingdom issued ‘Guidelines on prosecuting cases involving communications sent via Social Media’ designed to ensure consistency in the approach of prosecution of offences committed by sending communications over social media.
Where social media is used as a medium to facilitate another substantive offence, the Guidelines proffer that prosecution should proceed under the substantive offence in question.
The Guidelines categorise the offences into four broad areas;
- Credible threats (to a person’s life or safety or property)
- Communications targeting specific individuals (including persistent harassment and ongoing abuse)
- Breach of Court Orders (e.g. identifying people protected by law)
- Communications which are grossly offensive, indecent, obscene or false (Section 127, Communications Act 2003)
Notably, the threshold for prosecution in the fourth category is significantly higher than the first three. This is an attempt to strike a balance between the right to freedom of expression and abuse of this right.
Besides, the Terrorism Act 2006 provides for the removal of terrorist material hosted online in the UK if it; glorifies or praises terrorism, could be useful to conducting terrorism; urges people to commit or support terrorism.
Further, in line with a voluntary code of practice, leading Internet Service Providers to employ filtering mechanisms to block criminal content, including sites that promote online child sexual abuse, terrorism and copyright-infringement.
The French government enacted a decree in 2015 that allows the blocking of websites identified to be promoting terrorism and publishing child pornography, without seeking a court order. These new rules, oblige Internet Service Providers (ISPs) to take down offending websites within 24 hours of receiving a government order.
Under the Telecommunications Act 1997, the government requires ISPs to block specified websites, particularly those identified in child abuse claims. There is also a push to establish a voluntary or self-regulatory regime where ISPs would employ filtering mechanisms to block illegal content.
Section 79 of the Information Technology Act creates the Information Technology (Intermediary Guidelines) Rules 2011, which oblige Internet Intermediaries to observe due diligence in the execution of their duties and inform end-users of computer resources on responsible use of the internet, i.e. not to host, display, upload, modify, publish, transmit, update or share any information which is harmful, objectionable, against the law and can affect minors.
Social Media regulation in Uganda
Under Section 5(1) of the Communications Act 2013, UCC is tasked to; Monitor, inspect, license, supervise, control and regulate communications services, and to set standards, monitor and enforce compliance relating to content.
Section 31, Uganda Communications Act 2013 and the Fourth Schedule to the Act on minimum standards for broadcast content, require that all content; 4(a)
- is not contrary to public morality;
- does not promote the culture of violence or ethnic prejudice among the public, especially the children and the youth;
- in the case of a news broadcast, is free from distortion of facts;
- is not likely to create public insecurity or violence;
- complies with the existing law;
(b) programmes that are broadcast are balanced to ensure harmony in such programmes;
(c) adult-oriented programmes are appropriately scheduled;
(d) where a programme that is broadcast is in respect to a contender for a public office, that each contender is given the equal opportunity on such a programme;
(e) requires that content is verified where it relates to national security.
However, given the limitation of traditional media policy and regulation with regard to social media, our focus is more on the content that is transmitted over these platforms than the actual platforms.
Where the broadcaster or video operator plays the role of gatekeeper, enforcement is possible as they can easily be held accountable for the content carried over their platform. The closest analogy to a gatekeeper for Internet/social media is the local Internet Service Provider (ISP).
Under Section 11 of the Regulation of Interception of Communications Act (RICA) 2010, telecommunication service providers, which definition includes Internet Service Providers (ISPs), are required to under the guidance of and in a manner prescribed by the Minister responsible for Information and Communications Technology, provide services capable of being intercepted.
An extra regulatory position could be to require ISPs to filter/block/take down websites with specified content, e.g. child pornography, terrorism, hate speech, incitement of violence, breach of any law or as per the minimum broadcasting standards, over their networks.