MCI - Ministry of Communication and Information of the Republic of Singapore

07/25/2024 | Press release | Distributed by Public on 07/25/2024 19:47

MDDI Survey: Two Thirds of Respondents Encountered Harmful Online Content

SURVEY BY MDDI FINDS TWO-THIRDS OF RESPONDENTS ENCOUNTERED HARMFUL ONLINE CONTENT ON DESIGNATED SOCIAL MEDIA SERVICES; 6 IN 10 TOOK NO ACTION AGAINST SUCH CONTENT

More can be done by designated social media services to reduce harmful content and encourage users to act against such content

  1. A recent survey by the Ministry of Digital Development and Information (MDDI) (formerly known as the Ministry of Communications and Information), found that two-thirds (66%) of respondents encountered harmful content on social media services designated by the Infocomm Media Development Authority (IMDA) under the Code of Practice for Online Safety1. 6 in 10 (61%) who encountered such content ignored it. About one-third (35%) blocked the offending account or user, and only one-quarter (27%) reported it to the platform.

  2. The annual Online Safety Poll was conducted with 2,098 Singapore respondents aged 15 years old and above in April 2024. It aimed to understand the experiences of Singapore users with harmful online content, and their action to address such content2.

Prevalence of Harmful Online Content on Social Media Services

  1. Overall, about three-quarters (74%) of respondents surveyed encountered harmful online content in 2024, an increase from 65% in 2023. In terms of platforms, two-thirds (66%) of respondents encountered harmful content on designated social media services. This is up from 57% in 2023. In comparison, 28% of respondents encountered harmful content on other platforms, similar to last year's level3.

  2. Among respondents who encountered harmful content on designated social media services, close to 60% cited encounters on Facebook, while 45% cited encounters on Instagram. While the prevalence of harmful content on these platforms may be partially explained by their bigger user base compared to other platforms, it also serves as a reminder of the bigger responsibility these platforms bear4.

Table 1: Encounters with harmful online content across social media services

S/No.

Platform

Among respondents who encountered harmful content on social media services (%)

1

Facebook

57

2

Instagram

45

3

TikTok

40

4

YouTube

37

5

X

16

6

HardwareZone

6

Types of Harmful Online Content on Social Media Services

  1. Cyberbullying (45%) and sexual content (45%) remained the most common types of harmful content encountered on designated social media services. However, there was a notable increase in encounters with content that incite racial or religious tension (+13%) and violent content (+19%), compared to last year.

Table 2: Most Common Types of Harmful Online Content on Social Media Services

S/No.

Category

% encountered

1

Cyberbullying

45

2

Sexual content

45

3

Inciting racial/ religious tension

42

4

Violent content

39

Reporting of Harmful Online Content to Social Media Services

  1. Among those who reported harmful content to the platforms, 8 in 10 (78% to 86%) experienced issues with the reporting process5. Top issues cited were that the platform:

    a) Did not take down the harmful content or disable the account responsible;

    b) Did not provide an update on the outcome; and

    c) Allowed the removed content to be reposted.

  2. For respondents who did not report harmful content to the platforms, the most commonly cited reasons were that they:

    a) Did not see the need to take action (28% - 51% across the designated social media services);

    b) Were unconcerned about the issue (29% - 45% across the designated social media services); and

    c) Believed that making a report would not make a difference (26% - 37% across the designated social media services).

Whole-of-society Efforts to Tackle Online Harms

  1. Given the complex, dynamic and multi-faceted nature of online harms, the government, industry, and people must work together to build a safer online environment.

  2. As part of this holistic approach, the Government has taken several legislative steps to protect our people from online harms. For example,

    a) In February 2023, amendments to the Broadcasting Act took effect, enabling the Government to swiftly disable access to egregious content on the designated social media services.

    b) In July 2023, the Code of Practice for Online Safety came into effect. Among other things, it requires designated social media services to have in place systems and processes to minimise children's exposure to inappropriate content and provide tools for children and their parents to manage their safety.

    c) Earlier this month, Minister for Digital Development and Information Mrs Josephine Teo said that a new Code of Practice for App Distribution Services (or app stores as they are commonly referred to), which will require designated app stores to implement age assurance measures, will be introduced. This is to minimise Singapore users' exposure to harmful content and protect children from inappropriate content. More details will be shared in due course.

  3. Beyond the Government's legislative moves, the survey findings showed that there is room for all stakeholders, especially designated social media services, to do more to reduce harmful online content and to make the reporting process easier and more effective.

  4. As part of the requirements under the Code of Practice for Online Safety, designated social media services are due to submit their first online safety compliance reports by end-July 2024. It will provide greater transparency to help users understand the effectiveness of each platform in addressing online harms. The IMDA will evaluate their compliance and assess if any requirements need to be tightened.

  5. Users also need to do their part to proactively act against harmful online content by reporting to the platform. Workshops, webinars, and family activities are organised in support of IMDA's Digital for Life movement, to equip users with knowledge and tools to keep themselves and their children safe online. These online resources and schedule of relevant activities are available on https://www.digitalforlife.gov.sg.

  6. Together, we can provide a safe online environment for all.

1 Six social media services with significant reach or impact have been designated under the Code of Practice for Online Safety which came into effect on 18 July 2023. These platforms are Facebook, HardwareZone, Instagram, TikTok, X (formerly known as Twitter) and YouTube. More details can be found on IMDA's website.

2 Where the sample was not representative of the resident population by gender, age, or race, it was weighted accordingly to ensure representativeness.

3 These platforms are messaging websites/apps, search engines, email, news websites/apps, gaming platforms/apps and app stores.

4 The top three most used social media services in Singapore are Facebook, Instagram and TikTok, according to the "Digital 2024 Singapore" report by We Are Social and Meltwater, retrieved from https://wearesocial.com/sg/blog/2024/02/digital-2024-top-digital-and-social-media-trends-in-singapore/ on 12 July 2024.

5 78% to 86% for Facebook, Instagram, TikTok, X (formerly known as Twitter) and YouTube. Insufficient sample size for HardwareZone.