What are the efforts of social networks for mental health?

Thais Monteiro
July 19, 2022 – 6:00 am

In June, TikTok launched a time management tool on the platform. It was one of the efforts that Bytedance’s application to assist in the implementation of features aimed at the well-being of the user. Like him, other social platforms undertake these initiatives in four main pillars: restriction of harmful content, promotion of quality content, control tools for the user and direct contact with prevention centers.

(Credit: SurfsUp/Shutterstock)

Now, on TikTok users can set daily screen time limits in the app and schedule prompts to pause after a certain period of time that passes uninterrupted on the platform. Screen control was already a reality on TikTok, but it expired after a daily limit. This time, the user has access to a dashboard with information about access and time spent in the app and receives weekly notifications to revisit that dashboard. The youngest, ages 13-17, receive digital wellness warnings when they spend more than 100 minutes on the app.


TikTok launches time management feature on the platform

Along with the news, the platform published a guide on the Security Center on how to reflect on digital well-being with family and friends, aiming to encourage its community to reflect on the time spent online, whether on TikTok or any other platform. The Chinese app Bytedance also claims that it partners with organizations specializing in mental health to maintain a constant dialogue on platform improvements. In addition to the newly launched additional tools, TikTok has been campaigning since 2020 with recommendations from help channels regarding mental health and disseminating care recommendations and content from centers and organizations that are present in the app, such as the Life Appreciation Center. , Instituto Vita Alere, Safernet and Unicef.

Control and free will

A social network that in recent years has been investing more in user control resources is Twitter. Recognized for being the stage for heated discussions, especially in the context of socio-political issues, the platform performs its own control based on community guidelines, but also allows the user, in a way, to customize Twitter according to the subjects they want. accompany, with whom you want to interact, among others. The most recent, released in May, is its own version of what was previously released by Instagram as “Best Friends” and dubbed the “Twitter Wheel” on the bluebird network. It allows public account users to post comments, photos or videos that can be viewed and interactive to a select group of people (up to 150) elected by the user in question. Still in the conversations topic, the user also has the possibility to choose who can respond to a particular post (any person, only those they follow or those mentioned).

Recently, users have even been given the ability to opt out of conversations they are involved in by denying users to mention them to discourage abuse of mentions to others, and they are warned when Twitter detects harmful or offensive language when writing a post for so that he can review the attitude. Other features include muting muting words, phrases, usernames, emojis or hashtags so they don’t appear among content, removing specific accounts from the feed without blocking them, actually blocking them, and choosing who can follow you.


Twitter starts testing best friends tool

Bearing in mind that user control is not always enough to prevent them from seeking unrecommended solutions for their personal challenges, another strategy adopted by Twitter is to create direct communication channels and information provided by organizations recognized for dealing with issues involving mental health. . In 2020, the platform and the International Association for Suicide Prevention announced the expansion of resources in support of #SuicidePrevention with more than 20 nonprofit partners globally. When a user searches for terms associated with the topic, the first result is a notification encouraging the search for support, with CVV contacts. In this topic, the platform prohibits the publication of content that promotes or encourages suicide or self-harm behaviors. Users are encouraged to identify cases of self-harm and suicide and report them in the platform’s Help Center. #ExisteAjuda is a project with local authorities and non-profit organizations that provides information and support in the search for terms related to mental health.

wide front

Meta argues that it has been working with mental health for ten years on its platforms, giving greater control to the user and developing education and awareness programs with experts. In the user’s control, the company has developed the Snooze Mode, so that the user is not interrupted by notifications at certain times established by him, the option to unfollow or hide people, pages or groups on Facebook, as well as to hide direct messages or comments on Instagram with offensive or harmful words or restrict comments on posts. Facebook and Instagram also have panels that inform them about the time of use and, in 2021, it launched the “Make a break” on Instagram that suggests a break to users when they extrapolate a time established by them.

Meta also supports content created specifically on these topics, such as the project A Internet Que a Gente Faz, launched in February by Contente, the Vita Alere Institute for Suicide Prevention and Posvenção and Safernet, which brings together mental health professionals, specialists , parents, educators and influencers to discuss how to build a healthier and more diverse online space in different content formats.


Likes in Stories and the Instagram journey with the like

The company also has the Emotional Health Center, to connect people to organizations and support groups, in addition to disseminating content from Instituto Vita Alere and Centro Mente Aberta Mindfulness Brasil to deal with stress, anxiety, grief, loss, emotional crisis. , Covid-19 and healthy habits tips. The Facebook Security Hub also includes a Suicide Prevention hub, with discussion materials developed by SaferNet Brasil, Instituto Vita Alere and other global partners.

quality information

YouTube’s view on the topic involves content control, to ensure that the user does not come across harmful publications and misinformation about the topic, but also the promotion of those videos that can generate identification between the content creator and the audience, demystify the taboo that is to talk about mental health and provide a welcoming space.

In content monitoring, streaming works by removing comments from harmful content, reducing the reach of content that is on the edge of compliance with platform policies, recommending trusted content, and rewarding creators who meet established quality standards. Some of the prohibited content is that which threatens or persecutes people with insults and malice based on the characteristics of the individual, or with the aim of shocking users, promoting suicide or self-harm, or posing a risk.

At the same time, YouTube supports creators who discuss their experience with depression, self-harm or other mental health issues. Thus, some of the efforts in this direction were the realization, in September 2020, of an edition of the Conversas Que Importam project, in which Rita Von Hunty, Rennan da Penha, Cintia Aleixo and Lucca Najar discussed empathy, self-care and the importance of networks. of mental health support.


YouTube starts to highlight reliable sources on health topics

In 2022, in partnership with Instituto Vita Alere, YouTube created a playlist to bring more quality information on the topic. The platform also has partnerships with institutions in order to encourage the production of quality content on the topics and shows videos from reliable sources, such as Dr. Drauzio Varella. In the search for related topics, YouTube displays an information box on symptoms and treatment created by Hospital Albert Einstein and displays the number of the Centro de Valorização da Vida (CVV) in the search for the terms “suicide”, “suicide”, or even “suicide”. how to commit suicide”.