Introduction
The social media landscape is now ubiquitous in people’s everyday lives. Culture, politics, economics and sociological and public health discourses occur in this space [
1]. In 2021, more than four billion people worldwide used social media, spending an average of 144 min each day on platforms such as TikTok, Twitter, Instagram, Facebook and YouTube [
2]. These platforms provide users with opportunities to interact with a broad range of global content, exposing them to social change and marketing decisions, including harmful products such as e-cigarettes, an issue of global concern [
3]. Globally, 82 million people were estimated to use e-cigarettes in 2021 [
4], with the global e-cigarette market in 2023 estimated to be worth USD 24.6 billion and predicted to increase by 3.4% over the next five years [
5].
E-cigarette products are known to be harmful to health [
6]. Australia, where this research was centered, has historically taken a precautionary public health approach to e-cigarettes. Regulations have made it illegal to source liquid nicotine without a prescription from a medical doctor [
7]. Yet, recent figures show that more than one-quarter (26.1%) of Australians aged 18–24 have tried e-cigarettes, with ‘ever-use’ [daily, weekly, monthly and less than monthly use], especially high among those who currently smoke (63.9%) [
8]. Almost three-quarters (71.9%) of young people reported using e-cigarettes “out of curiosity,” and one in five (21.7%) used them because they believe that “vaping is less harmful than regular cigarette smoking” [
8]. In response to increased e-cigarette uptake, in January 2024, Australian legislation was introduced banning the importation of single use vapes, with refillable vapes banned from March 2024 [
9].
While traditional forms of e-cigarette advertising and promotion (print, radio and television) are regulated in Australia, tobacco and other independent vaping companies have increasingly turned their attention to social media platforms [
10]. E-cigarette products are promoted and advertised on social media [
11] through user-generated content, advertisements, and social media influencers [
12‐
15]. The use of social media is particularly popular among young people [
11]. There is substantial concern about young people’s exposure to e-cigarette advertising and user-generated content on social media, which is associated with lower perceptions of e-cigarette harm and more positive attitudes towards e-cigarettes, leading to the normalization of e-cigarettes and increased use [
16,
17].
The World Health Organization’s (WHO) Framework Convention on Tobacco Control (FCTC) [
18] was developed in response to the globalisation of the tobacco epidemic. The FCTC is a legally binding treaty with 183 signatories worldwide, that aims to reduce tobacco use and exposure to tobacco smoke. According to Article 13 of the FCTC, “
a comprehensive ban on all tobacco advertising, promotion and sponsorship applies to all forms of commercial communication, recommendation or action and all forms of contribution to any event, activity or individual with the aim, effect, or likely effect of promoting a tobacco product or tobacco use either directly or indirectly.” This ban includes traditional media (print, radio and television) and social media [
18]. However, social media companies are not bound by the FCTC, and the United States (US), which is home to many social media companies, is a non-party to the FCTC [
19].
Currently, social media platforms self-regulate, guided by their own content policies, which refer to prohibited content, including advertising and promotion of tobacco and e-cigarette products. However, it appears these platforms are disregarding their own e-cigarette content policies by permitting non-compliant content to be posted, thereby exposing users to content they should not be exposed to [
20,
21]. For example, there is evidence that social media account holders are not prevented from using various means to positively promote products on platforms (e.g., competitions encouraging users to share images of vape products, featuring vaping experiences, cross-promotion on alternate social networking platforms, and links to blogs to increase positive, searchable e-cigarette content) [
22].
The self-regulation approach of social media platforms requires review as it is insufficient to control the content promoting e-cigarette products [
14,
20]. Greater protection of social media users could be achieved by moving from self-regulation to public regulation (enforcement by an independent public regulator), which is legally binding to ensure greater accountability of social media platforms, content moderation, transparency, compliance and enforcement of content policy and sanctions for noncompliance [
1]. Our research aimed to understand industry professionals’ perceptions of social media harms and potential management strategies using vaping as a case study. Although the research focused on Instagram and TikTok, findings could be extended to other social media platforms and other harmful products.
Discussion
Social media has changed the way people communicate, interact and access information, goods and services, providing a forum for exposure to a range of imagery and opportunities that often do not occur in the ‘real world’ [
29]. There has been aggressive promotion of e-cigarettes via social media, specifically targeting adolescents and young adults [
30,
31], supporting the normalization of e-cigarettes and increased use [
16,
17]. How to deal with this new borderless digital media environment is a public health challenge.
By speaking with people who identified as working in public health, digital media, law, governance, tobacco control and advocacy, and using e-cigarette content on social media as a case study, we identified the complexity of the social media environment and potential opportunities for controlling the ready exposure to harmful content, via the themes of strengthening institutions, defanging industry and raising awareness.
In considering how to deal with social media content, we need to acknowledge the structural power of tobacco, vaping and media companies and highlight their global capacity to influence individual and community behaviour, as well as policy and public health outcomes [
32]. Responding to this dynamic and financially lucrative environment presents a range of challenges, particularly considering the power of social media companies, the limited experience of regulators, and relative sluggishness of regulatory action to catch up to technology [
33].
Social media platforms provide a powerful, inexpensive, and pervasive marketing stage for products, generating revenue primarily by collecting user data and capturing their attention, which is then monetised through advertising services [
29,
34]. There are currently 4.9 billion social media users worldwide [
2], spending an average of 144 min each day online [
35]. These exposures and interactions translate into significant dollars for digital platforms, with TikTok generating $350 million in revenue in 2022, and Facebook, Instagram, Twitter, and Snapchat together generating $205 million [
35]. The global e-cigarette market is expected to grow to USD 28.17 billion by 2023, with the online distribution channels expected to register the fastest growth [
36].
Considering the global operations of social media companies, our study participants recognized that coordination across borders is critical to the management of the platforms and their content [
29]. Some participants highlighted the relevance and leadership opportunity of the WHO FCTC, specifically Article 13 [
18], while others were less sure about this approach. FCTC Parties have recognized the challenges around monitoring and enforcing cross-border advertising, and have called for processes that more effectively facilitate global cooperation to ban cross-border advertising and sponsorship [
37]. However, others in our study, particularly those from the social media area, suggested that content could be controlled at a country level.
Participants in our study provided examples of regulations that have already been implemented to manage social media companies and their content at a country level. These examples included the EU’s General Data Protection Regulation, considered one of the strictest privacy and security laws globally [
38]; and the Australian News Media and Digital Mandatory Bargaining Code, which enables Australian news businesses to bargain with digital platforms regarding payment for news. Other examples include the UK Online Safety Act [
39], which ensures social media platforms are held responsible for the content they host [
40] and Australia’s recently introduced Public Health (Tobacco and Other Products) Act 2023 [
9] which will aim to address the proliferation of e-cigarette advertising and promotional activities on social media. How this particular Australian legislation, proposed to commence in April 2024, is enacted, and its effectiveness remain to be seen. It will be crucial that this legislation is regularly reviewed to limit the development of loopholes and ensure it maintains effectiveness in the dynamic online environment [
37].
These examples of current and proposed regulations have been introduced at a country level by government to manage social media companies and the content published on their platforms, demonstrating that these companies are governable. However, any government lead regulation needs to be accompanied by overseeing organisations that have access to the resources to monitor, enforce and appropriately penalise these companies for non-compliance [
41]. Identifying these agencies is another step in the process of governance and will require continuing government leadership to ensure society and individuals are protected from harm in this online environment [
29].
More specifically, those participating in our study stated that self-regulation was largely a failure, as the growing digital marketplace provided the ideal environment for those with vested interests to promote and distribute e-cigarettes and other harmful products globally, via organic and commercial content [
42]. However, no matter what legal regulations are imposed on platforms their internal commitment to self-regulation will impact compliance [
43]. These companies still need to have clear policies and community guidelines that set the rules for conduct, as presently they are often vague and unclear [
43,
44].
Currently social media company policies are enforced via a mix of moderation processes that include outsourced workers reviewing content, machine learning tools that detect and remove content, and internal policy teams that set standards and oversee the processes [
45]. These moderation processes can be challenging due to their inability to interpret language, and context and community standards, making it difficult to distinguish between problematic and permissible posts [
46]. Nonetheless, it is important to have ongoing monitoring and evaluation of moderation processes to assess the viability of any of these actions.
Drawing on our findings, we call for open community dialogue about social media companies’ operations to increase awareness of their processes and impact. This dialogue needs to include regulators so that informed debate can lead to appropriate cultural change around expectations of the company’s behaviour and in turn the content they host. Open dialogue will enable increased awareness about e-cigarettes and potentially other harmful products, such as gambling, alcohol, and ultra processed foods. Accompanying this, community online media literacy education and resources could be introduced to enable increased awareness and knowledge of what content is permissible and the implications of interacting in the online environment [
29]. Equipping people with the skills to critically evaluate online information, will reposition the media user as an active participant [
47]. Providing an independent easily accessible community complaints systems will further enhance the role of individuals and the community in managing social media content.
Limitations
Our study’s sample size of 13 may be considered a study limitation, however, it does provide a range of insights into the challenges and opportunities for management of social media content. In addition, social media is a dynamic environment and therefore recommended responses to better manage its content may change over time.
Conclusion
Through qualitative insights, participants working in the areas of public health, digital media, law, governance, tobacco control and advocacy, identified a range of levers that could be enacted to decrease exposure to e-cigarettes and theoretically to other harmful content on social media.
The management of social media content was seen as a global issue, requiring a global response, with the narrative highlighting the importance of cross-border cooperation. However, at a country level, government oversight and actions are the priority. This should comprise the development of national-level regulatory frameworks, which have government leadership and appropriate legislation; and in the Australian context, identification of organisation/s with suitable levels of regulatory power and resources to monitor, enforce and penalise noncompliant social media companies. This activity should be further facilitated by an effective independent complaints panel and also an internal commitment from social media companies to protect their users from exposure to harmful content.
In parallel, participants also identified the need to raise community awareness regarding social media platform operations. In particular strategies are needed to increase digital literacy regarding harmful social media content in conjunction with framing messages to increase pressure on social media companies to improve the management of unwanted and harmful content. Social media companies need to take responsibility for content published on their platforms, as these platforms need to be safe environments that do not expose users to harmful products that can increase adverse health outcomes.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.