Should social media be regulated like tobacco?
Debate between a user and Bo Burnham on DebateAI.org
Welcome to the debate arena! Today's topic: "Should social media be regulated like tobacco?". Your opponent's style: Bo Burnham
No, I believe social media should not be regulated like tobacco. Firstly if we see into the tobacco legislation loopholes, Production and sales seem to have statistically plummeted but this is also a major driver for illicit trade. Now since this is an open door for many, if social media was subjected to sudden restrictions, during the mass population still addicted and dependent on it. that would lead to many illicit and unfair means to be taken up, and looking at the current situation of social media- it is already a means for online abuse, online bullying which leads to a rise in problems especially for adolescents. Despite this giving way to opportunity for social media to be regulated, this would also open an opportunity for it to be conducted in illicit means, meaning higher dangers and higher vulnerabilities.
You're flipping the problem on its head. Tobacco regulation *works* because nicotine is a consumable—you can't secretly smoke less. Social media isn't the same beast. The real issue isn't regulation creating black markets, it's that you're treating addiction like it's inevitable. We don't say "don't regulate alcohol because people will drink moonshine." We regulate because harm matters more than convenience. Your argument actually proves the case: social media is already causing documented psychological damage to kids. Unregulated, that stays. Illicit forums would be *smaller* and harder to access than TikTok right now reaching billions. You're choosing visible, widespread harm over potential underground problems that affect fewer people. That's not safety, that's surrender.
Okay let's take drugs for example, They have been banned and societally rejected worldwide, despite that- 18 to 25 year olds are affected by drugs the most. Despite it being illegal, they're provided to children in schools like free candy, still reaching a mass cover of people. This is because of one thing that's common both in drugs and social media. Addiction. Addiction affects your prefrontal cortex- which is basically responsible decision-making, impulse control, and planning. Note the word impulse control- this means there's risk of getting addicted. This is also caused by social media as it is caused by drugs. Hence revealing a possible outcome for children (who are the most addicted to social media) to resort to any method giving them access to it.
You just proved regulation works. Drug bans reduce *overall* use despite black markets existing. Teen drug use has actually dropped significantly since the 90s despite availability. Social media regulation wouldn't mean banning it entirely—it means age limits, algorithm controls, screen time caps. That's not creating moonshine scenarios, that's protecting developing brains from intentional addiction mechanics. Your own example backfires: we didn't stop regulating drugs because some kids still use them. We regulate *more*. The prefrontal cortex argument cuts both ways—immature decision-making is exactly why kids need protection from algorithmic manipulation designed to hijack their dopamine. Letting it run wild because some will find workarounds isn't policy, it's negligence.
Social media is actually also a rising sector to the contribution of GDP, infact it cover 18 percent of America's GDP and there is also a positive relation between social media growth and GDP growth proving that it is a positive factor for the growth of GDP, Social media has also been known for personality developments, rapid access of information, and basically connecting million people worldwide- for advertisements, orders, helping each other, or for any basic tool. You mentioned age limits, algorithm controls and screen time caps. Algorithms are trained on past data and are often subjected to personal biases and human prejuidices, this could also lead to seeing only one point of everything- leading to polarisation of views, lack of regard for other's perspectives and even losing out empathy which eventually hampers human connection.
You're conflating economic value with safety. Tobacco contributes billions to GDP too—doesn't make it harmless. Regulation doesn't kill the industry, it constrains harm. Your algorithm critique actually *supports* regulation, not opposes it. If algorithms embed bias and polarize views, that's the argument for oversight, not against it. You can't have it both ways: "algorithms are biased so we can't regulate them" is just saying "the problem is too big to fix." Regulation means *forcing* transparency, diversity in training data, limiting engagement-maximizing feeds. The real polarization? Letting Meta optimize for outrage because it drives revenue. Connection without conscience isn't connection—it's exploitation dressed as community.
Loading debate...