Another form of technology, another societal scare
With each new form of technology brought to our society throughout history, fear comes from a lack of understanding. Whether it was the first radio brought to the United States in 1902 or the first television system in 1927, sociologists say there was always fear that surrounds each new technology.
Associate Professor in Sociology at SUNY Brockport Kyle Green said that social media is difficult to control and has caused worry. “I think those [past scares] if we look back historically, are kind of overblown, but I think we can see the negative impact of this almost immediately,” Green said.
Devices in the hands of younger ones have caused more negative effects. “There’s a lot of research that shows very clearly that after the introduction of smartphones and people having access [to smartphones] at a younger age, we have seen a rise in depression [and] we’ve seen a rise in body dysmorphia and eating disorders. We’ve seen a decrease in people socializing outside of social media [and] a decrease in number of friends that people have. So we’re seeing some pretty dramatic impacts of this,” Green said.
When looking through social media platforms like Facebook, Instagram, or TikTok there’s a variety of photos and videos to view, both from those you follow, and public images anyone can see. For example, on Instagram the ‘Explore’ page shows content that the viewer may be interested in based off what they have already viewed. This is known as the algorithm.
Assistant Professor of Media and Communication at Nazareth University Jennifer Billinson said the algorithm feeds the user certain content based on images the user interacts with, watches, etc. This means that it will continue to serve content that may have similar aspects to what has been viewed.
Billison said, “In terms of men or young boys using apps and social media platforms, we have research [and] studies have been done about how these piplines send them down specifically like right-wing Neo-nazi kind of platform spaces, especially Youtube is [a platform] that has struggled with that a lot. So it’s definitely an issue.”
Green said that seeing harmful content online is dangerous for a viewer regardless of age and can cause a negative view of the world around them.
“If this is what’s in your feed, this is how you see the world,” Green said.
Associate Professor of Women and Gender Studies and Sociology Bek Orr said when exposed to negative, violent, and disturbing images over time, it won’t mean the same to a person like when they first saw a harmful image. Orr said that people have become numb to violent content. Green said disturbing content that flows between images that are good may flatten the effect of the disturbing image that was seen prior. If a user sees a disturbing post after a couple of scrolls from a positive post Green said, “What [that is] telling you is each of these are an equal reflection of the world, and that leads you to a pretty dark place.”
How disturbing content affects mental health
Going to college may be the first time a young adult is away from home. Going to a university might mean moving away from people they once could talk to, like parents or friends from high school. Green said that being isolated and insecure can make these disturbing images more impactful.
“I have had some students in past semesters talk specifically about this, and every time it’s been a guy of saying, ‘Oh, recently I’ve been seeing all these disturbing videos of animals being like, being hurt or killed.’” Green said that many of those that identify as women in the class say they don’t see those images.
Whether from research, or hearing it directly from the students, both Green and Billinson say it’s shown that young men are being fed negative content. Billinson said in her classroom students have talked about unintentionally falling into disturbing places online or seeing unsettling content.
“It’s hard to get those things out of your mind. It’s like, once you’ve seen it, the damage is done, you know, and it’s and it’s difficult. So, even if it’s not making you go out and be violent, or making you go out and do something, it can still have that negative impact on mental health and general well being,” Billinson said.
Seeing images online can be startling and causes mental health effects like depression and isolation. Green said that men do not use the term ‘depression’ even when it’s the similar symptoms as women, less likely to talk about their feelings, and are more likely to feel like they don’t have close friends.
Orr calls into question if boys are given the same permission to express that they are depressed or have those feelings. Orr said if a young man and young woman were at the movie theater seeing a scary movie, the idea is that the young man is there to hold the girl if she’s scared or upset, and he needs to laugh it off.
“I think holding those feelings in and then eventually becoming numb to them. Means that there’s like, more room for eventually, like, acting out in violent ways that just aligns with what you’re surrounded by— the ideas that you’re surrounded by,” Orr said.
Orr said that women and girls have, “more permission to look away,” or to tell their parents and express their feelings. Whereas men and boys seem to bond over the disturbing images they see and share it in their social circle.
“When that becomes a part of your self development, you’re showing that you’re cool, you’re one of the group, you’re a man, then you shut down [the] internal voice in your head or in your gut that’s telling you, ‘This is bad. This is gross. This is not a good thing,’” Orr said.
Even with athletes like, DeMar DeRozan talking about therapy and his mental health, the internet provides opposing perspectives like Andrew Tate, for example. Green said that Tate would describe sharing those experiences as a weakness.
Is regulating the content on social media censorship?
The content seen on social media is not regulated. Billinson said that the government not stepping in and regulating is due to the first amendment rights; and said that they could be done more to regulate these apps. Personal responsibility falls on the users, versus the corporation, and when users are young, it can fall on the parents to monitor.
“A lot of people want to say, well, parents should be monitoring. How can a parent possibly monitor what their kids are doing on screens when they’re on screens for like huge parts of the day? Even if [the parents] weren’t working, it is just impossible, right? It’s just an impossible task,” Billinson said.
Orr said that harsher content will continue to come up on social media pages if it’s not in-check and will continue to have negative effects on people.
“I think corporations have responsibility. Tiktok has responsibility. We cannot rely on them to do the right thing. They’re a corporation. They’re out there for money, right? And as long as they’re earning and they’re able to get away with it, they’re gonna get away with it,” Orr said.
What can be done to stop negative content?
Billinson said this year of students is the first post-Facebook generation and they’ve grown up with it. Since they’ve grown up with social media for most of their lives, conversations of what is viewed on social media can help a person’s mental health and well-being.
“I think that should be done on campuses. It should be done in classrooms. It certainly should be done in like media classes and things like that, which is what I try to do a lot, and have open conversation,” Billinson said.
Having an open dialogue of disturbing content seen on social media can create less isolation, and more understanding that other users are seeing this type of content too. Orr recommends reporting the content right as you see it. Orr said although that’s easy to say just report it, it can be hard to do when you’re startled in the moment.
“Spreading the word that reporting things is actually way more effective to help shape your algorithm than just quickly swiping and not like watching a video all the way through,” Orr said.
Green said that technology can be really important, especially for those who need community of people they can relate to. With so much information on the internet, it may be tough to know what can be trusted.
“I don’t want to just dismiss new media. I think there’s actually a lot of good things buried in there. The problem is that we don’t have the critical skills to think about what we should trust and what we should not trust— and even when you do, it’s really hard to know.
Social media is a piece of entertainment that is not regulated. Talking about what’s on our social media feeds can aid in bettering mental health, especially when content is disturbing, or unsettling. Reporting unwanted content is a recommended step to help remove it and could help others not see it.