Popular social media apps now all have parental controls in place, but do they really make the service any safer?
Children are online more than ever. Screen time, in general, is way up. In fact, a recent study found that teenagers are on some form of social media almost constantly. With all the dangers that these services can pose, parents are more concerned than ever. More parental controls are being implemented on social media apps, but the big question remains, are they really making a difference?
Mounting calls from not only parents, but now even lawmakers are urging companies to make changes. Apps like Meta (Instagram and Facebook), TikTok, and Snapchat have all bolstered their security with new parental controls in the last year or so to give moms and dads more supervision and management over how their children are using social media. But the biggest issue that all of these revisions fail to address might just be the most harmful thing about these apps on young adolescent minds: dangerous algorithms.
Social media apps use algorithms that gather each user’s information to set data rules that will in turn send them specific content for viewing. The key here is to remember that while companies like Snapchat, who recently championed their new parental controls tend to convey a nuanced message that they are helping parents, in the end, these companies excel the more their customers use their services. Last fall, a former Facebook employee became a whistleblower speaking out against the media behemoth’s use of dangerous algorithms on kids. Addicting them with endless content, this has led to mental health issues and emotional problems in youth.
Another great example of how these parental controls likely won’t make social media apps any safer is seen in the flaws of Snapchat’s newly implemented parental controls. Earlier this week, the company launched The Family Center, a new tool that allows parents to see who their children message, but not necessarily the conversations. The company praised the tools as mirroring real-life parenting. They noted that parents ask their children important questions every day about who they interacted with when out of the house, but don’t necessarily get the opportunity in every situation to monitor those conversations.
While this reference is true, and parents need to have conversations about how their children are using social media, these parental control measures don’t fundamentally work in most real-world scenarios. To start, children need to send their parents a linking code to allow them to activate parental controls in the first place. Likewise, these measures completely ignore another major issue on the popular messaging platform, which is cyberbullying.
In the past few years, cyberbullying has increased by70%. This has been directly linked to social media use. Apps like Snapchat have escalated the rate and ability for teens to become victims of this, largely in part due to the app’s distinct features which completely erase all messages after a short length of time. On the surface, companies like Snap are praising parental controls as working, but they tend to ignore the bigger picture.
So while generally speaking, these parental control measures are likely not to make a dent in the real dangers of social media, there remains the growing urgency for legislators and congress to address the situation. Last March, a group of state general attorneys opened a nationwide investigation against TikTok, which is still underway. With these growing concerns, the hope is that more strict regulations will be enforced on these companies in the coming years.