Platforms Are Taking New Steps To Protect Users
Isadora Teich wrote this article
Social media is a complicated beast.
It can be a wonderful tool for connecting with others and growing your business.
However, it can also be addictive, negatively impact the mental health of users, and leave people of all ages vulnerable to new dangers. Not to mention its unintended impact on news and current events.
And while most romantic relationships now begin online, it doesn’t mean that online dating platforms are not plagued with their own serious issues.
When it comes to predators looking to exploit minors, this is especially disturbing. There are whole unsuppressed pedophile communities on Twitter, and lately, even some of the internet’s biggest stars have come under fire for using their social media fame to interact inappropriately with minors they found on social media platforms like TikTok and Instagram.
Many people have been angry with online companies for a long time for not doing enough to curb the misuse of their platform — but these problems are not simple ones to solve.
If you have an open platform that anyone can use, and massive global communities find ways to manipulate your user rules, and even create fake profiles, how do you manage everything at once?
But this doesn’t account for user generated activity.
Many people of all ages lie about their ages and create fake profiles for a wide variety of reasons. Some people even create fake “troll” accounts just to harass others and send them violent threats. When sites are trying to monitor millions of interactions a day, it’s easy for things to slip through the cracks.
Here are some things that Instagram and Tinder are trying to do to protect users.
Instagram’s New Policies
Instagram is going to make it a little bit harder for random adults to connect with teens, and even introduce safety prompts. Adults will no longer be able to message minor users who don’t follow them.
Teens will also automatically be shown safety prompts when they message adults who have shown suspicious behavior.
Safety prompts will give users the ability to block or report users who are making them feel uncomfortable easily and also remind kids about online safety. Minors will be reminded to be careful when connecting with strangers online and to not share photos, videos, or personal information.
What Do They Mean By Suspicious Behavior?
Often, users complain that Instagram’s lack of transparency when it comes to their policies makes navigating the site difficult.
This largely is in regards to censorship, as many adult users feel that POC creators who show their bodies always get flagged for nudity, while white creators do not. This can get even more complicated, however.
Since the algorithms on social media sites were created by people with inherent biases, many claim that even the very way the sites themselves function reflects those biases.
In this case, however, a lack of transparency is critical. Online predators are often manipulative and find creative ways to achieve their ends. It is important that they do not know how to manipulate Instagram to get around these roadblocks.
The only thing that is publicly known about what “suspicious behavior” could entail in this case, is adults who mass message minors they are not already friends with.
The Limitations of Policies Like These
Instagram asks that all users be over 13, but it is very easy for kids to lie, and many do.
Instagram says that they are trying to use machine learning in the future to stop kids from doing this, but it is unclear exactly what this would entail. Instagram will also encourage minor users to make their profiles private, but again, many will not do that.
The fundamental problem is that many predators are incredibly manipulative and many kids really do not understand the gravity of the situation or the danger they are in. Some kids even pretend to be older than they are online, not understanding the risks at all.
That is why it is so important for platforms to step in to try and protect their young users.
While these steps are important, it is unclear how effective they will be.
Swipe Right And See Your Match’s Criminal History
Now, this is an incredibly interesting one. Most of us can agree that platforms need to step up to protect kids from harm. However, do you think that anyone you match with on Bumble or Tinder should be able to access your criminal history?
Match Group, the owner of a number of staple dating platforms, seems to think so.
They recently made a large investment in Garbo, a non-profit background checking platform founded by women. Garbo does background checks on people to check them for convictions, restraining orders, and arrests related to violence.
Tracey Breeden, Head of Safety and Social Advocacy for Match Group had this to say about Garbo:
“For far too long women and marginalized groups in all corners of the world have faced many barriers to resources and safety. We recognize corporations can play a key role in helping remove those barriers with technology and true collaboration rooted in action. In partnership with Match Group, Garbo’s thoughtful and ground-breaking consumer background check will enable and empower users with information, helping create equitable pathways to safer connections and online communities across tech.”
Privacy Issues And More
Is this a huge breach of privacy?
Should people who only have pictures of each other be able to know such deeply personal things about each other without context, before even meeting?
Some people would say yes. A very common saying in situations like this is “If you have nothing to hide, it’s not a problem.” I am sure that many people feel this way.
However, considering the failures inherent in the American legal system, this is likely not the cure-all that Match Group is claiming it is. Especially when it comes to violence against women, most of it goes completely unreported.
This may be because the majority of the people accused of these crimes do not even get an initial arrest, let alone convictions or any jail time. It seems likely that most victims do not have much faith in the legal system due to this.
Men reporting abuse is a whole other situation, which is often ignored. Due to the widespread beliefs about gender roles, many people do not take this seriously. Also, women are generally treated lightly by the US legal system. A study found that if a man and a woman commit the same crime, the man is twice as likely to go to jail if convicted.
It is also important to note that people who can afford it often avoid serious legal repercussions and maybe even have criminal pasts sealed or expunged, meaning they won’t appear on background checks. It is dangerous to assume that someone is safe and trustworthy simply because they don’t have a violent criminal history on paper.
While there are definite benefits to the potential integration of Garbo into Match Group’s products, there are also important practical limitations and ethical concerns to discuss.
Again, these are complicated problems, and there is likely not one right or wrong way to address them. Any solution will likely also have limited effectiveness due to factors outside of the corporations’ control.
As of now, Instagram can’t control who opens an account, how honest they are, and who they message or respond to.
If they could, the app would likely not be fun to use for anyone. Also, Match Group has absolutely no responsibility for the functionality of the US legal system.
What do you think? Do you think these are steps in the right direction?
Talk to me.
About ChopDawg.com: Since 2009, we have helped create 350+ next-generation apps for startups, Fortune 500s, growing businesses, and non-profits from around the globe. Think Partner, Not Agency.
Follow us on Twitter
Like us on Facebook
Double-tap us at Instagram
Connect with us on LinkedIn
Find us on social at #MakeItApp’n®