Diy & Crafts

Now That We Know What We Know, What If We Could Give Social Media A Do-Over?

A strongly held internet-related belief of mine is: if there are jerks or misinformation on a social media platform, it is completely the fault of the platform’s owner. The owner makes the rules. The owner enforces the rules. The owner can kick anyone off for breaking the rules. The owner can moderate the content. The owner can decide which content is prioritized. The owner decides what the platform will be.

I believe this partly because I have my own (relatively) small platforms. This blog. My Instagram account. My Facebook page. I can make commenting and community rules. They can be rules or suggestions or guidelines that are public and known. Or they can just be in my head. I can enforce those rules in different ways — on the blog I can moderate or delete comments; on Instagram and Facebook I can delete comments or even block users.

If I want my platforms to be smart and interesting and welcoming, then I have to actively enforce those commenting guidelines and community standards (even if they’re just in my head). If you come to one of my platforms, I want you to know that though someone may disagree with you, you won’t be viciously attacked, doxed, or called names. And if any of those things do happen, I need to be watching, so that I can delete immediately.

This month marks 13 years of the Design Mom blog. And in that time, I’ve had over a million comments across the blog and other platforms. That’s a lot of comments for one person to moderate, but happily, the Design Mom community is really good at commenting without being rude, so it’s generally not a difficult job. When I do need to delete a comment, it’s often (but not always) from a new-to-Design-Mom person — someone who doesn’t know how it works around here, and doesn’t know they’re wasting their time when they write a cuss-filled rant, because it will be deleted right away.

Around 2009 (a decade ago!), many personal blogs removed their comment sections, or turned on full-time moderation, so all comments had to be approved one by one. I didn’t do that and it’s mostly worked out fine. But once in awhile, when I’ve written a particularly provocative post, or I’m caught off guard by some drama, I wish I had turned on full-time moderation.

Sometimes I’ll hit publish on what I think is a non-divisive post, and walk away from my computer to run some errands. When I come back to check on the comments, to my surprise an argument will have broken out and feelings are hurt. Urgh. If I’d stayed at my computer and seen the first drama-comment come in, I could have deleted it immediately and prevented the trouble. I always feel awful when that happens; I hate that some readers have experienced attacks on this website.

Instead of turning on full-time moderation, I’ve opted to keep comments open, but turned on moderation for a short list of commenters who have given me the impression they like to start trouble. It’s not a perfect system.

The idea of social media companies needing to take more responsibility for what happens on their platforms has been on my mind. When I see that stats regarding the amount of misinformation being spread on Facebook it makes me pretty angry, because I know they can stop it if they want to.

Case in point: Earlier this year, in an effort to stop misinformation from spreading on their platform, Pinterest blocked all vaccine-related searches. Five stars to Pinterest! Facebook isn’t willing to go that far, but promised to “diminish the reach of anti-vaccine information.” That sounds pretty weak to me.

A new article in Bloomberg describes how Facebook can track misinformation. When it’s misinformation about Facebook, they show content that debunks the information to whoever is posting it. Which makes it quite clear they have real options for preventing the spread of misinformation, but choose not to use those options unless they see a personal benefit for their company. Urgh. Urgh. See this tweet for more details:

Knowing what we know now about some of the negatives of social media — like the often horrific comment sections on Youtube and Reddit, and how algorithms tend to recommend articles and videos that feed paranoia and extreme views — if we were starting from scratch and relaunching your favorite platform, how would you change things? How would you prevent spam? Awful comments? Misinformation? And is there a way to prevent the harmful comparisons and loneliness that Instagram can drive?

The first platform I would tackle would be Pinterest. Probably because I think it’s the easiest to fix. If I was in charge, I would want to launch a similar site — a place you can find images, and save images, and organize images — but change the rules quite a bit:

-There would be no handles or public usernames (like @designmom).
-Users would not be able to follow other accounts.
-There would be no public counts of how many times an image has been liked or saved or shared.
-There would be no way to know who pinned what.
-There would be strict rules on objectionable content, with zero tolerance for violent content, and low or zero tolerance for provocative content. (I love me some provocative content, but there are other websites for that.)
-There would be no incentive to promote your own work or your own products or brands.
-There would be no ads, no recommendations.
-There would be no type-over-photos allowed (meaning no “10 Spookiest Halloween Treats” over a photo). Users could make notes or add keywords in a text area below the pin.
-A core amount of pins, maybe 40%, would come from paid pinners (meaning pinners Pinterest has hired). Really stylish and creative people from across the globe. Not celebrities or famous names, just people with a really good eye who pin anonymously.
-Users could save/repin content, and could pin new content as well, but other users may never see that new content, so the new content would be for the user’s personal boards.
-Users could search using keywords or subjects, but some topics would be banned.
-It would not be social media (again, no following; no commenting). Instead it would be a tool you subscribe to for $1 per month — and you can pause or restart your account as needed. So you might just use it for a few months for a particular project. Or you might subscribe every month because you use it as a trend watcher and inspiration source. (As a reminder, there would be no ads.)
-Users would need to show I.D. to establish an account. This would help reduce spam accounts, and keep people from posting illegal content that would be tracked back to their account and reported.
-It would be a privilege to use this tool. If a user tries to bypass the rules, they only get one strike before getting booted from the site, and blocked forevermore. Which I imagine would be a good incentive for staying aware of the platform guidelines and sticking to them.

What do you think? Would you like a tool like that? : )

As for Facebook, I’d like to see a better-safe-than-sorry approach. As soon as algorithms notice a post is starting to trend, the system should pause until a real human takes a look. If the trending post is an easy yes, then it’s approved and continues to trend. If it’s an easy no, then all instances of the post are removed and blocked. If it’s questionable — not quite yes, and not quite no — then err on the side of no. And maybe something like any account that posts 3 items that need to be removed is automatically banned. And no spam accounts — one account per person, proven with I.D.; if you’re caught creating a spam or bot account, you’re banned.

Ultimately, I want Facebook to make simpler, stricter content rules and then enforce them. At first people will get kicked off because the rules are new, and they won’t believe the rules will be enforced. But after a short while, it will be clear users need to keep to the guidelines in order to use the site. Spam and bots and misinformation will disappear from the site. And the site will be a better, safer, happier place. Yes, there may be some really good articles that don’t make the cut, but that’s okay. Not everything needs to be Facebook appropriate; there are other platforms and news sources.

Your turn. What’s your favorite platform? If you could reinvent it, what sorts of changes would you want? What kinds of things could the owner of your favorite platform do right now to improve the experience?

P.S. — A related article: ”Numerous studies have shown that corrections [of disinformation] can actually strengthen a person’s belief in misinformation.”

About the author

admin

Add Comment

Click here to post a comment