Jeff Perry

Hey there, I'm Jeff Perry and I write this blog, make YouTube videos, and other things I think you'll enjoy!

Moderator Mayhem

Today Techdirt released their second game, Moderator Mayhem . It is a "game that lets you see how good a job you would do as a front line content moderator," Mike Masnick writes, "for a growing technology company that hosts user-generated content".

Not only does the game have you moderate content, but it also gives you feedback from your manager and the public.

Are you supportive of free speech, or too oppressive in your moderation? Are you allowing too much harassment and therefore not considered safe? One thing about the public is that they’re not shy about letting you know how they feel.

I did my first run on my lunch break at work and I absolutely plan to play more with this later.

Techdirt has managed to, once again, provide some fantastic context to what is going on in big tech and the platforms we all know and love.

It's time for journalists to leave Twitter

I was listening to a recent episode of the podcast Hard Fork, and a specific part of it really resonated with me. It was where

Casey Newton was talking to his co-host, Kevin Roose, about NPR’s decision to leave Twitter after getting a “state-affiliated media” badge.

You know what I’ve been thinking about lately, Kevin? Do you remember during the Trump campaigns, when there would be these rallies. And in the center of the rally there would be a pit for the media. And a signature moment of every rally would be Trump pointing to the people in the media so that everyone at the rally could say, boo, we don’t like the press.

That is what Twitter has become. It is the press pit, where a bunch of people are standing around you in a circle, jeering. Adding this state-funded media badge was one of those steps. But I’m barely joking when I say that I think eventually every reporter who is still on the service will have a clown badge next to their username. And you just have to decide if you still want to be there when it happens.

I truly think this is just a matter of time, and if Elon listens to Hard Fork then he almost certainly has at least talked about it with his overworked developer team.

To add to this, I also read in a recent post by Pew Research that journalists on Twitter might not even be getting the views they deserve. Nearly 70% of journalists use Twitter as one of their top social media platforms. With that in mind, only about 13% of users use Twitter as their means of getting news.

The usage of Twitter by journalists is beyond disproportionate to their actual reach.

Huge Changes for New Mastodon Users

People signing up for Mastodon will no longer have to worry about what server to go to. Instead, Mastodon will now be defaulting to a server they operate. Eugen Rochko, Mastodon’s Founder and CEO, explained his reasoning for this saying that “[m]aking the onboarding process as easy as possible helps new users get past the sign-up process and more quickly engage with others.”

ZOOM OUT: The balancing act between usability and the open web is upon us. Instead of focusing on the decentralization of Mastodon, they are opting to choose something more closed.

  • This follows a more centralized platform like Facebook and Twitter. Though you can change servers.
  • Bluesky, Mastodon’s competitor, is also known to do something like this as well for new users.


Substack co-founder denounces bigotry, but has no plan

Shortly after Substack announced their Twitter competitor Substack Notes Nilay Patel interviewed Substack’s CEO, Chris Best, to talk about it.

In it, Patel asked Best if the statement “all brown people are animals and they shouldn’t be allowed in America” would be censored on Substack Notes. Best refused to say that it would, and when pressed further by Patel the CEO responded saying, “I’m not going to get into gotcha content moderation” because he didn’t think it’s “a useful way to talk about this stuff.”

On April 21st, a week after the Decoder interview went live, Substack co-founder Hamish McKenzie wrote a company statement via Substack Notes:

Last week, we caught some heat after Chris didn’t accept the terms of a question from a podcast interviewer about how Substack will handle bigoted speech on Notes. It came across poorly and some people sternly criticized us for our naivety while others wondered how we’d discourage bad behaviors and content on Notes. We wish that interview had gone better and that Chris had more clearly represented our position in that moment, and we regret causing any alarm for people who care about Substack and how the platform is evolving. We messed that up. And just in case anyone is ever in any doubt: we don’t like or condone bigotry in any form.

Spoiler alert: McKenzie doesn’t have any actions or policies laid out to explain how Substack will combat bigotry. “Caught some heat” is about as bad as it gets from a company statement. It might as well have said “got caught being shitty.”

The “heat” in question were from an episode of Decoder where Chris Best, CEO of Substack, refused to say that “all brown people are animals and they shouldn’t be allowed in America” would violate their content guidelines.

It gets worse, in classic whataboutism McKenzie argues that the other social media companies aren’t doing much to fight bigotry despite their huge content moderation teams.

Facebook, Instagram, Twitter and others have tens of thousands of engineers, lawyers, and trust & safety employees working on content moderation, and they have poured hundreds of millions of dollars into their efforts. The content moderation policies at some of those companies run to something like 150 pages. But how is it all working out? Is there less concern about misinformation? Has polarization decreased? Has fake news gone away? Is there less bigotry? It doesn’t seem so to us, despite the best efforts and good intentions of the most powerful media technology companies the world has ever known.

Now, this doesn’t mean there should be no moderation at all, and we do of course have content guidelines with narrowly defined restrictions that we will continue to enforce. But, generally speaking, we suspect that the issue is that you can’t solve a problem (social media business models) with a problem (a content moderation apparatus that doesn’t work and burns trust). So we are taking a different path.

That “different path,” McKenzie explains, is “changing the business model.” How will they change this business? Basically making the creator do their own content moderation. Substack decided to look at their writers, the whole reason Substack is making money, and telling them to figure it out themselves.

Truthfully this is a bad company statement trying to walk back Chris Best’s blunder on Decoder. In fact, it made things even more rocky for Substack.

Substack is a place where writers can write what they want to write, readers can read what they want to read, and everyone can choose who they hang out with. It’s a different game from social media as we know it.

No it isn’t, this “game” is the same on Facebook, Twitter, and more. There can be simple and no-nonsense content moderation policies in place and people who disagree on the platform.

Just because you are removing and disallowing someone from saying “all brown people are animals and they shouldn’t be allowed in America” doesn’t mean that everyone will suddenly be the same ideologically. You can have rules in place to prevent violence while having a healthy discourse.

I wrote before about Substack’s poorly written content guidelines and I said, “this isn’t an endorsement to spread hate but it certainly doesn’t thwart any of that kind of behavior either.” While I still believe that, the more Substack dives in to their content moderation guidelines give me pause. It makes me believe that Substack is less making a critical error and more deliberately dog-whistling.

Some people feel similarly, one being Mike Masnick at Techdirt. He explained the Nazi bar story on Reddit and how, with comments like Best’s on Decoder, Substack is allowing more Nazis to come in to Substack’s metaphorical bar.

But Substack is a centralized system. And a centralized system that doesn’t do trust & safety… is the Nazi bar. And if you have some other system that you think allows for “moderation but not censorship” then be fucking explicit about what it is. There are all sorts of interventions short of removing content that have been shown to work well (though, with other social media, they still get accused of “censorship” for literally expressing more speech). But the details matter. A lot.

If Substack truly wants to be a place for everyone to come and discuss things that matter to them, they cannot continue with this hands-off approach. Content moderation is messy, and it isn't easy to handle. That being said, Substack needs to roll up their sleeves and embrace the mess. Otherwise they will drive more and more people off of their platform.

Latest Status

Looking for something?