***with the exception of racist content, the use of slurs (racial or otherwise), targetted harassment, and incitement of violence, ***
Did everyone just skip right past reading this part? That’s a lot of exceptions that cover a large gamut of activity that will continue to be not allowed. That’s not exactly “free speech” by definition, but it also is not allowing content that most platforms also do not allow.
I’ll put it this way, there have been dozens of reddit alternatives over the years. Of those, pretty much every single one that advertised free speech has gone under from right-wingers, psuedo-nazi’s etc.
The fact is, the biggest subset of people deplatformed off of reddit or any platform are truly just awful1 , regardless of what they claim about unfair moderation. And if you don’t make it expressedly clear that you will not tolerate them, they will flock to your platform. Any claims of “free speech” even backed by “oh but nothing too awful please” is basically a dog whistle to them and they will flock to your platform.
If someone says something like this, they’re either naïve about how this works or they’re just saying it to maintain appearances. Either way, the platform is doomed.
[1] well maybe not recently due to api issues, but they’re still a huge subset and will be the majority again eventually
I would imagine a place shouldn’t even need rules for that in the first place, but I understand people arent always the most kind they can be online.
I think also, a lot of what is called “bigotry” is often being subjectively identified (that is, one person thinks a thing is bigoted while another doesn’t, certainly one cannot and should not always default to agreeing that every interaction is bigoted otherwise no interaction would be allowed anywhere), but I would imagine a vast majority of “bigotry” would still fall under the very vast “slurs racial or otherwise” or “targetted harassment” exceptions.
I dont know all the details, but its possible these admins may have been overly strict in removing content they considered bigoted to the point of being disruptive. I used to operate a forum back in the early 2000s (for reverse engineering video game software) and there was one moderator I had to remove because they were too strict in their deletion of content for a similar reason. Entire threads would be left graveyards and there was no way to discern the context.
I am only presenting my own speculation of course. What you’re saying is also possible. The only way to know is to wait and see what happens. I think a big problem for those platforms is how quickly people bandwagon leaving when a small group decry a potential problem. It’s like when people try a new game with a low player population, then call the game dead. Those people leave, and they tell everyone else the game is dead. So nobody really joins, except the bottomfeeders nobody else wants.
Did everyone just skip right past reading this part? That’s a lot of exceptions that cover a large gamut of activity that will continue to be not allowed. That’s not exactly “free speech” by definition, but it also is not allowing content that most platforms also do not allow.
I am not exactly sure what I am missing?
I’ll put it this way, there have been dozens of reddit alternatives over the years. Of those, pretty much every single one that advertised free speech has gone under from right-wingers, psuedo-nazi’s etc.
The fact is, the biggest subset of people deplatformed off of reddit or any platform are truly just awful1 , regardless of what they claim about unfair moderation. And if you don’t make it expressedly clear that you will not tolerate them, they will flock to your platform. Any claims of “free speech” even backed by “oh but nothing too awful please” is basically a dog whistle to them and they will flock to your platform.
If someone says something like this, they’re either naïve about how this works or they’re just saying it to maintain appearances. Either way, the platform is doomed.
[1] well maybe not recently due to api issues, but they’re still a huge subset and will be the majority again eventually
There’s a lot of types of bigotry and other general nastiness that are not covered by that.
Normally I would not be so nitpicky with language but if multiple admins were removed / quit over it, that’s pretty suspect.
I would imagine a place shouldn’t even need rules for that in the first place, but I understand people arent always the most kind they can be online.
I think also, a lot of what is called “bigotry” is often being subjectively identified (that is, one person thinks a thing is bigoted while another doesn’t, certainly one cannot and should not always default to agreeing that every interaction is bigoted otherwise no interaction would be allowed anywhere), but I would imagine a vast majority of “bigotry” would still fall under the very vast “slurs racial or otherwise” or “targetted harassment” exceptions.
I dont know all the details, but its possible these admins may have been overly strict in removing content they considered bigoted to the point of being disruptive. I used to operate a forum back in the early 2000s (for reverse engineering video game software) and there was one moderator I had to remove because they were too strict in their deletion of content for a similar reason. Entire threads would be left graveyards and there was no way to discern the context.
I am only presenting my own speculation of course. What you’re saying is also possible. The only way to know is to wait and see what happens. I think a big problem for those platforms is how quickly people bandwagon leaving when a small group decry a potential problem. It’s like when people try a new game with a low player population, then call the game dead. Those people leave, and they tell everyone else the game is dead. So nobody really joins, except the bottomfeeders nobody else wants.