Cake
  • Sign Up
  • Log In
    • William

      I have been thinking about this for a while, and I have finally decided to voice my disagreements (as a user) with Cake’s current moderation tool that allows Origional Posters, or OP’s, to delete other user’s posts at their discretion. My disagreement with this is for several reasons, as I will elaborate on further. Before I begin, I would like to say that I do think that Cake is a wonderful thing that is honestly a pleasure to use, and this critique is not trying to be an attack, but rather a reasoned critique of an aspect that I disagree with and think the program as a whole would be better without.

      Another thing I would like to say before I begin is that I understand the reasoning behind why the system is there in the first place, I.e to protect women and empower them to be able to remove harassing posts without needing to involve a moderator. I also understand that this would (in ideal conditions) reduce moderator workloads, which is a great boon to a small company like cake as employing moderators is a costly endeavor. However, I personally believe that the downsides of this tool are too great for it to be implemented.

      My first issue with the moderation tool is that it disincentives users from continuing to use cake. What I mean by that is that if you are an average user, and you get a post deleted due to a misunderstanding, that user isn't going to be happy. Sure, it might have been a misunderstanding, and sure, it might get reinstated, but who would want to go through that for a post on a message board? I feel it is a reasonable assumption to say that people who's posts are deleted would be less likely to continue to use Cake, as no one wants their voice to be censored. Now, please note, I'm not making a freedom of speech case here, but rather just a pure user retention argument.

      As an aside to the first issue, you could say that the people who are being deleted are likely bad apples / trolls, and that it would be a net boon to Cake to disincentives them. If it only got rid of trolls, I might agree with that argument, except there is no way to assure that it is only getting rid of trolls. Instead, the tool relies on the judgement of other users to decide, which could lead to very negative outcomes. Furthermore there is demonstrated evidence of the tool being misused on well meaning individuals right now on cake.

      My second issue with the tool is that it doesn't fit the site. From my understanding of the situation, the tool was adapted from a similar one on Facebook. However, I don't think it belongs on Cake because Cake isn't Facebook. It is reasonable to have a tool such as this on Facebook, as its completely acceptable to give this level of control (deletion powers) over your own personal page. But Cake isn't a personal website, its a message board, and Cake threads aren't status updates on a personal page, but rather lines of discussion about a specific topics on a public forum. The main issue I'm trying to get at here is that some things that would be completely reasonable on a private area are not very reasonable when they are applied to public places.

      My third issue with this tool is that it is a god send to trolls and other ne’er do well’s. As someone who has followed a variety of internet mischief for a while now, I've never seen a tool with such a potential for misuse given to users so freely. What do I mean by this? Lets consider some examples:

      Example 1 (Mild) : I made a post on December 16 asking cake users if they preferred cats or dogs. I got a few replies, all saying dogs. Now if there was a user who said they personally preferred cats, I could have called them a troll and deleted their post, or just deleted it without saying anything. This is a very mild form of trolling, so lets look at some stronger forms.

      Example 2 (Political): Lets say someone from one of reddit’s political echo chambers ( r/LateStageCapitalism, r/The_Donald) comes over and starts a thread that appears unassuming, talking about politics. They can then delete literally any post on their topic they disagree with, with nothing stopping them. This will lead to quite toxic malignant tumors developing on Cake of echo chambers that are unappealing to the vast majority of users. Even if you can filter these cancerous threads out, whats stopping these users from deleting other users posts because they have a different political view. For example, say someone from The_Donald started a thread about motorcycles, and then noticed someone who he disagreed with commented on his thread ( discussing motorcycles) and deleted his post for unrelated reasons.

      Example 3 (Worst Case): This example is, IMO, the worst possible outcome for the abuse of this tool, as it is using it in a way that is directly antithetical to its stated purpose. Say someone is a secret troll. They've built up a reasonable account with some basic posts, one that looks like a regular user. But in reality, they’re preparing for a single big troll. On International Women’s Day (March 8), they start a thread that looks something like this:

      “On International Women’s day, we are all discussing and celebrating the great accomplishments of women around the world, from getting the vote to getting our rights. But we cannot ignore the blights upon our gender. Rape, Abuse, Domestic Violence, FGM, the list goes on and on. Lets start a discussion here about these difficult issues, and show solidarity with our fellow women! Please feel free to share stories and give support, this is a safe place.“

      The thread blows up, hundreds of posts, maybe even gets stickies to the front page, who knows. After a bit, the troll strikes. With rapid clicks, he starts hiding every single post. Stories and voices are all fodder to the threshing strikes of the troll as he turns the once lively discussion into a wasteland. As he finishes his salted earth, he places a cherry on top by posting a finale post stating that women are worthless filth who deserve to be raped, or some other vacuous screed. He remains on the thread, deleting any replies to his manifesto, until an actual mod comes around and resolves the situation.

      Now you may think that I am being hyperbolic here, but I assure you this is all well within reason. There are plenty of trolls that will plan attacks like these, and even do the groundwork before hand in building a good cover to fly under the radar. As no one has protection from the tool, its completely plausible that a single troll could clean hundreds of comments in minutes.

      As I said earlier, this abuse of the tool is the worst because it goes against everything the tool stands for. It takes something designed to empower women, and uses it to silence them. This is why I think this tool is so dangerous. It puts too much power into too many peoples hands with absolutely zero vetting.

      My final issue with the tool is that it creates extra work for the moderators. In my previous example, if a moderator wanted to resolve the issue by restoring everyones posts, they would have to individually restore each post to the thread. Thats an incredible amount of work for a moderator, and turns their job from moderation to damage control. A possible fix for this would be giving moderators the ability to mass un-hide posts, but that solution just reenforces the idea that people would be abusing the tool, and also could allow some actually abusive posts to return among the false positives.

      A counter argument to this would be that if users didn't have this tool, the moderators would have to respond to reports anyway and delete these posts individually. This argument is true, they would. However, in the current system, they instead have to respond to the reports of the deleted people’s posts instead, so there is no negation of work, but instead the extra work of correcting for trolls abusing the system on top of existing moderation work.

      These issues are my main concerns with the system as it exists currently, and I feel that if nothing is done to change it, it has an extreme potential for abuse and mischief. I would like to restate again that I really do enjoy Cake, and I only want the best for the system. I would appreciate feedback on this, and I hope everyone has a wonderful day!

    • PJ
      Pj Kinsley

      There are probably a set of rules you could put in place to allow it to be useful to a user of the tool but stem any kind of systematic abuse.

      Something as simple as a timer between hidden posts allowed, or a limit of hidden posts per day/hour could help the tool stay useful but limit how much it gets abused.

      If someone is really posting so much that you would need to hide 5-10 posts in a short span I would assume a moderator should be involved anyway.

      Now perhaps some of these rules already exist or they are in the works.

      There is always the option of given those same users some sort of priority reporting button, that will get a post proper attention in a hurry.

      Worst case scenario you could always track use of the tool and remove it from folks that are deemed to abuse it. That could get pricey as well.

    • Chris
      Chris MacAskill

      Hi Will,

      What an awesome post. Many thanks. I can tell you've been thinking hard about this for awhile and you went to a lot of work to put it in writing. Imagine if you did this in a conversation I started and I deleted it because I disagreed with it. 😰

      The timing of your post is perfect for us because during the 2 years we've been building the service, I haven't felt we had thoroughly solved how moderation would work at scale. It's one reason we have an invite system in place and only let users trickle in, a few per week. Just yesterday I told the team at our weekly coffee that I think I finally see how it should work, and that I'd write it up.

      My thinking about the issue really started with my motorcycle forum, which gets 2 million unique visitors/month. I own the site and have admin powers, so I can move posts in a thread or remove them, as all mods can do. Some years ago, I started doing it in my own threads in the politics subforum (only visible to logged-in members). Those threads became very popular because I sometimes zap the soul-destroying petty squabbles that don't convey real information, and our members have come to appreciate that at least in my threads, it's not censorship, it's just cleaning up noise and keeping it on topic.

      We have tools there, however, that we haven't built here yet. There, you have the option to notify someone of what you're doing and why, and give them an opportunity to edit and re-submit. I know of at least one case you're referring to on Cake, and I spoke to both sides about it. What they both wanted more than anything was to be able to talk to each other so they could understand each other's point of view, and resolve it. We haven't built that yet.

      Over the years I spoke to various community leaders for Facebook Groups, Medium, blogs, Instagram, forums, yada and one theme seems to come up everywhere: the asymmetry of the problem. If you post in a long thread something that took awhile to compose, it's a bummer to have the thread-starter nuke it. But if you're the thread starter and there are 100 posts on a great topic from 50 participants, it only takes one jerk to ruin it for everyone. That's pretty asymmetric.

      Or, as trolls have learned on Twitter and forums where the thread starter cannot delete your post, one vile or threatening post is all it takes to shut the thread starter down even if they are someone famous and popular, like Jane Goodall. And then thousands think it's a shame that one person can spoil it for everyone.

      I think that's why the sites that have remained fairly happy places at scale, like Instagram, Pinterests and personal blogs, make it so easy to swipe left and delete a comment from your post. I've interviewed various people about the feature and the stories that we hear are like Richard's (not his real name): "Oh, my mother-in-law has a drinking problem and can say some crazy stuff on Instagram she regrets later. It humiliates my wife. No mod would catch it because it's personal and they don't know the context. We can't block her, she's my wife's mother. But neither of us would use Instagram if we couldn't delete some of her posts when she gets crazy."

      My dream is for the best conversation starters in the world like Oprah to be able to either start Panel discussions or regular conversations that millions of people would want to join. But I think Oprah's team would require us to enable them to vet the comments so the conversation is truly great, no? Otherwise it becomes a broadcast network like Twitter where she'll tweet but never engage with the torrent of awful tweets, no?

    • Shay

      Another group you have to consider is the teenagers who will use Cake.

      Cyber bullying is a serious social issue, unfortunately leading to problems from low self esteem to suicide.

      Parents have to be able to control the posts their children put up and to be able to cesor and delete any that are hateful.

      Being able to block bullys and delete their posts is essential, imho.

    • cvdavis

      Very well argued William! The research has pointed to most of us creating our own echo chambers. We read news sites and watch news programs that contain perspectives we support, we follow Youtube channels of users who we generally agree with, our Facebook is geared towards things that generally make us feel good about our beliefs. Facebook I believe has made some efforts to provide counter opinion pieces but I'd agree with William that we need to get used to seeing other opinions or views we currently disagree with. We also can make decisions that we later maybe wouldn't see in such a harsh light. Some people like myself love to continue arguing the different sides of a debate and that can offend some people who much prefer to just have everyone agree with them. We must get used to hearing other opinions and get used to defending or arguing for views we do support. This makes us a little less biased, focus more on the facts rather than get heated over someone who may disagree with us. I am in strong agreement with William so far but I'll see what others have to say.

    • cvdavis

      Newspapers are now often requiring people to use their real names to reduce the amount of abuse and trolling.

    • cvdavis

      Instagram and Pinterest tend not to be a place where political views and controversial subjects are discussed though younger people can sometimes make strong attacks on people for various reasons.

      I'd say look at political newspaper comments to see how to deal with truly controversial conversations.

      I have a younger friend on Youtube that has had many followers and people on there can be extremely cruel. The comments often have nothing to do with the video and are simply direct attacks on the person for no apparent reason other than jealousy and/or mental issues. The ability to delete the posts is pretty important for my friend.

      People like Oprah have been extremely influencial but some people like myself dislike that she promotes pseudoscience, alternative medicine and people like Dr. Oz. Now I wouldn't troll her conversation but if she said something I disagreed with then I'd certainly not want my post to be deleted if I was respectful and focused in my comment.

    • cvdavis

      Shay you are absolutely right! Having people use their real names can help solve some of this problem but of course I realize there are ways people can get around this.

    • Vilen
      Vilen Rodeski

      Great post, William! Thank you for writing your thoughts down so that we can see your points.

      I absolutely agree that the ability to "Hide" post is extremely controversial when misused as you've noted in your examples. As it stand today the tool is very basic and isn't fully finished when compared to how we envisioned it. It just hides the post from everyone but the owner of the conversation and the person who posted it. They both can see the post but they have no easy way to resolve the situation if they want to (like direct message each other or allow to explain the reasons for hiding).

      We have a big component of moderation still to build which would be a combination of internal tools and data that will flag trolls who hide, delete and otherwise ruin conversations. We'll also rely on people to flag bad behavior and bad actors as is done in many other services, but we are going to make it much more sophisticated and automated. We can only do so much with human moderation, so having a Machine-learning algorithm bubble up problem makers is critical.

      As for "real names" and identities we've debated it over the last 2 years and done a lot of research which @Chris can point you to. It comes down to the fact that there is no real evidence that using real name actually prevents trolling or makes people behave nicer. The research shows that using "usernames" only strengthens the behavior users already exhibit. So if you are trolling with a "real name" (like on Facebook) you'll be trolling even more when using a "username". Conversely, if you are being genuine using your "real name" you can open up even more when using a "username".

      It is all about trade-offs, but for us the goal is to spark great conversations and we think that "hiding" is a critical piece of this puzzle. We are still building it and need your feedback!

    • cvdavis

      Dealing with bots and real fake news must be a challenge

    • neduro

      This has to be one of the hardest topics on the internet today. I don't envy the Cake braintrust finding the right answer!

      My opinion is that we can have our cake and eat it too (pun intended). The OP could start a thread with either flavor of moderation- the ability to hide posts would be advertised up front, maybe with a flame or a kitten next to the post title.

      If you want a bare-knuckles no-holds-barred thread, make it so (Cake would still reserve the right to remove posts that are over the line, wherever that might lie). If you want to moderate your thread and keep it on topic, start the thread with those permissions and everyone can see it and participate with that in mind.

      This might be the kind of thing that resolves itself. Give people the choice, and if they all choose the same thing, it's an easy decision. I don't think that's what will happen, but I do think Cake doesn't have to decide now.

    • William

      This is a very unique take on the idea, and I kinda like it. Knowing before hand if you're gonna enter into a strict moderation or a relaxed level moderation thread would probably shape the discussion somewhat. But it still leaves it open, so if you are complaining about being stifled by one option, you can start a thread with the other moderation style.

      .......... I really like this. Its not a perfect fix, but its a pretty great idea.

    • William

      I think machine learning / automation would be another great way to solve this problem. One thought ive been mulling over is something akin to the Reddit shadowban system, where if someone is a bad actor, they are still allowed to post, but no one else can see. Perhaps if someone was overusing the hide function, like in my inital extreme example, after 2 - 3 hides in a short period of time they would be temporarily shadowbanned from the tool. After that they could still hide posts, and it would show them being hidden on the troll's side, but on everyone elses page the posts would still be up.

      This possible fix would allow the troll to feel initally sucessful, but when they realise that they actaully aernt having their desired impact, they would be further discouraged from trolling the system because its shown to be ineffctual.

    • Chris
      Chris MacAskill

      Thanks, William. Very fascinating. We have the notion of an internal trust score for every member of Cake that would work like Google's search index.

      The success of Google's algorithm depends on not revealing how it works and frequently adjusting it to thwart people who would game it. And we've been thinking similarly. For moderation, we would look at each post for people who are new or who have needed moderation in the past, but for people who've earned trust, probably no need to burn moderator cycles.

      Since everything comes back to our mission of helping you find great conversations, for content discovery it makes sense that people who are really good at contributing to conversations earn higher trust scores and their stuff surfaces more readily. One key would be that conversation starters who use their moderation powers to improve the conversation end up with better conversations, translating to those conversations being surfaced more. People who abuse their moderation powers and frustrate people, would hopefully not have their conversations surfaced as much. Make sense?

    • ia

      You raise some very valid points and as my MIL used to say, you need to understand both sides to have an opinion. If one has the power to remove posts, they have the power to remove any dissent creating an environment. In that sense, we agree. However, having the ability to get rid of the truly nasty comments or those that are off-topic is of great benefit when carefully thought out.

      I feel those who routinely remove dissenting opinion will find themselves with fewer participants as people move to away from topics with overly aggressive moderation.

      Maybe another way to handle it is to allow readers to branch off the original topic-sort of avoid the train wreck or even allow for a discussion to go in a different direction? Do you think that would help?

    • Chris
      Chris MacAskill

      Interesting, Ian. We've talked about having an option to start a related conversation because so many times in a long conversation it branches off to a side one, just like in real life. It's one of those things we have penciled in as a possible future feature.

      There seems to be a gender difference wrt to the conversation starter being able to delete a post in their conversation. I don't have huge numbers to base this on, but the handful of women we've spoken to have said adamantly, "Please never remove this feature."

    • ia

      I don't think it's gender specific. I bet if you asked those in groups considered disenfranchised, they would want similar ability in an open forum.

    • ia

      BTW, what if branching were visual? A participant could see where the topic were going in the various branches?

    Discover More Conversations

    Message
    You've been invited!
    Log in or sign up to post