Cake
  • Log In
  • Sign Up
    • Chris

      A few days ago Kara Swisher sat down with Mark Zuckerberg and asked him what Facebook's policy is wrt false news, as he called it. I heard three things that I'm trying to make sense of:

      1. If something is verifiably false, it should be taken down.

      2. Facebook should not be the organization that determines what is true.

      3. Everyone gets things wrong sometimes, so getting things wrong should not be a reason to be taken off the platform, only a reason to become less relevant in the feed. He gave a specific example:

      Let’s take this a little closer to home. So I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong — I don’t think that they’re intentionally getting it wrong. It’s hard to impugn intent and to understand the intent. I just think as important as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public leaders who we respect do, too. I just don’t think that it is the right thing to say we are going to take someone off the platform if they get things wrong, even multiple times.

      👆That quote was in response to being asked about Infowars claiming the Sandy Hook massacre was fake but not bumping Infowars from Facebook.

      What would you have us do on Cake?

    • vegasphotog

      For starters, since Cake is not a slave to advertisers, Cake is not a slave to advertisers, Cake is not a slave to advertisers, Cake is not a slave to advertisers, Cake is not a slave to advertisers,Cake is not a slave to advertisers,.....the "problem" really is using an advertising-centric algorithm. Infowars probably generates great ad revenue for them vis-a-vis personal profiling.....it seems to me that your handful of very conscientious team members are into this thing to promote meaningful discussions. You have already clarified that posting should be "family-friendly" thus no Toxic Brittney threads. But, using that as example, using Facebook's advertising model, think how much money you could have made just off the TB thread?

      For me, I do not think is practical to turn off FB, but, I go to great lengths to manage it. Most people would consider me almost a conspirist-theorist. Not sure how CAKE is going to make money...but, as this has been discussed before in regards to FB, I would pay a nominal annual fee to participate in a secure area where it is driven by timestamped post content and not an alogrithm solely designed to maximize ad revenue.

    • Shay

      there’s a set of people who deny that the Holocaust happened.

      Infowars claiming the Sandy Hook massacre was fake

      Holocaust denial is a crime in certain countries.

      Sandy Hook was a tragedy that shook the world.

      To allow people who believe that neither took place to post and to gather on your site ( FB or Cake or any other online forum) is reprehensible.

      There is a train of thought saying the internet should not be censored but, imo, that is nosense.

      A line in the sand has to be drawn and both above are examples of what should never be allowed on a public site.

      Cake has an opportunity to be a place where the venom and hate that propogates FB / Twitter will not be allowed to fester.

      I do feel very strongly about owners of public sites taking responsibility for what is posted.

      There was a very good documentary on C4, a british tv channel about how FB filters posts and allows controversial posts to remain on FB.

      link to Guardian article on the documentary.

    • Chris

      I’m getting some feedback that I gave the impression we don’t have a strong stance on this topic. We do: we would never allow it.

      I am just trying to understand what Zuck is saying. I can’t parse that anything provably false should be taken down but not holocaust denial.

    • Eddieb

      I'm with @Shay on this one but it does raise questions whichever way you look at it.

      People who deny these atrocities usually claim some sort of tin foil hat consipiracy for them being fake rather than 'unintentially being wrong' as Zuckerberg puts it and I believe these people shouldn't be given space to propogate their noxious beliefs and crackpot theories, along with peddlers of Fake News.

      Requiring the platform to monitor posts sounds logical but for whoever does it, who watches the watchers, who defines what's 'right' and whats 'wrong', and who monitors those decisions to ensure they aren't being made with ulterior motives in mind, and how to you deal with cultural differences?

    • vegasphotog

      "Requiring the platform to monitor posts sounds logical but for whoever
      does it, who watches the watchers, who defines what's 'right' and whats
      'wrong',"

      I think Facebook spends billions on salaries but refuses to make it a traditional business model with a CSR infrastructure solely because they know they could never process the millions upon millions of real and BS inquiries. Thus, the algorithm is supposed to be the fixall for everything (especially their ad revenue) with very little human supervision (by % of subscribers). If CAKE had one billion members, there is NO WAY there could really be any sort of human management perse.

      I am just talking like a schmo on a barstool somewhere...but, banks have to prove a certain "stress test" whereas many failed during the latest Depression. What if social media companies had to have their own "stress test" that reflected an adequate # of employees were actually screening page owners and their posts based on any infractions or complaints. I know FB has versions of that now but clown shows like Infowars should be managed. But, if you go that far, then how to manage these clowns on morning talk radio that spew fear and hate all in the name of the GOP?

    • yaypie

      I think there's a tendency — especially among engineers and people working for tech industry companies like Facebook and Twitter — to want to boil moderation policies down to a set of concrete, logical rules. But this will always fail.

      The acceptability of a particular word, phrase, sentence, paragraph, or image depends heavily on the context in which it's used and the intent behind its use. Context and intent are murky cloudy gray areas that require interpretation, and interpretation requires having experience, empathy, and a point of view. It can't be done by an algorithm or a set of rules.

      This is the trap that I think Mark Zuckerberg, Jack Dorsey, and others have fallen into. They think they need to be "fair" to "both sides", and they think that supporting free speech requires giving a platform to all points of view.

      But what Jack and Zuck don't seem to understand is that when you elevate all points of view to the same level of importance, hate speech and abuse cause good people to flee the platform, leaving only hateful, abusive people behind. Eventually your "neutral" platform is just a cesspool of the worst people saying the worst things to each other.

      That's why platforms need to have a conscience. Free speech is a valuable policy for a government, because it restricts the government's ability to work against its citizens. But it's not always the best policy for a social network or online service.

      This doesn't necessarily mean that social networks need to proactively moderate all content, but I think it does mean that they should act swiftly, decisively, and with conscience to remove hateful or abusive content and the users who spread it once they become aware of it.

    • Richard

      I think the reason you can't parse the statements is simply that they are contradictory. Taken together, they amount to no policy at all.

    • Shay

      This doesn't necessarily mean that social networks need to proactively moderate all content, but I think it does mean that they should act swiftly, decisively, and with conscience to remove hateful or abusive content and the users who spread it once they become aware of it.

      nail on the head...

    • Dracula

      I think unless carefully planned, the bigger a social conglomerate grows, the issue of controlling content starts depending on how dispersed responsibility can become. So even when automated, it may reach a point when no one really knows what is going on in the decision logic, at an extreme nor can they actually tweak decisions anymore. I heard a comment of something similar about youtube, where content is parsed automatically and could be removed (or not) by machine decisions.

    • wx

      Great post, yaypie. I think it's a bullseye.

      News organizations fall into the same trap of trying to balance arguments, thereby giving weight to positions which are provably wrong. The example that comes to mind is climate change -- they didn't know much about the topic, they knew there was opposition, and they weren't discriminating about giving it equal time.

      Socially irresponsible posts should be excised from public, non-government forums. They have no value, but are dangerous.

      Two problems. The first, as you suggest, is who gets to decide what's socially irresponsible. Editing speech is a concept that's open to abuse. But that doesn't mean it shouldn't be done. There's no such thing as unfettered free speech anywhere in the world, including in the United States. The hackneyed example is falsely yelling fire in a crowded theater. You will be succesfuly prosecuted. Common sense tells you that some things are wrong and shouldn't be permitted.

      The second problem is actually finding irresponsible speech on a massive platform. It must be a daunting challenge. But I totally agree with you that once it does come to light, it must be dealt with promptly.

      As for Zuckerberg (an ethically-challenged man whom I do not admire) his "rules" are self-contradictory gibberish, as Richard said. I get the impression he just doesn't want to deal with it, but is being forced to.

    • Chris

      Speaking of bullseyes, The Onion nailed it:

      Facebook Apologizes For Giving Mark Zuckerberg A Platform

      MENLO PARK, CA—In response to criticism about the social network’s failure to address the spread of falsehoods and offensive content on its site, Facebook apologized Thursday for giving Mark Zuckerberg a platform. “Lies and harassment have absolutely no place on Facebook, and we want to express our deep regret at offering someone like Mark Zuckerberg a space to spread his clearly abhorrent views,” said Monika Bickert, Facebook’s head of global policy management, adding that the company wanted to clarify any lingering doubt over its previous statements by issuing a full-throated condemnation of Mark Zuckerberg. “While someone’s intent isn’t always clear, there are some kinds of speech that are plainly beyond the pale and only meant to hurt or defame, and that’s obviously the case when we look at the types of things Mark Zuckerberg says. We support free speech, but there is a limit to First Amendment rights, and that limit is Mark. Providing Mark Zuckerberg a platform goes against all the values upon which Facebook was founded.”

    • Chris

      I listened to a couple of podcasts tonight while biking and wow, he's really getting ripped. The Charged Tech podcast hosts said he's the only tech exec with no clear moral lines. Tim Cook has them. Jeff Bezos has them. Zuck has none.

      And then there was Engadget's brutal article: Mark Zuckerberg: CEO, billionaire, troll

      You know that guy. The one who pops into a chill online community and makes everyone miserable. The one who says he's "just asking questions" about women able to do math, black people and evolution, shooting victims and paid actors, the validity of the Holocaust.

      He's the one that mods have to kick out for "JAQing off" ("Just Asking Questions") because he clearly has bad intentions to harm the community and recruit hate. The troll who feigns naïveté and uses free speech as a foil.

    • Dracula

      I wonder what his behavior may become if history where to repeat itself. I mean, if humans were ultimately guided merely by cold, calculated logic, over anything else such as the fuzzy term we call feelings.

    • Dr

      Yell "FIRE" in a crowded movie theatre.

      Is it exercising your freedom of speech?

      We know it isn't. Use that reference as a litmus test, when reading a post.

      Simplest benchmark I can think of.

    • Shay

      I've been thinking about this:

      the site owners have to have the strength to resist the easy option... if a post is inflammatory, it gets nuked...

      "yo mamma" was where all of the inflammatory posts went on advrider. The mods who policed that section would be ideal for policing cake. Apply the same criteria, if it belongs in the basement.. then it doesn't belong on cake.

      I like it here.. it's polite and troll free.. i really hope it stays like this.

    • wx

      I've become a firm believer in ruthless moderation. My view is that it's the simplest way to enforce the rules. It's Pavlovian training of the writers, if you will. Actually, after looking it up, I find that it's called "operant conditioning." No need to be unpleasant, just firm and as consistent as possible (not an easy task, I grant you.)

    • Chris

      I listened to a great interview of Kevin Systrom, the co-founder of Instagram, and he said from the very first day they decided their policy was nuke the trolls. If you were there to make trouble or make people feel bad, they simply nuked you. It seems to have worked.

    • Dracula

      I don't mind being beaten up. It has taught me things I'd have otherwise never had learned about myself and others. Yes it hurts, and am not able to do it to others on purpose, ever. But the inn-yang of the universe needs both good and evil. Just my .02 cents, could be totally off. Not sure if you meant CSM either, I kind of abhor politics deep down, I feel its the worst part of humanity.

    • Dracula

      What I really meant about facebook, was a hypothetical rhetorical question on what position would it have taken, and what would it have become if it existed in the era of the second world war beginnings. We know what happened with radio and cultural media. And if so why? What drives man stronger towards evil? In today's world everyone wants simple black or white, yes or no answers. I think the emoji variety aren't enough.

    You've been invited!