• Log In
  • Sign Up
    • Mark published a post today that seemed to say they're taking Facebook in the opposite direction of Cake. We stand for better public conversation, they are going in the direction of more walled gardens.

      Some people are tweeting that Zuck was against privacy before he was for it. My take is public discourse became bad enough on Twitter and Facebook that people are seeking shelter from the trolls. It isn't that people don't want public discourse and Internet fame, it's that they fear the trolls. When Reddit took steps to limit public trolling, it became much more popular.

      I suspect he wants to put a bandage on their reputation for privacy. Perhaps talking about encryption will help. But...aren't many of Facebook's worst scandals from secret groups? They're under intense pressure to limit misinformation about vaccines, but those groups are thriving on Facebook because they are walled gardens.

      As for privacy in these groups? It has become public sport to stealth into these groups, take screen captures of the things people say, and post them on public message forums. Since Facebook demands your real name, how private is that?

    • I am kinda talked out about the anti-vaxxer movement—we had an amazing 70+ post thread over the past three weeks where I learned a lot about a lot of stuff related to it.

      But I think it was @Factotum who brought up in that thread the idea that conspiracy theory groups in general are thriving on Facebook.

      Taking it one step further, how is Facebook going to protect members of private groups from Russian troll influence during the election season next year?

      Oh wait! Maybe this is Zuck’s plan all along: deny any responsibility by allowing it to occur behind closed doors in private groups. He can make the public view look squeaky clean and say that they are removing SO MUCH misinformation from public timelines.

      Sorry but this sounds like the same old smoke and mirrors from the Mark & Sheryl show.

    • Well, whatever gets Facebook to take privacy seriously, I'm all for it. It remains to be seen how this 'new Zuck' will change FB internal processes - which we probably won't know for quite some time. Until stuff like this stops happening regularly, it's all just talk.

      As to the heart of the matter - there is merit to what he said. Facebook was created for sharing personal stuff, and that inevitably works better if audience isn't the whole wide world, and there for all eternity, indexable and searchable. That is opposed to Cake, which is here for people to talk about their interests, something for which you naturally want to widest audience possible.

      There is place for both, I think.

    • I don't believe anything Z says. Or rather, I don't believe that any meaningful change is possible, as Facebook is one of the poster children of surveillance capital corporations. It may do more to protect its members from third party spying, but its own business imperative requires that it keep spying itself on its users. It may make changes around the edges--changing the functionality that the user sees in order to boost participation--but the core will not change.

    • Mark published a post today that seemed to say they're taking Facebook in the opposite direction of Cake. We stand for better public conversation, they are going in the direction of more walled gardens.

      Generally speaking, I think this is not a bad thing to do. Cake and this hypothetical "new Facebook" are on opposite ends of a spectrum of what I would consider "well-behaving online groups".

      In the other conversation @apm mentioned (and thanks for tagging me into this one! :)), I brought up the idea of groups being bad if they are both huge and closed at the same time. To expand on that idea, consider "group size" and "group openness" (as in, to what extent can non-members see that a group even exists, see what the group talks about and potentially become a part of the conversation?) as two axes of a graph.

      If a large group is "too closed", we're dealing with a lack of control. Without disagreeing but valid other opinions from outside the current group, the set of opinions that currently exist can easily become self-reinforcing - leading to anti-vaxxing, overboarding racism/nationalism, domestic terrorism, ...

      On the other hand, if a small group is "too open", that's typically a lack of privacy. If I want to share vacation images with my family, discuss a new product or functionality with colleagues or plan a schedule for the next year with club members - I don't want the world to see. If people outside of the current group can access this information, this is a privacy problem. If they can even join the group and participate in conversations where they don't belong, we're potentially dealing with trolling, harassment, doxxing, ...

      In the middle of the graph, along the diagonal, is an area for online groups that (at least in theory) should behave better than that. Cake is somewhere towards the upper right end of this area, while "new Facebook" would be somewhere towards the lower left.

      As others have pointed out, Facebook as a company is not to be trusted - but the general idea of having small&private communities is valid in my opinion.

    • If this shift saves Facebook scores of unsavory headlines, it could also come with terrible consequences as its networks become havens for all sorts of illicit activities beyond the reach of law enforcement, regulators, or the media. 

      So as I understand this ⬇️ article, private groups would become more powerful with encryption tools to keep their conspiracy theories hidden.

      And on top of that, they are planning to entwine FB, Instagram and their other platforms so that regulators won’t be able to break them up.

      Naive, cynical or galaxy-brain view?

    • Privacy vs. need for regulation is a continuous scale, and where on the scale we need to be with particular product heavily depends on the product itself.

      Here's great twitter thread by Alex Stamos (ex-FB CSO) which does a great job of explaining it:

      Expand the whole thread, it's worth it, but this is the crucial graph: the more amplification the particular product enables, the greater need for oversight and transparency there is. Policing 1:1 chats is totalitarianism and pointless. On the other end of the scale, allowing ads to be targeted and run in secret, without oversight and disclosure is very very dangerous.

      Facebook operates on the whole range, but oversight and transparency needs to be separately calibrated for each level of amplification. It's not a one-size-fits-all solution.

    • Is FB really going to change to a point that advertisers go "Hmmm...I'm not getting my money's worth, I think I'll spend my money at ..." ?

      I heard one story that 15 million users since 2017 have left FB over their privacy concerns.


      With 2.3 billion they (Zuck) really care?

    • I have heard a couple of news stories in the last week or so that have focused on FB’s reprehensible practice of hiring entry-level workers to police the platform, exposing them to the worst expressions of humanity for hours on end. I wonder if Zuck realizes this is the next scandal he will have to address, and knows there is no way to address the problem of online abuse other than to isolate it so FB can say it allows freedom of expression without having to take action when that expression turns abusive and harmful.

    • Expand the whole thread, it's worth it, but this is the crucial graph: the more amplification the particular product enables, the greater need for oversight and transparency there is.

      Fascinating, jpop. I was surprised to see the arrow go up for amplification. I wonder if he's using the word to mean number of people reached? I think of the word as amplifying beliefs about radical ideologies (devotion to ISIS) or conspiracies that can then boil over into the mainstream. Don't those amplifications happen most effectively in private groups?

    • Could this amplification be related in some way with the "ephemeral stories" angle mentioned?

      I mean, I had to go look that up to fully grasp (hey - neophyte user here...) what the context of that was:

      Ephemeral content is massively popular with younger audiences in the Millennial and Generation Z demographics, which suggests this is not just some social media trend that will fizzle out in a few months or years. To reach a wider audience and engage your current followers more frequently, try to post at least once or twice per day on your social stories.

      Hmm. People want to read about your stuff daily??

    • I wonder if Zuck realizes this is the next scandal he will have to address, and knows there is no way to address the problem of online abuse other than to isolate it so FB can say it allows freedom of expression without having to take action when that expression turns abusive and harmful.

      @lidja I came to say exactly this, but you beat me to it. 🙂

      There are a lot of potential positives and negatives about what Zuckerberg wants to do, but I think you're exactly right and that the primary benefit to Facebook of this move will be to decrease both their human moderation burden and their vulnerability to criticism.

      When the media publishes a story about something awful happening on Facebook, Facebook will be able to say the reason they didn't take action is that it was happening in encrypted private groups and they respect users' privacy. Or the awful thing will be in the form of ephemeral messages that will literally disappear, and Facebook will say, "Whoops, by the time we heard about it the content was gone!"

    • Facebook was created for sharing personal stuff, and that inevitably works better if audience isn't the whole wide world, and there for all eternity, indexable and searchable.

      Interesting! That made me look up their mission statement..

      What is Facebook's mission statement?

      Founded in 2004, Facebook's mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what's going on in the world, and to share and express what matters to them.

      From here:

      So, in layman's terms, FB is in fact saying vs. doing in fact what? Selling advertising which is supposed to pass as a generously free offer to connect people? It doesn't resemble even remotely their mission statement, I wonder why?

    • I frankly would cherish facebook disappearing, even though I never really used it to post anything. Socializing online can lead to amazing, great interactions, but also toxic ones, and always will, if not curtailed. But the sad part is - we could curtail online toxic groups, yet that won't change the way those people think and congregate, they'll just find different avenues. From surveillance perspective it's likely better to keep this contained. The real not so funny plot is how making money out of this mess relates to and shapes the spreading "culture".

      Their entire business model is based on lying. From that mission statement with cheesy cult-like "read between the lines" aim to empower people to "connect", so conveniently leaving out how investors are supposed to actually make money out of it.

    • My favorite journalist on all things Facebook is Casey Newton. He writes about social media for The Verge, but I subscribe to his daily newsletter that comes out at 5. This afternoon he sent this long but fascinating email about his debates today with other journalists and ex + current Facebook employees. The only thing that made me jump is reading that Zuckerberg wants cake. And he wants to eat it.


      On Wednesday, Mark Zuckerberg dropped a 3,200-word blog post in which he promised to “outline our vision and principles around building a privacy-focused messaging and social networking platform.” I took him seriously, and in the hours after his post was published, contemplated a Facebook that put messaging and privacy first.

      But did I take Zuckerberg too seriously? That’s the charge made by Silicon Valley’s favorite morning columnist, Ben Thompson, in his newsletter today. We had a friendly chat about it on Twitter over direct message, so there are no hard feelings here. But I do want to dig into the opposing views about the Zuckerberg memo, because I’ve noticed a fascinating split today in opinions.

      My view is that if you accept that Facebook’s News Feed and other feed-based products will eventually fade away, as they have already begun to do in North America, Facebook will need to transform its business completely. Rallying around privacy, encryption, and ephemeral messages — while buying time to build out new businesses around commerce and payments — seems to be as good an idea as any.

      Zuckerberg nods weakly to a belief in the continuing importance of the News Feed in his post. But over the past year, he also moved top News Feed talent to parts of the company that he needs to grow faster: Adam Mosseri to Instagram; designer Geoff Teehan to the blockchain division, and so on. These moves, coupled with the decline of original sharing in the News Feed in North America, lead me to believe that Zuckerberg — ever paranoid about the company’s long-term survival — feels pressure to start building lifeboats.

      But in the aftermath of his post going up, Zuckerberg walked back some of his enthusiasm over this vision of a purely “privacy-focused messaging and social networking platform.” He told Nick Thompson at Wired:

      It’s not that Facebook and Instagram are going to be less important for what they’re doing, it’s just that people sometimes want to interact in a town square, and sometimes they want to interact in the living room, and I think that that’s the next big frontier.

      As I said on Twitter, Zuckerberg wants to have his cake, and eat it too: thriving public feeds, and fast-growing private messaging apps. Thompson, in a piece titled “Facebook’s privacy cake,” takes the same metaphor and runs with it:

      They still have the core Facebook app, Instagram, ‘Like’-buttons scattered across the web — none of that is going away with this announcement. They can very much afford a privacy-centric messaging offering in a way that any would-be challenger could not. Privacy, it turns out, is a competitive advantage for Facebook, not the cudgel the company’s critics hoped it might be.

      He goes on:

      Stop expecting companies to act against their interests. Facebook isn’t killing their core business anymore than Apple, to take a pertinent example, is willing to go to the mat to protect user data in China.

      If nothing else, this view explains why Facebook’s stock has been mostly flat since the announcement. (It was down about 2 percent today.)

      At the same time, I find this view to be surprisingly cynical. It takes as a given that Facebook’s CEO, in announcing a bold new vision for privacy-focused social networking, was in reality simply describing a high-level product roadmap for an adjacent business. It suggests that the post was published primarily for public-relations reasons: to signal a commitment to privacy from a company whose reputation on the subject is dire.

      But assuming this is the case, Facebook has put itself in a vise. On one hand it will have its advertisers demanding ever-more intrusive tracking and targeting options, as usual; on the other, there is a large and increasingly dissatisfied user base that has now been promised that the next generation of Facebook products will be private, ephemeral, and regularly purge their data. Whole divisions of Facebook will now be working at cross purposes.

      And with each new error around data privacy — there was one a few hours ago, by the way — the world will have a chance to jeer: Remember the pivot to privacy? If the company truly hoped to buy some short-term goodwill at the expense of its long-term credibility, it seems like a bad bargain.

      Perhaps, with the threat of a forced breakup of Instagram and WhatsApp looming, Zuckerberg felt that his hand was forced — and that he had to justify the unification of the apps’ back-end technology with the most consumer-friendly argument he could find. But if he can’t deliver what he promised — and if data-related scandals continue at the pace of the past 12 months — the “pivot to privacy” will be remembered as an epic folly.

      One reason for the confusion over Zuckerberg’s post may be that he uses “privacy” differently than most people do. As Konstantin Kakaes writes in MIT Tech Review:

      By narrowly construing privacy to be almost exclusively about end-to-end encryption that would prevent a would-be eavesdropper from intercepting communications, he manages to avoid having to think about Facebook’s weaknesses and missteps. Privacy is not just about keeping secrets. It’s also about how flows of information shape us as individuals and as a society. What we say to whom and why is a function of context. Social networks change that context, and in so doing they change the nature of privacy, in ways that are both good and bad.

      Russian propagandists used Facebook to sway the 2016 American election, perhaps decisively. Myanmarese military leaders used Facebook to incite an anti-Rohingya genocide. These are consequences of the ways in which Facebook has diminished privacy. They are not the result of failures of encryption.In any case, I found that current and former employees seemed to take the news differently. Current employees, as Peter Kafka notes here, tend to endorse Thompson’s view — that this is a cake-and-eat-it-too situation. But former employees I’ve spoken to take Zuckerberg at his word that he plans to shift the company to a more message- and group-oriented future — and that it will be very, very hard. (“Everyone thinks it’s a bad idea,” one person familiar with employee sentiment told me today. “But it’s a top-down request to get it done.”)

      That’s the thing about having your cake and eating it too. Very few people ever get to.

    • Yes, amplification in the sense of enabling a single individual (company/group) to reach the large amount of people. A force multiplier of sorts. If I want to talk against vaccines with my friends in a closed, private group, my ability to do real damage is significantly smaller than if I buy an ad-word to show that same message to anyone searching 'vaccine' on Google. Or if Google itself recommends anti-vaxx video to play next each time someone is done watching a CDC video on vaccines.

      Former is damaging, but contained - should probably be left alone (the fix would the worse than the problem). Latter is devastating and must be regulated in some fashion. And self-regulation by tech companies themselves doesn't seem to work that well, at least right now.