Cake
  • Log In
  • Sign Up
    • The company chose not to disclose the data leak, the Wall Street Journal revealed Monday, in order to avoid the public relations headache and potential regulatory enforcement.

      Disclosure will likely result “in us coming into the spotlight alongside or even instead of Facebook despite having stayed under the radar throughout the Cambridge Analytica scandal”, Google policy and legal officials wrote in a memo obtained by the Journal. It “almost guarantees Sundar will testify before Congress”, the memo said, referring to the company’s CEO, Sundar Pichai. The disclosure would also invite “immediate regulatory interest”.

    • Have we as a society gotten so complacent with data leaks, that this one - and subsequent leak announcements - are more of a "Meh!" moment these days?

      If corporations large and small can't protect the data they require to use their service - or sell it to the highest bidder because it was in the ToS you signed to use the service - is that data really worth something?

      (rhetorical question to a degree...)

    • HI Tobias and welcome to Cake! 😁

      I have very mixed emotions about this, strange to say. One emotion is I'm sorry to see Google get caught up in this because I've always wanted to believe—and mostly have believed—that Google has some moral principles that guide them.

      I sorta shrugged because I thought "oh, it's just Google+. How valuable can whatever data leaked be?" But if a hacker was able to break in and steal your search history, I think that would be embarrassing to a lot of people if it became public.

    • The general user population must by this time assume that every chunk of data that's out in the Interwebz is compromised - if not now, then at some point in the future.

      The one sure way to reduce this possibility is to limit the amount of data consumed...or, go thru the headaches of VPN's and other ($$$) means to protect yourself.

    • If anyone wants to use VPN but does not want to spend a bunch of money until one has tested it, I recommend Tunnelbear. It is relatively easy to set up and easy to activate. If you are frugal in usage, you can get a lot of service without spending an arm and a leg.

    • I take some issue with the word "leak" being used to describe this.

      What Google discovered was a vulnerability. And they discovered it internally; it wasn't reported to them by an external party. If someone else had discovered this vulnerability, they might have been able to exploit it to gain access to user data, and then it would have been a leak. But there's no actual indication that this happened. Google can't say with absolute certainty that it didn't happen, but so far all signs are pointing toward this vulnerability not having been exploited.

      Obviously if there had been an actual leak, then Google would have a responsibility (both legally and ethically) to disclose it. But if we start requiring software developers to disclose every bug that could potentially have led to a data leak, then we start getting into some real slippery slope territory.

      All software has bugs. Some bugs are small and some are large, but there will always be bugs, and sometimes those bugs will be in code related to security or privacy. Chances are very good that most of the software you use on a daily basis has more than one security bug that could lead to a data leak. But chances are also very good that no one knows these bugs exist.

      Eventually, someone may discover them. The best case scenario is that the person who discovers these bugs is the developer, and they fix them before anyone else knows about them. That's what seems to have happened with Google+.

      But if we require developers to disclose any bug they discover internally that could have any implications for data security, that creates a strong incentive against trying to proactively identify security bugs internally. And it discourages companies from hiring security firms to perform audits, because if those audits reveal bugs (and they usually do), the obligation to disclose them could result in significant negative PR. So the safest thing to do financially becomes to avoid looking for bugs and just hope nobody else finds them either.

      I really hope we don't end up in this situation, because we would all suffer.

    • Indeed. Google has been pilloried for this, and for no good reason. There is no evidence actual data was stolen, they just noticed it could have been and fixed it. And the data that was potentially exposed was your G+ profile information, which is mostly public anyway.

      If companies have to disclose (and get raked over coals for) every vulnerability they fix in production, we're screwed as an industry.

    You've been invited!