A software bug in the Google+ social media platform exposed thousands of users' personal information to third-party developers, according to a report in the Wall Street Journal. The company discovered the flaw in March, but kept quiet about it until news broke this week.
It appears that Google kept the information from government regulators, fearing the type of backlash Facebook and other social media companies have weathered since the beginning of the year, especially with how these platform handle and secure users' personal data. (See Facebook's Data Breach: Will It Be First Test of GDPR?)
As the news broke on Monday, Alphabet, the parent company of Google, announced several new privacy regulations and guidelines for its products, including that it would permanently shut down all consumer versions of Google+.
This transition will last about ten months, with the consumer site closing in August 2019. The enterprise version will continue on and Google will focus its efforts there.
"The review did highlight the significant challenges in creating and maintaining a successful Google+ that meets consumers' expectations. Given these challenges and the very low usage of the consumer version of Google+, we decided to sunset the consumer version of Google+," Ben Smith, a Google Fellow and vice president of Engineering wrote in a October 8 blog.
In addition to Google+, the company plans to start restricting third-party developer access to other products, including the Android operating system, as well as Gmail.
This review of how Google works with third-party developers, as well as which APIs they can use and access, is part of an internal project dubbed Project Strobe. The company describes this project as:
...a root-and-branch review of third-party developer access to Google account and Android device data and of our philosophy around apps' data access. This project looked at the operation of our privacy controls, platforms where users were not engaging with our APIs because of concerns around data privacy, areas where developers may have been granted overly broad access, and other areas in which our policies should be tightened.
It was through Project Strobe that Google engineers found the Google+ flaw in March. The bug, according to the Journal's report published Monday, is part of an API that developers can use to access profile and contact information about people who download their apps, as well as people connected on Google+.
The bug also allowed developers to collect data that was not made public, according to the report.
It appears this flaw has been part of the API since 2015, and more than 400 different third-part apps had access to it during this time, although it's not clear if any of the developers took advantage of the bug. This could have affected nearly 500,000 Google+ and their friends, including those using paid G Suite accounts for business applications.
When Google executives were briefed about the flaw in March, they attempted to keep the issue quiet. It was during this time that Facebook faced questions about its relationship with Cambridge Analytica and what user info the company shared ahead of the 2016 Presidential Election in the US. (See How to Access the Voter Information Dirt Cambridge Analytica Has on You.)
Additionally, the European Union's General Data Protection Regulation (GDPR) was about to come into effect about two months later. This law requires companies to publicly announce data breaches within a certain timeframe, with fines that could go as high as 4% of global revenue. (See GDPR, AI & a New Age of Consent for Enterprises.)
However, when the bug was found in March, GDPR was not official yet, so Google would not have to face those regulations, which appears to have played into the company's strategy of not announcing the flaw.
Jessica Ortega, website security analyst at SiteLock, which offers cloud-based security tools, wrote in an email to Security Now that even though Google did not face scrutiny under GDPR, the company was clearly trying to salvage its reputation in the wake of Facebook's issues.
Now that GDPR is official, companies will have to come clean about data breaches faster, although some laws remain unclear about the timing of the disclosure.
"This type of behavior may become more common among tech companies aiming to protect their reputation in the wake of legislation and privacy laws -- they may choose not to disclose vulnerabilities that they are not legally required to report in order to avoid scrutiny or fines," Ortega wrote. "Ultimately it will be up to users to proactively monitor how their data is used and what applications have access to that data by using strong passwords and carefully reviewing access requests prior to using an app like Google+."
— Scott Ferguson is the managing editor of Light Reading and the editor of Security Now. Follow him on Twitter @sferguson_LR.