Google Is Slurping Up Health Data—and It Looks Totally Legal

Last week, when Google gobbled up Fitbit in a $2.1 billion acquisition, the talk was mostly about what the company would do with all that wrist-jingling and power-walking data. It’s no secret that Google’s parent Alphabet—along with fellow giants Apple and Facebook—is on an aggressive hunt for health data. But it turns out there’s a cheaper way to get access to it: Teaming up with healthcare providers.

On Monday, the Wall Street Journal reported details on Project Nightingale, Google’s under-the-radar partnership with Ascension, the nation’s second-largest health system. The project, which reportedly began last year, includes sharing the personal health data of tens of millions of unsuspecting patients. The bulk of the work is being done under Google’s Cloud division, which has been developing AI-based services for medical providers.

Google says it is operating as a business associate of Ascension, an arrangement that can grant it identifiable health information, but with legal limitations. Under the Health Insurance Portability and Accountability Act, better known as HIPAA, patient records and other medical details can be used “only to help the covered entity carry out its healthcare functions.” A major aspect of the work involves designing a health platform for Ascension that can suggest individualized treatment plans, tests, and procedures.

The Journal says Google is doing the work for free with the idea of testing a platform that can be sold to other healthcare providers, and ostensibly trained on their respective datasets. In addition to the Cloud team, Google employees with access include members of Google Brain, which focuses on AI applications.

Dianne Bourque, an attorney at the legal firm Mintz who specializes in health law, says HIPAA, while generally strict, is also written to encourage improvements to healthcare quality. “If you’re shocked that your entire medical record just went to a giant company like Google, it doesn’t make you feel better that it’s reasonable under HIPAA,” she says. “But it is.”

The federal healthcare privacy law allows hospitals and other healthcare providers to share information with its business associates without asking patients first. That’s why your clinic doesn’t get permission from you to share your information with its cloud-based electronic medical record vendor.

HIPAA defines the functions of a business associate quite broadly, says Mark Rothstein, a bioethicist and public health law scholar at the University of Louisville. That allows healthcare systems to divulge all sorts of sensitive information to companies patients might not expect, without ever having to tell them. In this case, Rothstein says, Google’s services could be seen as “quality improvement,” one of HIPAA’s permitted uses for business associates. But he says it’s unclear why the company would need to know the names and birthdates of patients to pull that off. Each patient could instead have been assigned a unique number by Ascension so that they remained anonymous to Google.

“The fact that this data is individually identifiable suggests there’s an ultimate use where a person’s identity is going to be important,” says Rothstein. “If the goal was just to develop a model that would be valuable for making better-informed decisions, then you can do that with deidentified data. This suggests that’s not exactly what they’re after.”

In fact, according to Bourque, Google would have to anonymize the information before it could be used to develop machine learning models it can sell in other contexts. Given the potential breadth of the data, one of the biggest remaining questions is whether Ascension has given the tech giant permission to do so.

Tariq Shaukat, president of industry products for Google Cloud, wrote in a blog post that health data would not be combined with consumer data or used outside of the scope of its contract with Ascension. However, that scope remains somewhat unclear. Shaukat wrote that the project includes moving Ascension’s computing infrastructure to the cloud, as well as unspecified “tools” for “doctors and nurses to improve care.”

“All work related to Ascension’s engagement with Google is HIPAA compliant and underpinned by a robust data security and protection effort,” Ascension said in a statement. The nonprofit health system has 2,600 hospitals primarily in the Midwest and southern US.

Health care providers see promise in mining troves of data to develop more personalized care. The idea is to establish patterns to better detect medical conditions before a patients’ symptoms get dire, or match patients with the treatment most likely to help. (Hospitals win here too; more personalized care means more efficient care—fewer unnecessary tests and treatments.)

In past efforts, Google has used anonymized data, which don’t require patient authorization to be released. Earlier this fall, the company announced a 10-year research partnership with the Mayo Clinic. As part of the deal—the details of which were not disclosed—Mayo moved its vast collection of patient records onto the Google Cloud. From that secure location, Google is being granted limited access to anonymized patient information with which to train its algorithms.

But even when it’s used anonymized data, the company has gotten into trouble for potential privacy violations related to healthcare research. In 2017, regulators in the UK determined that a partnership between Google DeepMind and that country’s National Health Service broke the law for overly broad sharing of data. This past June, Google and the University of Chicago Medical Center were sued for allegedly failing to scrub timestamps from anonymized medical records. The lawsuit claims those timestamps could provide breadcrumbs that could reveal the identities of individual patients, a potential HIPAA violation. Both missteps underscore how easy it is to mishandle—even accidentally—highly regulated health information, when you’re a company like Google that mostly works with non-medical data.

Google’s newest venture appears unprecedented in its scale, and also the scope of information. It was also foreseeable. “This fusion of tech companies that have deep AI talent with big health systems was inevitable,” says Eric Topol, a professor at Scripps Research who focuses on individualized medicine.

Legal? Yep. Creepy? Yeah, kind of. But surprising? At this point, it really shouldn’t be.


More Great WIRED Stories

Read More