You Got a Brain Scan at the Hospital. Someday a Computer May Use It to Identify You.

Oct 23, 2019 · 15 comments
Eloquaint (Minnesota)
Did the people whose faces grace this article countenance the use of their images for such a purpose?
W. H. Post (Southern California)
@Eloquaint I assume the Mayo Clinic's IRB required the researcjers to get consent from the subjects and that the subjects did consent. But you bring up an excellent point. "Informed consent" is essential in human research. Before agreeing to participate in research, people should be told what will be done to them, AND what will be done with the data that is collected about them.
Pat C (Scotland)
Cyber security is important in digital imaging whether routine or a research project. Why anyone would want to access another's imaging escapes me but the need for security is a topic of discussion on medical forums.
Doc (PA)
I’m a retired radiologist. I’ve known for decades that within the DICOM data of CT and MRI scans is the ability to make 3D reconstructions of the subjects’ facial and body contours. So nothing is new here, just a reported story. Furthermore, the same data can be readily fed into a 3D printer, and one can make a bust of the subject as well. This imaging data, clinical or research, is part of a set of protected health information (PHI) under HIPAA laws. Could it be hacked? Sure. Could a court order get the information? Yes. Is it readily available? No.
J. G. Smith (Ft Collins, CO)
Gina is my favorite science writer. Her book "Flu" was an eye-opener for me when I read it many years ago! I always pay close attention to findings of "bird flu". I think the scans described in this article have far more positive applications then negative. Certainly we need to put safeguards in place, but this research must go forward. As the technology in scanning develops more, we should be able to "look" inside the brain which has alluded us. Once this threshold is crossed, the benefits will be endless. So...keep going!!
Van Owen (Lancaster PA)
Great. Just great.
John (NYC)
> The obvious way to fix the problem would be to remove faces from M.R.I. scans stored in databases. That process, though, blurs the image of the brain. What does this even mean? I work in neuroimaging, and all of the utilities I know of wouldn't cause blurring. Indeed, to the best of my knowledge both Harvard-MGH's mri_deface (https://surfer.nmr.mgh.harvard.edu/fswiki/mri_deface) and pydeface (https://github.com/poldracklab/pydeface) work by creating mask images (https://en.wikipedia.org/wiki/Mask_(computing)) that are used to zero out any of the 3d pixels that might potentially belong to facial features. OpenNeuro, a project similar to ADNI, has a whole blurb about that at the following URL: https://openfmri.org/de-identification/
J.S. (Northern California)
First of all, using the word 'disturbing' in the title already screams bias. And that's simply not cool. And second... there are SO many positive science aspects to this I can't even begin to list them (plus it makes really interesting art). Yes, better privacy walls need to be put in place, but the way this alarmist and breathless article is written is beneath the New York Times.
Pie Fly (Vancouver)
@J.S. I agree. Not a well formed headline.
JRB (KCMO)
That’s great...someday, I may need help remembering!
Kevin (New York, NY)
Current neuroimaging studies anonymize data if it is to be shared outside the research institution. I think the anonymization of medical imaging data should be required for storage of data. Clinical applications that requires a completely accurate head model are often close in time to the scan, so that data should be anonymized afterward.
GL Alders (Toronto)
Many thorough ethics protocols already require that MRI data be “defaced” prior to storing/sharing for this very reason. There are several procedures that can reliably deface MRI images after data collection without affecting the quality of the structural image. Skull stripping is an additional step that would make it virtually impossible to reconstruct the face. Yes, it may be time consuming to go through the archives and deface and skull strip the data, but this is a simple solution if there are concerns about potential privacy breaches, and it would really be good clinical practice to do this from the outset. In addition to this there are programs that can be used to strip age and sex information from MRI image headers to further anonymize data in case a “bad actor” does access the stored imaging data. Neuroimaging ethics protocols should require defacing and skull stripping in addition to removing identifying age/sex information from image headers prior to storing of data.
Chicago Guy (Chicago, Il)
Why are we investing so much time and money developing technologies in which the primary benefactor would be a police-state apparatus? Where is all this privacy-invading, eavesdropping, tracing, tracking, identifying, and collating technology going to lead us? 1984? Because there doesn't seem to be much use for it for things other than that.
GL Alders (Toronto)
To stay one step ahead of the shady characters. These scientists identified a potential problem, then investigated the likelihood of someone being successful in identifying participants. The next step is to develop protocols to ensure that this won’t happen in the future.
Chicago Guy (Chicago, Il)
@GL Alders When was the last time a new technology was created that was designed with the specific intention of protecting your privacy? And I mean one that couldn't be easily co-opted to do the exact opposite. And what happens when the "shady characters" are actually anyone that disagrees with the ruling political party? I don't think that anything good will come of this and most other "identity matching" technologies. There may be some more benign uses, like for targeting ads, but nothing really "good" so to speak.