Apple and Google are releasing their jointly developed Covid-19 contact tracing feature (wisely renamed “exposure notification”). Soon every smartphone owner will have to make a choice whether to activate it, weighing hard trade-offs between health and privacy.
Apple already published the technical details, allowing security and privacy researchers to review them in advance. My analysis reveals many good things, a few bad things, plus one ugly one:
· Opt-in: Contact tracing isn’t enabled automatically. The user must explicitly enable it.
· Patient Anonymity: When someone (who opted in) tests positive for Covid-19, cryptographic keys are uploaded to a server that enable contact notification. By using the keys as opaque beacons, patient's identity is kept hidden from the server.
· Contact Anonymity: Contact information isn’t uploaded to the server, or even to the patient’s phone, preventing identification of people the patient came in contact with. The contact’s app knows, but isn’t supposed to notify the server without permission.
· Locationless: The system relies on Bluetooth interactions to detect proximity between people. The system does not use GPS. It doesn’t capture location data, nor are users required to turn on location services.
· Limited: Access to the contact tracing API is limited to one app per country, and that app must be approved or operated by national health authorities.
· Decloaking: An attacker in physical proximity to their victim could pierce the cloak of anonymity by collecting personal information (device characteristics, face recognition, etc.) along with the opaque Covid-19 beacon. The attacker must be close enough to make a Bluetooth connection, but this can be more than six feet. However, a beacon is only valid for 14 days, so such an attack has a limited time horizon.
· Social Graph: If decloaking were scaled up to many locations, and the collected data aggregated, it might be possible to reconstruct people’s social graphs. Think of this as your IRL social network – your friends, family, coworkers, etc. in the real physical world. This would be a treasure trove of data to ad agencies, spy agencies, or authoritarian governments.
· Stickiness: Although you don’t have to opt-in to contact tracing, once it gets baked into the device operating system you won’t be able to uninstall it. Apple and Google have stated they plan to bundle this service into Android and iOS in the next few months.
Impersonation: When two devices exchange opaque beacons via Bluetooth, there is no attempt to authenticate that a beacon legitimately belongs to the person sending it. A proximity attacker could collect hundreds of valid beacons from passersby, then rebroadcast them to other passersby, resulting in thousands of fake contacts. When a victim contracts Covid-19, the result would be hundreds of false exposure notifications, sowing chaos and undermining public confidence in the system.
Others including EFF agree this is a serious flaw. Bruce Schneier fears false positives and negatives will lead to so many people opting out of the system that it will fail. Apple and Google must take this threat seriously. They should tie beacons to devices, perhaps via digital signature, so they can be validated.
The Bottom Line
Apple and Google put a lot of thought into this feature’s security and privacy, and it shows. No system is perfect, but this one is good enough for now. Some privacy risks remain, but they require physical proximity to victims (harder to achieve with people staying at home or social distancing).
During this crisis, the benefits of automated contact tracing to our health and economy outweigh the privacy risks. People should opt-in and use the app. After the crisis is over they should opt-out. Apple and Google should remove the code from the OS after a Covid-19 vaccine is widely available and the pandemic is over.
Michael McCormick is an information security consultant, researcher, and founder of Taproot Security.