In an age where everything from keystrokes to footsteps is being monitored, it’s getting harder than ever to develop useful digital health privacy guidelines.
Unfortunately, those waffling on the issue includes the Federal Trade Commission, which recently came up more or less dry after studying this issue in depth.
The agency recently released a 55-page paper showcasing the results of discussions with privacy experts about the Internet of Things, plus some recommendations. While its efforts are commendable, the agency seems reluctant to draw hard conclusions. After a big build-up citing all sorts of technological and business threats, the report kind of fizzles out. Legislation specific to the IoT was rejected, though it did feature several suggestions for “general privacy legislation” such as requiring security on devices.
It’s not that the FTC is focusing on the wrong issues. No doubt about it, pacemakers and other critical devices can be hacked. It could be a movie: in Scene 1 a non-descript individual is moving through a crowded city street, thumbing over a common notepad. In Scene 2, later, numerous people fall to the ground as their pacemakers fail. They just had the bad luck to be in the vicinity of the individual with the notepad, who implanted their implants with malicious code that took effect later.
More security = more problems
That being said, it’s not surprising that the FTC isn’t ready to come out guns blazing with proposed solutions to digital health device security issues. The reality is, there are complex problems which arise when regulators demand more security.
First, security in computers almost always rests on encryption, which leads to an increase in the amount of data being protected. The best-known FTC case regarding device security, where they forced changes for cameras used in baby monitors, was appropriate for these external devices that could absorb the extra overhead. But increased data size leads to an increase in memory use, which in turn requires more storage and computing power on a small embedded device, as well as more transmission time over the network. In the end, devices may have to be heavier and more costly—serious barriers to adoption.
Furthermore, software always has bugs. Some lie dormant for years, like the notorious Heartbleed bug in the very software that web sites around the world depend on for encrypted communications. To provide security fixes, a manufacturer has to make it easy for embedded devices to download updated software—and any bug in that procedure leaves a channel for attack.
Perhaps there is a middle ground, where devices can be designed to accept updates only from particular computers in particular geographic locations. A patient would then be notified through email or a text message to tie it down to the doctor, where the fix could be installed. And the movie scene where malicious code gets downloaded from the street would be less likely to happen. But even these approaches create some extra hassles which could make promising new devices less attractive to patients and providers.
Involving the public
The FTC's paper demonstrates that they have developed a firm understanding of the problems in digital health security and privacy, but also suggests that they’re not sure where to take a stand. But the agency doesn’t have to go it alone. As devices grow in sophistication and spread to a wider population, the kinds of discussion the FTC held should be extended to the general public.
For instance, suppose a manufacturer planning a new way of tracking people—or a new use for the data—convened some forums in advance, calling on potential users of the device to discuss the benefits and risks. Collectively, the people most affected by the policies are chosen by the manufacturer who determines which trade-offs to adopt.
Even an engaged, educated public is no perfect solution. For instance, a privacy-risking choice that's OK for 95% of users may turn out harmful to the other 5%. Still, education for everyone—a goal chosen by the FTC as well—will undoubtedly help users make safer choices.