Software Security Analysis for Wearables with Jacob West

Jacob-West
Facebooktwitterredditpinterestlinkedintumblrmail

Jacob West is chief architect for security products at NetSuite and co-author of “WearFit: Security Design Analysis of a Wearable Fitness Tracker,” http://bit.ly/IEEEWearFit which supports the IEEE Center for Secure Design’s report, “Top Ten Software Security Design Flaws.” Both reports and the Center are part of the IEEE Cybersecurity Initiative.    

Question: Why use a fictitious “WearFit” device to illustrate sound software security design?

West: Wearable devices are catching on and changing the way the public interacts with computing technology.  According to Forbes, nearly half of all consumers plan to buy wearable devices, including fitness trackers, by 2019. That has potentially huge implications for software security that could affect consumers, vendors and other parties in the value chain.

The wearable form factor is particularly interesting because of how tightly integrated into our lives the devices are becoming. The intersection of technology with fitness, sleep, and other health factors changes some risks that we’ve seen in the past on desktop and Web applications.

Question: This WearFit design analysis is part of an effort to support with greater detail the Center for Secure Design’s original report on “The Top Ten Software Security Design Flaws.” Would you walk us through the process and how the report is organized?

West: My co-authors and I first agreed on a design for the system on which to base our analysis. It’s a fictitious design, but it’s based on real-world systems and constraints, and is as similar as possible to actual, wearable fitness tracking devices on the market.

The system overview is the first major section of the report, and it describes the device’s hardware and software architecture, as well as the mobile application that it employs for communication, and the backend website. The reader first needs to grasp the system design and how the device functions, independent of security concerns.

Then the report addresses the top 10 software security design flaws we have identified in the context of the WearFit device and how to avoid them, one flaw at a time.

Question: What are the report’s take-aways for vendors who create such devices, for software security designers, and for consumers?

West: For device vendors, we hope this report reflects the Center for Secure Design’s mission: to shift the industry’s focus from exclusively finding and fixing bugs in software to a more balanced approach that also looks at design flaws. We can avoid many bugs and vulnerabilities just by how carefully we build a system.

For the software security professional or system architect, much of this will be what we call in security “motherhood and apple pie.” It’s basic and for the most part well understood. But we hope that the analyses as a whole, the questions we ask, and the answers we provide will novel points, and that that will help expand the breadth of the architect’s toolbox when performing a similar design analysis.

In an aside on readership, the designer of software for wearable fitness tracking devices is most likely to be familiar with the issues and least likely to profit from reading the report. It’s the folks who don’t work in this specific product category that might be stimulated to see connections to their own work and become more aware of related issues who might take away the most insights.

For consumers, we’d like them to understand that security, broadly speaking, is a real concern. It shouldn’t be a paralyzing concern or prevent adoption of wearable or other technology, but it is a concern that consumers should be aware of and educated about. Wearable technology is in essence akin to desktops, laptops, tablets, smartphones, and other computing technologies that carry security risks. An educated consumer should understand that security risk exists, as well as how the devices they choose and how they use them can impact the severity of risk.

Question: Wearable fitness trackers typically record distance traveled and heart rate. If a device is compromised, where’s the security risk in a malicious actor obtaining those metrics?

West: Originally, say in the mid-20th century, computers had no security concerns, because security was exclusively a physical problem. As the first networks were developed, viruses and worms became an issue. Today, software being built into an ever-increasing number of everyday devices and those devices are accessing IP-based networks used by ever-increasing numbers of other devices and other people. Such networks, at scale, are being referred to in certain contexts as the emerging Internet of Things.

As the amount of attack surface increases, the need to “bake in” security, rather than “bolt-it-on”, increases as well. Security has to be built in from the design of the software all the way through the development and testing of the software until it’s deployed or turned over to an end user.

Users’ personal fitness information now needs even greater security protections because there are vulnerabilities to fitness data, to personal identity information, and to the network and the servers that support it. The report’s sections Six and Seven have more detail on the topic, which is “Use Cryptography Correctly,” and “Identify Sensitive Data and How They Should be Handled.”

The privacy and security implications of a fitness-tracking device may not be immediately obvious to the end user. For the software developer and the company that built that device, the tradeoffs between profitably promoting adoption of the fitness tracking device and the need to keep that user secure and their data private becomes a trade-off. Nothing is 100 percent secure, but failure to protect the user’s security and privacy could harm the business.

Question: Your comment brings up a final question: How does the software security designer gain organizational or corporate support for best practices in secure design?

West: Despite a growing awareness of the value of sound design versus the cost of fixing bugs and the consequences of data breaches, the designer still needs an executive-level mandate from a champion in top management. In securing that support, the best argument focuses on return-on-investment, and that requires some data. With homework on the security practitioner’s part, he or she can argue that a more sound design process could potentially have eliminated an entire class of vulnerabilities. If you can make that argument, then you can weigh an investment in secure design against cost reductions during testing or the brand erosion stemming from a data breach.

It should be a business decision based on costs and benefits. I don’t underestimate the difficulty of gathering credible data in support of that argument, but that’s the direction the conversation should take.