Moving Privacy Engineering into Practice: A Discussion with José Del Álamo and Seda Gürses

Facebooktwitterredditpinterestlinkedintumblrmail

The International Workshop on Privacy Engineering was launched in 2015 with a goal of providing a forum for proposing concrete tools, techniques, and methodologies that engineers can use to build privacy into systems. Two of its organizers, General Chair José Del Álamo, assistant professor of telematics system engineering at the Universidad Politécnica de Madrid, and Program Co-Chair Seda Gürses, a visiting research collaborator at the Center for Information Technology Policy at Princeton University, spoke with us shortly after the 2016 workshop about the direction of privacy engineering in general and goals for future workshops.

Question: In your IEEE Security & Privacy article, you define privacy engineering as “an emerging research framework that focuses on designing, implementing, adapting, and evaluating theories, methods, techniques, and tools to systematically capture and address privacy issues in the development of sociotechnical systems.” Why the focus on privacy engineering? Why not just security engineering?

Del Álamo: Security is a means to deliver privacy, but it is not the only means. If you don’t have security, you don’t have privacy. However, whereas security focuses on protecting the assets of the organization, privacy looks to protect the privacy needs and interests of the users. We use the term “engineering” because we noticed that developers often embed privacy into systems in an ad hoc manner rather than addressing it as a systematic process that everyone can repeat and adopt and test.

Gürses: In security, engineers can generally assume that the organization they’re working for is trustworthy. With privacy you might have an adversarial situation, and in order to build trust under these conditions an engineer working for a service provider might implement cryptographic protocols to prevent the service provider from accessing user information. For example, companies like WhatsApp and Signal are using encryption to protect the content of user messages from the service provider and any further parties who might have an interest in this information. But engineering for privacy doesn’t end there. Privacy engineering also aims to give users further controls over what they’ll disclose to the service provider and third parties.

Question: Does privacy extend beyond the user/provider relationship?

Gürses: Privacy between users is also an issue. Say you upload a photo to a social network and you want to tag some of your friends in the photo. The question here is should the photo or tags become public before or after your friends confirm the tag. That very subtle decision doesn’t necessarily change the opacity or transparency relationship with the provider that you’ve achieved through different privacy engineering mechanisms, but it might make a big difference in the quality of the experience or the conflicts users might have with their peers in social contexts. Privacy engineering includes this kind of user experience design.

Question: How has the growth of the Internet of Things and cloud-based services affected development within privacy engineering?

Del Álamo: To begin with, the interface with the user has changed. We’re used to interfaces we can interact with–screens and keyboards. With the Internet of Things, we have all kinds of wearable technologies that can gather different information from individuals, and we interact with the technology in different ways. Privacy engineering has a place in the design of these new interfaces. We need to think about how we can provide information to users about the kinds of information they’re delivering, as well as how they can consent to or exercise control over the ways that information will be collected and used.

Gürses: One of the transformations that underlies what’s happening with both mobile phones and the Internet of Things more generally is what we can call the switch from shrink-wrapped software to services. Instead of a one-time installation of static code and functionality, any smart thing is likely to be online continuously and integrate numerous services that dynamically deliver features to the users. This shift not only has a significant impact on the privacy of the individuals using the services, but also on the engineers, as with services they can make changes to the code anytime. Say you release a new feature and it turns out to be a privacy disaster–users hate it, it leaks information–you can quickly do an update because the code remains under the control of the engineers. Hopefully we’ll see a lot of interesting developments in leveraging the same client service architecture that underlies services in a way that doesn’t increase intrusiveness and reduce user autonomy but instead gives users more quality services while providing privacy.

Question: Given all of the user data that’s now at their disposal, how will organizations balance privacy with analytics?

Del Álamo: Any organization engaged in data analytics needs to address a number of privacy issues, from what data is being collected, to how it’s being processed (for example, is it being anonymized), to how it will be used, and whether the users have consented to both the collection and use of their personal data. The General Data Protection Regulation in Europe, which will be implemented in 2018, will provide some guidance on necessary mechanisms for user protection. We also need technical and organizational means to properly balance privacy and analytics.

Gürses: Researchers have developed a number of mechanisms that organizations can use when applying machine learning to analytics–for example, privacy-preserving recommender systems and mechanisms to promote fairness and transparency in machine learning. What we try to do with our privacy engineering concept is to say, “That’s great that researchers are coming up with one-off interesting solutions, but how do we systemize that knowledge so people outside of the research community can apply these mechanisms?” We try to turn those solutions into methods, techniques, and tools that will be available and accessible to practitioners. In that sense, our goals are congruent to events like the IEEE Cybersecurity Development Conference but with a focus on privacy.

Question: You recently concluded the second International Workshop on Privacy Engineering. What are your goals for next year’s workshop?

Del Álamo: As academics, Seda and I are trained to develop or put together methods, techniques, and tools but we don’t know exactly how companies are carrying out their privacy processes. We need to identify how we can make our work more compatible with the business processes within companies or public institutions. We hope to further develop collaborations with these organizations so we can share ideas and they can point us toward issues that inform our research.

Gürses: We would love to hear more from privacy engineering teams–what they’re struggling with, where are the pain points for the collaboration between the privacy team and the rest of the developer team, how are they organized, what models work, what privacy mechanisms are harder to implement, and so on. Coming up with technical solutions is one part of the game, the other part is making these actionable for engineering teams and for this we hope to collaborate with practitioners.

We’ve seen a steep change in how industry addresses privacy. This maturation is partly due to external events–the Snowden revelations, Apple versus the FBI with encrypted messaging, the cancellation of Safe Harbor, and so on. We’ve also seen new legislation, like the General Data Protection Regulation. This is an exciting time. Industry and public organizations understand that addressing privacy is not just a legal matter, it’s also a technical matter and we need expertise and knowledge. Our ambition is to contribute to the development of that knowledge base.