Q&A with Jim DelGrosso

Jim DelGrosso with Cigital Sign
Facebooktwitterredditpinterestlinkedintumblrmail

Jim DelGrosso was named executive director of the IEEE Center for Secure Design, under the IEEE Cybersecurity Initiative, in 2014. He is a senior principal consultant at Cigital. In this interview, DelGrosso discusses the Center’s purpose, its ongoing outputs and its work in the coming year.

Question: Would you describe how the Center for Secure Design was founded and its purpose?

One of the goals of our first meeting was to see if what we all thought was a fact was, indeed, a fact. We had all assumed that we have common flaws that are being built into our software today. But everyone at their own organization does things their own way. Though some flaws in software design make headlines because they lead to an egregious security breach, the vast majority never get publicly reported, for obvious reasons. But a lot of design flaws are caught by the organizations through either design review or root-cause analysis of a particular class of bugs.

In 2014 we gathered people with different backgrounds in a workshop to see if, in fact, we were all seeing common problems. Everybody brought their data to the table. We didn’t know what we were going to find. The beauty of the Center for Secure Design is that we collected this information from large and small organizations and universities, including Oracle, Google, Twitter, Hewlett-Packard, Cigital, RSA, McAfee, EMC, Harvard University, University of Washington, George Washington University, Athens University of Economics and Business, and the Sadosky Foundation. Over several days we found very strong commonalities that different commercial entities, universities, government agencies and the private sector were encountering.

That led us to create the document “Avoiding the Top Ten Software Security Design Flaws,” which gives very concrete advice on things software security professionals need to think about when building systems.

Question: So one of the Center’s missions is to shift the software security field from a reactive stance on bugs and hacks to a proactive stance on sound design from the ground up?

DelGrosso: Exactly. There’s plenty of published information about bugs and quite a few tools to identify them, using static and dynamic code analysis, penetration testing, and so forth. By contrast, we’re behind in creating design tools. That’s weird, because when you design something with a flaw baked into it and you start to find defects through the software development life cycle or implementation, the obvious conclusion is that sound design could have avoided the problem.

We’re trying to change the industry’s mindset by promoting secure design best practices. We’re going to help folks avoid problems that they haven’t seen yet, perhaps because they haven’t been looking for them. But those problems are probably there.

Question: What’s on the Center’s agenda for 2016?

DelGrosso: As the year unfolds, we’ll complete and release drill-down modules for all ten design flaws for a Web-based platform. We know more detail is needed to support our paper on avoiding design flaws. Then we can start to look at design flaws in another platform. We’ll pick another type of system, attract expertise in that platform and begin looking at how design flaws apply to it. We’ll need to develop guidance that makes sense in that other type of architecture. Meanwhile, we’ll want to review the so-called “top ten” software security design flaws we’ve already identified to ensure that they’re still relevant or revise the list of common flaws to reflect inevitable changes in the world around us.

We’ve discussed, but not decided, that it would helpful to hold the same sort of initial workshop that led to the Center’s creation in different parts of the world, which are at different stages in the practice of software security, and have very different views on what should be secure. For instance, privacy is a far bigger deal in Europe than here in the United States. Do they build software in a slightly different way? Let’s find out. Apart from geographic diversity, what about how software security is handled in different industry verticals? We’ve seen headlines about breaches in software security from the automotive industry to banking. We’ve just got to get people in different countries, different verticals, to recognize that software design has got to be a part of the discussion.

Question: We touched on the effort needed to shift the industry from a reactive stance to a proactive one. Is it a challenge for professional software engineers to justify the time and resources needed for implementing secure design or reviewing one’s own work for flaws?

DelGrosso: In industry verticals, there is constant pressure to get software out the door and be first to market. If I’m spending time creating sound, secure software design, that’s time I’m not spending on creating a new feature or a system with new capabilities. There’s a business decision to be made.

Here’s the paradox of sound software security design: If I do it well, no one sees tangible results. The software is secure and it works. On the other hand, if I scan my code and find bugs and fix them, my bug count goes down. When you take the time to follow secure design best practices, you do not reduce a “flaw count.” When something like that is difficult to measure, it can be difficult to get corporate or managerial buy-in. So we have to change the conversation. How can we not spend time and resources on following best practices in software security design? Everyone is going to find things that cannot be found any other way. We have to do it. Perhaps the justification is brand integrity and business continuity and a lack of headline-making flaws.

Question: Does the Center for Secure Design have a role in addressing the needs of students going into the software field?

DelGrosso: To a degree, yes. Designing software, architecting systems, is a kind of apprenticeship. You have to do a lot of it to figure out what works and what doesn’t. Some of it is trial and error. We can teach basic fundamentals in school, and it’s certainly a topic that should be covered. But to my mind, teaching secure design isn’t the same thing as teaching somebody how to program in a language. But certainly the whole concept can be taught, we can talk about design principles, we can talk about design patterns. But conveying best practices in the actual building and architecting of systems and software requires more time and experience, in my opinion.

Question: What’s the urgency of the Center’s mission, in your view?

DelGrosso: Software is being integrated into almost everything, including children’s toys. It turns out that because of the way the software is built and deployed on certain toys, there are actual vulnerabilities on the back end that allow hackers to get some information on the back-end servers. Thus even a toy can become a new way for attackers to do the nonsense that they do. If security is not thought through as these devices are connected to the Internet, hackers have a new attack path. With the Internet of Things, a teddy bear can create new opportunities for bad guys to do more damage in more ways. And that seems like a really horrible idea. It’d be great to avoid that, so we’ve got to get the message across on secure design and provide best practices and tools to support it.