I was recently part of Silver Bullet 100 where I was asked “How much progress have we made in the last ten years with Architecture Risk Analysis (that is, finding and fixing flaws in software design)?” My response surprised some folks here at Cigital when I explained that for the most part, I did not think we have made much progress at all. My response was based on the common defects we find analyzing the design of a system/application as part of our Architecture Analysis practice here at Cigital. We have clients all over the world, ranging in size from large to small, in very different markets, with very different business models, with different development methodologies, etc., etc., so I feel quite confident in my view of the [lack of] progress made over the last decade.
Fortunately, I am not the only one keeping tabs on commonly found defects. OWASP has been releasing their Top Ten lists for quite a number of years. Originally created in 2003, it was updated in 2004 and has been updated every three years since the 2004 release. I forget who I was talking to recently but a comment was made that most of the OWASP Top Ten has remained the same over the last ten years. My initial reaction was “No Way!!” How could we be making the same mistakes for ten years?
I decided to look at the Top Ten lists from 2004, 2007, 2010, and 2013 and see for myself how the list has changed over the years. I came across a PDF summarizing changes made to the Top Ten lists over the years and as I looked at the PDF it was immediately clear that sure enough, a number of entries have been in the list from the very beginning. Well that was depressing.
Comparison of 2003, 2004, 2007, 2010 and 2013 Releases
Why are we seeing the same problems over such a long period of time? They are documented well, there are numerous sample applications built to show how the vulnerabilities work, and there are static and dynamic testing tools that export their findings mapped to OWASP Top Ten entries. Certainly we should know about the issues.
I suppose there are many good reasons the same defects keep occurring year after year. Maybe we aren’t teaching students about this as part of computer science. Maybe there is some hyper-focus on functionality and security is one of those things you swear you will get around to, but never do. Maybe these things are genuinely hard to solve so we keep making mistakes. But no matter what the reason may be, the fact seems to be that the same defects keep occurring.
Now to be fair, although quite a few defects have appeared on the list for a few iterations, some have had either a consistent drop or a recent significant drop. For example, CSRF first appeared in 2007 as A5. It remained in the fifth spot in 2010 but in 2013 it dropped all the way to A8. Progress! I wonder what happened. I suppose it is possible that developers embraced CSRF, learned all about this defect and started writing their code to have CSRF protections. Or maybe frameworks that developers use were enhanced to have a CSRF protection mechanism built right into the framework making it easy for developers to get this protection for “free”. I have no data to support this possibility but it at least sounds like a good idea. So the broader question is, what can we do to make it easier for developers to not make mistakes in the first place and not have them reinvent a wheel every time they want to get something done?
Over the coming weeks we will look at various examples of doing exactly this.
Jim DelGrosso, Senior Principal Consultant, has been with Cigital since 2006. In addition to his overarching knowledge of software security, he specializes in Architecture Analysis, Threat Modeling and Secure Design. In fact, he was a catalyst for creating Cigital’s current Architecture Analysis practice. Jim is the Executive Director for IEEE Computer Society Center for Secure Design (CSD). He also predicts that “OpenSSL will have at least one new vulnerability found in the next 12 months. You can pick the start date—it’s the ‘12 months’ that matters.” Jim relaxes and decompresses from work by playing with the dogs, listening to music, or just chilling out with a beer and a movie.
Words of Software Wisdom: That better software security is achieved by combining many activities. You can’t just do one activity like pen-testing and declare victory. You also have to do secure code reviews, and architecture analysis, and training, and so on.
Cigital is one of the world’s largest application security firms. We go beyond traditional testing services to help our clients find, fix and prevent vulnerabilities in the applications that power their business.
Our experts also provide remediation guidance, program design services, and training that empower you to build and maintain secure applications.
Gary McGraw Delivers AT&T Cybersecurity Conference Keynote
Building Security into the SDLC Without Impacting Velocity
Adding Security Steps to Your “agile” Development Process
Benefits of Code Scanning for Code Review
If you didn't get a chance to see @cigitalgem's keynote at the @ATTSecurity Conference you can view the slides here sws.ec/1LggNax
Yesterday at 3:23 pm
RT @cigitalgem: And another @ATTsecurity picture from Vijay
SLIDES >> www.slideshare.net/Cigital/a-b… pic.twitter.com/Pq7VLnepUd
Yesterday at 3:18 pm
RT @cigitalgem: Wait, what??
@ATTsecurity (a picture from Ed) pic.twitter.com/wfW95AWDQz
RT @cigitalgem: Well that was fun. @ATTsecurity keynote was a hit. Thanks for having me.
Download this @Cigital Whitepaper to learn how to minimize time and resources spent on Application Security Testing sws.ec/1hoDwFh
Yesterday at 2:10 pm