Tuesday, May 26, 2009

The Challenges of Source Code Analysis

What are the challenges of Source Code/Static Analysis (SCA)
SCA in practice is far from perfect. Simply buying a tool doesn’t solve any problems. Almost certainly making a tool available to your team is not going to result in adoption. In fact, it rarely does.

Think of the old adage: “you don’t go to a hardware store to buy a drill, you go to buy holes.” When you purchase an SCA tool you are buying fixed defects and addressed vulnerabilities. The tool, by itself, will get you only small part of the way there. The biggest challenges we’ve seen from hundreds of deployments are:

• Lack of integration into the process
• Misconfiguration and setup of technology to match business goals
• Lack of trust in what the tool is reporting
• Lack of management support
• Lack of commitment
• Insufficient staffing in number and in expertise
• Lack of ownership

When you purchase an SCA tool, your work has only just begun. Just as with most other tools, after the license and support costs, there are tangible and intangible costs to operate the tool effectively from both a technical and process standpoint. These investments have to be made otherwise, the tool will sit on the shelf, nobody will use it and the purchase will have been a waste. Luckily, these costs are not generally large and most of the issues can be addressed with some good planning and allocation of the right resources and budget.

Belief in the tools value
It is very difficult to calculate the ROI on an SCA tool. By definition, you are fixing problems before they go out into the field, and so it is difficult to predict the full extent of what the benefit would have been if you hadn’t used the tool. Several companies have tried to measure a tool’s value by correlating found results to past defects with some success. Results have varied greatly depending upon the codebase and the methodology used to analyze the results. Most see a big benefit but are not able to quantify it well.

In addition, SCA tools provide at best, educated guesses and therefore can be incorrect. This should be fully expected. For example, SCA tools have limited ability to account for environmental assumptions like configuration settings or usage constraints. False positives (incorrectly flagged defects) or irrelevant (“that will never happen” issues) can cause users to lose confidence in it. Defect reports pile up or worse yet, valid bugs are incorrectly marked as false because the developer didn’t trust the tool enough to investigate the report in enough detail. SCA reported defects then become low priority and remain unaddressed. Bugs and vulnerabilities you could have caught in development instead slip through into production.

The initial usage of the tool is an important time when lasting impressions are formulated. Many SCA tools report problems equally with little guidance as to what is higher priority than others. To be fair, priority is different for every organization and is usually comprised of severity of effect, how likely it will occur in production and how it affects functionality of the product. Therefore it is typically not calculated. To the developer using the tool with a large pile of bug reports, they don’t know where to begin. SCA tools do not prioritize well.

In software development organizations, there is a constant battle between what is urgent and what is important. SCA tools fit more under the important category and are often deprioritized from daily activities in an environment where everything is urgent. Some organizations unfortunately are more reactive than proactive. If everything is urgent, then in general you are understaffed. If people are pulled onto different projects and can never accomplish their goals, then you have not made a serious commitment to SCA. Get ready to put the SCA tool on the shelf.

Changing behavior is hard
Most organizations who use SCA require the software developer who owns the code to fix the problems reported. Getting participants to change their process is never easy. Extra steps are never welcome. Expect resistance from a vocal minority.

If the tool is not easy then it won’t get used. It has to be baked into the process. The tool has to be easy to use and the users trained. In addition, individuals have to feel that it adds value to their job. Incentives and consequences are extremely important. People who use the tool to accomplish business goals should be rewarded loudly. Likewise, there need to be penalties or consequences for not fixing problems reported from the SCA tool. Techniques like the “wall of shame” and MBO’s help align everybody to the common goal but must be used carefully. Displaying high bug counts can be a powerful political force but can cause undesired effects such as developers overzealously marking their defects as false or ignore’s. You have to balance the culture with the proper checks and balances to have the right effect.

Management support has to be strong and steady. It’s hard to prioritize quality and security over features and functionality. If management doesn’t believe it is a priority, then it will never succeed. Setting up the right process and the correct incentive structures are critically important to changing behavior. That being said, management should always be evaluating whether an investment in a tool is required or not. Having the right reporting in place can help management monitor the contribution SCA is making to the overall development process. Market conditions and therefore priorities change rapidly from day to day.

Giving the right support
Lack of ownership is a huge issue. In addition to management support, there needs to be ownership from both a tools operation perspective and from the user perspective. Often, a small percentage of a single resource is charged with owning the SCA tool. Most SCA tools come preconfigured to accomplish goals that may be misaligned with your own. Without the proper expertise and time commitment, most SCA administrators resort to using the SCA tools out of the box, resulting in higher false positives rates, incomplete code coverage, increased false negatives (where the tool should have reported a bug but didn’t) and suboptimal performance. SCA tools are sophisticated and can be tuned to optimum capabilities for a specific codebase. Expecting your SCA operator (who may have many other responsibilities) to be an expert administrator of the tool is unrealistic and a set up for failure.

Companies who gain the most value from SCA allocate expert staff with “static analysis” in their titles. They, in turn, took years to develop the level of proficiency that transform their organizations into big quality and security success stories.

At Code Integrity Solutions, we've worked with many leading organizations (both small and large) to help them overcome these challenges. But every organization is different. Let us know what challenges you've seen with source code analysis/static analysis in your organization.

No comments:

Post a Comment