Tuesday, October 18, 2005

Effectiveness of web application testing tools and future trends

The two most relevant metrics for determining effectiveness of a web application testing tool are the number of vulnerabilities discovered and the number of false positives generated. False positives can cause, in many cases, the requirement of heavy manual labor to filter out false readings from huge amount of data.
The most advanced tools such as the ones that perform heuristic attack detection, have evolved from simple pattern matching (e.g. 404 error page detection) to slightly more flexible detection (e.g. user-configurable regular expressions).
Future trends will evolve into heuristic detection, which will consist of auto-generating detection through zero-day defense technology. Zero-day defense technology is the ability to learn from a pattern of known vulnerability behavior and then rule all unknown behavior as false positives (the same way some intrusion detection systems work today).
Currently, security testers use multiple tools, including commercial and open source tools and augument the tool deficiencies with manual analysis of the results. Just rely on tools analysis is not a guarantee of finding major vulnerabilities in an application. Overall most tools do not find more than perhaps 25-50% of known vulnerabilities in a typical application.
Some tool vendors allow users to extend the product capabilities by adding their own scripts or exploits which can help in increasing the number of vulnerabilities found, as well as to reduce false positives. Clearly, as the technology progresses, the sophistication of these products will continue to improve. In the meantime, there is no real substitute for the tester knowledge of application security. Testers need to focus on the most important security requirements, write a test plan and use tools that allow for both manual and automated analysis. Tools are only one factor of the equation, the other are people and process. Before to reccomend a tool for your organization, perform a proof of concept, especially look for flexibility in customizing the tool and to extend the tool functionality.


trustedconsultant said...

This is the first time I hear about Zero-Day Defense Technology.
Assuming this is the technology to provide countermeasures to Zero Day exploits it means that a countermeasure is ready on the same day the vulnerability is discovered (i.e. learned by the vendor). Heuristics Filters suppose to set new rules by learning from history and eventually set an effective vulnerability rule from past behavior (a technology used to attempt to block viruses or worms that are not yet known about).
Another way to implement Zero-Day Technology is perhaps to research for vulnerabilties before ther are exploited. One security organization (PivX Solutions) used to maintain a running list of Microsoft Internet Explorer vulnerabilities that Microsoft had been made aware of but hadn’t yet patched. On D-Day (calendar 0) avulnerability is published along with his countermeasure.
Anyway it would be interesting to know how effective Heuristics is on detecting vulnerabilities either before or the same day (calendar zero) a countermeasure is publicly available.
Assuming we are not just looking for a simple pattern that is linear predictable (like in most multivariable problems) trying to predict a multivariable and non linear behavior is rather challenging for what I know.
In current application firewall implementation filters are either white list or black list.
It is know that "white list" filtering is better than "black list" filtering when detecting vulnerabilities. White list filtering checks for expected behavior (such as validating expected input) and rejects everthing else flagging as false positive. Black list filtering checks for all suspicious behavior for example in case of validation, all possible unexpected character combination. Black list is difficult to implement because it requires checking for an infinite set of patterns since such patterns are unpredictable. There other limitation is

trustedconsultant said...

More on Heuristics. In case of Email Spam Filters is about assign a value to emails by learning from email archives. Heurstics is about statistically flagging some email components like suspicious. The filter sensitivity can be user defined. More sensitivity means more mail being filtered but also more false positives. Looks more as black list filtering to me.