Chad Perrin: SOB

4 April 2007

Security Analysis: Symantec ISTR XI (Vulnerability Trends Highlights)

Filed under: Security — apotheon @ 11:15

In yesterday’s analysis I presented some coverage of the “Attack Trends Highlights” of Symantec’s Internet Security Threat Report, volume XI. Today, I’ll present an analysis of the Vulnerability Trends Highlights from the Executive Summary Highlights section of Symantec’s report. I will be providing analysis of further content of the ISTR XI on a (hopefully) daily basis, until I decide I’m finished.

Interesting Numbers:

Symantec’s Internet Security Thread Report volume XI provides a number of interesting statistics in its “Vulnerability Trends Highlights”. The following is commentary on some of the more suggestive statistics.

  • 2,526 new vulnerabilities were documented by Symantec in the six-month period from 01 July through 31 December, 2006. This represents a 12% increase over the number documented by Symantec in the previous six months. This sort of metric is a lot more likely to be directly useful than many of the others Symantec counts, at least within a trivial margin of error, because of the nature of Symantec’s line of business. Of course, there’s nothing particularly surprising about a twelve percent increase in vulnerability detection: as detection techniques improve and the quantity of available software increases, the number of discovered vulnerabilities should increase as well.
  • 4% of discovered vulnerabilities were classified as high severity, according to Symantec. 69% were medium severity, and 27% low severity. Severity classification is based on a combination of factors, including ease of exploit, the level of access to your system or amount of damage it can allow an exploit to do directly, whether the vulnerability can be exploited remotely, and potential for harm to other network-connected systems via a single exploit, among other metrics. The “industry standard” for rating severity of vulnerabilities is FIRST‘s CVSS. Unfortunately, CVSS (and imitators as well) uses metrics that are highly arbitrary and not well suited to determining an actual useful measure of your systems’ security exposure. It’s somewhat akin to the largely useless color-code system the government now uses to rate current danger of terrorist attacks. To determine your actual exposure due to a specific vulnerability, you generally need to read as much as you can find about the vulnerability and make an informed decision for yourself — or hire someone to do so for you, assuming that person will employ all due rigor to the task and has the appropriate skills.
  • Sixty-six percent of all disclosed vulnerabilities in this period affected Web applications, according to Symantec. This, too, is not surprising, especially as Web applications become increasingly prevalent and important. It’s worth mentioning, however, if only so that readers who did not previously realize this was a likely statistic have an opportunity to consider it. This is important not only for people involved in Web application development and deployment, but also end-users of Web applications and people who might be inclined to forget about Web applications when considering vulnerability statistics.
  • Sun Solaris suffered the longest patch times of all OSes covered by the study, according to Symantec. This is important mostly because of the fact that Sun Microsystems directly disputes Symantec’s numbers. Symantec claims 63 vulnerabilities for Solaris with an average 122 day response time on patches. Sun reported 54 total “Security Sun Alerts” for the same period, only 36 of which applied to Sun Solaris. Of those, Sun’s statistics show that the majority were addressed in about five days, but averages were skewed upward slightly be “a small minority of 3rd party applications” or third party code included with Solaris as it was shipped.
  • 68% of disclosed vulnerabilities were not confirmed by vendors, according to Symantec. It’s possible this may account for disparities such as that between Sun’s numbers and Symantec’s.
  • Microsoft Internet Explorer suffered 54 documented vulnerabilities, about a 25% lead over the next highest number, according to Symantec. The second place was the Mozilla line of browsers — not a single browser, but all major Mozilla browsers.
  • Mozilla browsers saw an average patch response time of two days, according to Symantec, the shortest of any browser vendor/distributor studied.
  • 12 zero day exploits were documented by Symantec during this period. Zero day exploits (or “zero-day vulnerabilities” in Symantec’s phrasing) are exploits for vulnerabilities that exist at least concurrent with, if not prior to, disclosure of the vulnerability. In general, zero day exploits are indicators that vulnerabilities were discovered because malicious security crackers were employing them to compromise affected systems. Among other implications, this tends to suggest that the vendor/maintainer is not doing an effective job of timely vulnerability detection and resolution. The rate of zero day exploits in the second half of 2006 represented a substantial increase over the previous two consecutive six month periods in which there was one zero day exploit documented by Symantec per period, and the vast majority of these applied to Microsoft software.
  • 168 vulnerabilities were documented by Symantec for Oracle database implementations. That’s more than the number of vulnerabilities documented for any other vendor’s database systems. I don’t find this particularly interesting, personally, other than the humor factor, as I don’t use, or have much interest in using, Oracle software any time soon. It might be of interest to some of you, however.

Interlude for an Apology:

I fear this may be the least useful of my analyses so far. Hopefully it will continue to be the worst, and following analyses will be better. I must apologize, both for its lackluster quality and its lateness. A number of factors contributed to this today, factors unrelated to the task but of some impact on my available time for this.

Final Notes:

  • It’s always important to be sure you’re aware of who provides what figures, and to attempt to verify them based on the methodologies of those who gathered them, when accurate comparisons are necessary.
  • Zero day exploit vulnerabilities are the most important vulnerabilities to track for determining likely exposure of your systems based on the software you use.
  • Vulnerabilities discovered and disclosed, in and of themselves, do not constitute a reliable measure of the security of a given piece of software. High discovery rates may be an indicator of malicious activity, or perhaps of extremely diligent security testing by developers. Low discovery rates, by contrast, could easily be a result of either piss-poor developer security testing, or instead the result of a vendor’s unwillingness to reveal discovered vulnerabilities and ability to suppress disclosure.


This has been the third installment in my security analysis of the Symantec Internet Security Threat Report, volume XI. This is a series of daily posts collected under the SOB category Security. You may follow this series (and further security-specific posts) via RSS using the Security Category RSS Feed.

Next, I will continue my overview of Symantec’s “Executive Summary Highlights”, with specific attention on the “Malicious Code Trends Highlights”, in brief.

1 Comment

  1. […] In my immediately previous analysis I presented some coverage of the “Vulnerability Trends Highlights” of Symantec’s Internet Security Threat Report, volume XI. Today, I’ll present an analysis of conditions surrounding one statistic from the Malicious Code Trends Highlights from the Executive Summary Highlights section of Symantec’s report. I will be providing analysis of further content of the ISTR XI on a (hopefully) daily basis, until I decide I’m finished. […]

    Pingback by Chad Perrin: SOB » Security Analysis: Symantec ISTR XI (Malicious Code Trends Highlights, Part 1) — 6 April 2007 @ 10:25

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

All original content Copyright Chad Perrin: Distributed under the terms of the Open Works License