09
Feb
11

The “Magic” Vulnerability – Revised

What started this post is that I have recently received a number of calls and messages from clients and colleagues.  The conversations have all gone basically the same.  They were calling me and telling me that their ASV had failed their vulnerability scan because the OS detected was unsupported and they are wondering whether or not I have encountered this before.

My first question usually was along the lines of; “So, what vulnerabilities did they detect?”

“None,” was the confused answer at the other end of the line.

“What?  They must have detected at least one high, severe or critical vulnerability?  That is the only way you can fail,” I would ask, now also confused.

“Nope.  Nothing.  Just the fact that the OS is unsupported,” I was told.

Do not get me wrong.  I am not advocating the use of unsupported operating systems, particularly unsupported versions of Windows.  The risk of course is that one or more vulnerabilities show up that the vendor will not fix because the OS is no longer supported.  So there is good reason to avoid this situation.  However, there are also situations when you just get no other choice either due to your own organization’s issues and politics or software vendor issues.

This situation got me thinking and doing some research since I did not remember ever seeing or being told that an unsupported OS was an automatic vulnerability scan failure.  I no longer do external vulnerability scanning, so my recollections of training and working on the ASV side of our business is a bit fuzzy and very rusty.  However, I had never failed a client for an unsupported OS.  So when this issue came up, my only action was to determine what had changed.

The first thing I did was review the latest version of the PCI ASV Scanning Procedures, v1.1.  I searched for terms such as ‘old’, ‘unsupported’, ‘out of date’, ‘OS’ and ‘operating system’.  No matches.  So there is nothing in the ASV scanning procedures that fail an organization for running an unsupported OS.  Even the PCI DSS does not call out unsupported software, so procedurally; I am thinking there is nothing explicit regarding unsupported OSes causing a failed vulnerability scan.

So when I made the original posting, I got a comment from one of my readers pointing me to the ASV Program Guide.  Low and behold on the top of page 16 is the following:

“The ASV scan solution must be able to verify that the operating system is patched for these known exploits. The ASV scanning solution must also be able to determine the version of the operating system and whether it is an older version no longer supported by the vendor, in which case it must be marked as an automatic failure by the ASV. “

So there is no “magic” vulnerability I was missing as the PCI SSC does specify that a scan automatically fails if the OS is unsupported.

But that is not the entire story.  The key to this whole process is that the vulnerability scanner used must be able to verify the operating system.  While all vulnerability scanners attempt to identify the operating system, the reliability of this identification process is suspect at best.  I am not aware of any vendor of security testing tools that makes a claim that they will identify an operating system 100% of the time.  This is because of the fact that there are many, many things that can influence the OS signature that the tools cannot control and therefore can greatly affect the ability of the tool to identify the OS, particularly when talking about external scanning.  And if an organization follows the OS security hardening guidelines, a lot of unsupported OSes will not be properly or reliably identified by vulnerability scanners.  As a result, I find it hard to believe that the PCI SSC intended to have ASVs only rely on the results of a vulnerability scanner, but that seems to be the case.

So with this clarification, I contacted our ASV personnel and they have told me that they too have been failing vulnerability scans if they run across unsupported operating systems.  I ask if the OS signature is inconclusive, then there is not a failure?  Yes, if the scan comes back and does not identify the OS, then they have nothing to go on to fail the scan and the scan passes.  Given the difficulties vulnerability scanners can have identifying the target operating systems such as when scanning through network firewalls, Web application firewalls, load balancers and the like, I now ask if they feel that these identifications are reliable enough to fail a scan.  I am told this is why they confirm the information with the client before issuing the report so that the report is accurate.  So if a client is not honest, they could influence the results of their scan?  I am reluctantly told that is probably true.

Then there is the issue that not all operating systems are created equal.  Operating systems such as MVS, VMS and MCP are nowhere as risky, if they are even risky to begin with, as Windows and Linux.  A lot of ASVs would argue that they never come across these operating systems running Web services.  However, all of them have the capability of running Web services and I personally know of a number of organizations that run their Web services from such environments.  Organizations are running these older versions of operating systems mostly because of the financial considerations of migrating to something else.  However, I can guarantee that none of the dozens of vulnerability scanners that I have used in the last 10 years would accurately identify any of these operating systems, let alone tell you the version unless some service message header information was retrieved by these tools.  And even then, most tools do not parse the header to determine the OS so it would take human intervention to make that determination.

Regardless of the failure, most ASVs do have a review or appeal process that allows organizations to dispute findings and to submit compensating controls to address any failures.  So for organizations that cannot get rid of unsupported OSes, they can use a compensating control.  Like compensating controls for the PCI DSS, the organization is responsible for writing the compensating control and the ASV needs to assess the compensating control to ensure that it will address the issues identified by the vulnerability scan.

So, if you can fail an organization over an unsupported OS, why is it that you do not automatically fail on unsupported application software?  I went through the Program Guide and there are all sorts of other criteria for applications but nothing regarding the fact of what to do if they too are unsupported.  Applications such as IBM Websphere and Oracle Commerce can become unsupported just as easily as their OS brethren.  And in my experience, use of unsupported application software is even more prevalent than unsupported OSes under the idea that if it is not broken and does not have vulnerabilities, why upgrade?  When I asked our ASV group if they fail organizations on unsupported applications I got silence and then the response that they will fail an application if the vulnerability scanner provides a high, severe or critical vulnerability.  To tell you the truth, while vulnerability scanners regularly return text header information for a lot of applications, I would be hard pressed without doing a lot of research to find out if the version being reported was unsupported.  However, scanners could provide this feedback if they were programmed to provide it.

Then there are all of the conspiracy theories out there that say the PCI SSC and technology companies are working together to drive software sales by forcing organizations to upgrade and there would appear to be a lot of anecdotal evidence that would seem to support this argument.  In reality it is not that the software companies are working together with regulators such as the PCI SSC so much as software companies operate this way in order to focus development and support resources on fewer, more current versions.  As a result, it is just happenstance that regulations cause organizations to have to update their software.

The bottom line in all of this is that you have options to avoid a failing vulnerability scan because of an unsupported OS.  The best method, and the one I most recommend, is do not use unsupported operating systems in the first place.  However, as a former CIO, I do understand the real world and the issues IT departments face.  As a result, I recommend all of the following which may or may not require you to develop a compensating control.

  • Implement not only a network firewall, but also a Web application firewall (WAF) and make sure that the rules are extremely restrictive for servers running unsupported operating systems.
  • Configure your firewalls to block the broadcasting of any OS signature information.  Masking the OS signature will provide the benefit of not advertising to the world that the OS running whatever application is unsupported.  This is not a perfect solution as, 9 times out of 10, the application itself will likely advertise the fact that the underlying OS is unsupported.  It is very important to note that this is only a stop gap measure and you should still be actively in the process of migrating to a supported OS.
  • Implement real-time monitoring of firewalls, servers and applications.  Define very specific alerting criteria to ensure that any suspicious activity is immediately reported and operations personnel immediately follow up on any alerts to determine whether they are a false positive.
  • Implement a host-based intrusion detection/prevention solution on any servers that run the unsupported OS.  If using a HIPS solution, you may also want to consider using its preventative capabilities for certain critical incidents.
  • Implement real-time log analysis for firewall, servers and applications.  Define very specific alerting criteria to ensure that any suspicious activity is immediately reported and operations personnel immediately follow up on any alerts to determine whether they are a false positive.
  • Actively use your incident response procedures to address any incidents that are identified with any unsupported OS.
Advertisement

9 Responses to “The “Magic” Vulnerability – Revised”


  1. 1 Josh
    February 9, 2011 at 4:21 PM

    This started popping up when Nessus implemented plugin ID 47709 in July of 2010, “Windows 2000 Unsupported Installation Detection.” It reports a CVSS Base Score of 10/Critical, so for most ASV’s that’s going to trigger the failure.

  2. February 9, 2011 at 1:28 PM

    We ran into this back in June/July last year and I blogged about it last July: http://paymenttidbits.blogspot.com/2010/07/au-contraire-latest-pci-gotcha-from-pci.html

    I did receive a response from PCI-SSC and they informed me that ASV’s should fail the scan and it’s up to the QSA’s to determine if sufficient compensating controls are in place alleviate any risks associated with the obsolete O/S. To me, this is a significantly vague answer and will bite merchants in the rears and totally ignores the merchants that use SAQ’s.

  3. 3 JJ
    February 9, 2011 at 9:39 AM

    It’s kind of a Catch-22 situation. Microsoft and other vendors announce that they no longer test unsupported systems for vulnerabilities. So if a high, critical or severe vulnerability test needs a slight tweak to work on Red Hat Enterprise Server 3 but then it works well, is that an issue for the protection of card data? The fact that a stock vulnerability scanner configuration will not test for it doesn’t mean it doesn’t exist and cannot be exploited.

    If companies want to run unsupported systems, (unsupported by the original vendor, that is,) I think they need a compensating control saying they are purchasing maintenance from another company, they have installed other systems to protect against the vulnerability or something else.

    Otherwise not performing system upgrades becomes a way out. A former multi-national employer of mine (> 1 billion in annual revenue) is just now switching off their NT 4 domain and their Exchange 5.5 servers. These systems went unsupported a half a decade ago. Yet every year their “big 3” US auditing firm passes them on Sarbanes-Oxley even though the primary authentication mechanism in NT 4 is irrevocably broked. I asked a few of their auditors how they could possibly do this and their reply was that the control in place was to apply vendor patches. If the vendor hadn’t issued any patches, then we passed the control. The only reason they had to migrate is because their anti-virus systems, as old as they are, no longer have definition updates available and that failed them.

  4. 4 T. Anne
    February 9, 2011 at 8:35 AM

    So what are people supposed to do if the ASV fails them? I’m surprised they can do it in the first place if the whole point of the scan is to show compliance/minimum security based on the requirements of the PCI DSS… I should think the ASV making up their own rules shouldn’t be able to change a merchants compliance status for a unrequired made up requirement.

  5. 5 Dr. Strangelove
    February 9, 2011 at 8:08 AM

    Dear PCI Guru,

    please be aware that the current Program Guide in force to define ASV scanning practices specifies (page 16, under “Operating Systems”) that “…The ASV scanning solution must also be able to determine the version of the operating system and whether it is an older version no longer supported by the vendor, in which case it must be marked as an automatic failure by the ASV” (verbatim).

    From Italy, with l0ve

    Dr. Strangelove

    • February 9, 2011 at 10:48 AM

      I was looking only at the ASV Scanning Procedures v1.1 not the ASV Program Guide and you are correct. I stand corrected. However, I would argue that this is a rather narrow view of IT life and goes to show that people writing the standard have not necessarily operated and/or administered servers and a network at all or in quite a while. The real world is very different from theoretical world.

      • 7 Kat V.
        February 9, 2011 at 1:46 PM

        In the real world, things like forethought and properly configured networks/firewalls/ACLs/VLANs/etc exist to segregate your NT/Win2k box from your card data.

  6. 8 Luis E. Rodriguez
    February 9, 2011 at 8:05 AM

    I am not surprised by the attitude of these ASV´s however I am dissapointed seeing that they rely only on the functionality provided by the automated tools that they are accostumed to use. Let´s say that they find an unsupported OS, then they should go back to let´s say, manual procedures to work out on this “unsusported” system that more likely is kept as is because application needs and several other reasonable business reasones. Here is where the technical people on the business have to be smart enough to reasonably pressure the ASV´s and get them to work on the issue.

    Hope to hear more on this topic.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


Welcome to the PCI Guru blog. The PCI Guru reserves the right to censor comments as they see fit. Sales people beware! This is not a place to push your goods and services.

February 2011
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28  


%d bloggers like this: