26
Dec
14

PCI Compliance Is Getting More Rigorous

When Visa and MasterCard trotted out their security standards back in 2002 and 2003, the large eCommerce merchants that got to see them complained that they were too much.  Fast forward more than a decade and we still hear complaints that the PCI standards are too much.  Well if you are still complaining, things are about to get worse with version 3.  And the ever more consistent rumor is that business as usual (BAU) will be coming in v4.  If that comes to pass, I know some people that will likely jump out of windows as they did in the 1929 stock market crash.

So how is the PCI DSS getting more rigorous?

I spent some time analyzing the PCI DSS v3 as I did with v2.  From an analysis of v3 to v2, here are some of my findings.

  • There is an overall 11% increase in the number of tests in v3 versus v2.
  • Tests requiring some form of documentation have increased a whopping 83%. Not that 83% more documents will be required, just that there are 83% more tests where documentation is reviewed.  I will have more on this later in the post.
  • The number tests requiring interviews is up 48%. Again, not necessarily involving more people, just more questions to be asked and answered.
  • Tests requiring an observation of a process or activity are up 31%. As with the others, this is not a wholesale jump in new observations, but more an increase in things that must be observed.
  • Tests involving sampling are up 33%. This actually is an increase in the number of things sampled, but not all of the 33% increase are new samples.  This increase is the result of more clarifications from the Council to have QSAs explain what was sampled as it was implied in v2, but not explicitly requested.

Speaking of sampling, not only are the number of tests involving sampling increasing but the PCI SSC has told all of the QSAs that the days of “poor” or “inappropriate” sampling are over.  I have seen Reports On Compliance where QSAs have literally used a sample of one out of thousands under the rationale of “they are all configured the same”.  If you only tested one, how can you even draw the conclusion that the remaining thousands truly are the same?  You cannot and that is a big reason why the Council is getting picky on sampling.

The Council are also tired of incomplete samples.  The example most often quoted is there are 100 servers, half are Windows-based and half are Red Hat Linux.  A lot of QSAs were stopping there and sampling say five of each and calling their work complete.  Wrong!

What the Council is pointing out is that the QSA must go deeper in some cases when choosing their samples.  In the example above, the QSA needs to know the function of those servers so that they sample them based on their function such as database server, directory server, application server, etc.  In addition, the Council is also saying that it may be necessary to consider the applications involved as well to ensure that sampling provides a more complete picture of the environment.  In an assessment involving multiple applications, it might be necessary to sample database and application servers used by each application and not just a random sample of servers.

Finally, sampling might be higher for an entity’s first assessment or the first assessment by a QSA after a prior QSA.  The reason is that a higher sample size is warranted because all might not be as it is represented and minimal sampling would likely not reveal any issues.  This is common in the financial audit industry in situations where a new auditor is coming into the organization or the operations of the organization have been under increased scrutiny by regulators, banks or their prior auditors.

I earlier stated that documentation testing was up 83% and that was related to more testing of the same documents already being collected.  That is not to say that the amount of documentation is not increasing.  Regarding the amount of documentation required for v3 versus v2, I am estimating a conservative increase of around 100%.  I have been hearing horror stories regarding the amount of documentation being requested for v3.  I would not be shocked if the amount of documentation a QSA requires is up by 150% to 200% in some instances, particularly those situations where the QSA was not necessarily collecting all of the relevant documentation they should have been collecting.  A lot of this increase is that document counts now include observations which were considered separately in v2.

Based on this information, you should not be shocked if your QSAC increases the fees they are charging you for assessing your PCI compliance under v3.  Someone has to conduct all of those tests and review all of the extra documentation generated.  Even QSACs that have been doing the right thing all along are seeing impacts in the increases in testing required by v3.  But it has been definitely worse for those QSACs that were doing as little as possible to get an assessment done.  They are seeing the most impact from these changes and will likely find them highly onerous and difficult to justify the huge increases in professional fees required to cover their higher costs.  As a result, I would not be surprised if a number of QSACs stop doing PCI assessments because of the new requirements put on them.

But why are the changes occurring?

The primary reason is to minimize the “wiggle room” QSAs have in their testing so that assessments from one QSA to another are more consistent.  There has to be flexibility given to a QSA because organizations are never alike.  In addition what is compliant to one QSA can be non-compliant to another even within the same QSAC.  That occurs because every individual has their own sense of risk acceptance and avoidance.  This issue should be able to be taken out of the equation through discussion of the issue with the QSA and their superiors and, if necessary, development of mitigation strategies.

Under v2, a QSA that had a high risk tolerance could deem an organization compliant when the evidence would indicate that the organization is not compliant.  Or a QSA with a low risk tolerance could say one or more requirements are not in place in the same situation.  The new Reporting Template is an attempt to take the extremes out and reduce the wide swings in what is and is not compliant.  However, the new version of the PCI DSS does still allow some wiggle room for QSA/ISA judgment.

In addition to taking extremes in risk acceptance out of the assessment process, the Council is also trying to address the issue with QSAs that are judging organizations as PCI compliant when the QSA’s documentation does not support such a claim.  While the majority of QSAs thought this issue was addressed with the Reporting Instructions in v2, based on what the Council is telling us is that it apparently was not.  So the Council is getting stricter and stricter on their guidance as to what is acceptable through the language in the Reporting Template/Instructions as well as through their QSA training.

Another reason for the rigor is the breaches that keep occurring.  Each breach supplies information that might need to be incorporated into the PCI DSS.  One of the best examples of this is requirement 8.5.1:

“Service providers with remote access to customer premises (for example, for support of POS systems or servers) must use a unique authentication credential (such as a password/phrase) for each customer.”

This new requirement is in response to the significant number of breaches where the attacker gained access to a merchant’s cardholder data by knowing the remote access credentials of a vendor that is supporting the merchant such as those vendors that support point of sale (POS) solutions or card transaction processing.

Finally, the changes are also an attempt to circumvent some of the “legal” arguments that occur between the QSA and their client.  I am not the only QSA that has encountered clients that come up with very legal-like arguments and interpretations of what a particular test requires.  As a result, the Council has attempted to use wording in the tests and related testing guidance that reduces or even eliminates such interpretation arguments.  However, in my experience, clients that take this “legal” approach to their assessment are not going to stop.  They are not interested in security, they are interested in “checking a box”.  But the Council does no one any favors by only allowing QSAs and ISAs to read and have copies of the Reporting Template/Instructions until the client goes through their first PCI assessment under the new testing.  The Reporting Template should be a public document not one that only QSAs and ISAs have access.

Advertisement

6 Responses to “PCI Compliance Is Getting More Rigorous”


  1. 1 anonymous sales rep for QSAC
    January 7, 2015 at 8:20 PM

    I am a sales rep for a large QSA company. While we would be able to get a 2.0 assessment done for as little as 60 hours, we are now looking at a minimum of 120.

    Customers are livid.

    Somehow one of my clients got a competitive quote for a Gap *and* a ROC, 3.0, for 15k. We were over double that for only the ROC. I’m going to warn him to be skeptical about that. Doesn’t seem possible.

    • January 8, 2015 at 5:24 AM

      As I have said many times before, Kurt Vonnegut said it best in Cat’s Cradle, “In this world, you get what you pay for.”

      Some of these one to a few person QSACs are not bad firms. Unfortunately that is the exception, not the rule. Most small QSACs have very questionable staff and provide questionable services and advice to their clients. Unfortunately, until one of their clients is breached, they will provide services until they go out of business or their client goes out of business.

      Anyone can become a QSA/QSAC as long as they have the up front cost ($5,00USD) and pay their renewal ($2,500USD) and training ($1,250USD) fees and pass the examination. Small change to get into the business and stay there. As a result, there are a lot of very small practitioners that we all compete against. Eventually they will get winnowed out, one way or another, but until then, they are competitors and a bane to our existence.

  2. 3 Stephen Ames
    December 30, 2014 at 6:34 PM

    “Another reason for the rigor is the breaches that keep occurring. Each breach supplies information that might need to be incorporated into the PCI DSS.”

    The rigor in v3 will do nothing to reduce security breaches and resultant theft of cardholder data and other PII.

    “But the Council does no one any favors by only allowing QSAs and ISAs to read and have copies of the Reporting Template/Instructions until the client goes through their first PCI assessment under the new testing.”

    Indeed! I tried to use the ROC Reporting Template for v3.0 several months back and it is not a form fillable PDF. This is the top of my list for the open mic next September. Why even put it out there if I can’t use it? The only other tools for the non-ISA/QSA individuals are the SAQs, which are nothing but checkboxes with no free space to document test results. To me the PCI Council is perpetuating a checkbox mentality.

    • January 1, 2015 at 6:55 AM

      Breaches are always going to occur. The goal of the PCI standards is not to completely stop them, but to reduce their impact. No different than what technology has done for bank and art museum robberies.

      The ROC template is available now to the public as a PDF. QSAs and ISAs can get the Word version which should also be published to the public.

  3. 5 shift4sms
    December 30, 2014 at 1:00 PM

    You mentioned: “As a result, I would not be surprised if a number of QSACs stop doing PCI assessments because of the new requirements put on them.” IMHO, unless something changes with the card brands and breach liability, the same will be happening with merchants – they will be dropping traditional card acceptance and instead start embracing alternative payments. As merchants are painfully finding out, security is not 100% and any and all monies spent on PCI mean absolutely nothing if and when the merchant is breached. Unfortunately I think the card brands acceptance costs combined with PCI costs are getting very close to that line where the costs (related to PCI and potential breach liability) outweigh the benefits of accepting traditional payments. Now the card brands could help if they figure out a way to protect the merchant from breach liability costs for “certified compliant” merchants while still keeping merchants focused on security.

    • December 30, 2014 at 2:37 PM

      With the implementation of end-to-end (E2EE) and point-to-point (P2PE) encryption solutions where the sensitive authentication data (SAD) is encrypted at the swipe/dip, the merchant only has the point of interaction (POI) that is at risk. For the small merchants, that will basically take them almost totally out of the risk loop as long as they ensure the security of the POI. For mid-sized and large merchants, they will have the security of the POI as well as ensuring that the updates to the POI are from the correct source as a number of these merchants perform their own POI updates or at least control when the POI is updated.

      As a result, the vast majority of the risk ends up with the transaction processors, acquiring banks and the card brands.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


Welcome to the PCI Guru blog. The PCI Guru reserves the right to censor comments as they see fit. Sales people beware! This is not a place to push your goods and services.

December 2014
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
293031  


%d bloggers like this: