Archive for the 'PCI P2PE' Category

25
Nov
16

The Council’s Take On Non-Listed Encryption Solutions

On Monday, November 21, the PCI SSC posted a blog entry discussing their new Information Supplement titled ‘Assessment Guidance for Non-listed Encryption Solutions’.  After reading their post, I had a few comments of my own.

Mike Thompson, chair of the P2PE Working Group, states that:

“We are encouraged by the significant growth of the PCI P2PE Program in the last two years and the increasing number of PCI P2PE Solutions listed on our website.”

Yes, you have gotten up to 23 listed solutions, but you are still missing First Data TransArmor, Shift 4 True P2PE and Verifone VeriShield who probably comprise the vast majority of E2EE solutions used by merchants.  And most of those solutions that are validated were validated to v1.x of the standard, not the latest version.  Yes, vendors are slowly moving over to v2.x but only slowly.  Some of that due to the pace at which they can get through the Council’s QA process.  But probably the larger reason is that the original cost of getting validated (large) and what that was actually worth in sales (small) has made them question the value of getting revalidated.

“The Council recognizes this creates a challenge for Qualified Security Assessors (QSA) in how to complete PCI DSS assessments for these merchants and that guidance is needed.”

It creates a challenge?  There has been a documented and agreed upon approach in place for E2EE solutions for years.  If QSAs are unaware of this approach it is only because the Council has neglected to explain that approach to them in their training.  As a result, the fact that the Council now believes that guidance is needed is only the fault of the Council.

That said, the guidance the Council is providing in the Information Supplement is in the best interests of the Council because it effectively recommends the solution be P2PE assessed by a P2PE QSA.

It means a few more P2PE QSAs will be needed.  There will not need to be a significant increase in P2PE QSAs because there really are not that many E2EE solutions out there that would drive the training of masses of P2PE QSAs like we have with PCI QSAs.  Let alone the fact that most solution vendors will likely ignore this recommendation unless the card brands force the issue.

But better yet, if a solution vendor has to effectively go through a P2PE assessment, why not just pay the money and have the solution listed on the Council’s Web site?  What better way to drive revenue for a standard that has attracted only a few providers because the assessment process is just as onerous and costly as the PA-DSS which is also in trouble.

Never mind the fact that getting through the Council’s QA process has been a tremendous nightmare.  Most P2PE QSAs equate the QA process to the PA-DSS QA process which has become a huge problem for payment application providers.  Since the PCI SSC is legally on the hook for validated solutions listed on their Web site, the Council is going to be extremely diligent in their review of all validated solutions.

In the end, E2EE providers are not convinced that going through the process is worth the initial and ongoing effort and costs.  They are still selling their solutions without validation in higher volumes than those vendors that have gone through the P2PE validation process.  And those vendors that have been through the validation process are questioning the value of the process since it has not resulted in high sales volumes.

“PCI P2PE Solutions provide the strongest protection for payment card data and simplify PCI DSS compliance efforts.”

I have to say that this is the most hilarious statement made in this post.  There are a number of P2PE validated solutions that allow for the use of 168-bit triple DES (3DES) as the encryption algorithm to protect data.  While 3DES is still considered “strong” by the US National Institute of Standards and Technology (NIST), they only considered it barely strong.  NIST has been advising organizations for years to migrate away from using 168-bit 3DES because it is only a matter of time before it too is broken like its 56-bit and 112-bit versions.  In fact they issued a new warning on 3DES late in 2015 when a researcher broke 168-bit 3DES with keys less than 6 characters in length earlier that year.

E2EE solutions being used these days are relying on the advanced encryption standard (AES) which is much stronger than 3DES and has yet to be broken in any of its variants.

“We want to make it easier for assessors, acquirers, and merchants to get the information they need to make decisions about risk and PCI DSS responsibilities when using non-listed account data encryption solutions.”

As I said earlier, there has been a process in place for years as to how to handle such solutions.  It involves conducting a review of the implementation of the E2EE solution and ensuring that it is implemented properly.  Then submitting the results of that assessment to the acquiring bank for their approval for scope reduction.

In the vast majority of cases, the acquiring bank or a subsidiary of the bank is also the provider of the solution, so in a lot of cases the QSA is just ensuring that the merchant implemented the solution properly and the bank signs off on the reduction in scope.

However on some occasions, the QSA must go through a bit more rigorous process to prove that the solution does in fact encrypt the data and that the data stream cannot be decrypted anywhere but at the payment processor or gateway.  While this can take a bit more time, it typically is not as time consuming as the Council makes it out to be.  Again, in every case, the processor or gateway has recommended the vendors involved so the process is straight forward and easily accomplished and it is only the acquiring bank that would have questions or concerns.

It is not that the P2PE approach is a bad thing.  It is just that the Council over reached when they created it.  The original process was messy, complex, non-modular and did not allow large merchants to continue their operations as they existed.  As a result, it was not seen as necessary by the stakeholders of the standard.  Without their support, there was little reason for adoption.  And as it turned out, the existing E2EE solutions in the marketplace dominated it without validation.

At the end of the day, the Council is trying to force E2EE solution vendors to validate their solutions to the P2PE standard and make that standard relevant.  However without the force of the card brands and banks behind it, the P2PE standard will continue to be dead on arrival.

The good news is that this is only an Information Supplement, so it only needs to be obeyed if merchants and solution vendors choose to obey it.  Which based on the prevalence of E2EE solution implementations, I would expect that things will continue to go “as is”.

UPDATE: On Tuesday, December 6, 2016, the Council issued an FAQ on this subject as well as announced a Webinar for Thursday, December 15, at 11AM ET to give QSAs and ISAs an update on this topic. However in reading the FAQ, it still appears that the whole purpose of this Information Supplement is just to drive vendors to validate their solutions to P2PE since the recommendation is to have a P2PE-QSA validate the vendor’s solution to the P2PE standard(s) and then issue some sort of report for the merchants to use.

20
Nov
16

Revenue Generation Or Payment Security?

Late on Friday, November 18, the PCI Security Standards Council issued a draft Information Supplement titled ‘Assessment Guidance for Non-Listed Encryption Solutions’.  For those of you that follow my blog, these solutions would be what I refer to as end-to-end encryption (E2EE) solutions.  This is a draft document, but I would bet there will be a lot of discussion regarding it.  The good news is that it is a draft and an Information Supplement, so it is not yet official and is only offering a suggestion of how organizations should proceed.

The biggest recommendation that comes from this Information Supplement is the one that will cause the most heartburn and the most discussion.  The Council is recommending that a P2PE QSA assess a vendor’s E2EE solution and issue a non-listed encryption solution assessment (NESA).  As you read further into the document, the NESA is just a different name for a P2PE assessment.  So essentially, what the Council is recommending is a P2PE assessment without the QA review and listing by the Council of the solution on their Web site.

All I can think of is that the Council is taking this approach so that First Data, Verifone and others will be forced to get their E2EE solutions P2PE validated.  After all, if you have to go through a P2PE assessment to allow merchants to use your solution, why stop there?  Why not just get it validated and listed on the Web site?

But the next thing that is troublesome is the implication that regular QSAs are not capable of adequately assessing an E2EE solution.  That somehow the mystical P2PE QSA training process imbues some sort of encryption omnipotence on those that attend and pass the test.  If you have ever looked at the P2PE Report On Validation (ROV), I think most QSAs could easily execute it.

But I think the real reason behind this Information Supplement is revenue.  The Council is driving revenue to their bottom line with these recommendations.  There will likely have to be more P2PE QSAs and those non-listed solutions will likely end up as P2PE validated.  All of those activities generate revenue for the Council.  Revenue that is needed since the card brands have limited their funding of the Council.

Another big reason to believe this is just a revenue generator for the Council is the fact that, unlike a lot of other Information Supplements, this one was not developed by a committee of card brands, Participating Organizations, QSAs or other stakeholders.  In the 14 pages that comprise this Information Supplement, there is no page that lists any outside contributors.

So other than the Council, who could be driving this Information Supplement?

The acquiring banks?  I just completed an assessment of a merchant using an E2EE solution recommended to the merchant by their acquiring bank.  The acquiring bank is major player in the payment processing industry, so you would assume they would have pointed me to the P2PE ROV for the testing of the E2EE solution but they did not.

First Data, TrustCommerce and Verifone have never pointed me to the P2PE ROV for assessing their E2EE solutions.  So the payment processors are not demanding this sort of assessment.

One would think that the card brands would have each issued a press release announcing this draft, but they did not.

That only leaves us with a unilateral decision made by the Council that this was necessary.

But the real question is, how does this Information Supplement improve the security of the payment process?

Have there been a huge number of E2EE solutions that have been breached and this is a response?  I have not heard of any nor have I seen anything in the media indicating that E2EE solutions are a problem.

Are there “fly by night” vendors of E2EE solutions running rampant in the industry?  Not that I have encountered but it would not surprise me if there were a few.  That said, the merchants I have worked with in implementing E2EE solutions only worked with vendors recommended by their acquiring bank, payment processor or payment gateway.  In most of these cases, the solutions were from First Data and Verifone who are widely trusted in the industry.

I suppose this could be a proactive step to get ahead of things getting out of control with E2EE solutions.  But if that were the case, one would think that the card brands and acquiring banks would have been on board and pushing this effort as well as the Council and explaining that they were being proactive.  Nothing on that front either.

That leaves us with the only purpose of this Information Supplement is to generate revenue for the Council at the expense of merchants, E2EE vendors and ultimately consumers.

The P2PE standard has been a big flop in the industry because, surprise, surprise, it is doing nothing to help the industry.  If it had been adopted by the big players such as First Data and Verifone, then we would probably be in a different place.  But there is a reason those big players and others never got on board, because the standard is too cumbersome, time consuming and onerous just like the now failing PA-DSS process.

Do not get me wrong, every organization has to make money to subsidize its existence.  But I am troubled that the Council now appears to be generating requirements for the purposes of revenue generation rather than the securing of the payment process.

It appears that we have turned a corner and that it may not be a good corner to have turned.

30
Sep
16

2016 North American PCI Community Meeting

It was a hectic week out in Las Vegas at the Community Meeting this year.  I wish I had more time this year to just hang out with everyone, but I was in the middle of a number of assessments that needed to get done, so I was working at night and attending sessions during the day.

By the time you read this, the slide decks from the sessions will have been posted on the Council’s Web site.  So all of you that attended will be able to download those presentations.  You go to the link provided in the program guide, provide your name, organization name, email address and the password from the program guide (ve4eqepR) and you are in.

The Council tried the 20 minute “TED Talk” format again with the Wednesday sessions.  A number of the sessions I attended could have easily used an extra 10 minutes if not a complete hour.  I know the Council is trying to move things along and get a lot of information covered, but trying to discuss topics like “the cloud” or EMV standards just cannot be properly accomplished in 20 minutes.  I do not care how good a speaker or organized the presentation.

Here are some of the more notable highlights.

The Assessor Session Is Back

Possibly the most anticipated session of the Community Meeting this year was the return of the Assessor Session after being missing for two years.  But unlike previous years where this session occurred before the start of the Community Meeting, the return of the Assessor Session was moved to the end of the Community Meeting.  I heard a number of complaints throughout the week from assessors about being at the end of the meeting.  Yet when Thursday lunch came around, there were a lot of QSAs, ISAs and ASVs that adjusted their travel schedules (Guru included) to attend this session.

While I originally agreed with people that moving the Assessor Session to the end was not a good idea, the more I have thought about it, the more I think it was better at the end.  That way assessors can have questions covering topics that come up during the meeting get answered while we are all together.  I know we all want to get home, but I think the Assessor Session offers more value to all of us being at the end.

On the not so good side, the Council chose to use up an hour and 10 minutes to present a variety of topics, some of which took way too long to discuss.  But the larger question was why was this material not presented during the main conference?  Not only did all of the meeting attendees miss out, but there were people that did not get their questions asked.  I am also sure that running long discouraged a lot of people from asking questions as well.

That said, there were a number of good questions asked during this session and the Council rewarded five people with large PCI SSC coffee mugs for their “good” questions.

One question though really created a stir.  I will address that question regarding multi-factor authentication (MFA) as a separate post to be published later.  However I will say this about this discussion.  The Council really needs to go back and re-think their position on MFA if what they said is accurate.

The Council was asked about SAQ A and where it is headed.  The concern in the assessor community is that the mechanism that issues/controls the iFrame/redirect needs protection.  However the changes to SAQ A for v3.2 did not seem to address this obvious risk.  Based on how the question was answered, I am guessing that the hosting community is trying to keep SAQ A as simple and easy as possible regardless of the risk.

Another area that the Council agreed to review was the change to requirement 3.2 in the ROC Reporting Template.  In v3.2 of the template you can no longer mark those requirements as Not Applicable however it was pointed out that an ‘NA’ was still allowed in the SAQ D.  The reason for seeking this clarification was related to past comments from the Council to follow SAQs for P2PE (SAQ P2PE) and outsourced eCommerce (SAQ A) when filling out a ROC for merchants with these solutions.  It was pointed out that neither of these SAQs has requirement 3.2 in them, so how is a QSA/ISA supposed to respond to it in the reporting template if it cannot be marked as ‘NA’.

Understanding The Current Data Breach Landscape (aka Verizon DBIR Report Discussion)

When Verizon sends out Chris Novak, you know you will get a great presentation on the data breach incident report aka ‘The DBIR’.  This year was no exception albeit somewhat depressing as Chris again pointed out that most breaches are the result of sloppy operations, lax security and insecure applications.  Essentially security issues that we should have gotten past a long, long time ago but have not.

Architecting for Success

Who better to talk about success than a representative from the Jet Propulsion Laboratory (JPL) talking about how to develop spacecraft to explore the most inhospitable environment we know, outer space and planetary bodies.  Brian Muirhead was the keynote speaker on Wednesday and is the Chief Engineer for the Mars Science Laboratory, the group that designed and developed the various Mars exploration rovers.  He gave a great discussion on how to look out for problems and develop self-managing devices.  Very interesting and I am sure an eye opener for people that we need to stop accepting the sloppy and messy solutions we get for handling cardholder data.

Internet of Things Keynote

The Thursday keynote was just a great time.  While there seemed to be very little directly relevant to PCI compliance presented by Ken Munro and an associate from Pen Test Partners, it was a fabulous time exploring the wonderful world of flawed technology from a tea kettle, to a refrigerator to a child’s doll.  In the case of the child’s doll, they removed the word filter database and therefore allowed the doll to say things that no child’s toy should say.

What was relevant to PCI was the ease with which these folks were able to reverse engineer firmware and software used by these devices.  It gave a lot of people unfamiliar with IoT and penetration testing in the room pause as to how seemingly sophisticated technology can be easily abused.

Cloud Security

While it was great to see Tom Arnold from PSC, the even better thing about this presentation was the fact that Amazon provided an actual human being, in the form of Brad Dispensa, to talk about Amazon’s EC2 Cloud.  While billed as a discussion on incident response, the session provided great insight into AWS’s EC2 service offering as well as the variety of new tools available to manage the EC2 environment and also provide auditors and assessors with information regarding the configuration of that environment.  The key take away from this session is that organizations using EC2 can provide everything needed for conducting a PCI assessment using their EC2 Master Console.

EMVCo

Brian Byrne from EMVCo gave a great 20 minute session on EMV.  The slide deck will be more valuable than the presentation because he had so much content to share and so little time to share it in.  Of note was his discussion of version 2.0 of three domain secure otherwise known as 3D Secure or 3DS.  While v1.0 will remain under the control of Visa, EMVCo has taken over management and development of the 3DS standard.  The new version is in draft and only available to EMVCo members, so this was the first time I had been able to see what the new version has to offer.  But because of the time constraint, I will need to wait for the slide deck to be published to know more.

PCI Quality Assurance Program

Brandy Cumberland of the Council provided a great presentation on the Council’s quality assurance program that all QSAs have become familiar.  I appreciated her discussion of James Barrow who took over the AQM program after most of us wanted to kill his predecessor for creating one of the most brutal QA programs we had ever seen.  James efforts to make the AQM program more relevant cannot be underestimated as he took over a very troubled affair.  This was a bittersweet discussion as James passed away right after last year’s Community Meeting and will be greatly missed by those of us that came to know and respect him.  Brandy took over the AQM program when James left the Council and has been doing a great job ever since.  She is possible one of the best resources the Council has and does the AQM program proud.

Application Security at Scale

The last great session of the conference I saw was from Jeff Williams of Contrast Security.  The reason this session was great was it discussed what application developers can do to instrument their applications for not only security, but also for operational issues.  He introduced us to interactive AppSec testing (IAST) and run-time application self-promotion (RASP).  The beauty of this approach is that applications get security in the form of embedded instrumentation that results in actionable analytics which then allow decisions to be made to respond to threats to these applications.  It sounds like an interesting approach and concept and I cannot wait to see it in action.

As always, it was great to see and catch up with all of my friends in Las Vegas at the PCI Community Meeting.  It was also great to meet a lot of new people as well.  I look forward to seeing all of you again next year in Orlando.

10
Jun
16

Is The PCI DSS Even Relevant Any More?

First the National Retail Federation (NRF), then bloggers.  Organizations and people are piling on the PCI SSC and standards all because of the United States Federal Trade Commission’s (FTC) fact finding project.  Seems like PCI is now a bad three letter word.  But with the changes that have been implemented or will soon be implemented, I am starting to wonder about the relevance of the PCI DSS.  So I thought I would explore these topics and explain what has lead me to that conclusion.

Ever since the FTC announced there little fact finding mission, I have consistently said that the FTC is late to the party.

Why do I think the FTC is late?

The FTC’s fact finding efforts are I am sure in response to the Target, Michael’s, Home Depot, etc. data breaches which resulted in tens of millions of payment card accounts being exposed and potentially used for fraudulent purposes.  Remember, they are a governmental body, so taking action can take a bit of time, in this case at least three years and longer than most people would have desired.  But they eventually got around to it.  While this fact finding effort is a valid way to get up to speed on a problem, the trouble is that the threat landscape has changed since those notorious breaches and the FTC got its act together.

What in the threat landscape has changed?

The vast majority of mid-sized and large retailers have or are in the process of implementing point-to-point encryption (P2PE) or end-to-end encryption (E2EE) and tokenization solutions to minimize their PCI scope to only the point of interaction (POI) otherwise known as the card terminal.  As a result, the threat of large scale breaches at these merchants is or soon will be in the next 12 to 18 months (based on my knowledge of a large number of such efforts) near zero.  The reason being is that these merchants’ point of sale (POS) and other systems will no longer have access to cardholder data (CHD) or sensitive authentication data (SAD).

How can the threat be near zero?

The threat with P2PE/E2EE and tokenization limits scope to only the POI and is very, very low because of how the POI must be implemented to work with P2PE/E2EE and/or tokenization.  I am not going to discuss in detail the security features of these solutions so as not to tip the hand of those organizations implementing them.  Let me just say that there is a lot of information required that must be loaded into the POI in order to swap out terminals.  Even then, there are additional controls involving the registration of the device by the merchant and/or service provider that preclude terminal swaps without generating some form of alerts.

The one threat that still does remain is the use of an overlay for skimming cards.  But that risk varies from POI vendor to POI vendor and even by POI model within a vendor.  And it is not like vendors have not taken notice of the overlay problem.  Vendors have gotten a clue and are changing the design of their POI to make them as difficult as possible to use an overlay.  I have a client that went with a POI that has various angles, long swipe tracks, LED lights and other features that would make an overlay very expensive to engineer but also very difficult to appear seamless to customers and clerks.  Over time I expect to see all POI manufacturers adopt strategies to minimize the ability to use overlays.

The result of all of this is that merchants are no longer the risk (if they even present a risk) they were two or more years ago.

So who or what does that leave at risk?

ECommerce Web sites are still a huge problem.  EMV as it exists today does nothing to stem the problem of online fraud.  Even if a merchant has outsourced eCommerce, they still have to manage that environment as well as deal with the chargebacks and disputes that come from eCommerce card transactions.  I have heard rumors of solutions that are coming to address eCommerce, but I have yet to see any formal announcements of those solutions.  So for the foreseeable future, eCommerce will still be in-scope for some amount of PCI assessment.  So merchants with an eCommerce presence will likely still have to address some form of PCI assessment for that environment.

Any merchant that has not gotten on the P2PE/E2EE and tokenization bandwagon.  All merchants should be getting POI that encrypt and/or tokenize at the swipe or dip of a customer’s card.  Adopting such solutions will leave the merchant with only having to comply with requirements in 9.9 and 12.  I know for some merchants that will mean an investment, but the payoff is extremely reduced PCI scope and effectively taking almost all of the risk out of card payments.

The organizations that end up with a huge target on their backs are any service providers, transaction processors, issuers or financial institutions that have CHD and/or SAD stored in their files and/or databases.  An unfortunate fact of life is that transaction processors, issuers and financial institutions are always going to have to have some amount of CHD/SAD in their files and databases because of the nature of their business.  It is these organizations where the full on (i.e., Report On Compliance or ROC) PCI DSS assessment will never go away.

For merchants that have moved to P2PE/E2EE/tokens, I could see a move to an annual self-verification that those solutions are still implemented and functioning as designed.  I could additionally see that, every three years or so, the card brands requiring an independent assessment by a QSA/ISA that the controls for P2PE/E2EE/token solutions are still in place and functioning correctly.  The reason for independent verification is that changes get made and those changes might affect the environment making it less secure.  For merchants not using P2PE/E2EE/tokens, I would think the current SAQs and ROC will remain in place with an annual assessment required.

Will other PCI standards be marginalized or disappear?

The PA-DSS will never leave us.  Software developers need to develop secure code and those service providers, transaction processors, issuers and financial institutions that store CHD/SAD need applications that do that securely, so there is a built in constituency for the PA-DSS.  ECommerce solutions are also still going to need PA-DSS validation.  But regardless of whether P2PE/E2EE and tokenization are implemented, any application potentially dealing with CHD/SAD will need to be assessed under PA-DSS to ensure that any CHD stored is stored securely and is erased securely.  Then there are the unknowns of the future.  You never know what might come along in the future, so there is always a possibility that some solution might need to securely store CHD or other payment related information.  The bottom line is that I find it very hard to believe that the PA-DSS could ever be dropped.

The PTS standard will also not disappear because those POI need to be validated to handle CHD/SAD securely and work properly regardless of P2PE/E2EE solutions.  The PTS is the only standard that is a card brand requirement, not a PCI DSS requirement.  It is the card brands that demand merchants use only PTS validated POI and I do not see that requirement going away when the POI is going to become the remaining target at merchants.

The ASV standard will not go anywhere as there will still be eCommerce solutions that will require vulnerability scanning.  Most merchants will implement eCommerce solutions that minimize their PCI scope using a redirect or iFrame.  Although I can see it coming that even using those solutions will still require the merchant’s eCommerce site, now deemed as out of scope, to be scanned for vulnerabilities.  The reason is that the invocation point of the redirect or iFrame is at risk of modification by an attacker.

One standard I do believe that will eventually go away is P2PE.  The reason is that there is very little to gain with a P2PE versus an E2EE solution.  Both solutions are essentially the same, the only additional work required for E2EE is documenting that E2EE has been implemented appropriately and submitting that documentation to the client’s acquiring bank and getting the bank to agree to the PCI scope reduction.  As a result, I believe that the P2PE standard will slowly and quietly disappear into the night as the cost of going through the assessment process along with the Council filling fees just cannot be justified by a lot of influential vendors such as Verifone and First Data.

There is my rationale for where I think things are hopefully headed.  Only time will tell if the rest of the world sees things the same way.

06
Jun
16

The NRF’s Collective Amnesia

On May 23, 2016 the National Retail Federation (NRF) issued a scathing indictment of the card brands, the PCI SSC and the PCI standards, in particular the PCI DSS.  But what is truly amazing is the irony and collective amnesia expressed by this document.

The first thing that got to me was the hutzpah of the writer of this document.  Hutzpah is humorously defined as “a child who kills their parents and then throws themselves on the mercy of the court because they are an orphan.”

In this case, the writer has totally missed the whole reason why the PCI standards exist.  It was because of the NRF’s memberships’ short sidedness and refusal to secure their eCommerce Web sites and point of sale (POS) systems that we have the PCI standards.  If merchants had just done the right thing more than 15 years ago and secured their systems that deal with cardholder data (CHD), the PCI standards would likely have never come into existence.  Yet here we have the NRF going after the very thing they helped to create because they do not like it.  Talk about having your cake and eating it too.

The next thing that caught my eye was the NRF’s version of history regarding PCI.  Since I have been around the attempts to secure card data since 2002, I found the NRF’s version of events interesting if not missing a lot of facts.  In the NRF’s version of history, history starts in 2003.  However this should not surprise anyone for this lack of memory as it was the NRF’s own members that are the reason the Visa Customer Information Security Program (CISP) came into existence.  Heaven forbid the NRF should admit that fact.

To correct the record, the Visa CISP actually dates back to the very late 1990s.  Visa was concerned about the growing use of eCommerce and the security of using payment cards to buy goods and service through eCommerce.  Breaches were a new thing, but Visa was concerned that they would become a big thing.  The Visa CISP was codified around late 2001 to early 2002 and was published out to a limited number of consulting firms around the summer of 2002.  By that time, merchants using the new eCommerce approach to selling their goods and services were being breached in record numbers and customer payment information was being lost in what seemed like an almost every day occurrence.  The good news was that eCommerce was in its infancy and the Target or Home Depot type of huge breaches were still a ways off in the future.  The bad news was that, as things were going, banks would be replacing payment cards every week.

The next piece I found interesting was this.

“Around 2003, Visa approached NRF with a proposal to impose Visa’s proprietary data security system (“Cardholder Information Security Program” or “CISP”) on brick-and-mortar retailers for in-store transactions.”

The first reason this statement is interesting is because none of the other card brands had an information security program officially published as of 2003.  MasterCard’s Site Data Protection (SDP) program would be the only one published in the fall of 2003 but it was not really rolled out until early 2004.  American Express and Discover would not come out with their programs until early and late 2004 respectively.

The second thing that I found interesting is the “brick and mortar” comment.  Brick and mortar retail had always been included in the Visa CISP.  But because of all of the eCommerce breaches going on, Visa chose to focus the CISP assessments on eCommerce (does “risk-based approach” ring a bell with anyone?).  We see this selective amnesia with banks as well when it comes to PCI.  The risk when the Visa CISP first came out was predominately with merchants with eCommerce sites.  Banks were also under the CISP scope, but since they were heavily regulated in the US and their security was examined at least annually, Visa and the other card brands did not see them as the huge risk.  As a result, banks were not really assessed until only recently.

“NRF members balked at Visa’s plan largely because of concerns that the other card networks (e.g., MasterCard, JCB International) would also attempt to unilaterally impose their own—possibly different and conflicting—security standards on retailers.”

Given the way the merchant agreements are written (and have been written since the 1960s), the card brands through the acquiring banks can unilaterally implement whatever rules and regulations they want on the merchants.  I find it disingenuous to be calling out your displeasure with the rules and regulations when your legal counsel and management already agreed to those rules and regulations.  But to paraphrase a famous US Presidential candidate, “I voted for the agreement before I voted against it.”

That said, by the end of 2004 the remaining card brands had also introduced their security programs.  American Express and Discover were the first to recognize that multiple programs were not a good idea and told merchants that they would accept the Visa CISP assessment in lieu of their own assessment programs.  As of early 2005, American Express and Discover agreed to accept a Visa CISP review as proof of compliance with their security programs.

Even more interesting in this discussion is that MasterCard’s Site Data Protection (SDP) security program was focused entirely on eCommerce (hence the word “site” in the title), not brick and mortar.  So where the writer of the NRF paper got the idea that every program impacted brick and mortar I do not know.

But then there is the underlying message of this paper.  The NRF is essentially arguing to get rid of the PCI standards all together.  But the NRF makes no argument as to what they would do to replace the PCI standards.  Oh, that is right, I forgot, merchants do not need to be policed.  If we have followed that line of thinking, then we would have the NRF complaining about the over regulation of the government in this area.

Speaking of which.  This paper seems to imply a mistaken belief that the FTC investigation into the PCI standards will result in the removal of the PCI standards.  I am not sure how the writer of the NRF paper seems to think that will happen.  In all my years of dealing with the government, the last thing that happens as the result of an investigation of this sort is not the removal of regulations, it is with the imposition of additional regulations and even more intrusive oversight.  If the NRF thinks the PCI SSC and the card brands were a pain, wait until the government starts going through their members.

As with the FTC, the NRF is actually late to the party.  The vast majority of the NRF’s large members such as Walmart, Target, Home Depot and the like have all implemented or are implementing either end-to-end encryption (E2EE) or point-to-point encryption (P2PE) solutions with tokenization.  The data is therefore encrypted at the point of interaction (POI) and can never be seen by the POS solution.  Any data returned is tokenized so that the POS and other solutions do not have CHD.  That means that the days of the large merchant data breach are almost behind us.  As a result, the only PCI scope the NRF’s members will have is the POI at their checkout counters.  Talk about scope reduction, but that does not seem to matter to the NRF.

But this is an era of piling on and I am sure that has a lot to do with this NRF white paper and the vitriol it spews.  The NRF felt the need to vent and vent they did.  Unfortunately, their argument lacks any sort of basis in fact to make their point.

09
Apr
16

Living In PCI Denial

This was one of those weeks where you see something and all you can do is shake your head and wonder what some organizations think when it comes to PCI.  What added insult to injury in this case was that the organization arguing over PCI compliance is the manufacturer of card terminals, also known as point of interaction (POI).  It shocked me that such an organization was so clueless about PCI as a whole when you would think it is their business to know. But to add insult to injury, my client’s transaction processor and acquiring bank are also apparently clueless.

As background, I am working on a client’s Report On Compliance (ROC).  This client has almost completed with their roll out of an end-to-end encryption (E2EE) solution at all of their 4,000+ retail locations.  This E2EE solution will take all but the POI at those retail locations out of scope for PCI compliance.  That is the good news.

But if there is good news, you know there must be bad news.  In reviewing their documentation of this E2EE solution, I discovered that the POI vendor is providing management and updates to the POI through a terminal management system (TMS).  Since this TMS solution/service connects directly to my client’s cardholder data environment (CDE), I naturally asked the client for a copy of the vendor’s Attestation Of Compliance (AOC) for the TMS solution/service.

I thought those worthless PCI Certificates of Compliance took the cake.  Then, BAM!  I got the following message forwarded to me by my client from the POI vendor.  I have redacted all of the potential information that could identify the relevant parties and the TMS solution/service.

“Please see the follow up note below that you can send to your QSA for review and feedback:

  1. TMS systems in our industry do not require any type of PCI certification since PCI is concerned about card holder information that would be at risk. Since [vendor solution] does not have any card holder data at all, it falls outside of PCI requirements.  [Vendor solution] is merchant configuration and estate management tool only and as such, no payment card information passes through it, or directed to it.  In addition, no secure keys are stored on [vendor solution] so transaction data cannot be decrypted with anything on [vendor solution] or POS.
  2. [Vendor] Hardware and [vendor solution] Software are all PCI PTS compliant and certified and listed on the PCI website. Transactions are encrypted in hardware using the [encryption solution] keys which again [vendor solution] has no knowledge.  Transaction information can only be decrypted by [processor] the processor.  [Vendor solution] has no knowledge of this encrypted information being sent directly from the [vendor] to the processor.
  3. The beauty and simplicity of [vendor solution] semi-integrated terminal application is that is has all transaction data go directly to the Processor ([processor]) and no customer data is directed to the POS or [vendor solution] which makes the POS out of PCI Scope by the very nature of no card holder data in their environment.
  4. [Client] has a merchant certification with [processor] for the [encryption solution] with our [vendor solution] terminal application. Any questions regarding the certification should be directed to [acquiring bank] or a [processor] representative.

Let us know if your QSA has any further questions and we can also schedule a concall with all parties to address any concerns on [vendor solution] TMS and PCI.”

The first thing that wound me up is that this vendor is a business partner of my client’s transaction processor.  The processor is also a business partner of my client’s acquiring bank.  Those two organizations put forth this vendor to my client as being able to provide POI compatible to the processor’s E2EE and tokenization solution.  Obviously from this vendor’s response, these two well-known institutions did nothing in the way of due diligence to ensure that this vendor and its services were PCI compliant.

The second thing that totally irritated me is that there is no excuse for this vendor’s uneducated response.  Granted, this vendor is new to the US market, but they have been supplying POI to other merchants all over other parts of the world.  Which then starts to make you wonder just how lame are the banks, processors, card brands and other QSAs that they have not been called on the carpet about this before.  But that is a topic for another post and a good reason why the FTC is investigating the PCI compliance industry.

So let me take apart this vendor’s response.

“TMS systems in our industry do not require any type of PCI certification since PCI is concerned about card holder information that would be at risk.”

Wrong!  On page 10 of the PCI DSS the first paragraph under ‘Scope of PCI DSS Requirements’ clearly defines what is in scope for PCI compliance.

“The PCI DSS security requirements apply to all system components included in or connected to the cardholder data environment. The cardholder data environment (CDE) is comprised of people, processes and technologies that store, process, or transmit cardholder data or sensitive authentication data. “System components” include network devices, servers, computing devices, and applications.”

The operative phrase the TMS solution/service falls under is “connected to”.  The TMS solution/service directly connects to my client’s CDE.  That solution/service may not process, store or transmit cardholder data (CHD) or sensitive authentication data (SAD), but it is directly connected to my client’s CDE.  As a result, according to the above definition, the TMS solution/service is definitely in scope for PCI compliance.

“[Vendor] Hardware and [vendor solution] Software are all PCI PTS compliant and certified and listed on the PCI website.”

PTS certification is a card brand requirement, not a PCI DSS requirement.  Nowhere in the PCI DSS does it require that a PTS certified POI be used so I really do not care about this statement as it has nothing to do with my PCI DSS assessment activities.  If PTS were a PCI DSS requirement, then all of those people using Square and the like would be non-compliant.

“In addition, no secure keys are stored on [vendor solution] so transaction data cannot be decrypted with anything on [vendor solution] or POS.”

“Transaction information can only be decrypted by [processor] the processor.”

True, your TMS solution/service does not have the encryption keys.  But the firmware delivered by the TMS solution/service does have access.  (Unless you are the first POI vendor I have ever encountered that spent the huge amount of money required to truly create a hardware-only encryption solution.)  Given the low retail price and discounting of your POI you gave my client, I very seriously doubt that is the case.  So the firmware that your TMS solution/service delivers is what is doing the encryption and therefore has access to the encryption keys.  So while the TMS solution/service does not have the keys, it could be used to deliver rogue firmware that could obtain them.

Then there is the firmware delivery itself by your TMS solution.  If someone hacks your TMS environment, how easy would it be for them to have it deliver a rogue version of your firmware?  Since my client has no AOC, I have no idea if your security measures surrounding your TMS solution are adequate to prevent such an attack.

“[Client] has a merchant certification with [processor] for the [encryption solution] with our [vendor solution] terminal application.”

Such a statement ranks up there with those previously mentioned worthless PCI Certificates of Compliance.  Any QSA is required to obtain an AOC for the TMS solution/service to ensure that it is PCI compliant or the solution/service must be assessed as part of the merchant’s PCI assessment.

PCI DSS requirements under 12.8 are very clear as to everything a merchant needs to be able to provide to their QSA regarding third party PCI compliance.  Primarily of which is that AOC for your TMS solution/service among other items of evidence.

So I have a conference call with my client’s bank to discuss this situation.  I pushed back very hard when they told me that my client needs to do a compensating control for their business partner’s incompetence.  I even got an “atta boy” from the bank for identifying to them that they have a PCI compliance and potential security issue.  But I could not make the bank budge on the compensating control so I am off to get that written.

The lesson to be learned from this post is that nothing can be taken for granted when doing a PCI assessment even when you transaction processor and bank are involved.  A lot of people and QSAs would assume that a POI vendor would know better and that their bank and transaction processor had vetted the POI vendor.  Therefore, why do I have to worry about this vendor?  However as I have pointed out, you can never take anything for granted even when it involves organizations that you would think would know better.

This is just one way of many that could result in an organization being breached.  The TMS solution/service is a gateway directly to the merchant’s CDE.  Yet there has been no PCI assessment of that solution/service to ensure that it is PCI compliant and the risk it could be subverted has been minimized.

Thank goodness it is the weekend.  Oh, wait.  This weekend’s project is my income taxes.  Looks like I will be cranky all weekend as well.

28
Mar
16

Is The FTC Investigation A Witch Hunt?

With the FTC’s announcement of their PCI fact finding effort a few weeks back, the questions being asked in the PCI assessor community these days are:

“Is this a ‘witch hunt’ by the FTC?” and

“Are they coming after QSACs?”

First, a bit of an update since my last posting on this subject.  I have been able to determine from my sources that the nine qualified security assessor companies (QSAC) that were selected by the FTC were randomly selected from a list of QSACs provided by the PCI Security Standards Council to the FTC.  Based on that information, I have to assume that the nine QSACs selected were just the unlucky winners of this FTC fact finding effort.

Another tidbit I was able to glean from some friends in the nine QSACs is that the FTC had yet to send them the official orders for complying with the study, so the 45 day clock has not necessarily started.  That information is at least a couple of weeks old, so I would assume that they have received official notice from the FTC.

Witch Hunt?

If you review the questions being asked it would seem to be the start of a witch hunt.  Those of us in the PCI industry know how the QSACs will like respond to these questions and how those responses will be interpreted.

However, in further reviewing the questions, the FTC is allowing QSACs to explain their answers, so hopefully those explanations will satisfy the FTC about some of the answers they will receive.  Some of the questions that could create problems/concerns are:

  • For each year of the Applicable Time Period, state the number and percentage of clients for which You completed a Compliance Assessment and for which You declined to provide: (1) a “Compliant” designation on the Attestation of Compliance (“AOC”); or (2) an “In place” designation on the final Report on Compliance (“ROC”).
  • For each year of the Applicable Time Period, state the number and percentage of clients for which You completed a Compliance Assessment and for which You provided: (1) a “Non-compliant” designation on the AOC; or (2) a “Not in place” designation on the ROC.
  • the method by which the scope of Compliance Assessments is determined, including but not limited to, the extent to which a client or any third party, such as the PCI Security Standards Council (“PCI SSC”), a Payment Card Network, Acquiring Bank, or Issuing Bank, is permitted to provide input into the scoping of Compliance Assessments
  • the process by which the Company determines whether to use sampling as part of a Compliance Assessment, including, but not limited to, a description of the methodology used to determine that any sample is sufficiently large to assure that controls are implemented as expected. As part of Your response, provide copies of all policies and procedure related to sampling, as well as all documents related to a representative Compliance Assessment that included sampling, including all communications between the Company and the client or any third party, such as PCI SSC, a Payment Card Network, an Acquiring Bank, or an Issuing Bank;
  • the methodology and tools the Company uses to perform Compliance Assessments;
  • the guidelines and policies for interviewing a client’s employees as part of a Compliance Assessment. As part of Your response, identify any PCI DSS requirement for which client employee interviews alone could establish whether a client had satisfied the requirement;
  • the extent to which the Company communicates with clients in determining the adequacy of any compensating control. As part of Your response, provide all documents related to a representative Compliance Assessment that considered a compensating control, including all communications between the Company and the client or any third party such as PCI SSC, a Payment Card Network, an Issuing Bank or an Acquiring Bank;
  • Provide: a copy of the Compliance Assessment with the completion date closest to January 31, 2015; and a copy of a Compliance Assessment completed in 2015 that is representative of the Compliance Assessment that the Company performs. For each Compliance Assessment provided in response to this specification, the Company shall also include a copy of any contract with the client for which the Compliance Assessment was performed, all notes, test results, bidding materials, communications with the client and any other third parties, such as the PCI SSC, a Payment Card Network, an Issuing Bank or an Acquiring Bank, draft reports, the final ROC, and the AOC.

The biggest problem I see at the moment is with this last bullet.  All QSACs have non-disclosure agreements (NDA) in place between them and their clients that only allow the PCI SSC access to reports for the purposes of quality assurance assessments (AQM).  This sort of NDA has been mandated by the Council since the release of v2 of the PCI DSS.  There are no provisions for federal government agencies to have access to a client’s ROC or AOC.

As a result, I am sure there will be a lot of legal wrangling over turning over unredacted ROCs and AOCs to the FTC.  If the FTC does allow redaction of “sensitive information”, then the legal wrangling will be over what is “sensitive” information.  ROCs and AOCs contain a lot of sensitive information that will eventually become part of the public record.  If the FTC does not take appropriate measures to control access to that information, an attacker that accesses that archive of ROCs and AOCs will have a gorgeous road map as to how to hack the merchants and service providers that the ROCs and AOCs cover.

Another area that will be highly contentious will be the QSACs providing information on the tools they use for their assessments.  This sort of information is highly proprietary and guarded by QSACs.  If it is released by the FTC it could remove some of the competitive advantages of QSACs.

It will be interesting to see the responses to scoping.  The Council has been struggling to give guidance in this area for years.  It is so bad that an offshoot of the PCI scoping special interest group (SIG) issued their own Open PCI Scoping Toolkit a number of years back to provide guidance to the PCI community.

Finally, the question regarding discussions with the Council, banks, card brands and the like will also be interesting to see documented.  I know that QSAs from my firm discuss a lot of PCI compliance issues amongst ourselves as well as with banks and the brands.  We also have questions submitted to the Council from time to time when we do not have clear guidance.  However, in talking with other QSAs from other firms, this seems to be more an exception and not the rule.

The bottom line on the witch hunt question is that I do not see the QSACs as the primary entities in the crosshairs of the FTC.  If anyone is in the crosshairs, it is the card brands and acquiring banks.  The Council is driven predominately by the card brands and their Participating Organizations (PO).  The banks are driven by regulatory requirements and recommendations from the brands.

Are QSACs In The Crosshairs?

As I just alluded to above, I do not thing that QSACs are directly in the crosshairs.  But they could be depending on the answers and explanations the FTC receives as well as what happens with this study going forward.

The unfortunate thing about the nine QSACs selected is that there are a number of notable QSACs missing from the list.  QSACs that those of us in the PCI community know would have extreme difficulty being put under the scrutiny of this FTC fact finding mission.  Yes, they would have nice responses to the questionnaire, but they would have difficulty supporting those great responses with their work papers and other evidence being requested.

Even with the nine selected I am sure there will be some embarrassing disclosures from the QSACs that have been asked to respond.  But for most of those embarrassments the QSACs can ultimately point to the Council and say they were told by the Council to do what they did.  That is not a good answer from a public disclosure perspective, but it is the truth.

Likely Results

If I had to look down the road and see where this is headed, I likely see a mess.  The FTC is coming to this party a day late and a dollar short.  That is because the days of the large merchant data breaches are likely coming to an end.  Why?

Most Level 1 and 2 merchants have implemented or are in the process of implementing either a point-to-point encryption (P2PE) or an end-to-end encryption (E2EE) solution paired with tokenization.  These solutions encrypt at the swipe/dip of a card at the terminal or point of interaction (POI) and return a token back to the merchants’ applications at the transaction’s completion.  These implementations are either complete or will be completed by the end of 2016.  As such, the days of getting data from large merchants’ databases and POS systems have for the most part come to an end.  For merchants that have implemented these solutions, the only device they will have that is in scope is the POI.

The result of these projects are that any attack would have to compromise the POI which is typically controlled by the transaction processor, not the merchant.  Not that such an attack cannot be done, just that its rate of success at the moment is very low given the complexity of compromising a processor as well as creating an acceptable rogue POI payload.

I will not go into detail, but I know a lot of you are asking why replacing the POI is not an option?  The reason that attack is not viable is that the P2PE/E2EE solutions all provide some form of tracking the POI for a variety of reasons.  As a result, merely replacing the POI with a rogue POI is not an easy task and would also require compromising the processor.

The bottom line is that any results that the FTC comes up with will likely be impacting Level 3 and 4 merchants.  Not that such merchants are necessarily small by any sense of the word.  I know of retailers that generate hundreds of millions of dollars in revenue but do less than a million card transactions in a year.

I could easily see that FTC saying that all merchants must periodically submit to an independent evaluation of their security controls where that is evaluated against the PCI DSS or some other security standard.  I would assume that truly small merchants will push back on such a requirement pretty hard, so I would also assume that the FTC will set some sort of transaction limit so that those truly small merchants do not have to incur that expense.

I could also see the FTC or some other government agency taking over the PCI compliance program.  While I think that would bring an unnecessarily level of bureaucracy to the PCI game, it would seem to be a likely outcome.  Whether or not QSACs would be forced to turn over their compliance function to such a government operation will be interesting to see play out.  One could envision something similar to the compliance operations within the FDIC, OCC and other financial institution regulatory bodies used as a model.

The net is that the FTC is again, coming to this situation late.  Most of the problem will resolve itself in a year or two making the headlines go away.  However, the FTC will use its “study” as a way to justify cleaning up a questionable or bad process.  A process that will have to radically change as merchants essentially get out of the card data business.

My advice to the FTC is to let nature take its course.  The breaches of the past have lead the industry to change itself and significantly reduce the risk.  A better effort would be for the FTC to get the processors and card brands to push for adoption of P2PE/E2EE and tokenization across the board.  That would minimize the risk to merchants and only leave processors and banks at risk.

But my advice is rational and that is not typically how government institutions operate because they need to show the public that they are doing something.  As a result, they tend to over react and do things that are no longer required all in the name of proving that they deserve to remain in business.

It will definitely be interesting to see how this all plays out.




Announcements

If you are posting a comment, be patient, as the comments will not be published until they are approved.

If your organization has a PCI opportunity, is in need of assistance with a PCI issue or if you would like the PCI Guru to speak at your meeting, you can contact the PCI Guru at pciguru AT gmail DOT com.

I do allow vendors to post potential solutions in response to issues that I bring up in posts. However, the PCI Guru does not endorse any specific products, so "Caveat Emptor" - let the buyer beware. Also, if I feel that the response is too "sales-ee", I reserve the right to edit or not even authorize the response.

Calendar

February 2017
M T W T F S S
« Jan    
 12345
6789101112
13141516171819
20212223242526
2728  

Enter your email address to subscribe to the PCI Guru blog and receive notifications of new posts by email.

Join 1,775 other followers