Archive Page 4

01
Mar
21

Quick Update on PCI DSS v4

In the February 2021 Assessor newsletter, the Council announced the following.

“Because of the broad impact PCI DSS has on the payment community, the Council is seeking additional feedback into the PCI DSS v4.0 validation documents. As a result of expanding stakeholder feedback opportunities to include these supporting documents, the Council is now targeting a Q4 2021 completion date for PCI DSS v4.0. The publication and availability of PCI DSS v4.0 is still being determined. The Council will communicate the targeted publication date in the coming months.”

So we will apparently see one more iteration of v4 before it is released. According to their blog post, the comment period will start around June 2021.

See their blog post for more information.

One other important item from the newsletter for all QSAs, do not forget to register for the next All Assessor Webcast on March 18, 2021.

Advertisement
17
Dec
20

Quick Hits From PCI Dream Team Session 10

The following are some questions that were asked at the last PCI Dream Team session but we were unable to get to them during the session.

  1. If a PCI validated service provider omits requirements from SAQ-D-SP because they themselves also use PCI Validated Service providers who meet said requirements?

    First off, requirements cannot be “omitted” or marked “Not Tested” and have a compliant Service Provider SAQ D or ROC per FAQ #1382.
    As to how the organization should deal with requirements covered by a third party is to mark them as “In Place” with the description that the appropriate third party is responsible for the requirement and that the third party is PCI compliant as of the AOC date.
  2. Most QSA’s are suggesting that the best way to mitigate new requirements in PCI V4 is to implement P2PE. Would you agree?

    Going to a P2PE or E2EE solution is only part of the equation.  To reduce the scope the most, you would also want to implement tokenization to ensure that your systems never retain PAN.  It is important to remember that most P2PE/E2EE solutions do not automatically include tokenization.
    Also remember, only P2PE gets the immediate scope reduction without asking the acquiring bank.  However, E2EE can also result in scope reduction if properly documented and approved by your acquiring bank, so do not limit yourself to only P2PE solutions.  E2EE solutions from First Data (TransArmor) and Verifone (VeriShield) are the largest implemented scope reducing solutions in the marketplace and are offered through almost all payment processors.
  3. Can you give examples of connected-to tools for pushing out code – are you referring to Git, Chef, what other tools fall into this category?

    Yes, we were talking about tools such as Git, Jenkins and Chef.  But it is also more than just code that gets pushed out.  Configurations, networking, etc. are all getting pushed out by tools such as Ansible, Terraform and others in the cloud and are also in scope.
    Regardless of the PCI scoping issues, these tools create security issues for organizations because they are typically not very well protected and monitored.  These tools are an organization’s software factory and most organizations are leaving the factory’s doors wide open for anyone to come through and see how you construct your in-house software solutions that are supposedly the key to your organization’s success.  All of this should hit home pretty hard after the SolarWinds debacle.
  4. On the topic of end of life (EOL) software, what about open-source projects with no LTS such as React 16 since the next major version has been released?  Would I be compelled to update all my dependencies to the latest major version?

    As far as I am aware, there is no announced React 16 EOL date nor has there ever been an EOL announcement for any release of React.  That said, since React is a group of JavaScript libraries and JavaScript is a well-known attack vector, the risk of using an older React version just gets worse as time goes on.  A risk assessment for the React versions should take that all into account and drive your analysis as to when you should update React barring the vendor stating an EOL for the version.
    But there are larger issues with open source application projects that process, store or transmit cardholder data (CHD).  I wrote about this a few years back in this post and it has a link to a post on the subject from 10 years ago.
13
Dec
20

Network Segmentation Testing

NOTE: If you have not read the PCI SSC Information Supplement – Guidance for PCI DSS Scoping and Network Segmentation you must do so before using the procedures documented in this post.

How something so simple became something so complicated (or at least believed to be complicated), I just will never understand.  The only thing I can point to is the fact that network segmentation testing falls within the requirements of penetration testing.  Because of that, I think people therefore believe there is something “special” about how segmentation testing must be performed.  Never mind the fact that there is the even more basic issue of how to approach network segmentation testing.

Here is the network segmentation testing methodology for traditional IP networks.

  • Gather an inventory of all of the network segments.  Label each network segment as Cardholder Data Environment (CDE), Connected To or Out of Scope based on the definitions from the Scoping Information Supplement.
  • Make sure you have Nmap installed on a portable computer.  The reason this needs to be portable is because you will likely have to move around your facilities in order to complete all of the testing.  It is also not unusual to use diagnostic systems in the data center to accomplish this effort (they may already have Nmap installed) as well as creating VMs for this testing and then remoting into those systems.  The important thing is to have access to every network segment in your environment so that you can conduct this testing.
  • Connect your scanner to every CDE network segment and attempt to reach all of the Out of Scope network segments from the CDE.  You will want to run an Nmap scan that scans all TCP and UDP ports (i.e., 1 through 65535) against all IP addresses in a given out of scope network segment.  This likely sounds extreme but to prove segmentation you must test all 65,535 TCP/UDP ports against all IP addresses to make sure that no traffic “leaks” to your out of scope networks.  If you do find a port open in one of your out of scope networks, you will have to track down where that leak occurs.  Example: nmap –p- -sT –sU 10.10.0.0/16
  • While in each CDE, test connections out to your Connected To network segments testing all TCP and UDP ports against all IP addresses in your Connected To network segments.  Since communication between the CDE and Connected To segments is allowed, you will need to compare the results of the Nmap scan to your documented, approved ports and firewall rules to confirm that no ports are open that are not documented and approved.
  • Finally, you will need to test that your CDE can only reach the internet through ports and IP addresses you have specified.  Obviously, you are not going to test every internet address as that would take forever.  However, what I tell my clients to do is to use every external IP address they have for business partners or other third parties they are connected to.  Again, you are going to test all TCP and UDP ports against those addresses. If you get any unexpected results back, you are going to have to resolve those issues as there should be no external connectivity.
  • Connect to every Connected To network segment and conduct testing into the CDE for all TCP and UDP ports against all IP addresses in the CDE network segment.  Again, since communication is allowed between these network segments you will need to compare the results of the Nmap scan to your documented, approved ports and firewall rules to confirm that no ports are open that are not documented and approved.
  • While in the Connected To network segments, conduct testing to all Out of Scope network segments.  Since communication is allowed between these network segments you will need to compare the results of the Nmap scan to your documented, approved ports and firewall rules to confirm that no ports are open that are not documented and approved.
  • Connect to every Out of Scope network segment and run an Nmap scan into each CDE network segment for every TCP and UDP port for all IP addresses in the CDE.  This should return no results back if the network is truly out of scope.  If it does return results, you will have to figure out way and block that traffic into the CDE.
  • Save all of your results and comparisons so that you have a record of your testing.  If you found issues, make sure you document in detail what was done to resolve those issues and conduct new scans to prove that those issues were remediated.

When you bring in newer solutions such as the Cloud, containers, serverless, microsegmentation and the like the traditional method of network segmentation testing is impossible to completely test.  You can conduct all of the tests documented above from outside of the environment looking into your cloud environment, but you cannot look from inside the cloud out.  That must be done manually by examining the cloud configuration information and ensuring that networks are properly segmented.

If you are like me, you are looking for a better way to deal with the Cloud as well as large networks.  There are network tools from vendors such as FireMon, AlgoSec, Skybox and Tufin that have capabilities to take the Cloud configuration information as well as firewall, router, switch and other network infrastructure configurations and provide analytical capabilities to simulate the testing above from both internal and external perspectives.  The downside of these tools of course is that they are not inexpensive and can require significant horsepower to operate.  However, they can be worth their weight in gold for their ability to analyze and understand your networks, find misconfigurations and find issues where attacks can potentially succeed.

There is no reason to pay your penetration tester to conduct network segmentation testing unless you are uncertain as to how to analyze the information from the Cloud.

12
Dec
20

The PCI DSS Is Not The Only Relevant Payment Security Standard

One of the more lively discussions at our past PCI Dream Team session involved a discussion of requirement 12.8 and third party management (i.e., service providers).  What got the discussion started was when Art (Coop) Cooper made the comment that only SAQ A states that all third parties must be PCI compliant.  All of the other SAQs and even the ROC does not state that third parties need to be PCI compliant.

All of this is very true and has been this way since the beginning of the PCI DSS.

But …  That is not the whole story.

In this instance, the PCI DSS is not the only game in town.

People forget that Visa, Mastercard, Discover, American Express and JCB (aka “The Brands”) still have their own security programs and requirements in addition to the PCI DSS.  Some of these requirements are in their Operating Rules or similar documents.  In this case, Visa, Mastercard and Discover all require that service providers be PCI compliant as defined on their respective Web sites.  In the case of Visa and Mastercard, they maintain lists of PCI compliant service providers.  That said, those lists are marketing ploys that generate revenue for Visa and Mastercard as those service providers listed pay them to be on those lists. 

While Coop’s statement is accurate that the PCI DSS does not require service providers to be PCI compliant, it is shortsighted.  The Brands do require service providers to be PCI compliant and will enforce it through the merchant agreement/contract all organizations sign in order to accept those cards for payment.

The bottom line is that, if any service provider can provide you a current PCI Service Provider Attestation Of Compliance (AOC), you can use their services and comply with the Visa, Mastercard and Discover contracts.

Coop also stated that he has never seen the Brands enforce the contractual obligation when reviewing organizations’ ROCs and SAQs.  That is also a true statement but again not the complete story.  Based on what I have been told by lawyers that have been involved in breach litigation, it is the merchant agreement/contract that is used to hold breached merchants legally responsible and enforce fines, not PCI compliance or what is in any PCI document.  The PCI documents are used to influence fines and penalties, but the actual enforcement is through the contracts with the Brands.  If it is found that an organization was using non-PCI compliant service providers that just adds fuel to the fire.

As famous radio personality Paul Harvey used to say, “And that, is the rest of the story.”

10
Nov
20

The PCI Dream Team Rides Again

Please join us on Thursday, December 10, at Noon ET/1700 UTC as the PCI Dream Team discusses all things PCI EXCEPT PCI DSS v4. LOL!

You can register here for this free one hour session.

As usual, if you wish to submit questions before the session, please send them to our email box at pcidreamteam AT gmail DOT com.

We look forward to all of you attending this session.

30
Sep
20

The Second Draft of PCI DSS v4 Has Been Released

Not that anyone can discuss it because we are all under a non-disclosure agreement (NDA) and therefore we cannot discuss it. However, the Council Web site quietly announced the release of the new version on September 23. The comment period is open until November 13. You can read more here about it.

If you are a QSAC or a Participating Organization, your Point of Contact for the Council can download your copy of the new PCI DSS at the PCI Portal as usual.

For the rest of you, you will continue to stay in the dark until the final version is released. That said, as far as I am aware there have been no changes in the working dates for release which continue to seem to be a 2022 rollout.

11
Aug
20

Join Me On September 3

I am speaking at the Toronto ISACA Lunch & Learn session on PCI and the Cloud on Thursday, September 3, at Noon ET. You can go here to register.

I look forward to “seeing” you at this event.

22
Jul
20

PCI Dream Team Is Back On BrightTalk

The subject is unsupported software and devices and how to handle them. But of course, any PCI or security question is welcome. Join us on BrightTalk on Tuesday, July 28, at Noon ET, 5PM BST. You can register here or view the recording at the registration link as well.

As usual, you can submit question live during the session as well as any time before or after the session by sending them to ‘pcidreamteam AT gmail DOT com’.

We look forward to “seeing” you all next week.

07
Jul
20

The Security/Compliance Disconnect

I was speaking with someone recently and they tossed out one of the most despised phrases I know:

“Compliance is NOT security!”

I told them to stop right there and take it back or our discussion was over.  Since they really wanted my opinion on the actual topic at hand, we continued.  But I felt the need to explain why I find this statement so repulsive.  Which, by the way, has nothing to do with being an auditor.

The first point I make when discussing this phrase is about security frameworks and that they are merely the foundation for a good security program, not the whole enchilada.  They are only the starting point and that great security programs must go well beyond these frameworks.  The bottom line is that achieving compliance with any security framework means your organization can execute the basics consistently.

The next important point I like to make to people who spew this trope is that if they read any of the data breach or security reports from the likes of Verizon, Trustwave, Security Metrics or any other recognized security company, what do you see?  That the organizations breached could not comply with any of the recognized security frameworks be it PCI DSS, CoBIT, NIST, HIPAA, pick your poison.  Unfortunately, as these reports point out in annoying detail, organizations rarely execute the basics consistently because if they did, they would likely not have been breached.  Which really punches a huge hole in the whole compliance does not equal security argument.

Another point about this statement is that organizations high five over being compliant with a security framework when it really means that they are mediocre at best.  Yet time and again I hear back after PCI assessments that management is so proud that they were assessed compliant.  “Yea, we achieved mediocrity!”

Finally, there is how do you measure how well your security program is operating?  You must have a “yardstick” of some sort and to do that, so you need one of the security frameworks as your yardstick.  Given that these frameworks are only the basics, you need to add in all the additional controls your organization has in place that go beyond the framework.  That activity typically identifies a huge gap in the security program – there are few if any additional controls.  So, there you sit with say the PCI DSS as your “yardstick” and your organization cannot consistently execute the basic security controls in that framework.

Yeah, that is it!  It is the yardstick’s fault!

26
Jun
20

The 2020 PCI Community Meetings Go Virtual

A lot of us remember when the 2017 NACM in Orlando was cancelled due to Hurricane Irma.

Troy Leach announced on Twitter yesterday that the 2020 NACM would be virtual as would all of the other Community Meetings.

It will not be the same, but at least we will be virtually together.




Welcome to the PCI Guru blog. The PCI Guru reserves the right to censor comments as they see fit. Sales people beware! This is not a place to push your goods and services.

May 2023
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
293031