Archive for September, 2016

30
Sep
16

2016 North American PCI Community Meeting

It was a hectic week out in Las Vegas at the Community Meeting this year.  I wish I had more time this year to just hang out with everyone, but I was in the middle of a number of assessments that needed to get done, so I was working at night and attending sessions during the day.

By the time you read this, the slide decks from the sessions will have been posted on the Council’s Web site.  So all of you that attended will be able to download those presentations.  You go to the link provided in the program guide, provide your name, organization name, email address and the password from the program guide (ve4eqepR) and you are in.

The Council tried the 20 minute “TED Talk” format again with the Wednesday sessions.  A number of the sessions I attended could have easily used an extra 10 minutes if not a complete hour.  I know the Council is trying to move things along and get a lot of information covered, but trying to discuss topics like “the cloud” or EMV standards just cannot be properly accomplished in 20 minutes.  I do not care how good a speaker or organized the presentation.

Here are some of the more notable highlights.

The Assessor Session Is Back

Possibly the most anticipated session of the Community Meeting this year was the return of the Assessor Session after being missing for two years.  But unlike previous years where this session occurred before the start of the Community Meeting, the return of the Assessor Session was moved to the end of the Community Meeting.  I heard a number of complaints throughout the week from assessors about being at the end of the meeting.  Yet when Thursday lunch came around, there were a lot of QSAs, ISAs and ASVs that adjusted their travel schedules (Guru included) to attend this session.

While I originally agreed with people that moving the Assessor Session to the end was not a good idea, the more I have thought about it, the more I think it was better at the end.  That way assessors can have questions covering topics that come up during the meeting get answered while we are all together.  I know we all want to get home, but I think the Assessor Session offers more value to all of us being at the end.

On the not so good side, the Council chose to use up an hour and 10 minutes to present a variety of topics, some of which took way too long to discuss.  But the larger question was why was this material not presented during the main conference?  Not only did all of the meeting attendees miss out, but there were people that did not get their questions asked.  I am also sure that running long discouraged a lot of people from asking questions as well.

That said, there were a number of good questions asked during this session and the Council rewarded five people with large PCI SSC coffee mugs for their “good” questions.

One question though really created a stir.  I will address that question regarding multi-factor authentication (MFA) as a separate post to be published later.  However I will say this about this discussion.  The Council really needs to go back and re-think their position on MFA if what they said is accurate.

The Council was asked about SAQ A and where it is headed.  The concern in the assessor community is that the mechanism that issues/controls the iFrame/redirect needs protection.  However the changes to SAQ A for v3.2 did not seem to address this obvious risk.  Based on how the question was answered, I am guessing that the hosting community is trying to keep SAQ A as simple and easy as possible regardless of the risk.

Another area that the Council agreed to review was the change to requirement 3.2 in the ROC Reporting Template.  In v3.2 of the template you can no longer mark those requirements as Not Applicable however it was pointed out that an ‘NA’ was still allowed in the SAQ D.  The reason for seeking this clarification was related to past comments from the Council to follow SAQs for P2PE (SAQ P2PE) and outsourced eCommerce (SAQ A) when filling out a ROC for merchants with these solutions.  It was pointed out that neither of these SAQs has requirement 3.2 in them, so how is a QSA/ISA supposed to respond to it in the reporting template if it cannot be marked as ‘NA’.

Understanding The Current Data Breach Landscape (aka Verizon DBIR Report Discussion)

When Verizon sends out Chris Novak, you know you will get a great presentation on the data breach incident report aka ‘The DBIR’.  This year was no exception albeit somewhat depressing as Chris again pointed out that most breaches are the result of sloppy operations, lax security and insecure applications.  Essentially security issues that we should have gotten past a long, long time ago but have not.

Architecting for Success

Who better to talk about success than a representative from the Jet Propulsion Laboratory (JPL) talking about how to develop spacecraft to explore the most inhospitable environment we know, outer space and planetary bodies.  Brian Muirhead was the keynote speaker on Wednesday and is the Chief Engineer for the Mars Science Laboratory, the group that designed and developed the various Mars exploration rovers.  He gave a great discussion on how to look out for problems and develop self-managing devices.  Very interesting and I am sure an eye opener for people that we need to stop accepting the sloppy and messy solutions we get for handling cardholder data.

Internet of Things Keynote

The Thursday keynote was just a great time.  While there seemed to be very little directly relevant to PCI compliance presented by Ken Munro and an associate from Pen Test Partners, it was a fabulous time exploring the wonderful world of flawed technology from a tea kettle, to a refrigerator to a child’s doll.  In the case of the child’s doll, they removed the word filter database and therefore allowed the doll to say things that no child’s toy should say.

What was relevant to PCI was the ease with which these folks were able to reverse engineer firmware and software used by these devices.  It gave a lot of people unfamiliar with IoT and penetration testing in the room pause as to how seemingly sophisticated technology can be easily abused.

Cloud Security

While it was great to see Tom Arnold from PSC, the even better thing about this presentation was the fact that Amazon provided an actual human being, in the form of Brad Dispensa, to talk about Amazon’s EC2 Cloud.  While billed as a discussion on incident response, the session provided great insight into AWS’s EC2 service offering as well as the variety of new tools available to manage the EC2 environment and also provide auditors and assessors with information regarding the configuration of that environment.  The key take away from this session is that organizations using EC2 can provide everything needed for conducting a PCI assessment using their EC2 Master Console.

EMVCo

Brian Byrne from EMVCo gave a great 20 minute session on EMV.  The slide deck will be more valuable than the presentation because he had so much content to share and so little time to share it in.  Of note was his discussion of version 2.0 of three domain secure otherwise known as 3D Secure or 3DS.  While v1.0 will remain under the control of Visa, EMVCo has taken over management and development of the 3DS standard.  The new version is in draft and only available to EMVCo members, so this was the first time I had been able to see what the new version has to offer.  But because of the time constraint, I will need to wait for the slide deck to be published to know more.

PCI Quality Assurance Program

Brandy Cumberland of the Council provided a great presentation on the Council’s quality assurance program that all QSAs have become familiar.  I appreciated her discussion of James Barrow who took over the AQM program after most of us wanted to kill his predecessor for creating one of the most brutal QA programs we had ever seen.  James efforts to make the AQM program more relevant cannot be underestimated as he took over a very troubled affair.  This was a bittersweet discussion as James passed away right after last year’s Community Meeting and will be greatly missed by those of us that came to know and respect him.  Brandy took over the AQM program when James left the Council and has been doing a great job ever since.  She is possible one of the best resources the Council has and does the AQM program proud.

Application Security at Scale

The last great session of the conference I saw was from Jeff Williams of Contrast Security.  The reason this session was great was it discussed what application developers can do to instrument their applications for not only security, but also for operational issues.  He introduced us to interactive AppSec testing (IAST) and run-time application self-promotion (RASP).  The beauty of this approach is that applications get security in the form of embedded instrumentation that results in actionable analytics which then allow decisions to be made to respond to threats to these applications.  It sounds like an interesting approach and concept and I cannot wait to see it in action.

As always, it was great to see and catch up with all of my friends in Las Vegas at the PCI Community Meeting.  It was also great to meet a lot of new people as well.  I look forward to seeing all of you again next year in Orlando.

Advertisement
29
Sep
16

Microsoft Changes Their Patching Strategy

Back in May 2016, Microsoft issued a blog entry on TechNet giving the world insight into its new patching strategy.  The concept of a monthly “rollup” patch or what a lot of people are calling a “mega-patch”.  In August another blog entry was posted that further detailed this strategy and explained that from October 2016 going forward, this is how Microsoft would patch Windows.

But there is even more to it.  For WSUS and SCCM users, security patches will be separated from the Monthly Rollup in their own Security mega-patch.  The idea behind separating the security patches into their own mega-patch is to allow organizations to at least stay current on security.  However there is a twist on this approach as well.  Organizations such as small business that do not use WSUS or SCCM will only get a single mega-patch through Windows or Microsoft Update that will contain the Monthly Rollup and Security mega-patches in one mega-patch.

So what could go wrong you might be asking?

The biggest drawback to this scheme is that, should you have any issue with a mega-patch, you must back out the whole patch, not just the item that is creating the issue.  That means instead of having just one potential issue to mitigate, you could have as many issues to mitigate as the patch contains.  From a PCI compliance perspective, that could mean lots of missing patches in your Windows systems if your systems run into an issue with a mega-patch.  This can get doubly bad for organizations not using WSUS or SCCM because they will be backing out security patches as well as application patches.

But it can get even worse.  These mega-patches are cumulative meaning that every month Microsoft adds the previous mega-patch to the new month’s mega-patch.  For example, say one month the mega-patches cannot be applied for compatibility reasons.  For example, you apply the monthly mega-patch and your point of sale (POS) application fails to work with the mega-patch and you must back it out.  If that issue continues because of your vendor, you will not be able to patch your POS systems until that compatibility issue is resolved because month after month the mega-patches are cumulative.  So until the compatibility issue is resolved, you will not be able to patch your systems.

But I foresee small businesses running into the worst issue with this new approach.  Since small organizations likely will not be using WSUS or SCCM, they will not get a separate Security mega-patch, they will only get a single mega-patch that combines the Monthly Rollup and Security into one mega-patch.  If any issue occurs with that single mega-patch, the small businesses will not even get their security patches.  That will create a situation where the organization must figure out how to mitigate their inability to secure their systems.  In addition, that could mean months of security issues until the original compatibility issue can be resolved.

But to add insult to injury, I can also see situations where a vendor has issues resolving a compatibility problem with a mega-patch and finally gets it fixed only to encounter a new compatibility issue with the latest mega-patch.  Based on how Microsoft is running these mega-patches, there appears to be no way to go back to a compatible and useable mega-patch.  This could result in organizations being unable to patch at all due to ongoing compatibility issues.

At a minimum, I think Microsoft will need to make the Security mega-patch separate from the Monthly Rollup for all organizations, not just those using WSUS or SCCM.  At least then, all organizations can apply security patches independent of the Monthly Rollup which would be more likely to be the one that would create compatibility issues.

It will be interesting to see how this new patching strategy plays out.  Hopefully it does not create even more risk for uses of Windows.  If it does, I would not be surprised if the PCI SSC invokes significant new controls on Windows-based solutions.  That could be the final straw in using Windows for a lot of merchants.  Time will tell.

09
Sep
16

Level 3 Versus Level 4 Merchants

There seems to be a lot of confusion over these two merchant levels.  As such I thought I would take a quick moment to clarify them.

From the respective Web sites, here are the definitions for a Level 3 Merchant.

“20,000 to 1 million ecommerce Visa transactions annually” – Visa USA

“Any merchant with more than 20,000 combined Mastercard and Maestro e-commerce transactions annually but less than or equal to one million total combined Mastercard and Maestro e-commerce transactions annually” OR “Any merchant meeting the Level 3 criteria of Visa” – MasterCard

From the respective Web sites, here are the definitions for a Level 4 Merchant.

“Merchants processing less than 20,000 Visa ecommerce transactions annually and all other merchants processing up to 1 million Visa transactions annually” – Visa USA

“All other merchants” – MasterCard

The operative factor is eCommerce transactions.  Level 3 has always been about eCommerce.  It was specifically created to identify those small merchants that were predominately eCommerce focused.  That delineation is important because of the risk presented by card not present (CNP) payment transactions as well as the potential loss of sensitive authentication data (SAD) or cardholder data (CHD) from Web sites used for eCommerce.

However, where the confusion occurs is that both merchant levels end at 1 million total transactions from all payment sources (i.e., eCommerce, MOTO, card present, etc.).

The bottom line is that if your organization is conducting more than 20,000 payment transactions through your eCommerce payment channel, but your total number of payment transactions is less than 1 million, then you are a Level 3 Merchant.  Otherwise, your organization is a Level 4 Merchant.

Now we should all be on the same page.




Welcome to the PCI Guru blog. The PCI Guru reserves the right to censor comments as they see fit. Sales people beware! This is not a place to push your goods and services.

September 2016
M T W T F S S
 1234
567891011
12131415161718
19202122232425
2627282930