Author Archive for PCI Guru

02
Dec
16

Not Tested Clarification

In the November 2016 Assessor Newsletter from the PCI SSC, there is a clarification on what ‘Not Tested’ actually means and implies.  I am sure this will really get some service providers whipped up as it will create some issues with work they perform on behalf of their customers.

The following is taken directly from that newsletter.

“Recently, AQM has received some questions about the impact of using “Not Tested” as a response within a completed ROC. This article is intended to address a few points briefly, with published documentation to follow.

  1. Due to an oversight, the option for “Not Tested” was not included in the summary findings table within the summary overview when that table was introduced with the ROC Reporting Template for use with PCI DSS v3.2. We will publish an errata for the ROC Reporting Template shortly.
  2. Some have asked whether one can have a compliant AOC in instances where “Not Tested” was used. While PCI SSC is not able to comment on matters of compliance, we would direct you to read the verbiage at Part 3 PCI DSS Validation of the Attestation of Compliance below:

    aoc-part-3

How to achieve “all questions answered affirmatively” is the question. PCI SSC does not consider “Not Tested” to be an affirmative statement. The difference between “Not Tested” and “Not Applicable” is that no testing at all is performed for “Not Tested” whereas for “Not Applicable” some testing is performed to confirm a given control is truly not applicable. As such, between “Not Tested” and “Not Applicable,” only “Not Applicable” can be considered an affirmative response.

The intent in introducing “Not Tested” was to achieve a better level of transparency as to the level of compliance and this clarification supports that intent. If you have questions or suggestions, please reach out to the QSA Program Manager.”

It is that second to the last paragraph that will likely send most people off of the deep end.  Their comment that the “PCI SSC does not consider “Not Tested” to be an affirmative statement” really got me going.  What exactly then was the point of using ‘Not Tested’ if you did not consider it an affirmative statement?  Which by the way, when using affirmative as an adjective, means “asserting the truth, validity, or fact of something.”  Last I checked, ‘Not Tested’ would be considered a truth or fact.

There are a number of options for the Council to take here.

  1. Change the wording in the ‘Compliant’ box in Part 3 to reflect that an entity is compliant with all of the requirements tested.
  2. Give us a box in Part 3 that says ‘Compliant with Exceptions’ or something of that ilk which would allow those entities not testing certain requirements to still be judged compliant with what was tested.
  3. Tell QSAs that an AOC cannot be filled out for assessments that mark any requirements as ‘Not Tested’ because an AOC is not relevant.

I remember at a number of past Community Meetings various Council representatives repeatedly and emphatically told those of us from the Accounting community that PCI assessments were not SAS 70 (now SSAE 16) engagements when we would invoke SAS 70 like rules for sampling, testing and the like.  Well, I hate to say it, but the Council is sure turning them into one with all of these pronouncements.

25
Nov
16

The Council’s Take On Non-Listed Encryption Solutions

On Monday, November 21, the PCI SSC posted a blog entry discussing their new Information Supplement titled ‘Assessment Guidance for Non-listed Encryption Solutions’.  After reading their post, I had a few comments of my own.

Mike Thompson, chair of the P2PE Working Group, states that:

“We are encouraged by the significant growth of the PCI P2PE Program in the last two years and the increasing number of PCI P2PE Solutions listed on our website.”

Yes, you have gotten up to 23 listed solutions, but you are still missing First Data TransArmor, Shift 4 True P2PE and Verifone VeriShield who probably comprise the vast majority of E2EE solutions used by merchants.  And most of those solutions that are validated were validated to v1.x of the standard, not the latest version.  Yes, vendors are slowly moving over to v2.x but only slowly.  Some of that due to the pace at which they can get through the Council’s QA process.  But probably the larger reason is that the original cost of getting validated (large) and what that was actually worth in sales (small) has made them question the value of getting revalidated.

“The Council recognizes this creates a challenge for Qualified Security Assessors (QSA) in how to complete PCI DSS assessments for these merchants and that guidance is needed.”

It creates a challenge?  There has been a documented and agreed upon approach in place for E2EE solutions for years.  If QSAs are unaware of this approach it is only because the Council has neglected to explain that approach to them in their training.  As a result, the fact that the Council now believes that guidance is needed is only the fault of the Council.

That said, the guidance the Council is providing in the Information Supplement is in the best interests of the Council because it effectively recommends the solution be P2PE assessed by a P2PE QSA.

It means a few more P2PE QSAs will be needed.  There will not need to be a significant increase in P2PE QSAs because there really are not that many E2EE solutions out there that would drive the training of masses of P2PE QSAs like we have with PCI QSAs.  Let alone the fact that most solution vendors will likely ignore this recommendation unless the card brands force the issue.

But better yet, if a solution vendor has to effectively go through a P2PE assessment, why not just pay the money and have the solution listed on the Council’s Web site?  What better way to drive revenue for a standard that has attracted only a few providers because the assessment process is just as onerous and costly as the PA-DSS which is also in trouble.

Never mind the fact that getting through the Council’s QA process has been a tremendous nightmare.  Most P2PE QSAs equate the QA process to the PA-DSS QA process which has become a huge problem for payment application providers.  Since the PCI SSC is legally on the hook for validated solutions listed on their Web site, the Council is going to be extremely diligent in their review of all validated solutions.

In the end, E2EE providers are not convinced that going through the process is worth the initial and ongoing effort and costs.  They are still selling their solutions without validation in higher volumes than those vendors that have gone through the P2PE validation process.  And those vendors that have been through the validation process are questioning the value of the process since it has not resulted in high sales volumes.

“PCI P2PE Solutions provide the strongest protection for payment card data and simplify PCI DSS compliance efforts.”

I have to say that this is the most hilarious statement made in this post.  There are a number of P2PE validated solutions that allow for the use of 168-bit triple DES (3DES) as the encryption algorithm to protect data.  While 3DES is still considered “strong” by the US National Institute of Standards and Technology (NIST), they only considered it barely strong.  NIST has been advising organizations for years to migrate away from using 168-bit 3DES because it is only a matter of time before it too is broken like its 56-bit and 112-bit versions.  In fact they issued a new warning on 3DES late in 2015 when a researcher broke 168-bit 3DES with keys less than 6 characters in length earlier that year.

E2EE solutions being used these days are relying on the advanced encryption standard (AES) which is much stronger than 3DES and has yet to be broken in any of its variants.

“We want to make it easier for assessors, acquirers, and merchants to get the information they need to make decisions about risk and PCI DSS responsibilities when using non-listed account data encryption solutions.”

As I said earlier, there has been a process in place for years as to how to handle such solutions.  It involves conducting a review of the implementation of the E2EE solution and ensuring that it is implemented properly.  Then submitting the results of that assessment to the acquiring bank for their approval for scope reduction.

In the vast majority of cases, the acquiring bank or a subsidiary of the bank is also the provider of the solution, so in a lot of cases the QSA is just ensuring that the merchant implemented the solution properly and the bank signs off on the reduction in scope.

However on some occasions, the QSA must go through a bit more rigorous process to prove that the solution does in fact encrypt the data and that the data stream cannot be decrypted anywhere but at the payment processor or gateway.  While this can take a bit more time, it typically is not as time consuming as the Council makes it out to be.  Again, in every case, the processor or gateway has recommended the vendors involved so the process is straight forward and easily accomplished and it is only the acquiring bank that would have questions or concerns.

It is not that the P2PE approach is a bad thing.  It is just that the Council over reached when they created it.  The original process was messy, complex, non-modular and did not allow large merchants to continue their operations as they existed.  As a result, it was not seen as necessary by the stakeholders of the standard.  Without their support, there was little reason for adoption.  And as it turned out, the existing E2EE solutions in the marketplace dominated it without validation.

At the end of the day, the Council is trying to force E2EE solution vendors to validate their solutions to the P2PE standard and make that standard relevant.  However without the force of the card brands and banks behind it, the P2PE standard will continue to be dead on arrival.

The good news is that this is only an Information Supplement, so it only needs to be obeyed if merchants and solution vendors choose to obey it.  Which based on the prevalence of E2EE solution implementations, I would expect that things will continue to go “as is”.

UPDATE: On Tuesday, December 6, 2016, the Council issued an FAQ on this subject as well as announced a Webinar for Thursday, December 15, at 11AM ET to give QSAs and ISAs an update on this topic. However in reading the FAQ, it still appears that the whole purpose of this Information Supplement is just to drive vendors to validate their solutions to P2PE since the recommendation is to have a P2PE-QSA validate the vendor’s solution to the P2PE standard(s) and then issue some sort of report for the merchants to use.

20
Nov
16

Revenue Generation Or Payment Security?

Late on Friday, November 18, the PCI Security Standards Council issued a draft Information Supplement titled ‘Assessment Guidance for Non-Listed Encryption Solutions’.  For those of you that follow my blog, these solutions would be what I refer to as end-to-end encryption (E2EE) solutions.  This is a draft document, but I would bet there will be a lot of discussion regarding it.  The good news is that it is a draft and an Information Supplement, so it is not yet official and is only offering a suggestion of how organizations should proceed.

The biggest recommendation that comes from this Information Supplement is the one that will cause the most heartburn and the most discussion.  The Council is recommending that a P2PE QSA assess a vendor’s E2EE solution and issue a non-listed encryption solution assessment (NESA).  As you read further into the document, the NESA is just a different name for a P2PE assessment.  So essentially, what the Council is recommending is a P2PE assessment without the QA review and listing by the Council of the solution on their Web site.

All I can think of is that the Council is taking this approach so that First Data, Verifone and others will be forced to get their E2EE solutions P2PE validated.  After all, if you have to go through a P2PE assessment to allow merchants to use your solution, why stop there?  Why not just get it validated and listed on the Web site?

But the next thing that is troublesome is the implication that regular QSAs are not capable of adequately assessing an E2EE solution.  That somehow the mystical P2PE QSA training process imbues some sort of encryption omnipotence on those that attend and pass the test.  If you have ever looked at the P2PE Report On Validation (ROV), I think most QSAs could easily execute it.

But I think the real reason behind this Information Supplement is revenue.  The Council is driving revenue to their bottom line with these recommendations.  There will likely have to be more P2PE QSAs and those non-listed solutions will likely end up as P2PE validated.  All of those activities generate revenue for the Council.  Revenue that is needed since the card brands have limited their funding of the Council.

Another big reason to believe this is just a revenue generator for the Council is the fact that, unlike a lot of other Information Supplements, this one was not developed by a committee of card brands, Participating Organizations, QSAs or other stakeholders.  In the 14 pages that comprise this Information Supplement, there is no page that lists any outside contributors.

So other than the Council, who could be driving this Information Supplement?

The acquiring banks?  I just completed an assessment of a merchant using an E2EE solution recommended to the merchant by their acquiring bank.  The acquiring bank is major player in the payment processing industry, so you would assume they would have pointed me to the P2PE ROV for the testing of the E2EE solution but they did not.

First Data, TrustCommerce and Verifone have never pointed me to the P2PE ROV for assessing their E2EE solutions.  So the payment processors are not demanding this sort of assessment.

One would think that the card brands would have each issued a press release announcing this draft, but they did not.

That only leaves us with a unilateral decision made by the Council that this was necessary.

But the real question is, how does this Information Supplement improve the security of the payment process?

Have there been a huge number of E2EE solutions that have been breached and this is a response?  I have not heard of any nor have I seen anything in the media indicating that E2EE solutions are a problem.

Are there “fly by night” vendors of E2EE solutions running rampant in the industry?  Not that I have encountered but it would not surprise me if there were a few.  That said, the merchants I have worked with in implementing E2EE solutions only worked with vendors recommended by their acquiring bank, payment processor or payment gateway.  In most of these cases, the solutions were from First Data and Verifone who are widely trusted in the industry.

I suppose this could be a proactive step to get ahead of things getting out of control with E2EE solutions.  But if that were the case, one would think that the card brands and acquiring banks would have been on board and pushing this effort as well as the Council and explaining that they were being proactive.  Nothing on that front either.

That leaves us with the only purpose of this Information Supplement is to generate revenue for the Council at the expense of merchants, E2EE vendors and ultimately consumers.

The P2PE standard has been a big flop in the industry because, surprise, surprise, it is doing nothing to help the industry.  If it had been adopted by the big players such as First Data and Verifone, then we would probably be in a different place.  But there is a reason those big players and others never got on board, because the standard is too cumbersome, time consuming and onerous just like the now failing PA-DSS process.

Do not get me wrong, every organization has to make money to subsidize its existence.  But I am troubled that the Council now appears to be generating requirements for the purposes of revenue generation rather than the securing of the payment process.

It appears that we have turned a corner and that it may not be a good corner to have turned.

08
Oct
16

The Future Of PCI?

The 2016 North American Community Meeting was a celebration of the PCI SSC’s 10th anniversary.  And as with such anniversaries, the Council provided a look back and thoughts on the future.  During these looks into the future, I found some of their assertions questionable and they caused me to question the Council’s thought processes regarding the future of the Council and their standards.

The first instance was at Stephen Orfei’s keynote on the first day.  The General Manager of the PCI SSC proudly announced that the Council trains around 5,000 people annually and that there are current just over 2,000 QSAs and over 1,700 ISAs.  He then went on to explain that this is only the beginning and that more QSAs and ISAs would be needed.  But such a statement seems to be counter to where I think PCI is headed.

From the very beginning, the goal of the PCI standards has been to protect sensitive authentication data (SAD) and cardholder data (CHD) and the removal of it from processes that do not require it.  With most merchants moving to P2PE, E2EE and tokenization, the only scope at these merchants is going to be the card terminal or point of interaction (POI).  The only organizations that will have SAD/CHD remaining will be transaction processors and acquiring banks.  With that premise then why would there need to be growth in QSAs?  In my opinion, with merchant scope radically shrinking, the need to increase QSA and ISA counts is a pipe dream.

If there will be less of a need for QSAs, there will also likely be fewer QSACs.  Right now there are almost 370 QSACs in the world.  If all that will be left to actually assess are transaction processors, issuers and acquiring banks, then the number of QSACs will also have to shrink.  That means more competition for those transaction processors, issuers and acquiring banks until the QSAC numbers get to a more reasonable level based on market demand.

I could see the need for ISAs to potentially go up, but I would expect a lot of those people will just be QSAs that go in-house as the QSA numbers shrink.  With the scope of merchants shrinking so much, the need for ISAs is not going to be as large as I think the Council believes.  However, because of the silly Council rule that you cannot convert from a QSA to an ISA without going through the ISA training program, the Council will still have ISA training revenue regardless for the time being.

eCommerce will continue to be an ever larger part of merchants’ business.  But again, most merchants are moving to redirects and iFrames to reduce PCI scope.  While I fully expect the Council to adjust SAQ A to finally realistically address the risks of even redirects and iFrames that will likely not require any increase in ASVs who currently number 107.  Never mind the fact that the ASV business rapidly became a commodity long ago in its rush for every ASV to be a low cost provider.  As a result, there is very little margin left, if any at all, in ASV scanning.  Most ASVs are only in the business because they need to offer vulnerability scanning services to allow their clients to “one stop shop” their PCI compliance.  As a result, I really doubt that there will be any growth in the number of ASVs and I would not be surprised if the number of ASVs also drop over the next decade.

The next time I felt like the Council was going down the wrong path was when I attended the small merchant session.  What a waste of peoples’ time.  During that session, I leaned over to one of my colleagues who was there and I said, “Why is this taking so long?”

“What is your problem?” They asked.

“Why are they not just telling these small merchants to go to P2PE and tokenization?  Just get this done and done right.” I said very frustrated.

In my mind the small merchant session was 45 minutes too long.  This topic is one of those rare instances where it could be discussed in one of those TED Talk like 20 minute sessions.  Small merchants are looking for a quick answer and they have one.  P2PE and tokenization.  Period.  End of discussion.  Yet the group on stage continued to blather on and on and on.

There you have it.  I feel much better now that I have that off my chest.

04
Oct
16

The Great Multi-Factor Authentication Debate

The Council brings back the Assessor Session to this year’s Community Meeting and it takes only one question to get passions flowing.  The question was to get a clarification of a comment made by Ralph Poore, Director, Emerging Standards at the Council, about multi-factor authentication (MFA).

First a little background to get everyone up to speed remembering that the US National Institute of Standards and Technology (NIST) SP800-63B standard in question is still a draft and has not been finalized.  However, everyone expects this standard to be adopted largely unchanged and with only minor wording revisions that would not affect the overall recommendations in the standard.

What NIST stated about SMS was in section 5.1.3.2. Out-of-Band Verifiers of SP800-63B which states:

“Due to the risk that SMS messages or voice calls may be intercepted or redirected, implementers of new systems SHOULD carefully consider alternative authenticators. If the out-of-band verification is to be made using the public switched telephone network (PSTN), the verifier SHALL verify that the pre-registered telephone number being used is not associated with a VoIP (or other software-based) service. It then sends the SMS or voice message to the pre-registered telephone number. Changing the pre-registered telephone number SHALL NOT be possible without two-factor authentication at the time of the change. OOB using the PSTN (SMS or voice) is deprecated, and may no longer be allowed in future releases of this guidance.”

NIST is only calling out that new implementations of SMS or voice MFA should consider the security implications of using SMS or voice for MFA.  But NIST has not totally invalidated any existing SMS and voice MFA solutions.  They just do not want any new implementations unless there is no choice because the process is already underway.  So while SMS or voice MFA can still be used in existing implementations, NIST is saying that future implementation of SMS and voice MFA are out of the question, have basically killed those solutions.

With that as our background, in a Community Meeting session, Ralph Poore stated that MFA to devices such as smartphones or back to the same device or browser (i.e., “soft” solutions) were not considered secure because of statements in the NIST Draft of SP800-63B.  I was attending a different session when Ralph made his statements, but I can tell you that my cell phone started buzzing with text messages from various people asking if we had all heard what we had heard.  But since there was no Q&A at that session, there was no way to clarify Ralph’s statements.

As a result, this issue was brought up in the Assessor Session to clarify those MFA comments.  Ralph stood and reiterated his remarks and that sent the room into an absolute tizzy.  It was pointed out that NIST had only invalidated SMS and voice for future two-factor authentication, not all soft token solutions such as RSA’s or Symantec’s application solutions.  However, Ralph continued to repeat his remarks saying that they had invalidated all soft solutions.  That brought the house down and people were loudly explaining that his comments were invalidating decades of recommendations for OOB MFA solutions.  Eventually the room calmed down and the Council agreed to review their position on such “soft” MFA solutions.

So that is where we are with this subject.  Time will tell if the Council revises its statements on MFA and comes into line with what NIST is saying on the subject.

30
Sep
16

2016 North American PCI Community Meeting

It was a hectic week out in Las Vegas at the Community Meeting this year.  I wish I had more time this year to just hang out with everyone, but I was in the middle of a number of assessments that needed to get done, so I was working at night and attending sessions during the day.

By the time you read this, the slide decks from the sessions will have been posted on the Council’s Web site.  So all of you that attended will be able to download those presentations.  You go to the link provided in the program guide, provide your name, organization name, email address and the password from the program guide (ve4eqepR) and you are in.

The Council tried the 20 minute “TED Talk” format again with the Wednesday sessions.  A number of the sessions I attended could have easily used an extra 10 minutes if not a complete hour.  I know the Council is trying to move things along and get a lot of information covered, but trying to discuss topics like “the cloud” or EMV standards just cannot be properly accomplished in 20 minutes.  I do not care how good a speaker or organized the presentation.

Here are some of the more notable highlights.

The Assessor Session Is Back

Possibly the most anticipated session of the Community Meeting this year was the return of the Assessor Session after being missing for two years.  But unlike previous years where this session occurred before the start of the Community Meeting, the return of the Assessor Session was moved to the end of the Community Meeting.  I heard a number of complaints throughout the week from assessors about being at the end of the meeting.  Yet when Thursday lunch came around, there were a lot of QSAs, ISAs and ASVs that adjusted their travel schedules (Guru included) to attend this session.

While I originally agreed with people that moving the Assessor Session to the end was not a good idea, the more I have thought about it, the more I think it was better at the end.  That way assessors can have questions covering topics that come up during the meeting get answered while we are all together.  I know we all want to get home, but I think the Assessor Session offers more value to all of us being at the end.

On the not so good side, the Council chose to use up an hour and 10 minutes to present a variety of topics, some of which took way too long to discuss.  But the larger question was why was this material not presented during the main conference?  Not only did all of the meeting attendees miss out, but there were people that did not get their questions asked.  I am also sure that running long discouraged a lot of people from asking questions as well.

That said, there were a number of good questions asked during this session and the Council rewarded five people with large PCI SSC coffee mugs for their “good” questions.

One question though really created a stir.  I will address that question regarding multi-factor authentication (MFA) as a separate post to be published later.  However I will say this about this discussion.  The Council really needs to go back and re-think their position on MFA if what they said is accurate.

The Council was asked about SAQ A and where it is headed.  The concern in the assessor community is that the mechanism that issues/controls the iFrame/redirect needs protection.  However the changes to SAQ A for v3.2 did not seem to address this obvious risk.  Based on how the question was answered, I am guessing that the hosting community is trying to keep SAQ A as simple and easy as possible regardless of the risk.

Another area that the Council agreed to review was the change to requirement 3.2 in the ROC Reporting Template.  In v3.2 of the template you can no longer mark those requirements as Not Applicable however it was pointed out that an ‘NA’ was still allowed in the SAQ D.  The reason for seeking this clarification was related to past comments from the Council to follow SAQs for P2PE (SAQ P2PE) and outsourced eCommerce (SAQ A) when filling out a ROC for merchants with these solutions.  It was pointed out that neither of these SAQs has requirement 3.2 in them, so how is a QSA/ISA supposed to respond to it in the reporting template if it cannot be marked as ‘NA’.

Understanding The Current Data Breach Landscape (aka Verizon DBIR Report Discussion)

When Verizon sends out Chris Novak, you know you will get a great presentation on the data breach incident report aka ‘The DBIR’.  This year was no exception albeit somewhat depressing as Chris again pointed out that most breaches are the result of sloppy operations, lax security and insecure applications.  Essentially security issues that we should have gotten past a long, long time ago but have not.

Architecting for Success

Who better to talk about success than a representative from the Jet Propulsion Laboratory (JPL) talking about how to develop spacecraft to explore the most inhospitable environment we know, outer space and planetary bodies.  Brian Muirhead was the keynote speaker on Wednesday and is the Chief Engineer for the Mars Science Laboratory, the group that designed and developed the various Mars exploration rovers.  He gave a great discussion on how to look out for problems and develop self-managing devices.  Very interesting and I am sure an eye opener for people that we need to stop accepting the sloppy and messy solutions we get for handling cardholder data.

Internet of Things Keynote

The Thursday keynote was just a great time.  While there seemed to be very little directly relevant to PCI compliance presented by Ken Munro and an associate from Pen Test Partners, it was a fabulous time exploring the wonderful world of flawed technology from a tea kettle, to a refrigerator to a child’s doll.  In the case of the child’s doll, they removed the word filter database and therefore allowed the doll to say things that no child’s toy should say.

What was relevant to PCI was the ease with which these folks were able to reverse engineer firmware and software used by these devices.  It gave a lot of people unfamiliar with IoT and penetration testing in the room pause as to how seemingly sophisticated technology can be easily abused.

Cloud Security

While it was great to see Tom Arnold from PSC, the even better thing about this presentation was the fact that Amazon provided an actual human being, in the form of Brad Dispensa, to talk about Amazon’s EC2 Cloud.  While billed as a discussion on incident response, the session provided great insight into AWS’s EC2 service offering as well as the variety of new tools available to manage the EC2 environment and also provide auditors and assessors with information regarding the configuration of that environment.  The key take away from this session is that organizations using EC2 can provide everything needed for conducting a PCI assessment using their EC2 Master Console.

EMVCo

Brian Byrne from EMVCo gave a great 20 minute session on EMV.  The slide deck will be more valuable than the presentation because he had so much content to share and so little time to share it in.  Of note was his discussion of version 2.0 of three domain secure otherwise known as 3D Secure or 3DS.  While v1.0 will remain under the control of Visa, EMVCo has taken over management and development of the 3DS standard.  The new version is in draft and only available to EMVCo members, so this was the first time I had been able to see what the new version has to offer.  But because of the time constraint, I will need to wait for the slide deck to be published to know more.

PCI Quality Assurance Program

Brandy Cumberland of the Council provided a great presentation on the Council’s quality assurance program that all QSAs have become familiar.  I appreciated her discussion of James Barrow who took over the AQM program after most of us wanted to kill his predecessor for creating one of the most brutal QA programs we had ever seen.  James efforts to make the AQM program more relevant cannot be underestimated as he took over a very troubled affair.  This was a bittersweet discussion as James passed away right after last year’s Community Meeting and will be greatly missed by those of us that came to know and respect him.  Brandy took over the AQM program when James left the Council and has been doing a great job ever since.  She is possible one of the best resources the Council has and does the AQM program proud.

Application Security at Scale

The last great session of the conference I saw was from Jeff Williams of Contrast Security.  The reason this session was great was it discussed what application developers can do to instrument their applications for not only security, but also for operational issues.  He introduced us to interactive AppSec testing (IAST) and run-time application self-promotion (RASP).  The beauty of this approach is that applications get security in the form of embedded instrumentation that results in actionable analytics which then allow decisions to be made to respond to threats to these applications.  It sounds like an interesting approach and concept and I cannot wait to see it in action.

As always, it was great to see and catch up with all of my friends in Las Vegas at the PCI Community Meeting.  It was also great to meet a lot of new people as well.  I look forward to seeing all of you again next year in Orlando.

29
Sep
16

Microsoft Changes Their Patching Strategy

Back in May 2016, Microsoft issued a blog entry on TechNet giving the world insight into its new patching strategy.  The concept of a monthly “rollup” patch or what a lot of people are calling a “mega-patch”.  In August another blog entry was posted that further detailed this strategy and explained that from October 2016 going forward, this is how Microsoft would patch Windows.

But there is even more to it.  For WSUS and SCCM users, security patches will be separated from the Monthly Rollup in their own Security mega-patch.  The idea behind separating the security patches into their own mega-patch is to allow organizations to at least stay current on security.  However there is a twist on this approach as well.  Organizations such as small business that do not use WSUS or SCCM will only get a single mega-patch through Windows or Microsoft Update that will contain the Monthly Rollup and Security mega-patches in one mega-patch.

So what could go wrong you might be asking?

The biggest drawback to this scheme is that, should you have any issue with a mega-patch, you must back out the whole patch, not just the item that is creating the issue.  That means instead of having just one potential issue to mitigate, you could have as many issues to mitigate as the patch contains.  From a PCI compliance perspective, that could mean lots of missing patches in your Windows systems if your systems run into an issue with a mega-patch.  This can get doubly bad for organizations not using WSUS or SCCM because they will be backing out security patches as well as application patches.

But it can get even worse.  These mega-patches are cumulative meaning that every month Microsoft adds the previous mega-patch to the new month’s mega-patch.  For example, say one month the mega-patches cannot be applied for compatibility reasons.  For example, you apply the monthly mega-patch and your point of sale (POS) application fails to work with the mega-patch and you must back it out.  If that issue continues because of your vendor, you will not be able to patch your POS systems until that compatibility issue is resolved because month after month the mega-patches are cumulative.  So until the compatibility issue is resolved, you will not be able to patch your systems.

But I foresee small businesses running into the worst issue with this new approach.  Since small organizations likely will not be using WSUS or SCCM, they will not get a separate Security mega-patch, they will only get a single mega-patch that combines the Monthly Rollup and Security into one mega-patch.  If any issue occurs with that single mega-patch, the small businesses will not even get their security patches.  That will create a situation where the organization must figure out how to mitigate their inability to secure their systems.  In addition, that could mean months of security issues until the original compatibility issue can be resolved.

But to add insult to injury, I can also see situations where a vendor has issues resolving a compatibility problem with a mega-patch and finally gets it fixed only to encounter a new compatibility issue with the latest mega-patch.  Based on how Microsoft is running these mega-patches, there appears to be no way to go back to a compatible and useable mega-patch.  This could result in organizations being unable to patch at all due to ongoing compatibility issues.

At a minimum, I think Microsoft will need to make the Security mega-patch separate from the Monthly Rollup for all organizations, not just those using WSUS or SCCM.  At least then, all organizations can apply security patches independent of the Monthly Rollup which would be more likely to be the one that would create compatibility issues.

It will be interesting to see how this new patching strategy plays out.  Hopefully it does not create even more risk for uses of Windows.  If it does, I would not be surprised if the PCI SSC invokes significant new controls on Windows-based solutions.  That could be the final straw in using Windows for a lot of merchants.  Time will tell.




Announcements

If you are posting a comment, be patient, as the comments will not be published until they are approved.

If your organization has a PCI opportunity, is in need of assistance with a PCI issue or if you would like the PCI Guru to speak at your meeting, you can contact the PCI Guru at pciguru AT gmail DOT com.

I do allow vendors to post potential solutions in response to issues that I bring up in posts. However, the PCI Guru does not endorse any specific products, so "Caveat Emptor" - let the buyer beware. Also, if I feel that the response is too "sales-ee", I reserve the right to edit or not even authorize the response.

Calendar

December 2016
M T W T F S S
« Nov    
 1234
567891011
12131415161718
19202122232425
262728293031  

Enter your email address to subscribe to the PCI Guru blog and receive notifications of new posts by email.

Join 1,720 other followers