Opinions on a report (inspired by DtSR #144)

( Meta: This was going to be a very different post. This version is, IMHO, much better. It is only better because of the incredible generosity of people I deeply admire, who gave me something precious: good advice, and their time. I am deeply grateful. You know who you are.)

Before we begin, two disclaimers:

  • Disclaimer 1: When discussing my relationship with statistics, economics, and data science, I like to use the same expression I use to describe my golf and chess skills: “Don’t mistake enthusiasm for accomplishment (or competence)“. What I write here may be completely off the mark and laughed at by a first-year stats/econ student. If that is the case, let me know and I’ll gladly eat some humble pie…
  • Disclaimer 2: I am a strong supporter of (ISC)2. I have been a CISSP since December 2000 (my number is in the low 4 digits…), I dutifully keep the certification up-to-date, I vote in the elections regularly, and more. Anything I write here is as a way of supporting the broader mission of the organization by highlighting where we can improve the deliverables for our community.

With that out of the way…

I was listening to the excellent Down the Security Rabbithole podcast, episode 144. Rafal Los, James Jardine and Michael Santarcangelo had a really nice interview with David Shearer, executive director of (ISC)2. David and the others were discussing the recent workforce report released by the (ISC)2, along with several interesting perspectives on how to engage more people in security, growth options for InfoSec professionals, etc… Loved it!

OK, let’s look at the report then. Having recently attended Tony Martin-Vegue’s excellent BSidesSF talk about issues with statistics in InfoSec, I figured reading this report from that perspective would be a good exercise… It was based on a broad survey sent out last year. The original survey was quite long, so I know a lot of effort went into creating it. I wondered at the time if the length would be an issue. As I prepared to write this post and looked at the mailing list archives for CISSPforum, looks like I was not the only one: several people complained about the length.

Well… In my opinion, there’s several issues with the report that are worth calling out. Listing them here was both a way for me to apply what I learned to the real world, as well as to hopefully be part of the broader conversation on how to improve future versions. Hoping to act as a good member of our community, I didn’t want to just ‘throw a grenade and walk away’: underneath each issue, I suggest what, in my view, might be a better approach.

Part 1 – Key issues

  • A major theme for me throughout the report is that many security questions are asked outside of the purview of the respondents’ expected area of responsibility (as self-reported later in the report). How can we evaluate answers about software development practices given by network security teams, or vice-versa? What visibility does your typical tier-1 security analyst have over plans for use of advanced analytics, or outsourcing to answer questions about that? To me, this affects the comments about all the security issues listed throughout the report.
    • My suggestion: Get more detailed information from the respondents and only ask questions that are relevant. If that kind of selection was already done, make it known in the report itself. “When polling about budgets, we only tallied responses from those that indicated they have budget management authority” and so on. Does it make for a more complex survey? Yes. Does it make it better? Also yes.
  • I think there’s confounding in the salary data. When looking at salary ranges (p.18) for (ISC)2 members and non-members, the salary ranges for (ISC)2 members is significantly higher, but then so is their tenure (p.19). Also, their roles (as reported on p.22) also seem to skew towards more senior roles. Surely one would expect that more experience in a career would result in higher wages, no?
    • My suggestion: Instead of single salaries, break them down by roles, possibly by tenure too. This makes it much more transparent what the effects are.
  • The report is based on a survey and, with the methodology section being very short, I can’t be confident that the many issues associated with surveys – selection biases (selection of the sample, representativeness of the sample against the population, etc…) and response biases (non-response, voluntary response, leading questions, …) – were properly handled. There were many questions on my mind as I read the report: How was the survey distributed? What was the response rate? What were the incentives to answer? What were the exact questions? Were there leading questions? How did these factors influence the response? How representative is the overall response pool? etc… With “producing statistically significant findings” being a design goal for the report, did we achieve it? I don’t know (but guess we did not).
    • My suggestion: It would be much better if the report included a more detailed survey process and methodology section, discussing some of these topics explicitly. As an example, I really like the frankness with which Verizon approaches this on their excellent VZ DBIR (see their Appendix B).

Part 2 – Other Issues

  • I disliked the structure of the report in that it provided minimal information about the population then jumped straight into their opinions (“State of Security”), only to review the makeup of the population later on.
    • My suggestion: It would really help if the report was structured so that details about the population – job roles, tenure, etc… – would be brought up front, so that we can keep that in mind when we’re reading the sections about about people’s opinion.
  • Try as I might, I have no idea how the authors decided to extrapolate the demands for the future workforce (around p.32)… How were these numbers reached? What factors were taking into consideration? What are the margins of error?
    • My suggestion: Providing a much more detailed description of the underlying assumptions, methodology and data will make this section much more defensible and, by extension, much more useful for those using it as part of their rationale for workforce planning.
  • When looking through the ‘issues’ sections, I kept asking myself: how were these options presented to the respondents? To me, some options were not mutually exclusive, as they included both vulnerabilities and threat actors. How to choose between “hackers”, “configuration mistakes” or “application vulnerabilities”? Between “internal employees” and “configuration mistakes”? How were people led to their answers?
    • My suggestion: As the survey improves with time, a more rigorous validation of the questions will help clarify these points. Also, including their definitions in the report itself will also add clarity: “by ‘Hackers’ we mean this:…
  • The choice for many charts is “3D pie”. Tony’s BSidesSF session made excellent points about how 3D pie charts are extremely deceiving. This was clearly evident here, with the visual distortions skewing interpretation throughout. In addition to Tony’s session, there’s a great paper on pie charts here.
    • My suggestion: replace the 3D pie charts with bar charts or other option.
  • I think the discussion of global trends – salaries, primarily – was very brief and did not take into account what I think are relevant factors, such as exchange rates, cost of living, and inflation, to name a few. Page 18 talks about 2.1% increase in salary for members versus 0.9% for non-members, but is that before or after inflation? When looking at exchange rates, it might even work in favour of the argument that salaries is rising, but there is no discussion of rates anywhere. When one looks at how the US dollar behaved against other currencies (Here’s a couple of screenshots, click for larger), with a greater than 20% variation, how does the survey-over-survey gain of 2.1% compare?
    • My suggestion: a more rigorous discussion of the financial conditions surrounding the respondents will help put proper numbers in context.

Screen Shot 05-26-15 at 10.18 PM 001Screen Shot 05-26-15 at 11.23 PM

  • I think the report is not clear on the distinction between job growth and self-identification on a role. As an example, p.21 claims “demand for security architects leads job growth”. Based on my reading of the report, it means more people ‘self-identified’ as security architects. Unless there was a review of job openings or detailed interviews with hiring managers about roles, I don’t think we can make these claims about job growth.
    • My suggestion: Clarify terms (goes back to my main point about better methodology description).
  • There were a few more quibbles, but nothing as critical as the main points above.
    • Page 21 presents the breakdown of respondent profiles. Well, if you add up the percentages listed for job titles, you get 70.9%. This means a full 29.1% of respondents (almost 3 times the top category – security analyst at 10.5%) did not identify themselves. The chart leads one to think that most of the respondents were security analysts.
    • The “train & retrain” section (p.35) has several options that all circle around “training”. How to choose between them?

Part 3 – Positive things and conclusion

Not all is “doom and gloom”. I also want to highlight a few themes I really liked:

  • I particularly liked the reasoning about the challenges of on-boarding employees on page 32.
  • I wholeheartedly agree with the general sense that ‘communication skills’ are important and should be improved on.
  • Towards the end, I tend to agree with the approach listed in “The Last Word”, about working together to address several issues.

Let me make it clear: I strongly believe that the (ISC)2 is working hard to make positive changes in the InfoSec industry and deserves our support. In our industry, we suffer from not having enough good, unbiased content to discuss our issues. I applaud the initiative to go out and try to compile what the profession is thinking, what challenges we see, how we can move forward, etc…

That being said, my opinion is that, as an (ISC)2 member, I’d value much more having a statistically sound report over the ‘experience’ of participating in a less stringent survey.

One of the things I learned when writing this post is that the report is not issued by the (ISC)2 membership organization itself, but rather our charitable foundation. (ISC)2 Foundation then contracted the survey out. Let’s not forget that a lot of the work in defining the survey is done by volunteers and we need to help our community where we can. For me, right now, I think I can help by writing this post. In the future, who knows?

Advertisements

One thought on “Opinions on a report (inspired by DtSR #144)

  1. I appreciate you taking the time to put together a thoughtful response with feedback regarding the most recent (ISC)² Global Information Security Workforce Study. I also thank you for being a dedicated member of (ISC)² for nearly 15 years.

    You make some great points in your suggestions for improvement regarding both the survey instrument and the report. We’ve been conducting this study for the last 10 years, but there is certainly always room for improvement. I’m will take your suggestions into account during the de-briefing process as we begin to look ahead at the 2017 study being fielded next year.

    We’re continuously striving to be a better organization and members like you help to make that goal more attainable. Again, I thank you for your feedback.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s