O’Reilly Security Conference

Disclaimer: I was a speaker at the conference. As such, O’Reilly Media covered most of my travel expenses, as well as provided me with a Speaker pass. If you think such benefits, nice though they were, had a significant impact on my opinion, to me it just means we don’t know each other very well yet. Trust me when I say that they do NOT… Happy to discuss as needed…

TL;DR: The experience of being part of the inaugural O’Reilly Security Conference was amazing. The content I watched was excellent, the venue/logistics worked really well, and I really liked the “vibe” on the conference. 10/10!

security_newyork_2016_dsc_4822

Source: O’Reilly Media – click for license details.

This longish post is about my experience on the O’Reilly Security Conference. I summarize what I learned from each session I attended, as well as general opinions. I can’t think in prose, so this is mostly in list format. Without further ado:

Format and Venue

  • 4-track conference, held at the New York Hilton Midtown.
  • Pre-conference training and tutorials, an Ignite session, then 2 days with morning keynotes followed by morning and afternoon sessions.
  • Good breaks in between sessions (ranging from 15 minutes to 1 hour)
  • No idea on attendance, likely in the mid hundreds.

Tutorials and Ignite

I attended Jim Manico’s half-day tutorial on “Modern IdM” hoping to learn more about Web authentication and I was not disappointed. He covered OAuth in detail, as well as session management, and recommendations around password storage. He’s a very energetic and engaging speaker, and time flew by.

The afternoon was reserved for the Apache Drill tutorial led by Charles Givre, from Booz Allen Hamilton. Charles took us through the rationale for Apache Drill – basically a SQL-supporting unifying front-end for disparate back-end data stores – and led exercises on data manipulation. Drill can be a fantastic tool for a data scientist to easily get at disparate data sources.  I’m a SQL newbie and struggled with some of the exercises, but that is on me and not on the tutorial. He also based the exercises on a pre-configured VM that has other data science tools. This will come in very handy…

In the evening, Jerry Bell and Andrew Kalat hosted the Ignite talks (lightning fast talks with auto-advancing slides). Jerry and Andrew host the Defensive Security podcast , probably my favourite security podcast. It was a privilege to chat with them. The talks were interesting, ranging from the need to shy away from hero-focused security work, to how we can do better at training/education, to the use of existing intelligence/data sources. Great talks, easy-going format.

Karaoke…

Then there was… karaoke… For those that are not familiar, “slide karaoke” is a fun-filled/terrifying (depending on your point of view) format where someone is presented random slides at a fixed-time interval and the they have to “improv” their way to a somewhat coherent talk structure. Andrew and Jerry asked for 5 volunteers…. and I was one of them….

I don’t quite remember what all my slides were, but there were references to llamas, some sort of potato-based disease, and rule breaking. 🙂  I’m just hoping I made it entertaining for the audience…

Lesson learned: Courtney Nash is a master at this: she was funny, coherent, engaging, … She’s a very tough act to follow, which just happened to be my spot in the roster… You have been warned 🙂

Seriously, though: it was great fun, and I hope others join in. It was a great environment, people were having fun, and part of being in this industry is this sense of community that we build. It was a privilege to be able to take part in that.

Keynotes

On day 1, following the intro from Allison Miller and Courtney Nash, Heather Adkins from Google kicked things off by showing us how some of the main classes of security incidents – be they insecure defaults, massive theft, or instability – have been happening in different forms since the 1980s. After pointing to the increased siloization(sp?) of our industry as a possible cause, she urged us to think about broader platforms, and to design with a much longer timeframe in mind.

Richard Thieme took us through a sobering view of the psychological challenges in our career. Drawing parallels to the intelligence community and the challenges faced there, Richard rightfully reminded us to stay mindful of our needs as individuals and building adequate support networks in our lives.

Becky Bace did a great job of comparing the challenges of infosec today with the early days of the auto industry, and how we can use some of the lessons learned there to improve it. Given my interest in economics and incentives, I was silently clapping pretty much all the time.

Unfortunately I missed most of the day 2 keynotes – I look forward to watching video later. What I did catch was the latter part of Cory Doctorow‘s impassioned and cogent plea for more involvement from us as individuals into the immensely important debate about the very nature of property and democracy in modern society. There are key discussions and precedent-setting court cases taking place now, and many of the key societal instruments we take for granted are at risk.

Day 1 Sessions

Speak Security and Enter. Jesse Irwin led a great session focused on how to better engage with users when it comes to discussing security and privacy. She laid out very well defined steps for improving. If I could summarize her session in one idea would be: have more empathy to your user community. From using relatable examples, to framing the issue positively or negatively, and many other suggestions. Hearing her tell of the adventure of teaching security to 8-year-olds was priceless!

Notes from securing Android. Adrian Ludwig from the Google Android team took us through a data-driven journey into the Android security ecosystem. After reminding us that Android security must accommodate from $20 phones to modified units used by world leaders, Adrian focused on three aspects: active protections made by the Google ecosystem, options available for enterprise decisions (such as allowing external app stores or not), and details about the Android OS itself. He made a very compelling case that the security architecture of a modern Android-powered device such as the Google Pixel rivals what other options exist in the mobile ecosystem (iOS, WindowsPhone). This was one of the best talks I attended.

Groupthink. Laura Mather has had a very interesting career, including time at the NSA, eBay, founding SilverTail (where I had the pleasure of working for her), now leading Unitive. Her talk was not a ‘security’ talk, but rather a look into the issue of groupthink, often caused by unconscious biases. Fundamentally, the variety of challenges in modern security environment should be met by having a diverse workforce generate ideas based on diverse points of view. In order to achieve this, we need to work on the issue of lack of diversity. Laura pointed out specific ways to avoid unconscious bias in hiring, particularly being aware of, as an interviewer/hiring manager, not looking for someone “just like me”. Hiring decisions should be matched on values, not on superfluous characteristics that lead to biased outcomes.

UX of Security Software. Audrey Crane leads a design firm, and made the case for proper UX design taking into account the people who will actually use the product. Her firm conducted research into usage habits related to SOC roles, and came up with a few personas (different from the typical ‘marketing’ personas) and then showed an interface design that takes those personas into account. Her recommendations are for vendors to take this aspect of the product creation process seriously, and for buyers of software to not only demand better software from a usability perspective, but to actively try out any software being purchased with the intended audience.

Social Scientist. Andrea Limbago brought a “social scientist” perspective to the broad issues around information security. She framed the discussion in terms of Human Elements, Geopolitical trends, and Data-Driven Security. The human elements section looked at an structure-agent dynamic (top-down versus behavioural) and advocated approaches to evolving the security subculture. Very interesting, as were the comments around security still having a cold war framework, and that there is a gap on the usage of data within security conversations.

Day 2 Sessions

Are we out of the woods?. Kelly Harrington from the Google Chrome team talked about Web security issues. She covered some key issues – how updates are not universal, how older devices get attacked, and the scourge of what Google calls Unwanted Software – and delved into details about the exploit kits (Angler, Rig, and others), trends of attacks on routers, plus examples of malicious behaviour by Unwanted Software. She wrapped up by sharing a little about what Google’s Safe Browsing API does and by giving actionable advice on web security. This was a great talk to complement the one on Android security. Finally, extra points for her for the Jane Austen references… 🙂

Criminal Cost Modelling. Chris Baker – a data scientist at Dyn – took us through a whirlwind tour of some underground markets and the actual data he found there for pricing stolen goods, exploit kits, or DDOS services. It was refreshing to see someone dive beyond “oh, underground markets exist” into actual markets, prices, goods, and the possible economic issues that exist in those markets. I loved this session. If there was one session I wish could have been longer, it is this one. I’ll be watching the video when it comes out, many times over.

Economics of CyberSecurity. This session was delivered by yours truly. Happy to announce that slides are available here. I focused on how a brief understanding of economic concepts – Marginal Cost of Information Goods, Information Asymmetry, Externalities, and concepts from Behaviour Economics – can help us rethink some of the broad challenges we face. I hope the audience liked it. I was happy with my delivery and did pick up on a few things I want to improve. I really hope to have the opportunity to keep bringing this message to others.

No Single Answer. Nick Merker – now a lawyer but formerly an infosec professional – and Mark Stanislav – now a security officer with experience as security consultant – focused on cyber insurance. Their session went into the difference between first-party and third-party insurance, then delved into the details of what cyber insurance options exist, what they typically cover (or not), and how these products are currently priced and sold. They also covered some misconceptions around the role of insurance in a risk management program, how infosec should play a role when purchasing cyber insurance products, and how a well-defined and executed security program can help with insurance premiums. I learned a ton, and really liked the session.

Sponsors/Logistics/Others

The sponsor area was relatively small (maybe 10-15 sponsors total) but the people I spoke to were knowledgeable and the selection was varied. Not so much your typical security vendor, but more those offering solutions that fit into a more modern architecture view of security. There were options for web app security, container security, source code security, etc… I did not focus much on it, given my role as individual contributor.

The conference schedule and details were available via the O’Reilly app (iOS and Android) and things worked well. One suggestion I have is that the app could offer a toggle for ‘hide past events’ on the Full Schedule view, as that would help people choose their next sessions without having to scroll around so much…

Food options during the breaks were varied and quite nice. I loved that we had sushi available on one of the food stations.

As a Speaker

My “field report” would not be complete without a comment about my experience proposing the talk and later as a speaker.

The submission process was well defined, the guidelines for what should go in the submission were clear, and the timelines were very fair. I followed the process via the website and the questions I asked the speaker management team were answered promptly and efficiently. Major thanks to Audra Montenegro (no relation) and her team.

The organizing committee has been very transparent about what their side of the selection process was like. This is tremendously insightful and helpful for future proposals. I particularly liked the use of blind reviews. Blind reviews help us as an industry increase the quality of the content that makes it into the stage, AND increase the chance of hearing from a more diverse pool of contributors. What’s not to like?

Prior to the event, I was able to connect with Courtney Allen and we collaborated on a short email-based interview (which you can find here). She was fantastic to work with and has a keen insight into the role that O’Reilly Media can play in the security landscape.

Bottom line is: If you have defensive-focused security content you want to present, you’re open to be being evaluated on the merits of your content, and want to work with great people putting it together, O’Reilly Security should definitely be on your short list of conferences to submit to.

On the economics of ransomware

We blinked, and the world changed on us.

This [long] post is not meant as doom&gloom on the scourge of ransomware, but rather a look at some basic economic aspects of this type of attack, and some recommendations for the future.

So far,  2016 is definitely the year of ransomware. Every vendor is talking about it in their pitches, the media is all over it (good articles here and here), etc. This blog just adds to that cacophony, though hopefully adding a different perspective.

“Prior Art”: Lots of people are now talking about ransomware, and I’m sure many have in the past too. I’d be remiss if I didn’t point out that Anup Ghosh of Invincea wrote a scaringly prophetic blog post on this back in July of 2014! Check it out here. Also, I liked Daniel Miessler’s piece here.

Note : as I discuss these topics, I may sound insensitive to the plight of the victims. It’s absolutely not that: I think ransomware is a scourge that should be eradicated, that we bring to bear the full force of law enforcement, but I’m pessimistic it can be done.

There are several aspects of ransomware that make it interesting from an economic angle. Let’s explore some of them.

The “Taming” of Externality

First and foremost, to me, ransomware is the first major, widescale threat that significantly reduces the inherent externality of poor security practices. What does that mean?

Up until now, poor security practices by end users resulted in relatively light consequences for the users themselves. In some cases, being used as a spam relay might have been not noticeable, or at worst there was a rare circumstance where malware resulted in having to reformat one’s PC. Yes, annoying and potentially painful, but manageable. From a behavioral economics perspective, biases such as mental accounting made it even less painful.

The broader costs of that infection – spam being generated, systems that had to be wiped, etc… – were largely invisible to the user. In market economics terms, all these costs were externalities. This means that the agent in the transaction – the user – was not taking those costs into consideration when making their choice – in this case, poor security practices that let to an infection.

Enter ransomware. Now, the user is faced with the painful choice of paying the ransom – actual monies being stolen – or facing the imminent destruction of their data. Worse, depending on how that strain of ransomware behaves, it infected network drives and potentially backups as well. This triggers several well-known behavioural quirks/biases, including:
  • The salience of paying. It’s pretty clear that there is money being lost, and it’s your money (or your organization’s).
  • The immediacy of the request. It’s not something that can be postponed. Criminals know this, and exploit it: in many cases, ransoms increase as time passes.
  • Loss aversion. From Kahneman and Tversky’s work, we know the tendency of people to be loss averse.

All of this is, naturally, horrible for the user. From an economic perspective, though, it is interesting that this, in a way, “reduces” the externality of a poor security choice. The user now knows full well that their poor choice/practice may result in a non-negligible cost. [Edit: as someone provided feedback to me, just another way of saying “the chickens come home to roost”.] They’re understandably concerned, and rightly so. I don’t see this diminishing soon.

“To Pay or Not To Pay, that is the Question”

The second interesting point is analyzing the dilemma of deciding to pay the ransom or not. Even law enforcement seems ambivalent: recent advice has included both “pay” and “don’t pay”.

There’s two things to look at:
  • First, from a societal perspective, the issue is similar to the Tragedy of the Commons, a well-known economic problem. In the traditional Tragedy of the Commons, individuals overconsume a shared resource, leading to depletion. In the case of ransomware, it’s not the same: to me, it is close to the “Unscrupulous Diner Dilemma”, a variation of the more traditional Prisoner’s Dilemma, but where a group ends up paying more, even though they all wished they couldn’t. In the case of ransomware, the individual decision to pay negatively affects the community by supplying the criminals with additional funds to reward them for the crime, along with funds to reinvest in future capabilities for the tools, thus costing more in the future.
  • Individually, people and organizations should recognize that the rational economic decision is not just simply “is the cost of paying the ransom less than the loss associated with losing the data”. The decision should be based on that cost, sure, but also taking into account:
    • Is that the end of it? Will paying the ransom this one time be an exception? In most cases, hardly… As ransomware proliferates, different gangs will keep attacking.
    • Will paying the ransom even get the data back in the first place? As @selenakyle nicely pointed out recently, there’s little recourse if things goes wrong…

At the end of the day, we’re back to externalities:
  • Those recommending “don’t pay” don’t bear the cost of the advice: lost data, etc…
  • Those choosing to “pay” don’t [immediately] bear the indirect cost of enabling the criminals to continue their efforts.

A more realistic approach to handling of ransomware should keep these in mind.

“Thy ransomware’s not accidental, but a trade.”

There seems to be consensus that what has enabled the rise of ransomware is, among other things, the maturity of bitcoin. That was the point clearly made by @jeremiahg and @DanielMiessier (here). I agree: bitcoin seems to have tipped it, but along with other changes to the overall ecosystem that appear to have made ransomware a more viable attack.

Like legitimate business, criminals have explored ‘efficiencies’ in their supply chain. As the main example: bitcoin (the peer-to-peer exchange, not the currency itself) has removed significant “friction” from the system. Whereas before, the steps needed in the cashout scheme might include several steps – all of which incurred fees to the criminal – the ubiquity of bitcoin has made the cashout process faster, cheaper. Taking out the middlemen, if you will.

Regarding bitcoin specifically, there’s a couple of interesting points:
  • more than anonymous “enough”, bitcoin is a reliable and fast payment system. Even though it doesn’t provide full anonymity – the transactions on the blockchain can be traced to wallets – bitcoin is sufficiently opaque that the tradeoff of limited tracking with ubiquity/speed made it the currency/payment system of choice.
  • This leaves an interesting question about the bitcoin exchanges: Can we expect the exchanges to work against their own self-interest in restricting these transactions? What sort of defensive approach can we expect the exchanges to take? The danger of people equating bitcoin with ransomware is real, and the industry is right in defending itself.

All in all, from looking at the underground ecosystem, it looks like ransomware is a ‘killer app’: profitable, easy to use, etc…

“Much Ado About Nothing”? Maybe…

Finally, ransomware seems to have exploded into our collective attention, but is it really such an epidemic? While we deal with the onslaught of news/articles/posts about ransomware (including, of course, this post …), let’s recognize that there is vey little incentive to “underreport” ransomware infections. To wit:
  • InfoSec vendors can point to ransomware as the new ‘boogeyman’ that every organization should spend more resources to protect against.
  • Internally within organizations, like with “Y2K”, “SOX”, and “PCI” before it, we can now possibly start to see “ransomware” as the shibboleth that enables projects to be funded.
  • Media sites latch on to the stories, knowing the topic draws attention. As an example, a lot has been made of the incident where a Canadian university opted to pay $20,000 CAD . Would there have been the bombastic coverage if the cause of the loss was, say, a ruptured water main caused by operator error? Not likely…

I can’t help but wonder if this is not a manifestation of a couple of things:
  • one, a variation of what’s called the Principal-Agent problem: in an economic transaction where there is an expectation that an agent will act on behalf of a principal, but instead acts on their own benefit. In this case, bolstering the issue of ransomware above and beyond other relevant topics.
  • two, just your garden variety ‘availability bias’ from behaviour economics, where the ease with which we recall something inflates its perceived rate of occurence.

In either case, we can take a peek at the well-known Verizon Data Breach Report. What do we see? Verizon’s DBIR shows that ransomware, even as a subset of crimeware, is not as prevalent as other attacks. See figure 21 on page 24 of the 2016 report.

 
“’Advice’ once more, dear friends, advice once more”

Wrapping up, then. There is a fantastic paper by Cormac Herley, from Microsoft Research – So Long, and No Thanks for the Externalities – in which he discusses how users ignoring security advice can be the rational economic decision, when taking into account the costs of acting on some security advice. The paper is from 2009 and is still extremely relevant. I consider it mandatory reading for any security professional.

Taking that into account, how should we frame security advice about ransomware?
(One could argue whether ransomware is not exactly the change in cost that invalidates the conclusions. Might be an interesting avenue to pursue…)

At least to me, too much of the security advice we see about ransomware is not taking into account the aggregate cost of acting on such advice.

Ultimately, the protection methods have to be feasible to be implemented. With that in mind, here’s a few recommendations.

For individuals:
  • Be aware of your own limitations and biases as you interact online. To the extent that it is possible, incorporate safe practices.
  • Leverage the automated protections you have available – modern browsers have sophisticated security features, your email provider spends a ton of resources to identify malicious content, etc…
  • Devise and implement a backup system that fits your comfort level, balancing the frequency of backups with their associated hassle.
  • Periodically check and possibly reduce your exposure by moving content to off-line or read-only storage. Just like you wouldn’t walk around at night in a risky neighbourhood with your life savings in your pockets, make it a practice of limiting how much data is exposed.
  • If infected, don’t panic. Keep calm and, if you choose to do so, act promptly to avoid the increases in demands.

For corporate users, similar advice applies, boiling down to “don’t base your security architecture on the presumption that users are infallible at detecting and reacting to security threats”. Back it up with technology. On a tactical level, a few extra things come to mind:

Prevention
  • Verify that current perimeter- and endpoint-based scanning of executables/attachments is able to identify/catch current strains of malware (ask your vendor, then check to make sure). It might be a sandbox approach, endpoint agents, gateway scanning, whatever. Belt & suspenders is a good approach, albeit costly.

Detection
  • Consider application-level monitoring for system calls on the endpoints. This includes watching for known extensions, as well as suspicious bulk changes to files.
  • Consider monitoring data-center activity for potential events of bulk file changes such as encryption. Yes, there may be false positives.

Response/Containment
  • Re-visit the practices that allow users to mount multiple network shares.
  • Make sure the Incident Response playbooks include ransomware as a specific scenario. Prepare for single-machine infection, multiple machines hit, as well as scenario where both local and networked files are encrypted. While I’m skeptical of survey data such as this, getting familiar with how bitcoin transactions work might be a worthwhile investment.

To me, ransomware is here to stay: it leverages too many human and economic aspects to simply vanish. As with many other “security” issues, this is just another one that was never just a technology problem, but a social and economic one. InfoSec professionals should keep that in mind, remembering that the solutions are not always technical…

And to purveyors and enablers of ransomware, “a plague on your houses!

Professional Certifications & [Behavioural] Economics

I was thrilled to see many people responded well to my earlier post on certifications in the context of information economics (particularly information asymmetry). There was lots of interesting feedback, including some that were somewhat critical of certifications in general.

This led to an interesting question – what are the negative aspects of professional certifications, if any?

Again, we can use economics. There’s quite a few things to keep in mind…

First, there’s no denying that pursuing certifications can have significant costs, both explicit and implicit. Some of the clear explicit costs include preparation materials, tuition, travel costs, not to mention exam fees themselves: some run into thousands of dollars. It pains me to see how expensive it can be to prepare for some certifications, knowing that in many cases candidates may be latching too much hope in just having “that” cert.

There’s also implicit costs. Think of the hours of studying, preparing materials, etc… It is not uncommon for some of the more advanced certifications to eat up hundreds if not thousands of hours of preparation. The ‘opportunity cost’ of missing out on months or years of ‘regular’ life can be staggering.

Are these costs worth it? It depends, of course. In many cases, I think the answer is yes, but I want people to know what it is they’re getting themselves into. More than ever: Caveat emptor (buyer beware!).

I also wanted to explore something else: how can having a certification negatively affect you? This brings us to the extremely interesting field of behavioural economics

Behavioural Economics

Behavioural Economics is not one of the ‘foundational pillars of economics’ – those would be macroeconomics and microeconomics (of which information economics is a subset). Rather, it is more of a multi-disciplinary application of several fields – sociology, [micro]economics, psychology, finance, … . It looks at how sociological, psychological, cognitive, or emotional factors can affect economic decisions and processes.

Behavioural economics has taken the world by storm for the past few decades, notably with the work of Dan Kahneman and Amos Tversky on Prospect Theory, then many others. It was Nobel-worthy (though sadly Amos passed away before they were awarded the prize, and Nobels are not awarded posthumously). For those that are interested, I highly recommended the works of Dan Ariely, a popular researcher from Duke University. Dan has several books, blog posts, online courses, and even movies on Behavioural Economics.

Within behavioural economics, one area of great interest is cognitive biases – how our quirky little minds often behave in non-rational but predictable ways. There’s dozens of biases that have been identified – I recommend this Wikipedia page as a starter…

There’s lots of discussions about why these biases exist. My simplistic take is that the human mind evolved over millions of years and is not yet adapted to the changes that civilization introduced over the past 10,000 years or so. The behaviour that would save you from being eaten by a wild animal in the savannah or help you survive a harsh winter is the same that nowadays makes you susceptible to bad products on late-night TV and binge eating…

Let’s look at just a few biases, effects, …:
  • Endowment Effect. This is the notion that if you happen to “own” something, you value it more than if you don’t.
  • Loss Aversion. Somewhat related to endowment, this is the key insight that one feels the pain of loss of a certain amount ‘x’ as greater than the pleasure of gaining the same amount ‘x’.
  • Availability Bias.  You’ll attribute more importance/frequency to information that you have come across recently.
  • Cognitive Dissonance. The stress caused by holding contradictory thoughts and the rationalizations that are done to resolve this.
  • Social Proof and variations (group bias & others). When one assumes the behaviours of others to be correct.
  • Sunk Cost Fallacy. Continuing to invest in something because so much as been spent on it already.
There’s *tons* of information on biases, influence, manipulation, etc… Too many to list here. A particularly popular author on the topic is Robert Cialdini. Well worth taking the time, trust me.

Cognitive Biases & Professional Certifications

So, how does all this apply to professional certifications? Quite well, actually.

Note: I’m more familiar with the network security space, so that’s where my examples come from. When thinking of certifications, I’m thinking of the likes of Cisco, Microsoft, VMware, Juniper, Novell, or groups such a SANS, CompTIA, or the (ISC)2. In this space, many vendors have formal certification programs, often with multiple levels of certifications (associate, junior, senior, master, …) and regular recertification requirements. This makes sense, as today’s technology darling ends up as tomorrow’s legacy option, supplanted by a new option.

In the IT industry, a company that sells, distributes, or provides services for products with ‘certifications’, often receives benefits from the vendors that are tied to certifications: better margins, warranties, marketing dollars, easier access to support resources, etc… This incentivizes having and maintaining a healthy number of professionals on staff with the required level of certifications. This, in turn, means that someone working at these companies is strongly incentivized (or even required) to obtain these certifications.

Even if you don’t work for an IT reseller/distributor/integrator/…, there’s a strong message from vendors incentivizing you to certify, to show your skills, etc…

Why is that?

Because, amongst other things, having a professional certification from a vendor influences you, even if just a little.

If you, as a professional, worked to obtain a professional certification from a traditional “vendor”, you can expect the following to occur unconsciously:
  • due to a desire to resolve any cognitive dissonance, you’ll hold a generally more positive opinion of that vendor. “If I went through the effort of certifying on that vendor’s product *and* I consider myself a good person, then that vendor must be good too.”
  • because of the the endowment effect, you’ll likely hold a more positive opinion of others who have the same certification. This may come through on sales calls, hiring, etc…
  • the availability bias will kick in when thinking of alternatives, meaning you may have an easier time recalling a specific vendor’s offerings or technology, particularly if they refer to [re]certification topics.
  • social proof will kick in when you see that certification in prominent display by vendors when visiting trade shows, elections, … Vendors often offer certification exams at their shows (sometimes even waving the exam fees): it is extremely convenient for the test taker, but the visual of hundreds or thousands of your peers taking those exams is a shining example of social proof in action..
  • it’ll likely be really difficult to let go of that cert, or that particular vendor. That communication saying “your certification has now expired” is really painful. Such is the impact of the sunk cost effect (and loss aversion).

Now, this does NOT mean that everyone is going to mindlessly give in to their biases, but that these biases exist, and some will give in sometimes. Given enough nudges, that’s a powerful effect…

Vendors know this, and use it as an instrument. It helps them sell more product – be it an IT product, training, or a certification. It helps them maintain their base of customers, it helps them maintain a wide network of partners, which expands their reach, and so on…

They’re well within their rights to do so, just as you are within your rights to be aware of it and judge things on their proper merits.

Wrapping up

We’re all biased and susceptible to manipulation at different levels. (Yours truly included: among my many, many failures, I once fell – hard – for the “free Disney tickets pitch”. It hurt, it cost me money and stress, but I learned my lesson and moved on.)

I think professional certifications can be wonderful things:
  • They can provide a roadmap for learning, checkpoints for measuring your skill.
  • They can be a very effective (though not perfect) means of resolving the information asymmetry inherent in professional situations, both as signals and as screens.
  • They can help establish relationships with like-minded professionals.
That being said, we just saw how there’s potential negative aspects to certifications: explicit &implicit costs, along with being more vulnerable to cognitive biases that may work against your best interests.

Again, I’m absolutely not against professional certifications, quite the opposite actually! It is precisely because I value them that I want people to be cognizant of what the benefits and yes, the pitfalls, of certifications might be. That guy Sun Tzu said it best… 🙂