The New State of Security
Let’s begin with a little history. Starting in the early 1990’s, the internet was a new, and fresh commodity within both public and private sectors. Quickly, the internet grew with unbeknownst issues, that would later cause unruly, and problematic dilemmas. Ushering in the new millennium, the year 2000 was approaching, and so was the first IT disruption of magnus proportions. The now notorious “Y2K” bug precipitated a years’ worth of worldwide concerns around major outages of technology platforms leading into the New Year. The issue was simple: software and hardware using 2 digits for the year instead of 4 would throw off all programs using time-based calculations when the digits went from 99 (1999) to 00 (2000). Time would be resetting backwards, not forwards.
The fallout was potentially disastrous. How would banks use software to forecast interest rates if year was wrong? Could they amortize bank loans? Bill their clients on a schedule? Continue automated withdrawals? Any critical function that was automated based on a calendar was in jeopardy — nuclear power plants to hospitals were affected. Massive efforts to avoid a crisis were put into play. Finally, the New Year came and went, and all the “Y2K” panic seemed overhyped. No major problems were reported.
Centralized Systems Made Y2K Easy to Solve
Fake Dictionary, Dictionary definition of the word Easy.[/caption]Fake Dictionary, Dictionary definition of the word Easy.Let’s take a moment to go through some statistics from the years:
- United States cost to repair from Y2K – $100 Billion (Chandrasekaran,1999)
- 738 Million internet users in 2000 globally
- 3.2 Billion internet users in 2015 globally (Smiths, 2017)
- Worldwide IT spends for 2018 – $3.7 Trillion
- Expected worldwide IT spends for 2019 – $3.8 Trillion
- That is an increase of 3.2% from 2018 (Gartner, 2018)
These numbers speak volumes on where we were to where we are now. In 2018, the total IT spend globally will be around $3.8 trillion. So, we averted a ‘worldwide disaster’ and ‘economic collapse’ by spending only 3% of that amount ($100B) over the course of 5ish years to fix the Y2K bug — now that’s efficiency! But what were IT teams dealing with in the year 2000?
Perhaps a better question to ask is, what were they not dealing with?
Personal laptops didn’t exist in mass use at that point, and tablets sure as hell didn’t. The iPhone wasn’t invented yet, and the ‘Cloud’ was still referred to moisture in the sky. Software as a service, isn’t an established market yet, and no, there isn’t an app for that. The Blackberry was also born in 1999. No one would know what you meant by BYOD, and few IT teams were strategizing on mobile device management (MDM). Oh and least we forget, the original 802.11 WIFI was just released in 1997.
Not to mention, Justin Timberlake and Britney Spears were also Americas top couple. Does this help to put it in perspective?
Indeed, this was a time where business IT platforms were highly centralized and meticulously controlled. Technology programs used by employees could typically be counted on half of one hand. Remediating a few core systems of a business with a multi-year timeline wasn’t an impossible task.
What’s interesting to note though, Y2K was a generalized security issue in 2000. That being said, cybersecurity was rarely its own job position. Representing the state of security in the 90’s, security almost always fell under the system engineering positions. Security concerns of the yesteryear were simply the uptime and security of several main business systems.
A lot has changed, as we know. Security teams have evolved outside of systems engineering to large separate teams of dedicated security professionals (both IT and business), tasked with the expanding responsibility of varied outside and inside threats.
The Accelerating Threats of Decentralized Technology
The reasons seem obvious as to why IT security teams need to be so much more dynamic today. Mobile users, IoT, social engineering, increased connectivity, ‘the cloud,’ etc. Despite it being obvious, I still hear some clients say “we want to improve our edge security.” And in my own head I’m screaming:
EDGE!? WHAT THE DO YOU MEAN BY EDGE!? IT DOESN’T EXIST ANYMORE!
There is no edge. There is no perimeter. There is an ever-expanding series of doorways constantly being opened by people other than technology teams — internally and externally. Any business spending money to improve its “edge” while forgoing spend on social engineering training to educate its employees on how to avoid email phishing scams is missing the point.
Clients, customers, employees, partners, vendors, friends, and hackers all pose substantial security risk to any business regardless of their intent being malicious or inadvertent. Connectivity is anywhere and everywhere. Data is being pushed in every direction in and out of the office, and Godspeed to the careers of any IT team not accelerating this advancement and impeding productivity of the business. Successful IT teams are helping the decentralizing of technology while creating frictionless security controls across all risk mediums — and that isn’t an easy task.
To meet these growing threats, the market has seen a proliferation of new security hardware, software, and tools. With this a new problem has been born.
- Intrusion Detection
- Network Monitoring
- Data Loss Prevention
- Email Encryption
- Identity Services
- Disaster Recovery
- Cloud Security
- Big Data Security
- Governance/Compliance Management
- SSL & Digital Certificate Authority & Management
The above list just scratches the surface on the areas in which security and IT teams need to have solutions. There are literally hundreds of thousands of products and tools for each one of these categories. It’s an overwhelming portfolio to choose from, but too many teams make the mistake of over-evaluating the features of each tool and compare product vs. product in endless cycles. I use the analogy; a boat is taking on water, and the captain is more worried about what color tape to use to plug the leak.
What’s even more damning is when IT teams pick the most robust security tool while failing to have any internal skill or availability to deploy, tune, and actually utilize the tool. Given the often-limited resources IT teams have, I have seen countless purchases of very expensive security tools which go un-deployed, underused, and unmonitored for months and even years. More often than not, this creates friction from the business units allocating budget to technology who consistently feel a lack of return on their investment.
Security is Not a Product, it’s a Procedure
Effective security and IT teams are evolving quickly, understanding that their impact is greatest when they focus on building a program that is augmented by security products. It’s an ongoing operation with checks and balances, tools and people, changes and improvements. They realize that the threat landscape is evolving far too quickly to get hung up on any single feature of a product. Tools are useless if the are not utilized to their maximum abilities. Security and IT teams need good employees and strong partners they trust more than they need products. I’ll take that one step further:
BAD products implemented and managed by a strong team and trusted partner are far better than GREAT products that are poorly implemented by weak teams and incompetent partners.
It is disheartening to see technology partners and product manufactures alike rave on and on about how stellar and secure their products are, never once thinking to ask if their product actually fits into an existing program or how their client would implement, support, or fully leverage the product.
This is something your IT must learn to ask for themselves.
The days of centralized technology platforms are long gone. We can no longer avoid disaster by making one or two adjustments to a handful of systems or buying a product. Security and IT teams must shift their focus to building programs connected to the business in deeply valuable and impactful ways. The IT security teams of today must:
- Move away from being experts in security technology to being experts in identifying their organization’s biggest risks and quantifying risks and opportunities of growth.
- Find strong and trusted partners to be their new experts in security technology, and who can provide varying options with full transparency.
- Build a strong team and process that manages the security operations with scheduled time dedication.
None of this is ground breaking though — I believe most would find these ideas to be accurate or at minimum, common sense. But one doesn’t have to look far to see common sense is not adopted everywhere. Equifax spends roughly $200M a year on security upgrades after the notorious breach of 2017 (Sakelaris, 2017.) Come to find, their massive breach a year ago was not a failure in product, and it wasn’t a missing feature. It was because a process wasn’t followed and a server wasn’t patched.
That’s it. A 30 minute patch.
So, as we reflect on another New Year, saying goodbye to 2018 and hello to 2019, we need to remember those times that put our tech world on high alert. We remember what past incidents occurred and how we can adjust towards the future. There will always be security alerts, attacks, and malware, but with the right process and team procedures, these “alerts” will become less and less detrimental. Remember, it is not about the products you have in place, it is how your team implements those products and utilizes them to the fullest degree.