K.C.'s 2022 summary of the Information Security field

A few years ago, a friend reached out with a question:  Her daughter was studying cyber-security and software development at a university, and she wanted to know my take on Information Security as a career field.  

So I sat down and wrote up the following summary.   I think it's largely still accurate, and I've been asked for it again, so I figured I should just post it somewhere I can link to it.

Disclaimer 1:  This was written in 2022, prior to the AI uprising and the sudden massive contraction of the tech industry.  

Disclaimer 2:  I'm a Security Director / Architect / Engineer.  Below is my view on the industry landscape.  It is by no means complete and is likely an artifact of my personal journey.  Others' views will be different.  Deal with it.  🕶️

-=-=- 

Greetings!

Infosec / cyber security is a VAST field.  It's awesome that you're interested in software development as well, though!  If you become a competent programmer that puts you in the top 10% of people in the cyber security field right off the bat.  (Hilarious, sad, and true.)

Speaking very broadly, you can divide infosec into some categories.  I'll write a brief summary of them as I see them:

Defensive side


IT Security / Blue Team
"Blue Teaming" deals with keeping networks, infrastructure, devices, and employees themselves safe.

At a non-tech firm:  This is the most ubiquitous infosec job out there, because it's a role that every organization with any real technology infrastructure needs.  It's also the most varied.  At a large non-tech based company these are the people who set up remote access solutions, maintain firewall rules, work out how to make security changes for external vendor / custom application changes, do "vulnerability management" and patching methodologies of all the company owned assets, device / network inventory management, &etc.   It's the least exciting and most thankless, and also the part of the field most approachable without specialized skills.  TO BE SURE: It does have a lot of skilled people in it, and in fact is where I got my start.  It also has a lot of... not so skilled... people in it, just due to its vast and varied nature.

At a tech firm:  This same role at a technology based company adds a lot of complexity and specialized skills.  You're more likely to need to code, build automation, understand infrastructure (be it "on-premise" or "cloud"), and possibly even build your own security management tools from scratch, depending on how unique / specialized the company's infrastructure is. 

This is why "blue team" roles at, say, Delta Air Lines pay $60k/year but the same role at Google pays $120k/year.  Delta's "Security Engineers" click on firewall and VPN rules all day, maybe look at logs.  Google's "Security Engineers" write code and design systems to do heuristics and vet employee behavior to look for anomalies within their all-custom written infrastructure.  (I say this firsthand, having held that role at both companies.) 


Policy & Compliance
Do you wish your life looked more like the life of a lawyer, but without the status and glamour of saying you're a lawyer?  Compliance might be right for you!  Seriously, this job is all paperwork.  If you're interested in the legal side of things, God knows there's a lot of demand for people who are willing to sit down and mediate between IT / engineering and compliance auditors.  That's what the compliance field is though.   You take the requirements from the auditors, and you get the answers from the engineers / IT staff, and you discuss "compensating controls" or "mitigating factors" for things that your company isn't meeting in the (oft-bad-written) compliance requirements.   And that's as rosy as I can make it sound.


Digital Forensics / Incident Response (DFIR)
This is a specialized subset of the above... simply put- these are the people who reconstruct what went wrong and figure out how to respond when shit hits the fan.  E.g. you discover that a system has been compromised and is leaking your users' personal information to an organized crime syndicate.   Or you find out that someone hacked into the systems and transferred millions of dollars to a foreign bank.  Now what?  Well, that's where DFIR comes in.  This team knows how to triage to determine what happened while not contaminating evidence, not accidentally covering up the attackers tracks, or in some cases even not tipping off to the attacker that they know they're there.  It's a weird intersection of information security, data science, legal processes, and investigation / detective work.  Needless to say, it's not a field that every company staffs or maintains.  Many large organizations instead contract out this work to vendors (like Mandiant, or SecureWorks, or even big consulting firms like Gartner or IBM.)   But specialized / tech firms like Google, Facebook, Microsoft, &etc have to have their own full-time DFIR teams... because their entire infrastructure is specialized / custom (so a Mandiant wouldn't know where to begin anyway) and because the demand for it is high enough to warrant keeping a full team of specialists busy looking for bad stuff.  You're likely to find DFIR teams at any of the smaller tech companies that build online products that touch the public (e.g. everything from Activision / EA Games to Pateron) for similar reasons.

A related offshoot of DFIR is the "community safety" / privacy / anti-abuse teams.  But these are very company specific, as the "privacy" needs of, say Facebook are very different than the privacy needs of EA Games, or OnlyFans.  


Offensive side

Red Teaming
The "red team" is the opposite of the blue team.  They test a (usually large) organization's security by carrying out simulated (or often, actual) attacks against everything from the physical security mechanisms to the employee access methods or the products themselves.  This work is generally 30% preparation, 20% hacking, 50% report writing.  (Because the coolest hacks in the world aren't useful to your employer if you can't document what went wrong and help them figure out how to prevent them!)  This role is usually only the domain of major tech firms or software development houses... but similar fields "Offensive Security" / "Penetration Testing" are becoming more commonplace at non-tech firms that build critical infrastructure or systems.  (E.g. gas and utility companies, avionics / aerospace, public transportation, defense, &etc.)


Bug Bounty Programs
Similar to "Red Teams" except crowdsourced.  Every company that has a "bug bounty" program has to have a team of people to ingest the vulnerability reports, triage to determine whether the vulnerability is "real" and also meets the scope of the program, determine the "severity" (which usually directly determines the payout to the bug hunter who found it), and help the application development or infrastructure team assess and prioritize fixing it.

Bug hunters that participate in bug bounty programs are kind of the "mercenaries" of the information security world.  They're usually freelancers who pick an org with a bounty program (e.g. Microsoft, or Google), then pick a specific area (e.g. Microsoft Azure Cloud), then pick a product within that area, and pour hours into hoping they can find some critical flaw and submit it to Microsoft for a reward.


Consulting / "Pen Testing" Organizations 
There are, of course, vendors who will do this for you.  And those vendors employ professional "pen testers" for hire.  Double the amount of pre and post exploitation paperwork in this case, because unlike an embedded "red team", you didn't know anything about your target before you got the contract to test it.  Small government agencies, medium sized companies that make valuable / money-handling products, casino games, &etc are the types of orgs that hire pen testing orgs.


Development / Research


Application Security (AppSec)
Application Security deals with making sure custom applications are secure / don't behave in ways other than the developers intended.  Application developers are told "security is everyone's responsibility" but the truth of the matter is, their focus is on making whatever they're responsible for work.  Making it secure is often secondary.  Any given function written by a developer is likely to be tested against obvious errors to make sure it works, but making sure it is secure involves ensuring that it behaves as expected when every possible bad thing that can go wrong is forced to.  Because that's what a hacker is going to do.

As famed cryptographer Bruce Schneier once quipped:  Writing a reliable computer program requires "programming Murphy's computer." (Whatever can go wrong, will go wrong.)  But writing a secure computer program requires "programming Satan's computer."  (Whatever can go wrong will be forced to go wrong, again and again, in every possible combination.)

The "grunt work" of AppSec is things like "build a login system for this application" or "hey we built this, do we need logs?  How much logging?  How much detail?  Where does it go?" or "how do we store and authorize seven different privilege levels / roles for our app?"  The "oh shit!" work of AppSec is "Hey we just found out that our already deployed, super valuable app is being exploited and we need to fix it ASAP!  Oh, and we don't know how they're exploiting it yet..."

AppSec is a subset of larger application development / software engineering in general.  And usually an underfunded / understaffed one.  <snark>


Cryptography
Cryptography is the field of deep, deep mathematics + software engineering to build algorithms that the rest of the industry has strong opinions about but when pressed, doesn't understand as well as they claim to.  The best cryptographers in the world, that is to say- the only ones truly qualified to look at a cryptographic system and tell you whether it's actually secure or not, would all fit in a minivan together.  Crazy specialized field.  If you're really a cryptographer you're either a PhD candidate somewhere or you work for a shadowy government agency that probably has a cover story for how you're just an "analyst".

Everyone else just uses the products of those aforementioned experts.  Anyone who doesn't fit the former description but designs their own encryption scheme is an idiot.  Beware. 

Tangent: do not confuse "cryptography" and "cryptocurrencies".  One is a very critical field that depends on extremely smart experts and is an integral part of how modern society functions.   The other is a pyramid scheme full of manchildren who claim they want to "decentralize the system" while ultimately just trying to scheme each other to make themselves rich while fucking over everyone and the environment in the process.   "Crypto" actually stands for cryptosporidium - the parasite that gives you diarrhea from public pools.  But I digress...


Security Research
I have to put a catch-all here for general "research" work because there is sort of a long-tail of non-specific research, academic, and development work that falls into this bucket.  Everything from the development of new authentication & biometric systems to standardization of things like End-to-End Encryption (E2EE) for instant messaging, transaction security for online payments, client-based security, reverse engineering malicious software, &etc.  These are all specialized areas that are usually offshoots of someone's specialization in a college program, but occasionally someone self-taught in the open-source world comes out of right field with something in one of these areas and blows everyone away.   (Which is awesome.)


So there's a rundown of "cyber security" areas as I see them.  Though I probably missed some.  And there's a lot of offshoots / in-between roles.  E.g. "SRE / Infrastructure" has a LOT of overlap with security, and you'll often find infrastructure engineers embedded in security teams or vice-versa.   Same for the SWE / Appsec areas.   It's an ever-evolving field.

Personally?  I did not go to school for it.  I started as a computer tech as a teenager, moved on to being a network engineer in the 1990s, and naturally gravitated to security because it was an area I was always interested in and good at, even before it was a commercially viable field.  So I've kind of grown-up with it, self taught all the way.  I have my CISSP certification (though don't start there, it's not really a hands-on or entry cert).   I think certifications are a perfectly fine way to go, but personal experience is even more important IMHO. 

There are lots of programs about all of the above online.

Happy to continue to answer questions as you think of them!

Cheers,
 - K.C.

-=-=-

Comments

Popular posts from this blog

Caveat Emptor: Sedona Car Rentals

Sedona, AZ

Circa 1964 Livermore Data Systems Model A Modem