In 2009, I moved from New Jersey to Virginia and became a business partner at a consulting and product company. Despite having worked in software since 1998, it was only then that I started to become comfortable in my own code. These retrospective posts are rewrites of thoughts I had back then on the industry and my career up to that point.
When a company's network gets compromised by a virus, it can set off the panic alarms throughout the workplace. Of course Internet access being a "must" for all companies (every company is an IT company), means that exposure and risk are both high with the single biggest thread being the human element. With computer and network penetrations, the large successes in the 80's and 90's—a good example being Kevin Mitnick's adventures—were largely an exercise in social engineering.
"A company can spend hundreds of thousands of dollars on firewalls, intrusion detection systems and encryption and other security technologies, but if an attacker can call one trusted person within the company, and that person complies, and if the attacker gets in, then all that money spent on technology is essentially wasted."—Kevin Mitnick
I've had a few experiences with cleaning up enteprise networks in the past, and it's always a mixture of careless human effort and misconfiguration. When you're getting hacked because someone doesn't understand logins, or there's a legit worm moving through your system, the care that's needed in cleaning up networks, computers, and infrastructure is unparalleled—to the point where some security experts suggest completely wiping machines and even replacing hardware in highly sensitive situations. This was the case with the January 6th insurrection where many security experts were concerned about trespassers traipsing around Congressional offices, including noting that Nancy Pelosi's computer was unlocked.
In one of my own early cases (circa 2004), the company I was working for was called in to help one of our clients fight a network-wide infection that had crippled their entire operation. The infection permeated every single desktop computer, but consisted of three separate viruses. Each time any one computer was cleaned, the virus was pushed back out to it from one of the other computers. The network was toast and the company was handcuffed.
Time for the keyboard cowboys? Most of us were young at that time and just naive enough to not care about going home until the job was finished.
We showed up at 7:00 am on Thursday morning and set about a network cleaning process that ran 21 straight hours; No sleep and very little food.
The entire network needed to be brought down, and each computer was isolated and cleaned. We needed to kill very specific processes and run very specific tools based on security data we had researched (i.e., Googled like a boss—plus hitting a few well-known security resources that cataloged viruses at the time) just in order to get the computer running to a point that anti-virus software could work. After running a few different anti-virus programs (in a situation like this, if you're going to run one, you might as well run them all), the computer was then safe to reboot. At this time, we installed a Microsoft-based anti-virus software to help keep the individual computers clean. _This was the early days of Microsoft taking security seriously, so you still had to download and install their first attempt at Windows Defender.
Once all of the desktop computers were clean, we needed to start working on the servers. Not only did the servers need to be cleaned of any infection, but permissions needed to be reset (and reduced), and the network’s wiring was such a mess that nobody knew which network cables went where. We ended up having to rewire and color code the entire network. By the way, don't underestimate the security benefits of well organized server racks and cables. If we were dealing with malicious hardware, nobody would have found a thing.
21 hours later, the network was clean, the desktop computers were hooked back into the network and running anti-virus software, and all seemed right in the client’s world again. The pizza was ordered and the CEO ventured in with a case of beer as a bonus.
No network cleaning operation is too big that it can’t be fixed (unless you're dealing with a highly sensitive intrusion which might warrant a wipe and replace strategy), but at 21 straight hours with a team full of network IT professionals, this was the single most expensive project that this particular client had ever needed to sign off on. All of this could have been avoided with proper security permissions on the servers, and anti-virus/anti-spyware programs on the desktop and laptop PC’s (there were none when we first looked at the computers—again, not rocket science here, just human oversight).
But when I think back on that foray into high-intensity troubleshooting, I'm also reminded that it was at this time I discovered that the same company stored passwords and credit card information in their databases as plain text. On top of that, the CEO of the company that I worked for (that built this software prior to me joining them) did nothing to remediate a clear violation of policy even after I told him. So when I talk about the human element of security risk, it isn't just the people who don't know better; It's also the people who do know better, but who don't do anything about it.
Software might be eating the world, but what are we doing to ensure that people building and implementing the software are following best practices? Does software engineering need licensure like other engineering fields? This has always been an issue with software development and technology, but as we move further into areas of advanced mathematics for future-tech like artificial intelligence and blockchain, we see layers built around these technologies not just to make them easier for users, but easier for mid-tier and front-end application developers to interface with them. If that's the case, does the black box just get more mysterious?