Here’s the scenario. It’s 1999. I’m working as a PM consultant and our client is (really big tech company). The job is to install a test environment of our software. Our software has been purchased by (really big tech company) because we have something they need in order to complete their massive drive to get on the Internets and be what was then called an e-commerce player. The thing is, as big as this tech company was, like most dinosaurs, they were a bit behind in the whole Internets thing. This was my third install but it was also my first big league install. That is, at the previous two installs there was some leeway and room for error. But those days were gone. I had to get this install right without delay which meant that when we hit the “run” command all systems did exactly that. And let me tell you, the whole thing felt real heavy.
Lawrence (names have been changed to protect family members still living) was my engineer. He was from Suffolk originally but somehow fell in love with some chick from Munich and they both lived in Belgium. At the time I was living in Düsseltal. Since Lawrence and I were consultants it didn’t really matter where we lived. Our motto in a world of out-sourcing run amok: be where ever you can give your bank transfer code to get paid. The small dot com that we worked for liked what we did and so they kept us together on projects like this. And so, we were stuck in the SW of Germanland, not far from the Black Forest.
I arrived by plane and then taxi. Lawrence arrived in his new Jaguar X-10. We were assigned a ten square meter glass enclosure in the middle of a football field sized hall filled with automaton cubicles that were all meticulously categorized via huge signs suspended from a fifteen meter ceiling. The signs read “Customer Service” or “OS Operations” or “Printing Products” or “C++” or “Unix”, etc. Details aside, the thing to remember is that Lawrence and I were in a place that could only be compared to the gallows of the Roman Coliseum during games. For that is how we felt, that is how we were motivated. We didn’t care if we were out-sourced gladiators either. We had one up on all the salaried employees in that huge, multiple football field size room: they had called us!
The ten square meter room was made of plexiglass and it had no roof. It was labeled “Installer Box” and everything in it was (supposed to be) disconnected from the rest of (really big tech company). Our job was simple. Install our software, get it working, prove that it works, turn it over to the client and then they’ll call us when they need us. That’s pretty much how I got 2k Euros/day back then. But that’s neither here nor there. Upon arrival at (really big tech company) we were debriefed on protocol and told several times that virus and malicious code contamination was of the highest priority. The thing is, we had a lot of code with us. For this install we had twenty-five CDs. Each CD had to be unzipped and copied to a CPU. Even though I was a relatively useless participant at this point, because I was just the PM, the install and the compile and run process took about four hours. Once that was done and I got the green light from Lawrence, I would then notify the client (the PM is the front man). Then the geeks would surround us and we’d turn the switch and they all would get goo-goo-eyed at our brilliance; a few would even ask if we had any job openings.
Now, dear worst reader, with that little scenario in mind, let’s worst write today about computer security. As I’ve said, there is one major thing that most people never even think about when it comes to the gadgets they use on a daily basis which makes them no different than the big gadgets governments and corporations use. Software does not work. Let me put that another way. Software can never work. Wait. One more try. Software must fail. Did I make that clear? The problem is, as clear as I made that, it ain’t the half of the potential problem at hand. There is one other thing that’s really, really sucky about software. There is no way to prevent software from being completely immune to malicious code.
The fact is, while we were installing software on supposedly “secure” machines in an install-box there was/is no way, if we were willing, (really big tech company) could have prevented us from disrupting their entire network system. Plexiglass encased room or not, while I sat there and patiently waited for Lawrence to complete the daunting task of our install, I counted at least three significant breaches in their security that would have allowed me a major hacker home-run.
The first breach was that prior to our using the install-box someone else had used it and forgot his/her external disk drive. The second breach was that one of the computers in the room had at some point been connected to the company’s network recently because I could see the connection in the network preferences. Either a network cable had been run into the room or someone had taken that PC somewhere else in the building. In fact, I knew the date of the connection (the previous work day) and I also could access the network address code. I could also see in the preference panel that this network connection was made on a regular basis so there was no reason to believe that it wouldn’t be connected again. The third breach, and this was the doozy!, was that although we thought, after the install and setup, that we’d be working from dumb terminals, our hosts told us to go ahead and work directly from the PC that we installed on, which meant that we would have to be given administrator access to it above and beyond just installing our system. Trust among colleagues all on the same team, eh.
As if the war on terror ain’t enough, there has to be something in the wings to preoccupy the paranoid mindset of all future warriors. And in this case the preoccupation might have been worth it. The lack of security mentioned above represents a best case scenario for hackers the world over to find ways to get access to computers in order to partake in the up and coming cyber wars. With encryption technology it’s actually not that easy to get access to systems via networks. The best way to do it is to have direct physical access to those systems. And sense software simply doesn’t work, a system must be constantly accessed by someone, somehow. Of course, my scenario was then (1999) and this is now. Have things changed? The great thing about computers, other than design and screen clarity, nothing changes. During the time I was helping companies get online so they could sell shit, I had noidear that the breaches I discovered would turn out to be the same ones used years later to bring down the computer controlled centrifuges the Iranians were using to process uranium. The thing is, computer experts all know that software is the problem. They also are well aware of the potential for being attacked. One of the easiest ways to prevent such an attack is to simply not give access to the system. What a pickle they all must be in, eh. Obviously, when I was installing our system at (really big tech company) they thought they were secure by putting us in a install-box. The reality is, if I wanted to, I could have easily injected malicious code into their network by one, two or all three breaches in their security.
When I read this arstechnica article I couldn’t help but recall my dot com days while bored in an install-box, looking busy and waiting for Lawrence to do his thing. Wow, I thought, nothing has changed; if you get access to the system it’s easy as pie to take it down. According to an upcoming book by David Sanger, “Confront and Conceal: Obama’s Secret Wars and Surprising Use of American Power”, the Americans and/or the Israelis were able, through a double agent, to inject malicious code, probably via a simple USB stick, into German made CPUs that controlled centrifuges. The code tells the centrifuges to over-spin, which ends up breaking them. A whole bunch of these really expensive and hard to get centrifuges were taken down, practically halting the Iranians ability to continue with their nuclear ambitions. The details of what this code does is mind-boggling. But what’s more amazing is the secret agent-like method that was employed to get the code into the system. Oh. And from what they are reporting, there is one thing the brilliant hackers forgot. A way to turn the code off. Even though it did what it was supposed to do, it didn’t stop there. Supposedly it’s out in the Internets as you read this. But who really cares. We’re on the right side of all these wars, right?
- Confirmed: US and Israel created Stuxnet, lost control of it | Ars Technica
- Stuxnet – Wikipedia, the free encyclopedia