Josh:

Back when I was in college,

I worked in IT at a manufacturing company

over the summer and during breaks between semesters.

It was your typical run-of-the-mill stuff,

resetting passwords, replacing sticks of RAM,

cleaning up random viruses,

but there was one project that was truly different for me.

There was this old computer in the world

warehouse. It ran a crane that would go down rows of shelves, pick up the products this company

produced, which could weigh over a thousand pounds, or 450 kilograms for my European friends,

and move them to different staging areas. This computer was old. The fan vents were caked with

dust, and it no longer had a support contract for the hardware. It needed to be upgraded,

but no one wanted to do it. It's one thing to upgrade a computer that runs some software,

but you quickly realize no one wants the responsibility of altering a computer that

integrates with real-world hardware. It's like dealing with a printer, and one thing common

among all IT people, no one wants to deal with printers. Long story short, I got it all to work.

I swapped the custom control card into the new server they had purchased, upgraded the OS,

and got it up and running. Granted, it took a few tries and didn't work initially,

but eventually, we got it. That was my first and only time dealing with an industrial control

system, or ICS for short. But across the globe, in 2010, something was uncovered. A cyber weapon

that caused real, physical destruction to industrial control systems. This wasn't your typical malware

that steals credit cards or encrypts files. This was something that literally destroyed industrial

equipment by making it tear itself apart. The fascinating part is that when the damage was

happening, no one knew why. Centrifuges were mysteriously failing at Iran's nuclear facilities,

and engineers were baffled. It wasn't until later that the truth came out. An incredibly sophisticated

computer worm called Stuxnet had infiltrated their systems and was sabotaging the equipment from the

inside. From nation states using malware as a weapon to zero-day exploits, on this episode of In the Shell.

It takes you longer to do something by putting it into a computer and calling it up again than if you just kept simple records yourself in the house.

Let's start with what made Stuxnet different. Most malware is pretty indiscriminate. It spreads far and

wide, trying to infect as many systems as possible. Stuxnet, though, it was like a guided missile. It was

programmed to check every system it infected against a very specific set of criteria. If the system didn't

match exactly what it was looking for, it would sit dormant, doing nothing. So what was it looking for?

Stuxnet was specifically designed to target Siemens S7 PLCs, programmable logic controllers. These are

specialized computers that control industrial equipment. It was looking for ones connected to

very specific types of frequency converter drives made by two companies, one in Finland and one in Iran.

But even that wasn't specific enough. The malware would also check the frequency these drives were running

at. It was only interested in systems operating between 807 and 1210 hertz. Most industrial equipment

doesn't run at these frequencies. But you know what does? Nuclear centrifuges used for uranium enrichment.

Think about how much intelligence gathering had to go into this. The attackers needed to know exactly

what equipment Iran was using in its nuclear program, right down to the specific models and configurations.

This wasn't something a couple of hackers could throw together in their basement.

This required nation state level resources and intelligence. The way Stuxnet spread was equally clever.

I was punto in the cliff for putting out there.

The way Stuxnet is

Since many of these industrial systems weren't connected to the internet for security reasons,

the malware was designed to propagate through USB drives.

Once it infected a computer, it would copy itself to any USB drive connected to that machine.

When that USB was plugged into another computer, it would infect that system too.

Once Stuxnet found its target, it recorded normal operational data from the centrifuges for about three weeks.

Then, while it sabotaged the centrifuges by altering the rotation speeds, it replayed the recorded data to the operators.

So while the centrifuges were literally tearing themselves apart, everything looked normal on the monitors.

It's like something out of a heist movie, where they loop the security camera footage while the thieves sneak past.

The damage was significant.

Reports suggest that Stuxnet destroyed about 1,000 centrifuges,

So that was a great way to get the factory of the aircraft.

It just popped up the electric system.

Once it started, we would've got a chatterbox for the aircraft.

The force is as çalıştastic.

The vehicle, at the vehicle, at the vehicle, at the vehicle,

at Iran's Natanz facility. But the attack showed remarkable restraint. The worm could have destroyed

much more, but it was programmed to limit its damage. It seems the goal wasn't to completely

destroy Iran's nuclear program, but to slow it down and make it unreliable. Stuxnet is widely

believed to have been part of a covert operation called Operation Olympic Games, reportedly

orchestrated by the United States and Israel to sabotage Iran's nuclear ambitions. This marked

one of the first known uses of a cyber weapon to achieve military objectives, representing

a significant shift in modern warfare. While neither country has officially admitted responsibility,

the level of sophistication, the intelligence required, and the specific target all point

to a state-sponsored attack. Stuxnet was described by experts as the most sophisticated

piece of malware ever seen. It used four zero-day exploits, which is like finding four golden

tickets in Wonka bars, except each ticket is worth hundreds of thousands or millions

of dollars on a black market. It even used stolen digital certificates to make itself

look legitimate. But here's the thing that keeps security researchers up at night. Stuxnex

code is out there. It's been studied, dissected, and parts of it have been repurposed. The

techniques it pioneered have influenced a whole new generation of malware. As one researcher

put it, Pandora doesn't go back in the box. And speaking of that box, let's talk about

Stuxnex's legacy. In the years following its discovery, we've seen a dramatic increase in

attacks targeting industrial control systems. The Ukraine power grid attacks in 2015 and

2016, the attack on Saudi Arabian petrochemical facilities.

in 2017, and the Colonial Pipeline incident all followed the blueprint Stuxnet created,

targeting critical infrastructure through their control systems.

More importantly, Stuxnet changed how we think about cybersecurity in industrial environments.

Before Stuxnet, many industrial environments operated under the assumption that their

specialized, isolated systems were safe from external threats.

This reliance on security through obscurity was shattered, as the malware showed that

even highly specific and air-gapped systems could be compromised.

It also highlighted vulnerabilities in SCADA, supervisory, control, and data acquisition

systems, leading to an increased scrutiny in investment and securing critical infrastructure.

This led to some fascinating developments in industrial cybersecurity.

Companies started implementing air-gapped monitoring.

systems that detect when an air gap has been breached. USB drives are now treated like

potential weapons, with some facilities banning them entirely, or using specialized USB filing

stations to thoroughly scan drives before use. There's also been a philosophical shift in how

we approach industrial system design. The idea of defense in depth using multiple layers of security

has become standard practice. Modern industrial systems are now built with the assumption that

they will be targeted, not if, but when. One particularly interesting development is the

rise of digital twins, virtual copies of industrial systems used to test for vulnerabilities without

risking the actual equipment. This technology was partly developed in response to the realization,

thanks to Stuxnet, that even the most isolated systems could be compromised. Stuxnet

It also changed how we think about attribution in cyberattacks.

The level of sophistication became a calling card in itself.

When security researchers see characteristics like multiple zero-day exploits,

highly targeted attacks, and complex evasion techniques,

they start looking for state involvement.

The ripples from Stuxnet continue to spread.

Every major nation is now believed to have some form of cyberweapons program.

We've entered a digital arms race where countries stockpile zero-day exploits

like nuclear weapons of the code world.

And just like with nuclear weapons,

there's ongoing debate about the rules of engagement,

international treaties,

and what constitutes an act of war in cyberspace.

In the Shell is written, researched, and recorded by me,

the Air Gap Engineer.

That brings us to...

The end of Season 1.

Thanks for sticking around.

I'm going to take a short break, and I'll be back with Season 2.

Season 2 is going to be stories of influential people in tech, so stay tuned.

If you're listening to this in an app that lets you rate shows, please take a minute to rate this one.

I would truly appreciate it.

That's it. Take care, and I'll see you next time.

Thank you.