Josh:

It was the year 1999.

My family had what everyone had at the time, the family computer.

I had recently left Fry's Electronics with my mom, a store which is, sadly, no longer

in business.

She let me pick out a computer game for Christmas, and I picked NASCAR 2000.

I was never into NASCAR, but I wanted a cool car game that promised the best graphics.

If you saw it now, you'd wonder how those graphics could have been considered good,

but it was the 90s. I made sure to check the back of the box to make sure our family computer met

the requirements. RAM, CPU, it all looked good. But when I got home and installed it, it didn't

work. So my dad started looking into it. He talked to some people at work to get some ideas on what to

do. He kept tinkering for the next few days until finally, on December 31st, 1999, around 8pm, I was

able to play my game. Ivan had a joystick that one of our family friends gave us, so playing this game

was a blast. And the part that ended up being the issue was the graphics card. And on a side note,

this was Y2K, and as everyone sat around with flashlights and emergency supplies for when the

computers that tracked the year with only two digits instead of four would turn over to the year 2000

and reset to 00, bringing the entire system as we know it crumbling to the ground. But hey,

my computer game worked. The piece of a computer that turns bits into graphics, on this episode of

In The Shell. The wrong thing to do is just go out and buy a computer and then learn about it. You'll

learn, but you'll learn a lot of things that maybe you didn't want to learn. A computer that you buy

today will likely be obsolete six months from now, and there's not a dang thing that you can do about it.

My name is Josh, and I'm able to keep this podcast independent and advertisement-free

because of support from listeners like you. If you are finding value in what I'm doing here,

consider becoming a paid supporter at members.sideofburritos.com. And as a thank you,

members get early access to new videos, ad-free versions of everything, bonus content,

and access to a live monthly Q&A. Thanks for considering. Now let's get back to the show.

A GPU, or graphics processing unit, is essentially a specialized processor designed to handle lots of

calculations at the same time, especially for rendering images and videos. The graphics part

of its name comes from its original purpose, which is taking on the heavy lifting of drawing everything

you see on your screen, from the 3D landscapes in video games, to the smooth playback of videos,

so that the main processor, the CPU, doesn't get bogged down.

In simple terms, if the CPU is the brain of your computer, that can tackle many different tasks

one by one, then the GPU is like a muscle that excels at performing hundreds of thousands of tiny tasks

simultaneously. Modern GPUs contain hundreds to thousands of smaller cores

that work in parallel, which means they can crunch through data with massive throughput.

This makes them ideal for tasks that can be broken into many parts done at once,

like coloring millions of pixels on a screen, or doing the math for 3D graphics and images.

For example, in a video game, the GPU rapidly calculates lighting, textures, and geometry for

every frame, drawing complex scenes many times per second. Without a GPU, your CPU would struggle

to draw graphics and run the game logic at the same time. Over the years, GPUs have also become

more programmable and flexible. This means that beyond just drawing images, people found ways to use

GPUs for other computation-heavy tasks, not just graphics. Today, GPUs aren't only for pretty visuals,

they are central to fields like AI, artificial intelligence, and AI.

intelligence, which is just a bunch of if statements, I am just kidding, scientific

simulations, video editing, and more. In fact, the recent AI boom is largely fueled by GPUs

because training AI models involves tons of math that GPUs handle really well.

So what's inside a GPU, and how is it different from a CPU under the hood?

The key is parallel processing. A typical CPU might have 4 to 8 cores, or a dozen or two

in high-end chips, each designed for general purpose computing, and optimized for doing

one thing at a time very quickly. A GPU, by contrast, has many more cores, often hundreds

or thousands, but each core is a bit simpler and focused on a specific kind of operation

that can be done in-

parallel. For instance, an advanced graphics card today might have over 20,000 individual

processing cores on one chip. For comparing GPUs and CPUs, you can think of it this way.

If you have a huge task that can be split into lots of identical subtasks, a GPU shines.

Imagine painting a giant mural composed of thousands of small tiles. A CPU is like a

single master painter who paints each tile one by one with great precision. A GPU is like 1,000

painters working side by side, each quickly coloring their own tiles simultaneously.

The GPU's approach would finish the mural much faster for this kind of repetitive task.

In technical terms, GPUs use architectures that allow them to perform the same operation on many

pieces of data.

once. This is something called SIMD, Single Instruction Multiple Data or Vector Processing.

It's perfect for graphics, where say the same shading calculation

needs to be applied to millions of pixels, or the same geometric transform to thousands of vertices.

To enable this parallel processing, GPUs are optimized in a few ways.

They have wide memory buses and specialized memory, VRAM, to feed the data to all those

cores at high speed. VRAM, or video RAM, is the dedicated memory on a graphics card that stores

things like textures, images, and other data the GPU is working on, so that the data is close by

and can be accessed extremely quickly. If the GPU had to constantly reach out to your regular system

RAM over the motherboard.

it would be too slow. The trade-off with the GPU's design is that its cores are not as flexible

or as fast at single threaded tasks as a CPU's core. It's not an exaggeration to say that GPUs

are in high demand today. Gamers covet powerful GPUs because a better GPU means higher game detail,

smoother frame rates, and the ability to play at high resolutions or on multiple monitors.

But beyond the traditional graphics uses, GPUs have entered the godlike tier because of AI and

machine learning. Because how are you supposed to survive without a fridge that has AI built in

to tell you that you are running low on something? You can't possibly be expected to open the door

and look. Training a modern deep learning model involves doing billions of dollars.

of math operations. CPUs can do this, but GPUs do it far faster. This has made GPUs indispensable

for AI research and development, and for companies committed to losing money, never making a profit,

and stealing information, also known as open AI. This exploding AI demand, since around late 2022,

has led to companies grabbing as many high-end GPUs as they can, often before they even leave

the factory. Another big demand driver a few years ago was cryptocurrency mining. Certain

cryptocurrencies could be mined efficiently with GPUs, and during crypto booms, miners purchased

GPUs in large quantities, causing prices to skyrocket. The mid-1990s saw the rise of the

first 3D graphics cards for PCs.

Companies like 3DFX, with its Voodoo cards, and ATI, which is now part of AMD, created

add-in cards that could render 3D graphics much faster than any CPU could, enabling a

new era of PC gaming with smooth 3D visuals.

The term GPU itself was popularized in 1999 by NVIDIA.

That year, NVIDIA introduced the GeForce 256, marketing it as the world's first graphical

processing unit.

Over about two decades, GPUs went from a niche add-on for playing Quake 3 Arena in high detail

to a cornerstone of modern computing.

GPUs have truly graduated from being the graphics sidekick of the CPU to driving the future of

tech.

In the Shell is written,

Researched and recorded by me, the Parallel Processing Podcaster, PPP for short.

If you're listening in an app that lets you rate shows, please take a minute to rate this one.

I would truly appreciate it.

I miss the days when GPUs used to generate graphics.

Now, they just generate opinions.

That's it. Take care, and I'll see you next time.

Thank you.