AI Computing Power For Desktop PCs

You’d have to be living under a rock or completely ‘off-grid’ to have not heard about the developments in artificial intelligence (AI) over the past few years.

ChatGPT was really the first ‘mainstream’ AI product that many people heard about and could actually use, now seemingly every new tech product has some kind of AI features built into it.

Despite all the overblown hype I do think it is safe to say that AI is here to stay and is only going to get better over the next few years.

It has already started to be integrated into Windows and I think in time many other products will roll out AI features.

To take advantage of these new features you’re going to need a computer that is capable of powering them and unlike most desktop software, AI performance is not tied to CPU performance.

AI Models Run On Graphics Cards

Graphics Cards not CPUs power most AI workloads and functions, the reason for this lies in how AI models actually work.

When you ask AI to do something like create an image, it basically tries to understand the requirements and then creates an initial image based off its training data. The image is then assessed and refined over and over again until the model decides it has completed the task.

These constant iterations are created by the model carrying out literally billions of mathematical calculations.

The quicker you can do these calculations the faster your image is generated.

Standard desktop computer processors are designed and optimised to have CPU cores that are fast and powerful when dealing with complex logic problems executed in a sequential sequence.

This is what most software needs to run well, it is very rare to have software that performs anywhere near as many calculations as an AI model would.

Desktop CPUs do have multiple processing cores, top performing ones can have up to 24 of them which can execute instructions simultaneously, making them great for multi-tasking workloads.

Graphics cards tend to have a lot more processing cores however their cores are nowhere near as powerful as the ones on a CPU.

For AI workloads though the calculations are not particularly intensive, there are just a lot of them to work through as quickly as possible.

A really high end graphics card might have over 16,000 cores on it all of which can be used simultaneously.

Compared to the 24 cores on even a high end desktop processor it is easy to see why graphics cards are the best tool for the job when it comes to AI workloads, they can complete jobs hundreds or thousands of times faster than a traditional CPU can.

Running AI Models On Your Local Computer

When you interact with services like ChatGPT you are not running this on your computer, it is being run on servers in the cloud paid for by Microsoft and ChatGPT’s creators, Open AI.

When you enter a query it is sent to their servers, processed by them, and then the output is sent back to your computer for you to see.

The server infrastructure to power all of this costs hundreds of millions of pounds to run which is why most AI applications require payment to use. The free tiers of usage are usually capped and sometimes offer a limited service.

There are also security issues with this kind of setup, Open AI see and record every single query that ChatGPT is asked. For anyone dealing with confidential data, using hosted AI services may not be an option.

If you could run an AI model on your local computer this would remove the need to pay for a 3rd party hosted service, it would also remove the security concerns as no data would leave your local PC.

It is believed that as AI gets more efficient and further integrated into desktop software many AI features will run on your local computer rather than in the cloud.

Based on this it could become important to take AI processing performance into consideration when buying a new computer.

Measuring AI Performance

There is no ‘easy’ way to measure how well one graphics card will perform over another, different AI models using different queries will have different requirements.

One thing that is available on most graphics cards is something called a TOPS score.

TOPS or Trillion Operations Per Second is exactly how it sounds, a measure of operations the graphics card (or processor) can perform in a second.

Generally speaking, the higher the TOPS score the faster it performs on AI type workloads.

To help you see differences between different graphics setups on our computers we have added an AI Performance TOPS score to each graphics card option we sell. You can see the TOPS score alongside the star ratings for general CPU speed and multi-tasking capabilities on every computer we sell.

Switching the graphics option will change the TOPS score displayed.

Microsoft Copilot PCs, NPUs & Laptop Processors

As mentioned, Microsoft are starting to integrate AI features directly into Windows and they also have a new certification of sorts for what they call ‘Copilot PCs’.

These are basically laptop computers with a TOPS score of 40 or more.

Despite what I’ve said about graphics cards being better suited to running AI models, newer laptop processors have something called an NPU or Neural Processing Unit built into them.

These NPU’s generally offer AI performance without the need for a powerful graphics card.

Newer desktop processors also have NPUs built in but their TOPS scores are low, even the fastest desktop processor at the moment, the Intel Core Ultra 9 285K, only has a TOPS score of 13.

A new Copilot Laptop processor will generally have an NPU capable of around 40 – 45 TOPS.

Why are desktop processors worse at this? Because with a desktop PC you generally would add in a dedicated graphics card which would then handle the AI workload.

The lowest powered graphics card we use has a TOPS score of 66 which is already a lot better than even the strongest Copilot laptop computers.

We have graphics card options with TOPS scores of 242 and 836 on our Extreme and Trader Pro PC’s, these are obviously way better suited to running AI models and workloads.

Hopefully this gives you an overview of why graphics cards, not CPUs, are important for AI workloads, and gives you a way to assess relative performance differences between one graphics setup over another.

If you have any further questions or want some specific advice on a new computer then just let us know, we are here to help!

Written by Darren @ Multiple Monitors

Last Updated: January, 2025