Your computer can probably guess right.

What makes a PC an AIPC? No one has come up with a definition beyond the FOMO marketing effort, with some vague hand-waving at the presence of “neural processing units” and other features only available on the latest and greatest silicon.

While Intel suggests that application developers will soon infuse all software with AI, and PCs should be ready for them, the workloads that matter most right now are those that power large language models. Giving is the result.

So any AI PC worth the name must be quick and well guessed.

Inferencing is the process that transforms a given prompt into a response. Doing so requires a machine that can crunch multiple gigabytes of data – and matrix multiplication even more.

While such a compute-intensive task seems like it could stretch the capabilities of a desktop PC (chat GPT and its uses run in massive data centers), I’ve discovered that many of the things we already have Hardware can do a reasonable job of guessing. . Anyone can run a “good enough” chatbot on a PC – provided the PC has enough RAM and a mid-range GPU.

That means a GPU with at least 8GB of VRAM (and not one from Intel – at least at the moment, due to lack of driver support). It’s not too expensive – which is good, because it’s table stakes. As for RAM, you’ll be happier with 32GB than 16GB, while 8GB won’t cut it. And that’s about it. That’s all you need.

Apple has gotten this remarkably right with its M-series SoCs – possibly completely by mistake. Treat memory as “unified” – it’s all VRAM if it has to be – all modern M-series Macs with at least 16GB of RAM can act as an AI PC.

As it happens, four of the PCs I’ve bought in the last decade meet these specs – including an eight-year-old top-of-the-line monster with an Intel 6700K CPU that I bought as a “VR PC.” Gia and then top. -Off-the-line Nvidia GTX 980 Ti GPU. It’s a bit slower than newer machines in my collection, more or less by Moore’s Law, but it runs a chatbot natively just fine.

Although not purchased as AI PCs, each of my latest acquisitions is rated at speeds comparable to those produced by OpenAI’s GPT-3.5-Turbo – which surpasses them in both speed and execution quality. Not at all different from its older cousin.

To test these systems, I feed them a fairly sophisticated prompt, instructing them to act as an “agent” – solving a problem by creating JSON files that are opened in another program. (“Small pieces loosely connected” still applies in the age of AI.) And it all … just works.

So it turns out I already have AI PCs all over the place. Who knew? What’s more, there are Very of the AI ​​PCs there. Tens of millions. Which is good – because we’re going to need them.

While Microsoft will ask us to paste every sensitive document into the pilot for analysis – thus giving the Redmond data centers a slice of our top-secret workflow – many documents are simply too sensitive to be shared. could go

What about documents that are so hot they shouldn’t even be on a network machine? What about medical information? Or confidential documents? Much of what we work with has strings attached, preventing it from being freely shared.

Legal documents often fall into this category – the types of tasks where the recently released SaulML legal language model can be of great help. If a lawyer or paralegal had this model running on their local PC, they could freely insert documents into it, ask it to read it closely, even create new contract terms. are – all without worrying about leaking confidential information. Many law firms would run screaming to such a tool.Shut up and take my client’s money!

I prototyped this tool last weekend – cutting the full-fat open-source SaulLM model down to size, so it could work on my fleet of AI PCs. I’m not a lawyer, so I can’t fully judge its quality. However, I pasted extensive sections of a potential business partner’s agreement into it, then asked him some tough questions – without fear of leaking.

Even if lawyers don’t love it, the rest of us — who are faced with endless, mindless terms and conditions that we ignore at our peril — will find it very useful.

Similarly, hardware vendors – who depend on you to throw away the PC you bought for work from home three years ago for their revenue cycle – won’t like to admit it, but maybe You already have a fully serviceable AI PC. Not just an AI PC that Wonderland wants you to rush out and buy. ®

Leave a Comment