AI’s ‘mad cow disease’ problem creeps into earnings season.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Here’s a takeaway from today’s Morning Brief that you can take away. Sign up To receive in your inbox every morning with:

Mentioning AI on an earnings call used to be enough to get Wall Street excited. But a more sobering reality is emerging.

The grand ambitions of AI technologies are fueled by enormous costs – from extreme demands on natural resources to huge hardware investments. Big tech’s exorbitant prices seem less justified, set against the unpredictability of AI development.

If 2024 is the year of “Show Me,” we’re still waiting.

Now earnings season is upon us, and AI is once again driving a handful of mega-stocks. But the latest wave of skepticism suggests the hyped-up comeback may never come.

The web’s wealth of content — the content that inspires trendy models to create compelling photos or persuasive LinkedIn posts — is itself a finite resource. The breadth of the Internet also ends somewhere.

This sparked a mad dash among AI companies to find more content: ripping off copyrighted works, converting videos to text, or using AI-generated content as training data for AI systems. .

But reliance on artificial data degrades the quality and reliability of AI models, as research has shown. This highlights a major limitation in the promise of advanced AI.

Researchers at Rice University compared the risk of training generative models on synthetic materials to “feeding cattle with the remains (including brains) of other cattle,” likening AI training to mad cow disease.

The explosion in AI tools has already flooded the web with artificial content, making up a bigger and bigger part of the internet. You’ve probably already seen the results of this gaming search engine — powerless, artificial, and ultimately, useless articles that get your clicks and short attention spans when you’re looking for reliable, human information.

This of course means that existing AI systems have already achieved their results.

“It’s really about brains breaking the brains of the future,” said Richard Baraniuk, a professor of electrical and computer engineering at Rice University who co-authored the paper.

The limitations of human-made materials are the latest example of an AI story facing insurmountable limits. There is an array to choose from.

Rene Haas, CEO of chip design company Arm (ARM), said earlier this week that AI models are “simply insatiable in terms of their thirst” for electricity.

“By the end of the decade, AI data centers could consume 20% to 25% of US electricity needs,” Haas told the Wall Street Journal. “It’s hardly very sustainable.”

And those words are coming from a CEO, not a hater.

His comments echo a January report from the International Energy Agency that said a query to ChatGPT requires about 10 times more power than the average Google search. Measuring from 2023, electricity demand from the AI ​​industry is expected to grow at least 10-fold by 2026, the agency said.

Other drags on the AI ​​dream are closer to home.

Tech companies are scrambling to reduce their reliance on external suppliers of AI chips, spending billions on hardware and infrastructure. Google (GOOG, GOOGL) and Meta (META) unveiled new homegrown chips this week, flashing their expensive promises.

Investing is the ticket to prosperity in an AI-led future. But costs — such as caveats on data and resources — will bring them closer to proving it.

A brief picture of the morning

Hamza Shaban is a Yahoo Finance reporter covering markets and the economy. Follow Hamza on Twitter. @hshaban.

Click here for the latest technology news that will impact the stock market.

Read the latest financial and business news from Yahoo Finance

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment