Microsoft is gradually increasing generative AI innovation and monetization.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Microsoft ( MSFT ) is rapidly monetizing AI across its product portfolio and is continuously increasing its investments to drive continuous innovation in this space.

Microsoft's early AI success helped propel the stock to a new high of $433.60 last week. Recently trading near $415, Microsoft shares are up 10.4% YTD.

As more and more customers use Microsoft's platforms and tools to build their own AI solutions, Azure is growing its cloud market share. The company saw a spike in the number of large, long-term Azure deals among its enterprise customer base in fiscal Q3 (March). The number of $100 million+ Azure deals grew more than 80% year-over-year, while the number of $10 million+ deals more than doubled.

In FQ3, Microsoft Cloud revenue rose 23% to $35.1 billion. Azure's revenue growth of 31% beat guidance by 300 basis points and topped the consensus estimate of 28.6%. For FQ4 (June), Azure is expected to grow by 30% to 31%. After the FQ3 report, JP Morgan raised its price target on Microsoft to $470 from $440, saying Azure growth could accelerate over the next 12 months.

Azure's big deals in the March quarter helped lift total commercial bookings growth to 31% in constant currency, a significant acceleration from the 9% growth in FQ2. Commercial RPO of $235 billion increased 21% in constant currency, representing a faster pace than the 16% growth in FQ2. Microsoft's total revenue rose 17 percent to $61.9 billion, topping the consensus estimate of $60.8 billion.

Morgan Stanley believes that Microsoft has plenty of runway to advance in AI, as the era of innovation is just beginning. In FQ3, Azure AI contributed nearly 700 basis points of growth, up from 600 basis points in the previous quarter. Morgan Stanley maintained its price target on Microsoft at $520.

Goldman Sachs raised its target from $65 to $515, saying the company offers a unique growth profile at scale with the potential to grow both revenue and earnings by double digits in FY25 (June). Microsoft is well-positioned to capture generative AI revenue share through its broad suite of AI services and productivity focus, the firm says. Goldman Sachs raised its Azure forecast to better reflect the confluence of factors that could sustain 25% growth through FY'25.

On the FQ3 earnings call, Microsoft CEO Satya Nadella said Azure has “become the port of call for anyone doing an AI project.” AI is bringing new customers to Azure and powering the expansion of the installed base. Nadella also pointed out that AI engagements don't just sit on their own. While AI projects obviously start with calls to AI models, they also leverage ancillary services such as vector databases, developer tools, and Azure Search.

Microsoft is proving to be one of the pioneers when it comes to AI development. Although large language models (LLMs) are gaining a lot of attention, their large size means that they can require a significant amount of computing resources to operate. Microsoft has introduced a new class of highly scalable small language models (SLMs) that will make AI accessible to more people. These SLMs offer many of the capabilities found in LLMs, but are trained on less data.

Microsoft recently announced the Fi 3 family of smaller models. These SLMs outperform same-size and next-size models in a variety of benchmarks that assess language, coding, and math abilities. SLMs are easy to use for organizations with limited resources. They are designed to perform well for simple tasks and can be fine-tuned to meet specific needs. Users are able to choose either the large or small models that best suit their use cases.

LLMs are better suited for applications that require the orchestration of complex tasks that involve advanced reasoning, data analysis and contextual understanding. For organizations that want to build applications that can run locally on the device (as opposed to the cloud) and where a task doesn't require extensive reasoning or quick response, an SLM will do. SLMs are also good for regulated industries and sectors that need high-quality results, but want to keep data on-premises.

There is a long-term opportunity to have more capable SLMs on smartphones and other devices that work at the edge. This will mean AI-infused car computers, traffic systems, smart sensors on the factory floor, remote cameras and devices that monitor environmental compliance. By keeping data within a single device, users are able to minimize latency and maximize privacy.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment