Chip Firms Flock to AI's Promise -- WSJ

Nvidia, Intel and AMD take steps to counter slowdown in sales of PCs and smartphones

This article is being republished as part of our daily reproduction of WSJ.com articles that also appeared in the U.S. print edition of The Wall Street Journal (November 10, 2017).

Chip makers are racing to develop artificial-intelligence products to fuel growth as sales of smartphones and personal computers cool.

Nvidia Corp., Intel Corp., Advanced Micro Devices Inc. and a raft of startups are crafting new processors to tap into a broader market for AI hardware and software that is growing 50% a year, according to International Data Corp.

Growth has been elusive in some areas of the semiconductor industry, contributing to a wave of consolidation capped this week when networking specialist Broadcom Ltd. bid $105 billion for smartphone-chip leader Qualcomm Inc. in what would be the sector's biggest deal to date.

Global spending on AI-related hardware and software could expand to $57.6 billion in 2021 from $12 billion this year, IDC estimates. Of that, a sizable portion will go into data centers, which by 2020 are expected to devote a quarter of capacity to AI-related computations, IDC projects.

In recent years, certain AI techniques have become central to the ability of, say, Amazon.com Inc.'s Echo smart-speaker to understand spoken commands. They enable the Nest security camera from Google parent Alphabet Inc. to distinguish familiar people from strangers so it can send an alert. They also allow Facebook Inc. to match social-media posts with ads most likely to interest the person doing the posting.

The biggest internet companies -- Google and Facebook as well as Amazon, International Business Machines Corp., Microsoft Corp. and their Chinese counterparts -- are packing their data centers with specialized hardware to accelerate the training of AI software to, for instance, translate documents.

The online giants all are exploiting an AI approach known as deep learning that allows software to find patterns in digital files such as imagery, recordings and documents. It can take time for such programs to discover meaningful patterns in training data. Internet giants want to improve their algorithms without waiting weeks to find out whether the training panned out.

The chip makers are vying to help them do it faster.

Much of Nvidia's 24-year history was spent making high-end graphics chips for personal computers. Lately, its wares have proven faster than conventional processors in training AI software.

Nvidia, which reported third-quarter earnings Thursday that exceeded Wall Street's forecasts for revenue and profit, has nearly tripled its sales to data centers in the past 12 months to roughly $1.6 billion, largely driving a nearly sevenfold increase in the stock price to $205.32 as of Thursday's close.

The recent optimism around Nvidia "is what everyone in the space has been experiencing," said Mike Henry, chief executive of AI-chip startup Mythic Inc. Mr. Henry said an "explosion of interest" has yielded $15 million in investments in Mythic from venture outfits including Silicon Valley firm DFJ.

Private investors have nearly doubled their total stake in AI hardware to $252 million this year, according to PitchBook Data Inc.

Nvidia's chief rivals aren't standing still. Last year, Intel bought Nervana Systems for an undisclosed sum. The chip giant is working with Facebook and others to deliver Nervana-based chips, intended to outdo Nvidia for AI calculations, by the end of the year.

Beyond Nervana, Intel is prioritizing AI performance throughout its data-center product line. Revenue in the divisions responsible for server processors and programmable chips were up 7% and 10%, respectively, from the prior year, Intel said in its third-quarter earnings report last month.

"We don't disclose our total investments in these things, but it's a large effort" on par with its work to lead the way in conventional processing, said Naveen Rao, who leads Intel's AI group.

Advanced Micro Devices, which competes with Nvidia for graphics in game systems, recently shipped its own AI-focused graphics processor line called Radeon Instinct. Baidu Inc. is a customer, and AMD said in its most-recent earnings call that other cloud providers are as well.

Some companies aren't waiting for the big chip vendors. Google has designed its own AI accelerators, seeking an advantage by tailoring silicon specifically for its software.

"This field is only getting started," said Ian Buck, Nvidia's head of accelerated computing, "and it's being invented and reinvented, it feels like, every quarter."

Write to Ted Greenwald at Ted.Greenwald@wsj.com

(END) Dow Jones Newswires

November 10, 2017 02:47 ET (07:47 GMT)