Trump's remarks came a day before the Fed was set to announce its next decision on interest rates.Politicsread more
Democratic Rep. Maxine Waters on Tuesday requested that Facebook pause its development of Libra, an upcoming cryptocurrency that the company plans to release in 2020.Technologyread more
"I do expect our stock market to be hammered if nothing positive comes of this G-20 meeting ... the most likely outcome is nothing happens," Jim Cramer says.Mad Money with Jim Cramerread more
The S&P 500 is closing in on its all-time high, and is likely to sail past it, as long as the Fed promises lower interest rates and the trade war calms down.Market Insiderread more
Union Pacific CEO Lance Fritz tells Jim Cramer that he is optimistic about trade relations with China, Mexico, Japan, and the EU.Mad Money with Jim Cramerread more
See which stocks are posting big moves after the bell on June 18.Market Insiderread more
President Donald Trump on Tuesday announced that he will not nominate acting Defense Secretary Patrick Shanahan to hold the position in a permanent capacity. Army Secretary...Politicsread more
But a look at state-by-state data clarifies the scale of Trump's challenge. As the president tries to rally supporters at a 2020 kickoff rally in Orlando on Tuesday, he is...Politicsread more
In a tweet, Trump said that he and Xi "had a very good telephone conversation," and that "our respective teams will begin talks prior to our meeting."Politicsread more
A Bloomberg News report Tuesday morning said the White House had looked at such a move in February.Marketsread more
The order for 200 737 Max jets from British Airways parent IAG was a vote of confidence for Boeing's beleaguered aircraft following two fatal crashes.Airlinesread more
Chinese retailer and cloud infrastructure provider Alibaba is the latest company to think up its own design for processors that can run artificial intelligence software. It joins a crowded roster of companies already working on similar custom designs, including Alphabet, Facebook and Apple.
The trend could eventually threaten the traditional relationship between big buyers and big suppliers. In particular, chipmaker Nvidia, whose stock has surged as its graphics processing chips have become common for powering AI-based applications, could find its data center business impacted as these roll-your-own-chip projects mature.
The companies are betting that their own chips can help their AI applications run better while lowering costs, as running hundreds of thousands of computers in a data center isn't cheap. It could also reduce their dependency on the few vendors (like Nvidia) who make the types of graphics processors that excel at performing the functions modern AI applications require.
On Thursday, Alibaba said that its recently formed research and development arm -- dubbed the Academy for Discovery, Adventure, Momentum and Outlook -- has been working on an AI chip called the Ali-NPU and that the chips will become available for anyone to use through its public cloud, a spokesman told CNBC.
The idea is to strengthen the Alibaba cloud and enable the future of commerce and a variety of AI applications within many industries, the spokesman said. In the fourth quarter Alibaba held 4 percent of the cloud infrastructure services market, meaning that it was smaller than Amazon, Microsoft, IBM and Google, according to Synergy Research Group.
Alibaba's research academy has been opening offices around the world, including in Bellevue, Washington, near Microsoft headquarters. Last year Alibaba hired Qualcomm employee Liang Han as an "AI chip architect" in the Silicon Valley city of Sunnyvale. Job listings show that Alibaba is looking to add more people to the effort at that location.
The activity bears a resemblance to Google-parent Alphabet's efforts.
Internally Alphabet engineers have been using Google's custom-built tensor processing unit, or TPUs, to accelerate their own machine learning tasks, since 2015. Last year Google announced a second-generation TPU that could handle more challenging computing work, and in February Google started letting the public use second-generation TPUs through its cloud.
The second generation of the Google AI chip can be used in the place of graphics processing units from the likes of Nvidia, which can do more than just train AI models.
The Alibaba and Google server chip programs are still in relative infancy, at least compared to Nvidia's GPU business in data centers.
Indeed, Google and Nvidia remain partners, and Nvidia's GPUs remain available on the Google cloud alongside the TPUs. Alibaba also offers Nvidia GPUs through its cloud and will continue to do after the Ali-NPU comes out, the spokesman said.
In a note last July, analysts Matthew Ramsay and Vinod Srinivasaraghavan with Canaccord Genuity said that with the release of Nvidia's latest GPUs, they have "increased confidence Nvidia will ... more successfully defend pricing as data center sales scale and in-house and merchant ASIC [application-specific integrated circuit] offerings increase."
Earlier this week it became clear that Facebook is also exploring chip development. That initiative could one day lead the company to develop AI chips. That wasn't a complete surprise, though, as last year Intel said that it was working with Facebook on a new chip it had built for AI. But Intel hasn't been involved in Google's TPU, or Alibaba's Ali-NPU.
Facebook's AI chip could improve operations for internal researchers -- training systems faster could mean more rapid experimentation -- and boost the efficiency of systems doing calculations for the billions of people who use the company's apps. The company's push is different from Alibaba and Google in the sense that it's not primarily about giving customers an innovative type of hardware that could bring performance gains.
Meanwhile, Apple has built a "neural engine" element into the chips inside the top-of-the-line iPhone X phone; Microsoft is working on an AI chip for the next version of its HoloLens mixed-reality headset; and Tesla has been developing an AI chip for its vehicles.
But all those devices are different from the servers that would house AI chips from the likes of Google and Alibaba. Data center servers would have more power, direct network connectivity and more data storage on board.