November 22, 2024

Brighton Journal

Complete News World

Apple says its AI models are trained on custom Google chips

Apple says its AI models are trained on custom Google chips

Sundar Pichai and Tim Cook

Source: Reuters; Apple

apple Apple said Monday that the artificial intelligence models that power its AI system were pre-trained on processors designed by Google, a sign that big tech companies are looking for alternatives. Nvidia When it comes to training advanced AI.

Apple’s Choice Google The local Tensor Processing Unit (TPU) for training is detailed in Technical paper just published by Separately, Apple released a preview version of Apple Intelligence for some devices on Monday.

Nvidia’s expensive graphics processing units (GPUs) dominate the market for high-end AI training chips, and have been in such high demand over the past two years that it’s been difficult to buy them in the quantities needed. OpenAI, MicrosoftBoth Anthropic and Nvidia use Nvidia’s GPUs in their models, while other tech companies, including Google, Meta, inspiration And Tesla They are recruiting them to build AI systems and offerings.

Last week, both Meta CEO Mark Zuckerberg and Alphabet CEO Sundar Pichai made comments suggesting that their companies and others in the industry might be overinvesting in AI infrastructure, but acknowledged that the business risks of doing otherwise were too high.

“The downside of being left behind is that you’re going to be out of position on the most important technology for the next 10 to 15 years,” Zuckerberg said in an interview. Podcast With Emily Chang of Bloomberg.

Apple did not mention Google or Nvidia by name in its 47-page paper, but it did note that the Apple Enterprise Model (AFM) and AFM server were trained on “cloud TPUs.” That means Apple rented servers from a cloud provider to perform the calculations.

See also  Google rolls out Android 14 QPR3 Beta 1 for Pixel devices

“This system allows us to train AFM models efficiently and scalably, including on-device AFM, server-based AFM, and larger models,” Apple said in its research paper.

Representatives for Apple and Google did not respond to requests for comment.

Apple revealed its AI plans later than many of its peers, which loudly embraced generative AI shortly after OpenAI launched ChatGPT in late 2022. On Monday, Apple introduced Apple Intelligence. The system includes several new features, such as a refreshed look for Siri, better natural language processing and AI-generated summaries in text fields.

Over the next year, Apple plans to launch generative AI-based features, including image generation, emoji generation, and an enhanced Siri that can access a user’s personal information and take actions within apps.

In a paper Monday, Apple said the on-device AFM was trained on a single “chip” of 2,048 TPU v5p chips working together. These are the most advanced TPU chips, which were first released in December. The AFM server was trained on 8,192 TPU v4 chips configured to work together as eight chips across a data center network, according to the paper.

Google’s latest TPUs cost less than $2 per hour, and the chip is used when reserved for three years in advance, according to To GoogleGoogle first introduced its TPUs in 2015 for internal workloads, and made them available to the public in 2017. They are now among the most mature custom chips designed for AI.

However, Google is one of Nvidia’s biggest customers. It uses Nvidia’s GPUs and TDPs to train AI systems, and it sells access to Nvidia’s technology on its cloud.

See also  iPhone 15: Pro and Pro Max users complain of overheating problems | iPhone

Apple previously said that inference, which means taking a pre-trained AI model and running it to generate content or make predictions, will happen partly on Apple chips in its own data centers.

This is Apple’s second technical paper on its AI system, following a more general release in June. Apple said on time It was using thermal processing units (TPUs) while developing its AI models.

Apple is scheduled to report its quarterly results after the close of trading on Thursday.

Don’t miss these insights from CNBC PRO.

How the massive energy consumption of generative AI is straining our grid