Friday, May 17, 2024

Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI

Illustration by Alex Castro / The Verge

Google Cloud’s new partnership with AI model repository Hugging Face is letting developers build, train, and deploy AI models without needing to pay for a Google Cloud subscription. Now, outside developers using Hugging Face’s platform will have “cost-effective” access to Google’s tensor processing units (TPU) and GPU supercomputers, which will include thousands of Nvidia’s in-demand and export-restricted H100s.

Hugging Face is one of the more popular AI model repositories, storing open-sourced foundation models like Meta’s Llama 2 and Stability AI’s Stable Diffusion. It also has many databases for model training.

There are over 350,000 models hosted on the platform for developers to work with or upload their own models to Hugging Face, much like coders put their code up on GitHub. Valued at $4.5 billion, Hugging Face has seen Google, Amazon, Nvidia, and others help raise $235 million over the last year.

Google said that Hugging Face users can begin using the AI app-building platform Vertex AI and the Kubernetes engine that helps train and fine-tune models “in the first half of 2024.”

Google said in a statement that its partnership with Hugging Face “furthers Google Cloud’s support for open-source AI ecosystem development.” Some of Google’s models are on Hugging Face, but its banner large language models like Gemini, which now powers the chatbot Bard, and the text-to-image model Imagen are not on the repository and are considered more closed source models.

------------
Read More
By: Emilia David
Title: Google’s Hugging Face deal puts ‘supercomputer’ power behind open-source AI
Sourced From: www.theverge.com/2024/1/25/24050445/google-cloud-hugging-face-ai-developer-access
Published Date: Thu, 25 Jan 2024 18:11:56 +0000

Did you miss our previous article...
https://trendinginbusiness.business/technology/ios-174-adds-new-emojis-for-iphone-these-are-my-two-favorites