Generative AI

Google Cloud and Hugging Face Partner for Generative AI

Parvin Mohmad

Google Cloud and Hugging Face team up to accelerate generative AI innovation

Google and Hugging Face have established a collaborative alliance to advance open AI and machine learning development. This cooperation will combine Hugging Face's platform with Google Cloud's infrastructure, including Vertex AI, to make generative AI more accessible and effective for developers. The collaboration will allow Hugging Face users and Google Cloud customers to seamlessly deploy models for production on Google Cloud using Inference Endpoints, accelerate applications using TPUs on Hugging Face Spaces, and monitor use through their Google Cloud account.

Developers will be able to swiftly and cost-effectively train, adjust, and serve open models on Google Cloud, using AI-optimized equipment such as TPUs and GPUs. The cooperation also enables Google Kubernetes Engine installations, which enable the development of new generative AI applications.

This move is considered a big step into AI for Alphabet, Google's parent company, and is likened to the collaboration between Microsoft and OpenAI, but Hugging Face's head of product, Jeff Boudier, remarked that the Google-Hugging Face alliance is completely different.

Google's Tensor Processing Units are specialized hardware designed to speed up machine learning applications, particularly those involving huge matrix computations. Unlike General Purpose Graphics Processing Units, which are designed for parallel processing and can handle a variety of compute workloads, TPUs are specifically developed for AI and ML applications, with a concentration on tensor operations to achieve faster speeds and greater energy efficiency. Furthermore, TPUs consume less energy than GPUs. They are meant to reduce power consumption per operation, resulting in cheaper energy expenses and a smaller carbon impact. Hugging Face users will be able to benefit from TPUs made accessible through Google Cloud as a result of our collaboration.

Vertex AI is Google's machine learning and MLOps platform, which is available on the cloud. Hugging Face customers may now utilize Vertex AI as a deployment platform to host and maintain open models thanks to the two-way connectivity. They may also select GKE, a managed Kubernetes service for hosting models that provides fine-grained management and customization options.

Hugging Face has received considerable funding from major tech companies, including Google. Hugging Face acquired US$235 million in Series D investment from Google, Amazon, Nvidia, Intel, AMD, Qualcomm, IBM, Salesforce, and others, bringing the startup's worth to US$4.5 billion. Hugging Face's dedication to open source and open models has quickly made it the ideal platform for storing models, datasets, and inference endpoints. Almost all open model vendors, including Meta, Microsoft, and Mistral, make their models available through Hugging Face Hub.

Google offers foundation models that are only available through its public cloud platform. Gemini, one of the most successful big language models, was introduced last month. Vertex AI includes other models such as Imagen, Chirp, and Codey. Hugging Face's interface with Google Cloud allows businesses to design and deploy generative AI applications in the cloud using both proprietary and open models.

The cooperation between Google and Hugging Face is anticipated to democratize AI by making it simple for businesses to create their own AI using open models and technology. As Hugging Face establishes itself as the key centre for open-source AI software, this cooperation is expected to increase the size of its AI software repository.

Hugging Face Hub customers should expect the new capabilities, which include Vertex AI and GKE deployment choices, to be available in the first half of 2024.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

4 Coins That Are Ready to Beat Shiba Inu’s (SHIB) ROI This Bull Run

These 2 Affordable Altcoins are Beating Solana Gains This Cycle: Which Will Rally 500% First—DOGE or INTL?

Avalanche (AVAX) Nears Breakout Above $40; Shiba Inu (SHIB) Consolidates – Experts Say This New AI Crypto Could 75X

Web3 News Wire Launches Black Friday Sale: Up to 70% OFF on Crypto PR Packages

4 Cheap Tokens That Will Top Dogecoin’s (DOGE) 2021 Success in the Next Bull Run