Hugging Face, AWS’s partner for open source machine learning amid the AI arms race

View all sessions on demand from Smart Security Summit This.
Impressive strides in big language model (LLM) are showing signs of what could be the beginning of a major shift in the tech industry. AI startups and big tech companies are finding new ways to put advanced LLMs to use in everything from composing emails to generating software code.
However, LLM’s promises have also sparked an arms race among tech giants. In their efforts to build artificial intelligence arsenal, big tech companies threaten to push the sector to be less open and more secret.
In the midst of this competition, hug Face is mapping another strategy that will provide scalable access to open-source AI models. Hugging Face is collaborating with Amazon Web Services (AWS) to support the adoption of open source machine learning (ML) models. In an era where advanced models are becoming increasingly difficult to reach or hidden behind walled gardens, an easy-to-use open source alternative could expand the market for the app. machines learn.
Open source model
While large-scale machine learning models are useful, setting them up and running them requires special expertise that few companies possess. The new partnership between Hugging Face and AWS will attempt to address these challenges.
Developers can use Amazon’s cloud infrastructure and tools to easily refine and deploy the most advanced models from Hugging Face’s ML repository.
The two companies start working in 2021 with the introduction of Face Hugging study carefully container (DLC) on SageMaker, Amazon’s cloud-based machine learning platform. The new partnership will expand the availability of Hugging Face models to other AWS products and Amazon’s cloud-based AI-accelerated hardware to accelerate training and inference.
“Since we started offering Hugging Face natively in SageMaker, usage has grown exponentially and we now have more than 1,000 customers using our solutions every month,” said Jeff Boudier, manager Hugging Face’s product manager, told VentureBeat. “Through this new partnership, we are now working with engineering teams that build new efficient hardware for AI, such as AWS Trainium and AWS Inferentia, to build solutions that are directly usable. on Elastic Compute Cloud (EC2) and Elastic Kubernetes Services (EKS).”
AI arms race
Tech leaders have been talking about the transformative nature of machine learning for several years. But never before has this transformation been felt as in the past few months. The release of OpenAI’s ChatGPT language model set the stage for a new chapter in the race for AI dominance.
Microsoft recently poured $10 billion into OpenAI and is working on integrating LLM into its products. Google has invested $300 million in Anthropic, an OpenAI competitor, and is trying to protect its online search empire against the proliferation of LLM-powered products.
There are clear benefits to these partnerships. With financial backing from Microsoft, OpenAI was able to train very large and expensive machine learning models on specialized hardware and deploy them at scale to millions of people. Anthropic will also receive special access to Google Cloud Platform through the new partnership.
However, competition among big tech companies also has trade-offs in this area. For example, since the beginning of its partnership with Microsoft, OpenAI has stopped making most of its machine learning models open source and is serving them through a paid application programming interface (API). It has also been locked to the Microsoft cloud platform, and its templates are only available on Azure and Microsoft products.
On the other hand, Embracing Face remains committed to continuing to provide open source models. Through the partnership between Hugging Face and Amazon, developers and researchers will be able to deploy open-source models such as BLOOMZ (a GPT-3 alternative) and Stable Diffusion (DALL-competitor) E 2).
“This is an alliance between the leader in open source machine learning and the leader in cloud services to build the next generation of open source models and solutions to use them together. Everything we build together will be open source and publicly accessible,” Boudier said.
Hugging Face also aims to avoid the kind of locking that other AI companies are facing. While Amazon will remain its preferred cloud provider, Hugging Face will continue to work with other cloud platforms.
“This new partnership is non-exclusive and does not change our relationships with other cloud providers,” Boudier said. “Our mission is to democratize good machine learning, and to do that, we need to allow users to use our models and libraries anywhere. We will continue to work with Microsoft and other clouds to serve customers everywhere.”
Open and transparent
The API model offered by OpenAI is a convenient option for companies without in-house ML expertise. Hugging Face has also been providing a similar service through its Inference Endpoint and Inference API products. However, the APIs will be limited to organizations that want more flexibility in modifying models and integrating them with other machine learning architectures. They are also inconvenient for research that requires access to the model’s weights, slopes, and training data.
Scalable, easy-to-deploy cloud tools such as those offered by Embracing Face will enable these types of applications. At the same time, the company is developing tools to detect and flag abuse, bias, and other problems with ML models.
“Our vision is openness and transparency [are] the way forward for ML,” Boudier said. “ML is driven by science and science requires reproducibility. Ease of use makes everything accessible to the end user, so everyone can understand what the model can and cannot do, [and] how they should and should not be used.”
VentureBeat’s Mission is to become a digital city square for technical decision-makers to gain knowledge of transformative and transactional enterprise technology. Explore our Briefings.