AWS and Hugging Face expand partnership to make AI more accessible

James has a passion for how technologies influence business and has several Mobile World Congress events under his belt. James has interviewed a variety of leading figures in his career, from former Mafia boss Michael Franzese, to Steve Wozniak, and Jean Michel Jarre. James can be found tweeting at @James_T_Bourne.


Amazon Web Services (AWS) and Hugging Face have announced an expanded collaboration to accelerate the training and deployment of models for generative AI applications.

Hugging Face has as its mission the need ‘to democratise good machine learning, one commit at a time.’ The company is best known for its Transformers library for PyTorch, TensorFlow and JAX, which can support tasks ranging from natural language processing, to computer vision, to audio.

There are more than 100,000 free and accessible machine learning models on Hugging Face, which are altogether downloaded more than one million times per day by researchers, data scientists, and machine learning engineers.

In terms of the partnership, AWS will become the preferred cloud provider for Hugging Face, meaning developers can access tools from Amazon SageMaker, to AWS Trainium, to AWS Inferentia, and optimise the performance of their models for specific use cases at a lower cost.

The need to make AI open and accessible to all is at the heart of this announcement, as both companies noted. Hugging Face said that the two companies will ‘contribute next-generation models to the global AI community and democratise machine learning.’

“Building, training, and deploying large language and vision models is an expensive and time-consuming process that requires deep expertise in machine learning,” an AWS blog noted. “Since the models are very complex and can contain hundreds of billions of parameters, generative AI is largely out of reach for many developers.”

“The future of AI is here, but it’s not evenly distributed,” said Clement Delangue, CEO of Hugging Face, in a company blog. “Accessibility and transparency are the keys to sharing progress and creating tools to use these new capabilities wisely and responsibly.”

Readers of AI News will know of the democratisation of machine learning from the AWS perspective. Speaking in September, Felipe Chies outlined the proposition:

“Many of our API services require no machine learning for customers, and in some cases, end users may not even realise machine learning is being used to power experiences. The services make it really easy to incorporate AI into applications without having to build and train ML algorithms.

“If we want machine learning to be as expansive as we really want it to be, we need to make it much more accessible to people who aren’t machine learning practitioners. So when we built [for example] Amazon SageMaker, we designed it as a fully managed service that removes the heavy lifting, complexity, and guesswork from each step of the machine learning process, empowering everyday developers and scientists to successfully use machine learning.”

This announcement can be seen not just in the context of democratising the technology, but from a competitive standpoint. Microsoft’s moves in the market with OpenAI, and its ChatGPT-influenced Bing – albeit with the odd hiccup – have created waves; likewise Google with Bard, again not entirely error-free. Either way, the stakes for the biggest of big tech have increased and the battle ground for the ‘AI wars’ have intensified. Hugging Face has an existing relationship with Microsoft, announcing an endpoints service to securely deploy and scale Transformer models on Azure in May.

Picture credit: Hugging Face

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

View Comments
Leave a comment

Leave a Reply