
We had a great discussion yesterday with on using Hugging Face Transformers on AWS. From getting started all the way to using hardware acceleration, we covered quite a bit of ground. Check it out!
In this video, I use AutoTrain, an AutoML product designed by Hugging Face, to train a multi-class classification model on tabular data (the PetFinder dataset from Kaggle).
Original dataset: https://www.kaggle.com/competitions/petfinder-adoption-prediction/
Dataset on the hub: https://huggingface.co/datasets/juliensimon/autotrain-data-petfinder-demo
Model on the hub: https://huggingface.co/juliensimon/autotrain-tabular-demo-762523398
Notebook: https://gitlab.com/juliensimon/huggingface-demos/-/tree/main/autotrain-petfinder…
This video is a technical deep dive on the demo presented in https://youtu.be/I_hqzdqQ5vE, where I run multilingual voice queries on financial documents, using two state of the art Transformer models for speech to text and semantic search in less than 100 lines of Python:
This 5-minute video is for anyone curious about state-of-the-art Machine Learning, developer or not :) I introduce you to the value of Transformers and Hugging Face, and how they bring Software Engineering agility to Machine Learning.
To prove my point, I run a web app running multilingual voice queries on…
It’s that time of the year again! AWS re:Invent 2021 is over, and it’s time to update my maps once again:
- Overall map for all AI and ML services,
- Model deployment map,
- SageMaker map.
As usual, you’ll find the latest XMind and high-resolution PNG versions on Gitlab: https://gitlab.com/juliensimon/awsmlmap.
Enjoy, and…