Arcee.ai Llama-3.1-SuperNova-Lite is officially the π₯ 8-billion parameter model
The results came in 15 minutes ago. Arcee.ai Llama-3.1-SuperNova-Lite is officially the π₯ 8-billion parameter model on the Hugging Face LLM Leaderboard.
β‘ Model page: https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite
β‘ Notebook to deploy on SageMaker (GPU): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker.ipynb
β‘ Notebook to deploy on SageMaker (Inferentia2): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker-inf2.ipynb
This model is a scaled-down version of our SuperNova Llama-3.1β70B, which we believe is the best 70B available today.
β‘ SuperNova blog post: https://blog.arcee.ai/meet-arcee-supernova-our-flagship-70b-model-alternative-to-openai/
β‘ Deploy SuperNova from the AWS Marketplace: https://aws.amazon.com/marketplace/pp/prodview-sb2ndlhwmzbhi
#ai #slm #byebyeopenai
