Sitemap

Arcee.ai Llama-3.1-SuperNova-Lite is officially the πŸ₯‡ 8-billion parameter model

Sep 19, 2024

The results came in 15 minutes ago. Arcee.ai Llama-3.1-SuperNova-Lite is officially the πŸ₯‡ 8-billion parameter model on the Hugging Face LLM Leaderboard.

Press enter or click to view image in full size

➑ Model page: https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite

➑ Notebook to deploy on SageMaker (GPU): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker.ipynb

➑ Notebook to deploy on SageMaker (Inferentia2): https://github.com/arcee-ai/aws-samples/blob/main/model_notebooks/sample-notebook-llama-supernova-lite-on-sagemaker-inf2.ipynb

This model is a scaled-down version of our SuperNova Llama-3.1–70B, which we believe is the best 70B available today.

➑ SuperNova blog post: https://blog.arcee.ai/meet-arcee-supernova-our-flagship-70b-model-alternative-to-openai/

➑ Deploy SuperNova from the AWS Marketplace: https://aws.amazon.com/marketplace/pp/prodview-sb2ndlhwmzbhi

#ai #slm #byebyeopenai

--

--

No responses yet