Hosting open source LLM models on AWS optimized instances

AI INFRA
A support on the design and implementation of the open source LLM on a dedicated optimized instance of AWS Inferenciaand Trainium