Accenture launches AI Refinery framework with NVIDIA AI Foundry for custom Llama 3.1 models

Accenture has unveiled its Accenture AI Refinery framework, built on NVIDIA AI Foundry, to help clients develop custom LLM models using the newly introduced Llama 3.1 collection. This framework, part of Accenture’s foundation model services, is designed to enhance the use of generative AI by allowing businesses to create LLMs tailored to their specific domains and business processes.

The AI Refinery framework includes four main components: domain model customization and training, the Switchboard platform for model selection, an enterprise cognitive brain for indexing corporate knowledge, and Agentic architecture for autonomous AI actions.

Julie Sweet, Accenture's chair and CEO, stated, “The world’s leading enterprises are looking to reinvent with tech, data and AI. They see how generative AI is transforming every industry and are eager to deploy applications powered by custom models. Accenture has been working with NVIDIA technology to reinvent enterprise functions and now can help clients quickly create and deploy their own custom Llama models to power transformative AI applications for their own business priorities.”

Jensen Huang, founder and CEO of NVIDIA, commented, “The introduction of Meta’s openly available Llama models marks a pivotal moment for enterprise generative AI adoption, and many are seeking expert guidance and resources to create their own custom Llama LLMs. Powered by NVIDIA AI Foundry, Accenture’s AI Refinery will help fuel business growth with end-to-end generative AI services for developing and deploying custom models.”

Accenture is using the AI Refinery framework to transform its own functions, initially focusing on marketing and communications before expanding to other areas. The framework’s services will be available to all clients using Llama in the Accenture AI Refinery, leveraging NVIDIA technologies and cloud options.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.