Dell and Red Hat collaborate to bring Linux AI workloads to PowerEdge servers


Dell Technologies and Red Hat have announced a partnership to enhance open-source AI workloads by bringing Red Hat Enterprise Linux AI (RHEL AI) to Dell PowerEdge servers. This collaboration positions RHEL AI as a preferred platform for deploying AI applications on Dell’s PowerEdge R760xa servers.
RHEL AI, designed as an AI-optimized operating system, aims to help organizations more easily develop, test, and run large language models (LLMs) for enterprise applications. By validating RHEL AI for AI workloads on Dell hardware, this partnership seeks to simplify the implementation of AI strategies while scaling IT infrastructure.
Meta’s Llama AI engine sees rapid growth in open source adoption


Meta’s Llama AI engine has seen substantial growth in the open source AI landscape, emerging as a leading force in the industry. With nearly 350 million downloads to date, Llama models have experienced a significant increase in adoption over the past year. In July alone, the models were downloaded over 20 million times, positioning Llama as a prominent open source model family.
Since the introduction of Llama 3.1, which expanded context length to 128K and added support for multiple languages, usage by token volume across major cloud service providers has more than doubled in three months. This growth reflects an increasing preference for Llama within the developer community, making it a notable competitor in the AI field.
Google releases open source Magika content type detection tool on GitHub


Google has decided to make Magika open source, but what exactly is it? Well, it is an innovative AI-powered system that the search giant designed to revolutionize the way binary and textual file types are identified. Magika stands out for its ability to deliver precise file identification within milliseconds, even when operating on a CPU.
Magika employs a custom, highly optimized deep-learning model that has been meticulously designed and trained using Keras. This model is remarkably lightweight, weighing in at just about 1MB. For inference, Magika utilizes Onnx as an engine, ensuring that files are identified swiftly, almost as quickly as non-AI tools, even on a CPU.
Intel teams up with Accenture to launch 34 open-source AI reference kits


In a powerful collaboration with Accenture, Intel has rolled out an impressive collection of 34 open-source AI reference kits. These kits are potential game-changers, aimed at simplifying and speeding up the process of deploying AI for data scientists and developers.
Think about it, dear readers -- every kit is a treasure chest of AI tools. Inside, you'll find model code, training data, instructions for setting up the machine learning pipeline, libraries, and oneAPI components. All of these are designed to optimize AI and make it more accessible, regardless of whether you're working in an on-premises, cloud, or edge environment.
Recent Headlines
Most Commented Stories
BetaNews, your source for breaking tech news, reviews, and in-depth reporting since 1998.
Regional iGaming Content
© 1998-2025 BetaNews, Inc. All Rights Reserved. About Us - Privacy Policy - Cookie Policy - Sitemap.