Paul Krill
Editor at Large

Red Hat ships AI platform for hybrid cloud deployments

news
Feb 27, 20262 mins

Red Hat AI Enterprise is an integrated AI platform for deploying, managing, and scaling AI-powered applications on any infrastructure, Red Hat said.

Red Hat logo and sign on open-source software company office in Silicon Valley. Red Hat has its corporate headquarters in Raleigh, North Carolina - Sunnyvale, California, USA
Credit: Michael Vi/Shutterstock

Red Hat has made its Red Hat AI Enterprise platform generally available, with the intent to provide an AI platform to simplify development and deployment of hybrid cloud-based applications powered by AI.

Availability of the platform was announced February 24. Engineered to solve the “production gap” for AI, Red Hat AI Enterprise unifies AI model and application life cycles—from model development and tuning to high-performance inference—on a standard, centralized infrastructure to accelerate delivery, increase operational efficiency, and mitigate risk by providing a comprehensive, all-in-one experience, Red Hat said. Users can move away from treating AI as a disjointed, bespoke effort and transform it into a scalable, repeatable factory process. Red Hat AI Enterprise is powered by the Red Hat OpenShift cloud application platform.

Red Hat cited the following business benefits of Red Hat AI Enterprise:

  • Accelerated time-to-value, with a ready-to-use environment for teams to “develop once and deploy anywhere” without rewriting code.
  • Increased operational efficiency, simplifying workflows from code commits to model serving.
  • Mitigated risk and governance, with a foundation for digital sovereignty, giving organizations control over where data and models reside.

For platform engineers, AI engineers, and application developers, Red Hat AI Enterprise provides a foundation for modern AI workloads, Red Hat said. This includes AI life-cycle management, high-performance inference at scale, agentic AI innovation, integrated observability and performance modeling, and trustworthy AI and continuous evaluation. Tools are provided for dynamic resource scaling, monitoring, and security. For zero-downtime maintenance, rolling platform updates keep the AI stack current and protected without disrupting active inference services, according to Red Hat.

Paul Krill

Paul Krill is editor at large at InfoWorld. Paul has been covering computer technology as a news and feature reporter for more than 35 years, including 30 years at InfoWorld. He has specialized in coverage of software development tools and technologies since the 1990s, and he continues to lead InfoWorld’s news coverage of software development platforms including Java and .NET and programming languages including JavaScript, TypeScript, PHP, Python, Ruby, Rust, and Go. Long trusted as a reporter who prioritizes accuracy, integrity, and the best interests of readers, Paul is sought out by technology companies and industry organizations who want to reach InfoWorld’s audience of software developers and other information technology professionals. Paul has won a “Best Technology News Coverage” award from IDG.

More from this author