
Michael Malzahn
@ironwoodmike
Father of two – Kayla and Brianna. Husband. Technology Pusher @Redhat. Golfer. Orangetheory. National Ski Patrol.
ID: 3248768304
18-06-2015 15:07:46
176 Tweet
129 Followers
951 Following






.Salesforce majors in customer relationships, and when they wanted to take #CX to the next level, they turned to Red Hat Enterprise Linux. See why: red.ht/462AM9H #DF23


Congratulations @redhat Red Hat OpenShift for winning Best Hybrid Cloud Product of the year and judges calling it "the gold standard of hybrid cloud computing"!






Want a close look at Red Hat’s developer strategy and roadmap? Join us TODAY at Red Hat Summit for a live demo of Red Hat's newest technologies that focus on improving developer experiences: red.ht/44nuOQX #PlatformEngineering #PlatformAsProduct


#InstructLab puts LLM development into the hands of the open source community, right where it should be. And who better to contribute than #RHSummit and #AnsibleFest attendees? Check out the InstructLab Lounge in the expo hall.



El Capitan is expected to boast a processing power of over two exaflops per second and will be used by all three national Tri-Labs. See the future of exascale supercomputing, brought to you by Red Hat Enterprise Linux and Lawrence Livermore National Laboratory. red.ht/3JB4Gsr #RHSummit


Discover how Red Hat Ansible helps Rockwell Automation save time and standardize system deployments around the globe: red.ht/3SPaGSj

Today, Red Hat signed a definitive agreement to acquire Red Hat AI (formerly Neural Magic), a pioneer in software and algorithms that accelerate #GenAI inference workloads. Read how we are accelerating our vision for #AI’s future to break through the challenges of wide-scale enterprise AI and

AI needs to be scalable, trainable and everywhere. That's why I'm especially excited about today's news. Red Hat AI (formerly Neural Magic) is a pioneer in software that accelerates generative AI. With this technology and our shared commitment, Red Hat is building the future of AI with open source
