

The primary Maserati was launched in 1926. The primary Ferrari was launched in 1947. And the primary Porsche was launched in 1948. And my private favourite, the primary Land Rover, additionally was launched in 1948.
What do every of those legendary vehicles have in frequent?
Every predates the mainframe and COBOL, but nobody calls them outdated. Why? As a result of they’ve frequently advanced—embracing fashionable engineering, cutting-edge know-how, and innovation whereas sustaining the effectivity, efficiency, reliability, and excellence they have been constructed on. The identical is true for the mainframe.
But, regardless of a long time of steady transformation, some critics nonetheless cling to the parable that mainframes are outdated, inefficient, and unable to combine with fashionable IT programs. This couldn’t be farther from the reality. IBM’s z16, launched in 2023, was constructed for AI, and the z17, as a consequence of launch this yr, is poised to deal with new workloads with unparalleled safety, scalability, and effectivity. COBOL, the spine of essential purposes, is as straightforward to make use of as any fashionable programming language when paired with the precise instruments.
The issue isn’t the mainframe—it’s how we’ve managed and reworked the purposes operating on it. As a substitute of strolling away from probably the most dependable, safe, and high-performing computing platform in historical past, we should always deal with how it’s evolving to help new workloads, AI-driven insights, and hybrid cloud integration.
A Quickly Modernizing House
The mainframe isn’t standing nonetheless. It’s taking up extra mission-critical workloads than ever, supporting all the pieces from AI-powered fraud detection to high-speed monetary transactions. Actually, a whopping 72 % of the world’s compute runs on mainframes whereas the platform makes up simply 8 % of IT prices.
Mainframe transformation includes two issues. First, improvement groups have to harness mainframes’ computing energy, scale, and knowledge storage capabilities. Second, they want these mainframe programs to align with the automation capabilities that their cousins within the cloud have adopted, making the mainframe software program improvement life cycle extra environment friendly, eliminating guide processes, and growing the standard and velocity of legacy purposes.
DevOps workflows alone gained’t get us there, however instruments are bridging the hole.
With regards to instruments, retailers want mainframe code to be managed identical to cloud or distributed purposes, enabling steady integration/steady improvement pipelines, automated testing, and model management whereas sustaining compatibility with legacy environments.
Tradition and the developer expertise additionally play an vital function in mainframe transformation. If the developer expertise for engineers is subpar, a lift to effectivity is unlikely to emerge. Eradicating guide bottlenecks, lowering or eliminating context switching, streamlining archaic improvement processes, and adopting an agile tradition are all straightforward methods to enhance the developer expertise.
Advantageous-Tuning the Mainframe for Authorities Effectivity
Clients I speak to usually describe three very totally different—however equally legitimate—paths for fine-tuning their mainframe technique. Some authorities businesses select a slow-and-steady method, increasing their mainframe footprint over time as wants evolve. “Our workloads are rising as our inhabitants grows,” one CIO advised me. “We’re not shifting off the mainframe—we’re rising with it.” For these businesses, there’s a pure rhythm of progress that doesn’t require radical change, simply considerate funding as utilization expands.
Others are leaning into modernization by refactoring the code itself. With the assistance of Generative AI-powered code assistants, prospects are telling me they’re lastly in a position to sort out decades-old purposes with confidence. These instruments clarify unfamiliar code in plain language, doc it routinely, and counsel finest practices for making modifications. For presidency groups with restricted entry to senior mainframe builders, this new stage of code intelligence helps bridge the abilities hole and enabling sooner, safer transformation of core purposes.
After which there are the businesses doubling down—reinvesting within the mainframe by upgrading to the newest zSystems and embracing DevOps practices throughout the board. “If we are able to do it on the distributed facet, we should always be capable of do it on the mainframe,” one company chief advised me. By staying present, these organizations cut back technical debt, help fashionable improvement instruments, and guarantee seamless integration into their enterprise-wide DevOps workflows.
Future-Proofing the Mainframe
The builders working with mainframes are additionally enthusiastic about their future. A 2024 Forrester Report discovered that “amongst international infrastructure {hardware} decision-makers, 61% mentioned that their agency makes use of a mainframe. Of people who use mainframes, 54% indicated that their group would enhance its use of a mainframe over the subsequent two years.”
There’s additionally a large ecosystem of distributors constructing instruments to modernize the mainframes.
That’s the reason you see increasingly discuss synthetic intelligence, graphical scanning, and mapping instruments to parse, map, and refactor legacy code bases and monolithic code into extra manageable property. AI additionally provides organizations the power to rapidly onboard new assets and get them aware of their code base sooner to grow to be extra productive. Builders can pinpoint needed modifications sooner, lowering planning time and accelerating updates.
These developments are promising, and I don’t doubt that they might permit authorities companies to harness the mainframe’s knowledge storage and processing energy whereas additionally adopting the agility that has been the hallmark of Silicon Valley.