Is Software-Defined A New Liberty For IT?


Adrian Bridgwater, Contributor

March 18, 2016

The information technology industry’s problem is that it keeps reinventing itself. But this is a good thing. Processing and compute speeds get faster, storage capacities get bigger and cheaper… and our endpoint devices get more functionality and better interconnectivity.

But constant reinvention is also a bad thing. The IT industry ends up in a vicious cycle of legacy obsolescence forcing updates that often lead to expensive ‘rip and replace’ practices. The constant march towards version 2.0 of everything very often ends up being wasteful and expensive.

Software-defined saviour?

Modern thought (inside this last decade especially) has driven us towards the notion that we can circumvent and insure against a good proportion of expense incurred inside upgrade cycles by embracing software-defined computing.

Being able to ‘describe’ a computer through mere software-definition is central to the notions of virtualization and abstraction that we use to posit the theory of cloud computing. As we know, there is no real cloud – it’s a server in a datacenter that we have applied management software to in order to ‘define instances’ of compute power, storage and networking intelligence.

In cloud, we define the computer through software. When we need a new computer, we disassemble, break down, retire or simply park the old software space and move forward to define a new virtual machine.

It is from a relationship with these techniques that LzLabs is attempting to create mainframe systems out of mere software. In what is claimed to be the world’s first Software Defined Mainframe, the firms wants old mainframe users (there are many in financial IT and some governmental use cases) to move their legacy applications and data to open Linux server and cloud platforms.

70% of the world runs on old stock

LzLabs claims that more than 3000 of the world’s largest companies have no escape from expensive and outdated application architectures (many of them mainframe), which still power 70% of the world’s commercial transactions. The LzLabs’ Software Defined Mainframe supports major legacy operating environments and software languages.

“Despite an almost universal desire to liberate mainframe applications to improve interoperability, business agility and to reduce costs, the risk and complexity of rewriting or recompiling code have been assessed as too high by many mainframe customers”, said Thilo Rockmann, chairman of LzLabs. “What was required was a seamless way to allow the customer’s application code and data to run unchanged in a modern environment.  LzLabs has worked for five years to build exactly this solution – the Software Defined Mainframe.”

The LzLabs’ Software Defined Mainframe will be offered to customers both for use within their own datacentres running on Red Hat Linux-based computers and for deployment via the Microsoft Azure cloud platform.

According to LzLabs, “With over 70 per cent of commercial transactions occurring on mainframe-based systems, organizations have become dependent on legacy applications stuck behind outdated application programming interfaces (APIs). Historically, these organizations have been forced to abandon compatibility with the mainframe in order to move legacy applications and data to Linux or the cloud.  Abandoning compatibility makes migration very difficult as critical data will need to be converted and complex applications must be rewritten or recompiled and tested in a new environment. LzLabs Software Defined Mainframe protects customers’ investment in their business processes by eliminating recompilations of COBOL and PL/I programs, data conversion and complex testing.”

Mainframe brain drain

So what’s interesting about this story? Well for one, it’s worth embracing the fact that software-defined computing is changing the way IT works and its virtual impact is today very real. Secondly, this is another example of so-called ‘container’ technology — LzLabs has developed a managed software container that to migrate applications from mainframes onto Linux computers or private, public and hybrid cloud environments.

Thirdly, the suggestion (if all this works as well as it is claimed to… and it won’t in every instance) is that when legacy application programs are placed into the container, the customers’ applications are actually enhanced because decades-old APIs (and other componentry) are exchanged for newer, more contemporary ones.

It’s easy to pop caustic remarks at LzLabs because a) mainframes aren’t sexy b) some of this technology has yet to fully prove its worth and c) the firm uses a funny superscript z in its name. That said, native mainframe inside software-defined container on cloud might well cook up as a nice pizza.

This article was written by Adrian Bridgwater from Forbes and was legally licensed through the NewsCred publisher network.

Comment this article

Great ! Thanks for your subscription !

You will soon receive the first Content Loop Newsletter