IT Modernization increasingly means something new. Go back five, and certainly 10 years, it would have been primarily an efficiency – which is to say primarily a cost-cutting – story. Out with Solaris and in with Linux. Out with legacy Unix system hardware and in with x86. Out with a proprietary database and in with MySQL.
We also saw the continued mainstreaming of virtualization, which served to improve the historically dismal utilization of mass-market servers, without requiring much of a fundamental change in how servers were operated.
This was a common theme of that era of IT modernization. Make things more efficient but don’t upset existing processes, workflows, and development methods. Don’t require any large-scale upskilling of staff.
But that’s changing. While some percentage of existing platforms are being either left as is or lightly refreshed, many others are being modernized through major refits or other fundamental platform shifts.
[ Related read: Kubernetes: 6 open source tools to put your cluster to the test ]
1. Container platforms begin with Kubernetes but they don’t end there
Containers have brought the most visible change to IT platforms and they show no signs of slowing down. Red Hat’s The State of Enterprise Open Source 2021 report found that 72 percent of respondents plan to increase their container usage over the next 12 months. Unlike virtual machines, containers demand – or at least align naturally with – many other technology and operational practices trends including DevSecOps, automation, and microservices. As a result, containers can lead to fairly significant rethinks in established practices, which makes the pace of container adoption that much more remarkable.
Complementary to containers and a necessity at scale is Kubernetes, which provides the orchestration needed for production deployments. Yet, even that’s not the full story. Kubernetes has effectively served as the nexus for a vast number of cloud-native projects that have grown up around it in areas such as registries, security and compliance, networking, storage, service meshes, and CI/CD. Thus the new foundation for IT is container-optimized Linux, Kubernetes, and an integrated suite of other tools targeting developers, operators, and other individuals interacting with the platform.
[ Kubernetes terminology, demystified: Read How to explain Kubernetes in plain English and get our Kubernetes glossary cheat sheet for IT and business leaders. ]
2. Where you run and consume applications and services is up to you
Another change lies in where and how the applications and services running on the platform – and indeed the platform itself – can be consumed.
Today, platforms like Kubernetes don’t need to run on on-premises bare metal servers or virtual machines. Nor do you need to configure and operate your own application and data services. You still can of course but you also have the option of consuming services that someone else is integrating, operating, and delivering for you. And, with open source technologies, you can have a consistent experience across hybrid cloud environments.
Red Hat Director of Product Marketing, Managed Application Services Coco Jaenicke puts it this way: “Managed cloud services – functionality that is hosted and managed in the cloud – provide a clean separation of the service’s features and effort that goes into administering the service. They provide the best of both worlds if you are looking at them through the lens of a development team under pressure – they provide the technology you want with none of the hassles of acquiring hardware, managing uptime, or updating software.”
[ Read also: OpenShift and Kubernetes: What’s the difference? ]
3. Digital transformation is about more than platforms. But it needs solid platforms
Digital transformation got a lot of CIO attention this past year at events like the MIT Sloan CIO Symposium – and it still isn’t a product or a solution that anyone can buy. Rather, it’s best described as a continuous process involving new technologies, ways of working, and adopting a culture of experimentation. Fostering that culture leads to faster and more experimentation and the ability to arrive at better outcomes through continuous improvement.
But just because the technology component is often not front and center (and shouldn’t be) in digital transformation projects doesn’t mean that a technology toolbox, including a foundational platform, is unimportant. Anything but. If you look back at some of the words in that digital transformation definition, it’s easy to see why traditional rigid platforms often intended to support monolithic long-lived applications might not fit the bill.
Digital transformation is responsible in no small part for the acceleration of both containerized environments and the consumption of cloud services.
4. Boring is still good
Some things haven’t changed though. When it comes to production infrastructure, do you really want an exciting 3 AM page? I’m guessing you don’t. Leave the excitement for solving business and customer problems using the right tool while the platform just chugs along.
Deon Ballard, Principal product marketing manager, Red Hat Enterprise Linux, told me in an interview that: “Sometimes you want boring. Sometimes you want reliable and predictable and all the things that you generally don’t want your first date to describe you as, you really want your operating system customers to describe you like that.”
5. Security is still essential
The need to secure your platforms (as well as everything else in your business) is another constant. It’s certainly not a new requirement.
That said, while not new, it’s an arguably more intense problem that does call out for new approaches that are more systematic, more automated, and more continuous than has often been the case in the past. For example, companies writing software are making use of more and more open source code. Lucy Kerner, Red Hat Director of Security Global Strategy and Evangelism, observes that “Open source brings advantages, such as access to the latest innovation, transparency, collaboration, cost savings, potentially higher quality software, and more eyes on the code. But you need to put safeguards in place to consume open source securely. Especially now, organizations are consuming cloud technologies more and more, and oftentimes this means giving developers more and more control of the full development lifecycle... But developers aren’t security experts, so this can lead to challenges if mitigations and security gates aren’t in place.”
[ Get the checklist: Top security and compliance considerations for IT modernization. ]
She advises putting “a secure development lifecycle in place where you’re regularly doing things like code and vulnerability scanning, extensive quality engineering, and testing per release.”
New demands
Put this all together and a pretty consistent picture emerges. The fundamental requirements for an IT foundation haven’t really changed. It’s expected to be reliable, stable, and secure. And it’s expected to handle the demands placed on it. Those demands have changed though. So the specifics of the foundation need to change too. It has to be flexible, whether that means a containerized platform an organization operates on its footprint of choice or a service that the developers at an organization just consume.
[ Learn more about hybrid cloud strategy. Get the free eBooks, Hybrid Cloud Strategy for Dummies and Multi-Cloud Portability for Dummies. ]