Cutting out the cancer in the code.
Customers are accustomed to immediate and regular updates on their personal devices, so it can be very difficult for them to understand how a massive multi-billion dollar organization like a bank, energy company or supermarket chain does not is not as flexible and nimble as the phones in their pockets.
But anyone who looks under the hood will understand: technology inherited from a large organization is usually some kind of Frankenstein’s add-and-change monster and it would be incredibly expensive, not to say impractical, to replace wholesale . A large enterprise usually means decades of large, expensive, cumbersome, fragmented, and often incompatible implementations. And these legacy systems weren’t designed with modern agility needs in mind.
You know your systems need to run in the cloud. And you know you need to make sure things are secure and that all the old stuff not only works together, but will also work with current and future implementations. So what are you doing?
This is where containerization comes in.
The container essentially wraps your legacy systems in a cloud-native “skin,” so they now operate in the cloud as if they were cloud-native systems.
Containers can all talk to each other, acting as a translator and helping all your fragmented legacy systems talk to each other and all new technology implementations and applications. This hybrid approach is gaining momentum and is known in DevOps as “lift and shift” because a monolithic containerized on-premises application can be “lifted and moved” somewhere else (e.g. modern public or cloud).
Containers also act as a shield, isolating against malicious attacks.
The result is a cloud-native, flexible, dynamic, and protected environment that supports your existing and future systems.
But what about the cybersecurity vulnerabilities already present in your systems? Sometimes the threat really comes from inside the house.
There is a strong philosophy based on the free exchange of knowledge that has historically underpinned the software engineering community and professions. This is often referred to as the open source ethos. We can largely thank the many fathers of the internet for this. They had a utopian vision of what software development should look like at the very dawn of the Internet age.
It has many advantages. Rather than reinventing the wheel over and over again, engineers can lean on their peers and ancestors. This is one of the reasons why growth in this space has been so rapid over the past two decades.
But it also has its downsides and (especially for businesses) its dangers.
If you have a business that uses software of any kind, you probably have open source code in the bowels of your system somewhere. Your IT people have been using open source code in all of your technology products from the very beginning. It is very good. It’s normal, great even! But how do you know if errors (or worse, malware) have been introduced into the billions of lines of code already in use in your systems?
Honestly, no. We can speak of these anomalies as a kind of “cancer in the code”. It could be small mistakes that could cause problems as you build and develop, or it could be malicious lines smuggled in among the good stuff. Basically, your risk is low as long as you don’t run into a) bad actors, or b) bad developers. We are now able to scan the code for these cancers, remove them, and provide a clean resource for developers to tap into.
After that, if strands are still passing, the moment they try to activate, we have technology that stops them and isolates them. While the container protects you from the outside, containerized security acts as an immune system, protecting you from the inside.
Elvis Jusic is the former A/NZ Manager of AquaSec.