Trends that will shape the future of DevOps

Jul 05, 2021 / by Esko Hannula

Everyone, including me, loves predicting future trends. But what about trends that aren’t trendy yet?

Let me get my crystal ball and take a look at some topics everyone is talking about — and one thing that you should really pay attention to. 👀

Right now, it looks like three major trends will shape the world of software companies in the future. Businesses already need to start adapting to remain competitive. These are hyper-automation, quality intelligence, and dependencies

The underlying trend, the base of all these things, is accelerated delivery of value through software and it manifests itself in the adoption of DevOps. It has shaped the way IT is working now —from internal processes to cloud infrastructure.

If you want to find out more about DevOps and DevOps automation, you can find our DevOps automation guide here

Continuous development and a high release pace — increasingly even continuous deployment — are a reality already. Businesses are adapting to these trends in different ways and at different speeds. 

In this blog post, I will be looking at three key trends and emerging technologies I see shaping the future, and at how AI plays a role in the other trends. 

Find out more

📚 Five steps you need to apply DevOps now
📚
4 benefits of DevOps automation — and 4 ways to make the most of it
🎧
Webinar - How open-source has fuelled DevOp

Hyper-automation


The term itself is pretty self-explanatory, but: Hyper-automation is the orchestrated automation of anything you can imagine, and with this, we have truly only seen the beginning.

We’re already starting to automate many crucial processes in our everyday lives, and increasingly so. 

The Gartner Technology report 2020 already highlights hyper-automation as “a disciplined approach to rapidly identify, vet and automate as many business and IT processes as possible through the orchestrated use of multiple technologies, tools or platforms.”

This can mean anything from automated testing, deployments, or procurement. 

Find out more

🎧  RoboCon: Hyperautomation or Hype-automation: the automation landscape of the future

Quality Intelligence


The best decisions are made based on proper and adequate information. However, information is rarely complete and timely. Therefore, most important decisions require judgment as the available information is incomplete. 

In the traditional way of building software, you never quite know what is going on until your product is ready. In DevOps, you must know.

DevOps does not automatically make development industrial. Agile methods and DevOps have brought much more transparency into the production so that an external observer — such as a manager — can have better visibility into the process. 

But, at the same time, the need for such transparency is higher than ever. 

When you have a production line you have automated, you can also automate measurement. This instantly gives you access to much more data to make sense of.

The downside is that while you’re better equipped with data, the process is running so much faster that data-based decisions are also expected to be done a lot quicker.

And that is why we added quality intelligence to Qentinel Pace. 

We were long frustrated: We had been testing software for a long time but had continuous challenges in breaking down the findings and their implications in an understandable way. Now, this can be done objectively and understandably.

This is why I think quality intelligence is a significant trend. It democratizes decision-making when it comes to software development. 

Read more

📚 What is Quality Intelligence? 

Dependencies — the trend you should really care about


When it comes to actually making the most of trends like hyper-automation, tackling dependencies will be the defining moment between success and failure for businesses worldwide.

One thing the world has failed to catch is that there is no single DevOps pipeline. 

There are many. Each solution provider that contributes to a complex business system architecture has their own DevOps pipeline, and it operates in isolation from other solution providers. 

Then companies that create these business processes form their pipeline, either knowingly or implicitly. On top of that, there are integration providers who juggle some other parts of the puzzle.

DevOps is all about bringing development and operational IT closer together. 

Companies may have different ecosystems that are closely interlinked to each other. 

Let’s look at an example. A company is using Salesforce, and that’s where they have their sales and subscription data. Then, they have an ERP system, for example, a SAP 4Hana system, that controls their demand-supply chain. 

Somewhere, close to the shop floor, there is an old warehouse management system that is rarely modified. Finally, they have a custom-built loyalty program that allows their end-customers to collect points whenever they buy something from the company. 

There is an integration partner for Salesforce, and the custom loyalty program is built in-house with integrations to the ERP and Salesforce environments. 

So, in this scenario, there are at least four different development and operation teams involved. Each of the solution providers has their own DevOps team running continuous updates.

Of course, this fictional company also has its own DevOps team as the implementation partner. 

These different DevOps teams are applying DevOps principles as part of their solution delivery, but not as part of the overall, highly customized architecture that the company has in place. Chances are, they are not even aware of each other.

This means that updates from one solution may impact the entire value chain, and, in the worst case, this can cause errors.

On top of different systems, you need to take into account that each customer possibly (and likely) combines a variety of different business systems to a unique architecture that is tailor-made just for their specific business needs.

To summarize: The average business process architecture is already complex today. Corporate IT needs to handle continuous releases from multiple sources that are highly integrated with each other. 

Once you add the method of developing software where each individual part can change rapidly, you create a world we have not seen before. 

What's the simple solution?


Tackling interdependencies is the most substantial advice I can give to companies. I know it may be obvious, but let’s ask ourselves: How secure do we feel when we run an SAP, Salesforce, or Dynamics update? Or better yet, ask your IT colleagues.

End-users and customers will play a significant role in resolving these issues because if something goes wrong, their customers will see the effect in action.

While DevOps is in a way part of the challenge, it’s also part of the solution. 

DevOps has brought about frequent releases, but typically these are happening in small increments. Big bang releases are far less common, and this means the risk that each release brings is delivered in small chunks and becomes more manageable.

Here are three things that I believe will be crucial in resolving this challenge of dependencies:

Continuous automated testing (with tools like Qentinel Pace)

  1. Set up ways to continuously monitor your systems so you can see when things go wrong.
  2. DevOps, product teams, and integration partners need to adjust their working ways and manage dependencies. They know the interface, but not which potential dependencies might be within the system.

Find out more

🎧  Webinar recording: Test Automation in DevOps

AI to the rescue


Even today, you can use AI, not yet to solve problems, but to facilitate solutions. 

For example, if you have collected data on how your software is used, you can use AI to see how many processes are likely to be affected when changes are made. This way you can easily estimate how big the impact would be.

For example in Qentinel Pace, when you run tests, there is a piece of AI that will try to figure out if something changed in an unexpected way. In case something has already gone wrong, you can use AI to figure out where the issue occurred. 

With a multi-vendor setup, the most challenging part is not to find the solution but to find the one who is expected to resolve the problem. 

For AI to be usable and successful, you need data on the past behavior of systems. You can only predict the future if you know what happened in the past — or if you have that crystal ball. 

Larger companies have done it for a long time, but it hasn’t quite become mainstream yet. Microsoft is a well-known example. They use telemetry in their product push-out software and gather usage data and user feedback to find out what they need to fix urgently.

How AI and test automation will develop


And here we are, back at my favorite topic, test automation. 

Today, typically, the actual running of tests is already automated. Test case design and maintenance have become the actual bottlenecks. 

Test case maintenance


When a test fails, there are two potential reasons. 

First, there may be a bug or an error, and in that case, it’s good that the case failed.

Secondly, it’s equally likely that the case failed not because the software has an error but because it has been changed so that the test case does not work anymore. 

Even the technology we have today can distinguish between these things and automatically resolve the errors. 

Qentinel Pace will inform you when a test fails, and even give you a potential reason, and instructions how to fix it. The more data is gathered, the more intelligent the system will become.

Test case design


This is another thing that is developing as we speak. DevOps is working with telemetrics and gathering user behavior. 

Data for user behavior enables us to create test cases from this behavior and play those back in your environment. This way, you’re essentially testing real user behavior. Note, however, that this approach is subject to privacy controls.

Test selection


This is another emerging trend when it comes to test automation. 

Imagine you are running automated tests, you gather data, and at some point, you have a large amount of test cases to run. 

Even though the tests are automated, a robot is no magician either. This is a great place to use machine learning to understand which cases are most likely to detect an error. This helps you prioritize the tests that are most important to run. 

Read more

📚 Automated SAP testing – Building block for business processes

What you can do today to be ready for tomorrow


It’s great to talk about trends and speculate how things will change. However, let’s finish this blog post with two concrete pieces of advice that everyone can already start with today.

  1. Go and talk to your software providers or at least the most advanced of them. Find out their ways of working and think about what this may mean for your business (processes). 
  2. Start thinking of how you can collect more data about the usage of your information systems. You can use that data to figure out the critical functions of your product,  which parts are likely to be affected by updates, and where to focus your mitigation efforts. Or if you want to look at it the other way around, you know where your users spend most of their time, so you can optimize that part. 

 

And if you’re ready to try our cloud-based test automation tool: Qentinel Pace is the all-in-one platform for automated software testing

The best part is that you can try it for free.

Start Free Trial

Topics: DevOps, Software Testing, Test Automation, Agile, Quality Assurance

Esko Hannula

Written by Esko Hannula