When electricity first became available for industrial usage, factories began converting from steam to electricity to drive machinery. 

In 1900, just 5% of factories used electrical power, and large steam engines drove very long drive shafts that ran the length of a factory. Machines hooked up to the drive shafts using belts, and there was a whole apparatus of belt maintenance and lubrication to keep it all turning (and avoid catching fire).

The obvious thing to do was replace the steam engine with an electric motor, which got rid of a hot engine that required constant attention and a lot of coal. Costs were reduced and reliability improved, but core productivity didn’t budge. Factories were no more productive, and factory owners wondered if electric power was worth the expense of switching over.

Today, there is a similar paradox associated with “lifting and shifting” applications to public clouds like AWS, Azure and Google Cloud. It can make some improvements to costs and uptime, and can also be compelling for disaster recovery or data center exits. But most of the time, there is very little improvement in productivity or other key business metrics.

Using electric motors as a steam engine substitute didn’t play to one of the key strengths of electricity: it was far easier to distribute around a factory. Instead of machines aligned around long drive shafts, innovators arranged their machines along an assembly line, so that the output of one machine could be fed into the next. This was possible because you could power each machine with its own electric motor. Productivity exploded as a result. Henry Ford could only make the Model T in the 1920s because of this innovation.

New tools require new ways of working

Similarly, cloud adoption is most compelling when the application is refactored to take advantage of the operational and cost advantages of public cloud services, including on-demand capacity, elastic managed storage (e.g. databases on demand), autoscaling and more. Costs decline, flexibility increases and the application becomes easier to change.

The challenge that many Diffblue customers face when migrating Java applications to the cloud is that they lack good tests—especially unit tests—which makes it difficult to refactor without silently “breaking” the app. That slows down the migration, and makes it a lot riskier. One company showed us its “Cloud Passport” system, where an application must have a “stamped passport” before it can be migrated to the cloud. One of the stamps is “adequate unit test coverage”—which was a big challenge for them. We have another customer doing a similar migration to Microsoft Azure, using Diffblue Cover to automate their unit testing as they refactor.

Automatically generated, human-readable unit tests

Diffblue Cover automatically writes human-like tests for existing Java applications, so you can validate that the application still functions properly in the new cloud environment. Because Cover writes unit tests, you can quickly identify components where behavior has changed, and identify problems. And because Cover can also automatically update tests as the code changes, the unit tests it writes keep pace with the application. Tests are always up to date as the application is modernized and becomes more cloud-native in design and architecture. 

The result? Faster cloud migrations with less risk.

To see how Diffblue Cover can work for your organization, download our free IntelliJ plugin, or get a free trial of the CLI tool