Code castles made of sand fall into the dependen-sea eventually

Most modern applications depend on third-party libraries for key functions (Java applications, for example, use 107 libraries on average), and this expansion in open source library usage has been a good thing for everyone. Why reinvent the wheel if you can just download the design for a perfectly functional wheel at Maven Central Repository?

But when your application relies on so many moving pieces that are frequently being updated, ignoring even a few updates means it will soon be running on out-of-date dependencies. The longer you wait between updates, the more painful each update is, the more updates you’ll have to do at once, and the more likely it is that you’ll be trying to fix everything under time pressure caused by a security emergency that happens at the worst possible moment.

In some ways, keeping dependencies up-to-date is like building a sandcastle right next to the ocean: It takes time and work to build, and you’re proud of the result—and as soon as you’ve made some progress, a wave of updates wipes out your work and leaves you back where you started.

Current strategy: Seek higher ground

This constant race against both code-breaking updates and potential security vulnerabilities is why a lot of repository owners effectively “seek higher ground,” move away from the shore and avoid implementing updates for their dependencies in favor of trying to maintain current functionality. This works until the application dries out and crumples because it’s too deprecated and insecure to continue standing, and dependencies that rely on each other are no longer supported.

Don’t fight the waves of updates: Build a moat

You really shouldn’t have to choose between security and functionality. At the risk of overstretching this metaphor, the key to withstanding the waves of updates is finding a method that embraces them and incorporates them into your application automatically, like a moat simultaneously protects your sandcastle and provides a new source of wet sand to build with.

In an ideal world, there would be a solution for each step of the dependency-updating process that could:

  • Help you identify outdated dependencies and potential vulnerabilities as they occur

  • Recommend fixes for these vulnerabilities

  • Identify whether proposed fixes introduce code-breaking changes

  • Propose automatically generated fixes that can optionally be integrated seamlessly

  • Refactor native code, in cases where implementing an update would break something

For the first two steps, Software Composition Analysis (SCA) tools and Dependency File Scanners can tell you which of your dependencies are outdated and suggest fixes. However, most repo owners wouldn’t want to integrate fixes continuously or automatically unless they could also guarantee that these fixes wouldn’t break anything. This is why the later steps can currently only be done manually.

But if it were possible to automate this process from start to finish, with automatically generated and verified fixes, the entire challenge of trying to outbuild the waves of updates would be eliminated; everyone could keep their project dependencies fully up-to-date without having to think about it. Updates could be done daily, instead of monthly at best, and applications would stay as secure as possible.

This is the problem we hope to solve with our brand new tool that aims to automate every one of these steps. Currently, we can identify and recommend fixes for outdated dependencies for Java Maven repositories without requiring any login or installation, and our AI for code is learning how to help you keep everything running when new updates are implemented.

Try it out with your repository and stay tuned to learn about upcoming features!