Software Knowledge Transfer and Overcoming the Brain Drain

Mar 14, 2024 | IT Thought Leadership Software Knowledge Transfer and Overcoming the Brain Drain

Every organization is built on institutional knowledge. That could be trade secrets, essential product information, knowledge about customer relationships or even where you have the coffee stored for the break room. Every employee has information relevant to business operations, and when that employee leaves, that institutional knowledge goes with them.

Capturing institutional knowledge and insight into the mechanics of custom-built software has its own unique challenges. No two programmers think exactly alike, so documenting software development and coding changes is essential to keep products running. When a developer leaves or is let go, the remaining development team must rely on notes and documentation to understand their code structure. As applications age, it becomes increasingly difficult to support legacy software since the original development team is likely long gone.

When you have a disruptive loss of programmer knowledge, software is sure to break if the remaining developers don't have easy access to knowledge about all application elements and their dependencies. The recent upheaval at Twitter is a good example. After 15 years, the Twitter code base has become very complex, and the ongoing loss of technical personnel in layoffs and voluntary departures has left the company without the staff or expertise to maintain its software. One Twitter executive said that because of the institutional knowledge lost, any new code will likely lead to global outages since the existing engineering team lacks expertise in legacy Twitter systems.

Dealing with Technical Debt

74% of organizations lack a formal method of capturing and retaining industrial knowledge, including technical knowledge. Experts estimate that Fortune 500 companies lose $31.5 billion yearly to failure to share information.

The problem is particularly acute in software environments. Research shows that 33% of companies surveyed cite legacy software and the lack of skilled talent to modernize applications as the biggest impediment to digital transformation. Engineers spend many hours a week just trying to comprehend the software mechanics, as they try to address the impact of even the smallest of changes. The problem of maintaining legacy applications is exacerbated by the fact that aging software is often cobbled together using multiple programming languages, frameworks, and database technologies.

Custom-built applications are a big part of the problem, and it can take time to bring a new developer up to speed on company applications. As part of their job, developers spend about 5% of their time writing new code, 20% of their time updating legacy code, and 60% of their time understanding the code in a project.

The challenge of software maintenance becomes more acute when you factor in the lack of qualified developers. According to IDC, the shortage of qualified developers will increase from 1.4 million in 2021 to 4.0 million by 2025. That means fewer developers will have to do more work, which means working smarter, not harder. It also means developers will increasingly rely on automated software intelligence to shorten development time and increase code quality.

Software Decoding Software

New software intelligence technologies are evolving to make it easier to reverse engineer and "understand" custom software and legacy applications. Rather than spending hours or days decrypting someone else's code, these software intelligence technologies automate the knowledge extraction, drilling down to create a blueprint of an application's inner workings.

With software intelligence, organizations can worry less about knowledge transfer and software complexity. Rather than relying on barely existing documentation and watercooler chats with subject matter experts, software intelligence technology provides an accurate map of the application as it exists today. For anyone to easily navigate on their own and find the answers they need in minutes rather than spinning their wheels for days or weeks at a time.

A common challenge fueled by the drive for digital transformation is application modernization. Enterprise software originally written to run in a data center needs to be made easier to change and is often being ported to the cloud to cut costs and take advantage of scalability and accessibility. Developers need to figure out how best to modernize or refactor the applications, and evaluate each application for cloud readiness, including testing to see if the application can run in a new environment, interoperability, resiliency, resource efficiency, security concerns, and deal with integration issues.

Software intelligence technology can be applied in this scenario too for assessing cloud readiness, best use of cloud native services, identifying open-source and third-party components that could cause intellectual property or a security risk. Based on this intelligence, IT executives and developers can determine whether it's more cost-effective to refactor, rebuild, or retire an application. After an initial lift-and-shift, they can then continue to optimize the applications for their new cloud environment to get faster to the benefits of running on cloud they were after in the first place.

Keeping pace with changing technology becomes a greater challenge if you no longer have the in-house expertise to maintain existing systems. Corporations are feeling increased pressure to stay current with the latest enterprise solutions. With the ongoing shortage of technical talent, programmers will increasingly rely on automated software intelligence capabilities to shorten development time and safely upgrade legacy solutions to meet new business demands.

This article originally appeared in DevOps Digest.