Developer be Agile, Developer be Quick; Use Automated Analysis, it Does the Trick

by

All business-critical applications consist of many intertwined components. In Agile Development, these components are built individually in “scrums,” but eventually have to coexist and work together, possibly across many layers (UI, data, business logic). This underscores a fundamental problem among applications created using Agile techniques: How do you ensure that the end product performs reliably and dependably outside the production environment?

Some believe that to ensure software quality requires setting aside the technology and focusing just on the basics – the people and the process. This was the stance taken by Bola Rotibi recently in her piece on Application Development Trends titled, “Want Quality Software? Focus on People and Processes -- Not Technology.” In this article, Rotibi puts the first step to achieving software quality in very simple terms, “quality people equipped with the right processes are resolute criteria for the timely and successful delivery of quality software.”

She later points out that one of the best examples of “people and process” can be found within Agile Development, noting:

Any organization that has achieved success through Agile development practices will know that people attitude, culture, training, education and management buy-in and support are essential criteria for quality delivery and stakeholder satisfaction.

We generally agree with Rotibi’s views that the tenets of people and process inherent in the ideal Agile environment set a basis for success in terms of people and process. Where we differ is on the need for technology – namely automated analysis and measurement – applied at the right point in the process.

The Agile Conundrum

The very nature of Agile programming conspires against software quality. Bits and pieces of functionality that will eventually become interdependent are created and tested separately in different scrums. New functionality is often added on top of old, which further muddies the architectural waters, threatens reliability and performance, and increases the cost to modify and maintain the software. Moreover, as the number of lines of code grows, architectural complexity grows exponentially.

At this point, performance bottlenecks and structural quality lapses become very hard to detect. This makes it very difficult to see and measure the structural quality – a measure of how the architecture and the implementation hang together to ensure reliable, dependable, mission-critical performance – of the application as a whole. Being able to find and fix critical architectural bottlenecks in a rapidly evolving code base reliably is the key to developing high-quality applications using Agile techniques.

Unfortunately, even the most qualified software engineer does not have the requisite insight into the thousands of classes, modules, batch programs, and database objects that need to work flawlessly together in the production environment. Most issues of robustness and performance are not hidden behind one specific artifact of code but exist in the interaction between multiple components created in separate scrums.

You Can’t Hit What You Can’t See

This is where you need to involve technology by introducing a system of automated analysis and measurement, which provides comprehensive visibility over component interconnections and assesses the structural quality of the application software as a whole, rather than each part individually. This is important because even if each scrum develops a perfect portion of the project, if these modules do not interconnect properly, the entire project is doomed. Providing visibility and quantification give software engineers the information they need to ensure high performance and reliability, no matter how rapidly the code base evolves.

CAST’s Application Intelligence Platform does what the human eye cannot do. It can read, analyze and semantically understand most kinds of source code, including scripting and interface languages, 3GLs, 4GLs, Web and mainframe technologies, across all layers of an application (UI, logic and data). By analyzing all tiers of a complex application, CAST measures quality and adherence to architectural and coding standards, while providing real-time system blueprints.

So yes, producing high quality software via Agile IS about people and process, but it is ALSO about technology!

All business-critical applications consist of many intertwined 

components. In Agile Development, these components are built individually

in “scrums,” but eventually have to coexist and work together, possibly

across many layers (UI, data, business logic).  This underscores a

fundamental problem among applications created using Agile techniques:

How do you ensure that the end product performs reliably and dependably

outside the production environment? Some believe that to ensure software quality requires setting aside the

technology and focusing just on the basics – the people and the process.

This was the stance taken by Bola Rotibi recently in her piece on

Application Development Trends titled, “Want Quality Software? Focus on

People and Processes -- Not Technology.” In this article, Rotibi puts the

first step to achieving software quality in very simple terms, “quality

people equipped with the right processes are resolute criteria for the

timely and successful delivery of quality software.” She later points out that one of the best examples of “people and

process” can be found within Agile Development, noting: Any organization that has achieved success through Agile development

practices will know that people attitude, culture, training, education

and management buy-in and support are essential criteria for quality

delivery and stakeholder satisfaction. We generally agree with Rotibi’s views that the tenets of people and

process inherent in the ideal Agile environment set a basis for success

in terms of people and process. Where we differ is on the need for

technology – namely automated analysis and measurement – applied at the

right point in the process. The Agile Conundrum The very nature of Agile programming conspires against software quality.

Bits and pieces of functionality that will eventually become

interdependent are created and tested separately in different scrums.

New functionality is often added on top of old, which further muddies the

architectural waters, threatens reliability and performance, and

increases the cost to modify and maintain the software. Moreover, as the

number of lines of code grows, architectural complexity grows

exponentially. At this point, performance bottlenecks and structural quality lapses

become very hard to detect.  This makes it very difficult to see and

measure the structural quality – a measure of how the architecture and

the implementation hang together to ensure reliable, dependable,

mission-critical performance – of the application as a whole.  Being able

to find and fix critical architectural bottlenecks in a rapidly evolving

code base reliably is the key to developing high-quality applications

using Agile techniques. Unfortunately, even the most qualified software engineer does not have

the requisite insight into the thousands of classes, modules, batch

programs, and database objects that need to work flawlessly together in

the production environment.  Most issues of robustness and performance

are not hidden behind one specific artifact of code but exist in the

interaction between multiple components created in separate scrums. You Can’t Hit What You Can’t See This is where you need to involve technology by introducing a system of

automated analysis and measurement, which provides comprehensive

visibility over component interconnections and assesses the structural

quality of the application software as a whole, rather than each part

individually. This is important because even if each scrum develops a

perfect portion of the project, if these modules do not interconnect

properly, the entire project is doomed. Providing visibility and

quantification give software engineers the information they need to

ensure high performance and reliability, no matter how rapidly the code

base evolves. CAST’s Application Intelligence Platform does what the human eye cannot

do. It can read, analyze and semantically understand most kinds of source

code, including scripting and interface languages, 3GLs, 4GLs, Web and

mainframe technologies, across all layers of an application (UI, logic

and data). By analyzing all tiers of a complex application, CAST measures

quality and adherence to architectural and coding standards, while

providing real-time system blueprints. So yes, producing high quality software via Agile IS about people and

process, but it is ALSO about technology!

Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Jonathan Bloom Writer, Blogger & PR Consultant
Jonathan is an experienced writer with over 20 years writing about the Technology industry. Jon has written more than 750 journal and magazine articles, blogs and other materials that have been published throughout the U.S. and Canada. He has expertise in a wide range of subjects within the IT industry including software development, enterprise software, mobile, database, security, BI, SaaS/Cloud, Health Care IT and Sustainable Technology. In his free time, Jon enjoys attending sporting events, cooking, studying American history and listening to Bruce Springsteen music.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|