Learning to survive in the new normal of IT Program Management

by

The sequestration has hit a lot of organizations hard, and IT intensive programs aren’t ducking the proverbial bullet. In the decade since 9-11, organizations had more money and resources to give to development teams to fix their application’s performance issues. But now that the nation is trying to fix its fiscal woes, every day and dollar counts.

It’s more important than ever to catch mistakes in that can impact testing time and production stability early in the development cycle so precious time and money isn’t squandered making late corrections through rework. No longer do we have the flexibility in budget, staff, or systems to compensate for the development mistakes or errors accumulated over the lifecycle of enterprise-class software and IT systems.

I’ll be speaking more about this topic in a webinar, The New Normal in Government IT Program Management, on April 10th with Bill Curtis from CAST. Click here to register and find out how you can keep your federal IT system from finding its way to the chopping block. Or you can read more about it in a release we put out today.

What do I need to be prepared for?

For organizations not prepared for this New Normal of application development, there is potential for their applications to enter the “program death spiral.” If it is possible, there is even more pressure than ever for program managers to hit cost schedule and performance targets, otherwise their programs will receive extensive scrutiny that could lead to further funding reductions, or ultimately, termination.

Once a program starts having problems, the ‘buzzards start circling’ in the form of other programs that are doing well and need more resources. Instead of dumping money into a “problem program” to fix it, it’s possible that the program could be sacrificed so that the resources can be reallocated to programs that are doing better. This is new territory for many program managers.

Application development component of a program isn’t immune to oversight

Just like every other aspect of the federal government, program managers face increasing amounts of oversight for their systems, processes, and results. Now, if a program manager’s app isn’t performing to expectations, it gets management’s attention, but also those offices that provide oversight. The app will then be further scrutinized by ‘experts’ and there’s a real chance the program’s funding could be cut. It’s much harder to justify the cost of fixing an application when it’s already entered into a death spiral.

Carefully balance tools and time to execute efficiently

I know what you’re thinking, “What kind of balancing act am I going to have to do to keep my applications from entering this ‘death spiral’?” It’s not so much a question of balance, it’s more a question of smart program execution. Program managers are caught in a fresh round of the age old problem of doing more with less. You have to do a lot of things, and you have to do them all very well.

Program managers need to pay attention to the software quality metrics their applications produce during development so they can effectively identify and manage risk. The automation exists to catch design, coding, and integration errors and to fix them in cycle to keep development within cost, schedule, and performance targets.

Those who ignore quality defects -- not measuring against known industry benchmarks and best practices -- can find themselves in a situation where they are going into a final acceptance test only to discover their system is not stable or hitting performance targets. The result, their system fails the test and they have to go back to fix their mistakes and start the process over again -- a time consuming and costly rework process that makes everyone look bad.

This is not simply about balancing development time and cost. This is about ongoing technical risk and due diligence throughout the development lifecycle -- catching quality errors and fixing them where it is most cost effective.

Equip the industry with approved tools and policy changes

This is a perfect time for policy leaders at various government headquarters to offer some guidance and direction to the system development and sustainment program management teams. Program managers may not be aware of the tools at their disposal to ensure their programs are delivering the way they should.

The real challenge will be getting large federal organizations (DoD, DHS, etc.) to institute a broad policy where program managers are required to employ quality tools to collect metrics and report on them. Further complicating the issue, they’ll then have to communicate to their entire agency the enterprise tools that are approved and available for product managers to utilize.

Once the policy and vetting steps are complete, leadership can follow up with a memo saying, ‘use any of the approved tools, take action on the key findings and report out on quality metrics to ensure that your programs get executed on their cost schedule and performance targets.

In this new world of application development, measuring “quality” may not seem like a “must do.” But think about it this way, what you are really managing is risk -- because the minute your application underperforms, or your program misses a milestone, it could be next in line for the chopping block, along with your next promotion!

You can hear about this and more in my upcoming webinar, The New Normal in Government IT Program Management, taking place on April 10,2013. Register here and ensure your applications stay out of the program death spiral.

Gary Winkler is Founder and President of Cyber Solutions & Services, aimed at providing Government and industry organizations with value-based, leading edge IT/Cyber solutions and services. Gary was previously with the U.S. Army for more than 25 years, ultimately leading the Program Executive Office, Enterprise Information Systems (PEO EIS). In this role, Gary led a $4 billion portfolio of large scale information technology systems, applications and communications infrastructure programs. He was honored as GCN's Defense Agency Executive of the Year in 2010.

Filed in: CAST News
Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Gary Winkler
Gary Winkler President at American Cyber, Inc.
P&L responsibility for C4ISR and Cyber business in the national security environment.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|