Scientific Resource Management

by

If you're like most (don't deny it), here's how you do resource estimation for IT projects.

You figure out roughly how many bodies you need in which skill sets by glancing at the project description and duration. You let your experience of past projects guide you, and while you're at it, you throw an eye of  newt into the cauldron that's boiling out back.

But what if you could get data on application  size, technology distribution, the complexity of artifacts, the number of critical structural quality problems in these artifacts, and architectural visibility over the entire application (even as it's evolving)?  This will let you staff much more optimally by taking into account not just the number of bodies needed for the job but the specific skills they need, and perhaps most importantly, the required  level of expertise in each skill.

Now imagine an automated system that can give you all of this information. Such a system will  increase resource efficiency in two ways:

1. It will increase resource allocation efficiency throughout the project

  • Optimal resourcing at the start when the project is being staffed
  • Resource flexibility during the course of the project: once the project is stable, you can move people in and out without productivity loss because knowledge is transferred faster
  • Resource substitution during the course of the project: with the additional information this automated system provides, you often don't need an expert where previously you needed one. Since labor rates escalate rapidly the higher the expertise level (for example, 4 to 8 to 12 lakhs a month in India for the same skill at increasing expertise levels)

2. It will increase resource execution efficiency. The architectural visibility will help you:

  • Find problems faster, fix them faster, and fix them once and for all
  • Sequence activities in the right way to minimize rework and avoid bottlenecks

So the automated system described up front takes the guesswork out of resourcing. You can now staff scientifically rather than be swayed by stressful but corrosive factors like how difficult it is to work with your client and how good your project manager is.

The question is: do such automated systems exist? The answer is, yes, of course!

Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|