The Software Intelligence Blog

Tag: automated analysis and measurement

  • CI/CD DevOps: Enhancing Continuous Delivery with Software Intelligence

    Software today is more complex than it has ever been. New technologies emerge rapidly and as applications evolve to utilize them, gaps occur. Some gaps result in “technical debt”, an industry term to describe development practices where ideal craftsmanship has not been achieved and additional work needs to be done.
  • CAST Releases Application Intelligence Platform (AIP) 8.1

    CAST is pleased to announce the release of AIP 8.1, a continuation of the big step forward made in AIP 8.0. AIP 8.1 extends the functionality of Application Intelligence Platform to provide greater technology support, improved reporting and new code viewing capabilities in the Application Engineering Dashboard (AED).

     Java 8 Support

    Java 8 is quickly being adopted by Java developers. CAST now fully supports Java 8 and can help you find flaws linked to the use of the very popular Java 8 lambda functions, among others.

  • QAI QUEST: Fixing Quality Issues with Automated Code Review

    Recently I had the pleasure of speaking at QAI QUEST 2016, which showcases the latest techniques for software quality measurement and testing. It was a content-rich program with more than three days of diving deep into issues like DevOps, Open Source, Security Mobile and more. But what struck me the most above all the event chatter is that even the brightest of companies are still having a difficult time identifying and fixing code quality errors.

  • CAST User Group on Function Point Analysis: Key Findings

    1On April 6th, CAST held a user group meeting on the topic of function point analysis and software productivity measurement. The meeting gathered more than 20 software measurement professionals from major companies in the banking, IT consulting, telecom, aviation and public sectors for a two-hour working session to discuss the benefits of function point analysis testing.

    The event featured presentations including:

    1. An IBM case study on how they worked with CAST to integrate and secure an Automated Function Point (AFP) approach with a big player in the aeronautic sector within TMA Systems
    2. Functional sizing case study
    3. Updates on the new CISQ standards for Automated Function Points
    4. The importance of internal and external benchmarking
  • Software Risk Management: Risk Governance in the Digital Transformation

    UntitledSoftware Risk Management in Digital Transformation was the focus during the 4th edition of the Information Technology Forum, hosted by International Institute of Research (IIR).  Massimo Crubellati, CAST Italy Country Manager, discussed how Digital Transformation processes are changing the ICT scenario and why software risk management and prevention is mandatory.


    Massimo shared our recipe for Digital Governance evolution: including a specific ICT Risk chapter in the design of the governance structure of the digital transformation, whose most relevant aspect is to determine which methods and through which key performance indicators to measure the operational risk inherent in the application portfolio. Measurement needs to be continuous and structural, it must include the assessment of application assets inherent weaknesses, through the analysis of correlations between the layers composing them. Thus obtaining, not only an effective prevention of direct damage ensuring the service resilience, but a reduction in maintenance and application management costs.

  • IT Leaders Address the Value of Software Measurement & Government Mandates Impacting Development

    IT leaders from throughout the federal government discussed the value of how software measurement can positively impact their development process at CAST’s recent Cyber Risk Measurement Workshop in Arlington, VA – just outside of the Washington, D.C. area. The event brought together more than 40 IT leaders from several governmental agencies, including the Department of Defense and Department of State, system integrators and other related organizations. The group shared their experiences in how their respective organizations are driving value to end users and taxpayers.

  • Software Risk: Executive Insights on Application Resiliency

    Software risks to the business, specifically Application Resiliency, headline a recent executive roundtable hosted by CAST and sponsored by IBM Italy, ZeroUno and the Boston Consulting Group.  European IT executives from the financial services industry assembled to debate the importance of mitigating software risks to their business.

  • The Rule of Three: NYSE, UAL, and WSJ Operations Foiled by Their Own Systems

    The events of last Wednesday proved that things often do come in threes. The “rule of three” reared its ugly head, as technical failures occurred at three large American organizations: the New York Stock Exchange, United Airlines, and The Wall Street Journal. United Airlines grounded all flights nationwide, wasn't able to conduct background checks of passengers, and left flight attendants handwriting tickets (many of which were not accepted by TSA agents). Then, the NYSE suspended trading for almost four hours, the first time in a decade that trading was halted during regular business hours. The Wall Street Journal's homepage also faced difficulties and was offline for almost an hour.

  • Function Points Analysis: On Point at Federal Productivity Workshop

    In business, measurement is key. It’s not a new concept, of course, but it’s one that information technology has enabled to be implemented to a higher degree than ever before. Function point analysis is one of those areas where, like initiatives such as Six Sigma, the ability to measure can help insure ultimate success.

  • Is Application Security Risk a Result of Outsourcing?

    There’s a common belief in the software development space that when companies choose application outsourcing of their projects, the control they relinquish by doing so results in lower application quality and puts their projects at risk. Once again, however, CAST’s biennial CRASH Report, which reviews the structural quality of business critical applications, has disproved this theory.

  • Agile-Waterfall Hybrid Best for Structural Quality According to CRASH Report Findings

    For the last half-decade, a debate has raged over which project management method reigned supreme – Agile or Waterfall. To determine which held the advantage, some looked at the management techniques and fluidity with which projects were completed, others judged the debate by pointing to the structural quality of the applications being developed.

  • Making Software Quality the First Measure of Software Security

    If you read the news these days, one would think that software security is something that is layered on top of existing software systems. The truth is, however, that software security needs to be woven into the very fabric of every system and this begins with eliminating vulnerabilities by measuring software quality as the system is built.

    During the CAST Software Quality Fall Users Group, Dr. Carol Woody, PhD, senior member of the technical staff at the Software Engineering Institute (SEI) at Carnegie Mellon University, whose research focuses on cyber security engineering, discussed the importance of software quality as a basis for security.

  • Automated Function Points Provide Data-Driven Captives Management

    Last month in this space I wrote about the importance of optimizing the cost-effectiveness of Captives (i.e., Global In-House Centers) by setting metrics and enhancing process transparency for better management of them. For these management methods to work, though, an organization needs to employ automated function points as a way to way to gain insight about current costs and supplied value, which can then be used to enhance received output from current or future providers.

  • Digital Transformation Keeps Software Complexity from Becoming a CIO’s Legacy

    They say “if something works, don’t fix it.” This old adage may be the reason behind why some organizations hold onto legacy systems longer than they should, but it is also the reason why these same organizations struggle with software complexity. In fact, according to the GAO, Uncle Sam spends 80 percent of its $86.4 billion IT budget on legacy systems.

  • VIDEO: IT Expert Calls Upon Automated Function Points for Vendor Management

    Barbara Beech, an expert in the field of IT development for telecommunications companies, recently spoke to CAST in a video chat about her experience using software analysis and measurement as well as automated function points to gain visibility into IT vendor deliverables.

    As a solution to gaining visibility into IT vendor deliverables, Beech points to the CAST Automated Function Points (AFP) capability – an automatic function points counting method that is based on rules defined by the International Function Point User Group (IFPUG). CAST automates the manual counting process by using the structural information retrieved by source code analysis, database structure and transactions.

  • IT RISK MANAGEMENT: A Conversation with BCG’s Benjamin Rehberg

    Benjamin Rehberg, Partner and Managing Director of the Boston Consulting Group and former consultant for IBM Global Business Services, discusses the importance of both IT risk management and application portfolio management (APM) in a video conversation with CAST. He looks at the challenges for IT leaders, the need for software measurement and discusses how IT transformation can improve business operations.

  • Five Reasons You MUST Measure Software Complexity

    There’s an old adage in the IT industry – you can’t manage what you can’t measure. Knowing how complex an organization’s application portfolio is provides insight into how to manage it best. The problem is the issues that comprise software complexity – legacy system remnants, antiquated code, overwritten and rewritten code, the integration of formerly proprietary applications, et al – are the same things that make measuring it difficult.

    With multiple system interfaces and complex requirements, the complexity of software systems sometimes grows beyond control, rendering applications and portfolios too costly to maintain and too risky to enhance. Left unchecked, software complexity can run rampant in delivered projects, leaving behind bloated, cumbersome applications. In fact, Alain April, an expert in the field of IT maintenance, has stated, “the act of maintaining software necessarily degrades it.”

  • Top 5 Reasons to Use Code Analysis Tools with Automation to Establish Vendor Management Metrics

    As IT organizations face increasing demands from business, their IT systems have become increasingly complex. Today’s applications are typically a heterogeneous web of systems and software from an array of vendors and custom development.

  • Closing the Back Door thru Code Analysis

    Have you performed code analysis on your software recently? If not, you are in good company as many companies are failing to do the one thing that could improve their software security – making sure the software isn’t vulnerable to an attack to begin with.

  • 5 Keys to Optimizing Cost-Effectiveness of Captives

    Companies seeking to reduce time to market while improving application quality, today usually choose between assigning application development projects to either in-house teams or outsourced system integrators (SI). However, the cost arbitrage of Global In-House Centers (GIC), better known in the industry as “Captives,” continues to provide advantages in cost competitiveness that cannot be overlooked