IT: Scorecard Metrics for Applications Development and Maintenance

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #292169

    We are seeking key scorecard metrics for IT services specific to:

    • applications development and testing;
    • maintenance

    Thank you in advance for any assistance you can provide.

    Remember, if you wish to donate a scorecard or other type of template, SIG will sanitize and remit for your approval before sharing.

    #293341

    INTAKE

    • Number of new requests vendor is asked to evaluate
    • Percentage of requests that get approved
    • Duration to complete an estimate for a new request
    • Percentage of estimates that have to be re-worked
    • (NOTE: Much of the focus on this process is not to evaluate the vendor, but instead the actual process itself. In a mature IT organization, the vendor is not asked to provide estimates for a high number of requests that don’t get approved…there is a good vetting process that eliminates requests that likely will never get approved)

    APPLICATION DEVELOPMENT

    • % of projects that meet schedule (NOTE: the scheduling process is typically split into two parts…a timeline for gathering requirements, and then a timeline for developing and implementing the system based on requirements. Asking a vendor to create one timeline and meet it before fully capturing requirements is not reasonable)
    • % of projects that meet budget

    TESTING

    • Number of defects uncovered during unit / system testing (this is testing that happens BEFORE Users test the system)
    • Number of defects uncovered during User Acceptance Testing (this often is broken down into categories – example: Severity 1 defects, which are those that would make the system unusable or produce invalid results; Severity 2 defects, which miss a key use case or would require users to find workarounds in the system; Severity 3 defects, which are typos or small errors that won’t impact data quality or performance)
    • Number of cycles of testing – Assumes that there are repeated iterations of testing until the system is defect free

    POST IMPLEMENTATION

    • Number of defects uncovered during the first 30 days (using the severity levels described above)
    • Number of defects uncovered during the first 90 days

    MAINTENANCE

    • If Maintenance includes minor enhancements, some of the KPIs will be similar to the above, especially around intake. Biggest issue I see in this space is no filter resulting in a growing backlog of requests
    • Number of changes related to missed requirements
    • Number of changes related to new customer requirements

     

    Scott Glenn | The Hackett Group | [email protected]

    #293343
    Anonymous
    Guest

    I am completing a total revamp of our scorecards and SLAs for two major vendors. My internal business partners put me onto ITIL. They are the gold standard for IT metrics. Here’s their site:

    http://www.itlibrary.org/

    Another good starting point:

    http://wiki.en.it-processmaps.com/index.php/ITIL_Key_Performance_Indicators

    #293342

    I am completing a total revamp of our scorecards and SLAs for two major vendors. My internal business partners put me onto ITIL. They are the gold standard for IT metrics. Here’s their site:

    http://www.itlibrary.org/

    Another good starting point:

     

    http://wiki.en.it-processmaps.com/index.php/ITIL_Key_Performance_Indicators

Viewing 4 posts - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.