Code Guard app

The Code Guard app reports on used standard vs. custom code, used vs. unused custom code, used and unused code by Application Area, and used custom code quality.

A user with LiveCompare Editor privileges must prepare this app by running the Create Object Links Cache workflow for the ‘Analysis System’ RFC Destination, and making sure that performance history data is available for the ‘Performance History System’ RFC Destination. See the ‘Prerequisites’ section below for details.

DevOps categories

Development, Operations.

Parallel impact analysis

You may run the Code Guard app in parallel with other impact analysis apps and workflows. See here for details.

Prerequisites

Before running the Code Guard app for the first time, you will need to run the Create Object Links Cache workflow from the Prerequisites package to create an object links cache database for the system to be analyzed. Check the system’s RFC Destination in the LiveCompare hierarchy first, and select the OLC tab to verify whether an object links cache database has already been created. The Create Object Links Cache workflow may be run incrementally to update the object links cache database with any recent object dependency changes.

You will also need to make sure that performance history data is available on the RFC Destination selected for the ‘Performance History’ system. Select the RFC Destination in the LiveCompare hierarchy and click the PHD tab. Select the source for performance history data, and if necessary, the number of months of data to retrieve, then click ‘Update Data’. The performance history data may also be retrieved using a schedule. See the Retrieve performance history data help topic for details.

Run the app

To run the Code Guard app, select the app from the Apps screen and create an app variant. Set the ‘Analysis System’ field to the RFC Destination for the system to be analyzed, and the ‘Performance History’ field to the RFC Destination for the system from which performance history data has been obtained. Edit the ‘AAQ Select List’ parameter if required to specify the quality scenarios, categories and rules to be used in the analysis.

Click ‘Run’. When the variant has completed, its results may be accessed from the App Cockpit screen.

App results

The Code Guard app generates a Dashboard report which includes the following charts:

  • Used Standard vs Used Custom
  • Used vs Unused Custom Code
  • Used by Application Area. The top three used Application Areas are shown, and the remaining used Application Areas are grouped into the ‘Other’ category.
  • Unused by Application Area. The top three unused Application Areas are shown, and the remaining unused Application Areas are grouped into the ‘Other’ category.
  • Quality Score Card Summary
  • ABAP Failures Summary

The Dashboard report also includes links to the following reports:

Code Guard Details

This Excel report includes the following spreadsheets:

Dashboard

This spreadsheet includes the following charts:

  • Used Standard vs Used Custom
  • Used vs Unused Custom Code
  • Used by Application Area. The top three used Application Areas are shown, and the remaining used Application Areas are grouped into the ‘Other’ category.
  • Unused by Application Area. The top three unused Application Areas are shown, and the remaining unused Application Areas are grouped into the ‘Other’ category.
  • Quality Score Card Summary
  • ABAP Failures Summary

Help (Quality Scorecard Chart)

This spreadsheet lists descriptions for the quality scores used in each of the categories.

Used

This spreadsheet provides details for the objects that were used according to the available performance history data.

Unused

This spreadsheet provides details for the custom objects that were unused according to the available performance history data.

Quality Scorecard

This spreadsheet provides a quality score for each custom object, in each of the analyzed categories.

Rule Break Summary

This spreadsheet provides a summary of the quality rules that were triggered for each custom object.

Rule Break Detail

This spreadsheet provides details for each used custom object that triggered one or more quality rules. The results are filtered so that only rules in the Error class are shown. In this spreadsheet:

  • The LINE_NUMBER column stores ABAP source code line number in which the violation was detected. For Web DYNPRO (WDYN) objects, methods are stored in include files, and the line number refers to the position of the source code line within its method. For other object types, if the INCLUDE column for the violation contains a value, the line number refers to a line in the INCLUDE file.
  • The SEQ column stores the ABAP source code line number in which the violation was detected. For Web DYNPRO (WDYN) objects, this column refers to a line number in the object’s INCLUDE file. This column may be used when sorting the Rule Break Detail dataset to display the violations for each object ordered by line number.

Analysis Input Data

This Excel report contains a copy of the input parameters used to produce the app’s Dashboard report. The value of each input parameter is stored in a separate worksheet, which is named after the parameter whose value it contains.

Standard apps