Developer Impact Analysis

The Developer Impact Analysis workflow performs a continuous impact analysis for developer changes on the specified SAP system. Transportable tasks are obtained from the E070 table on the analysis system; these provide the set of changing objects to be used by the analysis. If an SAP Solution Manager System is provided, the workflow’s results include the ChaRM ID associated with each task.

If there are any objects with ABAP changes to analyze, the workflow generates an Excel report which is emailed to the specified recipients. The report includes:

  • The objects, screens and users impacted by each changing object. A maximum of 10 impacted objects are reported for each changing object.

  • The most-at-risk objects that are recommended for testing.

A code quality analysis is performed for each impacted changing object that has associated ABAP code. By default, for includes (INCL), functions (FUNC) and programs (PROG), the code quality analysis is performed for lines of code that have changed in the current version of the object, and the previous version.

If any impacted, changing objects have ABAP Unit Tests in the specified system, the ABAP Unit tests are run, and their results are included in the workflow’s Excel report.

The workflow should be scheduled to run periodically. During each run, an Impact Analysis is performed for the transportable tasks from the analysis system that have not yet been processed.

By default, the Developer Impact Analysis workflow reports the following object types as used, impacted or most-at-risk.

Object Type Description
CLAS Class
FUNC Function Module
IDOC IDoc
PROG Program
SSFO Smart Form
TCOD Transaction
WAPA BSP Application

However, a user with Administrator privileges may customize these defaults as follows.

  1. Select the Administration > Configuration > Impact Analysis folder in the LiveCompare hierarchy.

  2. In the TypesToFind section, deselect the object types to be excluded from the used, impacted and most-at-risk results. Note that if no items are selected, all the above object types will be used as the default.

  3. Click ‘Save’ to save your changes.

These changes affect all subsequent runs of the Developer Impact Analysis workflow, but they do not affect any existing workflow results.

The workflow populates the Developer chart in the LiveCompare Dashboard.

Parallel impact analysis

You may run the Developer Impact Analysis workflow in parallel with other impact analysis apps and workflows. See here for details.

Prerequisites

The Developer Impact Analysis workflow uses a Pipeline to identify the Analysis, Comparison, Usage and SAP Solution Manager systems to be used the analysis. Before the Developer Impact Analysis workflow is run, you must create a Pipeline that includes the appropriate systems. In the Pipeline:

  • The Analysis System field stores the RFC Destination for your Analysis system.

  • The Comparison System field stores the RFC Destination for the system on which to compare the most-at-risk executables. Note that no comparison results are in the workflow’s reports if a comparison system is not specified.

  • The Usage System field stores the RFC Destination for the system from which performance history data has been obtained.

  • If required. the SAP Solution Manager System field stores the RFC Destination to the RFC Destination from which to retrieve the ChaRM IDs associated with each transportable task.

The Create Object Links Cache workflow from the Prerequisites templates folder should be run to create an object links cache database for the Analysis system.

You will need to make sure that performance history data is available on the Performance History System. Select the RFC Destination in the LiveCompare hierarchy and click the PHD tab. Select the source for performance history data, and if necessary, the number of months of data to retrieve, then click ‘Update Data’. The performance history data may also be retrieved using a schedule.

The ChangingObjectsToIgnore External Data Source removes tables whose names begin with ENH or AGR from the set of changing objects. If required, this External Data Sources may be edited in the LiveCompare Studio using the ‘Replace Data File’ option.

LiveCompare should be configured to send emails. See the Guided Configuration - Email help topic for details.

Prepare the workflow

The Developer Impact Analysis workflow should be run in a separate workspace for each system to be analyzed. To prepare the Developer Impact Analysis workflow, carry out the following steps.

  1. Create a new workspace whose name reflects the system to be analyzed, for example DIA - <Analysis System>.

  2. Select the Templates > Impact Analysis > Developer Impact Analysis template in the LiveCompare hierarchy and choose ‘Copy to Workspace’ from the context menu.

  3. Select DIA - <Analysis System> as the target workspace, and click Copy. A number of dependent templates will also be copied.

  4. Copy the Templates > Impact Analysis > Initialize Task Store workflow to the DIA - <Analysis System> workspace.

  5. If required, configure the Initialize Task Store workflow to set the date from which to retrieve developer tasks.

  6. Run the Initialize Task Store workflow in the BWIA - <Analysis System> workspace.

Select the Developer Impact Analysis workflow in the DIA - <Analysis System> workspace, and configure the workflow as follows.

  1. Set the Pipeline parameter to the Pipeline that includes your Analysis, Comparison, Usage and SAP Solution Manager systems.

  2. Set the Compare Changing Objects? Boolean parameter to specify whether the most-at-risk executables will be compared on the Analysis and Comparison systems.

  3. Set the Changed Code Only? Boolean parameter to specify whether code quality analysis should be performed for changed code only. This applies to includes (INCL), functions (FUNC) and programs (PROG).

  4. If the Changed Code Only? Boolean parameter is set to true, set the Compare Revision Integer parameter to the number of revisions to go back when identifying changed code. The default value is 1, which means use the previous revision.

  5. Set the Development Email String List parameter to a list of email recipients for the Developer Impact Analysis report. Each email address should be stored a separate string entry.

  6. Set the From String parameter to the EmailFromAddress value stored in the Configuration - Email screen. You may need to check this setting with a LiveCompare Administrator.

Schedule the workflow

  • The Developer Impact Analysis workflows should be run using a schedule. The following workflows will need to be scheduled. To schedule a workflow, select it in the LiveCompare hierarchy and choose ‘Schedule Run’ from the context menu.

Create Object Links Cache

This workflow should be copied from the Prerequisites templates folder to each of your DIA - <Analysis System> workspaces. It should be configured to create an object links cache database for the analysis system and then scheduled to run once each week. If possible, the workflow should be scheduled to run when no developer tasks are being performed on the analysis system (for example, outside office hours).

Developer Impact Analysis

This workflow performs an impact analysis for the most recent set of transportable tasks submitted on the Analysis system, and sends report links to the specified email recipients. It should be scheduled to run as required on a daily basis, for example several times each day, depending on how frequently you wish to review the impact analysis reports.

Workflow results

The Developer Impact Analysis workflow generates an Excel report which includes the following spreadsheets.

Home

This spreadsheet lists changing objects (in the CHILD_TYPE and CHILD_NAME columns) and their associated change IDs and change types. The CHANGE_ID column may store a transport name or a ChaRM change request. Changing objects that do not impact other objects are also included.

If an SAP Solution Manager System is provided in the Developer Impact Analysis workflow, the CHANGE_ID column contains the ChaRM ID associated with each task.

The IMPACTED_OBJECTS column displays the number of impacted objects for each changing object. Click a link in this column to display the impacted objects in the Impacted Objects spreadsheet.

Click a hyperlink in the CHILD_NAME column to display an Object Differences report for the selected object.

The UNIT_TESTS column displays the number of unit tests for each impacted object. Click a link in this column to display the unit tests in the Unit Tests spreadsheet.

The QUALITY_ISSUES column is not used.

Impacted Objects

This spreadsheet lists used impacted objects in the NAME and TYPE columns, and the objects that impact them in the CHILD_TYPE and CHILD_NAME columns. Details for the used impacted objects are listed in the DESCRIPTION, APP_AREA and DEVCLASS columns.

For each used impacted object:

  • The DYNP column lists the number of impacted screens for the object in the most-at-risk results. Click a hyperlink to display the object’s screens in the Impacted DYNPs spreadsheet.

  • The USAGE column lists the object’s usage count according to the available performance history data.

  • The USERS column lists the number of users of the object according to the available performance history data. Click a hyperlink to display the object’s users in the Impacted Users spreadsheet.

  • The CUSTOM column is set to ‘Yes’ for custom used objects.

  • The MAR column is set to ‘Y’ for impacted objects that are most-at-risk and recommended for testing. Each changing object will have no more than one most-at-risk executable. The BUSINESS_CRITICAL column is set to ‘Y’ for Business Critical executables, and to <blank> for objects that are not identified as Business Critical.

Impacted DYNPs

This spreadsheet lists each used object and its associated screens (including each screen’s descriptive text). If a hyperlink is selected in the Impacted Objects spreadsheet’s DYNP column, the Impacted DYNPs spreadsheet lists the used object’s associated screens.

Impacted Users

This spreadsheet lists each impacted object, and its usage count for each user according to the available performance history data. If a hyperlink is selected in the Impacted Objects spreadsheet’s USERS column, the Impacted Users spreadsheet lists the users of the associated object.

Unit Tests

This spreadsheet lists impacted objects with ABAP Unit Tests, and each Unit Test’s class and method. The TEXT column stores a description of the Unit Test’s result. The FAILURE, EXCEPTION, RUNTIME and WARNING columns indicate the number of failed tests, and the number of tests that generated exceptions, runtime errors or warnings.

Rule Break Summary

This spreadsheet provides a summary of the quality rules that were triggered for each impacted object with associated ABAP code.

Rule Break Detail

This spreadsheet provides details for each impacted that triggered one or more quality rules. The results are filtered so that only rules in the Error class are shown. In this spreadsheet:

The LINE_NUMBER column stores ABAP source code line number in which the violation was detected. For Web DYNPRO (WDYN) objects, methods are stored in include files, and the line number refers to the position of the source code line within its method. For other object types, if the INCLUDE column for the violation contains a value, the line number refers to a line in the INCLUDE file.

The SEQ column stores the ABAP source code line number in which the violation was detected. For Web DYNPRO (WDYN) objects, this column refers to a line number in the object’s INCLUDE file. This column may be used when sorting the Rule Break Detail dataset to display the violations for each object ordered by line number.

Test Data

This spreadsheet is not used.

Help

This spreadsheet provides help for each of the spreadsheet reports.

Run Developer Impact Analysis with the Smart Impact app

If the Developer Impact Analysis workflow is scheduled to run at the same time as the Smart Impact app, there may be a delay while the Smart Impact app updates an object link cache database that is required by the Developer Impact Analysis workflow.

To resolve this, a second RFC Destination that refers to the same SAP analysis system may be created in LiveCompare. The first RFC Destination may be used with the Developer Impact Analysis workflow, and the second RFC Destination may be used with the Smart Impact app.

LiveCompare Dashboard