Configurator Impact Analysis

The Configurator Impact Analysis workflow performs an impact analysis for customizing transports on the specified SAP system. Customizing transports contain changes related to data, including Configuration and Master Data. Transportable tasks are obtained from the E070 table on the analysis system; these provide the set of changing objects to be used by the analysis.

The workflow generates an Excel report which is emailed to the specified recipients. The report includes:

  • The objects, screens and users impacted by each changing object. A maximum of 10 impacted objects are reported for each changing object.

  • The most-at-risk objects that are recommended for testing.

The workflow should be scheduled to run periodically. During each run, an Impact Analysis is performed for the transportable tasks from the analysis system that have not yet been processed.

By default, the Configurator Impact Analysis workflow reports the following object types as used, impacted or most-at-risk.

Object Type Description
CLAS Class
FUNC Function
IDOC IDoc
PROG Program
SSFO Smart Form
TCOD Transaction
WAPA BSP Application

However, a user with Administrator privileges may customize these defaults as follows.

  1. Select the Administration > Configuration > Impact Analysis folder in the LiveCompare hierarchy.

  2. In the TypesToFind section, deselect the object types to be excluded from the used, impacted and most-at-risk results. Note that if no items are selected, all the above object types will be used as the default.

  3. Click ‘Save’ to save your changes.

These changes affect all subsequent runs of the Configurator Impact Analysis workflow, but they do not affect any existing workflow results.

The workflow populates the Configurator chart in the LiveCompare Dashboard.

Prerequisites

The workflow requires access to RFC Destinations for an Analysis system to which developer changes will be submitted, and a Performance History System from which performance history data will be retrieved.

The Create Object Links Cache workflow from the Prerequisites templates folder should be run to create an Object Links Cache database for the Analysis system.

You should make sure that performance history data is available on the RFC Destination selected for the Performance History System. Select the RFC Destination in the LiveCompare hierarchy and click the PHD tab. Select the source for performance history data, and if necessary, the number of months of data to retrieve, then click ‘Update Data’. The performance history data may also be retrieved using a schedule. See the Retrieving Performance History Data help topic for details.

If you plan to use the workflow to identify test hits and gaps in a Test Repository, the Create Test Repository Cache workflow from the Prerequisites template folder must be run first in order to populate the Test Repository’s cache. Refer to the template’s Help file for details.

The ChangingObjectsToIgnore External Data Source removes tables whose names begin with ENH or AGR from the set of changing objects. If required, these External Data Sources may be edited in the LiveCompare Studio using the ‘Replace Data File’ option.

LiveCompare should be configured to send emails. See the Integrating LiveCompare with your Email System section of the LiveCompare Installation and Configuration Guide for details.

Preparing the Workflow

The Configurator Impact Analysis workflow should be run in a separate workspace for each system to be analyzed. To prepare the Configurator Impact Analysis workflow, carry out the following steps.

  1. Create a new workspace whose name reflects the system to be analyzed, for example CIA - <Analysis System>.

  2. Select the Templates > Impact Analysis > Configurator Impact Analysis template in the LiveCompare hierarchy and choose ‘Copy to Workspace’ from the context menu.

  3. Select CIA - <Analysis System> as the target workspace, and click Copy. A number of dependent templates will also be copied.

  4. Copy the Templates > Impact Analysis > Initialize Task Store workflow to the CIA - <Analysis System> workspace.

  5. If required, configure the Initialize Task Store workflow to set the date from which to retrieve developer tasks. See the template’s help file for details.

  6. Run the Initialize Task Store workflow in the CIA - <Analysis System> workspace.

Select the Configurator Impact Analysis workflow in the CIA - <Analysis System> workspace, and configure the workflow as follows.

  1. Set the Analysis System RFC Destination parameter to the RFC Destination for your Analysis system.

  2. Set the Performance History System RFC Destination to the RFC Destination for the system from which performance history data has been obtained.

  3. Set the Comparison System RFC Destination to the RFC Destination for the system on which to compare the most-at-risk executables.

  4. Set the Compare Changing Objects? Boolean parameter to specify whether the most-at-risk executables will be compared on the Analysis and Comparison systems.

  5. If required, set the Test Repository parameter to a Test Repository in which to test asset hits and gaps using the names of the most-at-risk objects.

  6. Set the Test Search Paths String List parameter to limit the search for test assets to one or more project folders in the specified Test Repository. Each path should begin with the Subject folder, and a backslash (\) should be used to separate path components, for example Subject\Release1.

  7. Set the Create test execution? Boolean parameter to specify whether the matching test assets will be used to create test execution lists in the specified Test Repository.

  8. Set the Execution Name String parameter to the name to be used for the test execution folder in which to store the matching test assets.

  9. Set the Execution Path String parameter to the path of an existing Test Repository folder in which to store the test executions. or example Execution/TestExecutions. If this property is not set, the test execution will be created in the default Executions folder.

  10. Set the Email To String List parameter to a list of email recipients for the Configurator Impact Analysis report. Each email address should be stored a separate string entry.

  11. Set the From String parameter to the EmailFromAddress value stored LiveCompare’s Configuration – Email screen. You may need to check this setting with a LiveCompare Administrator.

Scheduling Workflows

The Configurator Impact Analysis workflows should be run using a schedule. The following workflows will need to be scheduled. To schedule a workflow, select it in the LiveCompare hierarchy and choose ‘Schedule Run’ from the context menu.

Create Object Links Cache

This workflow should be copied from the Prerequisites templates folder to each of your CIA - <Analysis System> workspaces. It should be configured to create an Object Links Cache database for the analysis system and then scheduled to run once each week. If possible, the workflow should be scheduled to run when no developer tasks are being performed on the analysis system (for example, outside office hours).

Create Test Repository Cache

If you plan to use the Configurator Impact Analysis workflow to identify hits and gaps in a Test Repository, the Create Test Repository Cache workflow should be copied from the Prerequisites templates folder to each of your CIA - <Analysis System> workspaces. It should be configured to update the cache for the required Test Repository, using performance history data from the Performance History System, and then scheduled to run once each week.

Configurator Impact Analysis

This workflow performs an impact analysis for the most recent set of transportable tasks submitted on the Analysis system, and sends report links to the specified email recipients. It should be scheduled to run as required on a daily basis, for example several times each day, depending on how frequently you wish to review the impact analysis reports.

Workflow Results

The Configurator Impact Analysis workflow generates an Excel report which includes the following spreadsheets.

Home

This spreadsheet lists changing objects (in the CHILD_TYPE and CHILD_NAME columns) and their associated task IDs and change types. Changing objects that do not impact other objects are also included.

The IMPACTED_OBJECTS column displays the number of impacted objects for each changing object. Click a link in this column to display the impacted objects in the Impacted Objects spreadsheet.

Click a hyperlink in the CHILD_NAME column to display an Object Differences report for the selected object.

Impacted Objects

This spreadsheet lists used impacted objects in the NAME and TYPE columns, and the objects that impact them in the CHILD_TYPE and CHILD_NAME columns. Details for the used impacted objects are listed in the DESCRIPTION, APP_AREA and DEVCLASS columns.

For each used impacted object:

  • The DYNP column lists the number of impacted screens for the object in the most-at-risk results. Click a hyperlink to display the object’s screens in the Impacted DYNPs spreadsheet.

  • The USAGE column lists the object’s usage count according to the available performance history data.

  • The USERS column lists the number of users of the object according to the available performance history data. Click a hyperlink to display the object’s users in the Impacted Users spreadsheet.

  • The TEST_HITS column lists the number of matches found in the specified Test Repository. Click a hyperlink to display details for the matched token in the Test Hit Details spreadsheet.

  • The CUSTOM column is set to ‘Yes’ for custom used objects.

  • The BUSINESS_CRITICAL column is set to ‘Y’ for Business Critical executables, and to <blank> for objects that are not identified as Business Critical.

  • The MAR column is set to ‘Y’ for impacted objects that are most-at-risk and recommended for testing. Each changing object will have no more than one most-at-risk executable.

Impacted DYNPs

This spreadsheet lists each used object and its associated screens (including each screen’s descriptive text). If a hyperlink is selected in the Impacted Objects spreadsheet’s DYNP column, the Impacted DYNPs spreadsheet lists the used object’s associated screens.

Impacted Users

This spreadsheet lists each impacted object, and its usage count for each user according to the available performance history data. If a hyperlink is selected in the Impacted Objects spreadsheet’s USERS column, the Impacted Users spreadsheet lists the users of the associated object.

Test Hit Details

This spreadsheet includes the details for each test asset that matched a most-at-risk executable in the specified Test Repository. It also displays the total usage, top user and top usage details for each of the matched most-at-risk executables. If the spreadsheet is accessed via a hyperlink, it displays the details for the matching test asset.

For Tosca Test Repositories, the TEST_PATH column may contain a hyperlink to the test. The HAS_DATA column is set to ‘Y’ for Tosca test cases that match affected data, and to <blank> for test cases that do not match affected data. Affected data is defined by the key fields of table rows that are different, in the Analysis system only (added data), or in the Comparison system only (deleted data).

Test Data

This spreadsheet is not used.

Help

This spreadsheet provides help for each of the spreadsheet reports.

Running Configurator Impact Analysis with the Smart Impact App

If the Configurator Impact Analysis workflow is scheduled to run at the same time as the Smart Impact App, there may be a delay while the Smart Impact App updates an object link cache database that is required by the Configurator Impact Analysis workflow.

To resolve this, a second RFC Destination that refers to the same SAP analysis system may be created in LiveCompare. The first RFC Destination may be used with the Configurator Impact Analysis workflow, and the second RFC Destination may be used with the Smart Impact App.