Configurator Impact Analysis
The Configurator Impact Analysis workflow performs an impact analysis for customizing transports on the specified SAP system. Customizing transports contain changes related to data, including Configuration and Master Data. Transportable tasks are obtained from the E070 table on the analysis system; these provide the set of changing objects to be used by the analysis. If an SAP Solution Manager System is provided, the workflow’s results include the ChaRM ID associated with each task.
If there are any objects with data changes to analyze, the workflow generates an Excel report which is emailed to the specified recipients. The report includes:
-
The objects, screens and users impacted by each changing object. A maximum of 10 impacted objects are reported for each changing object.
-
The most-at-risk objects that are recommended for testing.
The workflow should be scheduled to run periodically. During each run, an Impact Analysis is performed for the transportable tasks from the analysis system that have not yet been processed.
By default, the Configurator Impact Analysis workflow reports the following object types as used, impacted or most-at-risk.
Object Type | Description |
---|---|
CLAS | Class |
FUNC | Function |
IDOC | IDoc |
PROG | Program |
SSFO | Smart Form |
TCOD | Transaction |
WAPA | BSP Application |
However, a user with Administrator privileges may customize these defaults as follows.
-
Select the Administration > Configuration > Impact Analysis folder in the LiveCompare hierarchy.
-
In the TypesToFind section, deselect the object types to be excluded from the used, impacted and most-at-risk results. Note that if no items are selected, all the above object types will be used as the default.
-
Click ‘Save’ to save your changes.
These changes affect all subsequent runs of the Configurator Impact Analysis workflow, but they do not affect any existing workflow results.
The workflow populates the Configurator chart in the LiveCompare Dashboard.
Parallel impact analysis
You may run the Configurator Impact Analysis workflow in parallel with other impact analysis apps and workflows. See here for details.
Prerequisites
The Configurator Impact Analysis workflow uses a Pipeline to identify:
-
The Analysis, Comparison, Usage and SAP Solution Manager systems to be used in the analysis.
-
One or more Most-at-risk Search Test Repositories that will be searched to find test assets that match the most-at-risk executables.
-
A Most-at-Risk Gaps Test Repository, in which test requirements are to be created for most-at-risk objects that do not have matching test assets in the specified search Test Repositories.
-
One or more Most-at-risk Hits Execution Test Repositories, in which tests that match most-at-risk objects are to be created and executed.
Before the Configurator Impact Analysis workflow is run, you must create a Pipeline that includes the appropriate systems. In the Pipeline:
-
The Analysis System field stores the RFC Destination for your Analysis system.
-
The Comparison System field stores the RFC Destination for the system on which to compare data changes for the most-at-risk executables. Note that no comparison results are in the workflow’s reports if a comparison system is not specified.
-
The Usage System field stores the RFC Destination for the system from which performance history data has been obtained.
-
If required. the SAP Solution Manager System field stores the RFC Destination to the RFC Destination from which to retrieve the ChaRM IDs associated with each transportable task.
-
The Most-at-risk Search Test Repositories fields store the Test Repositories in which the most-at-risk objects are searched to find hits and gaps. Each Most-at-risk Search Test Repository may have one or more search paths to limit the folders that are searched. Each path should begin with the Subject folder, and a backslash (\) should be used to separate path components, for example Subject\Release1.
-
The Most-at-risk Hits Execution Test Repositories fields store the Test Repositories in which execution lists are to be created. For each most-at-risk hits execution Test Repository:
-
Set the Execution Name field to the name of the test execution folder in which execution lists are to be created.
-
Set the Execution Path field to the to the path of an existing Test Repository folder in which to store the test executions, for example Execution/TestExecutions. If this property is not set, the test execution will be created in the default Executions folder.
-
Set the Execution List Path field to the path of an execution list to reuse when creating Tosca test executions, for example, Execution/MyTests/LiveCompare_2023-01-31 09:47:17/MyExecutionList.
-
Set the Options field as appropriate to either create test runs, or to create and run test runs. Executing tests requires Tosca to be configured using DEX.
-
The Create Object Links Cache workflow from the Prerequisites templates folder should be run to create an object links cache database for the Analysis system.
You will need to make sure that performance history data is available on the Performance History System. Select the RFC Destination in the LiveCompare hierarchy, click the PHD tab and set a schedule for the retrieval of performance history data. You can also retrieve performance history data for an RFC Destination using the Collect Performance History Data action. See the Retrieve performance history data topic for details.
If you plan to use the workflow to identify test hits and gaps in a Test Repository, a LiveCompare Editor must run the Create Test Repository Cache workflow from the Prerequisites template folder must be run first in order to populate the Test Repository’s cache.
-
The ChangingObjectsToIgnore External Data Source removes tables whose names begin with ENH or AGR from the set of changing objects.
-
The TransportsToIgnore External Data Source contains regular expressions which are used to filter out transports containing custom objects.
If required, these External Data Sources may be edited in the LiveCompare Studio using the ‘Replace Data File’ option.
LiveCompare should be configured to send emails. See the Guided Configuration - Email help topic for details.
Prepare the workflow
The Configurator Impact Analysis workflow should be run in a separate workspace for each system to be analyzed. To prepare the Configurator Impact Analysis workflow, carry out the following steps.
-
Create a new workspace whose name reflects the system to be analyzed, for example CIA - <Analysis System>.
-
Select the Templates > Impact Analysis > Configurator Impact Analysis template in the LiveCompare hierarchy and choose ‘Copy to Workspace’ from the context menu.
-
Select CIA - <Analysis System> as the target workspace, and click Copy. A number of dependent templates will also be copied.
-
Copy the Templates > Impact Analysis > Initialize Task Store workflow to the CIA - <Analysis System> workspace.
-
If required, configure the Initialize Task Store workflow to set the date from which to retrieve developer tasks. See the template’s help file for details.
-
Run the Initialize Task Store workflow in the CIA - <Analysis System> workspace.
Select the Configurator Impact Analysis workflow in the CIA - <Analysis System> workspace, and configure the workflow as follows.
-
Set the Pipeline parameter to the Pipeline that includes your Analysis, Comparison, Usage and SAP Solution Manager systems, and the Test Repositories to be used in the Analysis.
-
Set the Compare Changing Objects? Boolean parameter to specify whether the most-at-risk executables will be compared on the Analysis and Comparison systems.
-
If required, edit the Execution List Configuration Table parameter to provide one or more configuration entries to be associated with the test execution in Tosca Test Repositories, for example /Configurations/Environment1/TDS.
-
Set the Configuration Email String List parameter to a list of email recipients for the Configurator Impact Analysis report. Each email address should be stored a separate string entry.
-
Set the From String parameter to the EmailFromAddress value stored in the Configuration - Email screen. You may need to check this setting with a LiveCompare Administrator.
Schedule the workflow
The Configurator Impact Analysis workflows should be run using a schedule. The following workflows will need to be scheduled. To schedule a workflow, select it in the LiveCompare hierarchy and choose ‘Schedule Run’ from the context menu.
Create Object Links Cache
This workflow should be copied from the Prerequisites templates folder to each of your CIA - <Analysis System> workspaces. It should be configured to create an object links cache database for the analysis system and then scheduled to run once each week. If possible, the workflow should be scheduled to run when no developer tasks are being performed on the analysis system (for example, outside office hours).
Create Test Repository Cache
If you plan to use the Configurator Impact Analysis workflow to identify hits and gaps in a Test Repository, the Create Test Repository Cache workflow should be copied from the Prerequisites templates folder to each of your CIA - <Analysis System> workspaces. It should be configured to update the cache for the required Test Repository, using performance history data from the Performance History System, and then scheduled to run once each week.
Configurator Impact Analysis
This workflow performs an impact analysis for the most recent set of transportable tasks submitted on the Analysis system, and sends report links to the specified email recipients. It should be scheduled to run as required on a daily basis, for example several times each day, depending on how frequently you wish to review the impact analysis reports.
Workflow results
The Configurator Impact Analysis workflow generates an Excel report which includes the following spreadsheets.
Home
This spreadsheet lists changing objects (in the CHILD_TYPE and CHILD_NAME columns) and their associated task IDs and change types. Changing objects that do not impact other objects are also included.
If an SAP Solution Manager System is provided in the Configurator Impact Analysis workflow, the CHANGE_ID column contains the ChaRM ID associated with each task.
The IMPACTED_OBJECTS column displays the number of impacted objects for each changing object. Click a link in this column to display the impacted objects in the Impacted Objects spreadsheet.
Click a hyperlink in the CHILD_NAME column to display an Object Differences report for the selected object.
Impacted Objects
This spreadsheet lists used impacted objects in the NAME and TYPE columns, and the objects that impact them in the CHILD_TYPE and CHILD_NAME columns. Details for the used impacted objects are listed in the DESCRIPTION, APP_AREA and DEVCLASS columns.
For each used impacted object:
-
The DYNP column lists the number of impacted screens for the object in the most-at-risk results. Click a hyperlink to display the object’s screens in the Impacted DYNPs spreadsheet.
-
The USAGE column lists the object’s usage count according to the available performance history data.
-
The USERS column lists the number of users of the object according to the available performance history data. Click a hyperlink to display the object’s users in the Impacted Users spreadsheet.
-
The TEST_HITS column lists the number of matches found in the specified Test Repository. Click a hyperlink to display details for the matched token in the Test Hit Details spreadsheet.
-
The CUSTOM column is set to ‘Yes’ for custom used objects.
-
The BUSINESS_CRITICAL column is set to ‘Y’ for Business Critical executables, and to <blank> for objects that are not identified as Business Critical.
-
The MAR column is set to ‘Y’ for impacted objects that are most-at-risk and recommended for testing. Each changing object will have no more than one most-at-risk executable.
Impacted DYNPs
This spreadsheet lists each used object and its associated screens (including each screen’s descriptive text). If a hyperlink is selected in the Impacted Objects spreadsheet’s DYNP column, the Impacted DYNPs spreadsheet lists the used object’s associated screens.
Impacted Users
This spreadsheet lists each impacted object, and its usage count for each user according to the available performance history data. If a hyperlink is selected in the Impacted Objects spreadsheet’s USERS column, the Impacted Users spreadsheet lists the users of the associated object.
Test Hit Details
This spreadsheet includes the details for each test asset that matched a most-at-risk executable in the specified Test Repository. It also displays the total usage, top user and top usage details for each of the matched most-at-risk executables. If the spreadsheet is accessed via a hyperlink, it displays the details for the matching test asset.
For Tosca Test Repositories, click a link in the TEST_NAME column to locate the corresponding test in an open Tosca workspace.
The HAS_DATA column is set to ‘Y’ for Tosca test cases that match affected data, and to <blank> for test cases that do not match affected data. Affected data is defined by the key fields of table rows that are different, in the Analysis system only (added data), or in the Comparison system only (deleted data).
The COMMON_TERMS column lists the search terms that were matched to identify each matching test asset. This column is populated for Tosca, qTest, ALM and SAP Solution Manager Test Repositories only.
Test Data
This spreadsheet is not used.
Help
This spreadsheet provides help for each of the spreadsheet reports.
Run Configurator Impact Analysis with the Smart Impact app
If the Configurator Impact Analysis workflow is scheduled to run at the same time as the Smart Impact app, there may be a delay while the Smart Impact app updates an object link cache database that is required by the Configurator Impact Analysis workflow.
To resolve this, a second RFC Destination that refers to the same SAP analysis system may be created in LiveCompare. The first RFC Destination may be used with the Configurator Impact Analysis workflow, and the second RFC Destination may be used with the Smart Impact app.