Rpasce 190 Impg
Rpasce 190 Impg
Rpasce 190 Impg
April 2022
Oracle Retail Cloud Edition Implementation Guide, Release 19.0
F25568-09
This software and related documentation are provided under a license agreement containing restrictions on use and
disclosure and are protected by intellectual property laws. Except as expressly permitted in your license agreement or
allowed by law, you may not use, copy, reproduce, translate, broadcast, modify, license, transmit, distribute, exhibit,
perform, publish, or display any part, in any form, or by any means. Reverse engineering, disassembly, or
decompilation of this software, unless required by law for interoperability, is prohibited.
The information contained herein is subject to change without notice and is not warranted to be error-free. If you find
any errors, please report them to us in writing.
If this is software or related documentation that is delivered to the U.S. Government or anyone licensing it on behalf of
the U.S. Government, then the following notice is applicable:
U.S. GOVERNMENT END USERS: Oracle programs, including any operating system, integrated software, any
programs installed on the hardware, and/or documentation, delivered to U.S. Government end users are "commercial
computer software" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental
regulations. As such, use, duplication, disclosure, modification, and adaptation of the programs, including any
operating system, integrated software, any programs installed on the hardware, and/or documentation, shall be subject
to license terms and license restrictions applicable to the programs. No other rights are granted to the U.S. Government.
This software or hardware is developed for general use in a variety of information management applications. It is not
developed or intended for use in any inherently dangerous applications, including applications that may create a risk of
personal injury. If you use this software or hardware in dangerous applications, then you shall be responsible to take all
appropriate fail-safe, backup, redundancy, and other measures to ensure its safe use. Oracle Corporation and its
affiliates disclaim any liability for any damages caused by use of this software or hardware in dangerous applications.
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their
respective owners.
Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC trademarks are used
under license and are trademarks or registered trademarks of SPARC International, Inc. AMD, Opteron, the AMD logo,
and the AMD Opteron logo are trademarks or registered trademarks of Advanced Micro Devices. UNIX is a registered
trademark of The Open Group.
This software or hardware and documentation may provide access to or information about content, products, and
services from third parties. Oracle Corporation and its affiliates are not responsible for and expressly disclaim all
warranties of any kind with respect to third-party content, products, and services unless otherwise set forth in an
applicable agreement between you and Oracle. Oracle Corporation and its affiliates will not be responsible for any loss,
costs, or damages incurred due to your access to or use of third-party content, products, or services, except as set forth
in an applicable agreement between you and Oracle.
The following restrictions and provisions only apply to the programs referred to in this section and licensed to you. You
acknowledge that the programs may contain third party software (VAR applications) licensed to Oracle. Depending
upon your product and its version number, the VAR applications may include:
(i) the MicroStrategy Components developed and licensed by MicroStrategy Services Corporation (MicroStrategy) of
McLean, Virginia to Oracle and imbedded in the MicroStrategy for Oracle Retail Data Warehouse and MicroStrategy
for Oracle Retail Planning & Optimization applications.
(ii) the Wavelink component developed and licensed by Wavelink Corporation (Wavelink) of Kirkland, Washington, to
Oracle and imbedded in Oracle Retail Mobile Store Inventory Management.
(iii) the software component known as Access Via™ licensed by Access Via of Seattle, Washington, and imbedded in
Oracle Retail Signs and Oracle Retail Labels and Tags.
(iv) the software component known as Adobe Flex™ licensed by Adobe Systems Incorporated of San Jose, California,
and imbedded in Oracle Retail Promotion Planning & Optimization application.
You acknowledge and confirm that Oracle grants you use of only the object code of the VAR Applications. Oracle will
not deliver source code to the VAR Applications to you. Notwithstanding any other term or condition of the agreement
and this ordering document, you shall not cause or permit alteration of any VAR Applications. For purposes of this
section, "alteration" refers to all alterations, translations, upgrades, enhancements, customizations or modifications of
all or any portion of the VAR Applications including all reconfigurations, reassembly or reverse assembly,
re-engineering or reverse engineering and recompilations or reverse compilations of the VAR Applications or any
derivatives of the VAR Applications. You acknowledge that it shall be a breach of the agreement to utilize the
relationship, and/or confidential information of the VAR Applications for purposes of competitive discovery.
The VAR Applications contain trade secrets of Oracle and Oracle's licensors and Customer shall not attempt, cause, or
permit the alteration, decompilation, reverse engineering, disassembly or other reduction of the VAR Applications to a
human perceivable form. Oracle reserves the right to replace, with functional equivalent software, any of the VAR
Applications in future releases of the applicable program.
Contents
Preface ................................................................................................................................................................. ix
Audience......................................................................................................................................................... ix
Documentation Accessibility ......................................................................................................................... ix
Related Documents ........................................................................................................................................ ix
Customer Support........................................................................................................................................... ix
Improved Process for Oracle Retail Documentation Corrections.................................................................. x
Oracle Retail Documentation on the Oracle Technology Network ............................................................... x
Conventions.................................................................................................................................................... x
1 Implementation
Required Skills............................................................................................................................................ 1-1
Batch Framework....................................................................................................................................... 1-1
Batch Processes Under the Control of the Implementer ....................................................................... 1-2
Batch Processes Not Under the Control of the Implementer ................................................................ 1-2
POM Jobs and Batch Exec Service....................................................................................................... 1-2
Batch Framework Service Catalog ........................................................................................................... 1-3
Batch Exec Service ............................................................................................................................... 1-3
Load Measures: measload ..................................................................................................................... 1-4
Export Measures: measexport ............................................................................................................... 1-5
Mace Calculation Service: calc............................................................................................................. 1-7
Load Hierarchies: hierload.................................................................................................................... 1-7
Export Hierarchy: hierexport ................................................................................................................ 1-8
Load PDS Dimension: loaddimdata...................................................................................................... 1-9
Trigger a BDI Export Flow: bdiexport ................................................................................................. 1-9
Wait for Trigger File: waittrigger ......................................................................................................... 1-9
Send a Trigger File: sendtrigger......................................................................................................... 1-10
Extract Input Files from Archive: unpack.......................................................................................... 1-11
Transform File Service....................................................................................................................... 1-11
Custom Function: ap_set_datr .......................................................................................................... 1-14
Convert Informal Positions to Formal: formalize .............................................................................. 1-15
Rename Positions in a Hierarchy: renamepositions........................................................................... 1-15
Synchronize NA value: syncna .......................................................................................................... 1-16
Workspace Refresh by Template Name: refresh .............................................................................. 1-16
v
Workspace Rebuild by Template Name: rebuild ............................................................................... 1-16
Workspace Delete by Template Name: delete ................................................................................... 1-16
Run Segment Build Queue: autobuild ............................................................................................... 1-17
Create Backup of Domain: backup .................................................................................................... 1-17
Initialize Testing Environment: initrpac ............................................................................................ 1-17
Execute Automated Tests: runrpac .................................................................................................... 1-18
Integrating Planning Version 19 with Merch and RI Version 22 and Above.................................... 1-18
Setting Up V19 Planning and V22 RMS/RI Integration ............................................................ 1-19
Export Measure Sample.............................................................................................................. 1-19
Transforming the File Service Sample ....................................................................................... 1-19
Automated Testing with RPAC ............................................................................................................. 1-20
Domain Creation ..................................................................................................................................... 1-21
SFTP Upload Location....................................................................................................................... 1-21
config .......................................................................................................................................... 1-22
batch_control .............................................................................................................................. 1-22
input ............................................................................................................................................ 1-22
jse ................................................................................................................................................ 1-22
Bootstrap Environment ...................................................................................................................... 1-23
OAT Parameters................................................................................................................................. 1-23
Config Name .............................................................................................................................. 1-23
Partition Dim .............................................................................................................................. 1-23
JSE Jar Files................................................................................................................................ 1-23
Input Archive .............................................................................................................................. 1-23
Batch Group ............................................................................................................................... 1-23
Overwrite .................................................................................................................................... 1-24
Domain Build ..................................................................................................................................... 1-24
2 In-Context Help
Navigating to Help Topics on RPASCE ................................................................................................... 2-1
Creating the Contextual Help Configuration File................................................................................... 2-3
Using JSON in the Contextual Help Configuration File....................................................................... 2-3
Structure of Contextual Help Configuration File.................................................................................. 2-3
Help Topic Building Block ................................................................................................................... 2-4
Key Naming Convention ...................................................................................................................... 2-5
JSON Structure of Contextual Help Configuration File ....................................................................... 2-5
Editing the Contextual Help Configuration File ................................................................................... 2-6
Batch Job Admin Installation .................................................................................................................. B-1
RPASCE Process Flows....................................................................................................................... B-3
vi
Send Us Your Comments
Note: Before sending us your comments, you might like to check that you
have the latest version of the document and if any concerns are already
addressed. To do this, access the Online Documentation available on the
Oracle Technology Network Web site. It contains the most current
Documentation Library plus all documents revised or released recently.
vii
viii
Preface
Oracle Retail Implementation Guides provide detailed information useful for implementing and
configuring the application. It helps you to understand the behind-the-scenes processing of the
application.
Audience
This document is intended for administrators and system implementers of RPASCE.
Documentation Accessibility
For information about Oracle's commitment to accessibility, visit the Oracle Accessibility
Program website at http://www.oracle.com/pls/topic/lookup?ctx=acc&id=docacc.
Related Documents
For more information, see the following documents in the Oracle Retail Predictive Application
Server Cloud Edition documentation set:
■ Oracle Retail Predictive Application Server Cloud Edition Configuration Tools User Guide
■ Oracle Retail Predictive Application Server Cloud Edition Release Notes
■ Oracle Retail Predictive Application Server Cloud Edition Security Guide
■ Oracle Retail Predictive Application Server Cloud Edition User Guide
Customer Support
To contact Oracle Customer Support, access My Oracle Support at the following URL:
https://support.oracle.com
ix
■ Detailed step-by-step instructions to re-create
■ Exact error message received
■ Screen shots of each step you take
Conventions
The following text conventions are used in this document:
Convention Meaning
boldface Boldface type indicates graphical user interface elements associated with an
action, or terms defined in text or the glossary.
italic Italic type indicates book titles, emphasis, or placeholder variables for which
you supply particular values.
monospace Monospace type indicates commands within a paragraph, URLs, code in
examples, text that appears on the screen, or text that you enter.
x
1
Implementation
1
RPASCE acts as a platform to create tailored solutions or migrate existing on-premise solutions
into the cloud. This guide addresses the process of preparing a custom solution for use in either
of these Cloud Service environments.
Because Oracle Retail Cloud Service applications do not support any back-end server access,
implementation is different from an RPAS on-premise implementation. The applications
provide online tools to cover all the necessary facets of an RPASCE application roll-out and
administration. This includes:
■ Building and patching domains from your custom configuration
■ Defining nightly, weekly, or ad hoc batch process sequences
■ Scheduling recurring batch processes
Required Skills
Since the implementations are based on a retailer- or implementer-provided configuration,
working knowledge of the RPASCE configuration tools is essential. The RPASCE
configuration tools are supported for offline use on a Windows 7 or 10 system. They are
available in the applicable Starter Kits, and their use is detailed in the Oracle Retail Predictive
Application Server Cloud Edition Configuration Tools User Guide.
In addition to supplying an RPASCE configuration, the implementer must also prepare the
retailer to provide RPASCE hierarchy and measure data load files, as well as to take RPASCE
exported measure data files for any downstream integration needs. While the implementer does
not call the RPASCE loadHier, loadMeasure, or exportMeasure utilities directly, knowledge of
their usage gained from the RPASCE Administration Guide is helpful.
Data files for loading into the applications and exported files for integration with other systems
are sent and received from the RPASCE cloud environment via an SFTP site. Knowledge of the
use of SFTP software, including an ability to automate such uploads and downloads, is a
necessary prerequisite for routine nightly or weekly batch processing jobs.
Batch Framework
RPASCE operations require that the administrative user, who will not have command-line
server access, must be able to select, initiate, and schedule RPASCE batch activities.
The RPASCE platform includes an Online Administration Tool (OAT) capability, which allows
simple parameterization and scheduling of pre-configured batch tasks. The RPASCE introduces
an enhancement to the OAT framework that allows a sequence of several batch tasks to be
defined. This sequence is built from a list of available batch services, such as measure loading,
calculation, segment workspace refresh, and so on. These service tasks run in a defined order,
Implementation 1-1
Batch Framework
so that you can know, for example, that your daily data updates have been loaded before your
workspace refresh tasks are run. The batch tasks are configured to run under the existing OAT
framework, so that scheduling them to run once, or on a repeating basis, is the same as for other
OAT tasks.
The batch task sequences are defined in a small set of text files, which are specified below, with
some examples.
In this example, the job_weekly is configured to execute the batch_weekly control set, which is
a set of tasks within the batch_exec_list that can be configured in the same way as you would if
not using POM. The "*" before the control set name indicates that this control set is optional,
that is, if it does not exist, the batch execution of the POM job just ignores it without reporting
any error.
For more details about the RPASCE Schedule for POM, refer to the Oracle Retail Predictive
Application Server Cloud Edition Administration Guide.
The first column is an identifier, which may be repeated on several lines to define a grouping of
tasks to be run together. The second column indicates which task from the catalogue is being
requested. The third column gives parameter details for that task (as necessary). Comments
may be placed in the batch_exec_list.txt file by starting a comment line with the hash sign (#).
Here is a sample batch_exec_list.txt file for reference:
# Daily Batch Cyle
daily | waittrigger | daily_upd.txt~ftp~3600
daily | unpack | daily_upd.tar.gz
daily | calc | exp_set
daily | measexport | daily_exp_set
daily | measload | load_oo_list
daily | sendtrigger | batch_load_complete.txt~ftp
daily | calc | batch_oo
daily | rebuild | rebuild_daily_group
In this sample file, three batch task groups are specified: daily, load_oo, and weekly. Note that
these names are implementer-defined identifiers; there is nothing special about the names
"daily" or "weekly". Each identifier is thus associated with a sequence of tasks, which will run
in the order they are listed in the file.
Implementation 1-3
Batch Framework Service Catalog
Note also that no information is provided about times or schedules on which these task groups
should be run. Scheduling information must be specified in the RPASCE Online Administration
Tool.
The services listed for each batch task group are run in the order specified when that type of
batch run is requested through the OAT interface. Details on the individual batch services and
what their service parameters mean are described in the following sections.
In this example, data files are to be copied from the incoming SFTP location, and if any files
for the listed measures are absent, an error condition will be reported. The measure drtyoou is
found in its own file, whereas the measures drtyoor and drtyooc are both to be loaded from the
same file. Note that the properties for those measures will need to have been set with the same
file name in the domain configuration in order for this feature to be used.
The Validate option checks for required data files in the <domain>/input directory, as well as
either the FTP or Cloud data location (if the C parameter is given). This allows the measload
task's Validate option to correctly detect files that were previously placed in the
<domain>/input location by an unpack batch task.
When measure data files are loaded, some lines in the file may be rejected (possibly due to an
incorrectly formatted input file or a position that does not exist in the dimension). The RPASCE
measure load process does not, by default, treat these rejected lines as errors and will continue
loading any valid lines from the rest of the file. In order to detect when rejected lines were
encountered, since the batch framework does not report this as an error, the loadmeas batch task
writes the rejected records count into its own log file and also creates a rejected records
warning file in the outgoing FTP area.
The warning file has no contents, providing all relevant information in the file name itself. The
file name indicates the name of the measure, the count of rejected records, and a timestamp to
indicate when the task was run.
In the example below, the measure apcpfcstslsu had four rejected records when it was loaded
on 26-April-2018 at 7:52am:
warning.eebatch_loadmeas.apcpfcstslsu.rejected.4.20180426075212
If the optional |R| parameter is given in the control file, the numerical value indicates a limit to
the number of rejected lines, above which the rejections will be reported as an error, rather than
a warning. For example, in the load_oo config shown above, the limit is given as 200. If, while
loading any particular measure in this load group, more than 200 rejected record lines are
detected, then the task will halt, reporting an error, and the batch sequence that includes this
task will also halt. In this way, if a badly formatted or corrupted data file was uploaded, then
later batch steps such as calculations or workbook refreshes will be performed.
Implementation 1-5
Batch Framework Service Catalog
– I - Flag to use an individual output file for each measure (optional). The default file
name is measure_name.csv.ovr unless it is overridden by the output file name option
of the parameter of M.
– S - File share destination. Keywords: ftp, temp, cloud:<app>, where <app> is one of:
ri, mfp, rdf, ap, rms. (For the rare case where multiple instances of a single RPASCE
application are to be deployed, the second instance may be integrated by using the
values mfp2, rdf2, or ap2.) The cloud:<app> keyword sends the output file to the
indicated Oracle RGBU Cloud Service application, if configured for your
environment. The temp keyword sends the output to an internal temporary location,
where it will not be accessible externally but can be used by other configured batch
tasks such as the transform file service (by specifying temp as the input value for the
subsequent task)
– C - Compress output. Optional; if single file is output, compress as .gz, if multiple
files, compress as .tar.gz.
– D - Delimiter. Optional character to use in place of comma; to select the | character as
the delimiter, specify the keyword "PIPE".
Note: D simply replaces all commas with the delimiter. It does not work
well with string measure values that include commas.
– U - Uppercase position name (optional). Does not have third column option. If it is
specified, the position name in the output file will be converted to an uppercase name.
P - useDate parameter (optional). Requires either the value "start" or "end" for the
third column. It is used for a measure that has a high level on the CLND hierarchy (for
example, Mnth). When specified, it will replace the measure CLND level position of
each data record with the lowest CLND level (for example, Day) position
corresponding to that higher level position. The values start or end are used to
determine whether the starting day position or the ending day position of the
corresponding mnth period will be output.
– N - Specify when to skip NA values in export (optional). Valid options for the third
column are "never", "allna", and "anyna".
* never - Export the corresponded data point even though the measure’s values are
all NA values. This option essentially exports all data points in the logical space.
* allna - Do not output the corresponded data point if all measure values are NA
values (default mode).
* anyna - Do not output the corresponded data point if any one measure value is an
NA value.
– L - no-merge export. As part of the export measure operation, the measure is exported
into a file in each of the local domains. In the last step of the export process, the
exported files from the local domains are merged into a single file in the master
domain. If this parameter is specified, the export operation will skip the final step and
will not merge the exported data from all the local domains into a single file. Each file
will have a suffix with the name of the corresponded local domain. If the export
intersection is HBI (non-partitioned), this parameter is ignored. If the C option for
compression is specified along with the L option, then the local domain files will be
compressed separately as well.
– T - Appending a unique identifier as suffix to the file name (optional). This generates a
unique name to ensure parallel export can proceed. This flag can only be used with
default file naming, without using the |O| flag. This flag is specifically designed to
work with the intradayexport() expression.
■ Parameter value. Relative to the parameter type selected above.
Here is an example control file for the Export Measure service:
# Export PoC Plan CP
lpcp|F|lpcpexportb
lpcp|S|ftp
lpcp|M|lpcpbopc
lpcp|M|lpcpbopr
lpcp|M|lpcpbopu
lpcp|M|lpcpeopc
lpcp|M|lpcpeopr
For the lpcp export group, the implementer has given a Filter Mask measure, has indicated that
the file will be published to the SFTP server location, and has given a list of several measures
to be included in the output.
The first column provides an identifier for each group of calc instructions. These identifiers are
used to select calculations to be run either directly, or as part of a Batch Exec run. (The
identifiers can match the Batch Exec identifier, but this is not required.) The second column,
which references either the (G)lobal or (L)ocal domain, is maintained for batch control file
compatibility with earlier RPAS versions, but is no longer used. All expressions will be
executed in the Global domain context. The G or L value is required; a runtime error will occur
if it is missing.
The third column indicates whether the calculation to be run is a registered rule group or an
individual expression. This value is also required and must be specified as either 'group' or
'expression'; a runtime error will occur if it is missing. The final column provides either the
name of the rule group to be executed or the text of the expression to be run.
As with the other control files, any line starting with # is ignored and can be used to comment
or document the file, as needed.
Here is an example file for the calculation service:
# Calc Set for Batch Aggregation Weekly
batch_week | G | group | Batch_GB
batch_week | L | group | Batch_AggW
batch_week | L | group | Batch_InvRoll
batch_week | G | expression | LTWPNSlsR = DRTYNSls1R+DRTYNSls2R
batch_week | G | expression | LTWPNSlsU = DRTYSls1U+DRTYSls2U-DRTYRtn1U-DRTYRtn2U
Implementation 1-7
Batch Framework Service Catalog
provides the details for loading one hierarchy; multiple hierarchies may be loaded, each on a
separate line.
The parameter column provided in the Batch Exec file contains three values, separated by the ~
character. The values are: hierarchy to be loaded, purgeAge value, and whether User Defined
Dimensions (UDD) are included.
The example from the Batch Exec section, above, contains these sample values:
weekly | hierload | clnd~14~N
weekly | hierload | prod~~N
weekly | hierload | loc~14~N
This indicates that in the weekly batch execution, the CLND hierarchy with a purgeAge value
of 14 days is loaded, and there are no UDDs in the CLND hierarchy. Note that the purgeAge
value is omitted from the prod example line; this allows the value set in the configuration to be
used. Be aware that a run-time error will occur if there is not either a configured purgeAge or a
purgeAge value set in the batch config file.
The Hierarchy Load task checks for data files first in the Cloud integration area (supports
app-to-app integration data sharing) and then in the incoming SFTP area.
For RMS-shared hierarchies that require User Defined Dimension (UDD) roll-up updates,
additional data files can be uploaded and these are processed by the hierload task as well. For
each UDD, you can provide a comma-separated values (CSV) file, with a filename of the
pattern: [udd dimension name].csv.dat.
Lines in the CSV file must contain three columns:
■ Source position
■ UDD position
■ UDD label
values mfp2, rdf2, or ap2.) This sends the output file to the indicated Oracle RGBU
Cloud Service application, if configured for your environment.
■ Parameter Value. If required, by parameter type).
Here is an example control file for the Export Hier service:
# Export PROD hierarchy, compressed
prod_export|H|prod
prod_export|T|F
prod_export|O|prod_exp.dat
prod_export|C|
prod_export|S|ftp
In this example, the prod_export grouping indicates that only the formal positions in the PROD
hierarchy will be written to a compressed file prod_exp.dat.gz and placed in the outgoing SFTP
server location.
This task, when run, looks for <hier>.csv.dat files in the incoming SFTP area under the
subdirectory rdm_input/dimdata. If no incoming data files are found, a log message will
indicate this, and then the Batch process will continue without error.
Implementation 1-9
Batch Framework Service Catalog
By default, the waittrigger task waits for 3 hours for the trigger file to appear before timing out
and reporting an error. A shorter timeout may optionally be specified, given in the number of
seconds to wait.
The waittrigger task requires only an entry in the batch_exec_list.txt control file; no separate
control file is required. As seen in the example above:
daily | waittrigger | daily_upd.txt~ftp~3600
This example daily batch task waits up to one hour for the file daily_upd.txt to be present in the
incoming FTP location. The third column uses the tilde (~) character as a separator and gives
two or three parameters:
■ the trigger file name. Simple file names only, no paths.
■ a location keyword that indicates where the trigger file will be found:
– ftp: the FTP server "input" directory
– cloud: the RGBU Cloud data share location, used when multiple RGBU Cloud apps
are integrated together
– input: the current domain's input directory
■ (optional) number of seconds to wait before timing out
When the trigger file is configured to be found in the FTP area, it should be placed under the
input directory (which will be the same directory location for any associated data files or data
file archives).
This control line indicates that the file batch_load_complete.txt will be created in the SFTP
area, once batch execution successfully reaches this point in the daily batch sequence.
Note that no automatic clean-up of the trigger file is performed, so other processes that look for
the presence of this trigger file must remove it. If a trigger file from the previous batch run is
still in place during a subsequent batch run, the file will remain in place and the file's timestamp
will be updated.
The task specifies that the archive file daily_upd.tar.gz is expected to be in the FTP input
directory, and it will be unpacked into the domain’s input directory before any subsequent batch
tasks are performed.
Implementation 1-11
Batch Framework Service Catalog
This example shows an input file split into multiple files using the multiple 'X' option based on
column numbers. In the above example, the output files are created in the domain input
directory.
Example 2: To split a single file into multiple files based on column IDs and also to filter
records based on a column value.
rms_inv1|F|rms_inv.csv.ovr
rms_inv1|I|cloud
rms_inv1|V|
rms_inv1|L|5|N
rms_inv1|X|drtyeop1u.csv.ovr|1,2,3,6
rms_inv1|X|drtyeop1c.csv.ovr|1,2,3,7
rms_inv1|X|drtyeop1r.csv.ovr|1,2,3,8
In this example, the first only records with fifth column value as 'N' in the csv file and then
those will split into multiple files.
Example 3: To copy columns and swap columns before writing the output file.
rms_curh|F|rms_curr.csv.ovr
rms_curh|I|cloud
rms_curh|C|3
rms_curh|W|2|6
rms_curh|U|
rms_curh|X|curh.csv.dat|2,3
In this example, the original file only contains five columns. The third column is copied to the
end of the file as the sixth column due to the use of option 'C'. Then, column '2' and '6' are
swapped due to use of option 'W'. Then it writes out column 2,3 after removing duplicates due
to use of option 'U'.
Example 4: To add a constant value to a file and to join two columns based on a separator.
rms_patt3|F|rms_prod.csv.dat
rms_patt3|I|cloud
rms_patt3|L|22|NA|N
rms_patt3|A|BRAND
rms_patt3|J|34|22|_
rms_patt3|X|drdvprdattt.csv.ovr.3|1,34,35
Implementation 1-13
Batch Framework Service Catalog
It is necessary to add a constant value 'BRAND' and also concatenate it with another column
and export both the columns.
In this example, the original file only contains 33 columns. It is first filtered for records not
equal to 'NA' in column 22. Then it adds a constant value 'BRAND' in column 34. Then,
columns 34 and 22 are joined, using the separator '_' that is added as column 35. Finally, the
newly added columns 34 and 35 are extracted into an output file.
Example 5: The following sample shows the use of 'E' to create different delimited output file
and 'Z' option to compress the output file.
mfp_exp_ri|F|ri_mpop_plan.dat
mfp_exp_ri|F|ri_mpcp_plan.dat
mfp_exp_ri|I|temp
mfp_exp_ri|V|
mfp_exp_ri|X|W_RTL_PLAN1_PROD1_LC1_T1_FS.dat|4-
mfp_exp_ri|O|cloud:ri
mfp_exp_ri|E|PIPE
mfp_exp_ri|Z|RI_MFP_DATA|W_RTL_PLAN|dat|Y
In this example, use of multiple 'F' options merges two output files and creates one output file
with only from column 4 delimited by comma. However, the final output file is created with
delimiter as PIP' due to use of option 'E'.
In addition, the use of the 'Z' option compresses the output files of pattern 'W_RTL_
PLAN*.dat' created at cloud:ri location into a compressed file as 'RI_MFP_DATA.zip' and
deletes the generated file after compressing.
Example 6: The following option shows the use of the 'M' option to copy a set of files from
one location to another location.
copy_dom_in|I|dom_in
copy_dom_in|O|ftp_out
copy_dom_in|M|*.dat|N
This example copies all the files of pattern '*.dat' from domain/input location to sftp_out
location. Due to use of option 'N' to not delete the input files, it only copies the file. To move
the files, option 'Y' should be used.
Implementation 1-15
Batch Framework Service Catalog
This control line indicates that the weekly batch task will look for a rename positions data file
for the PROD hierarchy, prod.rn.dat, and will carry out the renamings specified.
The example contains only one refresh group, with four template pattern names to match. All
workspaces in the global domain or any local domains that are built from templates matching
those patterns will be refreshed.
The example contains only one rebuild group, with two template pattern names to match. All
workspaces in the global domain or any local domains that are built from templates matching
those patterns will be rebuilt.
Note that nothing is required after the second pipe (|) character.
Note: Incrementally adding test collateral files are not supported; previous
file sets of each type are removed before unpacking the new archive, so any
updated archive must contain all collateral files of that type. This prevents
stale test scripts or data files from being left in the testing environment,
which could otherwise cause unexpected test failures.
The second task carried out by initrpac is to place the contents of the input.tar.gz (or input.zip)
into the <domain>/input directory. This will be used to place any hierarchy load (.dat) or
measure load (.ovr, .clr, .rpl) files into position so that subsequent batch tasks may set the
domain into a known state, ready for automated tests to run and verify the expected result
values.
Implementation 1-17
Batch Framework Service Catalog
The initrpac task entry in the batch_exec_list.txt control file does not require any parameters. It
would normally be placed as the first entry in a test-enabled alternate version of a daily or
weekly batch execution sequence. See full example in Automated Testing with RPAC.
rpac_validate | initrpac |
In this case, the test file RT01_MT_WB.xml will be executed under an identifying title
"MFPCS_Sample_Test1". See Automated Testing with RPAC for a full example of a
test-enabled batch execution sequence.
Summary test results will be visible in the output log for the batch execution (visible in the
Online Administration dashboard), and full test result details will be available in the log file
archive that is sent to the FTP server after the batch execution completes.
Note: If all Cloud Services are version 19, then SFTP will continue to be
used. Additionally, SFTP support for integration with an on-premise or
external system is not impacted.
The following destinations are supported for this integration in RPASCE batch control files.
■ cloud:rms
■ cloud:rms2
■ cloud:ri
■ cloud:alloc
Note: cloud:rms, cloud:ri, cloud:alloc are already being used with GBUCS,
and these are the folder structures in the SFTP server. Cloud:rms2 is new and
can be used if integration with two Merchandising Environments hosted on
CFS is required (for example, integration with MFCS STAGE and PROD).
The ftp is the destination in this sample that will move the file to SFTP.
When integrating V19 Planning with V22 Merchandising, the entry will look like this.
# Export PoC Plan CP
lpcp|F|lpcpexportb
lpcp|S|cloud:rms
lpcp|M|lpcpbopc
lpcp|M|lpcpbopu
lpcp|M|lpcpeopc
lpcp|M|lpcpeopu
If there is a requirement to send the file to more destinations, then an additional entry can be
added.
# Export PoC Plan CP
lpcp|F|lpcpexportb
lpcp|S|cloud:rms
lpcp|S|cloud:ri
lpcp|M|lpcpbopc
lpcp|M|lpcpbopu
lpcp|M|lpcpeopc
lpcp|M|lpcpeopu
Implementation 1-19
Automated Testing with RPAC
rms_oo|V|
rms_oo|X|drtyoou.csv.rpl|1,2,3,6
rms_oo|X|drtyooc.csv.rpl|1,2,3,7
sequence, if needed. Here is an example batch execution sequence that shows how an existing
weekly batch specification might be augmented with RPAC tests:
# Standard Weekly Batch Cycle
weekly | unpack | weekly_sales.tar.gz~ftp
weekly | hierload | prod~14~N
weekly | hierload | loc~14~N
weekly | measload | load_oo_list
weekly | calc | batch_fcst
weekly | autobuild |
The first section, labeled "weekly", represents a weekly batch sequence that might run at
midnight every Saturday. Note that updated hierarchy and measure data files for the week are
sent through FTP in an archive file named "weekly_sales.tar.gz" using the unpack task.
The second section shows how the weekly batch sequence has been augmented with RPAC
tests and named "validate". Note that the unpack task from the weekly sequence has been left
out, and in its place initrpac is called to place the test data input files into the domain. If new or
updated RPAC test collateral files have been placed on the FTP server, they will be brought in
at this point and used.
There are two sets of RPAC tests in this sequence, specified by the runrpac task entries. The
first runs immediately after the hierarchy and measure files are loaded, and validates expected
values in the domain. The second test set is executed after some further calculations have been
run, and builds one or more segments, then validates values within them as well.
When RPAC-enabled batch sequences are run, the primary log file, which is available through
the Online Administration dashboard as well as through the FTP log archive package, will
show a brief summary of test results. Full test details and log files are available in the complete
log archive package from the batch exec run, available in the FTP area once the execution has
completed.
For full details on the contents of an RPAC test .xml file, and all the tags and attributes that are
available for specifying RPAC tests, see "Appendix B: RPAS Test Automation" in Oracle
Retail Predictive Application Server Fusion Client Administration Guide. Note that the latest
version of this guide specifies which RPAC features are available for Cloud deployments. Due
to Cloud security constraints, some RPAC features, primarily the <SHELL> tag, have been
disabled; however, inclusion of RPAC tests as a step in existing batch execution sequences
should fully compensate for this restriction.
Domain Creation
This section describes domain creation.
Implementation 1-21
Domain Creation
they are available for use by the application. Two methods are available for sending this
transfer signal:
■ To move individual files one at a time, after uploading each file, a second file having the
same file name but with the word ".complete" appended must be uploaded to the same
directory, For example, after uploading the input file archive measloads.tar.gz to the
incoming input directory, the file measloads.tar.gz.complete must also be uploaded to the
input directory.
■ For multiple uploaded files, including files in different directories, ensure that there is a
directory at the top level of the incoming SFTP area called "COMMAND" (which must be
uppercase). Into that directory place a file "COMPLETE" (which must be uppercase).
Note that this does not need to be done in every subdirectory, only at the top level. An
important note here is that there is no guarantee regarding the order in which the several
uploaded files will be transferred to the internal file area where the RPASCE applications
can see them, and in particular, larger files will take longer to transfer than smaller files.
For the purposes of building the domain, four subdirectories in the SFTP site are used:
config
For uploading the domain configuration into the cloud environment, create an archive (either
.zip or .tar.gz) containing the contents of the config directory (without the top level config
folder). This archive file must be named as <config_name>_config.zip or <config_name>_
config.tar.gz. This archive file must be placed in the config subdirectory on the SFTP server. It
may be updated as often as necessary in support of domain build or patch activities. Remember
to signal the internal transfer of these files from the incoming SFTP area by one of the two
methods detailed above.
Example
The ascs_config.zip must contain the following contents:
■ ascs folder - this is the folder with ascs application configuration.
■ ascsDashboardSettings.json - required for dashboard.
■ ascsHelpConfig.json - required for Help.
batch_control
The set of batch process control files, as detailed in the previous section, must be uploaded to
the batch_control subdirectory within the incoming SFTP location. These files are placed into
the domain environment when the domain is built and can be updated later by running the
domain patch task. Remember to signal the internal transfer of these files from the incoming
SFTP area by one of the two methods detailed above.
input
The initial domain creation process requires at least the .dat files for all hierarchies specified in
the domain configuration. Normally, it is desirable to have an initial set of measure data load
files available at domain build time as well. All of these files must be placed in the input
directory of the incoming SFTP location. In addition to the domain build and patch processes,
any use of the measload or hierload tasks in the batch framework always checks for incoming
data files from this directory. Remember to signal the internal transfer of these files from the
incoming SFTP area by one of the two methods detailed above.
jse
The RPASCE includes optional support for one or more Java Special Expression (JSE)
extension libraries. JSE extensions are encoded into one or more Java .jar files, and the new
expressions may then be used in your domain configuration. To include your .jar file(s) in the
domain environment, they must be uploaded via SFTP to a subdirectory named "jse". The .jar
file(s) must then be named in the Bootstrap OAT parameters, as described in JSE Jar Files.
Bootstrap Environment
A newly provisioned RPASCE cloud environment is set up with a bootstrap configuration that
allows the implementer to log into the RPASCE Client and access the Online Administration
Tool (OAT) interface before the domain has been built. The bootstrap OAT configuration
allows only tasks required to construct a domain. Once the domain has been constructed, both
the domain tasks and activities as well as the bootstrap activities will be available. This allows
the domain to be re-built from scratch multiple times, should this be required.
OAT Parameters
A few parameters must be specified when initiating a domain build process through OAT. The
implementer must supply these values:
Config Name
The name under which the configuration has been saved. For those familiar with the RPASCE
domain construction process, this is the name that is internally passed as the -cn parameter to
rpasInstall. A drop-down list offers choices based on the available domain config archive files
in the incoming FTP area.
Partition Dim
The dimension on which the domain will be partitioned. The domain is constructed with one
subdomain for each position in the given dimension. This must be a level of separation that fits
with the intended workflow for individual users so that, when possible, most users' daily tasks
relate to only one subdomain. This lessens contention when many users are active in the
system.
Input Archive
If input files with hierarchy and measure data are being sent in an archive file as described
above, the archive file name must be given here. A drop-down list offers the names of any
archive files in the FTP input directory area. If individual data files are being sent, this selection
may be left blank.
Batch Group
Once a domain has been built successfully, a named group of batch operations may be specified
(typically including measure data loads and mace calculations). This operation sequence must
Implementation 1-23
Domain Creation
be one batch_type entry in the Batch Exec control file, batch_exec_list.txt (described above in
Batch Exec service section).
Overwrite
In the case where the domain has already been built once, and the implementer must rebuild the
domain from scratch, which might occur because a non-patchable change has been made to the
configuration, this option must be selected. If it is left in the default unselected state, then the
domain build process will halt and report an error, rather than overwrite the existing domain.
Domain Build
The domain build process automatically carries out the following steps:
1. Basic validation of the given config name and partition dimension.
2. Ensure that a configuration with the given config name has been uploaded.
3. If the overwrite flag is false, ensure that there is no existing domain. It reports error if
domain exists.
4. If the overwrite flag is true, remove the existing domain.
5. Build the domain using the config name and the partition dimension as specified in the
OAT parameter screen.
6. Copy any users and user groups from the bootstrap domain environment into the domain
environment.
7. Copy the uploaded batch control text files into the domain from the SFTP location.
8. Run post-domain-build batch group.
9. Add the domain details into the provisioned RPASCE Client configuration.
Once the Bootstrap Domain task has completed, you only need to log out of the RPASCE
Client and then log back in again to see the tasks and menus associated with your newly built
domain. (It is no longer required to restart the RPASCE Client, and this option has been
removed from the OAT menus.)
This chapter describes how to configure In-Context Help for solutions based on RPASCE.
In-Context Help is a resource to access relevant help topics, in the format of html and video,
within the application. At present, it focuses on help topics related to the dashboard and the
workspace. The InContext Help file is located under config/Help in the RPASCE Client install
folder. The naming convention is <solution-name>HelpConfig.json.
Dashboard
The help topics for the dashboard are added to the following two levels:
■ All: The generic topics related to MFP or A&IP are added to this level.
■ Report: This consists of topics related to dashboards such as the effective usage, how to
analyze the metrics, and so on.
Figure 2–1 shows the view of a dashboard.
The help topics for the dashboard are visible on the right side panel, as shown in Figure 2–2.
Workspace
The workspace contains the actual content related to MFP or A&IP. Here the topics are aligned
with respect to the different levels of the Taskflow.
Figure 2–3 illustrates the workspace for the product MFPRCS.
guide/output/introduction.htm#introduction",
"type" : "document",
"imageSrc" : "",
"color" : "turquoise"
} ],
"reports" : {
"helpTopics" : [ ],
"reports.dashboards.id" : {
"helpTopics" : [ {
"name" : "Using the dashboard",
"description" : "Manipulate the dashboard in order to effectively analyze
plan matrics",
"url" : "http://docs.oracle.com/cd/E75764_
01/merchfinplan/pdf/cloud/161/html/retail_implementer_
guide/output/dashboard.htm#dashboard",
"type" : "document",
"imageSrc" : "",
"color" : "turquoise"
} ]
}
},
"workbooks" : {
"maxTopics" : "2.0",
"helpTopics" : [ ],
"mfprcs.Activity1.Task1" : {
"helpTopics" : [ {
"name" : "Overview of Merch Plan Targets",
"description" : "Learn about the steps associated with creating and
monitoring targets",
"url" : "http://docs.oracle.com/cd/E75764_
01/merchfinplan/pdf/cloud/161/html/retail_implementer_
guide/output/CreateMerchPlanTargets.htm#create_merch_plan_targets_task",
"type" : "document",
"imageSrc" : "",
"color" : "turquoise"
} ]
}
}
To add a topic under sub-level Step, the implementer must search for the Step key and add
the help topic. For editing, the implementer must search for a particular help topic and edit
any of the properties as required.
■ Adding or editing the help topic for sub-level View under level WORKBOOKS.
To add a topic under sub-level View, the implementer must search for the View key and
add the help topic. For editing, the implementer must search for a particular help topic and
edit any of the properties as required.
This appendix describes all non-success exit codes from the Batch Framework services and
batch administration tasks.
All EE batch scripts have consistent exit codes. Codes from 1 to 22 come from the BSA
framework (although only 6 and 13 are commonly used by EE batch and so are included in the
table below). Codes of 30 and above are from EE batch scripts themselves and are also listed in
Table A–1.
Table A–1 lists the common (non-success) exit codes from the EE batch scripts and the BSA
framework.
Note that in a live OCI-provisioned environment, it is not expected that customers will see any
of these error codes except 31 through 33. These codes indicate issues in the customer-provided
batch config files.
Table A–2 lists additional exit codes from eebatch_exporthier.ksh, eebatch_exportmeas.ksh,
eebatch_loadhier.ksh, and eebatch_loadmeas.ksh, that result from the exit codes of the
underlying RPASCE binary utilities (exportHier, exportMeasure, loadHier and loadMeasure).
The exit codes from the binary utilities are reported by the EE Batch Framework as being 100
more than the raw utility results. This prevents overlap between the BSA/EE script result codes
and the RPASCE binary utility result codes. If loadHier itself returns an error code of 5, then
the EE batch framework will report the error as code 105.
It is not expected that customers will encounter any of the RPASCE exceptions, internal errors,
or C++ exceptions, which indicate corrupted data or a programming error.
3. Run the deployer script to create the data sources and deploy BDI RPAS Batch Job Admin.
./bdi-job-admin-deployer.sh -setup-credentials -deploy-job-admin-app
These are the details for the credential aliases used in the installation.
■ bdiAppServerAdminServerUserAlias: WebLogic admin server credentials
■ bdiJobAdminUiUserAlias: Credentials for Admin Role use for Job Admin app.
■ bdiJobOperatorUiUserAlias: Credentials for Operator Role user for Job Admin app
■ bdiJobMonitorUiUserAlias: Credentials for Monitor Role user for Job Admin app
■ bdiJobAdminDataSourceUserAlias: Credentials for the Data Source of the Job Admin
Schema
■ bdiRxmReceiverServiceDataSourceUserAlias: Credentials for the Data Source of the
Job Receiver Schema
■ batchInfraDataSourceUserAlias: Credentials for the Data Source of the Batch Infra
Schema
■ ItemHdrAndUdaItemLov_Fnd_ProcessFlow_From_RMS
■ Supplier_Fnd_ProcessFlow_From_RMS
■ Inventory_Tx_ProcessFlow_From_RMS
■ OnOrder_Tx_ProcessFlow_From_RMS
■ StockOut_Tx_ProcessFlow_From_RMS
■ TranData_Tx_ProcessFlow_From_RMS
■ WeeklySales_Tx_ProcessFlow_From_RMS