Advanced Datastore Object
Advanced Datastore Object
Advanced Datastore Object
The DataStore object (advanced) is the central object for data storage and data consolidation in
the BW system.
If the required properties are set accordingly, the DataStore object (advanced) can be used in the
various layers of the data warehouse. To simplify the modeling of the DataStore object
(advanced), we provide templates that can be used to generate the required properties. The
DataStore object (advanced) can contain InfoObjects and fields. This allows you to load data into
the BW without having to assign InfoObjects. Thanks to its new Request-Management, the
DataStore object (advanced) is particularly well suited to deal with frequent loading and large
amounts of data.
The DataStore object (advanced) consists of a maximum three tables: the inbound table, the
change log and the table of active data.
You might not need all three tables. This depends on how you want to use the DataStore object.
The data is initially loaded into the inbound table. The data is either read from the inbound table
directly, or it is activated first and then read from the table of active data. This depends on how
the data is used. The change log contains the change history for the delta update from the
DataStore object to other data targets.
Modeling Properties
The modeling properties enable you to control how you use your DataStore object (advanced).
Activate Data: In general, the data is always written to the inbound table. If you choose Activate
Data, the data is written to the table for active data (during the activation/compression process).
Write Change Log: If you choose this option, the delta (new and changed records) is saved in
the change log. The change log is used to extract the delta. You can only delete data from the
DataStore object if the object has a change log.
Keep Inbound Data, and extract from Inbound Table: If you choose this option, no data is
saved in the change log. The extraction process always reads the data in the inbound table again
- for delta extraction or full extraction.
Unique Data Records: If you only load unique data records (data records with non-recurring key
combinations) into the DataStore object, you can select this property. This means the system
does not check whether the record already exists. You have to be sure that no duplicate records
are loaded. This means that the table of active data will only contain unique data records. Data
aggregation is not allowed.
Snapshot Support: If your DataSource only allows full update, you can use the Snapshot
Support flag to make sure that deleted data records are still updated. Upon activation, the
system recognizes records that are in the table of active data but not in the load request. These
are written to the change log as reverse images.
Note: Make sure that all records are contained during every load, as data could otherwise be
lost.
Direct Update:With this property, you create a DataStore object (advanced) for direct writing.
The data is then written directly to the table of activate data using a DTP or an API.
All Characteristics are Key, Reporting on Union of Inbound and Active Table
If you select this property, all the characteristics are included in the key. The system accesses
the inbound table and the active table (using a union across both tables) in the query. In this
case, you should only load additive deltas. The data is aggregated. The properties are
comparable to the InfoCube.
Planning Mode
If you set the Planning Modeflag, you can use the DataStore object (advanced) for planning.
Inventory
If you set the Inventory flag, you can use non-cumulative key figures in your DataStore object.
You can then add non-cumulative key figures to the DataStore object (advanced) and define
validity characteristics.
If you use SAP IQ as extended storage for your BW system, you can set the SAP HANA
Dynamic Tiering flag. The data is then only saved in the persistency layer. If you set this flag at
a later point in time when the tables already contain data, a remodeling request is created, which
you can start in the Remodeling Monitor. Otherwise, the tables are written directly to SAP IQ. If
the definition of the DataStore objects (advanced) necessitates activation, this is performed in
ABAP.
You can use the DataStore object (advanced) in various data warehouse layers by selecting the
required templates or by selecting the required properties.
The templates are structured according two aspects: By data warehousing layers and by the
classic BW InfoProviders. By the Data Warehousing layers and by the classic BW InfoProviders.
If you are acquainted with the classic BW InfoProviders and want to work with the modeling for
the DataStore object (advanced), you can choose your template from the Classic
Objects category. If you want to work with the layer architecture, you can choose your template
from the Enterprise Data Warehouse Architecture category.
InfoCube
1) Data Acquisition Layer (Including Corporate memory) = Write
Optimized DSO
With the template for the data acquisition layer, a DataStore object (advanced) is created with
fields.
This type of modeled object corresponds to a persistent staging area (PSA) and acts as an
incoming storage area in BW for data from source systems.
The corporate memory contains the complete history of the loaded data.
This displays an intermediate layer. The data is posted to other DataStore objects (advanced)
that serve as architected data marts. It is filled separately from the update in the architected data
marts. The template for the corporate memory is used to create a DataStore object (advanced)
with InfoObjects or fields. Fields are useful if you load data from external sources or if you want
flexible modeling and you only want to assign InfoObjects at a higher point in the data flow.
The corporate memory can have two flavors: Focus on compressing data, or focus on reporting
and analyzing data. For the corporate memory with focus on compression, the Activate
Data property is selected under Properties:
The requests are loaded into the inbound table. The data is stored at a granular level of detail.
If the data is not required with this level of detail, it can be compressed in order to save space.
Before you activate (and thereby compress) the data, make sure that all the data has been
updated from the inbound table using delta and that all the data is consistent. During activation,
the data is aggregated in accordance with the semantic key and is written to the active data
table. In the query, you will then only see the data that has been activated. To save memory
space, the change log is not filled. Therefore you cannot perform request-based deletion of data
from the DataStore object. You can only delete data selectively.
For the corporate memory with focus on reporting and analysis, Activate Data and Keep
Inbound Data, extract from Inbound Tableare selected under Properties:
This type of object also stores data at granular level. The data can be activated, but is stored
redundantly in the inbound table in order to prevent the detailed information from being lost. This
also makes it possible to delete the data from the active table and to create it again from the
inbound table.
The data is only extracted from the inbound table. When a query is executed, the active table is
accessed:
4) Data Warehouse Layer - Delta Calculation = Standard DSO
The Data Warehouse Layer can have too different flavors: With delta calculation or as data mart.
With the template for the Data Warehouse Layer with delta calculation, the Activate
Data and Write Change Log properties are selected under Modeling Properties:
You can also choose the optional property Unique Data Records, if you are only loading unique
data records.
Requests are loaded into the inbound table. If you want to execute a query on these requests,
they must be activated first. The data is written to the active data table, and the history is stored
in the change log. The change log is also used for the rollback, so that activated requests can
also be deleted again.
This type of modeled object corresponds to a standard DataStore object (classic). Unlike with
InfoCube-like DataStore objects, this does not provide stable navigation during reporting. When a
query is executed, the active table is accessed:
The Data Warehouse Layer (Data Mart) contains the objects that are used to perform queries for
analysis.
With the Data Warehouse Layer (Data Mart) template, the Activate Data and All
Characteristics are Key, Reporting on Union of Inbound and Active Table properties are
selected under Modeling Properties:
This type of modeled object corresponds to a standard InfoCube.
The inbound table corresponds to the InfoCube's F table, while the active data table corresponds
to the E table.
Reporting on this type of DataStore object is consistent and provides stable navigation. A query
can be executed straight after loading. You do not need to activate it beforehand. You can load
deltas, for example from another DataStore object (advanced). The data is aggregated, thus
making it impossible to overwrite key figures for example. As the change log is not filled, you
cannot delete any data from the DataStore object.
When a query is executed, the active table and the inbound table are accessed:
6) DataStore Object (advanced) for Direct Update
In a DataStore object (advanced) for direct update, you can load the data directly into the table
for active data.
You can load the data via an API. You can also perform the initial load using a DTP, but should
perform all subsequent using the API.
The system still performs certain checks though, including SID handling and time consistency
checks. It does not check for overlaps with previously archived requests however.
/BIC/A(DSOTECHNAME)5 = Reference point table (Enabled only when select Inventory option)
Validity Table:
Validity table stores the time intervals for which non-cumulative values have been loaded into the
ADSO
Reference point table contains reference points for non-cumulative key figures. Unlike for
InfoCubes, reference points are stored in a separate table for Advanced DSOs.
Advanced DSO doesn't support Nav.Attributes for Reporting but support for extraction