Article published by : john683 on Thursday, June 09, 2016 - Viewed 361 times

Category : Computers

Sap Bi: Introduction To Standard And Optimized Dso



About a Standard DSO:

A standard DSO has three straightforward tables on the database.
Initiation Queue: Holds the records that are to be upgraded, not yet been actuated.

Dynamic Data: Table which holds dynamic information.
Change Log: Holds the change history for delta loads.
Procedure of Data exchange to DSO:

•Information gets initially stacked into the actuation Queue likewise called the New information table.

•Data upon "Actuation" is exchanged from New information table to Active information table

•And after that to Change log table. In change log table you can have the changed information or adjusted information.
Note: the information in Change log table is repetitively put away and can be erased after the records have been initiated.

Creation Of Standard DSO:

Step 1:

•Go to exchange code RSA

•Click the OK catch.

Step 2:

•Explore to Modeling tab->InfoProvider.

•Right tap on InfoArea.

•Click on "Make DataStore Object" from the setting menu.

Settings in DSO:

Kind of DataStore Object: This choice can be utilized to change the sort of DSO. As a matter of course, Standard DSO would be picked. This can be changed to Write improved or Direct Update DSO.

SID Generation upon Activation :Generated the Surrogate ID (SID) for every expert information esteem, when this choice is checked.

One of a kind Data Records: This choice can be utilized when the DSO will never hold copy values.

Set Quality Status to "alright" Automatically: This setting sets the Quality status after the information stacking has been finished.
Initiate Data Automatically: DSO enactment is mechanized by utilizing this setting.

Overhaul Data Automatically: Data stacked in the DSO can be naturally stacked to target objects utilizing this setting.

Step 3:

•Enter the Technical Name.

•Enter the Description.

•Click on the "Make" catch.

Optimization Of DSO:

Compose Optimized DSO is utilized when a Data stockpiling item is required for putting away most minimal granularity records, for example, address and when overwrites usefulness is not required. It comprises of the table of dynamic information just, thus no requirement for information enactment which builds information process. Information store item is accessible promptly for further handling; it is utilized as an impermanent stockpiling region for expansive arrangement of information.

Compose Optimized DSO has been essentially intended to be the underlying organizing of the source framework information from where the information could be exchanged to the Standard DSO or the Info Cube.

•PSA gets information unaltered to the Source framework

•Information is posted at report level, After stacking into standard DSOs information is erased

•Information is presented on Corporate memory compose –optimized DSO from pass through compose streamlined DSO

•Information is Distributed from compose streamlined "pass through" to Standard DSOs according to business prerequisite
Compose Optimized DSO Properties:

•It is utilized for introductory organizing of source framework information.

•Information put away is of most reduced granularity.

•Information burdens can be quicker since it doesn't have the different actuation step.

•Each record has a specialized key and thus collection of records is unrealistic. New records are embedded inevitably.
Formation Of Write-Optimized DSO:

Step 1:

•Go to exchange code RSA1

•Click the OK catch.

Step 2:

•Explore to Modeling tab->Info Provider.

•Right tap on Info Area.

•Click on "Make Data Store Object" from the connection menu.

Step 3:

•Enter the Technical Name.

•Enter the Description.

•Click on the "Make" catch.

Step 4:

•Click on the Edit catch of "Sort of DataStore Object".

Step 5:

•Pick the Type "Compose Optimized".

•Specialized keys incorporate Request ID, Data bundle, Record number. No extra protests can be incorporated under this.

•Semantic keys are like key fields, notwithstanding, here the uniqueness is not considered for over compose usefulness. They are rather utilized as a part of conjunction with setting "Don't check uniqueness of information".

•The Purpose of Semantic Key is to distinguish blunder in approaching records or Duplicate records .

•Copy Records are built into mistake stack in the resulting request. These records in the mistake stack can be taken care of or re-stacked by characterizing Semantic Group in DTP.

•Semantic Groups need not be characterized if there will be no probability of copy records or blunder records.

•On the off chance that we don't check the Check Box "Permit Duplicate Data Record ", the information originating from source is checked for duplication, i.e, if the same record (semantic keys) as of now exist in the DSO, then the present burden is ended.

•On the off chance that we choose the check box , Duplicate records are stacked as another record. There is no significance of semantic keys for this situation.

Step 6:

•Initiate the DSO.

Folkstrain offers a best online training for sap bi in usa and globally with professionals on your flexible timings@ sap bi online training

---
NationDrugs.to


Keywords: sap bi online training

By: john683

Article Directory: http://www.articlecatalog.com

Copy and Paste Link Code:


Read other Articles from john683: More »

Article ID 1027539 (Views 361)



Sponsor Listing

NationDrugs.to Canadian Pharmacy