Thursday, 9 June 2011

Persistent Staging Area (PSA)

The persistent staging area (PSA), the entry layer for data in BI.
A transparent PSA table is created for every DataSource that is activated. The PSA tables each have the same structure as their respective DataSource. They are also flagged with key fields for the request ID, the data package number, and the data record number.
InfoPackages load the data from the source into the PSA. The data from the PSA is processed with data transfer processes.
With the context menu entry Manage for a DataSource in the Data Warehousing Workbench you can go to the PSA maintenance for data records of a request or delete request data from the PSA table of this DataSource. You can also go to the PSA maintenance from the monitor for requests of the load process.
Using partitioning, you can separate the dataset of a PSA table into several smaller, physically independent, and redundancy-free units. This separation can mean improved performance when you update data from the PSA.

Constraints
The number of fields is limited to a maximum of 255 when using TRFCs to transfer data. The length of the data record is limited to 1962 bytes when you use TRFCs. 
RSDSSEG: Data Source and corresponding PSA table
RSDODSO: Directory of all Data store

How to find PSA table
1). Go to RSA1 -> Select the Data Source for which you want to know PSA table name

  


2) Go to Goto -> Technical attributes

3) Here in PSA properties you can find PSA table name.

4) Now Go to SE16 and write the name of PSA table and look the content of it.

How to find table space name corresponding to a PSA table

1)Go to SE11 and write the PSA table name -> F8

2) Now Go to Utilities -> Display content -> Display Utilities


3) Go to storage parameter

Here you can find the table space corresponding to the PSA table uxkSegQisGHytmfbVO_QHbSanv8

Thursday, 2 June 2011

Data Transfer Process


Data Transfer Process is a new concept in SAP BI(BW 7.0) . It was not in BW3.5. In SAP BW 3.5 we use Infopackage for the loading of data upto data targets but now in SAP BI we can load till only PSA through infopackage uxkSegQisGHytmfbVO_QHbSanv8.


From PSA to Data Target  loading is done through Data Transfer Process.

The data transfer process makes the transfer processes in the data warehousing layer more transparent. Optimized parallel processing improves the performance of the transfer process .You can use the data transfer process to separate delta processes for different targets and you can use filter options between the persistent objects on various levels. For example, you can use filters between a Data Store object and an Info Cube
Data transfer processes are used for standard data transfer, for real-time data acquisition and for accessing data directly. 


Features of DTP

  1. Loading data from one layer to others except Info sources.
  2. Separation of delta mechanism for different data targets.
  3. Enhanced filtering in dataflow.
  4. Improved transparency of staging processes across data warehouse layers.
  5. Improved performance : optimized parallelization
  6. Enhanced error handling in the form of error stack 
  7. Enables real-time data acquisition.  
Most important feature of DTP is easy error handling process.
Process#1 Enhanced Filtering, Debugging and error handling options 
 
Process # 2 -Handling Data Records With Errors
  1. Using the error handling settings on the Update tab page in the data transfer process, when data is transferred from a DTP source to a DTP target, you can specify how the system is to react if errors occur in the data records.
  2. These settings were previously made in the Info Package. When using data transfer processes, Info Packages write to the PSA only. Error handling settings are therefore no longer made in the Info Package, but in the data transfer process
Process # 3 -Error Handling Features
  1. Possibility to choose in the scheduler to
  2. Abort process when errors occur
  3. Process the correct records but do not allow reporting on them
  4. Process the correct records and allow reporting on them
  5. Number of wrong records which lead to a wrong request
  6. Invalid records can be written into an error stack
  7. Keys should be defined for error stack to enable the error handling of data store object
  8. Temporary data storage can be switched on/off for each sub step of the loading process
  9. Invalid records can be updated into data records after their correction
 
Process # 4 - Error Stack
  1. Stores erroneous records
  2. Keeps the right sequence of records à for consistent data store handling.
  3. Key of error stack defines which data should be detained from the update after the erroneous data record.
  4. After Correction, Error-DTP updates data from error stack to data target.
Note: Once the request in the source object is deleted, the related data records in error stack area automatically deleted.


Creating Data Transfer Processes from the Object Tree in the Data Warehousing Workbench
1) In the Data Warehousing Workbench, an object tree is displayed and select the data target object.
          2) In the context menu, choose Create Data Transfer Process.
The dialog box for creating a data transfer process appears.
Proceed as described in step three and onwards in the procedure for creating a data transfer process using a process chain.

Creating Data Transfer Processes Using Process Chains
          1)  Use Drag&Drop or double-click to include the process in the process chain.   
          2) To create a data transfer process as a new process variant, enter a technical name and choose Create. The dialog box for creating a data transfer process appears.
          3) Select the type of data transfer process:
For VirtualProviders, the only available option is DTP for Direct Access
For DataStore objects, choose Standard DTP and DTP for Real-Time Data Acquisition
.         4) Select the source object type and the object from which you want to transfer data into the target. If only one source object exists, this is selected by default.
          5) Choose Continue.
          6) The data transfer process maintenance screen appears.
The header data for the data transfer process shows the description, ID, version, and status of the data transfer process, along with the delta status.
          7) On the Extraction tab page, determine the parameters:
                            a.      Choose the extraction mode.
You can choose full mode or delta mode. You do not need to initialize the delta process explicitly for the delta transfer.
                            b.      If necessary, determine filter criteria for the delta transfer.
This means that you can use multiple data transfer processes with disjunctive selection conditions to transfer small sets of data from a source efficiently into one or more targets, instead of transferring large volumes of data. You can specify individual selections, multiple selections, intervals, selections based on variables, or routines. Choose Change Selection to change the list of InfoObjects that can be selected.
                            c.      Apply further settings, which are dependent on the source object.
          8) On the Update tab page, determine the parameters:
                            a.      Apply settings for error handling: Determine how the system updates the valid records when errors occur, the number of errors allowed before the load process terminates, whether the load process should be judged to contain errors if records are being aggregated, sorted, or added during the transformation.
                            b.      Under Semantic Groups determine the key fields for the error stack. 
          9) Make the settings for the temporary storage by choosing Goto   ®  Settings for DTP Temporary Storage. Here you determine the steps in the process run of the program after which you want to store data to the temporary storage, the level of detail to which you want to store data, and when you next want to delete the temporary storage.
         10) On the Execute tab page, determine the parameters:
On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure.
                            a.      Specify the status that you want the system to adopt for the request if warnings are displayed in the log.
                            b.      Specify how you want the system to determine the overall status of the request.
The system automatically determines the processing mode for the background processing of the respective data transfer process.
          11) Check the data transfer process, save and activate it.
       

       Comman Errors While loading through DTP

A DTP can be failed due to follwing reasons-
1. Errorneous records in many data packets
2. Short dump error

we are having to option to correct the request-
1. Repair
2. Repeat

If request is red in the data target due to short dump error and no data consistency issue we can repair the existing failed load.
Or if there are errorneous record in PSA then we can edit them in and then repair the failed load.

If request is red in the data target due to any data consistency issue then first you will have to delete the request from the data targets and then repeat the DTP.