Search This Blog

Sunday, December 27, 2009

Happy New Year

Dear Friend,
Wish You Merry Christmas & Prosperous New Year.

Where did the year go?

It's December...... and we realize that with giant strides we started in January and within a blink of an eye, 2009 is on its’ back!
A big "Thank You" to each and every one of you, for the impact you had on my life this year. Especially for all the support, telephone calls, sms’s, and e-mails I received (not forgetting those shoulders when I needed it), the happy moments to smile about and the sad ones to cry about, .......
Without you, I'm sure that 2009 would have been extremely boring.
From my side I wish you all a MAGICAL FESTIVE SEASON filled with Loving Wishes and Beautiful Thoughts.
May 2010 mark the beginning of a Tidal Wave of Love, Happiness and Bright Futures.


To those who need money, may your finances overflow
To those who need caring, may you find a good heart
To those who need friends, may you meet lovely people
To those who need life, may you find GOD.
Wishing you one more fantastic new year.....Hope this year bring all the happiness and success you were / are looking for long.....!!!

Cheers!
Ramu

Wednesday, December 2, 2009

major changes in BI 7.0

Below are the major changes in BI 7.0 or 2004S version when compared with earlier versions.
1. In Infosets now you can include Infocubes as well.
2. The Remodeling transaction helps you add new key figure and characteristics and handles historical data as well without much hassle. This is only for info cube.
3. The BI accelerator (for now only for infocubes) helps in reducing query run time by almost a factor of 10 - 100. This BI accl is a separate box and would cost more.
4. The monitoring has been improved with a new portal based cockpit. Which means you would need to have an EP guy in your project for implementing the portal !
5. Search functionality has improved!! You can search any object.
6. Transformations are in and routines are passed! Yes, you can always revert to the old transactions too
7. Renamed ODS as DataStore.
8. Inclusion of Write-optimized DataStore which does not have any change log and the requests do need any activation
9. Unification of Transfer and Update rules
10. Introduction of "end routine" and "Expert Routine"
11. Push of XML data into BI system (into PSA) without Service API or Delta Queue
12. Introduction of BI accelerator that significantly improves the performance.
13. Load through PSA has become a must. It
Metadata Search (Developer Functionality) :
1. It is possible to search BI metadata (such as InfoCubes, InfoObjects, queries, Web templates) using the TREX search engine. This search is integrated into the Metadata Repository, the Data Warehousing Workbench and to some degree into the object editors. With the simple search, a search for one or all object types is performed in technical names and in text.
2. During the text search, lower and uppercase are ignored and the object will also be found when the case in the text is different from that in the search term. With the advanced search, you can also search in attributes. These attributes are specific to every object type. Beyond that, it can be restricted for all object types according to the person who last changed it and according to the time of the change.
3. For example, you can search in all queries that were changed in the last month and that include both the term "overview" in the text and the characteristic customer in the definition. Further functions include searching in the delivered (A) version, fuzzy search and the option of linking search terms with "AND" and "OR".
4. "Because the advanced search described above offers more extensive options for search in metadata, the function ""Generation of Documents for Metadata"" in the administration of document management (transaction RSODADMIN) was deleted. You have to schedule (delta) indexing of metadata as a regular job (transaction RSODADMIN).

• Effects on Customizing
• Installation of TREX search engine
• Creation of an RFC destination for the TREX search engine
• Entering the RFC destination into table RSODADMIN_INT
• Determining relevant object types
• Initial indexing of metadata"
Remote Activation of DataSources (Developer Functionality) : 1. When activating Business Content in BI, you can activate DataSources remotely from the BI system. This activation is subject to an authorization check. You need role SAP_RO_BCTRA. Authorization object S_RO_BCTRA is checked. The authorization is valid for all DataSources of a source system. When the objects are collected, the system checks the authorizations remotely, and issues a warning if you lack authorization to activate the DataSources.
2. In BI, if you trigger the transfer of the Business Content in the active version, the results of the authorization check are based on the cache. If you lack the necessary authorization for activation, the system issues a warning for the DataSources. BW issues an error for the corresponding source-system-dependent objects (transformations, transfer rules, transfer structure, InfoPackage, process chain, process variant). In this case, you can use Customizing for the extractors to manually transfer the required DataSources in the source system from the Business Content, replicate them in the BI system, and then transfer the corresponding source-system-dependent objects from the Business Content. If you have the necessary authorizations for activation, the DataSources in the source system are transferred to the active version and replicated in the BI system. The source-system-dependent objects are activated in the BI system.
3. Source systems and/or BI systems have to have BI Service API SAP NetWeaver 2004s at least; otherwise remote activation is not supported. In this case, you have to activate the DataSources in the source system manually and then replicate them to the BI system.
Copy Process Chains (Developer Functionality):
You find this function in the Process Chain menu and use it to copy the process chain you have selected, along with its references to process variants, and save it under a new name and description.
InfoObjects in Hierarchies (Data Modeling):
1. Up to Release SAP NetWeaver 2004s, it was not possible to use InfoObjects with a length longer than 32 characters in hierarchies. These types of InfoObjects could not be used as a hierarchy basic characteristic and it was not possible to copy characteristic values for such InfoObjects as foreign characteristic nodes into existing hierarchies. From SAP NetWeaver 2004s, characteristics of any length can be used for hierarchies.
2. To load hierarchies, the PSA transfer method has to be selected (which is always recommended for loading data anyway). With the IDOC transfer method, it continues to be the case that only hierarchies can be loaded that contain characteristic values with a length of less than or equal to 32 characters.
Parallelized Deletion of Requests in DataStore Objects (Data Management) :
Now you can delete active requests in a DataStore object in parallel. Up to now, the requests were deleted serially within an LUW. This can now be processed by package and in parallel.
Object-Specific Setting of the Runtime Parameters of DataStore Objects (Data Management):
Now you can set the runtime parameters of DataStore objects by object and then transport them into connected systems. The following parameters can be maintained:
- Package size for activation
- Package size for SID determination
- Maximum wait time before a process is designated lost
- Type of processing: Serial, Parallel(batch), Parallel (dialog)
- Number of processes to be used
- Server/server group to be used

Enhanced Monitor for Request Processing in DataStore Objects (Data Management):
1. for the request operations executed on DataStore objects (activation, rollback and so on), there is now a separate, detailed monitor. In previous releases, request-changing operations are displayed in the extraction monitor. When the same operations are executed multiple times, it will be very difficult to assign the messages to the respective operations.
2. In order to guarantee a more simple error analysis and optimization potential during configuration of runtime parameters, as of release SAP NetWeaver 2004s, all messages relevant for DataStore objects are displayed in their own monitor.

Write-Optimized DataStore Object (Data Management):

1. Up to now it was necessary to activate the data loaded into a DataStore object to make it visible to reporting or to be able to update it to further InfoProviders. As of SAP NetWeaver 2004s, a new type of DataStore object is introduced: the write-optimized DataStore object.

2. The objective of the new object type is to save data as efficiently as possible in order to be able to further process it as quickly as possible without addition effort for generating SIDs, aggregation and data-record based delta. Data that is loaded into write-optimized DataStore objects is available immediately for further processing. The activation step that has been necessary up to now is no longer required.

3. The loaded data is not aggregated. If two data records with the same logical key are extracted from the source, both records are saved in the DataStore object. During loading, for reasons of efficiency, no SID values can be determined for the loaded characteristics. The data is still available for reporting. However, in comparison to standard DataStore objects, you can expect to lose performance because the necessary SID values have to be determined during query runtime.

Deleting from the Change Log (Data Management):

The Deletion of Requests from the Change Log process type supports the deletion of change log files. You select DataStore objects to determine the selection of requests. The system supports multiple selections. You select objects in a dialog box for this purpose. The process type supports the deletion of requests from any number of change logs.
Using InfoCubes in InfoSets (Data Modeling):

1. You can now include InfoCubes in an InfoSet and use them in a join. InfoCubes are handled logically in InfoSets like DataStore objects. This is also true for time dependencies. In an InfoCube, data that is valid for different dates can be read.

2. For performance reasons you cannot define an InfoCube as the right operand of a left outer join. SAP does not generally support more than two InfoCubes in an InfoSet.

Pseudo Time Dependency of DataStore Objects and InfoCubes in InfoSets (Data Modeling) :

In BI only master data can be defined as a time-dependent data source. Two additional fields/attributes are added to the characteristic. DataStore objects and InfoCubes that are being used as InfoProviders in the InfoSet cannot be defined as time dependent. As of SAP NetWeaver 2004s, you can specify a date or use a time characteristic with DataStore objects and InfoCubes to describe the validity of a record. These InfoProviders are then interpreted as time-dependent data sources.

Left Outer: Include Filter Value in On-Condition (Data Modeling) :

1. The global properties in InfoSet maintenance have been enhanced by one setting Left Outer: Include Filter Value in On-Condition. This indicator is used to control how a condition on a field of a left-outer table is converted in the SQL statement. This affects the query results:
• If the indicator is set, the condition/restriction is included in the on-condition in the SQL statement. In this case the condition is evaluated before the join.
• If the indicator is not set, the condition/restriction is included in the where-condition. In this case the condition is only evaluated after the join.
• The indicator is not set by default.

Key Date Derivation from Time Characteristics (Data Modeling) :

Key dates can be derived from the time characteristics 0CALWEEK, 0CALMONTH, 0CALQUARTER, 0CALYEAR, 0FISCPER, 0FISCYEAR: It was previously possible to specify the first, last or a fixed offset for key date derivation. As of SAP NetWeaver 2004s, you can also use a key date derivation type to define the key date.

Repartitioning of InfoCubes and DataStore Objects (Data Management):

With SAP NetWeaver 2004s, the repartitioning of InfoCubes and DataStore objects on the database that are already filled is supported. With partitioning, the runtime for reading and modifying access to InfoCubes and DataStore objects can be decreased. Using repartitioning, non-partitioned InfoCubes and DataStore objects can be partitioned or the partitioning schema for already partitioned InfoCubes and DataStore objects can be adapted.

Remodeling InfoProviders (Data Modeling):


1. As of SAP NetWeaver 2004s, you can change the structure of InfoCubes into which you have already loaded data, without losing the data. You have the following remodeling options:
2. For characteristics:
• Inserting, or replacing characteristics with: Constants, Attribute of an InfoObject within the same dimension, Value of another InfoObject within the same dimension, Customer exit (for user-specific coding).
• Delete
3. For key figures:
• Inserting: Constants, Customer exit (for user-specific coding).
• Replacing key figures with: Customer exit (for user-specific coding).
• Delete
4. SAP NetWeaver 2004s does not support the remodeling of InfoObjects or DataStore objects. This is planned for future releases. Before you start remodeling, make sure:
(A) You have stopped any process chains that run periodically and affect the corresponding InfoProvider. Do not restart these process chains until remodeling is finished.
(B) There is enough available tablespace on the database.

1. After remodeling, check which BI objects that are connected to the InfoProvider (transformation rules, MultiProviders, queries and so on) have been deactivated. You have to reactivate these objects manually

Parallel Processing for Aggregates (Performance):

1. The change run, rollup, condensing and checking up multiple aggregates can be executed in parallel. Parallelization takes place using the aggregates. The parallel processes are continually executed in the background, even when the main process is executed in the dialog.

2. This can considerably decrease execution time for these processes. You can determine the degree of parallelization and determine the server on which the processes are to run and with which priority.

3. If no setting is made, a maximum of three processes are executed in parallel. This setting can be adjusted for a single process (change run, rollup, condensing of aggregates and checks). Together with process chains, the affected setting can be overridden for every one of the processes listed above. Parallelization of the change run according to SAP Note 534630 is obsolete and is no longer being supported.

Multiple Change Runs (Performance):

1. You can start multiple change runs simultaneously. The prerequisite for this is that the lists of the master data and hierarchies to be activated are different and that the changes affect different InfoCubes. After a change run, all affected aggregates are condensed automatically.

2. If a change run terminates, the same change run must be started again. You have to start the change run with the same parameterization (same list of characteristics and hierarchies). SAP Note 583202 is obsolete.

Partitioning Optional for Aggregates (Performance):

1. Up to now, the aggregate fact tables were partitioned if the associated InfoCube was partitioned and the partitioning characteristic was in the aggregate. Now it is possible to suppress partitioning for individual aggregates. If aggregates do not contain much data, very small partitions can result. This affects read performance. Aggregates with very little data should not be partitioned.

2. Aggregates that are not to be partitioned have to be activated and filled again after the associated property has been set.

MOLAP Store (Deleted) (Performance):

Previously you were able to create aggregates either on the basis of a ROLAP store or on the basis of a MOLAP store. The MOLAP store was a platform-specific means of optimizing query performance. It used Microsoft Analysis Services and, for this reason, it was only available for a Microsoft SQL server database platform. Because HPA indexes, available with SAP NetWeaver 2004s, are a platform-independent alternative to ROLAP aggregates with high performance and low administrative costs, the MOLAP store is no longer being supported.

Data Transformation (Data Management):

1. A transformation has a graphic user interfaces and replaces the transfer rules and update rules with the functionality of the data transfer process (DTP). Transformations are generally used to transform an input format into an output format. A transformation consists of rules. A rule defines how the data content of a target field is determined. Various types of rule are available to the user such as direct transfer, currency translation, unit of measure conversion, routine, read from master data.

2. Block transformations can be realized using different data package-based rule types such as start routine, for example. If the output format has key fields, the defined aggregation behavior is taken into account when the transformation is performed in the output format. Using a transformation, every (data) source can be converted into the format of the target by using an individual transformation (one-step procedure). An InfoSource is only required for complex transformations (multistep procedures) that cannot be performed in a one-step procedure.

3. The following functional limitations currently apply:
You cannot- use hierarchies as the source or target of a transformation.
You can- not use master data as the source of a transformation.
You cannot- use a template to create a transformation.
No- documentation has been created in the metadata repository yet for transformations.
In the- transformation there is no check for referential integrity, the InfoObject transfer routines are not considered and routines cannot be created using the return table.

Quantity Conversion :

As of SAP NetWeaver 2004s you can create quantity conversion types using transaction RSUOM. The business transaction rules of the conversion are established in the quantity conversion type. The conversion type is a combination of different parameters (conversion factors, source and target units of measure) that determine how the conversion is performed. In terms of functionality, quantity conversion is structured similarly to currency translation. Quantity conversion allows you to convert key figures with units that have different units of measure in the source system into a uniform unit of measure in the BI system when you update them into InfoCubes.

Data Transfer Process :

You use the data transfer process (DTP) to transfer data within BI from a persistent object to another object in accordance with certain transformations and filters. In this respect, it replaces the InfoPackage, which only loads data to the entry layer of BI (PSA), and the data mart interface. The data transfer process makes the transfer processes in the data warehousing layer more transparent. Optimized parallel processing improves the performance of the transfer process (the data transfer process determines the processing mode). You can use the data transfer process to separate delta processes for different targets and you can use filter options between the persistent objects on various levels. For example, you can use filters between a DataStore object and an InfoCube. Data transfer processes are used for standard data transfer, for real-time data acquisition, and for accessing data directly. The data transfer process is available as a process type in process chain maintenance and is to be used in process chains.

ETL Error Handling :

The data transfer process supports you in handling data records with errors. The data transfer process also supports error handling for DataStore objects. As was previously the case with InfoPackages, you can determine how the system responds if errors occur. At runtime, the incorrect data records are sorted and can be written to an error stack (request-based database table). After the error has been resolved, you can further update data to the target from the error stack. It is easier to restart failed load processes if the data is written to a temporary store after each processing step. This allows you to determine the processing step in which the error occurred. You can display the data records in the error stack from the monitor for the data transfer process request or in the temporary storage for the processing step (if filled). In data transfer process maintenance, you determine the processing steps that you want to store temporarily.

InfoPackages :

InfoPackages only load the data into the input layer of BI, the Persistent Staging Area (PSA). Further distribution of the data within BI is done by the data transfer processes. The following changes have occurred due to this:
- New tab page: Extraction -- The Extraction tab page includes the settings for adaptor and data format that were made for the DataSource. If data transfer from files occurred, the External Data tab page is obsolete; the settings are made in DataSource maintenance.
- Tab page: Processing -- Information on how the data is updated is obsolete because further processing of the data is always controlled by data transfer processes.
- Tab page: Updating -- On the Updating tab page, you can set the update mode to the PSA depending on the settings in the DataSource. In the data transfer process, you now determine how the update from the PSA to other targets is performed. Here you have the option to separate delta transfer for various targets.

For real-time acquisition with the Service API, you create special InfoPackages in which you determine how the requests are handled by the daemon (for example, after which time interval a request for real-time data acquisition should be closed and a new one opened). For real-time data acquisition with Web services (push), you also create special InfoPackages to set certain parameters for real-time data acquisition such as sizes and time limits for requests.
PSA :

The persistent staging area (PSA), the entry layer for data in BI, has been changed in SAP NetWeaver 2004s. Previously, the PSA table was part of the transfer structure. You managed the PSA table in the Administrator Workbench in its own object tree. Now you manage the PSA table for the entry layer from the DataSource. The PSA table for the entry layer is generated when you activate the DataSource. In an object tree in the Data Warehousing Workbench, you choose the context menu option Manage to display a DataSource in PSA table management. You can display or delete data here. Alternatively, you can access PSA maintenance from the load process monitor. Therefore, the PSA tree is obsolete.

Real-Time Data Acquisition :

Real-time data acquisition supports tactical decision making. You use real-time data acquisition if you want to transfer data to BI at frequent intervals (every hour or minute) and access this data in reporting frequently or regularly (several times a day, at least). In terms of data acquisition, it supports operational reporting by allowing you to send data to the delta queue or PSA table in real time. You use a daemon to transfer DataStore objects that have been released for reporting to the ODS layer at frequent regular intervals. The data is stored persistently in BI. You can use real-time data acquisition for DataSources in SAP source systems that have been released for real time, and for data that is transferred into BI using the Web service (push). A daemon controls the transfer of data into the PSA table and its further posting into the DataStore object. In BI, InfoPackages are created for real-time data acquisition. These are scheduled using an assigned daemon and are executed at regular intervals. With certain data transfer processes for real-time data acquisition, the daemon takes on the further posting of data to DataStore objects from the PSA. As soon as data is successfully posted to the DataStore object, it is available for reporting. Refresh the query display in order to display the up-to-date data. In the query, a time stamp shows the age of the data. The monitor for real-time data acquisition displays the available daemons and their status. Under the relevant DataSource, the system displays the InfoPackages and data transfer processes with requests that are assigned to each daemon. You can use the monitor to execute various functions for the daemon, DataSource, InfoPackage, data transfer process, and requests.

Archiving Request Administration Data :

You can now archive log and administration data requests. This allows you to improve the performance of the load monitor and the monitor for load processes. It also allows you to free up tablespace on the database. The archiving concept for request administration data is based on the SAP NetWeaver data archiving concept. The archiving object BWREQARCH contains information about which database tables are used for archiving, and which programs you can run (write program, delete program, reload program). You execute these programs in transaction SARA (archive administration for an archiving object). In addition, in the Administration functional area of the Data Warehousing Workbench, in the archive management for requests, you can manage archive runs for requests. You can execute various functions for the archive runs here.

After an upgrade, use BI background management or transaction SE38 to execute report RSSTATMAN_CHECK_CONVERT_DTA and report RSSTATMAN_CHECK_CONVERT_PSA for all objects (InfoProviders and PSA tables). Execute these reports at least once so that the available request information for the existing objects is written to the new table for quick access, and is prepared for archiving. Check that the reports have successfully converted your BI objects. Only perform archiving runs for request administration data after you have executed the reports.

Flexible process path based on multi-value decisions :

The workflow and decision process types support the event Process ends with complex status. When you use this process type, you can control the process chain process on the basis of multi-value decisions. The process does not have to end simply successfully or with errors; for example, the week day can be used to decide that the process was successful and determine how the process chain is processed further. With the workflow option, the user can make this decision. With the decision process type, the final status of the process, and therefore the decision, is determined on the basis of conditions. These conditions are stored as formulas.

Evaluating the output of system commands :

You use this function to decide whether the system command process is successful or has errors. You can do this if the output of the command includes a character string that you defined. This allows you to check, for example, whether a particular file exists in a directory before you load data to it. If the file is not in the directory, the load process can be repeated at pre-determined intervals.
Repairing and repeating process chains :

You use this function to repair processes that were terminated. You execute the same instance again, or repeat it (execute a new instance of the process), if this is supported by the process type. You call this function in log view in the context menu of the process that has errors. You can restart a terminated process in the log view of process chain maintenance when this is possible for the process type.

If the process cannot be repaired or repeated after termination, the corresponding entry is missing from the context menu in the log view of process chain maintenance. In this case, you are able to start the subsequent processes. A corresponding entry can be found in the context menu for these subsequent processes.

Executing process chains synchronously :

You use this function to schedule and execute the process in the dialog, instead of in the background. The processes in the chain are processed serially using a dialog process. With synchronous execution, you can debug process chains or simulate a process chain run.

Error handling in process chains:

You use this function in the attribute maintenance of a process chain to classify all the incorrect processes of the chain as successful, with regard to the overall status of the run, if you have scheduled a successor process Upon Errors or Always. This function is relevant if you are using metachains. It allows you to continue processing metachains despite errors in the subchains, if the successor of the subchain is scheduled Upon Success.

Determining the user that executes the process chain :

You use this function in the attribute maintenance of a process chain to determine which user executes the process chain. In the default setting, this is the BI background user.

Display mode in process chain maintenance :

When you access process chain maintenance, the process chain display appears. The process chain is not locked and does not call the transport connection. In the process chain display, you can schedule without locking the process chain.

Checking the number of background processes available for a process chain :

During the check, the system calculates the number of parallel processes according to the structure of the tree. It compares the result with the number of background processes on the selected server (or the total number of all available servers if no server is specified in the attributes of the process chain). If the number of parallel processes is greater than the number of available background processes, the system highlights every level of the process chain where the number of processes is too high, and produces a warning.

Open Hub / Data Transfer Process Integration :

As of SAP NetWeaver 2004s SPS 6, the open hub destination has its own maintenance interface and can be connected to the data transfer process as an independent object. As a result, all data transfer process services for the open hub destination can be used. You can now select an open hub destination as a target in a data transfer process. In this way, the data is transformed as with all other BI objects. In addition to the InfoCube, InfoObject and DataStore object, you can also use the DataSource and InfoSource as a template for the field definitions of the open hub destination. The open hub destination now has its own tree in the Data Warehousing Workbench under Modeling. This tree is structured by InfoAreas.
The open hub service with the InfoSpoke that was provided until now can still be used. We recommend, however, that new objects are defined with the new technology.

Thanks,
Ramu

Monday, November 2, 2009

SAP BI Related Table information

Transfer Structure
RSTS Transfer Structure List
RSTSFIELD Transfer Structure fields
RSTSRULES  Transfer Structure rules
RSAROUTT Text name of Transfer Routine
DD03T Text for R/3 Transfer structure Objects

Update Rules
RSUPDROUT Update rules List
RSUPDDAT   Update rules with routines
RSUPDKEY    Update rule key fields
RSUPDINFO InfoProvider to Infosource correlation
Embedded ABAP coding for Transfer / Update Rules
RSAABAP ABAP source code per object routine
InfoPackage
RSLDPIO Links datasource to infopackages
RSLDPIOT  InfoPackage Text Description
RSLDPRULE ABAP source code for InfoPackages
RSLDPSEL   Hardcoded selections in InfoPackages
RSMONICDP Contains the request-id number by data target
RSPAKPOS List of InfoPackage Groups / InfoPackages
ProcessChain
RSEVENTCHAIN   Event Chain Processing Event Table
RSEVENTHEAD    Header for the event chain
RSEVENTHEADT    Header for the event chain
RSPCCHAIN Process chain details
RSPCCHAINATTR  Attributes for a Process Chain
RSPCCHAINEVENTS  Multiple Events with Process Chains
RSPCCHAINT  Texts for Chain
RSPCCOMMANDLOG System Command Execution Logs (Process Chains)
RSPCLOGCHAIN  Cross-Table Log ID / Chain ID
RSPCLOGS Application Logs for the Process Chains
RSPCPROCESSLOG Logs for the Chain Runs
RSPCRUNVARIABLES  Variables for Process Chains for Runtime
RSPC_MONITOR Monitor individual process chains
Queries
RSZELTDIR Directory of the reporting component elements
RSZELTTXT Texts of reporting component elements
RSZELTXREF Directory of query element references
RSRREPDIR Directory of all reports (Query GENUNIID)
RSZCOMPDIR Directory of reporting components
RSZRANGE Selection specification for an element
RSZSELECT Selection properties of an element
RSZELTDIR Directory of the reporting component elements
RSZCOMPIC Assignment reuseable component <-> InfoCube
RSZELTPRIO Priorities with element collisions
RSZELTPROP Element properties (settings)
RSZELTATTR Attribute selection per dimension element
RSZCALC Definition of a formula element
RSZCEL Query Designer: Directory of Cells
RSZGLOBV Global Variables in Reporting
Workbooks
RSRWBINDEX List of binary large objects (Excel workbooks)
RSRWBINDEXT Titles of binary objects (Excel workbooks)
RSRWBSTORE Storage for binary large objects (Excel workbooks)
RSRWBTEMPLATE Assignment of Excel workbooks as personal templates
RSRWORKBOOK Where-used list for reports in workbooks
Web templates
RSZWOBJ Storage of the Web Objects
RSZWOBJTXT Texts for Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template
RSZWTEMPLATE Header Table for BW HTML Templates
InfoObject
Directory of all InfoObjects
RSDIOBJ                              
RSDIOBJT Texts of InfoObjects
RSDIOBJ   Directory of all InfoObjects
RSDIOBJT Texts of InfoObjects
RSDATRNAV Navigation Attributes
RSDATRNAVT Navigation Attributes
RSDBCHATR Master Data Attributes
RSDCHABAS Basic Characteristics (for Characteristics,Time Characteristics, and Units)
RSDCHA  Characteristics Catalog
RSDDPA Data Package Characteristic
RSDIOBJCMP Dependencies of InfoObjects
RSKYF  Key Figures
RSDTIM Time Characteristics
RSDUNI Units
InfoCube
RSDCUBE  Directory of InfoCubes
RSDCUBET Texts on InfoCubes
RSDCUBEIOBJ Objects per InfoCube (where-used list)
RSDDIME Directory of Dimensions
RSDDIMET Texts on Dimensions
RSDDIMEIOBJ  InfoObjects for each Dimension (Where-Used List)
RSDCUBEMULTI InfoCubes involved in a MultiCube
RSDICMULTIIOBJ MultiProvider: Selection/Identification of InfoObjects
RSDICHAPRO Characteristic Properties Specific to an InfoCube
RSDIKYFPRO Flag Properties Specific to an InfoCube
RSDICVALIOBJ InfoObjects of the Stock Validity Table for the InfoCube
Aggregates
RSDDAGGRDIR Directory of Aggregates
RSDDAGGRCOMP Description of Aggregates
RSDDAGGRT Text on Aggregates
RSDDAGGLT Directory of the aggregates, texts
ODS Object
RSDODSO Directory of all ODS Objects
RSDODSOT Texts of all ODS Objects
RSDODSOIOBJ InfoObjects of ODS Objects
RSDODSOATRNAV Navigation Attributes for ODS Object
RSDODSOTABL Directory of all ODS Object Tables
PSA
RSTSODS  Directory of all PSA Tables
DataSource (= OLTP Source)
ROOSOURCE   Header Table for SAP BW DataSources (SAP Source System/BW System)
RODELTAM BW Delta Procedure (SAP Source System)
RSOLTPSOURCE Replication Table for DataSources in BW
InfoSource
RSIS  Directory of InfoSources with Flexible Update
RSIST Texts on InfoSources with Flexible Update
RSISFIELD InfoObjects of an InfoSource
Communications Structure
RSKS Communications Structure for InfoSources with Flexible Update
RSKS Communications Structure (View) for Attributes for an InfoSource with Direct Update
RSKSFIELD Texts on InfoSources with Flexible Update
RSISFIELD InfoObjects of an InfoSource with Flexible Update
Transfer Structure
RSTS Transfer Structure in SAP BW
ROOSGEN Generated Objects for a DataSource (Transfer Structure, for example in SAP Source System)
Mapping
RSISOSMAP Mapping Between InfoSources and DataSources (=OLTP Sources)
RSOSFIELDMAP Mapping Between DataSource Fields and InfoObjects
InfoSpoke
RSBSPOKESELSET InfoSpoke Directory and Selection Options
RSBSPOKEVSELSET InfoSpoke Directory with Selection Options and Versioning
RSBSPOKE List of all InfoSpokes with attributes maintained with transaction RSBO which include the name of
the Source & Target Structures
RSBSPOKET List of all InfoSpokes with the Short & Long Descriptions (only one of these can be maintained).
RSBSTEPIDMESS Contains all the messages that have been recorded during the execution of an InfoSpoke. This table can
be added to using the ABAP Class/Method i_r_log->add_sy_message.
SAP BW Statistics
RSDDSTAT Basic Table for InfoCubes/Queries
RSDDSTATAGGR Detail Table for Aggregate Setup
RSDDSTATAGGRDEF Detail Table of Navigation for each InfoCube/Query
RSDDSTATCOND InfoCube Compression
RSDDSTATDELE InfoCube Deletions
RSDDSTATWHM Warehouse Management
Misc
RSFEC BW Frontend Check. Useful for checking the installed SAP GUI versions on user machines.
RSSELDONE InfoPackage selection and job program, there in field UPDMODE the update status (INIT/DELTA/FULL)
RSPSADEL PSA Table deletion
TBTCP Job Schedule Definition
TBTCO Job Schedule Result
RSMONMESS Monitor Messages
RSERRORLOG Check loading errors in table
V_RSZGLOBV Report Variables view table
DEVACCESS Developer Keys table
TSTC All Transactions in the system
RSDDAGGRDIR Directory of the aggregates
ROIDOCPRMS Control parameters for data transfer from the source system
SMEN_BUFFC Objects in User's Favorites
Web Item
RSZWITEM Header Table for BW Web Items
RSZWMDITEM BW Web Metadata: Template Item ( Dataprovider, Item, ... ).
RSZWITEMXREF Cross Reference of Web Items
RSZWMIMEIOBUFFER Buffer for Translation MIME Rep. to IO
RSZWOBJ Storage of the Web Objects
RSZWOBJTXT Texts of Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template

RSZWTEMPLATE  Header Table for BW HTML Templates
Archiving
RSARCHIPRO BW Archiving: General Archiving Properties
RSARCHIPROIOBJ BW Archiving: General Archiving Properties
RSARCHIPROLOC BW ARchiving: General Local Properties
RSARCHIPROLOCSEL BW Archiving: Archived Data Area
RSARCHIPROPID BW Archiving: Program References of InfoProvider
RSARCHREQ BW Archiving: Archiving Request
RSARCHREQFILES BW Archiving: Verfified Archive Files
RSARCHREQSEL BW Archiving: Request-Selections
Open Hub Destination
RSBOHSERVICETP Open Hub: Service Types
RSBREQUESTDELTA Open Hub: Cross Reference Outbound/Inbound
RSBREQUESTMESS Open Hub: Log for a Request
RSBREQUID Open Hub: Requests
RSBREQUID3RD Open Hub: Status of 3rd Party Requests
RSBREQUIDRUN Open Hub: Table with Status for a Request
RSBSTRUCTURE Open Hub: Generated Structures and Tables

Wednesday, October 21, 2009

SAP BW Open Hub - Introduction

In general, SAP BW Open Hub Service is a method that reads data from a SAP BW object, namely an InfoCube, DataStore Object or InfoObject and places the contents into a staging area for consumption. It provides APIs for all the steps required, to start the Open Hub Service Job, read the data and to tell it when the data was completely read.
So the interaction between DataServices and Open Hub Service contains the following steps
1. A DataServices Job is started including many processing steps. One of these steps is a DataFlow with an Open Hub Destination as source.
2. Before the dataflow can start reading the data, a DataTransferProcess (DTP) has to run filling the Destination with the data. But a DTP cannot be started directly from DataServices, only a ProcessChain containing one. So with each Open Hub Destination Reader, a SAP BW ProcessChain is associated.
3. The ProcessChain potentially can include many steps, at one place it will call a DTP reading the source data and writing into the Open Hub destination.
4. Once all the data is copied into the Destination, the DTP pauses and sends a notification to the DataServices RFCServer. Hence it has to be configured first.
5. The RFCServer writes the notification information into a repository table (AL_SAP_JOB) and with that the DataServices DataFlow can finally start with reading the data from the OpenHub Destination.
6. During the completion of the DataFlow we have to tell the DTP the fact that this OpenHub Destination was successfully read.
7. So a BAPI call is made to update the status of the DTP to green and with that...
8. the ProcessChain can continue until its logical end.
9. In parallel, the next DataServices task is started



Above architecture has a couple of consequences one needs to be aware
 Can a ProcessChain contain multiple steps? Yes, but just one DTP meant for DataServices. Otherwise each OpenHub Destination reader would trigger the same ProcessChain, executing it twice hence.
 Can two different OpenHub Destinations be used in parallel? Yes, no problem.
 Can the same OpenHub Destination be used in different DataFlows? Yes, but not in parallel. Each DTP will truncate the Destination table and load its data into it. So neither it is good to truncate the data while another DTP is still in the process of copying the data, nor should the truncation be done when the Destination has the data but DataServices has not been able to read all yet. In other words, a DTP is done once its status is set to green - by DataServices.
What if multiple developers are working on the same dataflow? Same argument as with other tables. Should a target table be loaded by two developers at the same time? No. However, the actual OpenHub Destination table is editable to enable the same dataflow being tested in parallel. Just to mention that...
 Can a DTP be started manually via BW, and DataServices just reads that data, e.g. for testing?
 In step 5 the RFCServer writes in the AL_SAP_JOB table of the repo. Which repo in case there are multiple?

SAP BW Open Hub Destination

Configuring SAP BW Open Hub Destination

The first step, after setting up a SAP BW external source system and configuring the DataServices RFCServer, is creating the Open Hub Destination (OHD). This is a "table" with a structure and will act as the interface between SAP BW and DataServices.



For this structure we have to define all the columns later but usually our goal will be to read the data from an existing object and therefore using the same columns. In the dialog below you can choose a template for the structure, as we want to read the data from the InfoCube ZDS_SALES we use it as the template and hence get all the InfoCube's columns in the Open Hub Destination.



We need a "3rd Party" OHD with RFC Destination being DataServices, the one we created in the first paragraph.


As we have used the InfoCube as a template when creating the OHD, we do not start with an empty list of columns but all columns of this template object. We could refine that list but as we want to copy all the data 1:1, there is no need for that.


As last step we activate the OHD by clicking on the Activation button in the tool bar.

The OpenHub Destination exists now but is completely disconnected, we create a Transformation to tell where the data should come from.

In our case, the InfoCube ZDS_SALES is the source.

By default, all columns are mapped directly which is exactly what we need in our case, nothing to do hence.

Now we need a DTP which does execute the OHD loading.



At the end, all is to be saved and activated.

Wednesday, October 14, 2009

Currency translation in Bex

Why Currency translation in Bex
• Depending upon various business scenarios we need to do currency translation for BI Reports according to exchange rate of varying time ref.
• For Ad-hoc reporting and on particular date's exchange rate translation is required.
• Target currency is not fixed and hence can not be defined the translation in the update/transfer rules.
• Time ref. (Date) is not fixed based on which translation should happen and hence can not be defined the translation in the update/transfer rules.

How to achieve currency translation in Bex (Implementation part)
1. Transaction code to create currency translation is RRC1.
2. Provide technical name and description .Here you need to be cautious while providing a description as just by looking at it one should able to figure out which time ref. which currency this transaction refers to. As this will only be identifier in Query designer / Bex.
3. Visible tab pages are Exchange Rate, Source Currency, Target Currency, Time reference. Let’s see each and every tab pages significance.

Exchange Rate
1. Exchange Rate Type: In general case it is M Standard translation at average rate.
2. Source Currency
3. Source Currency from Data Record: Fetches currency from the record at the run time of a query key figure. Generally this is a default setting.
4. Info object for determining the source currency: Info object can be specified which holds currency , and this would be taken as ref. for converting to target currency.
Target Currency
1. Selection of Target currency with translation: This option is selected when target currency is not fixed. In the query designer / runtime system asks for target currency containing target currency.
2. *Fixed Target Currency:*This option is selected when target currency is Fixed and it will not be change for reporting part. In Indian scenarios companies prefer to see amount in currencies like "USD" or "EUR”.
3. Info object for determining Target Currency: Info object can be specified which holds currency, and this would be taken as ref. for converting to target currency.
4. *Target currency from Variable: *This option is selected when target currency is not fixed. In the query designer / runtime system asks for target currency containing target currency. Variable for target currency needs to be created if it doesn’t exist from query designer, it will be added to selection-screen/input dialog while executing a query.
Time References
Fixed Time Ref :
1. Current Date: This option is selected, when report should convert all key figures into target currency based on exchange rate as on date (today).
2. Key Date: This option is selected, when report should convert all key figures into target currency based on exchange rate as on specified date. Generally this date value is end of fiscal year or depending upon requirement it changes.

3. Time Based value from Variable: This option is selected, when report should convert all key figures into target currency based on exchange rate as on specified date in query designer/ reports.
Variable Time Ref: Different options available here are

1 Fiscal year closing
2 Start of fiscal year
3 End of period
4 Beginning of period
5 End of calendar year
6 Start of calendar year
7 End month
8 Start month
9 End of week
A To the exact day
B End of Calendar Year/ Quarter
C Start of Calendar Year/ Quarter
Standard info object: Based on granularity selected in the variable time ref. system tries to assign time characteristics to the object.
e.g. if Fiscal year closing is selected then standard object would be 0FISCYEAR
for end of week standard object would be 0CALWEEK
For end of the month standard object would be 0CALMONTH
Special Info object: If this option is selected we can specify info object other than standard time characteristics info objects.
Example: Translating values as on the transactions date exchange rates for large amount of transaction is necessary. In this case we would select variable time ref. as A (to the exact day), special info object to be 0PSTNGDT (Posting Date). By this setting for each posting date system will try finding exchange rate and will convert keyfigures considering exchange rate as on the posting date.

Query Key Date: If this option is selected, system takes query key date into consideration while translating amounts to target currency. (Where to specify it ?? : Query -> Properties -> Query Key Date). Please note that, if you have time dependent master data characteristics in the report, then this setting would affect those master data display also.
Example: In the scenario where user wants to run a currency translation as on specified date , this option would be useful.
Disadvantages of Currency Translation in Bex
 Increase execution time of query
 Increase load on the system as system does currency translation for every record.
Advantages of Currency Translation in Bex No need to change data model
 Flexibility of dates which should be taken in to account while translation.

Technical and Functional upgrade from 3.1 to 7.0 - Check List

Technical and Functional upgrade from 3.1 to 7.0 - Check List

Before the technical upgrade

1. Make sure that all transports in DEV system should be released and imported to all downstream systems QA and PRD systems.
2. Check for Inconsistent Infoobjects and repair inconsistent Infoobjects as much as possible.
3. Clean Up inconsistent PSA directory entries.
4. Check consistency of PSA partitions.
5. Check compounding consistency in Multiproviders.

Right before the technical Upgrade procedure:

1. Apply latest SPAM patch.
2. Download most recent SP (Support Package ) stack and most recent BI Support pack. It is recommended to upgrade to the latest version of all relevant support packages during the upgrade.
3. Check for the newest versions of SAP Notes for the Upgrade.
4. Ensure that correct java runtime environment version is installed on sever.
5. Ensure DB Statistics are uptodate prior to upgrade.
6. Check for inactive update rules and transfer rules. All update and transfer rules should be active.
7. Check for inactive Infocubes and agreegates. All Infocubes should be activated.
8. Check for inactive Infoobjects. All Infoobjects should be activated.
9. Check for inactive ODS objects. All ODS objects should be activated.
10. Make sure all ODS data requests have been activated.
11. Data load and other operational tasks i.e change run should not be executed while SAPup runs. So, reschedule Infopackages and process chains. SAPup automatically locks background jobs.
12. Special consideration for modifications to time characteristics 0CURRENCY, 0UNIT, 0DATE, 0DATEFROM, 0DATETO, 0SOURCESYSTEM and 0TIME.
13. For UNICODE systems special reports must be run. Execute reports RUTTTYPACT and UMG_POOL_TABLE.
14. Complete any data mart data extractions and suspend any data mart extractions.
15. Only 3.0 systems Run SAP_FACTVIEWS_RECREATE from SE38 transaction before running SAPup.
16. Before execution PREPARE backup your system.

Notes for Upgrade

1. Review SAP notes 964418, 965386 and 934848, and plan to incorporate the installation of the new technical content into tasks performed following the
technical upgrade procedure.
2. Review note 849857 to prevent potential data loss in PSA/change log. Review note 856097 if issues are encountered with parttion.
3. Review note 339889 to check PSA partition consistency.
4. Review SAP note 920416 that discusses a potential issue with compounding in MultiProviders.
5. Review note 1013369 for a new intermediate SAP NetWeaver 7.0 BI ABAP Support Package strategy .
6. Review note 449891 and also see note 883843 and 974639 to execute routine for deleting temporary BI database objects.
7. Review note 449160 to Execute program RSUPGRCHECK and to locate any inactive update and transfer Rules .
8. Review note 449160 to Execute program RSUPGRCHECK and to locate any inactive InfoCubes.
9. Review note 449160 to Execute program RSUPGRCHECK and to locate any inactive InfoObjects.
10. Review note 449160 and 861890 to Execute program RSUPGRCHECK to locate any inactive ODS objects.
11. Refer to note 996602 If modifications have been made to these to time characteristics, 0CURRENCY, 0UNIT, 0DATE, 0DATEFROM, 0DATETO,
0SOURSYSTEM, or 0TIME, create or locate a change request containing them sourced from the BI development system.
This change request will be re-imported into not only the BI dev system, butalso any other systems following SAPup.
12. Review notes 544623 and 813445 to run special reports for any UNICODE SAP system.
13. See Note 506694 and 658992 for more info for SAP Service API (S-API), which is used for internal and BI data mart extraction, is upgraded during the upgrade. Therefore,
the delta queues must be emptied prior to the upgrade to avoid any possibility of data loss .
14. For Release NetWeaver 7.0, there is completely new workload statistics collector. This newly developed workload statistics collector is incompatible
with earlier workload statistics data. In order to preserve the data for use after the upgrade, follow the steps in SAP notes 1005238 and 1006116 .
15. For BW 3.0B systems: Execute report SAP_FACTVIEWS_RECREATE from SE38 before running SAPup, to prevent problems with the /BIC/VF
fact views. For more information, see SAP Note 563201 .

Tuesday, October 13, 2009

Vijaya Lakshmi Nehru Pandit


Vijaya Lakshmi Nehru Pandit (August 18, 1900 - December 1, 1990) was an Indian diplomat and politician, sister of Indian Prime Minister Jawaharlal Nehru.

In 1921 she married Ranjit Sitaram Pandit, who died on January 14, 1944. She was the first Indian woman to hold a cabinet post. In 1937 she was elected to the provincial legislature of the United Provinces and was designated minister of local self-government and public health. She held the latter post until 1939 and again from 1946 to 1947. In 1946 she was elected to the Constituent Assembly from the United Provinces.

Following India's independence from the British in 1947 she entered the diplomatic service and became India's ambassador to the Soviet Union from 1947 to 1949, the United States and Mexico from 1949 to 1951, Ireland from 1955 to 1961 (during which time she was also the Indian High Commissioner to the United Kingdom), and Spain from 1958 to 1961. Between 1946 and 1968 she also headed the Indian delegation to the United Nations. In 1953, she became the first woman President of the United Nations General Assembly

In India, she served as governor of Maharashtra from 1962 to 1964, after which she was elected to the Indian Lok Sabha from Phulpur, her brother's former constituency. She held office from 1964 to 1968. Pandit was a harsh critic of her niece, Indira Gandhi, after Gandhi became Prime Minister in 1966, and she retired from active politics after relations between them soured. On retiring she moved to Dehradun in the Doon Valley in the Himalayan foothills.

In 1979 she was appointed the Indian representative to the UN Human Rights Commission, after which she retired from public life. Her writings include The Evolution of India (1958) and The Scope of Happiness: A Personal Memoir (1979).

Her daughter Nayantara Sahgal, who later settled in her mother's house in Dehradun, is a well-known novelist.

Friday, October 9, 2009

Tax exemptions sought on donations

Hyderabad
The state government on Wednesday announced that 100 per cent tax exemption could be claimed on donations given to the Chief Minister’s Relief Fund to help the flood victims.
The revenue minister, Mr Dharmana Prasada Rao, told mediapersons that the exemption could be availed under 80 (G). The donors can also avail the online facility at http://cmo.ap.gov.in/cmrf and make payments through debit and credit cards, he said.
Under 80 (G) the donation will be deducted from the gross taxable income and tax will be calculated on the remaining amount.
For instance, a person earning Rs 10 lakh per annum has to pay a tax of Rs 1.79 lakh in normal circumstances. If a donation of Rs 10,000 is given to the CMRF, the person will get a tax benefit of Rs 3,000.
“If the donor decides to respond to the calamity and anyway give financial assistance at his level, the 100 per cent exemption under 80 (G) will give some tax benefit,” said Mr Giridhar Toshniwal, a chartered accountant.
The revenue minister also announced that the kin of the flood victims would get Rs 1 lakh from the Prime Minister’s Relief Fund in addition to Rs 2 lakh already announced by the state government. He said the state would also provide Rs 10,000 for loss of cattle.
The minister said the loss incurred in the energy sector because of damage to substations and poles was Rs 236.79 crore.
Power has been restored fully in Kurnool district and would be done in Mahbubnagar by Thursday evening. There were 179 breaches on roads out of which 50 were already restored.
Mr Prasada Rao said the government allotted Rs 75 crore for roads, Rs 36 crore for rural water supply and Rs 40 crore to municipal administration.

Thursday, October 8, 2009

Rani of Jhansi


Rani Lakshmi Bai of Jhansi whose heroism and superb leadership laid an outstanding example for all future generations of women freedom fighters. Married to Gangadhar Rao head of the state of Jhansi. She was not allowed to adopt a successor after his death by the British, and Jhansi was annexed.
With the outbreak of the Revolt she became determined to fight back. She used to go into the battlefield dressed as a man. Holding the reins of there horse in her mouth she used the sword with both hands. Under her leadership the Rani's troops showed undaunted courage and returned shot for shot. Considered by the British as the best and bravest military leader of rebels this sparkling epitome of courage died a hero's death in the battlefield.

The first name that comes to mind is that of the famous Rani Laxmibai of Jhansi. Dressed in men’s clothes, she led her soldiers to war against the British. Even her enemies admired her courage and daring. She fought valiantly and although beaten she refused to surrender and fell as a warrior should, fighting the enemy to the last. Her remarkable courage inspired many men and women in India to rise against the alien rule.

Jhalkari Bai The Leader

Jhalkari Bai was the female leader of a resistance struggle against British rule in India.
She came from a very poor family in Bundelkhand and once killed a tiger with no assistance, using only her axe. She bore a close resemblance to Rani Laxmibai and Laxmibai convinced her to join the women's wing of the Indian army. Jalkari Bai, a.k.a. Bai defended Jhansi fort from British raids in 1857-1858. Just before the fort was going to be taken over, Bai convinced Laxmibai to escape. Bai then impersonated Laxmibai and took command of the army. By the time the British discovered that Bai was an impostor, Laxmibai was already far away. Bai then fought the British army, but was eventually compelled to surrender. However, the British general released her because he was impressed with her courage and leadership ability.