Search This Blog

Thursday, May 6, 2010

List of the frequently used ABAP Programs in SAP BI List of the frequently used ABAP

List of the frequently used ABAP Programs in SAP BI List of the frequently used ABAP Programs in SAP BI
• RSCDS_NULLELIM: Delete fact table rows where all Key Figure values are zero. See Note 619826.
• RSDG_CUBE_ACTIVATE: Activation of InfoCubes
• RSDG_CUBE_COPY: Make InfoCubes Copies
• RSDG_CUBE_DELETE: Delete InfoCubes
• RSDG_DODS_REPAIR: Activation of all ODS Objects with Navigation Attributes
• RSDG_ODSO_ACTIVATE: Activation of all ODS Objects
• RSDG_IOBJ_ACTIVATE: Activation of all InfoObjects
• RSDG_IOBJ_DELETE: Deletion of InfoObjects
• RSDG_IOBJ_REORG: Repair InfoObjects
• RSDG_IOBJ_REORG_TEXTS: Reorganization of Texts for InfoObjects
• RSDG_MPRO_ACTIVATE: Activating Multiproviders
• RSDG_MPRO_COPY: Make Multiprovider Copies
• RSDG_MPRO_DELETE: Deleting Multiproviders
• RS_COMSTRU_ACTIVATE_ALL: Activate all inactive Communication Structures
• RS_TRANSTRU_ACTIVATE_ALL: Activate Transfer Structure
• RSAU_UPDR_REACTIVATE_ALL: Activate Update Rules
• RRHI_HIERARCHY_ACTIVATE: Activate Hierarchies
• SAP_AGGREGATES_ACTIVATE_FILL: Activating and Filling the Aggregates of an Infocube
• SAP_AGGREGATES_DEACTIVATE: Deactivating the Aggregates of an Infocube
• RS_PERS_ACTIVATE: Activating Personalization in Bex(Inactive are highlighted)
• RSSM_SET_REPAIR_FULL_FLAG: Convert Full Requests to Repair Full Requests
• SAP_INFOCUBE_DESIGNS: Print a List of Cubes in The System and Their Layouts
• SAP_ANALYZE_ALL_INFOCUBES: Create DB Statistics for all InfoCubes
• SAP_CREATE_E_FACTTABLES: Create Missing E-Fact Tables for InfoCubes and Aggregates
• SAP_DROP_EMPTY_FPARTITIONS: Locate/Remove Unused or Empty partitions of F-Fact Table
• SAP_DROP_TMPTABLES: Remove Temporary Database Objects
• SAP_RSADMIN_MAINTAIN: Add, change, delete RSADMIN table entries
• CUBE_SAMPLE_CREATE: A fast way to put some "sample" records in a Infocube. No need to use Flat files, just enter the value in a ALV-Grid or let fill the Cube with random value.
• SAP_CONVERT_NORMAL_TRANS: Convert Basis Cube to Transactional Cube and the opposite way around.

List of the frequently used Function Module in SAP BW.

List of the frequently used Function Module in SAP BW.
1.RRMX_WORKBOOK_DELETE: Delete BW Workbooks permanently from Roles & Favorites
2.RRMX_WORKBOOK_LIST_GET: Get list of all Workbooks
3.RRMX_WORKBOOK_QUERIES_GET: Get list of queries in a workbook
4.RRMX_QUERY_WHERE_USED_GET: Lists where a query has been used
5.RRMX_JUMP_TARGET_GET: Get list of all Jump Targets
6.RRMX_JUMP_TARGET_DELETE: Delete Jump Targets
7.MONI_TIME_CONVERT: Used for Time Conversions.
8.CONVERT_TO_LOCAL_CURRENCY: Convert Foreign Currency to Local Currecny.
9.CONVERT_TO_FOREIGN_CURRENCY: Convert Local Currency to Foreign Currency.
10.TERM_TRANSLATE_TO_UPPER_CASE: Used to convert all texts to UPPERCASE
11.UNIT_CONVERSION_SIMPLE: Used to convert any unit to another unit. (Ref. table: T006)
12.TZ_GLOBAL_TO_LOCAL: Used to convert timestamp to local time
13.FISCPER_FROM_CALMONTH_CALC: Convert 0CALMONTH or 0CALDAY to Financial Year or Period
14.RSAX_BIW_GET_DATA_SIMPLE: Generic Extraction via Function Module
15.RSAU_READ_MASTER_DATA: Used in Data Transformations to read master data InfoObjects
16.RSDRI_INFOPROV_READ
17.RSDRI_INFOPROV_READ_DEMO
18.RSDRI_INFOPROV_READ_RFC: Used to read Infocube or ODS data through RFC
19.DATE_COMPUTE_DAY
20.DATE_TO_DAY: Returns a number what day of the week the date falls on.
21.DATE_GET_WEEK: Will return a week that the day is in.
22.RP_CALC_DATE_IN_INTERVAL: Add/Subtract Years/Months/Days from a Date.
23.RP_LAST_DAY_OF_THE_MONTHS
24.SLS_MISC_GET_LAST_DAY_OF_MONTH: Determine Last Day of the Month.
25.RSARCH_DATE_CONVERT: Used for Date Converstions. We can use in Info Package routines.
26.RSPC_PROCESS_FINISH: To trigger an event in process chain
27.DATE_CONVERT_TO_FACTORYDATE: Returns factory calendar date for a date
28.CONVERSION_EXIT_PDATE_OUTPUT: Conversion Exit for Domain GBDAT: YYYYMMDD - DD/MM/YYYY
29.CONVERSION_EXIT_ALPHA_INPUT: Conversion exit ALPHA, external->internal
30.CONVERSION_EXIT_ALPHA_OUTPUT: Conversion exit ALPHA, internal->external
31.RSPC_PROCESS_FINISH: Finish a process (of a process chain)
32.RSAOS_METADATA_UPLOAD: Upload of meta data from R/3
33.RSDMD_DEL_MASTER_DATA: Deletion of master data
34.RSPC_CHAIN_ACTIVATE_REMOTE: To activate a process chain after transport

Monday, April 26, 2010

The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI & Business Objects Roadmap

The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI & Business Objects Roadmap

What is SAP’s strategic direction for Enterprise, Formatted Reporting?
1. SAP's direction for enterprise, formatted reporting is Crystal Reports (CR), the de-facto industry standard for enterprise reporting. Crystal Reports is included as part of the SAP BusinessObjects BI suite, a premium offering. Crystal Reports is already integrated with SAP BW and SAP ERP today using advanced connectivity methods to access virtually all SAP data objects needed for enterprise reporting. BEx Report Designer will remain in the NetWeaver BW portfolio for lightweight reporting on NetWeaver BW data but beyond Release 7.0 EhP1, no further enhancements of the Report Designer are planned.To help SAP customers who have already invested in the BEx Report Designer transition to the full version of Crystal Reports,, SAP will provide a ‘base' version of Crystal Reports entitled ‘Crystal Reports for NetWeaver BW. 2. The base version will be restricted to SAP NetWeaver BW queries and views and limited in term of productive reports
3. Crystal Report for SAP NetWeaver BW will be available as part of the NetWeaver license to all NetWeaver customers will planned availability in 1H 2010. .

What is SAP’s strategic direction for OLAP Analysis?
1. SAP is developing a new Olap analysis tool for advanced analysis capabilities on SAP NW BW. The codename for this is Pioneer. Pioneer will combine Voyager's intuitive user interface with the powerful OLAP capabilities of the BEx Analyzer OLAP tools.
2. It will be available as a web client as well as an Excel client with tight integration between the two.
3. Pioneer will be a new state-of-the art OLAP client based on Ajax (Web 2.0) technology using advanced connectivity as the universal access layer for SAP & Non-SAP multidimensional data
4. Pioneer's combination of high end analysis capabilities, with ease-of-use and elegant visualization control & personalization features will reduce the need for specialized design-time tools and enable end users to make simple design & UI changes directly during data analysis.
5. Pioneer will be included as part of the SAP BusinessObjects Business Intelligence suite with planned availability in 1H 2010.
6. BEx Analyzer and BEx Web Analyzer will remain as the base offering for NetWeaver BW offering, and will continue to be shipped with NetWeaver. However, feature extensions for BEx Analyzer & BEx Web Analyzer will be limited.

What is SAP’s strategic direction for Dashboards?
Xcelsius will continue to be the tool of choice for the creation of standalone dashboards and as the visual content creation tool for the desktop or portal page. For the creation of Composite dashboards that combine multiple BI content types, Xcelsius should be used in conjunction with SAP BusinessObjects Dashboard Builder - for non-SAP Netweaver customers or customers that have deployed the InfoView portal - ; or with SAP Composition Environment (Web Page Composer, Visual Composer) for those customers that have deployed the NetWeaver platform.
Some of the enhancements planned for Xcelsius future releases include:- Enhancements to encompass functionality from BEx Web Application Designer, in particular data binding functionality with SAP NW BW. - Integration with Pioneer. It will be possible to build Xcelsius dashboards seamlessly on Pioneer analyses. - Improved data access to directly use BEx queries and query views with BI Consumer services just like the NetWeaver BW BEx tools do in NW BW 7.0.
Xcelsius is available now as a premium offering and is part of the SAP BusinessObjects BI Suite. Beyond NW release 7.0 EhP1 only very limited enhancements to BEx Web Application Designer are planned. BEx WAD will remain in the NW offering as a base solution. SAP and Busines Object will provide a service-based offering to move from SAP NW BI to the Busines Object premium offering.

What is SAP’s strategic direction for Ad-hoc Query and Reporting?
1. Web Intelligence is the flagship product for web-based ad-hoc query and reporting.
2. The focus of Web Intelligence lies in the creation of light-weight reports in an ad hoc web environment.
3. At present, Ad-hoc Query and Reporting is not covered extensively by the SAP NW BW tools. WebI clearly extends the value of SAP NW BW by addressing business user needs for answering business questions in an agile and easy to use way.

What is SAP's strategic direction for Master Data Management?
SAP NW MDM and SAP BusinesObjects Data Services (Data Quality) are naturally complementary products. MDM provides a modeling environment, a central repository, generic data cleansing and matching capabilities, synchronization and workflow capabilities. SAP BusinesObjects Data Quality provides first class address cleansing and fuzzy matching of customer data that greatly enhances those of MDM. Today, there is already integration of the two products. Going forward, SAP intends to release an integrated version of SAP BusinesObject Data Quality with MDM for enhanced cleansing and matching of addresses and customer data. In a typical workflow, customers will be able to start a Data Qualtiy initiative locally with address cleansing against a CRM system, and move up towards central management, cleansing and distribution across the complete system landscape. At the same time, SAP will continue to offer SAP NW MDM and SAP BusinesObjects Data Services.

What is the future of the BEx Query Designer?
The BEx Query Designer is not affected by the roadmap. It remains a central tool for SAP NetWeaver BW.

What is SAP’s strategic direction for composite applications?
1. CE is and will continue to be SAP's flagship product for building composite applications, which can include both BI information and transactional/operational steps.
2. VC is an integral part of SAP NetWeaver's Composition Environment (CE).
3. SAP continues to invest in CE and CE will also be enhanced with premium functionality from SAP BusinessObjects.

What is SAP’s strategic direction for the data warehouse?
SAP NetWeaver BW continues to be a strategic focus and investment area for SAP. BW provides a complete data warehouse solution including a sophisticated modeling environment, industrial strengths for data provisioning and dissemination, and an enterprise-ready solution for handling business intelligence. SAP NetWeaver BW thus offers an efficient approach for dealing with BI, giving enterprises the opportunity to standadrize on one technology that runs their transactional and information environment.

What is SAP's strategic direction the SAP NetWeaver BW Accelerator?
SAP NetWeaver BW Accelerator continues to be a strategic focus for SAP. SAP BW Accelerator ensures high-performance access to SAP NetWeaver BW data that directly benefits the SAP BusinesObject BI tools. As part of this strategic focus, SAP has leveraged the power of the SAP BW Accelerator by combining it with an intuitve BI front which has been released as SAP BusinessObjects Explorer.

Does SAP plan to discontinue support for products in the current BEx portfolio?
SAP will continue to offer support for the SAP BEx BI tools based on customer's existing maintenance policy with SAP. However, all current and future innovation and development in the BI front-end is focused on the SAP BusinessObjects BI solutions while the SAP BI tools within NetWeaver are no longer the strategic direction.
What is the strategic direction for Visual Composer?
SAP encourages customers to use Visual Composer and will continue to provide support for the application. Over the past years, SAP has invested in reshaping Visual Composer to be an integral part of SAP NetWeaver Composition Environment (CE). The key achievements include: support for clustered services, componentization support, support for lifecycle management of modeling artifacts, model debugging and support for the SAP standard UI , Web Dynpro.With Visual Composer enhancements coming later in 2009, SAP will complete the Visual Composer road map as laid out during TechEd '08. This includes integration capabilities into SAP NetWeaver Business Process Management (BPM). This means that Visual Composer is a mature and fully supported capability by SAP. Besides Visual Composer, customers can of course continue to use the capabilities of Web Dynpro Java to model UIs directly there.
What is the strategic directoin for BEx Web Application Designer?
Web Application Designer (WAD) is no longer the strategic direction in terms of Dashboarding and Data Visualization but will continue to be supported under SAP's maintenance policy Xcelsius is the clear direction and recommended product for dashboards and sophisticated visual presentations: However, the ability in WAD to build web templates for web application design and advanced analysis will be provided by Pioneer going forward.

What is SAP and BOBJ’s future offering for Information Broadcasting?
Information Broadcasting is no longer the strategic focus for information delivery and publishing. The SAP BusinessObjects BI suite offers a mature and robust suite of information delivery capabilities that customers can take advantage of today. However, for those SAP NetWeaver BW customers who want to continue to use Information Broadcasting, SAP will continue to support the product under SAP's maintenance policy.

Does SAP recommend to upgrade an existing BW installation to SAP NetWeaver 7.0?
1. SAP definitely recommends to upgrade your existing BW 3.x systems to SAP NetWeaver BW 7.0.
2. Although the SAP BusinesObjects BI tools and platform can run in conjunction with SAP NetWeaver '04 BW (SAP BW 3.5), SAP NetWeaver 7.0 BW provides significant advantages in particular in the area of the BIWserver.
3. With this release, you can take advantage of the significant performance benefits and minimized maintenance of SAP NetWeaver BW Accelerator.
4. The new security approach for analysis authorizations guarantees easy and transparent definition of multidimensional access rights.
5. Important options also complement the data warehousing layer, including write-optimized DataStore object for very fast staging layer, new transformation concept and data flow design (DTP).
6. Real-time BI is facilitated by new methods to continuously request data from SAP sources and web service data (real-time data acquisition).
7. In the information lifecycle management, SAP NetWeaver 7.0 offers interfaces to Near-Line Storage systems that enable to export data from the operational database into NLS partitions under control of the particular NLS vendor soution. Several vendors are already certified. From the Query perspective the existence of NLS partitions for basic cubes or data store objects is totally transparent.If a Bex Query is marked with the additional flag 'Read NLS as well' the data of external NLS partitions is directly accessable without the need of reloading.NLS partitions of an InfoProvider are read-only and therefore typically contain older data that is not of interest for the entire Business User community. The NLS concept finaly reduces TCO for database management and also keeps old data - that nobody wants to look at - away from the BW Accelerator. NLS therefore has to be seen as complementary to the BWA approach.In order to establish NLS as a SAP NetWeaver BW Data Warehouse corner stone, in the BI 7.0 release NLS is supported for basis cubes and DataStore Objects (from 7.01 for write optimized DS Objects). From the BW release 7.1 on MultiProviders will be supported as well in order to strengthen SAP NetWeaver's BW OLAP and Data Mart extension strategy through NLS as well.

8. Beyond these highlights, many more usability and performance improvements have been made, including new administration and monitoring facilities (such as the Administration Cockpit), and further modelling options.
9. In general, we recommend the NetWeaver BW 7.0 upgrade because planned future investments into an even more seamless integration of SAP BusinesObject tools with NetWeaver BW (for example plans around direct connectivity of Xcelsius or Pioneer directly on top of NetWeaver BW queries) will be based on NetWeaver BW 7.0 as the source system. Therefore you can expect that integration of SAP BusinesObject tools on top of NW BW will be even tighter with BI 7.0 than with older BW releases.
10. For users of the NetWeaver Composition Environment, especially Visual Composer 7.1, an upgrade to NetWeaver BI 7.0 is also highly recommended, as Composition Environment & Visual Composer 7.1 do not connect to NW BI releases prior to NetWeaver BI 7.0. (Accessing 3.x NW BI is possible in theory via XMLA but is not recommended by SAP due to the many functional restrictions).
11. Whereas the technical upgrade to NW BI 7.0 is generally recommended (see reasons above) active usage of the BEx 7.0 tools depends on the specific customer requirements and is not always necessary. However, we do not recommend using BEx Report Designer, as SAP can already offer a far superior Reporting tool with Crystal Reports. BEx Web Application Designer should be used if a specific project requires it, but keep in mind that there will be no future development for BEx Web Application Designer and the BEx Web Runtime. We recommend evaluating NetWeaver Composition Environment 7.1.1 as well as tools such as Xcelsius and WebIntelligence. BEx Analyzer can be used in its 3.5 or 7.0 version. There will be migration support for standard 3.5 & 7.0 workbooks that do not include Visual Basic coding.

Does SAP provide a transition path from the Bex Analyzer to current or future SAP BusinessObjects BI tools?
In the short-term, SAP recommends that customers look to the full SAP BusinessObjects BI Suite to meet all new and future BI requirements. However, for those customers who see value in the BEx Analyzer and want to continue to use the tool to meet advanced analysis needs, SAP recommends the continued use of the BEx Analyzer. However, for the mid to long term timeframe, SAP is developing the next generation BEx Analyzer tool called Pioneer that will be available in mid 2010. Pioneer will provide a superset of capabilities from SAP BEx Analyzer (Excel), SAP BEx Analyzer (Web), and SAP BusinessObjects Voyager. End users will find an easy transition in moving from the BEx Analyzer to Pioneer as Pioneer will allow the consumption of most BEx workbooks and the re-use of the same BEx Queries that the BEx Analyzer uses today.

How do the SAP BusinessObjects BI products integrate with the SAP NetWeaver Portal
Reports, Analyses & Dashboards built with the SAP BusinesObject BI technology can be integrated into the SAP NetWeaver Portal on an iView level. SAP is currently working on even tighter integration. Crystal Reports is already tightly integrated with NW Portal´s Knowledge Management services.

When should I use SAP BusinessObjects Rapid Marts instead of SAP NW Data Warehouse?
1. If your long-term company strategy is going clearly towards an enterprise-wide Data Warehouse approach, you can get started right away using SAP NetWeaver BW Data Warehouse.
2. If you are planning a project with limited scope in terms of either size and reach (for example a specific BI solution for a business department) or of time (either as a quick win or intermediate solution) you should consider an approach using SAP BusinessObjects Rapid Marts. You can then extend to a full-blown Enterprise Data warehouse later on by bringing in an SAP NetWeaver BI Data Warehouse.

Do I need an additional server for the SAP BusinessObjects BI platform?
1. Yes, an additional server is required for the SAP BusinessObjects BI platform (Business Objects Enterprise (BOE)
2. The SAP BusinessObjects platform provides the infrastructure for most of the client tools, such as the universe repository, scheduling services, caching services, etc.
3. BOE supports various platforms, including Linux, Microsoft Windows, Sun Solaris, IBM AIX and HP-UX.From a DI&DQ perspective the amount of additional resources for the Data Services Server is reviewable.

Will the SAP BusinesObjects BI platform and SAP NetWeaver be integrated into one single platform?
SAP NetWeaver and SAP BusinessObjects BI serve different IT and business user needs and both have unique platform requirements. For example historical SAP NetWeaver and BOBJ customers potentially run data warehouse strategies based on distinct platforms. Therefore it is not planned for the foreseeable future that the complete SAP BusinessObjects BI platform and the SAP NetWeaver platform will run on the same platform. However SAP is working on a platform rationalization strategy that will reduce the overall adminstration when running both with the aim of lowering overall cost of ownership. Examples include the sharing of such services as server management, administration console, user store, authorizations, job scheduling, lifecycle management, content delivery etc.

Can the the SAP BusinesObject BI platform be deployed on an SAP NetWeaver system with usage type AS Java?
No, as this is not a J2EE application, but a mix of Java and C++. SAP is evaluating if the SAP BusinesObject BI platform/server can be shipped as a "Standalone engine" (as it is the case for BWA and Livecache), interoperating with the SAP NetWeaver system at low TCO.

Thursday, April 15, 2010

Real Time Data Acquisition

Definition
Definition: Real-time data acquisition supports operational reporting and analysis by allowing the user to send data to the delta queue or PSA table in real-time and to pass this data to InfoProviders in the operational DataStore layer at regular short intervals using a daemon.
The data is stored in the BI system persistently.
Purpose
Real-time data acquisition supports tactical decision-making.
It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
Use Daemon to transfer Datastore objects to the operational datastore layer at frequent regular intervals.
Prerequisites
1: The Datasource has to support real-time data acquisition.
1:1 BI Content DataSources have to be delivered with the property for supporting real-time data acquisition.
1:2 The real time enabler/indicator has to be set in the generic delta settings for generic datasources.
Process Flow
1. Data is loaded into BI at frequent, regular intervals
1:1 then posted to the DataStore objects that are available for operational reporting.
1:2 Data is available for reporting as soon as it has been successfully posted to the datastore obtject and activated.
Web Service
A: you use this service to write the data from the source into the PSA.
B: Only an infopackage (full upload) is required to determine specific parameters for real-time data acquisition.
Using Service API
A: Data from the SAP source system can be loaded into the PSA using an Infopackage created specifically for this purpose.
B: You have to simulate the initialization of the delta process for the datasource beforehand.
Challenges
1.
1: You can only use real-time data acquisition to fill DataStore objects.
1.1. Data is first transferred into the PSA and then into the DataStore object.\
1.2. The DataStore object cannot be used as the source for a further real-time data transfer to another DataStore object.
1.3. Master Data cannot be transferred to the BI System with Real-Time data acquisition
1.4. Navigation attributes of the characteristics could no longer be used in aggregates.
aggregates cannot react to real-time updates since the change run in the RDA process cannot be triggered automatically for the loaded data.
2. Datasources that are used for real-time data acquisition cannot be used in the delta process for standard data transfer.
2:1 A data transfer with RDA and a scheduled data transfer cannot be executed simultaneously in the Delta process for a DataSource because there may be only one entry in the delta queue for each datasource and target system.
2:2 A data is updated from the PSA to a DataStore object a DTP for Real-time data acquisition, and to another datastore object with a standard DTP.
3. If you load data into a datastore object with real time data acquisition, you cannot load data into this datastore object simultaneously with an additional DTP.
1a: This is because there can only be one open activation request in a datastore object.
1b: Real-time data acquisition keeps an activation request open parallel to each DTP request.
1c: A further data transfer process cannot load into the same datastore object as long as an activation request is open.
4. Depending on requirements you can nevertheless merge the data that you load with a real-time data acquisition in an infoprovider with additional datasources.
4:1 The datastore object in which you load data with real time data acquisition can be used in a multiprovider or infoset.
4:2 Using a process chain you can restrict the time in which you load data into the datastore object with real time data acquisition. You can load into the same datastore object with a different data transfer process during the remainder of the time.

Daemon
Definition:
Background process that processes the infopackages and data transfer processes assigned to it at regular intervals.
Use: The Daemon controls and monitors the transfer process for real time data acquisition.
1. If you are transferring data from an SAP source system, the Daemon starts the transfer of data into the PSA using an infopackage.
2. It controls the status of the data transfer.
3. It starts the further processing of data into the datastore object using a data transfer process.
4. It closes and opens requests when threshold values are reached.
5. It triggers the subsequent process chains.

Function: When you extract data the daemon works on the basis of the list of Datasources assigned to it in the infopackage.
1. extracts the data from the source systems
2. transfers it to the PSA tables and datastore objects.
3. It informs the service API in the source system when the data for the target system has been successfully updated.
4. When the PSA request has been successfully closed and a new delta request has been opened, the updated data is deleted from the delta queue of the source system.
1. When you further update data from the PSA to the Datasource objects, the daemon works on the basis of the list of sources (datasources) and targets (DataStore objects) assigned to it in the data transfer processes.
1:1 It transfers data to the datastore object.
1:2 The data is directly activated in standard datastore objects and written to the change log.

Operating Mode
1. 1: The daemon runs in a permanent background job and switches to idle mode, during which the background job continues execution, between extraction processes.
a. If you are using API to transfer data, the RFC connection to the source system remains open.
i. This implies that a permanent RFC connection is required between each source system and BI for real time data acquisition using the service API.
b. To prevent the system from using too much main memory and to avoid keeping the RFC connection open for too long, the daemon stops independently on a regular basis and reschedules itself.
i. As a result the main memory can be released and a new RFC connection is opened.
A: This happens without having to close the request for real-time data acquisition.

Error Handling
1: The daemon writes each successfully executed step to a control table.
a. : If the extraction or update is terminated the daemon restarts itself and continues the process from the point at which it was terminated.
i. : It repeats the entire substep with the granularity of a data package.
(IE) If execution of the DTP is terminated the daemon gets the new data from the source and loads, it into the infoprovider together with the data packages that terminated previously.

DTP Process
1: The system synchronizes infopackage requests ( PSA request), data transfer process requests (DTP) and change log requests in the standard datastore object and these requests remain open across several load processes.
a. When a daemon is started, it opens a PSA request in the first load process.
b. The system opens a DTP request and change log request when the PSA request contains data. c. The daemon uses the threshold values defined in the infopackage to determine when to close a request.
i. The data transfer process copies these settings from the infopackage.
ii. When a PSA request is closed, the related DTP and change log requests in the same load process are closed too.

d. D: When the requests are closed, new requests are opened automatically the next time daemon accesses data and the data transfer for real time data acquisition continues with these new requests.
e. E: If you are transferring data from an SAP source system, data is automatically deleted from the delta queue when the new PSA request is opened.
i. You can only update data from the datastore object to further infoprovider (an infocube), for example, if the DTP and change loge, request is closed.
f. In the monitor for real time data acquisition you can close PSA and DTP requests manually.

Sunday, December 27, 2009

Happy New Year

Dear Friend,
Wish You Merry Christmas & Prosperous New Year.

Where did the year go?

It's December...... and we realize that with giant strides we started in January and within a blink of an eye, 2009 is on its’ back!
A big "Thank You" to each and every one of you, for the impact you had on my life this year. Especially for all the support, telephone calls, sms’s, and e-mails I received (not forgetting those shoulders when I needed it), the happy moments to smile about and the sad ones to cry about, .......
Without you, I'm sure that 2009 would have been extremely boring.
From my side I wish you all a MAGICAL FESTIVE SEASON filled with Loving Wishes and Beautiful Thoughts.
May 2010 mark the beginning of a Tidal Wave of Love, Happiness and Bright Futures.


To those who need money, may your finances overflow
To those who need caring, may you find a good heart
To those who need friends, may you meet lovely people
To those who need life, may you find GOD.
Wishing you one more fantastic new year.....Hope this year bring all the happiness and success you were / are looking for long.....!!!

Cheers!
Ramu

Wednesday, December 2, 2009

major changes in BI 7.0

Below are the major changes in BI 7.0 or 2004S version when compared with earlier versions.
1. In Infosets now you can include Infocubes as well.
2. The Remodeling transaction helps you add new key figure and characteristics and handles historical data as well without much hassle. This is only for info cube.
3. The BI accelerator (for now only for infocubes) helps in reducing query run time by almost a factor of 10 - 100. This BI accl is a separate box and would cost more.
4. The monitoring has been improved with a new portal based cockpit. Which means you would need to have an EP guy in your project for implementing the portal !
5. Search functionality has improved!! You can search any object.
6. Transformations are in and routines are passed! Yes, you can always revert to the old transactions too
7. Renamed ODS as DataStore.
8. Inclusion of Write-optimized DataStore which does not have any change log and the requests do need any activation
9. Unification of Transfer and Update rules
10. Introduction of "end routine" and "Expert Routine"
11. Push of XML data into BI system (into PSA) without Service API or Delta Queue
12. Introduction of BI accelerator that significantly improves the performance.
13. Load through PSA has become a must. It
Metadata Search (Developer Functionality) :
1. It is possible to search BI metadata (such as InfoCubes, InfoObjects, queries, Web templates) using the TREX search engine. This search is integrated into the Metadata Repository, the Data Warehousing Workbench and to some degree into the object editors. With the simple search, a search for one or all object types is performed in technical names and in text.
2. During the text search, lower and uppercase are ignored and the object will also be found when the case in the text is different from that in the search term. With the advanced search, you can also search in attributes. These attributes are specific to every object type. Beyond that, it can be restricted for all object types according to the person who last changed it and according to the time of the change.
3. For example, you can search in all queries that were changed in the last month and that include both the term "overview" in the text and the characteristic customer in the definition. Further functions include searching in the delivered (A) version, fuzzy search and the option of linking search terms with "AND" and "OR".
4. "Because the advanced search described above offers more extensive options for search in metadata, the function ""Generation of Documents for Metadata"" in the administration of document management (transaction RSODADMIN) was deleted. You have to schedule (delta) indexing of metadata as a regular job (transaction RSODADMIN).

• Effects on Customizing
• Installation of TREX search engine
• Creation of an RFC destination for the TREX search engine
• Entering the RFC destination into table RSODADMIN_INT
• Determining relevant object types
• Initial indexing of metadata"
Remote Activation of DataSources (Developer Functionality) : 1. When activating Business Content in BI, you can activate DataSources remotely from the BI system. This activation is subject to an authorization check. You need role SAP_RO_BCTRA. Authorization object S_RO_BCTRA is checked. The authorization is valid for all DataSources of a source system. When the objects are collected, the system checks the authorizations remotely, and issues a warning if you lack authorization to activate the DataSources.
2. In BI, if you trigger the transfer of the Business Content in the active version, the results of the authorization check are based on the cache. If you lack the necessary authorization for activation, the system issues a warning for the DataSources. BW issues an error for the corresponding source-system-dependent objects (transformations, transfer rules, transfer structure, InfoPackage, process chain, process variant). In this case, you can use Customizing for the extractors to manually transfer the required DataSources in the source system from the Business Content, replicate them in the BI system, and then transfer the corresponding source-system-dependent objects from the Business Content. If you have the necessary authorizations for activation, the DataSources in the source system are transferred to the active version and replicated in the BI system. The source-system-dependent objects are activated in the BI system.
3. Source systems and/or BI systems have to have BI Service API SAP NetWeaver 2004s at least; otherwise remote activation is not supported. In this case, you have to activate the DataSources in the source system manually and then replicate them to the BI system.
Copy Process Chains (Developer Functionality):
You find this function in the Process Chain menu and use it to copy the process chain you have selected, along with its references to process variants, and save it under a new name and description.
InfoObjects in Hierarchies (Data Modeling):
1. Up to Release SAP NetWeaver 2004s, it was not possible to use InfoObjects with a length longer than 32 characters in hierarchies. These types of InfoObjects could not be used as a hierarchy basic characteristic and it was not possible to copy characteristic values for such InfoObjects as foreign characteristic nodes into existing hierarchies. From SAP NetWeaver 2004s, characteristics of any length can be used for hierarchies.
2. To load hierarchies, the PSA transfer method has to be selected (which is always recommended for loading data anyway). With the IDOC transfer method, it continues to be the case that only hierarchies can be loaded that contain characteristic values with a length of less than or equal to 32 characters.
Parallelized Deletion of Requests in DataStore Objects (Data Management) :
Now you can delete active requests in a DataStore object in parallel. Up to now, the requests were deleted serially within an LUW. This can now be processed by package and in parallel.
Object-Specific Setting of the Runtime Parameters of DataStore Objects (Data Management):
Now you can set the runtime parameters of DataStore objects by object and then transport them into connected systems. The following parameters can be maintained:
- Package size for activation
- Package size for SID determination
- Maximum wait time before a process is designated lost
- Type of processing: Serial, Parallel(batch), Parallel (dialog)
- Number of processes to be used
- Server/server group to be used

Enhanced Monitor for Request Processing in DataStore Objects (Data Management):
1. for the request operations executed on DataStore objects (activation, rollback and so on), there is now a separate, detailed monitor. In previous releases, request-changing operations are displayed in the extraction monitor. When the same operations are executed multiple times, it will be very difficult to assign the messages to the respective operations.
2. In order to guarantee a more simple error analysis and optimization potential during configuration of runtime parameters, as of release SAP NetWeaver 2004s, all messages relevant for DataStore objects are displayed in their own monitor.

Write-Optimized DataStore Object (Data Management):

1. Up to now it was necessary to activate the data loaded into a DataStore object to make it visible to reporting or to be able to update it to further InfoProviders. As of SAP NetWeaver 2004s, a new type of DataStore object is introduced: the write-optimized DataStore object.

2. The objective of the new object type is to save data as efficiently as possible in order to be able to further process it as quickly as possible without addition effort for generating SIDs, aggregation and data-record based delta. Data that is loaded into write-optimized DataStore objects is available immediately for further processing. The activation step that has been necessary up to now is no longer required.

3. The loaded data is not aggregated. If two data records with the same logical key are extracted from the source, both records are saved in the DataStore object. During loading, for reasons of efficiency, no SID values can be determined for the loaded characteristics. The data is still available for reporting. However, in comparison to standard DataStore objects, you can expect to lose performance because the necessary SID values have to be determined during query runtime.

Deleting from the Change Log (Data Management):

The Deletion of Requests from the Change Log process type supports the deletion of change log files. You select DataStore objects to determine the selection of requests. The system supports multiple selections. You select objects in a dialog box for this purpose. The process type supports the deletion of requests from any number of change logs.
Using InfoCubes in InfoSets (Data Modeling):

1. You can now include InfoCubes in an InfoSet and use them in a join. InfoCubes are handled logically in InfoSets like DataStore objects. This is also true for time dependencies. In an InfoCube, data that is valid for different dates can be read.

2. For performance reasons you cannot define an InfoCube as the right operand of a left outer join. SAP does not generally support more than two InfoCubes in an InfoSet.

Pseudo Time Dependency of DataStore Objects and InfoCubes in InfoSets (Data Modeling) :

In BI only master data can be defined as a time-dependent data source. Two additional fields/attributes are added to the characteristic. DataStore objects and InfoCubes that are being used as InfoProviders in the InfoSet cannot be defined as time dependent. As of SAP NetWeaver 2004s, you can specify a date or use a time characteristic with DataStore objects and InfoCubes to describe the validity of a record. These InfoProviders are then interpreted as time-dependent data sources.

Left Outer: Include Filter Value in On-Condition (Data Modeling) :

1. The global properties in InfoSet maintenance have been enhanced by one setting Left Outer: Include Filter Value in On-Condition. This indicator is used to control how a condition on a field of a left-outer table is converted in the SQL statement. This affects the query results:
• If the indicator is set, the condition/restriction is included in the on-condition in the SQL statement. In this case the condition is evaluated before the join.
• If the indicator is not set, the condition/restriction is included in the where-condition. In this case the condition is only evaluated after the join.
• The indicator is not set by default.

Key Date Derivation from Time Characteristics (Data Modeling) :

Key dates can be derived from the time characteristics 0CALWEEK, 0CALMONTH, 0CALQUARTER, 0CALYEAR, 0FISCPER, 0FISCYEAR: It was previously possible to specify the first, last or a fixed offset for key date derivation. As of SAP NetWeaver 2004s, you can also use a key date derivation type to define the key date.

Repartitioning of InfoCubes and DataStore Objects (Data Management):

With SAP NetWeaver 2004s, the repartitioning of InfoCubes and DataStore objects on the database that are already filled is supported. With partitioning, the runtime for reading and modifying access to InfoCubes and DataStore objects can be decreased. Using repartitioning, non-partitioned InfoCubes and DataStore objects can be partitioned or the partitioning schema for already partitioned InfoCubes and DataStore objects can be adapted.

Remodeling InfoProviders (Data Modeling):


1. As of SAP NetWeaver 2004s, you can change the structure of InfoCubes into which you have already loaded data, without losing the data. You have the following remodeling options:
2. For characteristics:
• Inserting, or replacing characteristics with: Constants, Attribute of an InfoObject within the same dimension, Value of another InfoObject within the same dimension, Customer exit (for user-specific coding).
• Delete
3. For key figures:
• Inserting: Constants, Customer exit (for user-specific coding).
• Replacing key figures with: Customer exit (for user-specific coding).
• Delete
4. SAP NetWeaver 2004s does not support the remodeling of InfoObjects or DataStore objects. This is planned for future releases. Before you start remodeling, make sure:
(A) You have stopped any process chains that run periodically and affect the corresponding InfoProvider. Do not restart these process chains until remodeling is finished.
(B) There is enough available tablespace on the database.

1. After remodeling, check which BI objects that are connected to the InfoProvider (transformation rules, MultiProviders, queries and so on) have been deactivated. You have to reactivate these objects manually

Parallel Processing for Aggregates (Performance):

1. The change run, rollup, condensing and checking up multiple aggregates can be executed in parallel. Parallelization takes place using the aggregates. The parallel processes are continually executed in the background, even when the main process is executed in the dialog.

2. This can considerably decrease execution time for these processes. You can determine the degree of parallelization and determine the server on which the processes are to run and with which priority.

3. If no setting is made, a maximum of three processes are executed in parallel. This setting can be adjusted for a single process (change run, rollup, condensing of aggregates and checks). Together with process chains, the affected setting can be overridden for every one of the processes listed above. Parallelization of the change run according to SAP Note 534630 is obsolete and is no longer being supported.

Multiple Change Runs (Performance):

1. You can start multiple change runs simultaneously. The prerequisite for this is that the lists of the master data and hierarchies to be activated are different and that the changes affect different InfoCubes. After a change run, all affected aggregates are condensed automatically.

2. If a change run terminates, the same change run must be started again. You have to start the change run with the same parameterization (same list of characteristics and hierarchies). SAP Note 583202 is obsolete.

Partitioning Optional for Aggregates (Performance):

1. Up to now, the aggregate fact tables were partitioned if the associated InfoCube was partitioned and the partitioning characteristic was in the aggregate. Now it is possible to suppress partitioning for individual aggregates. If aggregates do not contain much data, very small partitions can result. This affects read performance. Aggregates with very little data should not be partitioned.

2. Aggregates that are not to be partitioned have to be activated and filled again after the associated property has been set.

MOLAP Store (Deleted) (Performance):

Previously you were able to create aggregates either on the basis of a ROLAP store or on the basis of a MOLAP store. The MOLAP store was a platform-specific means of optimizing query performance. It used Microsoft Analysis Services and, for this reason, it was only available for a Microsoft SQL server database platform. Because HPA indexes, available with SAP NetWeaver 2004s, are a platform-independent alternative to ROLAP aggregates with high performance and low administrative costs, the MOLAP store is no longer being supported.

Data Transformation (Data Management):

1. A transformation has a graphic user interfaces and replaces the transfer rules and update rules with the functionality of the data transfer process (DTP). Transformations are generally used to transform an input format into an output format. A transformation consists of rules. A rule defines how the data content of a target field is determined. Various types of rule are available to the user such as direct transfer, currency translation, unit of measure conversion, routine, read from master data.

2. Block transformations can be realized using different data package-based rule types such as start routine, for example. If the output format has key fields, the defined aggregation behavior is taken into account when the transformation is performed in the output format. Using a transformation, every (data) source can be converted into the format of the target by using an individual transformation (one-step procedure). An InfoSource is only required for complex transformations (multistep procedures) that cannot be performed in a one-step procedure.

3. The following functional limitations currently apply:
You cannot- use hierarchies as the source or target of a transformation.
You can- not use master data as the source of a transformation.
You cannot- use a template to create a transformation.
No- documentation has been created in the metadata repository yet for transformations.
In the- transformation there is no check for referential integrity, the InfoObject transfer routines are not considered and routines cannot be created using the return table.

Quantity Conversion :

As of SAP NetWeaver 2004s you can create quantity conversion types using transaction RSUOM. The business transaction rules of the conversion are established in the quantity conversion type. The conversion type is a combination of different parameters (conversion factors, source and target units of measure) that determine how the conversion is performed. In terms of functionality, quantity conversion is structured similarly to currency translation. Quantity conversion allows you to convert key figures with units that have different units of measure in the source system into a uniform unit of measure in the BI system when you update them into InfoCubes.

Data Transfer Process :

You use the data transfer process (DTP) to transfer data within BI from a persistent object to another object in accordance with certain transformations and filters. In this respect, it replaces the InfoPackage, which only loads data to the entry layer of BI (PSA), and the data mart interface. The data transfer process makes the transfer processes in the data warehousing layer more transparent. Optimized parallel processing improves the performance of the transfer process (the data transfer process determines the processing mode). You can use the data transfer process to separate delta processes for different targets and you can use filter options between the persistent objects on various levels. For example, you can use filters between a DataStore object and an InfoCube. Data transfer processes are used for standard data transfer, for real-time data acquisition, and for accessing data directly. The data transfer process is available as a process type in process chain maintenance and is to be used in process chains.

ETL Error Handling :

The data transfer process supports you in handling data records with errors. The data transfer process also supports error handling for DataStore objects. As was previously the case with InfoPackages, you can determine how the system responds if errors occur. At runtime, the incorrect data records are sorted and can be written to an error stack (request-based database table). After the error has been resolved, you can further update data to the target from the error stack. It is easier to restart failed load processes if the data is written to a temporary store after each processing step. This allows you to determine the processing step in which the error occurred. You can display the data records in the error stack from the monitor for the data transfer process request or in the temporary storage for the processing step (if filled). In data transfer process maintenance, you determine the processing steps that you want to store temporarily.

InfoPackages :

InfoPackages only load the data into the input layer of BI, the Persistent Staging Area (PSA). Further distribution of the data within BI is done by the data transfer processes. The following changes have occurred due to this:
- New tab page: Extraction -- The Extraction tab page includes the settings for adaptor and data format that were made for the DataSource. If data transfer from files occurred, the External Data tab page is obsolete; the settings are made in DataSource maintenance.
- Tab page: Processing -- Information on how the data is updated is obsolete because further processing of the data is always controlled by data transfer processes.
- Tab page: Updating -- On the Updating tab page, you can set the update mode to the PSA depending on the settings in the DataSource. In the data transfer process, you now determine how the update from the PSA to other targets is performed. Here you have the option to separate delta transfer for various targets.

For real-time acquisition with the Service API, you create special InfoPackages in which you determine how the requests are handled by the daemon (for example, after which time interval a request for real-time data acquisition should be closed and a new one opened). For real-time data acquisition with Web services (push), you also create special InfoPackages to set certain parameters for real-time data acquisition such as sizes and time limits for requests.
PSA :

The persistent staging area (PSA), the entry layer for data in BI, has been changed in SAP NetWeaver 2004s. Previously, the PSA table was part of the transfer structure. You managed the PSA table in the Administrator Workbench in its own object tree. Now you manage the PSA table for the entry layer from the DataSource. The PSA table for the entry layer is generated when you activate the DataSource. In an object tree in the Data Warehousing Workbench, you choose the context menu option Manage to display a DataSource in PSA table management. You can display or delete data here. Alternatively, you can access PSA maintenance from the load process monitor. Therefore, the PSA tree is obsolete.

Real-Time Data Acquisition :

Real-time data acquisition supports tactical decision making. You use real-time data acquisition if you want to transfer data to BI at frequent intervals (every hour or minute) and access this data in reporting frequently or regularly (several times a day, at least). In terms of data acquisition, it supports operational reporting by allowing you to send data to the delta queue or PSA table in real time. You use a daemon to transfer DataStore objects that have been released for reporting to the ODS layer at frequent regular intervals. The data is stored persistently in BI. You can use real-time data acquisition for DataSources in SAP source systems that have been released for real time, and for data that is transferred into BI using the Web service (push). A daemon controls the transfer of data into the PSA table and its further posting into the DataStore object. In BI, InfoPackages are created for real-time data acquisition. These are scheduled using an assigned daemon and are executed at regular intervals. With certain data transfer processes for real-time data acquisition, the daemon takes on the further posting of data to DataStore objects from the PSA. As soon as data is successfully posted to the DataStore object, it is available for reporting. Refresh the query display in order to display the up-to-date data. In the query, a time stamp shows the age of the data. The monitor for real-time data acquisition displays the available daemons and their status. Under the relevant DataSource, the system displays the InfoPackages and data transfer processes with requests that are assigned to each daemon. You can use the monitor to execute various functions for the daemon, DataSource, InfoPackage, data transfer process, and requests.

Archiving Request Administration Data :

You can now archive log and administration data requests. This allows you to improve the performance of the load monitor and the monitor for load processes. It also allows you to free up tablespace on the database. The archiving concept for request administration data is based on the SAP NetWeaver data archiving concept. The archiving object BWREQARCH contains information about which database tables are used for archiving, and which programs you can run (write program, delete program, reload program). You execute these programs in transaction SARA (archive administration for an archiving object). In addition, in the Administration functional area of the Data Warehousing Workbench, in the archive management for requests, you can manage archive runs for requests. You can execute various functions for the archive runs here.

After an upgrade, use BI background management or transaction SE38 to execute report RSSTATMAN_CHECK_CONVERT_DTA and report RSSTATMAN_CHECK_CONVERT_PSA for all objects (InfoProviders and PSA tables). Execute these reports at least once so that the available request information for the existing objects is written to the new table for quick access, and is prepared for archiving. Check that the reports have successfully converted your BI objects. Only perform archiving runs for request administration data after you have executed the reports.

Flexible process path based on multi-value decisions :

The workflow and decision process types support the event Process ends with complex status. When you use this process type, you can control the process chain process on the basis of multi-value decisions. The process does not have to end simply successfully or with errors; for example, the week day can be used to decide that the process was successful and determine how the process chain is processed further. With the workflow option, the user can make this decision. With the decision process type, the final status of the process, and therefore the decision, is determined on the basis of conditions. These conditions are stored as formulas.

Evaluating the output of system commands :

You use this function to decide whether the system command process is successful or has errors. You can do this if the output of the command includes a character string that you defined. This allows you to check, for example, whether a particular file exists in a directory before you load data to it. If the file is not in the directory, the load process can be repeated at pre-determined intervals.
Repairing and repeating process chains :

You use this function to repair processes that were terminated. You execute the same instance again, or repeat it (execute a new instance of the process), if this is supported by the process type. You call this function in log view in the context menu of the process that has errors. You can restart a terminated process in the log view of process chain maintenance when this is possible for the process type.

If the process cannot be repaired or repeated after termination, the corresponding entry is missing from the context menu in the log view of process chain maintenance. In this case, you are able to start the subsequent processes. A corresponding entry can be found in the context menu for these subsequent processes.

Executing process chains synchronously :

You use this function to schedule and execute the process in the dialog, instead of in the background. The processes in the chain are processed serially using a dialog process. With synchronous execution, you can debug process chains or simulate a process chain run.

Error handling in process chains:

You use this function in the attribute maintenance of a process chain to classify all the incorrect processes of the chain as successful, with regard to the overall status of the run, if you have scheduled a successor process Upon Errors or Always. This function is relevant if you are using metachains. It allows you to continue processing metachains despite errors in the subchains, if the successor of the subchain is scheduled Upon Success.

Determining the user that executes the process chain :

You use this function in the attribute maintenance of a process chain to determine which user executes the process chain. In the default setting, this is the BI background user.

Display mode in process chain maintenance :

When you access process chain maintenance, the process chain display appears. The process chain is not locked and does not call the transport connection. In the process chain display, you can schedule without locking the process chain.

Checking the number of background processes available for a process chain :

During the check, the system calculates the number of parallel processes according to the structure of the tree. It compares the result with the number of background processes on the selected server (or the total number of all available servers if no server is specified in the attributes of the process chain). If the number of parallel processes is greater than the number of available background processes, the system highlights every level of the process chain where the number of processes is too high, and produces a warning.

Open Hub / Data Transfer Process Integration :

As of SAP NetWeaver 2004s SPS 6, the open hub destination has its own maintenance interface and can be connected to the data transfer process as an independent object. As a result, all data transfer process services for the open hub destination can be used. You can now select an open hub destination as a target in a data transfer process. In this way, the data is transformed as with all other BI objects. In addition to the InfoCube, InfoObject and DataStore object, you can also use the DataSource and InfoSource as a template for the field definitions of the open hub destination. The open hub destination now has its own tree in the Data Warehousing Workbench under Modeling. This tree is structured by InfoAreas.
The open hub service with the InfoSpoke that was provided until now can still be used. We recommend, however, that new objects are defined with the new technology.

Thanks,
Ramu

Monday, November 2, 2009

SAP BI Related Table information

Transfer Structure
RSTS Transfer Structure List
RSTSFIELD Transfer Structure fields
RSTSRULES  Transfer Structure rules
RSAROUTT Text name of Transfer Routine
DD03T Text for R/3 Transfer structure Objects

Update Rules
RSUPDROUT Update rules List
RSUPDDAT   Update rules with routines
RSUPDKEY    Update rule key fields
RSUPDINFO InfoProvider to Infosource correlation
Embedded ABAP coding for Transfer / Update Rules
RSAABAP ABAP source code per object routine
InfoPackage
RSLDPIO Links datasource to infopackages
RSLDPIOT  InfoPackage Text Description
RSLDPRULE ABAP source code for InfoPackages
RSLDPSEL   Hardcoded selections in InfoPackages
RSMONICDP Contains the request-id number by data target
RSPAKPOS List of InfoPackage Groups / InfoPackages
ProcessChain
RSEVENTCHAIN   Event Chain Processing Event Table
RSEVENTHEAD    Header for the event chain
RSEVENTHEADT    Header for the event chain
RSPCCHAIN Process chain details
RSPCCHAINATTR  Attributes for a Process Chain
RSPCCHAINEVENTS  Multiple Events with Process Chains
RSPCCHAINT  Texts for Chain
RSPCCOMMANDLOG System Command Execution Logs (Process Chains)
RSPCLOGCHAIN  Cross-Table Log ID / Chain ID
RSPCLOGS Application Logs for the Process Chains
RSPCPROCESSLOG Logs for the Chain Runs
RSPCRUNVARIABLES  Variables for Process Chains for Runtime
RSPC_MONITOR Monitor individual process chains
Queries
RSZELTDIR Directory of the reporting component elements
RSZELTTXT Texts of reporting component elements
RSZELTXREF Directory of query element references
RSRREPDIR Directory of all reports (Query GENUNIID)
RSZCOMPDIR Directory of reporting components
RSZRANGE Selection specification for an element
RSZSELECT Selection properties of an element
RSZELTDIR Directory of the reporting component elements
RSZCOMPIC Assignment reuseable component <-> InfoCube
RSZELTPRIO Priorities with element collisions
RSZELTPROP Element properties (settings)
RSZELTATTR Attribute selection per dimension element
RSZCALC Definition of a formula element
RSZCEL Query Designer: Directory of Cells
RSZGLOBV Global Variables in Reporting
Workbooks
RSRWBINDEX List of binary large objects (Excel workbooks)
RSRWBINDEXT Titles of binary objects (Excel workbooks)
RSRWBSTORE Storage for binary large objects (Excel workbooks)
RSRWBTEMPLATE Assignment of Excel workbooks as personal templates
RSRWORKBOOK Where-used list for reports in workbooks
Web templates
RSZWOBJ Storage of the Web Objects
RSZWOBJTXT Texts for Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template
RSZWTEMPLATE Header Table for BW HTML Templates
InfoObject
Directory of all InfoObjects
RSDIOBJ                              
RSDIOBJT Texts of InfoObjects
RSDIOBJ   Directory of all InfoObjects
RSDIOBJT Texts of InfoObjects
RSDATRNAV Navigation Attributes
RSDATRNAVT Navigation Attributes
RSDBCHATR Master Data Attributes
RSDCHABAS Basic Characteristics (for Characteristics,Time Characteristics, and Units)
RSDCHA  Characteristics Catalog
RSDDPA Data Package Characteristic
RSDIOBJCMP Dependencies of InfoObjects
RSKYF  Key Figures
RSDTIM Time Characteristics
RSDUNI Units
InfoCube
RSDCUBE  Directory of InfoCubes
RSDCUBET Texts on InfoCubes
RSDCUBEIOBJ Objects per InfoCube (where-used list)
RSDDIME Directory of Dimensions
RSDDIMET Texts on Dimensions
RSDDIMEIOBJ  InfoObjects for each Dimension (Where-Used List)
RSDCUBEMULTI InfoCubes involved in a MultiCube
RSDICMULTIIOBJ MultiProvider: Selection/Identification of InfoObjects
RSDICHAPRO Characteristic Properties Specific to an InfoCube
RSDIKYFPRO Flag Properties Specific to an InfoCube
RSDICVALIOBJ InfoObjects of the Stock Validity Table for the InfoCube
Aggregates
RSDDAGGRDIR Directory of Aggregates
RSDDAGGRCOMP Description of Aggregates
RSDDAGGRT Text on Aggregates
RSDDAGGLT Directory of the aggregates, texts
ODS Object
RSDODSO Directory of all ODS Objects
RSDODSOT Texts of all ODS Objects
RSDODSOIOBJ InfoObjects of ODS Objects
RSDODSOATRNAV Navigation Attributes for ODS Object
RSDODSOTABL Directory of all ODS Object Tables
PSA
RSTSODS  Directory of all PSA Tables
DataSource (= OLTP Source)
ROOSOURCE   Header Table for SAP BW DataSources (SAP Source System/BW System)
RODELTAM BW Delta Procedure (SAP Source System)
RSOLTPSOURCE Replication Table for DataSources in BW
InfoSource
RSIS  Directory of InfoSources with Flexible Update
RSIST Texts on InfoSources with Flexible Update
RSISFIELD InfoObjects of an InfoSource
Communications Structure
RSKS Communications Structure for InfoSources with Flexible Update
RSKS Communications Structure (View) for Attributes for an InfoSource with Direct Update
RSKSFIELD Texts on InfoSources with Flexible Update
RSISFIELD InfoObjects of an InfoSource with Flexible Update
Transfer Structure
RSTS Transfer Structure in SAP BW
ROOSGEN Generated Objects for a DataSource (Transfer Structure, for example in SAP Source System)
Mapping
RSISOSMAP Mapping Between InfoSources and DataSources (=OLTP Sources)
RSOSFIELDMAP Mapping Between DataSource Fields and InfoObjects
InfoSpoke
RSBSPOKESELSET InfoSpoke Directory and Selection Options
RSBSPOKEVSELSET InfoSpoke Directory with Selection Options and Versioning
RSBSPOKE List of all InfoSpokes with attributes maintained with transaction RSBO which include the name of
the Source & Target Structures
RSBSPOKET List of all InfoSpokes with the Short & Long Descriptions (only one of these can be maintained).
RSBSTEPIDMESS Contains all the messages that have been recorded during the execution of an InfoSpoke. This table can
be added to using the ABAP Class/Method i_r_log->add_sy_message.
SAP BW Statistics
RSDDSTAT Basic Table for InfoCubes/Queries
RSDDSTATAGGR Detail Table for Aggregate Setup
RSDDSTATAGGRDEF Detail Table of Navigation for each InfoCube/Query
RSDDSTATCOND InfoCube Compression
RSDDSTATDELE InfoCube Deletions
RSDDSTATWHM Warehouse Management
Misc
RSFEC BW Frontend Check. Useful for checking the installed SAP GUI versions on user machines.
RSSELDONE InfoPackage selection and job program, there in field UPDMODE the update status (INIT/DELTA/FULL)
RSPSADEL PSA Table deletion
TBTCP Job Schedule Definition
TBTCO Job Schedule Result
RSMONMESS Monitor Messages
RSERRORLOG Check loading errors in table
V_RSZGLOBV Report Variables view table
DEVACCESS Developer Keys table
TSTC All Transactions in the system
RSDDAGGRDIR Directory of the aggregates
ROIDOCPRMS Control parameters for data transfer from the source system
SMEN_BUFFC Objects in User's Favorites
Web Item
RSZWITEM Header Table for BW Web Items
RSZWMDITEM BW Web Metadata: Template Item ( Dataprovider, Item, ... ).
RSZWITEMXREF Cross Reference of Web Items
RSZWMIMEIOBUFFER Buffer for Translation MIME Rep. to IO
RSZWOBJ Storage of the Web Objects
RSZWOBJTXT Texts of Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template

RSZWTEMPLATE  Header Table for BW HTML Templates
Archiving
RSARCHIPRO BW Archiving: General Archiving Properties
RSARCHIPROIOBJ BW Archiving: General Archiving Properties
RSARCHIPROLOC BW ARchiving: General Local Properties
RSARCHIPROLOCSEL BW Archiving: Archived Data Area
RSARCHIPROPID BW Archiving: Program References of InfoProvider
RSARCHREQ BW Archiving: Archiving Request
RSARCHREQFILES BW Archiving: Verfified Archive Files
RSARCHREQSEL BW Archiving: Request-Selections
Open Hub Destination
RSBOHSERVICETP Open Hub: Service Types
RSBREQUESTDELTA Open Hub: Cross Reference Outbound/Inbound
RSBREQUESTMESS Open Hub: Log for a Request
RSBREQUID Open Hub: Requests
RSBREQUID3RD Open Hub: Status of 3rd Party Requests
RSBREQUIDRUN Open Hub: Table with Status for a Request
RSBSTRUCTURE Open Hub: Generated Structures and Tables