Quantcast
Channel: SCN : Blog List - SAP Business Explorer (SAP BEx)
Viewing all 103 articles
Browse latest View live

Passing Filter Values Through URL In Bex WAD

$
0
0

This blog explains the use of BI commands to pass variable parameters to a BEx query.

 

Scenario:

 

It is quite usual scenario that we will have to open a BEx query from another Web Application Designer report or from any other portal. This will generally be implemented by having a button / URL pointing to the target query.

 

At times, it is also required that we have to pass the selections made in the landing page to the query so that the output will be filtered according to the values passed.

 

Solution:

 

In order to achieve this, we can make use of the standard BI commands. One such command is SET_VARIABLES_STATE. This command is used to set the value of a variable which is used in BEx query.


Consider the following example, where in a query has a single value variable ZVAR_DATE created on InfoObject ZDATE. From the landing page, when the user clicks on the URL of the target query, it should pass the date value selected in the landing page to the above said query.


Below is the BI command which we need to append it to the actual query URL with required <variable name> and <variable value>.


<Generated_URL>&BI_COMMAND_1-BI_COMMAND_TYPE=SET_VARIABLES_STATE&BI_COMMAND_1-VARIABLE_VALUES-VARIABLE_VALUE_1-VARIABLE=<variable_name>&BI_COMMAND_1-VARIABLE_VALUES-VARIABLE_VALUE_1-VARIABLE_TYPE=VARIABLE_INPUT_STRING&BI_COMMAND_1-VARIABLE_VALUES-VARIABLE_VALUE_1-VARIABLE_TYPE-VARIABLE_INPUT_STRING=<variable_value>


In our case, we should replace <variable_name> with ZVAR_DATE and <variable_value> with 01.01.2014(or any other required date).


Similarly, it is possible to pass values for multiple variables by appending the query URL with the above command with ‘N’ number of times, where N= number of variable values that needs to be passed.


Thanks for reading this blog


Appreciate any suggestions/feedbacks!




Run time population of fields in BEx Query

$
0
0

In this blog I am going to demonstrate on how the fields can be populated during run time in BEx query.

 

Scenario:

 

A firm has confidential data such as the price of a material which is yet to be released to market, which the firm does not want to store it in a data store object or an InfoCube straightaway. Instead they will store it in an encrypted format in a database table. But certain users such as Product Managers or Board Members should be able to view the decrypted prices in the final BEx query. In this case, it is required to populate the decrypted values during run time of a BEx query.

 

Solution:

 

We can make use of classic BADI implementation on RSR_OLAP_BADI. This BADI basically gets executed for the specified characteristics / KeyFigures and populate the values during run time based on the code written for the implementation.

 

To do this, we need to have virtual charactristics/KeyFigures present in the underlying InfoProvider i.e. say we already have the decrypted value stored in the InfoObject ZDEC_VALUE of an InfoProvider, then we need to create a dummy Key Figure say ZENC_VALUE on the same InfoProvider.

 

Once the above step is complete, then create a custom implementation on RSR_OLAP_BADI which will have three pre-defined methods i.e. INITIALIZE, DEFINE and COMPUTE. It is mandatory to define and initialize the fields in INITIALIZE and DEFINE methods in order to use them in COMPUTE method. In the COMPUTE method we can have the logic to decrypt the values and assign it to the dummy virtual characteristic/KeyFigure created.

 

By doing this, everytime when the queries having the InfObjects specified in the BADI will undergo the business logic and the fields are populated accordingly.

 

It is possible to create new custom methods other than the pre-defined methods.

 

Note:

 

It is also possible to restrict the BADI to run it only for the queries built over a particular InfoProvider or a particular query alone. This can be handled by using a TVARVC variable by having list of queries stored in it and having a lookup to this before going to the actual logic in COMPUTE method. The InfoProvider restriction can be added straightaway in FILTERS area of BADI implementation.

 

Where as sample piece of code for INITIALIZE method can be copied from the example class CL_EXM_IM_RSR_OLAP_BAPI and this can be changed based on the requirement.

 

Thanks for reading this blog.

 

Appreciate any comments/feedbacks!

Few pitfalls to avoid during BEX workbook upgrade from 3.x to 7.x – Part 1

$
0
0

I am sure most of you have come across multiple issues during BEX workbook upgrade from 3.x to 7.x. Most of the customers use BEX workbooks pretty heavily. Some of the workbooks might have huge number of tabs. I have seen workbooks with 50+ tabs. Do you believe it? I am not kidding…

 

The BEX 7.x is based on .NET framework. A process can allocate a maximum of approximately 1.2 gigabytes (GB) in the .NET framework, regardless of the memory capacity in the front-end. This corresponds to approximately 750,000 data cells.

 

Following are some of the potential solutions, if you get the error “Client out of memory” in the upgraded workbook.

 

  1. Avoid refreshing the whole workbook. Refresh the queries that you require by enabling the following features. The following features are available only from a certain support package level.
    1. Allow refresh of single queries as a global setting in BEX
    2. Single data provider refresh functionality
  2. Split the workbook into multiple. It’s time to re-engineer and remove unwanted tabs.
  3. Covert BEX workbook into to a new tool called SAP BO Analysis for Office. It's a bigger decision. If you have plans to move to this tool in future , this a good time to move forward. One of the main prerequisites is that you have to be on SAP BW 7.01 SP11 or higher.

 

(1)  Net..Net…Plan to upgrade the back-end and front-end patches to leverage the above-mentioned features so that you can avoid memory related errors in the upgraded workbook.

 

In the next part, I am going to talk about the missing features in BEX 7.x workbook compared to BEX 3.x workbook and provide alternate options to overcome the same.

Use the reporting name in Exit-variables for authorization - Addendum

$
0
0

This blog has been translated with Google Translate. the original blog can be found here: ekessler.de

 

 

In the blockUse the reporting name in Exit-variables for authorizationI have shownas the nameof a reportin the processing ofexit variablesin the contextof theauthorizationmaybeused. In thisblock, I show what restrictionsaretaken into account.

 

 

Thecustomer exitis traversed twicein the context ofauthorization. In the first passtheCOMPIDfieldisnot yet availablein thememory, and the call

IMPORTFROMMEMORYIDl_compidcompid='COMPID'.

 

returnsforl_compidnothing. With the followingcodetoensure that allusersof the report'ZTKE_EXIT_VAR_AUTH' onlytheinformation on the countryGermany(DE) is obtained.

 

METHODif_rsroa_variables_exit_badi~process.

   DATA: TYPEl_compidrszcompid,

         ls_rangeTYPErrrangesid.

 

   IMPORTFROMMEMORYIDl_compidcompid='COMPID'.

 

   CASEl_compid.

     WHEN 'ZTKE_EXIT_VAR_AUTH'.

          ls_range-sign = 'I'.

          ls_range-opt = 'EQ'.

          ls_range-low = 'EN'.

          APPENDTOls_rangec_t_range.

      ...

   ENDCASE.

ENDMETHOD.

 

The first callinI_STEP=0stilldeterminednotCOMPIDandthusno valueforthe processedhereexit variablecomesbackas part of theauthorization. In the secondcall, theCOMPIDcanbedeterminedandthe permission isrestricted toDE.

 

The first callhasthe consequence thatwein the executionof thereport, thewarning

You do nothaveanalysisauthorizationforanychar. values​​of char. 0COUNTRY

 

get, seeFigure2.1. theauthorization

 

By default, the variables are determinedbufferedvalues​​in the contextof the authorization check. To ensurethat the values​​can be evaluatedin thesecond pass, thebuffering intheRSECADMIN(seeFigure2.1) can be switched off.This is onlypossiblesystem-wide!

 

Figure_2_1.jpg

Figure 2.1: Variablen Pufferung deaktivieren

 

After thebufferwasswitched off, we getno warningsregarding. The lack ofanalysisauthorizationmore.

Unfortunately, there isnoalternative toturning off thebuffer. Theanalysisauthorization conceptclearly statesthat an analysis ofauthorizationshouldbemade​​to the dataandnotthereportingobject.

Everything you always wanted to know about the processing of customer exit variables, but ...

$
0
0

The original blog can be found here: ekessler.de

 

Inthis blog Iwould like to trysome clarityintheprocessingofexit variables(EXIT_SAPLRRS0_001) to bring.TheemergedwithBW7.30BAdIRSROA_VARIABLES_EXIT_BADIhasto dealwithExitvariablenot exactlysimplified.In addition,theBAdIisunfortunatelynotdocumentedintheSAPHelp. Furthermore,it has nowalsochanged60withthe7.4domainRSCHAVLofCHARtoSSTRING.


Allexit variablesdescribedhereare used tothe range of valuesof a reportlimit orexpandthescopeofpermissions. In addition,thepropertiesdescribedhereby ExitvariablesforExitvariables areto beusedas part of thestaginginDTP's orInfoPackages.


First, however,let usfirst of alldealwith the differenttypesofexitvariables and theirprocessing order.


1.1      Variablentypen


If I talk about exit variablesIdistinguish the followingtypes of use:

  • Ready for input
  • Not ready for input
  • Use for authorization or staging


Ready for inputvariablescomethentouseiftheuser shouldbe given the opportunityto influence theoutcomereportindividually. The basic concept ofan input-readyvariablesprovidestheuserdetermines the valueforthe variable, and the value could not be changed by internalprocessingprocesses(customer exit). Insection 1.3,"Overriding inputvalues", I describe how this conceptcanbebypassedandtheuser-enteredvalue of aninput-readyvariablesin the customer exitcanbeoverwritten.


Reday for inputvariables would be processed in I_STEP=1andI_STEP=3, seeSection 1.2"processing steps(I_STEP)".


Notready for inputvariablescomethentouseifthe value is tobedetermined byrules. Hererulesare often defined(implemented) in whichthe variable valuesforinput-readyvariablesnotdepending on ready for input variables aredetermined.

 

Notready for inputvariables would be processed inI_STEP=1andI_STEP=2, seeSection 1.2"processing steps(I_STEP)".


Exitvariablescan also be usedas partofauthorization or tostaging. At exitvariablesthat must be consideredare used here, that there is nointeraction with a userisusuallyheld. This meanshereisthe processing orderanother.


Therefore,it must be ensuredherethatcombinationsasready for input, mandatoryandnodefault value(Default value)thatcauseavariable dialogis needed. Processofstaging(DTP, InfoPackage) are usuallyscheduledprocessesinplacethat are executed bybackground-users.


Variablesof usageauthorization andstagingareprocessedonlyinI_STEP = 0, seeSection 1.2"processing steps(I_STEP)".


1.2 Processingsteps(I_STEP)


Exitvariablesdepending on theusageandpurposeinone or more steps, theI_STEP's processed. In the section "Dependencies for Variables ofTypeCustomer Exit" theSAPHelptheI_STEP's are briefly explained.The descriptioninthe online help is unfortunately incomplete and omitted entirely on examples. Therefore, I willbrieflyexplain eachstep again using examples.


1.2.1AuthorizationandStaging(I_STEP =0)


InI_STEP=0exit variablesare processedin theauthorization andare usedinstaging. Figure1.1shows the useof an exitvariableswithintheauthorization. FortheprocessingofexitvariableswithintheentitlementonlytheI_STEP=0is traversed.

 

Figure_1_1.jpg

Figure1.1: Exitvariableswithintheauthorization


Figure1.2showsthe use of anexit variablesinstagingthe example ofselectionwithin anInfoPackage.

Figure_1_2.jpg

Figure1.2: Exitvariables within theStaging

 

1.2.2Initialization(I_STEP =1)


TheI_STEP=1 is used forinitializationofvariablesandexitis runseparatelyfor eachexit variable. In the first stagetheinput-readyvariablesandthennotready for inputvariables are processed, seeFigure1.7. (The order could be differ, depends on the release (here BW 7.31 SP06)!)


Figure1.3shows a typical exampleof initializingan input-readyvariables. The variable isinitialized withthecurrent monthof last year.

 

Figure_1_3.jpg

Figure1.3: Initialization

 

1.2.3Derivationof variable values(I_STEP =2)

 

TheI_STEP=2is used toderive thevaluesfor the non-input-ready variablesExit. Again, thevariables areprocessedseparatelyanalogous toI_STEP=1. To derive thevalues for non-input-ready variablesExitall previouslydetectedvariable valuesin the parameterI_T_VAR_RANGEavailable. Insection 1.3,"Overriding values entered" I describeas well asinput-readyvariables can beprocessedhere.

 

Figure1.4 showshowthe currentvariable(examination of thevariable nameisnot shown here), the value is derived basedon thevalue of the variableZTKE_MONTH.

 

Figure_1_4.jpg

Figure1.4: Derivation ofvariables

 

1.2.4validation(I_STEP = 3)

 

TheI_STEP=3is used to validateall recordedvariables.InI_STEP=3all previouslyrecorded valuesin the parameterI_T_VAR_RANGEaretesting and validation.

 

TheI_T_VAR_RANGEparameter containsonlythevariablesthat contain a value. That isherearetheonlyvariablesincluded:

  • value set by default value or
  • value set byan implementation(I_STEP =1orI_STEP=3) or
  • user entereda value in thevariable dialog

 

InI_STEP=3cannotbechanged, the values of the individualvariables. It is possibleto generate messages which would be displaywith thereport-result or the variable dialog. In the eventthatthevalidation of thevariablesmeans thatit makes no sensetorun the reportby throwingan exception(RAISE EXCEPTION) to preventthe report is run. The exceptionmeans thattheuserre-enters the values in the variablesdialog.

 

Figure1.5 showsthevaluesfor the two variablesZYEARFROMandZYEARTOdetermined andthen comparedas inI_STEP=3. Ifthe FROMvaluegreater thanTOvalueofamessage is issued andusing the RAISEstatementwrong_valueprevents thereport is run. The userhasthe opportunityto correct the valuein the variabledialog.

 

Figure_1_5.jpg

Figure1.5: Validation -CustomerExit

 

Figure1.6 showssimilar tothe exampleinFigure1.5, as in the object-orientedcontextto run thereportcan be prevented. The exceptionmustbethrownherein theobject-orientedcontext.

 

Figure_1_6.jpg

Figure1.6: Validation -BAdI

 

1.3Execution OrderofI_STEP

 

Figure1.7 showsthesequence in which theindividual's I_STEPunder aBExreports.I distinguishthe twophases:

  • Präparation(preparation phase) and
  • Validierung(Validation Phase)

 

TheI_STEP's thepreparationphaseare runbefore the variabledialogue andI_STEP's validationphasewill onlygo throughifthe values of theinput-readyvariableschangein thevariablesdialog.

 

Figure_1_7.jpg

Figure1.7: Processing ofexit variables(I_STEP's)

 

That isthe callingSAPstandardprocessing procedureinitially assumesthat the useraccepts thedefault values of thevariablesdialogwithout changing. In thiscase, the validationphaseis not runagain!

 

Theprocessstepsof thevalidationphasewill onlygo throughifthevaluesin thedialogvariableswerechanged by the user.

 

1.4ProcessingofprocessvariablesExit

 

WithBW7.3theBAdIRSROA_VARIABLES_EXIT_BADIwasintroducedandpresented to thecustomer exitEXIT_SAPLRRS0_001. The blog Coexistence of BAdI RSROA_VARIABLES_EXIT_BADI and Customer-Exit EXIT_SAPLRRS0_001showshowtheBAdIandthecustomer exitbehaveinaBW7.3system.

 

Figure1.8shows the individualprocessingblockswhich are executedas partof the variableprocessingofexitvariables.

 

Figure_1_8.jpg

Figure1.8: Variables processing

 

Thestandardprocessingprocessfirst checkswhether an activeBAdIimplementationbyTypeRSROA_VARIABLES_EXIT_BADIisavailable. As afiltervaluehere, the technical name of theInfoObjectis usedontheexit variablebasedcurrently being processed. From a technicalperspective, thistest is performedwithin the function blockRRS_VAR_EXITvia GETBADI.

 

The blogNew BAdIRSROA_VARIABLES_EXIT_BADIdescribesthemanufacturing processof theBAdI's in the details.

 

1.5Overridinginputvalues

 

The basic principleforinput-readyvariableswasinitiallythatuser enteredvalues can notbeoverwritten. An input-readyvariable isnotprocessedbydefaultafter the variabledialogas a singlevariable.

 

InI_STEP=3, the variablecanindeedbe validatedbutnot be changed. If it is determinedduring validationthattheuser enteredvalueisnot meaningful, in the I_STEP=3 is amessagethatthe usergeneratedinformed. In addition,an exceptionwill bethrown. The exceptionensures thatthevariablesdialogappearsagain.

 

With the introductionof the parameterE_CHECK_AGAIN(see Note1272242-RenewedVariablenverprobunginI_STEP=2), the concept was canceled. The parameterallows thedeveloperto theuser-enteredvalue of aninput-readyvariablesafter the variabledialoginI_STEP=2to overwriteas needed.

 

As describedinSection 1.2"processing steps(I_STEP)" input-readyvariablesonlyinI_TEP=1andI_STEP = 3 processed, the value onlyinI_STEP=1 changes(initialiesiert) canbe. To ensurethatan input-readyvariablesinI_STEP= 2 isagainprocessedmust be inI_STEP=1forthis variable is theexportparametersE_CHECK_AGAIN(E_CHECK_AGAIN ='X') are set. If the parameter issetE_CHECK_AGAINsothisis ready for inputvariable isprocessed into anotready for inputvariablesinI_STEP=2analogoustothevariablesdialog.

 

 

RSR_VARIABLE_F4_RESTRICT_BADI - With Compounding Object Restrictions

$
0
0

Issue:

 

I have ZGL_ACCNT with two compounding object 1 Chart of accounts 2 Logical Source System.

 

Created variable and using in BEx Report.

 

As per my requirement I need restrict F4 values for G/L Account variable specific to Chart of accounts.

 

Ex

 

P Table

 

GGL Account     Chart of Accounts          Source System

123                    A                                   ABC

123                    B                                   ABC

 

 

As per my requirement I need to show F4 values for chart of accounts 'A'.

 

 

So, implemented BADI using RSR_VARIABLE_F4_RESTRICT_BADI enhancement and written SELECT statement pull only where Char of accounts equal to 'A' and filling C_T_RANGE.

 

In C_T_RANGE I am able see only 123 and A record.(In Debug)

 

If I execute report and press F4 I am able see 123 A and 123 B also, I am wondering how 123 B is coming since i filled C_T_RANGE table with 123 A.

 

Finally I am not able to meet customer requirement ,

 

Here is the solution , you may found number of documents how to create(http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/4009b0a8-7adc-2e10-48b3-a111c8f407af?QuickLink=index&…) RSR_VARIABLE_F4_RESTRICT_BADI along with that we need add below piece of code to work as expected like 123 A as per above example, while working with compounding objects.



LOOP AT t_zgl_accnt INTO w_zgl_accnt.

 

           l_s_range-iobjnm = i_iobjnm. "ZGL_ACCNT

           l_s_range-sign = 'I'.

           l_s_range-option = 'EQ'.

           l_s_range-low = w_zgl_accnt-zgl_accnt.

           APPEND l_s_range TO c_t_range.

 

           l_s_range-iobjnm = 'ZCHRT_ACC' ." Char of accounts.

           l_s_range-sign = 'I'.

           l_s_range-option = 'EQ'.

           l_s_range-low = w_zgl_accnt-zchrt_acc.

           APPEND l_s_range TO c_t_range.

 

           l_s_range-iobjnm = 'ZSOSYS' ." Logical source system

           l_s_range-sign = 'I'.

           l_s_range-option = 'EQ'.

           l_s_range-low = w_zgl_accnt-ZSOSYS.

           APPEND l_s_range TO c_t_range.

 

          ENDLOOP.

 

Note: Need to add all compounding objects for C_T_RANGE, here chart of accounts and logical source system, adding to C_T_RANGE even though not required for user then only data will restrict and display when you press F4 as expected.

 

Thank you,

Nanda

 

 

 

 




Framework for Customer Exit OLAP Variables

$
0
0

Customer Exit OLAP Variables require ABAP coding which often can be found in a single INCLUDE program ZXRSRU01 as part of legacy Enhancement RSR00001 (a.k.a. Customer Exit EXIT_SAPLRRS0_001). Usually the INCLUDE program becomes bigger and bigger. Moreover, various people and/or project teams are working in the same place. You can imagine that it is dangerous because one programming or transport sequence mistake can destabilize the entire system. The solution is encapsulation of the coding of individual variables and you will need a framework to realize it.

In this blog I would like to discuss the latest enhancement technology to be used. Moreover, I will present an alternative approach for implementing a framework and the advantages it can offer you.

Please refer to my document Implementing Framework for Customer Exit OLAP Variables for a detailed step-by-step instruction for implementing the framework.

New Enhancement Spot

In SAP BW release 7.3 a new Enhancement Spot RSROA_VARIABLES_EXIT for Customer Exit OLAP Variables was introduced. It contains BAdI RSROA_VARIABLES_EXIT_BADI which can be impemented multiple times using an appropriate filter. It also contains a default implementation SMOD_EXIT_CALL to call program ZXRSRU01 as part of legacy Enhancement RSR00001.

I can recommend reading blog New BAdI RSROA_VARIABLES_EXIT_BADI (7.3) for more detailed information. Although it is a big improvement, I would like to suggest a more sophisticated approach to take the encapsulation to the next level.

Framework

My alternative approach indirectly uses standard SAP’s new BAdI RSROA_VARIABLES_EXIT_BADI. Instead of implementing this BAdI for every new variable, I introduce an intermediate layer: a new custom Enhancement Spot YBW_OLAP_VAR with 4 BAdIs (corresponding to the processing steps). The following 4 BAdI Definitions are available:

 

  • YBW_OLAP_VAR_BEFORE_POPUP - OLAP Customer Exit Variables - Before Popup (i_step = 1);
  • YBW_OLAP_VAR_AFTER_POPUP- OLAP Customer Exit Variables - After Popup (i_step = 2);
  • YBW_OLAP_VAR_VALIDATION- OLAP Customer Exit Variables - Validation (i_step = 3);
  • YBW_OLAP_VAR_AUTHORIZATION- OLAP Customer Exit Variables - Authorization (i_step = 0).

 

Only once BAdI RSROA_VARIABLES_EXIT_BADI will be implemented in a generic way. Here the respective BAdI implementation will be called dynamically according to the processing step and filter on variable name (except processing step 3 where I propose to filter on query name).

Rationale

The main reason for implementing such a framework is achieving a safer system that is much better/easier to manage and support. The main advantages are in my opinion:

 

  • Every BAdI implementation is independent of all others, the encapsulated pieces of coding (residing in their own implementing class) can be maintained and transported independently and therefore eliminating risks which can destabilize the system;
  • The BAdI implementation is filtered on variable name (except processing step 3 where I propose to filter on query name), only one BAdI implementation will be processed every time;
  • The BAdI implementations are grouped by processing step (i.e. before pop-up, after pop-up, etc.) which increases transparency;
  • There is no need anymore to program a filter on variable name (CASE I_VNAM. etc.) and the processing step (e.g. IF I_STEP = 1.) in a nested way, the framework is taking care of that and contains appropriate error handling;
  • The Enhancement Spot technology is the latest and way forward technology for enhancing the system, it's using ABAP-OO programming technology and can optionally (not mandatory) be combined with the Switch Framework, you can find more information in the blog Simplify & structure your Enhancements by using Switchable Kernel BAdIs.

Migration

Most likely you are not working in an empty greenfield SAP BW system and you might find a considerable implemention effort in INCLUDE program ZXRSRU01 and/or any preceding framework (e.g. using Function Modules). If you decide to switch over to a new way-of-working, you will not be able to migrate all existing content in one go. Usually you will migrate gradually using a “phase in, phase out” approach. In other words, the legacy implementation and the new implementation have to live together in the same system for the time being.

This is possible but one “hidden” aspect will pop up as soon as you will create your first new implementation. The standard SAP BAdI implementation SMOD_EXIT_CALL will not be called anymore. This implementation is used to call INCLUDE program ZXRSRU01 as part of Enhancement RSR00001. Since this implementation is flagged as a “default implementation”, it will not be called as soon as any normal implementation is found.

The solution is creating another RSROA_VARIABLES_EXIT_BADI implementation which replaces the SMOD_EXIT_CALL implementation. It must be an exact copy of the standard SAP implementation so that it will call the legacy implementation in the same way.

You can find more information in blog Coexistence of BAdI RSROA_VARIABLES_EXIT_BADI and Customer-Exit EXIT_SAPLRRS0_001.

Conclusion

In this blog we discussed a new way of implementing Customer Exit OLAP Variables. The framework consists of an intermediate layer: a custom Enhancement Spot and 4 BAdIs which correspond to the processing step. The respective BAdI will be called dynamically using a single generic implementation of standard SAP BAdI RSROA_VARIABLES_EXIT_BADI. After an overview of the advantages of such an approach, we discussed the topic migration that is applicable in almost all cases.

If you are interested to know all about it, please refer to my document Implementing Framework for Customer Exit OLAP Variables. Here I will share with you detailed technical instructions how to implement such a framework.

How to Debug Hierarchy Exit ZXRSAU04

$
0
0

In many BI projects it is necessary to create a custom hierarchy either by combining several logically linked characteristics, or by creating a subset hierarchy (diluted hierarchy according to business rules).

 

One of the ways to do it is to use an enhancement RSAP0001 with EXIT_SAPLRSAP_004 component in ZXRSAU04 include.

 

Here, after CASE/WHEN “8DATA_SOURCE_NAME”, we can place an ABAP coding which builds a hierarchy according to our logic (sure, important prerequisite is to generate export data source in RSA1 for the characteristic).

 

If our coding is OK and the hierarchy was created successfully - well done.

 

But if something was wrong (dump, errors, erroneous hierarchy structure) we would want to debug the coding.

 

Setting any breakpoint in ZXRSAU04 (hard-coded BREAK-POINT, using checkpoint, setting external breakpoint) doesn’t bring us to debugger session.

 

In this article I will explain one of the ways to debug ZXRSAU04 exit.

 

First, go to SE38 and show the report SAPLRSAP.

1.png

 

Then, go to include LRSAPF06 to the line 475.

In the line 475 we can see customer function ‘004’ is called – that is a call to EXIT_SAPLRSAP_004, so put an external breakpoint here.

2.png

 

It is possible that line number may vary depending on your system version, so, if there is no such call in the line 475, just try to search it around.

 

Then execute an info package which loads a custom hierarchy from "8DATA_SOURCE_NAME" (debugger still doesn’t called).

 

After the info package load was finished go to SM37, find your job (should have prefix BIREQU_*), check the line, write in transaction line JDBG and press enter.

3.png

Debugger will appear.

4.png

 

Press F8 and you will get to your breakpoint.

5.png

 

Finally, press F5 will get you into your ABAP coding.

 

Nice debugging


APD Report Output with Negative Sign

$
0
0

Purpose: Making data available from BW system to another system is common and regular activity.

 

When extracting data from Report or any other sources we may have negative values like 100- , while updating same data to flat files , we may have requirement inverse negative values from 200- to -200.

 

Here is the solution.

 

  1. Drag the report
  2. Add ROUTINE transformation
  3. Connect report to ROUTINE transformation
  4. Double click on ROUTINE transformation add required fields from Field List to Source Flds
  5. In TargetFlds Tab add one extra column along with normal amount filed. Here SIGN_AMT type 0txtlg

 

Untitled.png

  1. Write code as like below in ROUTINE tab of ROUTINE transformation.

DATA:


ls_source TYPE y_source_fields,
ls_target
TYPE y_target_fields,

   res_amt(60) type c.

 

 

LOOP AT it_source INTO ls_source.

if ls_source-amount lt 0.
clear: res_amt.

res_amt = ls_source-amount * -1.
concatenate '-' res_amt into
ls_target
-sign_amt.

CONDENSE ls_target-sign_amt NO-GAPS.
else.

 

ls_target-sign_amt = ls_source-amount.
CONDENSE ls_target-sign_amt NO-GAPS.
endif.

 

    APPEND ls_target TO et_target.
ENDLOOP.

 

Note:

 

  1. You have to assign remaining fields source to target
  2. Create target like Direct DSO or flat file and connect ROUTINE transformation to data target
  3. Execute APD

 

 

 

Thank you,

Nanda

BI reports safety belt in Portal and Bex Analyzer

$
0
0

Hi All,

 

Here is my first blog post ..


As SAP suggested few safety belt s-notes to avoid server(JAVA stack) crashes and improve the fastness of server.

 

Many end users will run the BEx reports without filter or will run entire year data, in this cases, server will go slower or will create issues(slowness) for  other end users when they try to execute reports in portal. For avoid those issues - go below

 

Set Bex report execution limitation in Portal:

By default Bex report execution cell  limit is 10,00,000. This has to set lower(based on client requirement. Example 650000 cells

Check default settings in below table.

 

Se16>> table “ RSADMIN “ execute.  (If don’t have any setting’s for

BICS_DA_RESULT_SET_LIMIT_DEF or BICS_DA_RESULT_SET_LIMIT_MAX  please set )

 

Diff b/w DEF and MAX:   DEF – when the user ran the report,  limitation is DEF number. If the user wants to extend the cell limit, end users can increase MAX limit by clicking SETTINGS option.

 

Set values:  se38 >> programe SAP_RSADMIN_MAINTAIN >> execute (F8) then provide the parameters

BICS_DA_RESULT_SET_LIMIT_DEF = 500000 cells

BICS_DA_RESULT_SET_LIMIT_MAX  = 650000 cells then OK. We can modify also my clicking modify if wants to change the cell limit.

 

Portal execution limit msg.jpg

 

entend limit in web.jpg

 

 

For more details see s-note: 1127156

 

Set BEx report download to Excel/PDF limitation in Portal:


These limitations enhance the above settings.

The memory consumption may be larger for the export to Excel or to PDF than for the execution in the Web; therefore, a lower value can be specified as the safety belt for Excel and PDF.

BICS_DA_RESULT_SET_LIMIT_XLS =300000

BICS_DA_RESULT_SET_LIMIT_PDF = 300000

Execution default limit is 500000 and Max limit is 650000 in portal but download to excel/pdf is 300000 cells only if we set XLS and PDF setting in RSADMIN table.

 

Message when Excel limit cross.

 

excel.jpg

After Open Excel:

 

excel mesg.jpg

 

 

Message when PDF limit cross.


pdf.jpg



pdf msg.jpg


For more details see s-note: 1622134

 

 

Set report execution limitation in BEx Analyzer:


We know BEx reports can execute in Web and Analyzer.

We discussed web Safety belt setting in above. Below Analyzer settings.

 

ANALYZER_LIMIT_DEF: The default number of maximum cells if no further setting is specified. The default number can be overwritten locally in BEx Analyzer for each data provider.

ANALYZER_LIMIT_MAX: the absolute maximum number of cells (see below). The maximum number cannot be overwritten in BEx Analyzer.


ANALYZER_LIMIT_DEF=500000

ANALYZER_LIMIT_MAX = 750000


Message when cross limit:

Analyzer msg.jpg

For more details see s-note: 0001411545


How to select Max limit in Analyzer


After executing the report in Analyzer click the workbook settings as shown below for increasing the MAX limit. Increasing MAX limit is report specific.

 

Analyzer prop.jpg

Data Provider >> Properties

propertes.jpg

 


cell limit inc.jpg


Thanks

Nageswara Polaka


New tool for BW/BI integration and performance testing -> BExTest

$
0
0

How to perform integration tests of BEx Analyzer Workbooks and the corresponding queries?

 

We have all encountered this problem - new support packages are installed, changes are made to your BW production system and you are tasked with manually testing all your BEx workbooks to make sure that nothing is broken. This is a task which can easily take many hours or days and you can never guarantee that you did not overlook some query or query condition.

 

 

The solution

 

To help you in automating these integration testing tasks, there is a new product on the market: BExTest.

 

bextest1.png

In BExTest, you can define a set of BI queries and accompaning filter conditions.

These test sets can then be executed with two different test methods:

 

  • The queryies are executed and then compared to a stored set of query result files.
  • They are executed against two different environments, e.g. your production and your testing environment to see if there are any differences between the results (that works of course only if the underlying data are the same).

 

Differences in the tests are highlighted and you get a verdict of Fail or Pass of your test run.

The queries are either executed by using the SAP webservice QUERY_VIEW_DATA or by controlling your BEx analyzer directly.

 

Because for every query the query response time is displayed, the tool is also suitable for repeated performance tests.

 

A command line client for windows is also available to run these integration tests automatically and to integrate BExTest to an existing automated test tool.

 

And there is a free evaluation version available, no registration needed!

 

 

What is your impression? Comments, questions and feature requests are welcome!

Trick to view BEx Variable Properties without going in edit mode

$
0
0

Hi Folks,

 

This is my first blog post in SDN and I want to share a simple Bex trick with you. Some of you might already know it.

 

while creating a BEx query, we often want to use the variables that are already available and ready for use. It is always recommened to re-use the objects instead of creating new ones. These re-usable objects may or may not be created by you or your collegues and you may want to confirm there behavior before you use them in your query. Sometimes its tricky to find which variable suits your requirements best, as you wouldn't know the purpose of all variables available for a certain characteristic. Here, you must note that for all the standard variables, SAP has mentioned their properties in description itself. However, if that not how your custom variable's description is mainatined as per given naming convetion then one needs to look at the properties of the variable to determine its behaviour.

 

The general way to check the Re-usability of a variable is to check the property of the variable and mostly its done by getting the required characteristic in the filter area, right click -> Restrict -> Variables. when the list of all variables on that characteristic is displayed then we select it and click on option 'Edit Variable' to see the properties.

 

Bex_Variable_edit.JPG

 

 

Here, theres a chance of altering the Variable values unknowningly, which will lead to change the behaviour of the variable across all queries.

 

so simpler way to check the variable properties without going in change mode is as follows.

 

in the infoprovider pane, expand on dimensions to the characteristic that you need.

Expand the characteristic further and you will see below 3 options, which can be further expanded.

 

1. Attributes

2. Characteristic Value variables

3. Characteristic values

 

Expand node 'Characteristic Value variables' and you will be able to see list of all the variables for that characteristic.

 

click on any variable and you will be able to see it's properties in the properties pane on the right had side of your Bex Query designer.

 

variable Properties.JPG

 

Here, you will be able to see all the properties of your desired Variable with ease and also they not in edit mode.

 

One more ease is in terms of checking out multiple variables.

 

 

Hope this helps you.

 

 

-Swati Gawade.

Reorganizing and deleting bookmarks

$
0
0

Bookmarks are saved navigational states of a Web application.In the bookmark functional area of the Reporting Agent,

one can manage the bookmarks created in the system for all BW system users.

Many a times we do not require the bookmarks and it needs to be reorganized or deleted.

 

Creation of Bookmarks in BI system:-

  •    in BEx Web Analyzer
  •    once you personalize a web application
  •    using Bookmark function in the context menu of a web application.

 

Reorganization of BookMark : -

 

we can follow below simple steps to reorganize bookmarks :-

 

1. Go to SA38

 

2. Run the report :-RSWR_BOOKMARK_REORG
1.png

3. you get the below screen to fill the details


General Criteria

 

Created by User: You can enter the user name .

Time of Attachment: Enter the timeframe.

 

Bookmark Points to Following Object

Type: Choose the basis of bookmark e.g a report, a query view etc

Technical ID: technical name of web template


1.png

 

 

4. Once you fill the details and execute it, gives you list of bookmarks , you can choose from to delete:-

 

 

1.png

 

we can delete/reorg the bookmark from Broadcasting tab of RSA1 as well.

 

we need to go to RSA1-->Administration -->Broadcasting --> Reorganization -->Bookmarks

 

 

1.png

then we choose bookmark:-

 

1.png

then we fill the no. of days  for deletion and reorg. and simulate in "Test Call".

 

1.png

 

it displays the simulated results of what it will do :-

 

 

1.png

you can confirm and re-run it by unchecking the "TestCall" for deletion.

Hope this helps you in deleting/reorganizing your bookmarks.

 

cheers,

vinay singh

How to build geomaps with BW/BEx data using SAP BusinessObjects BI Location Intelligence

$
0
0

This post is about geovisualization and spatial analysis of BEx data with SAP BusinessObjects BI Location Intelligence.

SAP BusinessObjects BI Location Intelligence is a geomapping extension for SAP BI (See SAP website).

 

We are going to see how to get the following map:

This post is specific to SAP Webi on BW/BEx.

You can look at thispostto get a similar topic specific to geomapping with Design Studio:

Interested in Location Analytics for Design Studio ? Take a look at this blog

It is assumed that 80% of enterprise data is geo-located and this will increase in the future with mobility, Internet of Things, and smart grids amongst other factors.


So what is a geolocated data

The most common geolocated data is a point location with Latitude/Longitude coordinates.

Note that coordinates can be defined according to different coordinate systems such as Lat/long in degrees and X/Y in meters just to give a couple examples.

Geolocated data can also be related to geographical areas such as zip code, counties, administrative boundaries or your own business location such as your points of sale, territories, technical assets or networks.

If you have the above type of data in your BW/BEx systems, you should be interested by this post.


Let's see how we can leverage this data through the power of SAP BI 4.x Web Intelligence and its Location Intelligence extension.

 

Step1- Build your Webi document using a BEx query


Here is a broad overview as the purpose of this post is not to go into the complex details. Refer to the following document to get more information about this:

http://help.sap.com/businessobject/product_guides/boexir4/en/xi4sp7_bex_queries_en.pdf

 

1. Build a BEx query with location lat/long coordinates and associated business measures.

This is done in SAP BEx Query Designer.

Check that your lat/long coordinates consist of numerical data.

 

2. Define a BICS connection to your BEx query

This can be done in the SAP BusinessObjects CMC or in the SAP BusinessObjects Information Design Tool.

You can define a connection to a single BEx query or to an InfoProvider containing several BEx queries.

 

 

3. Create a Web Intelligence query on your BEx query

  • Go and connect to the SAP BusinessObjects BI Launchpad
  • Create a new Webi document using the Rich Internet Application client

 

4. Add the appropriate data in your Webi query

  • Open the query panel to access the BEx query data
  • Add the lat/long coordinates and the required business measures
  • Add prompts / filters according business needs
  • Test your query to check that you get one record per location with coordinates and measures
  • Save and close your Webi document

 

And now, let's go for geomaps!

 

Step2- Add a geomap in our Webi report

 

1. Open your Webi report with the HTML Viewer (See user preferences)

 

2. Insert a map container in your report

  • Add a blank cell in your report
  • Define its size and properties according online help

 

Your report should look like this:

 

3. Go to the Cartography menu and choose Map Document:

 

4. Define the different settings in the map wizard below:

  • Choose a cartographic view among the ones made available by the administrator
  • Choose the viewer to use (Flex for advanced use, HTML5 for more intuitive UX)
  • Choose your Query with BEx data in the pick list
  • Choose the fields to use for X/Y coordinates

 

5. Submit your settings and refresh your Webi report to display the map

 

6. It's time to visualize your business data on your map

  • Go to the Menu Layers
  • Open the Thematic Manager
  • Add a Layer
  • Define the settings of the map visualization (Business measure, color, size...) in the following wizard

 

  • Apply to see the map as below:

 

 

In conclusion, you can get more insight on your BW/BEx data with dynamic geovisualization by taking advantage of the power of Webi and its Location Intelligence extension

Have a look on the new map we've got in a few seconds!

Just have refreshed the Webi query and subsequent BEx data and changed the background map and the visualization settings to a heatmap.

 

Work Around on Applying Conditions on Characteristics

$
0
0

     In some discussions, I realized that there is a need to define condition according to the value of a characteristic. It is very easy to deal with key figure values to create a condition, but when it comes to the characteristics, we need some work around.  In this blog I will discuss a few alternatives and I will try to explain the procedure for one of them with a small scenario.

 

Scenario:


     Suppose we have a scenario where we have material type, materials and their related prices. We want to apply a condition on material type. When material type is “A” we don’t want to show the list of materials with 0 price. But when the material type is something else, we need to show all materials in the BEx query result.

 

Alternative Ways:


1. Change in design of the model


We can create a new characteristic which will point out the records to be showed in the report. The data type may be char1 and in the transformation rule, we can assign   1 to the records that will be showed and 0 for the records which we don’t want to be in the query result. Then in the query designer we can simply apply a filter on this characteristic in the characteristic restrictions tab.

 

This is a solution where we are flexible with changes in the modelling side. In most of the situations, change to the model may not be feasible due to huge amount of data which cannot be reloaded.  There are also cases where the customers don’t prefer to change the model even without any reason! Then BI experts are restricted to find solution on BEx side.

 

2. Replacement Path Variables in BEx Query Design

 

We can use this approach only if we have the characteristic data type defined as numeric. With this approach, we can create a formula variable with replacement path where we get the key value of the characteristic. When we have this variable, we can use this variable in a formula to write an if-else statement.

 

This solution is also restricted with the data type of characteristic. If it is a char defined characteristic, this solution also becomes useless.

 

3. Creating calculated key figures and using conditions in BEx Query Design

 

With this approach we define a calculated key figure (CFK) to count the records we want to show in the query result. According to the value of this CKF, we define a condition to show the results.

 

When other approaches I mentioned above are inefficient, this solution is what we are left with. Now, I will go into details of this approach.

 

Suppose we have data:

 

Material

Material Type

Unit

Price

ABC129

A

USD

912

ABC128

A

USD

178

ABC127

A

USD

0

ABC126

A

USD

167

ABC125

A

USD

154

ABC124

A

USD

0

ABC123

A

USD

0

ABC135

B

USD

0

ABC134

B

USD

25

ABC133

B

USD

0

ABC132

B

USD

266

ABC131

B

USD

187

ABC130

B

USD

644

 

          We want to show the all records of material type B, but we also want to hide 0 prices for material type A. The final report should look like this way:

 

Material

Material Type

Unit

Price

ABC129

A

USD

912

ABC128

A

USD

178

ABC126

A

USD

167

ABC125

A

USD

154

ABC135

B

USD

0

ABC134

B

USD

25

ABC133

B

USD

0

ABC132

B

USD

266

ABC131

B

USD

187

ABC130

B

USD

644

 

          We create query on the infoprovider adding material to rows and price to columns.

 

1.JPG

     Then we create two calculated key figures to count the number of materials we want to include for each material type. We want to show the materials where price is greater than 0 for material type A and all of the materials type B. The first CKF will count the number of all materials where the price has a value.

 

2.JPG

 

     In the aggregation tab we select counter for all detailed values that are not zero, null or error:

 

3.JPG

     We create another CKF to calculate the number of all materials whatever the price is. The same way we add price to the general tab and select counter for all detailed values in the aggregation tab:



4.JPG


5.JPG


     At the next step we use these CKFs. We can either create a selection or a Restricted Key Figure (RKF) for this purpose. We add the CKF where all materials are counted and restrict the material type to type “A”:


6.JPG


     The same we create another selection (or RKF)


7.JPG


     Now when we run this query we will see that in each of the row we want to show, either type A >0 column is 1 or type B all prices is 1:


8.JPG


     At this point we are successfull to eliminate 0 values in type A. The remaining procedure will include adding a formula to sum up the two columns type –A>0 and Type-B all prices. And then we can add a condition on this new formula:


9.JPG


     Now we have the report shown as:


10.JPG

     When we create the condition:


11.JPG


     And we make sure that we don’t suppress zeros in the query properties:


12.JPG


     We hide the formula and selections:


13.JPG


And when we run the report we get:

14.JPG


We can use this procedure in all cases where we can count the number of detailed records.

 

            Hope it helps

 

Regards

 

Yasemin Uluturk





Very Important BICS Notes for BW 7.40

$
0
0

Hello BW People!

 

Recently, here at Product Support, we have been informed by our development team that they had detected some BICS issues mainly in BW 7.40 SP 8 and SP 9.

 

They affect all the reporting tools which make BICS usage, such as WebI, Design Studio, Analysis for Office, BEx Web Java Runtime, etc.

 

Those issues are related to filters, hierarchy usage and variables. So, in case you have BW 7.40 and see any issues with filters not being applied, hierarchies not correctly displayed/validated in prompts/variable screen, variables not being filled or executed in case of customer exits, please take a look at the notes below.

 

 

NoteApplicable for BW 7.40 SP
2101236 - Missing data in query with artificial characteristic Currency/Unit789
2050425 - BICS: Dynamic filter missing and inconsistent state (N_SX_META_DATA)5678
2101188 - BICS: Filter on structure element is reset to initial selection789
2092810 - Filter variable on structure becomes ignored89
2064630 - BICS: Compoundment on Hierarchy Node Variable does not work5678
2096560 - BEx Web 7.40: Termination when using exceptions or conditions678910
2068075 - BICS: Values of exit variables remain empty after refresh6789
2052141 - Variable values not correctly resolved5678
2109550 - Bookmark deletes filter selection8910
2107060 - RSBOLAP: Correction for SAP Note 2065089 (reader reset)78910
2094538 - BICS: struc. member change after submit incomplete89

 

As clearly stated in SAP Note 2000326, please do apply these notes in any case, as they solve known issues and have no reported side effects.

 

With this you will avoid many known BICS issues that many times are not that easy to detect, trace and have a quick resolution, even with SAP Support help.

 

Kind Regards,
Marcio

Decimal Places Setting in Bex Query Designer

$
0
0

I have seen many questions in SDN Business explorer space regarding the display settings of Decimal points. This blog will explain these settings in detail.

 

Background:

In the BEx query designer, we can set the number of decimal places till which a Key figure will be displayed as output. These settings can be set for each key figure individually by selecting the keyfigure and going to ‘Display’ tab in ‘Properties’ section in the query designer.

 

Pic1.jpg

 

If the ‘Use Default Settings’ Checkbox is ticked then the no. of decimal places displayed will be determined from the Key figure info-object properties.

 

To view the default properties of the key-figure, go to the display of the info-object and in the ‘Additional Properties’ tab, you will able to see the default properties set for that key figure.

 

pic2.jpg

Problem Description:

As per business requirement, the no. of decimal places displayed should be 2 whereas the output of query displays 3 decimal points.

 

Solution:

Step 1: Go to the decimal settings in the Query designer as explained in background.

 

Step 2: Check if the ‘Number of Decimal Places’ dropdown has been set to 0.00 (as the requirement is set to display 2 decimal places)

 

Step 3: If not, then set it to display 2 decimal places as shown below.

 

pic3.jpg

Step 4: if yes, then check if the ‘Use Default Settings’ Checkbox is ticked as shown below.

 

pic4.jpg

Step 5: If yes, then uncheck the checkbox.

 

Now your BEx Designer settings are accurate to display the correct decimal places as set in the ‘Number of Decimal Places’ dropdown. It must have been displaying numbers upto 3 decimal places as per the default info-object settings for this key fogure as explained in background section.

 

For confirming, you can simply check the settings trough transaction RSA1 or RSD1.

 

Note:

Sometimes after an upgrade, the decimal settings are changed. For this issue refer to SAP Note.

When your BEx report execution turns into Error- General Steps to follow

$
0
0

Whenever our Bex query execution turns into an error or users report error in query execution, one should follow some basic steps before turning into a paniced maniac.


Sometimes we have worked hard on a query developement but the query gives error instead of the desired output. This can be very frustrating as we have already spent a lot of time on the developement and the error causes us further distress.


One can follow some basic steps below to make sure that there are no obvious mistakes.

 

1. Use Generate Report option from RSRT tcode. This will either resolve the issue or give you a better description of the error.

 

2. Test all the infoprovider objects in RSRV. This will let you know if there is any issue with the inforproviders, their indices or their structures.

 

3. Run RSRV test on all the master data objects involved in the info-providers. Sometimes the issue lies in the SID values or the indexes built on the master data. Often, you can find garbage master data causing issues as well.

 

4. check if the report is giving problem only to your ID or only your system? try checking on other systems and ask you collegues to try with their ID.

 

5. Check the patch level of BEx software and GUI. Is it old? does it need to be upgraded?

 

6. If the error is occuring only in a partiular system then try to retransport all the objects in the Query. (structures of the multiproviders and their respective Cubes) If it's not production system then I guess you can drop data and reload from underlying DSO/data providers.

 

7. Check if there were any warnings displayed when you transported the Query or any other object that is involved in the query. Sometimes these mild harmless warning messages turn into real nightmare.

 

8. Check if you have any customer exit code in your query. any code that needs to be executed before selection screen is displayed . Check if CMOD code was moved to appropriate system.


9. Check if any notes that need to be applied to your system which are causing issues.

 

 

 

Hope this helps.

 

-Swati.

"Client Out of Memory" - BEx Analyzer 7x Limitations

$
0
0

Recently I worked on a project where we upgraded our BEx components from 3.x to 7x an encountered numerous issues, In this technical blog I have listed one of the interesting issue related to BEx Analyzer limitation


Problem Statement


Part of BEx Upgrade project BW workbooks when upgraded from BEx 3.x to 7x started experiencing issues for having huge data set (1 Million data cells and beyond). Business users were executing workbooks which were either stored on local machine OR from the server having 10,000+ Rows & 90+ Columns, upon execution they got a popup "Client Out of Memory" .


This phenomenon has been described in SAP note 1411545 & 1040454. Understood from SAP that there is a design limitation for 7x analyzer (0.75 Million data cells) because BEX 7.X leverages MS .NET framework having this memory limits.


As per note number of cells that can be displayed as defined by SAP is 750,000 OR 1.2GB but the our workbooks had 1+million data cells. Calculation of the data cells is as per allocated memory MB = 100 + ( (# Rows * # Columns) * 0.0016)


Only the result set in the workbook should be considered for Number of Rows and Columns in above calculation

 

Alternative Approach/Workarounds


1. Cut down data set size by including additional filters to display the data, so that number of rows and columns gets reduced. SAP says "Anything extracted beyond ~.5M cells is considered as ETL requirement. BEx is a reporting and analytical tool and this should not be used for data extraction as an ETL tool."

 

2. If your result set has more number of columns due to Drill Across fields for all key figures/measures then change the way how reports are executed by rearranging the drills down/across so that the number of columns get reduced as we have no control on the number of rows.


3.  Every workbooks will have corresponding query associated with it. If in your case workbooks have only one query in the back end then as k the users to access corresponding report from either 3.x OR 7x analyzer instead of workbooks as queries have separate memory restriction which can be extended in RSADMIN.


With this workaround you can instantly run in excel and you also have refresh functions as in workbooks but the look and feel might be slightly different

 

4. If in your case the workbooks can be accessed both 3.x and 7x then you can ask the users to open 3x analyzer and access the migrated workbooks as 3.x workbooks dont have .Net limitation


5. Execute the report in Web and download to CSV as query can handle huge data set, presuming you maintain appropriate settings in RSADMIN


6.  Execute the report in smaller chunks using a custom ABAP program and then merge the files together

 

All the above workarounds will be limited until SAP BW/BI version 7.30 and presuming you enable both 3.x and 7x front end tools.


I also performed research on couple of other BI tools (Hadoop, Qlikview, Tableau..) Some of them dont have the restriction as in Analyzer but most of them are meant for small data packets and not for huge data volumes.


One that can be a potential alternative is SAP Business Objects Analysis for Office Edition, A premium alternative to Bex Analyzer. When I googled found combination of OS 64 Bit + Excel 64 Bit  + Analysis for office 64 Bit can handle huge data set but there are mixed reviews about this tool.


Relevant SAP Notes

1411545 & 1040454

1973478 - BEx Analyzer: Safety Belt option for large Query Results

1411545 - BExAnalyzer: safety belt for large resultsets

1860872 - Report not executable in BEx Analyzer 7x / Client out of Memory

2061104 - BEx Analyzer: Throws out of memory exception when there are many data providers and hence the size of the XML is very large.

2041337 - BExAnalyzer: Performance in Serialization of Structure Members and Cells

1958613 - Optimizing the BEx Analyzer Performance - Known Corrections and Best Practice Guidelines

1466118 - Hardware & Software requirements for Analysis, edition for MS Office


Thanks

Abhishek Shanbhogue

OLAP Cache

$
0
0

USE of OLAP Cache

•OLAP cache is to improve the query performance by reading the data out of the cache instead of requesting it from the database.

•After a query is run for the first time the data set transferred from the data base is saved in the storage systems of the OLAP cache.

•This cache entry can then be taken the next time. As it is a 'global' cache, different work processes can use entries of the cache and its data content is stored independently from user sessions.

Local Cache v/s Global Cache

•The OLAP Processor accesses two types of cache, the local cache and the global cache.

•The local cache can only keep the data of the current work process (roll memory) and only one user can access this data during that particular work process.

•Unlike the local cache, the global OLAP cache can be accessed by various query instances, the data content is stored independently from user sessions. When the global cache is switched off, the local cache is still active.

•GLOBAL CACHE is also called “Cross-transaction cache”.

OLAP Cache Settings

The following is a list of the various levels where OLAP cache performance can be enhanced.

Global: BASIS ONLY SETTING: This setting ensures that all queries use OLAP cache by default, but the settings can be overwritten at both the Info Provider and/or query level. Global Cache Settings shows how to verify this. In some cases, it may be necessary to switch OLAP cache off.

Info Provider: This setting ensures that all queries using the Info Provider will initially default to the settings of the Info Provider. These settings can be overwritten at the Query level.

Query: Any settings made at this level only apply to that specific query. One can, for example, choose not to use cache at this level which would only impact the given query.

Global Cache Settings

•The global activation of cache should be performed by your BASIS team and should not be changed by any other person (recommended). Check (RSRCACHE) that this has been done before going ahead with the settings at a query level. If a check is done and the green light is not turned on beside Cache. Please contact your BASIS team.

•The pre-requisite for the OLAP Cache is that the SHM Memory should at least be equal to the Global Memory Cache.

•The Global Cache Size should normally not be smaller than 300-500 MB.

Info Provider Level Cache Settings

•OLAP relevant settings can be changed at the Info Provider level by those that have the relevant authorization levels.

•These can be displayed or changed by executing the transaction code RSDIPROP directly.

•Even if the server has had cache globally activated, it can be deactivated at either the Info Provider or Query level by setting the Cache Mode to 0.


Query Level Cache Settings

•To change the Query properties for all or selected queries of an Info Provider, we can use the Query Mass Maintenance feature from Query Monitor.

•For this, use the menu path RSRT → Environment → Query Mass Maintenance., For single query you can select "Properties" of that particular query in RSRT

•To view the cache settings for all or selected queries, the RSRREPDIR table can be viewed. The steps are shown below:

  • Execute transaction SE16>RSRREPDIR
  • The CACHEMODE field below shows the settings of the individual queries. The numbers in this field correspond to the cache mode settings in Cache Mode Settings.

Cache monitoring

Cache monitor is the central monitoring tool for the OLAP Cache. This is accessed using the Transaction code RSRCACHE. This can be used to check if the query was read from the OLAP Cache.

Invalidation of Cache

The OLAP Cache entry is invalidated in the following cases:

•Transaction Data Load to the InfoProvider

•Deletion of Transaction Data from the InfoProvider

•Master Data Load, Hierarchy load and subsequent change run of InfoObjects contained in the query definition switching Off OLAP Cache

•Currency conversion rate change

Switching Off OLAP Cache

We may require the OLAP Cache to be switched off in some instances including but not limited to performance testing and during performance tuning. This would effectively remove cache overhead. Following is a list of the options for switching it off:

  • Using transaction code RSRT to switch off at Query Level for existing ones
  • Using transaction code RSDIPROP to switch off for all new Queries to be developed on an InfoProvider
  • Using RSRCACHE to make cache Inactive globally

We would want to switch it off:

  • for very fast Queries
  • for an Info Provider that gets loaded frequently in a Day
  • for ad hoc queries that are not used frequently
  • for one time query execution (using transaction RSRT->debug)

Priming the Cache (Filling the Cache)

•The OLAP cache can be filled with the Query result set before any user executes a Query.

•This can be done with the BEx Broadcaster and ensures that the first user of a query has a greater possibility of using OLAP cache.

•This can be automated using Process Chains (Process type: Trigger event data change for Broadcaster).

Restrictions and Considerations

•Queries created and run by a small user group that never uses the same query variables twice do not execute faster from the cache.

•The cache is not automatically turned on or off by the system - so if you set a query to use cache it always will use the cache.

•Since the system has no information about changes to the data store on which a remote provider is based, queries on a remote cube, on a Multi Provider with a remote part provider, or on an Info Set that uses a Direct update DSO cannot generally use any data from the OLAP cache.

Recommendations

•Best practices demand that you evaluate the most frequently run queries using the cache and create these as views for Web templates to be run in batch. For less frequently run queries - criticality and business priority should also be a component. Even if the query is not necessarily run frequently, it may be wise to use OLAP cache to ensure that all Users have the same end user experience.

•Aggregates and OLAP Cache can be used together to enhance the Query response time. The data is cached from the Aggregates which is a smaller data set compared to the entire Info Provider thereby reducing the DB time.

•Regular audit of the Cache should be part of the OLAP Cache strategy. Program RSR_CACHE_RSRV_CHECK_ENTRIES can be scheduled to run on a regular basis to remove the unused cache entries.

 

 

Regards,

Vinay B

Viewing all 103 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>