BPC blogs.docx



Comments



Description

Running BPC process chains within non-BPC process chains Posted by Pravin Datar in pravin.datar on Jan 26, 2009 10:59:42 AM Business Planning and Consolidation version for Netweaver extensively uses process chains for running the BPC processes. These process chains are automatically invoked by the BPC application when the BPC user executes processes from the front end. Should these process chains be exclusively executed only within BPC alone or should we able to execute them outside BPC, using native Netweaver BW, may be from within any custom process chains that we may create? Is there any need to do so? And finally, is there any way to do that? Let try to answer these questions in this blog. Let us begin with trying to see if we have any business reason to run the BPC process chain outside BPC application. In order to do that we need to understand how the optimization process works in BPC version for Netweaver. Optimizing the BPC data model: A dimension in BPC is equivalent to a characteristic in Netweaver BW and dimension members in BPC are equivalent to characteristic values in Netweaver BW. Taking this further, when a user creates a dimension in BPC version for Netweaver, a Netweaver BW characteristic is generated in the BPC namespace for the same. When a user creates a dimension member for that dimension in BPC version for Netweaver, a characteristic value is generated in Netweaver BW in the master data of characteristic corresponding to that BPC dimension. When a user creates a BPC application in BPC version for Netweaver by selecting a few of the BPC dimensions, an infocube (as well as a multiprovider containing that infocube) is generated in the BPC namespace that includes all the characteristics corresponding to the selected BPC dimensions. (You can read more about the BPC namespace at A reservation of a different kind – why, what and how of BPC namespace) We should distinguish the BPC dimension from the Netweaver BW dimension. In Netweaver BW, the term dimension is used to group the characteristics. How the characteristics in a BPC infocube are organized among the Netweaver BW dimensions within the generated BPC infocube? Well, it depends upon the number of dimensions included in the BPC application. If the number of BPC dimensions in the BPC application is 13 or fewer, then all of them are automatically modeled as line item dimensions in the BPC infocube. This is because Netweaver BW allows upto 13 user defined Netweaver dimensions in an infocube. If the number of BPC dimensions exceeds 13, then the BPC infocube model is automatically generated for those BPC dimensions. The data modeling thus generated while creating the cube may not remain the most optimized one as the fact table of the cube begins to grow. BPC version for Netweaver gives the option to the BPC user to optimize the data model from the front end. As shown below, there are two options to optimize - Lite optimize and Full optimize.. The Lite Optimize option does not make any changes to the data model. It just closes the open request; compresses and indexes the cube and updates database statistics. The Full optimize option is the one that may rearrange the characteristics among the 13 user defined Netweaver BW dimensions. The Full Optimize process will check if the size of the dimension table is less than 20% of the fact table or not and create as many line item dimensions as possible. In order to do this reconfiguration, it takes the appset offline, creates a shadow cube with optimal data model; links the new optimal cube to the multiprovider for the application; moves data to the shadow cube; deletes the original cube; closes the open request; compresses and indexes the cube; updates database statistics and brings the appset online again. Though this results in creating a new infocube, the multiprovider remains the same and all the BPC reports are built on the multiprovider and not the underlying infocube. Hence this optimization does not affect the BPC reports reporting this data. Using ETL for BPC infocubes: Since the data that the BPC user enters from the BPC front end is stored in the underlying real time infocube for that application, one may ask whether it is possible for us to load data to that cube with normal Netweaver BW ETL process. The answer to that is ‘yes' - but with a caveat. We can use Netweaver BW ETL for the BPC infocubes. Here is an example of a DTP to load data through a flat file to a BPC infocube. Now if the BPC user chooses to do a Full Optimize' for this application, it may result in creating a new infocube with more optimal data model. That new infocube, though gets automatically linked to the multiprovider for the BPC application, at present, does not inherit the ETL structure that was built on the original cube. So in the above example, if the BPC user executes a ‘Full Optimize' for the Finance application, the new optimal infocube for the Finance application may not inherit the DTP created on the original /CPMB/DZID30P infocube. The source system, data source, infosource etc will remain but the transformation that links these to the infocube will get deleted and has to be recreated. If this optimization happens in the production system then the transformation may have to be recreated and transported up the landscape. A way to obviate such situation is to execute the process chains used by BPC to load data using native Netweaver BW tools, outside the BPC application. In the above example, a flat file is being loaded to the BPC infocube using Netweaver BW ETL tools. However, BPC application itself offers a front end functionality of Data Manager to load data either through a flat file or from any other Infoprovider. Data Manager uses BPC process chains in the background to load the data as shown below. If we can run these process chains outside BPC - from the EDW layer using the native Netweaver BW, then not only we can integrate this with the custom process chains but also obviate the issue of ETL structures getting deleted on ‘Full Optimize'. Running BPC process chains outside BPC is also important if we are using open hub and want to automate the flat file load to BPC cubes by creating a user defined process chain that integrates the file creation of the open hub and loading of that file to BPC cube. If by any means, our user defined (custom) process chain (that we create in transaction ‘rspc') can run the BPC process chain to load the data to BPC cube, then we have an ‘industrial strength' solution for loading data to BPC infocubes using Netweaver toolset. The question now becomes how to accomplish this. Let us try to understand the steps involved. Steps in using BPC process chain within non-BPC process chain: The first step is to upload the flat file. If we want to use open hub then the open hub can place the file at any specified location on the BPC application server or we can upload the flat file to the BPC File service (transaction ‘ujfs') as shown below. The second step is to create a transformation file using the BPC front end. Though we want to run the BPC process chain with native Netweaver tools, this is the only step that we have to do with the BPC front end. This is because the BPC process chain looks for the XML version of the transformation file. When we process the transformation file from the BPC front end, this XML version of the transformation file is automatically created and stored in the file service. The third step is to create an answer prompt file that passes the required parameters to the BPC process chain. This file should be a tab delimited file. The format of the answer prompt file is as follows: %FILE% 'csv file path in file service' %TRANSFORMATION% 'transformation file path in file service ' %CLEARDATA% 1/0 %RUNLOGIC% 1/0 %CHECKLCK% 1/0 Here is an example of the answer prompt file: then we shouldn't get unnecessary surprises! Now in the final step. . please note that this ujd_test_package program was originally designed to assist in debugging the data manager packages. As shown below.The fourth step is to run program ujd_test_package with the right Appset and Application. we are ready to create our custom process chain that executes the BPC process chain. create a user defined process chain in transaction ‘rspc' and include a process type to execute ABAP program.just to be on safer side so that if future development changes the nature of this program. Hence it may not be a bad idea to copy this program to a user defined program and use the user defined program in the next step . We should use the answer prompt file created in the above step and save the variant for the program as shown below. Include ujd_test_package program (or the user defined program created based on ujd_test_package) with the saved variant. However. 2011 10:45:56 AM Even though I have a long experience with BPC. I might not write that code within 10 minutes. It is not a problem of my knowledge but writing a script logic needs to understand your financial requirements and knowledge of your application like account member id and name of properties. version for SAP NetWeaver on May 24. but also for loading data from other Infoprovider to BPC infocube (using the BPC process chain to load data from Infoprovider) BPC Script logic for Dummies? (Part 1) Posted by James Lim in SAP Planning and Consolidations. Thus we can run the BPC process chain from within non-BPC process chains. . These steps will work not only for the process chain to load flat file into BPC infocube with open hub.Activate the process chain and execute the process chain. if someone asks me to a write a script logic. JAN for scoping actual.. On top of that. if you don't specify scope.. *XDIM_MEMBERSET <DIMENSIONNAME> = <MEMBERNAME 1>. Here is the grammar of XDIM_MEMBERSET. 1. Let’s learn it step by step.January. Let's say you need to calculate 2011. calculation(create/Record) and Writing. It is the same thing that is not easy to write a book but reading a book is a different story.<MEMBERNAME n> Now. *XDIM_MEMBERSET TIMEDIM=2011. *XDIM_MEMBERSET CATEGORYDIM=ACTUAL for scoping external sales. Logic is a special script engine and it consists of 3 parts. it will take a lot of time to read data. Scoping.<MEMBERNAME 2>. actual data and only one account like 'Discounted External sales' based on the External Sales. for scoping 2011. you might be enable to modify or create it..There are some documents and help files about the script logic and HTG but end users may feel that is not easy. let't scope above exampe. 2. *XDIM_MEMBERSET ACCOUNTDIM=EXTSALES (Note: we need to scope External sales because discounted External sales will be calculated based on the External Sales. How can we scope this from a big database? The answer is. Understand 3 logic parts.January.) . Therefore. I agree with that but if you understand its structure and concept.. *XDIM_MEMBERSET *XDIM_MEMBERSET is using for scope data by each dimension.. Scoping BPC is based on NW BI or MSAS which has a lot of data. I can guarantee you can read and understand what it means and what the purpose of that script logic is. 3. (Note: *REC means 'Record'. We can make our script as below. we just finished scoping so it is time to calculate(create) data. what should we do? We can use another *REC statement at the same time.2011.CATEGORY="BUDGET") Now you want to put 80% value into FORECAST at the same time.JAN. there is no temporary variable in the logic script engine so it will create a record that has same as fact table structure.ACTUAL.9.ACCOUNT="DISCOUNT_EXTSALES") Which means multiply by 0.9. and it will replace or change its value uaing '*REC' command.ACCOUNT="DISCOUNT_EXTSALES". *REC (FACTOR = 0.{dim2}=?)] Using that grammar.ACTUAL.{dim1}={member}.) Here is the grammar of *REC statement *REC[([FACTOR|EXPRESSION={Expression}[.10000 <Generated record> DISCOUNT_EXTSALES.JAN. .2011. <Scoped record> EXTSALES.9 to current scoped record and replace account member with DiSCOUNT_EXTSALES Here is an example what happens with above statement.9000 What if you want to put generated record into BUDGET category? Then statement should be *REC (FACTOR = 0. Unlike other script engine. Now. 8. Therefore. we need to write data into database. As a final step.BUDGET. a.FORECAS. c. Destination cell can be same. 4.*REC (FACTOR = 0.CATEGORY="BUDGET") *REC (FACTOR = 0. It means you scoped 1 record but you can create multiple records using this. Currency translation is the best example because you need multiple converted currencies using a local currency record.ACTUAL. All values will be accumulated.2011. . you can imagine there will be multiple *REC statement in your currency conversion script logic.8000 Getting easier? I hope so Please keep in mind below rule. b.ACCOUNT="DISCOUNT_EXTSALES".ACCOUNT="DISCOUNT_EXTSALES". Each source record can generate as many records as desired.JAN. Each REC instruction generates ONE new record.JAN.9.9000 DISCOUNT_EXTSALES.2011.JAN.10000 <Generated record> DISCOUNT_EXTSALES.CATEGORY="FORECAST") <Scoped record> EXTSALES.2011. *REC statement will generate a new record so it doesn't matter even though destination is same. *COMMIT Fortunately. 2011 10:19:13 AM I explained basic 3 parts of script logic in the last post. It was Scoping. Scope using member property We found how to use *XDIM_MEMBERSET last time. . Just use *COMMIT. Calculation and writing. *XDIM_MEMBERSET is for scoping based on the member ID. We will find out more advanced features for scoping In this post. When BPC script engine execute *COMMIT. version for SAP NetWeaver on Jun 20. generated records will be posted to the table using BPC sending engine which is same engine that you submit data from the Excel workbook. What if user wants to scope members based on a specific property value? For example. a user wants to filter Account dimension members those are Asset. 1. BPC Script logic for Dummies? (Part 2) Posted by James Lim in SAP Planning and Consolidations.script statement is really simple one. We reviewed three main parts of BPC Logic script as a first step. I will explain advanced scoping. recording and commit command in the next post. it doesn't have any parameter. BUDGET. *XDIM_FILTER ACCOUNT = [account]. 2011. we need to use ACCTYPE property which has the type value of account. ID ACCTYPE Extsales INC CASH AST TAXES EXP NETINCOME INC Then *XDIM_FILTER_ACCOUNT will select CASH member only.JAN.properties(ACCTYPE='AST') Let's say Account dimension has 3 members as below. 2011.To achieve this.Properties("Property name") = "Property value" So above example can be written as below. (Note: Value is based on the APSHELL of BPC. 9000 CASH . *XDIM_MEMBERSET TIME = 2011. AST is the value for ASSET account.JAN.) The command is *XDIM_FILTER.JAN *XDIM_MEMBERSET CATEGORY = BUDGET <Result> EXTSALES . 3000 . The usage is *XDIM_FILTER <DIMENSIONNAME> = [DIMENSIONName]. BUDGET. Let's assume If you already used multiple *XDIM_MEMBERSET command and below are selected data from the fact tables. Unlike other command.properties(ACCTYPE?='AST') Then only one record will be selected from above result because CASH is the only account member that has 'AST' value of ACCTYPE property. 2011. BUDGET. Here is the grammar of XDIM_GETMEMBERSET and an example. 3000 2. *XDIM_MEMBERSET TIME = 2011.JAN.JAN. can we select all data that an account value greater than 100? Of course. Scope using member value We just figured out how to scope the source data based on the property.TAXES . we can do it. Then someone might ask this question. *XDIM_GETMEMBERSET {dimension} [={member set}] [*APP={application}] //optional . it needs *ENDXDIM command to specify data. 2011.JAN *XDIM_MEMBERSET CATEGORY = BUDGET *XDIM_FILTER ACCOUNT = [account]. 800 NETINCOME. 2011. <Result> CASH . BUDGET. BUDGET.JAN. "Can we scope based on the value? For example. The command is *XDIM_GETMEMBERSET. 1500 Now if you add *XDIM_FILTER against ACCOUNT dimension. JAN. INPUT. ACTUAL. 1100 CE0001000. INPUT.. 2011. INPUT. 2011. INPUT. INPUT. INPUT.FEB.FEB. 2222 CE0001000. 2011. 345 CE0001000. 2011. 2011. KOREA. 2011. 1050 . KOREA .[*XDIM_MEMBERSET {dimension} [={member set}] //as many of these as needed [*QUERY_TYPE= 0 | 1 | 2] //optional *CRITERIA {expression} //required *ENDXDIM *XDIM_GETMEMBERSET P_CC=[P_CC]. BUDGET. ACTUAL. ACTUAL. CE0001000.[AAPJ]. BUDGET. JAPAN . 3000 CE0002000. BUDGET.CHILDREN *APP=PLANNING *XDIM_MEMBERSET P_DataSrc=INPUT *CRITERIA [P_ACCT]. TURKEY. CHINA . CHINA .[H1]. 2011. JAPAN . INPUT member of P_Datasrc dimension AND d. INPUT.[H1]. ADJ . INPUT.FEB. 1999 CE0003000.JAN.JAN. CHINA . TURKEY.FEB. 1999 CE0003000.JAN. JAPAN . 2011. CHINA. 5000 CE0001000. Children member of AAPJ in the P_CC dimension. 2011. 450 Which records will be selected? The answer is CE0001000. 2011.[CE0001000]>1000 *ENDXDIM It will get data those are. BUDGET.FEB.JAN. 2011. ACTUAL. from the PLANNING application AND c. a. ADJ . 2500 CE0002000. INPUT. ACTUAL. 2011.JAN. 2500 CE0001000. KOREA . ACTUAL. AND b.JAN. ACTUAL. 1050 CE0001000. CE0001000 member's value of the P_ACCT dimension should be greater than 100000 Let's Assume Fact table has below records. INPUT. BUDGET. JAPAN . Note 2: if you don't specify each dimension's scope.Below records will not be selected even though P_ACCT is CE0001000 because its value is less than 1000 or Datasrc is not INPUT or it is not the child member of AAPJ (Asia Pacific) CE0001000.JAN. 1999 (Turkey is not child member of AAPJ) CE0001000. BUDGET. ADJ . ADJ . Note 1: This command works only for BPC MS. When user wants to add more members on top of current scoped data. INPUT. 2011.FEB. you can use *XDIM_GETINPUTSET. TURKEY. 2011. 2011. In that case user defines as below. KOREA . Note 3: This command will generate MDX statement so it takes more time to execute. ACTUAL.FEB. BUDGET. if your dataset has only base members.FEB. 450 (Value is less than 1000) Here are some important Notes for using this command. 345 (datasrc is not input) CE0001000. INPUT. Let's say a user wants to add USASales entity on top of predefined memberset. 3000 (datasrc is not input) CE0001000. BUDGET. it will be performed in the corresponding members of the pre-selected region which is defined with XDIMMEMBERSET of previous line or Passed by Data Manager. (please refer help file) 3. CHINA . *XDIM_ADDMEMBERSET Entity = USASales The main reason why we need this is . 2011. Sometimes we need to save our scoped data into a script logic variable. *SELECT (. b. %CUR% etc. The Variable is defined using % symbol. Sometimes XDIMMEMBERSET doesn't work with some specific functions like BAS(parent). Then we can use *SELECT and *MEMBERSET command as a dynamic scope tool. BAS() will not work with XDIM_MEMBERSET.Here are some examples. Dynamic Scope and saving it to a variable. *Note: In BPC MS. Here is the grammar of both commands. But. IN BPC NW. *XDIM_MEMBERSET = BAS(US). if user always wants to run a specific memberset whenever logic runs. Like other script engine. what will happen in your script logic? You can use *Filter but sometimes it may not work all the time. what if your dimension members are updated frequently? As I know. should use XDIM_ADDMEMBERSET 4.. For example. If customer changes their dimension members. the variable can be filled using *SELECT command and *MEMBERSET command. So how can we save some data into the variable and when it can be used? Usually..a. {member set in MDX format}) . Therefore. Both of them is scope command but *SELECT will be faster because it will create SQL statement. %MYTIME% . almost every customer updates their dimension at least once a month.CANADA will not work. Logic script also supports Variable to save some data. we should use *XDIM_MEMBERSET and *XDIM_ADDMEMBERSET. “*GROUP+ = 'REP'”) This command will get the 'member ID(what)' from the 'currency dimension(From)' that the GROUP property has the value 'REP' (where). ID USD EUR KRW JPY GROUP REP REP Then above statement will be converted as *XDIM_MEMBER_SET CURRENCY = USD.EUR .properties(GROUP='REP')}) The variable can be used anywhere in the logic. all result will be saved into %REPORTING_CURRENCIES% variable.Let's see how to use *SELECT command. “ID”. *SELECT(%REPORTING_CURRENCIES%. *MEMBERSET(%REPORTING_CURRENCIES%.members. Actually. “CURRENCY”. like in this example: *XDIM_MEMBER_SET CURRENCY=%REPORTING_CURRENCIES% Let's assume Currency dimension has below members. [currency]. Filter{[CURRENCY]. Here is an example of *MEMBERSET which will make same result but execute MDX statement instead of SQL. it will create a SQL statement as below SELECT ID from mbrCurrency where [GROUP] = 'REP' After it executes above SQL command. please remember this as 'MEMBERSET Variable' Note: MEMBERSET command is only supported by MS version. if your scoped record is same as below. the script logic consists of 3 parts. We reviewed key command of scoping today.*NEXT in the next post. Again. 2011 7:37:49 PM I am sorry for the late posting of this series but I had to take my vacation and needed to get some training about HANA Let's start to learn how to caluate and write some data using the script logic. . *REC statement is used for writing data. 1. Calculationand Writing. You need to keep in mind that *REC will create records based on the scoped records.When you define and fill in data using *SELECT and *MEMBERSET. Basic concept of Writing and *REC statement As we saw in my first posting of this series. Scoping. We will review advanced calculation command and control command like *IF or *FOR . version for SAP NetWeaver on Aug 4. BPC Script logic for Dummies? (Part 3) Posted by James Lim in SAP Planning and Consolidations. For example. . 3000 EXTSALES.JAN.9. The other dimension that is not specified in the *REC statement will be same as scoped data so 2011. 4500 As you can see. 2011. we changed Account value. KOREA. 2011.JAN.JAN. KOREA. BUDGET. Category value and its signeddata vale (or measure value) using *REC statement. ACCOUNT="DISCOUNTED_EXTSALES". 2011. 9000 What if your scoped record is not a single one but multiple record? <Scoped record> EXTSALES. USA. 10000 EXTSALES. ACTUAL. ACTUAL.JAN and each country (entity) doesn't be changed. ACTUAL.JAN. 2011.<Scoped record> EXTSALES. ACTUAL. BUDGET. 2011.JAN. 2011. BUDGET. USA. USA. CATEGORY="BUDGET") Then your generated record will be <Generated record> DISCOUNTED_EXTSALES. *REC (FACTOR = 0. USA. BUDGET.JAN. 2700 DISCOUNTED_EXTSALES.JAN. CANADA. 2011. 5000 Then your generated records will be <Generated record> DISCOUNTED_EXTSALES.JAN. 9000 DISCOUNTED_EXTSALES. CANADA. 2011. 10000 and your *REC statement is below. Here is the grammar of *REC statement.JAN.2. Here is an example. The EXPRESSION is any formula that will result in the new value to post. <Scoped record> EXTSALES. 2011. What is the difference between FACTOR and EXPRESSION? The FACTOR is a factor(multiply) by which the retrieved amount is to be multiplied. 2011.JAN. And specify dimension name and member to change its value. *REC[([FACTOR|EXPRESSION={Expression}[. 10000 *REC(EXPRESSION=%VALUE% + 5000) .JAN. fixed values and the Script logic keyword %VALUE% this is representing the original retrieved value of the scoped record. The formula can include regular arithmetic operators. 10000 *REC(FACTOR=6/2) <Generated record> EXTSALES. <Scoped record> EXTSALES. 30000 What if you want to add or divide? then you should use EXPRESSION.{dim1}={member}. ACTUAL. 2011.{dim2}=?)] 3. ACTUAL. Here is an example. You can use FACTOR or EXPRESSION for various calculations for signeddata vale (or measure value). Grammar of *REC statement. ACTUAL. Let's assume you want to create forecast values of salary and put it into the forecast category based on the country's actual salary values of January. you need to scope first." Yes. Let's assume ENTITY dimension has country information. 2011. ACTUAL. We need to increase 10% for US." "I need to copy a value to multiple destinations!" "How can I get the value from the other application?" "I want to use some value from other records to calculate the result.JAN *XDIM_MEMBERSET CATEGORY = ACTUAL Now you need to write the *REC statements . 5% for Canada and 3% for other countries." "Can I use a property value to calculate the result?" The script logic can handle above requirements. 2011.JAN. That's why *you MUST use *REC statement with *WHEN ~ *IS ~ *ELSE ~ *ENDWHEN statement.<Generated record> EXTSALES. I will explain first question in this post and will do others in the next post. To do this. "There are some scoped data and I need to do different calculations based on each specific dimension member. "There are some scoped data and I need to do some calculations based on each specific dimension member. *XDIM_MEMBERSET ACCT = SALARY *XDIM_MEMBERSET TIME = 2011. 15000 Now we got the basic things of *REC statement but you may ask below questions. 03. CATEGORY=“FORECAST“) // 5% *REC(FACTOR = 1. CATEGORY=“FORECAST“) // 10% . In this example.03. CATEGORY=“FORECAST“) // 10% *REC(FACTOR = 1.05. you should specify a condition of each *REC statement. Second. CATEGORY=“FORECAST“) // 5% *REC(FACTOR = 1. CATEGORY=“FORECAST“) // 10% *REC(FACTOR = 1. write a dimension name that you want to compare next to *WHEN.1.*REC(FACTOR = 1. CATEGORY=“FORECAST“) // 3% *ENDWHEN NOTE : You don't need to use the indentation of code in the script logic but I would like to recommend using it for better readability. you MUST use *WHEN ~ *IS ~ *ELSE ~ ENDWHEN statement.1.1. *WHEN ENTITY *REC(FACTOR = 1. CATEGORY=“FORECAST“) // 3% Finally. First. For doing this. Write down *WHEN and *ENDWHEN outside of the *REC statement *WHEN *REC(FACTOR = 1. it will be ENTITY dimension.05. CATEGORY=“FORECAST“) // 5% *ELSE *REC(FACTOR = 1. CATEGORY=“FORECAST“) // 3% *ENDWHEN Third. put *IS statement on top of each *REC statement and *ELSE statement on top of the last *REC statement. CATEGORY=“FORECAST“) // 5% *REC(FACTOR = 1.03. CATEGORY=“FORECAST“) // 10% *IS *REC(FACTOR = 1.05. CATEGORY=“FORECAST“) // 3% ENDWHEN Fourth.*REC(FACTOR = 1. put each condition value next to *IS *WHEN ENTITY ***IS USA *REC(FACTOR = 1. *WHEN ENTITY *IS *REC(FACTOR = 1.1.05. CATEGORY=“FORECAST“) // 10% ***IS CANADA .1.03. We need two *IS statements and *ELSE statement because there are two conditions and others will be calculated as one condition. *XDIM_MEMBERSET ACCT = SALARY *XDIM_MEMBERSET TIME = 2011. <= with numeric value with *IS statement. VALUE_B Note 2 : You can use >.03.*REC(FACTOR = 1. so final version should be same as below code. CATEGORY=“FORECAST“) // 5% ***ELSE *REC(FACTOR = 1. CATEGORY=“FORECAST“) // 5% ***ELSE *REC(FACTOR = 1. OR and NOT with *IS Note 4 : " (double quotation) is not mandatory for comparing string value with *IS statement. . it is equal (=) so it will be ok even though you don't specify it. Note 5 : *WHEN statement can be nested.1.03. CATEGORY=“FORECAST“) // 10% ***IS CANADA *REC(FACTOR = 1.05. ex) *IS > 4 By default. CATEGORY=“FORECAST“) // 3% *ENDWHEN *COMMIT Note 1 : You can use multiple condition value like *IS VALUE_A. For example. CATEGORY=“FORECAST“) // 3% *ENDWHEN As a last step. Note 3 : can't use AND.JAN *XDIM_MEMBERSET CATEGORY = ACTUAL *WHEN ENTITY ***IS USA *REC(FACTOR = 1.05. put *COMMIT at end of the script so that logic engine can post data to Database. ex) *IS Intco. USE *LOOKUP/*ENDLOOKUP! The simplest example is the currency conversion because you need to read rate value from the rate application to convert currency values of the finance application. "How can I get the value from the other application?" The simple answer is. version for SAP NetWeaver on Aug 12.. . BPC Script logic for Dummies? (Part 4) Posted by James Lim in SAP Planning and Consolidations. I hope you feel script logic is not too complex. I will post a couple of advanced features like LOOKUP in the next post for answering other questions. As I explained you in the first post of this series.”E” *REC(…) *ELSE *REC(…) *ENDWHEN *ENDWHEN Note 6 : You can use property value with *IS statement.”D”.. 2011 11:11:21 AM OK. let’s start to find out the answer about one of the questions that we had in the last post.Entity Now we finished learning 3 basic parts of the script logic.*WHEN xxx *IS “A” *REC(…) *REC(…) *IS “B” *REC(…) *WHEN yyy *IS “C”. The member id of RATEENTITY dimension in the rate application should be "DEFAULT" 4.{Property}] **DIM …+ *ENDLOOKUP {Application} is the name of the application which you will retrieve value. Now. let's do it step by step. 3. This is only required when multiple values must be retrieved. {DimensionName} is a dimension in the lookup application. 1. 2.) Here is the syntax of *LOOKUP/*ENDLOOKUP The syntax is: *LOOKUP {Application} *DIM [{LookupID}:] {DimName}="Value" | {CallingDimensionName}[. You need to get the rate values from rate application for currency conversion (LC to USD and EUR). The member id of RATE dimension in the rate application should be the same as RATETYPE property of the account dimension in the finance application. The rule of currency conversion is 'DESTINATION CURRENCY/CURRENT CURRENCY' First. {CallingDimensionName} is a dimension in the current application. Here are our requirements for the currency conversion.(NOTE:*LOOKUP/*ENDLOOKUP also can be used for reading value of the current application. you need to define *LOOKUP with application name. {LookupID} is an optional variable that will hold the value so that you can use it in the script. *LOOKUP RATE *ENDLOOKUP . CURR // added one more for the currency conversion as variable value *DIM RATE=ACCOUNT. Make copies of that dimension and assign different values. *LOOKUP RATE *DIM RATEENTITY="DEFAULT" // Fixed value *DIM INPUTCURRENCY="USD" // Fixed value *DIM INPUTCURRENCY="EUR" // Fixed value. INPUTCURRENCY.CURR *DIM RATE=ACCOUNT. CATEGORY and TIME dimension. assign the member id value of each dimension from the current application (Finance) or use fixed value.RATETYPE // Variable value based on the current application *DIM CATEGORY *DIM TIME *ENDLOOKUP Fourth. Put variables for multiple retrieving values in front of each duplicate dimension name.RATETYPE *DIM CATEGORY . *LOOKUP RATE *DIM RATEENTITY="DEFAULT" *DIM DESTCURR1:INPUTCURRENCY="USD" *DIM DESTCURR2:INPUTCURRENCY="EUR" *DIM SOURCECUR:INPUTCURRENCY=ENTITY. If you need to retrieve multiple value according to different member id values of specific dimensions. specify dimensions of RATE application with *DIM statement. Copy same dimension for another value *DIM INPUTCURRENCY=ENTITY. (Let's assume the rate application has RATEENTITY.Second. RATE.) *LOOKUP RATE *DIM RATEENTITY *DIM INPUTCURRENCY *DIM RATE *DIM CATEGORY *DIM TIME *ENDLOOKUP Third. CURR *DIM DESTCURR1:INPUTCURRENCY="USD" . If the 'ENTITY' property of INTCO dimension has I_FRANCE value.ENTITY *ENDLOOKUP Even though the member id of ACCOUNTOWN dimension is same. Remove dimension names (TIME and CATEGORY> that don’t have any fixed value or variable value because it will be passed as current value automatically.ENTITY // IC_PCON is used for INTCO.80 => IC_PCON --------------------------------------------------------------------------- Last. above *LOOKUP will select below two records and each variable will have different value.FRANCE.100 => PCON IC_NONE. *LOOKUP OWNERSHIP *DIM INTCO="IC_NONE" *DIM PARENT="MYPARENT" *DIM PCON:ACCOUNTOWN="PCON" // PCON is used for ACCOUNTOWN *DIM PCON:ENTITY=ENTITY // PCON is used for ENTITY *DIM IC_PCON:ACCOUNTOWN="PCON" // IC_PCON is used even though it searches same "PCON" *DIM IC_PCON:ENTITY=INTCO.MYPARENT.PCON.MYPARENT.*DIM TIME *ENDLOOKUP ------------------------------------------------------------------------Note: If you want to get some value based on two or more dimensions. the variable should be defined as a different variable because the member id of ENTITY dimension is different in the combination. IC_NONE. You should use the same variable name when you map dimensions.PCON. *LOOKUP RATE *DIM RATEENTITY="DEFAULT" *DIM SOURCECUR:INPUTCURRENCY=ENTITY.I_FRANCE. Here is an example. TIME.JAN.RATETYPE *ENDLOOKUP Now we get the values so how can we use these values? You can use it using LOOKUP(Variable) in your *REC statement as below *WHEN ACCOUNT.JAN. ACTUAL. AVG. ACTUAL. ACTUAL. 0. 2011. 2011. AVG. PARENT ="MYPARENT". USD.91 DEFAULT. EUR. 2011. . Now it is time to find out how it works in the script engine. END.JAN. USD.JAN. 1 RATCALC."END" *REC(FACTOR=LOOKUP(DESTCURR1)/LOOKUP(SOURCECURR).JAN.RATETYPE *IS "AVG". USD.JAN. ACTUAL. INPUTCURRENCY. AVG. 2011. Ex) *WHEN LOOKUP(PCON) //as a condition value of when *IS <= LOOKUP(IC_PCON) //as a value of IS *REC(FACTOR=-1.JAN. EUR. 1.93 RATCALC. 2011. ACTUAL.CURRENCY=”EUR”) *ENDWHEN NOTE: You can use LOOKUP(variable) with *WHEN and *IS statement.*DIM DESTCURR2:INPUTCURRENCY="EUR" *DIM RATE=ACCOUNT.CURRENCY=”USD”) *REC(FACTOR=LOOKUP(DESTCURR2)/LOOKUP(SOURCECURR). 2011. 1 DEFAULT. CHF. 2011. 1 Here are your current finance application records that need to be processed. SIGNEDDATA DEFAULT. Let's assume below records are in the rate application and see what will happen during execute of the script logic. 1 DEFAULT. END.22 DEFAULT. 2011. AVG. ACTUAL. 0. END. RATE. CATEGORY. CHF.24 DEFAULT. ACTUAL. RATEENTITY.DATASRC="ELIM") *ENDWHEN We reviewed how to define *LOOKUP/*ENDLOOKUP statement and how to use it. ACTUAL. 1. AVG.JAN. USD. there is no relationship between finance application and rate application. 5000 REVENUE. *DIM SOURCECUR:INPUTCURRENCY=ENTITY.ACCOUNT. LC. ACTUAL. Select RATEENTITY = "DEFAULT" 2. 2011. CATEGORY.JAN. SWITZERLAND.JAN. LC.CURR ENTITY. TIME. Select INPUTCURRENCY = "CHF” (because current Entity member's 'CURR' property value is 'CHF') OR INPUTCURRENCY = "USD" OR INPUTCURRENCY = "EUR" 3. Then. the Script engine will do the following steps to process the first record of the fact table. Select RATE = "END” (because current account member's 'RATETYPE' property value is 'END') 4. Therefore. SWITZERLAND. 1. ENTITY. CURRENCY.) . how can script logic find the value and calculate it? The key point is 'ENTITY. Select CATEGORY = "ACTUAL” (There is no statement so it is same as current application CATEGORY value. Switzerland which is one of the Entity dimension member should have 'CHF' value in its 'CURR' property. SIGNEDDATA INVENTORY. 1000 As you can see. *DIM RATE=ACCOUNT.CURR' which we used it for mapping dimension as below. We know Switzerland currency is CHF but there is no information in each fact table record. Therefore.CURR means 'CURR' property value of ENTITY dimension. 2011.RATETYPE So the 'RATETYPE' property value of INVENTORY and REVENUE account should have 'AVG' or 'END' value. Same thing is for mapping RATE dimension of rate application as below. It only has LC (local currency) value. ACTUAL. TIME. END. USD. (4 records will be generated in total. ACTUAL. 1. LC.34 // 5000 * (1/0.JAN.JAN. ACTUAL. it will generate 2 records. ACTUAL.91 After it processes 'REVENUE' records. So the 3 records below will be selected and its signeddata value will be assigned to each variable. 5376.CURRENCY=”EUR”) *ENDWHEN ACCOUNT. EUR. END. SWITZERLAND.5. CHF. 0."END" *REC(FACTOR=LOOKUP(DESTCURR1)/LOOKUP(SOURCECURR).) . 2011. DEFAULT.93 After the script logic engine executes below statements. ACTUAL.JAN. CURRENCY. *WHEN ACCOUNT.JAN. ACTUAL. SIGNEDDATA INVENTORY. ACTUAL. ACTUAL. 2011. there will be 6 records in the fact table as below. SWITZERLAND. AVG. Select TIME = "2011. 1 => DESTCURR1 will be 1 DEFAULT. 2011.93 => SOURCECUR will be 0.JAN” (There is no statement so it is same as current application TIME value. 2011.93) INVENTORY. the 3 records below will be selected from the rate application fact table because Revenue account has 'AVG' RATETYPE. ACTUAL.JAN. 5000 INVENTORY. 6666.JAN. CHF.67 // 5000 * (1.JAN. AVG. END.24 => DESTCURR2 will be 1.) All above selection will be combined with 'AND' condition.24/0. ENTITY.JAN. ACTUAL. AVG.93) For the 2nd record in the fact table. SWITZERLAND. 2011. 2011. EUR. DEFAULT. USD. 1 DEFAULT.24 DEFAULT. 2011.CURRENCY=”USD”) *REC(FACTOR=LOOKUP(DESTCURR2)/LOOKUP(SOURCECURR).22 DEFAULT. USD. CATEGORY. 2011. 1. EUR.JAN. 0.RATETYPE *IS "AVG". 2011. the logic script also supports Loop statement. ACTUAL. 2011. 2011. Let's see how it works.JAN. Here is the syntax of *FOR .34 INVENTORY.67 REVENUE. EUR. SWITZERLAND.*NEXT statement. 1340.22/0. EUR. USD. ACTUAL. 1098. 2011. *FOR {variable1} = {set1} [ AND {variable2={set2}] . BPC Script logic for Dummies? (Part 5) Posted by James Lim in SAP Planning and Consolidations. Question 3: Can we lookup parent member value instead of base member (leaf member)? MS version can do it with *OLAPLOOKUP statement instead of *LOOKUP but NW version doesn't have it yet. LC. TIME.90 // 1000 * (1/0. 5376. 2011. ENTITY. LC. version for SAP NetWeaver on Oct 3. USD.66 // 1000 * (1. if there are no records in the fact table of the current application. Nothing will happen.JAN. SWITZERLAND. 5000 INVENTORY. ACTUAL.91) We finished learning how to use *LOOKUP/*ENDLOOKUP statement. 1000 REVENUE. Therefore. SWITZERLAND. 6666. ACTUAL.JAN. I will explain about *FOR/*NEXT in the next post. Here are some questions and answers that I got frequently. 2011.JAN. ACTUAL. CATEGORY. SIGNEDDATA INVENTORY. 2011 7:39:07 AM Like other program language or script.JAN.ACCOUNT. ACTUAL. CURRENCY.91) REVENUE. Question 2: I don't have any records in the fact table of current application. SWITZERLAND.JAN. Question 1: What if rate application doesn't have the value? Then currency conversion will not happen. SWITZERLAND. 2011. What will happen? The script logic always reads records from the current application. SWITZERLAND. %CURR% variable will be replaced with two values USD and EURO.KRW then the it will be translated as below.SEP.} *NEXT And here is an example.CURRENCY=EURO) *REC(FACTOR=1.*NEXT statement because it can be used as a nested form.1.CURRENCY=USD) *REC(FACTOR=1.EURO.CURRENCY=CHF) *REC(FACTOR=1.APR.MAR. 4.AUG.%MONT H%") .OCT. it is correct if it is simple but we need *FOR .1. *REC statement includes %CURR% variable. 3.OCT.OCT.JUN.TIME=2011.CURRENCY=EURO) Let's assume %CURR% varible has USD.NOV.1. *FOR %YEAR%=2003.*NEXT statement. *FOR %CURR%=USD. Of course.CURRENCY=KRW) Someone may say "we can use multiple line of *REC statement".TIME=2011.TIME="TOT.MAY.TIME=2011.TIME="%YEAR%.EURO *REC(FACTOR=1.OCT.1.CHF.{other statement.1.FEB.INP")/100.OCT.TIME=2011.1.JUL.CURRENCY=%CURR%) *NEXT So what is the meaning of the example above? 1.1.. it will be translated two statement as below because *REC statement exists inside of *FOR . Therefore.2005 *FOR %MONTH%=JAN.2004.OCT.OCT.CURRENCY=USD) *REC(FACTOR=1.OCT.DEC *REC(FACTOR=GET(ACCOUNT="TOT. We set a variable as %CURR% 2.TIME=2011.. *REC(FACTOR=1.TIME=2011.OVRHEAD".TIME=2011. *REC(FACTOR=1. For example. user should write 36 statements instead of above simple a *FOR .5. (NOTE: BPC NW supports Nested FOR .CURRENCY=%Y%) *NEXT So the first variable set and the second variable set will be matched 1 to 1.TIME=%X%.NEXT in the version 7. The last thing about *FOR . This is very useful when we use script logic with dynamic dataset.TIME=1 .CURRENCY=B) *REC(FACTOR=1. *REC(FACTOR=1. the extra values of the second variable will be ignored.5. Users can use data_set like %TIME_SET% instead of specifying all time members.B. For example. the missing values of the second variable will be assumed null so please be careful to match the number of varible. We can use *FOR %MYTIME%=%TIME_SET% . If the first variable has more values than the second variable. *FOR %X%=1.TIME=3 .3 AND %Y%=A.CURRENCY=C) What if the number of values is not matched between first and second variable? If the first variable has less values than the second variable. User may use two variable sets like the exmaple below.*NEXT statement.CURRENCY=A) *REC(FACTOR=1.TIME=2 .*NEXT *NEXT If the user is using *REC statement.5 ) In addtion.5. then will be replaced as the example below.5.2.C *REC(FACTOR=1.*NEXT is using data set variable as values. datar on Apr 16.2004. We will see how to use GET and FLD function and memory variable in the next post as the last topic.JAN. if we want to. The limitation of this option is that here we are really not leveraging the BW platform.instead of *FOR %MYTIME%=2003. So in essence. in this option we are treating BW infocube as any other external data source. Loading transactional data from any infocube to BPC Application in BPC7NW Posted by Pravin Datar in pravin.JAN. If we have to get a flat file export from an infocube and then import that flat file into BPC. 2009 6:54:44 PM Business Planning and Simulation version for Netweaver successfully leverages the Netweaver infrastructure and we can use the data stored in infocubes in the Enterprise Data Warehouse in BPC7NW.JAN Therefore. The advantage of this option is that it is relatively very simple to administer and very flexible to adapt since we are dealing with a flat file when it comes to importing data into BPC Application. There is another ostensible limitation that this option may portray . Options available for cube to cube data load: There are many options available to the users of BPC7NW to get transactional data from a BW cube to BPC Application. Here are some of them: 1. In this blog we will discuss what tools and techniques we can use to get the transactional data stored in any infocube in BW into BPC Application using Data Manager in BPC7NW.that this process can not be automated. users can execute script logic dynamically based on the passed data sets. However we can overcome this limitation.export data from that external data source to a flat file and then import that flat file into BPC. we can use exactly the same approach . then it may appear that there has to be a handoff from the flat file export to the import of flat file.2005. If we have to load data from any other non-BW source. We can export the transactional data from any infocube in the form of flat file and then use Data Manager in BPC7NW to load that data from flat file to BPC Application. by creating a custom process chain that in turn . This option is to use the Data Manager in BPC7NW to load transactional data from any Infoprovider in BW to the BPC Application.for example if we have 10 in the cube and then write 100. infosource etc will still be there in BW but they would be required to be linked again to the new cube. it will bypass this validation mechanism completely and there is a risk of invalid records being written to BPC application. So the ETL load has to coordinated with the BPC users to avoid any potential mishaps like the automated process switching the BPC cube to loading mode when a planner is in the middle of updating his/her plan. 3. This can very well be . This means that during that time. using the ETL toolset is a proven. BPC Application in BPC7NW generates its own infocube in its namespace. At present. If we use BW ETL. in BPC7NW. BPC may end up generating a totally new cube which is more optimized than the previous one. In that case. However this option is also beset with several limitations. The validation mechanism will not check the records that have already been written. We can leverage BW Extraction. what and how of BPC namespace). This is just a consideration rather than a limitation. This option overcomes almost all the limitations enumerated above since this is executed from within BPC itself. This brings us to the third option which we will discuss in much more detail. the result will always be 110. if and when the BPC user executes ‘full optimize' for the BPC Application. Finally. We can use standard BW ETL techniques to transform and load the data to the cube used by the BPC Application. scalable and very robust technique to handle data within BW. Firstly. Fourthly. You can read more about that atRunning BPC process chains within non-BPC process chains 2. Lastly. Transformation and Loading (ETL) tools to transfer transactional data from any infocube to the infocube used by BPC Application. all the ETL work that was done for the previous cube will be dropped from the new cube. infocube for the BPC Application will need to be switched to loading mode while loading it using ETL and then switch it back to planning mode after loading is complete. This option also is valid one and it indeed leverages the BW platform and also this can very well be automated if we desire to do so. the BPC audit logs will not be updated if we use BW ETL since it won't invoke any BPC audit functionality. Thirdly. Further this option can handle deltas very efficiently since we can use the delta handling mechanisms in BW. we can maintain validations to validate the data written to BPC Application so that invalid records are not written and data integrity is maintained. The building blocks for the ETL like the datasource.executes the process chain to import the flat file data into BPC Application. the data will always be additive . the ‘full optimize' option does not automatically inherit the ETL configuration done on the earlier BPC cube. it will not be available for planning through BPC front end. Secondly. (You can read more about BPC namespace at A reservation of a different kind – why. So in this case. Also please note the mapping for AMOUNT. we are using the dimension names (and not the technical names of the BW characteristics corresponding to those dimensions) whereas when we map them to the Infoprovider characteristics. the TIME dimension in BPC is mapped to 0FISCPER characteristic in BW. we can choose only one key figure in a transformation file since our BPC cube has only one key figure. The dimension names in the BPC Application are mapped to the corresponding characteristics from the Infoprovider from where we want to load the data. please note that the source BW Infoprovider can have multiple key figures. we must use the technical names of the BW characteristics.scheduled from within Data Manager or by invoking the Data Manager process chain from a custom process chain. An example of a transformation file that we can use to load data from other infoproviders is given below. ZMAOUNT01 is the technical name of the key figure in the source Infoprovider. Please note the mapping section. Please note that as far as the BPC dimension names are concerned. In this regard. The AMOUNT is mapped to the technical mane of the key figure in the source Infoprovider. For example. So let us see how exactly we should go about realizing this. Here TIME is the BPC Dimension name whereas 0FISCPER is the technical name of the BW characteristic in the source Infoprovider. you have a situation where you have to get data from two or more key . we should prepare the transformation and conversion files. If so. If for any reason. Creating transformation file: Before we run the data manager package to load data. Please see the example of the transformation file below.figures. you can use multiple transformation files and run the data manager package multiple times with each transformation file. in that case. Here for the category dimension. Please see the example of the following transformation file: . please note that the data from all those key figures will end up in the same single key figure in the BPC cube. we can enter our selection in the transformation file. we can use the keyword *NEWCOL in the transformation file for the corresponding BPC dimension. Similarly for the dimension P_DATASRC.we can always pass the default value through our transformation file. However in that case. What if there is no corresponding characteristic in the source Infoprovider for a dimension in BPC? What if we want to have a single default value in the data load regardless of what exists in the source Infoprovider? Well. even if there is no corresponding BW characteristic in the source Infoprovider. we are forcing the default value of FORECAST in the data load. What if we want to load only a subset of the data from the source Infoprovider? In that case. it is OK . Please note that in the OPTIONS section. Those conversion files we can refer in the transformation file as shown below: . This is necessary if the master data values in the BPC dimension in the BPC Application (dimension members) and the characteristics in the source Infoprovider (characteristic values) are different. You can enter selections from multiple characteristics here in the selection and thus load data from a specific data slice in the source from Infoprovider. Hence we are selecting only a subset of the source Infoprovider. we have entered a selection to select data from only two characteristic values C2000 and C1000 of the characteristic ZACCT01. Creating conversion file: In addition to the transformation file. we can have conversion files and refer them in the transformation files. For example. we can write the conversion for AMOUNT if necessary.Here. BPC7NW gives additional options for validation as shown below: . we can write the formula in the conversion file for AMOUNT as shown below: So in this case. Validating the transformation file: After we prepare the transformation and conversion files. if we want to change the data coming over for a particular dimension member during the load. That conversion file can have the mapping between internal (BPC) time dimension members and external (0FISCPER) values. the data for account CE0004220 will be increased by 10% during the load. Another example of the conversion file for account is shown below: In addition to writing such conversion files for BPC dimensions. the next step would be to get the transformation validated with the source Infoprovider.XLS. That conversion file is shown below: Here ? is used as a wildcard so that we don't have to write conversion for each year. the conversions for TIME dimension are read from the file ZBICGR01TIME. This conversion file can work for any year. We should select the option 'transaction data from infocube' and enter the technical name of the Infoprovider. The validation then can validate our transformation and conversion files against the data in the source Infoprovider and gives us the result as shown below: . Please note that though the option here reads ‘transaction data from infocube' it works with DSO also in the same way. we are in a good shape to run our data manager package to load the data from the source Infoprovider. There is a delivered process chain and data manager package to load data from Infoprovider as shown below: .Running the data manager package: Once we have our transformation files validated. If we run this package. we can enter the technical name of the source Infoprovider and the transformation file and schedule the load. Upon completion. we can get the success message in the data manager package status as shown below: . Can we use the same technique to load data from one BPC application to another BPC application in the same Appset or for that matter from any application in any other Appset within BPC7NW? The answer is an emphatic ‘yes'. we have to enter the /CPMB namespace technical name of the infocube pertinent to the source BPC application. An example of the transformation file used for such data transfer is shown below.Loading data from other appsets: So far we have seen how we can use the data manager features to load data from any Infoprovider in BW to BPC Application. This raises another question. We can treat the BPC application in the same or other appset as just another BW cube with characteristics and key figure. . The only consideration is using this approach to load data from other applications in other appsets is that we have to maintain the /CPMB name space technical names of the characteristics of those dimensions in the transformation file against the dimension names of the target application and while running the package. .Thus we can see that we can effectively leverage the Netweaver platform to transfer the transactional data from one infocube to another using Data Manager package. Documents Similar To BPC blogs.docxSkip carouselcarousel previouscarousel nextBPC - Insight QuestionsSaravanan MuthuDeveloper Student GuideBPC Certification SAP BPC QuestionsCloneSkills SAP BPC 10.0 Associate Certification Exam 01212012Week9_Functions.pptKrishna ReddyHow To Create Conditional Logic in BPW 10+ NWData StageETL TestingC EPMBPC 10 Sample ItemsSAP BPC MaterialBPC FAQs InterviewQsThipparthiVijayKumar[6_0]When QualityStage is a Better ETL Tool Than DataStagemsbi training institute in hydrabadWajid December Ics2Siperian ProfileSinatra DocumentationBusiness intelligenceGeokettle ReadmePMP Philosophywajid december ics2.docTopic 6 Nested for LoopsASAP METDC-C++Lecture 1Notes Informaticabenefit of 3D city GISMore From Aditya PavanSkip carouselcarousel previouscarousel next222366.pdfBods Learn371617.pdfrsdu_table_consistency_005.pdfSAP BW_4HANA – Overview and RoadmapAdvanced SLT ReplicationITIL Pocket GuideAvailable to PromiseEmployee FAQLD.txtDSP_Blackrock.xlsSuryaAlapatiMaestro EdgeDSP BlackrockCM - Status Reason in Various Forms0BW Blue Print FinalBudget Highlights -FY 2015-16Tripmanageoperationsforsappos-130318011837-phpapp01Format of Gazette Certificate 03082012Rent Recipt4Compounding9 - Demand Signal Management 3Using OOPS in Transformations in SAP BIAPO SNP Training - GlanceHow to Cook Indian Food Over 150 Recipes for Curry MoreCOPY_TMResignationReadmeFooter MenuBack To TopAboutAbout ScribdPressOur blogJoin our team!Contact UsJoin todayInvite FriendsGiftsLegalTermsPrivacyCopyrightSupportHelp / FAQAccessibilityPurchase helpAdChoicesPublishersSocial MediaCopyright © 2018 Scribd Inc. .Browse Books.Site Directory.Site Language: English中文EspañolالعربيةPortuguês日本語DeutschFrançaisTurkceРусский языкTiếng việtJęzyk polskiBahasa indonesiaSign up to vote on this titleUsefulNot usefulYou're Reading a Free PreviewDownloadClose DialogAre you sure?This action might not be possible to undo. Are you sure you want to continue?CANCELOK
Copyright © 2024 DOKUMEN.SITE Inc.