Informatica FAQsDifferences between Active Transformation and Passive Transformation Active Transformation An active transformation can change the number of rows that pass through it, such as a Filter transformation that removes rows that do not meet the filter condition. Advanced External Procedures Aggregator Application Source Qualifier Filter Transformation Joiner Transformation Normalizer Transformation Rank Transformation Router Transformation Sorter Transformation Source Qualifier Update Strategy Passive Transformation A passive transformation does not change the number of rows that pass through it, such as an Expression transformation that performs a calculation on data and passes all rows through the transformation. Expression External Procedures Input Transformation Lookup Transformation Output Transformation Sequence Generator Stored Procedure XML Source Qualifier Differences between Parameter and Variable Parameter A mapping parameter represents a constant value that you can define before running a session. A mapping parameter retains the same value throughout the entire session. When you use a mapping parameter, you declare and use the parameter in a mapping or mapplet. Then define the value of the parameter in a parameter file. During the session, the Informatica Server evaluates all references to the parameter to that value. Variable A mapping variable represents a value that can change through the session. The Informatica Server saves the value of a mapping variable to the repository at the end of each successful session run and uses that value the next time you run the session. When you use a mapping variable, you declare the variable in the mapping or mapplet, and then use a variable function in the mapping to automatically change the value of the variable. At the beginning of a session, the Informatica Server evaluates references to a variable to its start value. At the end of a successful session, the Informatica Server saves the final value of the variable to the repository. The next time you run the session, the Informatica Server evaluates references to the variable to the saved value. You can override the saved value by defining the start value of the variable in a parameter file. Use mapping variables to perform automatic incremental reads of a source. For example, suppose the customer accounts in the mapping parameter example, in parameter, are numbered from 001 to 065, incremented by one. Instead of creating a mapping parameter, you can create a mapping variable with an initial value of 001. In the mapping, use a variable function to increase the variable value by one. The first time the Informatica Server runs the session, it extracts the records for customer account 001. At the end of the session, it increments the variable by one and saves that value to the repository. The next time the Informatica Server runs the session, it automatically extracts the records for the next customer account, 002. To reuse the same mapping to extract records for other customer accounts, you can enter a new value for the parameter in the parameter file and run the session. Or you can create a parameter file for each customer account and start the session with a different parameter file each time using pmcmd. By using a parameter file, you reduce the overhead of creating multiple mappings and sessions to extract transaction records for different customer accounts. Differences Between Active Mapplets and Passive Mapplets Page 1 of 18 Differences Between Standard Validation and Extended Validation Standard Validation Use standard validation to validate task instances and expressions in the workflow without validating nested worklets and worklet objects. Moving the files onto the Informatica Server system and adding disk space might improve performance. configure the Informatica Server to run in the ASCII data movement mode. To validate a worklet and its nested worklets. regardless of whether you have edited them. worklet objects. Distributing the session load to separate PowerCenter Server systems increases performance. Moving the target database onto a server system might improve Informatica Server performance. If you use relational source or target databases. When you run the Informatica Server in Unicode mode. The Workflow Manager validates the worklet instance by verifying attributes in the Parameter tab of the worklet instance. When you store flat files on a machine other than the Informatica Server. How to Increase Performance by Improving Network Speed? • • The performance of the Informatica Server is related to network connections. Additional CPUs allows the system to run multiple sessions in parallel as well as multiple pipeline partitions in parallel. and all other nested worklets in the workflow. You can run multiple PowerCenter Servers on separate systems against the same repository. the Workflow Manager validates all tasks instances and nested worklets. choose Workflow-Extended Validate.Informatica FAQs • • • • A mapplet can be active or passive depending on the transformations in the mapplet. When you run sessions that contain multiple partitions. the Workflow Manager does not validate nested worklets or nonreusable worklets and sessions you have not edited. Active mapplets contain one or more active transformations. Configure your system to use additional CPUs to improve performance. you cannot concatenate data from an active mapplet with a different pipeline. try to minimize the number of network hops between the source and target databases and the Informatica Server. To use extended validation. When you use standard validation. The Workflow Manager validates the worklet object using the same validation rules for workflows. If the workflow contains nested worklets. If you use flat file as a source or target in your session. A local disk can move data five to twenty times faster than a network. When you use extended validation. the Informatica Server uses one byte to store each character. Extended validation validates all task instances (including sessions) and worklets. Passive mapplets contain only passive transformations. Consider the following options to minimize network activity and to improve Informatica Server performance. regardless of whether you have edited them. In ASCII mode. When you use standard validation. it uses two bytes for each character. have your network administrator analyze the network and make sure it has enough bandwidth to handle the data moving across the network from all partitions. When all character data processed by the Informatica Server is 7-bit ASCII or EBCDIC. you can move the files onto the Informatica Server system to improve performance. you can select a worklet to validate the worklet and all other worklets nested under it. session performance becomes dependent on the performance of your network connections. the Workflow Manager does not validate reusable worklet objects used in the workflow. • • • • • Page 2 of 18 . right-click the worklet and choose Extended Validate. The Workflow Manager validates nonreusable worklet objects and reusable session instances if you have viewed or edited the session or worklet. As with an active transformation. Extended Validation Use extended validation to validate reusable worklet instances. which can slow session performance. the native database drivers improve session performance. the Informatica Server forwards rejected rows to the next transformation. you might want to add memory to the system. • • What are the transformations cannot be used inside a Mapplets? You cannot include the following objects in a mapplet: o o o o o o Normalizer transformations COBOL sources XML Source Qualifier transformations XML sources Target definitions Other mapplets Which transformation cannot be made reusable? Sequence Generator transformations must be reusable in mapplets. Moving the files onto the Informatica Server system and adding disk space might improve performance. you can use any of the mapping sources or mapping targets as the lookup table. or you can import a lookup table from any database that both the Informatica Server and Client machine can connect to. The lookup table can be a single table. If your mapping includes multiple sources or targets. If the source is a Flat File. You run a session with many partitions. What is Forwarding Rejected Rows? You can configure the Update Strategy transformation to either pass rejected rows to the next transformation or drop them. However. By default. how can I improve performance? If you use flat file as a source or target in your session. The Informatica Server queries the lookup table or an in-memory cache of the table for all incoming rows into the Lookup transformation. You run a session that uses large cached lookups. additional CPUs might cause disk bottlenecks. You might want to increase system memory in the following circumstances: 1. minimize the number of processes accessing the disk. or you can join multiple tables in the same database using a lookup SQL override. • If you cannot free up memory. To prevent disk bottlenecks. Connect to the database to import the lookup table definition. session performance becomes dependent on the performance of your network connections. The Informatica Sever can connect to a lookup table using a native database driver or an ODBC driver. You cannot demote reusable Sequence Generator transformations to standard in a mapplet. Can we create a Lookup based on more than one table? • You can import a lookup table from the mapping source or target database. Processes that access the disk include database functions and operating system functions. you can move the files onto the Informatica Server system to improve performance. Parallel sessions or pipeline partitions also require disk access. Page 3 of 18 . 2.Informatica FAQs • However. When you store flat files on a machine other than the Informatica Server. For example. NULL if a null value is passed as an argument to the function. format) Returns the length of time. 0 if the two dates are equal. days. 1 if the second date is earlier.Informatica FAQs The Informatica Server flags the rows for reject and writes them to the session reject file. Note that DATE_COMPARE returns an integer value rather than a date value. the return value is a negative number. Page 4 of 18 . and returns a date in the same format as the date you pass to the function. NULL if one of the date values is NULL. You can use the same source or mapplet multiple times in the business component tree. ADD_TO_DATE accepts positive and negative integer values. The Designer creates a business component when you drag any source or mapplet into any directory of the business component tree. hours. You can think of business components as tools that let you view your sources and mapplets in a meaningful way using hierarchies and directories. DATE_DIFF (date1. amount) Adds a specified amount to one part of a date/time value. date2. You can then organize the appropriate source definitions into logical groups and add descriptive names for them. format. and display sources and mapplets in a single location in your repository folder. The Informatica Server subtracts the second date from the first date and returns the difference. Zero if the dates are the same. • • What the various Date functions? ADD_TO_DATE (date. the return value is a positive number. Return Value • • • • -1 if the first date is earlier. If date1 is earlier than date2. minutes. the Informatica Server drops rejected rows and writes them to the session log file. Return Value • • Double value. Since business components are references to another object. If date1 is later than date2. What is Business Component? • Business components allow you to organize. you can edit the object from its original location or from the business components directory. Return Value • • Date in the same format as the date you pass to this function. group. you can create groups of source tables that you call Purchase Orders and Payment Vouchers. If you do not select Forward Rejected Rows. Business components let you access data from all operational systems within your organization through source and mapplet groupings representing business entities. or seconds). months. measured in the increment you specify (years. date2) Returns an integer indicating which of two dates is earlier. DATE_COMPARE (date1. You create business components in the Designer. between two dates. You can also use MIN to return the minimum numeric value in a port or group. MAX is one of several aggregate functions. the filter condition evaluates to FALSE or NULL for all rows). LAST_DAY returns NULL. Return Value • • Integer representing the specified part of the date. Return Value • • Null If a value is NULL. However. or if no rows are selected (for example. LAST_DAY (date) Returns the date of the last day of the month for each date in a port. MIN (date [. GET_DATE_PART returns 4. if you create an expression that returns the month portion of the date. filter_condition]) Returns the latest date found within a port or group. filter_condition]) Returns the oldest date found in a port or group. You can also use MAX to return the largest numeric value in a port or group. format) Returns the specified part of a date as an integer value. You can apply a filter to limit the rows in the search.Informatica FAQs • NULL if one (or both) of the date values is NULL. You use aggregate functions in Aggregator transformations only. MIN is one of several aggregate functions. LAST_DAY treats all rows as one group. GET_DATE_PART (date. Date. You can apply a filter to limit the rows in the search. & If NULL if all values passed to the function are NULL. NULL if a value passed to the function is NULL. Return Value • Date. MAX (date [. You can nest only one other aggregate function within MIN. LAST_DAY ignores the row. Therefore. Page 5 of 18 . if all values passed from the port are NULL. NULL if a value in the selected port is NULL. The last day of the month for that date value you pass to this function. and pass a date such as Apr 1 1997 00:00:00. returning one result for each group. and the nested function must return a date datatype. If there is no group by port. You can nest only one other aggregate function within MAX. You use aggregate functions in Aggregator transformations only. returning one value. Group By LAST_DAY groups values based on group by ports you define in the transformation. Group By MIN groups values based on group by ports you define in the transformation. returning one value. format]) Rounds one part of a date. format. or if no rows are selected (for example. If there is no group by port. You can also use ROUND to round numbers. SET_DATE_PART (date. Return Value • • Date with the specified part rounded. value) Sets one part of a date/time value to a value you specify. returning one result for each group. MIN returns NULL. Nulls If a single value is NULL. You create user-defined workflow variables when you create a workflow. NULL if a value passed to the function is NULL. Return Value • • Date in the same format as the source date with the specified part changed. format]) Return Value • Date. ROUND returns a date in the same format as the source date. the filter condition evaluates to FALSE or NULL for all rows). MIN treats all rows as one group.Informatica FAQs Return Value • • Date if the value argument is a date. You can link the results of this function to any port with a Date/Time datatype. NULL if all values passed to the function are NULL. • NULL if a value passed to the function is NULL. You can use workflow variables when you configure the following types of tasks: Page 6 of 18 . NULL if you pass a null value to the function. MIN ignores it. However. if all values passed from the port are NULL. What are the types of Workflow variables? • • Pre-defined Workflow variables User-defined Workflow variables The Informatica Server creates pre-defined workflow variables each time you create a new task. TRUNC (date [. ROUND (date [. you can use the Status variable to run a second session only if the first session completes successfully. For example. The Workflow Manager lists task-specific variables under the task name in the Expression Editor. Use links to connect each workflow task. the number of rows written to a target in a session. You can use task-specific variables to represent information such as the time a task ended. Each link in the workflow can execute only once. Last error message for the associated task. You can use the SYSDATE and WORKFLOWSTARTTIME system variables within a workflow. You can use workflow variables in links to create branches in the workflow. You can use pre-defined variables within a workflow. the Informatica Server sets FirstErrorMsg to an empty string when the task completes. Decision tasks determine how the Informatica Server executes a workflow. and another link to follow when the decision condition evaluates to false. The Workflow Manager does not allow you to use links to create loops in the workflow. There are two types: • • Task-specific variables. You can use a user-defined date/time variable to specify the exact time the Informatica Server starts to execute the next task. The Workflow Manager lists system variables under the Built-in node in the Expression Editor. You can specify conditions with links to create branches in the workflow. The Task-Specific Workflow variables are. If the previous task succeeded. Pre-defined Workflow Variable Condition EndTime ErrorCode ErrorMsg FirstErrorCode FirstErrorMsg PrevTaskStatus Description Evaluation result of decision condition expression. you can increment a user-defined counter variable by setting the variable to its current value plus 1. the Informatica Server sets ErrorCode to 0 when the task completes. Links connect each workflow task. Pre-Defined Workflow Variables The Workflow Manager creates a set of pre-defined variables for every workflow. Error code for the first error message in the session. the Workflow Manager sets PrevTaskStatus to Datatype Integer Date/Time Integer Nstring Integer Nstring Integer Page 7 of 18 . Date and time the associated task ended. If the task fails. For example. If there is no error.Informatica FAQs Assignment tasks You can use an Assignment task to assign a value to a user-defined workflow variable. The Workflow Manager creates a set of task-specific variables for each task in the workflow. If there is no error. If there is no error. the Workflow Manager keeps the condition set to null. the Informatica Server sets FirstErrorCode to 0 when the session completes. The first error message in the session. Decision Task Links Timer Tasks Timer tasks specify when the Informatica Server begins to execute the next task in the workflow. the Informatica Server sets ErrorMsg to an empty string when the task completes. Last error code for the associated task. System variables. You cannot modify or delete pre-defined workflow variables. For example. or the result of a Decision task. If there is no error. after a Decision task. Status of the task that the Workflow Manager executes immediately before the current task. you can create one link to follow when the decision condition evaluates to true. You must create a variable before you can assign values to it. If you do not specify a condition in the Decision task. StartTime Status Execution status. the Informatica Server evaluates the Decision task to true. To use an Assignment task in the workflow. The Decision task has a pre-defined variable called $Decision_task_name. You can specify one decision condition per Decision task. After you assign a value to a variable using the Assignment task. Assignment The Assignment task allows you to assign a value to a user-defined workflow variable. it sets PrevTaskStatus to FAILED. Depending on the workflow. After the Informatica Server evaluates the Decision task. first create and add the Assignment task to the workflow. similar to a link condition. The Informatica Server evaluates the condition in the Decision Task and sets the pre-defined condition variable to True (1) or False (0). the Informatica Server uses the assigned value for the variable during the remainder of the workflow. Integer Total Numbers of Transformations Errors Integer Note: Nstring can have a maximum length of 600 characters. You cannot assign values to pre-defined workflow variables. SrcSuccessRows Date and time the associated task started. TgtFailedRows TgtSuccessRows TotalTransErrors Integer Total number of rows successfully written to the targets. Date/Time Integer Integer Integer Decision The Decision task allows you to enter a condition that determines the execution of the workflow. For more information.Informatica FAQs SUCCEEDED. you might use link conditions instead of a Decision task. Timer Page 8 of 18 . Total number of rows read from the sources that failed. you can use the pre-defined condition variable in other expressions in the workflow to help you develop the workflow. Then configure the Assignment task to assign values or expressions to user-defined variables. Otherwise. Condition that represents the result of the decision condition. SrcFailedRows Total number of rows successfully read from the sources. Task statuses include: • ABORTED • DISABLED • FAILED • NOTSTARTED • STARTED • STOPPED • SUCCEEDED Total number of rows that the targets rejected. For example. use an EventWait task to instruct the Informatica Server to wait for the specified indicator file to appear before continuing with the rest of the workflow. Aborts the workflow or worklet that contains the Control task. you may specify the following types of events for the EventWait and Event-Raise tasks: • Pre-defined event. Wor kflo w Stop Top-Level Wor kflo w Abort Top-Level Wor kflo w Stops the workflow that is running. or the top-level workflow starts. You may specify the exact date and time.Informatica FAQs The Timer task allows you to specify the period of time to wait before the Informatica Server executes the next task in the workflow. A parent workflow or worklet is the workflow or worklet that contains the Control task. the Informatica Server fails the parent workflow.” The Informatica Server fails the Control task if you choose this option. You specify the exact time that the Informatica Server starts executing the next task in the workflow. or worklet before starting the next task. The Timer task has two types of settings: • • Absolute time. or fail the top-level workflow or the parent workflow based on an input link condition. Page 9 of 18 . Control You can use the Control takes to stop. Use a Timer task after the first session. For pre-defined events. Control Option Fail Me Fail Parent Stop Parent Abort Parent Fail Top-Level Description Marks the Control task as “Failed. You instruct the Informatica Server to wait for a specified period of time after the Timer task. Aborts the workflow that is running. or you can choose a user-defined workflow variable to specify the exact time. You can choose to start the next task in the workflow at an exact time and date. In the Relative Time setting of the Timer task. the parent workflow. Stops the workflow or worklet that contains the Control task. The event is triggered based on the completion of the sequence of tasks. A pre-defined event is a file-watch event. You want the Informatica Server wait ten minutes after the first session completes before it executes the second session. Relative time. workflow. When the Informatica Server locates the indicator file. You can also choose to wait a period of time after the start time of another task. it starts the next task in the workflow. Fails the workflow that is running. you may have two sessions in the workflow. If you choose Fail Me in the Properties tab and choose Fail Parent If This Task Fails in the General tab. abort. specify ten minutes from the start time of the Timer task. Marks the status of the workflow or worklet that contains the Control task as Failed after the workflow or worklet completes. Use the following tasks to help you use events in the workflow: Even-Raise Task and Even-Wait Task To coordinate the execution of the workflow. You can define events in the workflow to specify the sequence of task execution. To use the Event-Wait task for a user-defined event. you must first declare the user-defined event. The file must be created or sent to a directory local to the Informatica Server. the Workflow Manager looks for the indicator file in the system directory. Then. When the Informatica Server executes the Event-Raise task.Informatica FAQs • User-defined event. Once the event triggers. Waiting for Pre-Defined Events To use a pre-defined event. A user-defined event is sequence of tasks in the branch from the Start task leading to the Event-Raise task. the Event-Raise task triggers the event. or you can manually delete the indicator file. Page 10 of 18 . In the EventRaise task properties. When you specify the indicator file in the Event-Wait task. enter the directory in which the file will appear and the name of the indicator file. You must provide the absolute path for the file. The Informatica Server marks the status of the Event-Wait task as failed if it cannot delete the indicator file. The Event-Wait task waits for the Event-Raise task to trigger the event before continuing with the rest of the tasks in its branch. To use an Event-Raise task. as if running the full session. You can choose to have the Informatica Server delete the indicator file after it detects the file.and post-session functions. A user-defined event is a sequence of tasks in the workflow. When you use the Event-Wait task to wait for a pre-defined event. Once the indicator file appears. the Informatica Server continues executing the rest of the workflow. Enable Test Load in Session Property You can configure the Informatica Server to perform a test load. the Event-Raise task triggers the event. With a test load. A pre-defined event is a filewatch event. Event-Wait Task Event-Raise task represents a user-defined event. specify the name of a user-defined event. Use the Event-Raise task with the Event-Wait task to define events. the Informatica Server reads and transforms data without writing to targets. Do not use the Event-Raise task to trigger the event when you wait for a pre-defined event. The Informatica Server generates all session files. Note: Do not use a source or target file name as the indicator file name. script. Once the user-defined event is triggered. you need a shell command. The Event-Wait task waits for a pre-defined event or a user-defined event. The directory must be local to the Informatica Server. You can also use the Event-Wait task to wait for a user-defined event. the Informatica Server continues executing tasks after the Event-Wait task. create an Event-Raise task in the workflow to represent the location of the user-defined event you just declared. the Informatica Server continues executing tasks after the Event-Wait task. If you only specify the file name and not the directory. The Informatica Server writes the time the file appears in the workflow log. you specify an indicator file for the Informatica Server to watch. and performs all pre. Use an Event-Raise task to specify the location of the user-defined event in the workflow. The Informatica Server waits for the indicator file to appear. When all the tasks in the branch from the Start task to the Event-Raise task complete. Event-Raise Task The Event-Wait task waits for an event to occur. The Informatica Server waits for the Event-Raise task to trigger the user-defined event. you specify the name of the user-defined event in the Event-Wait task properties. or batch file to create an indicator file. The file can be any format recognized by the Informatica Server operating system. Informatica Server Processing for Incremental Aggregation The first time you run a session with incremental aggregation enabled. Ignores unchanged aggregate data. Move the aggregate files without correcting the configured path or directory for the files in the session property sheet. Note: You can perform a test load when you configure a session for normal mode. Enter the number of source rows you want to test in the Number of Rows to Test field. and saves the incremental change. the Informatica Server checks historical information in the index file for a corresponding group. use only changes in the source as source data for the session. Inserts new aggregate data. Incremental Aggregation Select Incremental Aggregation option if you want the Informatica Server to perform incremental aggregation. For all other target types. but rolls back the data when the session completes. Page 11 of 18 . Deletes removed aggregate data. the session fails. Each subsequent time you run the session with incremental aggregation. when you perform one of the following tasks: • • • • Save a new version of the mapping. using the aggregate data for that group. If it does not find a corresponding group. the Informatica Server realigns the cache files the next time you run the incremental aggregation session. You cannot perform a test load on sessions using XML sources. The Informatica Server creates the files in a local directory. configure the Informatica Server to overwrite existing aggregate data with new aggregate data. then: o o If it finds a corresponding group. instead of using historical data. you use only the incremental source changes in the session. Select Reinitialize Aggregate Cache in the session property sheet. the Informatica Server stores aggregate data from that session run in two files. the Informatica Server creates a new group and saves the record data. If you configure the session for bulk mode. the Informatica Server performs the aggregate operation incrementally.Informatica FAQs The Informatica Server writes data to relational targets. the Informatica Server processes the entire source. Change the configured path or directory for the aggregate files in the session property sheet without moving the files to the new location. If you change the partitioning information after you run an incremental aggregation session. the Informatica Server creates one set of cache files for each partition. • When writing to the target. The second time you run the session. such as flat file and SAP BW. At the end of the session. the index file and the data file. the Informatica Server does not write data to the targets. and you want the Informatica Server to continue saving aggregate data for future incremental changes. The Informatica Server then performs the following actions: • For each input record. Saves modified aggregate data in the index and data files to be used as historical data the next time you run the session. the Informatica Server applies the changes to the existing target: o o o o o Updates modified aggregate groups in the target. When you partition a session that uses incremental aggregation. The Informatica Server creates new aggregate data. If the source changes significantly. When you reinitialize the aggregate cache. You might use this option when source tables change dramatically. instead of using the captured changes in source tables. you must reinitialize the cache. even though these code pages are compatible. Normal. Informatica Server logs initialization information as well as error messages and notification of rejected data. Then enter the appropriate directory for the server variable. Creating File Directory When you run multiple sessions with incremental aggregation. Your mapping contains percentile or median functions. or Verbose Data. Informatica Server logs initialization and status information. However. Override Tracing Overrides tracing levels set on a transformation level. by using the server variable for all sessions using incremental aggregation.Informatica FAQs Re-initializing the Aggregate Files Reinitializing the aggregate cache overwrites historical aggregate data with new aggregate data. Do not enable incremental aggregation in the following circumstances: • • • You cannot capture new source data. in the Workflow Manager. You can enter session-specific directories for the index and data files. edit the session properties to disable the Reinitialize Aggregate Cache option. Increase this setting from the default of 1024 bytes per line only if source flat file records are larger than 1024 bytes. you typically need to use the use the entire source table. decide where you want the files stored. Capturing Incremental Changes Before enabling incremental aggregation. errors encountered. After you run a session that reinitializes the aggregate cache. You may be able to remove pre-existing source data at the source database with a pre-load stored procedure. Terse. You may be able to remove pre-existing source data during a session with a filter. You might do this by: • • Using a filter in the mapping. the Informatica Server overwrites the aggregate cache each time you run the session. you can easily change the cache directory when necessary by changing $PMCacheDir. None Terse Normal The Informatica Server uses the tracing level set in the mapping. Line Sequential Buffer Length Affects the way the Informatica Server reads flat files. Using a stored procedure. $PMCacheDir. Note: When you move from Windows to UNIX. If you do not clear Reinitialize Aggregate Cache. you cannot change from a Latin1 code page to an MSLatin1 code page. Processing the incrementally changed source significantly changes the target. and Page 12 of 18 . Note: Changing the cache directory without moving the files causes the Informatica Server to reinitialize the aggregate cache and gather new aggregate data. Verbose Initialization. Selecting this option enables a menu from which you choose a tracing level: None. Therefore. you must capture changes in source data. you can set the order in which the Informatica Server sends rows to different target definitions in a mapping. When you enter a tracing level in the session properties.Informatica FAQs skipped rows due to transformation row errors. Constraintbased loading establishes the order in which the Informatica Server loads individual targets within a set of targets receiving data from a single source qualifier. Target Load Type Note: Constraint-based loading does not affect the target load ordering of the mapping. Page 13 of 18 . you then determine the order in which each Source Qualifier sends data to connected targets in the mapping. Summarizes session results. or updating records in tables that have the primary key and foreign key constraints. The Informatica Server writes data to all the targets connected to the same Source Qualifier or Normalizer simultaneously to maximize performance. Verbose Initialization In addition to normal tracing. create one Source Qualifier or Normalizer transformation for each target within a mapping. you override tracing levels configured for transformations in the mapping. This feature is crucial if you want to maintain referential integrity when inserting. Target load ordering defines the order the Informatica Server reads each source qualifier in the mapping. names of index and data files used. the Informatica Server sends all rows to targets connected to that Joiner at the same time. Informatica Server logs additional initialization details. Informatica Server logs each row that passes into the mapping. To set the target load order. In the Designer. Also notes where the Informatica Server truncates string data to fit the precision of a column and provides detailed transformation statistics. and detailed transformation statistics. Verbose Data You can also enter tracing levels for individual transformations in the mapping. To specify the order in which the Informatica Server sends data to targets. deleting. but not at the level of individual rows. regardless of the target load order. When a mapping includes a Joiner transformation. In addition to verbose initialization tracing. or any valid DOS or batch file for Windows servers.or post-session shell command.or post-session shell commands. Use the following guidelines to call a shell command: • • Use any valid UNIX command or shell script for UNIX servers. The Workflow Manager allows you to choose from the following options when you configure shell commands: • • Create non-reusable shell commands. or to archive target files before the session begins. Create a non-reusable set of shell commands for the session.or post-session shell commands. You can use pre. The Workflow Manager provides the following types of shell commands for each Session task: • • • Pre-session command. for example. The Informatica Server performs post-session success commands only if the session completed successfully. The Informatica Server performs post-session failure commands only if the session failed to complete.Informatica FAQs Pre-Session and Post-Session Shell Commands The Informatica Server can perform shell commands at the beginning of the session or at the end of the session. You cannot use server variables or session variables in standalone Command tasks in the workflow. The Informatica Server does not expand server variables or session variables used in standalone Command tasks. Shell commands are operating system commands. Other sessions in the folder cannot use this set of shell commands. Post-session success command. Configure pre. Page 14 of 18 . you have the option to make them into a reusable Command task.or post-session shell commands. The Informatica Server performs pre-session shell commands at the beginning of a session. the commands are only visible in the session properties. Using Server and Session Variables You can include any server variable. you can create non-reusable shell commands for the pre.or post-session shell commands. Use an existing reusable Command task. When you use a server variable instead of entering a specific directory. If you create non-reusable pre. The Workflow Manager provides a task called the Command task that allows you to specify shell commands anywhere in the workflow. or session variables in commands in presession and post-session commands. you can run the same workflow on different Informatica Servers without changing session properties. to delete a reject file or session log. such as $PMTargetFileDir. You can configure a session to stop or continue if a pre-session shell command fails. Configure the session to execute the pre. Post-session failure command.and post-session shell commands in the Components tab of the session properties.or post-session shell commands. Configuring Non-Reusable Shell Commands When you create non-reusable pre.or postsession shell command. Select an existing Command task to run as the pre. The Workflow Manager does not create a Command task from the non-reusable pre.or post-session shell commands. Or. You can choose a reusable Command task for the pre. You have the option to make a non-reusable shell command into a reusable Command task. You cannot use updates with constraint-based loading. split the mapping to load the primary key table first. you have the option to make them into a reusable Command task. Targets must be in one target connection group. the Informatica Server stops the session. you cannot revert back. By default the Informatica Server stops the session upon shell command errors. Note: Constraint-based loading does not affect the target load ordering of the mapping. For every row generated by an active source. In the General Options settings of the Properties tab. then to any foreign key tables. Constraint-based loading depends on the following requirements: • • • • Active source. the Informatica Server ignores the errors and continues the session. When the mapping contains Update Strategy transformations and you need to load data to a primary key table first. Active Source The following transformations can be an active source within a mapping: • • Source Qualifier Normalizer (COBOL or flat file) Page 15 of 18 .or Post-Session Commands If you create non-reusable pre. the Informatica Server orders the target load on a row-by-row basis. Target load ordering defines the order the Informatica Server reads each source qualifier in the mapping. Key relationships.Informatica FAQs Creating a Reusable Command Task from Pre. Related target tables must have the same active source. If you select stop. Use this option when you insert into the target. Using Server Variables You can include any server variable. Pre-Session Shell Command Errors You can configure the session to stop or continue if a pre-session shell command fails. and the dependent tables second. such as $PMTargetFileDir. select Constraint-Based Load Ordering. When you select this option. On the Advanced settings. Click the Config Object tab.or post-session shell commands into a reusable Command task. If you select Continue. the Informatica Server loads the corresponding transformed row first to the primary key table. you must set the session option Treat Source Rows As to Data Driven. Target connection groups. Once you make the pre.or post-session shell commands.or post-session shell commands. To enable constraint-based loading: 1. but continues with the rest of the workflow. When you use a server variable instead of entering a specific directory. in pre. 2. Configure the session to stop or continue if a pre-session shell command fails in the Error Handling settings on the Config Object tab. choose Insert for the Treat Source Rows As property. Treat rows as insert. Constraint Based Load Ordering in Sessions Do not use constraint-based loading when the mapping used in the session contains Update Strategy transformations. you can run the same workflow on different Informatica Servers without changing session properties. Constraintbased loading establishes the order in which the Informatica Server loads individual targets within a set of targets receiving data from a single source qualifier. When you use Update Strategy transformations. Target tables must have key relationships. overriding mapping-level LOOKUP configurations. It reverts to a normal load. Choose normal mode for the target load type for all targets in the session properties. Define the same target type for all targets in the session properties. If you want to specify constraint-based loading for multiple targets that receive data from the same active source. Workspace File Directory The directory for workspace files created by the Workflow Manager. Similarly. the Informatica Server caches PowerMart 3. the Informatica Server performs lookups on a row-by-row basis. If the tables with the primary-foreign key relationship are in different target connection groups. What are the factors to be considered before configuring the repository environment? <Need to Refer> Apart from using the ‘Abort’ function to stop a session. If not selected. Use the default partition properties and do not add partitions or partition points. Treat Rows as Insert Use constraint-based loading only when the session option Treat Source Rows As is set to Insert. the Informatica Server cannot enforce constraint-based loading when you run the workflow. Target Connection group The Informatica Server enforces constraint-based loading for targets in the same target connection group. For example. the Informatica Server reverts to a normal load. This directory should be local to the Informatica Client to prevent file corruption or overwrites by multiple users. you must verify the tables are in the same target connection group. you have one target containing a primary key and a foreign key related to the primary key in a second target.Informatica FAQs • • • • • • Advanced External Procedure Aggregator Joiner Rank Sorter Mapplet. the Informatica Server does not perform constraint-based loading. if it contains one of the above transformations Key Relationship When target tables have no key relationships. You might get inconsistent data if you select a different Treat Source Rows As option and you configure the session for constraint-based loading. unless otherwise specified in the mapping. Cache Lookup() Function property in Sessions If selected. what is the other way to stop a session? <Need to Refer> Page 16 of 18 . The Informatica Server cannot enforce constraint-based loading for these tables.5 LOOKUP functions in the mapping. Define the same database connection name for all targets in the session properties. perform the following tasks: • • • • • Verify all targets are in the same target load order group and receive data from the same active source. By default. The second target also contains a foreign key that references the primary key in the first target. the Workflow Manager creates files in the Informatica Client installation directory. To verify that all targets are in the same target connection group. when target tables have circular key relationships. Workspace files maintain the last task or workflow you saved. ” If one or more tasks are still running in the worklet. It can contain any task available in the Workflow Manager. Declaring Events in Worklets Page 17 of 18 . When a task in the worklet fails. The Workflow Manager does not provide a parameter file or log file for worklets. Extend the metadata stored in the repository by associating information with repository objects. • You cannot use a non-reusable worklet in another workflow. • Events. • You can create non-reusable worklets in the Workflow Designer as you develop the workflow. You can view a list of reusable worklets in the Navigator Worklets node.” • You can create reusable worklets in the Worklet Designer. You can also nest a worklet in another worklet. To execute a worklet. After you create the worklet in the Workflow Designer. the worklet status is “Suspended. the worklet status is “Suspending. The Informatica Server writes information about worklet execution in the workflow log. the Informatica Server stops executing the failed task and other tasks in its path. Use worklet variables to reference values and record information.Informatica FAQs Difference Between Workflow and Worklet Worklets • • • • • • • • • • • • • • • A worklet is an object that represents a set of tasks. • Metadata extension. Configuring Worklet Properties When you use a worklet in a workflow. The workflow that contains the worklet is called the parent workflow. The worklet executes on the Informatica Server you choose for the workflow. you can configure the following worklet properties: • Worklet variables. If no other task is running in the worklet. You can run worklets inside a workflow. In addition to general task settings. The worklet does not contain any scheduling or server information. When you choose Suspend On Error for the parent workflow. include the worklet in a workflow. To use the Event-Wait and Event-Raise tasks in the worklet. Create a worklet when you want to reuse a set of workflow logic in several workflows. open the worklet to edit it in the Worklet Designer. you can configure the same set of general task settings on the General tab as any other task. • Create reusable worklets in the Worklet Designer. the Informatica Server also suspends the worklet if a task in the worklet fails. you must first declare an event in the worklet properties. You can assign a workflow variable to a worklet variable to override its initial value. • Non-reusable worklets only exist in the workflow. You use worklet variables the same way you use workflow variables. • You can promote non-reusable worklets to reusable worklets by selecting the Reusable option in the worklet properties.” The Informatica Server suspends the parent workflow when the status of the worklet is “Suspended” or “Suspending. The worklet does not contain any scheduling or server information. Use the Worklet Designer to create and edit worklets. You can also create non-reusable worklets in the Workflow Designer as you develop the workflow. the worklet variable retains its value the next time the Informatica Server executes the worklet instance in the parent workflow. Similarly. you cannot use user-defined worklet variables in the parent workflow. you first declare a user-defined event in the worklet. You can also create user-defined worklet variables. userdefined worklet variables can be persistent or non-persistent. A worklet has the same set of pre-defined variables as any task. you can use Event-Wait and Event-Raise tasks in a worklet.Informatica FAQs Similar to workflows. However. You cannot specify worklet events in the Event tasks in the parent workflow. You cannot use variables from the parent workflow in the worklet. Like user-defined workflow variables. Worklet variables only persist when you run the same workflow. just as you can use pre-defined variables for other tasks in the workflow. A worklet variable does not retain its value when you use instances of the worklet in different workflows. Persistent Worklet Variables To create a persistent worklet variable. When you create a persistent worklet variable. you can use pre-defined worklet variables in the parent workflow. Page 18 of 18 . select Persistent when you create the variable. To use the Event-Raise task. Using Worklet Variables Worklet variables are similar to workflow variables. Events in one instance of a worklet do not affect events in other instances of the worklet.