Informatica Pushdown Optimization Option increases performance by providing the flexibility to push transformation processing to the most appropriate processing resource. Using Pushdown Optimization, data transformation logic can be pushed to source database or target database or through the PowerCenter server. This gives the option for the ETL architect to choose the best of the available resources for data processing.
What is Pushdown Optimization
Performance Improvement Features
Pushdown Optimization Option enables data transformation processing, to be pushed down into any relational database to make the best use of database processing power. It converts the transformation logic into SQL statements, which can directly execute on database. This minimizes the need of moving data between servers and utilizes the power of database engine.
How Pushdown Optimization Works
When you run a session configured for pushdown optimization, the Integration Service analyzes the mapping and transformations to determine the transformation logic it can push to the database. If the mapping contains a mapplet, the Integration Service expands the mapplet and treats the transformations in the mapplet as part of the parent mapping. The Integration Service converts the transformation logic into SQL statements and sends to the source or the target database to perform the data transformation. The amount of transformation logic one can push to the database depends on the database, transformation logic, and mapping and session configuration.
Different Type Pushdown OptimizationYou can configure pushdown optimization in the following ways.
- Source-side pushdown optimization
- Target-side pushdown optimization
- Full pushdown optimization
Source-side pushdown optimization
When you run a session configured for source-side pushdown optimization, the Integration Service analyzes the mapping from the source to the target or until it reaches a downstream transformation it cannot push to the database.
The Integration Service generates a SELECT statement based on the transformation logic for each transformation it can push to the database. When you run the session, the Integration Service pushes all transformation logic that is valid to push to the database by executing the generated SQL statement. Then, it reads the results of this SQL statement and continues to run the session.
If you run a session that contains an SQL override or lookup override, the Integration Service generates a view based on the override. It then generates a SELECT statement and runs the SELECT statement against this view. When the session completes, the Integration Service drops the view from the database.
Target-side pushdown optimization
When you run a session configured for target-side pushdown optimization, the Integration Service analyzes the mapping from the target to the source or until it reaches an upstream transformation it cannot push to the database.
The Integration Service generates an INSERT, DELETE, or UPDATE statement based on the transformation logic for each transformation it can push to the database, starting with the first transformation in the pipeline it can push to the database. The Integration Service processes the transformation logic up to the point that it can push the transformation logic to the target database. Then, it executes the generated SQL.
Full pushdown optimization
The Integration Service pushes as much transformation logic as possible to both source and target databases. If you configure a session for full pushdown optimization, and the Integration Service cannot push all the transformation logic to the database, it performs partial pushdown optimization instead.
To use full pushdown optimization, the source and target must be on the same database. When you run a session configured for full pushdown optimization, the Integration Service analyzes the mapping starting with the source and analyzes each transformation in the pipeline until it analyzes the target. It generates SQL statements that are executed against the source and target database based on the transformation logic it can push to the database. If the session contains an SQL override or lookup override, the Integration Service generates a view and runs a SELECT statement against this view.
Configuring Session for Pushdown Optimization
A session can be configured to use pushdown optimization from informatica powercenter workflow manager. You can open the session and choose the Source, Target or Full pushdown optimization as shown in below image.
You can additionally choose few options to control how integration service push data transformation into SQL statements. Below screen shot shows the available options.
- Allow Temporary View for Pushdown. Allows the Integration Service to create temporary view objects in the database when it pushes the session to the database.
- Allow Temporary Sequence for Pushdown. Allows the Integration Service to create temporary sequence objects in the database.
- Allow Pushdown for User Incompatible Connections. Indicates that the database user of the active database has read permission on the idle databases.
Using Pushdown Optimization Viewer
Use the Pushdown Optimization Viewer to examine the transformations that can be pushed to the database. Select a pushdown option or pushdown group in the Pushdown Optimization Viewer to view the corresponding SQL statement that is generated for the specified selections.
You can invoke the viewer from highlighted 'Pushdown Optimization' as shown in below image.
Pushdown optimizer viewer pops up in a new window and it shows how integration service converts the data transformation logic into SQL statement for a particular mapping. When you select a pushdown option or pushdown group in the viewer, you do not change the pushdown configuration. To change the configuration, we must update the pushdown option in the session properties.
Things to Consider before Using Pushdown Optimization
When you run a session for full pushdown optimization, the database must run a long transaction, if the session contains a large quantity of data. Consider the following database performance issues when you generate a long transaction.
- A long transaction uses more database resources.
- A long transaction locks the database for longer periods of time, and thereby reduces the database concurrency and increases the likelihood of deadlock.
- A long transaction can increase the likelihood that an unexpected event may occur.