Getting "Invalid hexadeximal constant" error when querying BYTE column with 'value'xb? Make sure there are no dashes - switch them off in Options > Data Format > Display dashes between bytes for Binary data values 👍

Target Load Type is important: Bulk cannot work if target object has indexes ( database fails with ORA-26002: Table has index defined upon it error) but is faster due to limited ability to recover because no logging occurs. Normal mode works with indexed target objects.

When adding derived ports in Source Qualifier SQL query override, will fail with "The Source Qualifier contains an unbound field" - SQ ports must always be connected to Source Definition. Workaround: create dummy source table with corresponding data types (important!) and connect them to SQ with ports derived by SQL override. Works smooth 👍

Load multiple files (same structure) at once with by providing file list: 1) develop mapping with one of the files as source definition; 2) create workflow variable $$LoadFileName (do not have it as mapping parameter!); 3) put a list of files to load into indirect.txt; (can automate this with Bash) 4) add $$LoadFileName=indirect.txt to param file; 5) in session level set Source filetype=Indirect, Source filename=$$LoadFileName. All files listed in indirect.txt will be loaded 👍

If you get "ORA-28040: No matching authentication protocol" error trying to import v12 table into metadata and using DataDirect 7.1 Oracle Wire Protocol driver, make sure SQLNET.ALLOWED_LOGON_VERSION_SERVER=11 and SQLNET.ALLOWED_LOGON_VERSION_CLIENT=11 are set in database config. kb.informatica.com/solution/23

Pay special attention to ampersand in Database queries - even inside string you need to escape it: either split and concat 'string with &' || ' ampersand' or substitute & with chr(38).

Even though creating and dropping views in is a bit expensive, they are mandatory when trying to do UPDATE in - otherwise following error is displayed: Full pushdown optimization stops at the target because update on Teradata requires a view, and view creation is not enabled.

To large files into use LOAD DATA INFILE. Best case scenario is to have file uploaded onto filesystem of MySQL server and use LOAD DATA LOCAL INFILE. Make sure to check field and line termination symbols in the file in advance.

When developing with a mapping that has several targets, it is important to select session setting "Treat source rows as: Data driven" to make sure each target has corresponding operation (DD_INSERT, DD_UPDATE, DD_DELETE) applied as set in Update transformations. When Data driven option is selected, there has to be UPDATE transformations in the mapping for optimization to work.

Efficient Slowly Changing Dimension type 1 in : separate INSERT detection port (record doesn't exist on target) from UPDATE port (check all columns for changes) and evaluate them separately in Filter Transformations. Benefit: generated query for INSERT will not have complex WHERE condition which UPDATE uses.

Sadly: Full optimization stops at the target [target] because on is not supported in the presence of the joiner or lookup. - Informatica Solution: replace Joiner Transformations with overrides in Source Qualifiers and it will work. Some say Update Strategy doesn't work with but actually it does: both DD_INSERT and DD_UPDATE 👍 When is done on target you can use Joiner though.

How to convert into when part after decimal separator is not needed and you don't want trailing dot in the end of string in ? Use CAST(CAST(9.00 AS FORMAT 'Z(I)') AS VARCHAR(50))

Mastodon.lt

Mastodon node of Lithuanian software engineer Gytis Repečka.