Temporary tables are bound to one single Oracle session, as far as I understand it. You cannot access their contents from a different Oracle session.
Each SQ, each LKP, each SQL transformation, each SP gets its own DB session when running a PowerCenter session. So indeed you cannot access the data in a different Oracle session which you inserted e.g. via a SP.
Why is it necessary to load the data into a temp table first and then write them to the "real" target table later? To ensure that data sets are complete?
If that's the reason, there are better ways to ensure that. It MAY look as if these other ways are more complex than your idea (which might run fine under normal circumstances), but in the end maintenance efforts probably become easier by other means.
Thanks Nico for the explanation. As you mentioned, each SQ, each LKP, each SQL transformation, each SP gets its own DB session when running a Power Center session.
am calling the proc and inserting the data into target table from temp tables and all these are done in SQL transformation (script mode) as I had given 2 statements in single sql file and passed it to the script name column in SQL transformation. Guess its same db session that power center uses to run the proc and insert the data from temp to target table. Please correct me if am wrong.
Your sql file must be like this
insert into TABLE_TEMP ...;
insert into TABLE_TGT select ... from TABLE_TEMP ... ;
If you put additional commit between inserts then your DDL for TABLE_TEMP must contain
ON COMMIT PRESERVE ROWS;
Correct; if you execute the statements from a SQL Transformation, they are executed via the same Oracle session.
Please beware of two points:
First - as mentioned by Vlad - the temp table must be created "correctly".
Second you have to make sure that the undo / redo logs are large enough for the whole load (and trust me, quite a few DBAs will want to kill you for such a design).
Here is the sql statement that gets executed from SQL Transformation.
call ESP_CR_ALTERED_ROWS(i_TableNm => table_nm, i_StartDate => start_date,i_EndDate => end_date, i_NoDuplicates => true);
insert into TABLE_TGT select * from TABLE_TEMP;
The first statement invoke the procedure ESP_CR_ALTERED_ROWS which in turn insert the data into global temp table 'TABLE_TEMP' (ON COMMIT PRESERVE ROWS) .
second statement reads the data from 'TABLE_TEMP' and load into TABLE_TGT.
When I verified the data for 'TABLE_TEMP' in data base after running the workflow ,its empty.
when I ran the proc directly in the data base , this temp table has data. Don't have any control on these all these global temp tables as they are owned by different team and cant change them to normal tables. Please advice.
You can never view data in GTT (global temporary table) generated in a workflow in another DB tool (sqlplus, SQL developer, etc.). It is the one of the main feature of GTT.
If TGT_Table has no new records add some debug rows to procedure (or it's clone) to be sure that it works correctly without error or exceptions. May be the reason in datatype of input parameters may be in privilegies and so on.
We do have the same problem. Can you let me know how you solved this issue
With all due respect, but isn't it all available in this thread already?
What information do you miss?
There is no "resolution", you have to redesign your approach if you want to do it with PowerCenter and you insist on using a temp table. Vlad has explained how to do it.