1 Reply Latest reply on Feb 15, 2019 6:44 AM by Christian Brudevoll

    How can I avoid that bulk load fails after 100 rows?

    Christian Brudevoll Active Member

      We use Powercenter 10.2.0 to load data between Sybase ASE databases.

      A mapping I've created use almost 100% of the time to write to a target table.

      To improve performance I want to change from normal to bulk load.

      But the bulk load fails after inserting 100 rows in one of the tables.

      The table definition is identical i database and PC, and all indexes removed.

      For testing I've put 2000 rows in a temp table, and load from it into the real table in another simple mapping.

      There I've tested several different reading sort orders, but it always fails after 100 rows.

      Therefor I suppose the data is not the source of the problem.

      Anybody having any tips?


      From a log-file:

      Message: Start loading table [XXXX] at: Wed Feb 13 14:01:59 2019

      Message: Database errors occurred:

      Wed Feb 13 14:01:59 2019  SQL Server Message 5701 : Changed database context to 'YYYY'.

      Database driver error...Row transfer failed

      Database driver error...

      Function Name : Execute Multiple

      SQL Stmt : XXXX

      Message: ERROR: Writer execution failed.