My suspicion is that the target table in Oracle has been created with the parameter NLS_SEMANTICS set to BYTE instead of CHAR. This means that the VARCHAR2 attribute will be allocated with 4000 bytes.
Now consider you have some string which contains >4000 characters. The SUBSTR() function will cut off everything after character #4000, but if there is any character in the first 4000 characters which needs more than one byte to store (e.g. one of the German Umlauts ä, ö, ü, ß, or any Kyrillian or Greek or French character), then those 4000 characters will need more than 4000 bytes to store.
How can you remedy that?
The easiest remedy usually is to recreate the table with the definition showing that you want to store up to 4000 characters, not 4000 bytes, such as in this example:
CREATE TABLE foobla
( id NUMBER NOT NULL,
key VARCHAR2( 40 CHAR) NOT NULL,
value VARCHAR2( 4000 CHAR) NOT NULL
Thank You for your reply.
I was also in the same impression and I checked it 2/3 times and noticed that the field is 4000 char not 4000 bytes.
Thanks & Regards:
Could you please check which records are rejected due to being too lengthy?
In particular, what code page does your DB run in, and what data are in those records?
What I have in mind is that your data may be e.g. in a Shift-JIS encoding (where characters may consume more than one byte) and that the DBMS can't handle this particular encoding in all cases.
It would be interesting to see whether an INSERT statement issued "manually" (for example, via SQL*PLUS, TOAD, SQL Developer, or some other tool) with the same data rejected in your session would work via manual INSERT or not.
Other than that I only have a weak suspicion which I can't check myself:
Does Oracle handle VARCHAR2(4000 CHAR) correctly in all cases? Or could it be that those 4000 bytes are kind of a "hard-coded" limit implemented in Oracle? Meaning this particular behaviour MAY be a bug in Oracle or - in your case - a "feature" which can't be overstretched as you're trying to do it.