3 Replies Latest reply on Oct 24, 2019 2:54 AM by Pierre Martignon

    Redshift JDBC Connection

    Premkumar S Seasoned Veteran

      Hi Gurus,

       

      Im trying to write data from Hive tables to Redshift. When I used Native connection i have to create Data object Write operation for all my target and then only im able to use them in my dynamic mapping.

       

      So i thought of creating JDBC connection. I created a JDBC connection and dynamic worked fine when i run the mapping in Native mode. But the same mapping when i run it using Spark engine, im getting error "Caused by: com.amazon.support.exceptions.ErrorException: [Amazon](500310) Invalid operation: permission denied for schema public;"

       

      Can anyone help me here?

       

      Thanks

      Prem

        • 1. Re: Redshift JDBC Connection
          Pierre Martignon Seasoned Veteran

          Hi Prem,

           

          With 10.2.2 Spark supports (simple) JDBC connection only with Update strategy otherwise this requires Sqoop JDBC.

          When we have a dedicated connector (here we have a Redshift) then this is the supported method to access this source/target and the generic JDBC is not certified,

           

          Regards,

          Pierre

          • 2. Re: Redshift JDBC Connection
            Premkumar S Seasoned Veteran

            Hi Pierre,

             

            Thanks for the inputs. I used the "RedshiftJDBC4-1.2.16.1027.jar" and created a JDBC with Sqoop connection. It worked as well. But below are the issues im facing:

             

            1. When target, Redshift, has column names enclosed inside double quotes ("primary"), then execution fails with error Invalid relation

            2. While loading date table into Redshift, getting date value of error.

             

            Any inputs on the above error?

             

            Thanks

            Prem

            • 3. Re: Redshift JDBC Connection
              Pierre Martignon Seasoned Veteran

              Hi Prem,

               

              As explained this is not the supported connector for this source/target, and hence there will be issues.

              If you manage to implement any sort of Sqoop job via Sqoop command line but do not succeed in getting this same job to run with Informatica then you should check with the support and raise a ticket with the details and we'll endeavour to assist.

               

              Regards,

              Pierre