5 Replies Latest reply on Mar 23, 2021 7:51 AM by Dan Kampschroer

    Salesforce Analytics to .CSV via IICS

    Chaitanya Polisetty New Member

      Hi Folks,

       

      I am trying to load a SF Einstein Analytics Dataset data to .CSV file using IICS.

      Hence using below properties.

       

      Source: Salesforce Analytics (Informatica Cloud)

      and placed xxx_schema.JSON int the running agent path (this schema file is same file that was generated when i was loading data from a DB to SF analytics Dataset earlier)

       

      Target: xxxxxx.CSV under accessible path.

       

      Same source connection was working absolutely fine when I was using it as target to load dataset, but its not working as source to read Dataset data to load CSV.

      All my connections, Security token, URLs, passwords are correct. Even then Sync map files with below error.

       

      [ERROR] java.lang.NullPointerException

      How to fix or any possible solution ?

       

      Regards,

      Chaitanya

        • 1. Re: Salesforce Analytics to .CSV via IICS
          Jeffline Jenisha W Active Member

          Hi Chaitanya,

           

          The error message alone would not be sufficient to identify the cause of the issue. Could you please share the session log generated for the task?

           

          Regards,

          Jeffline

          • 2. Re: Salesforce Analytics to .CSV via IICS
            Chaitanya Polisetty New Member

            ========================PFB Session Log===============================

             

            Task Name: NAImpIDCardPrints_Extract

            Agent Group Id: 000Q1J2500000000000B

            Agent Group Name: IP-16AE6C11

            Agent Id: 000Q1J08000000000007

            Agent Name: IP-16AE6C11

            02/24/2021 11:54:26 **** Importing Connection: SrcConn_000Q1J0B00000000007I ...

            02/24/2021 11:54:26 **** Importing Connection: TargConn_000Q1J0B00000000007C ...

            02/24/2021 11:54:26 **** Importing Source Definition: NAImpIDCardPrints ...

            02/24/2021 11:54:26 **** Importing Target Definition: NAImpIDCardPrints ...

            02/24/2021 11:54:26 **** Importing SessionConfig: cfg_s_dss_000Q1J0I0000000001QW ...

                <Warning> :  The Error Log DB Connection value should have Relational: as the prefix.

                <Warning> :  Invalid value  for attribute Error Log DB Connection. Will use the default value

                Validating Source Definition  NAImpIDCardPrints...

                Validating Target Definition  NAImpIDCardPrints...

            02/24/2021 11:54:26 **** Importing Mapping: m_dss_000Q1J0I0000000001QW ...

            Validating transformations of mapping m_dss_000Q1J0I0000000001QW...

            Validating mapping variable(s).

                <Warning> :  Invalid table attribute: Exec SQL

            02/24/2021 11:54:26 **** Importing Workflow: wf_dss_000Q1J0I0000000001QW ...

                <Warning> :  The value entered is not a valid integer.

                <Warning> :  Invalid value NO for attribute Fail task after wait time. Will use the default value

            [Summary for Application Source Qualifier SQ_NAImpIDCardPrints]

                 Is a new repartition point.

                 Partition type changed from None to Pass Through.

            Successfully extracted session instance [s_dss_000Q1J0I0000000001QW].  Starting repository sequence id is [1300586911]

            ======================================================================

            DIRECTOR> VAR_27085 [2021-02-24 11:54:26.824] Parameter file [C:\Program Files\Informatica Cloud Secure Agent\apps\Data_Integration_Server\data\parameters\s_dss_000Q1J0I0000000001QW.param] is opened for [session [wf_dss_000Q1J0I0000000001QW.s_dss_000Q1J0I0000000001QW]].

            DIRECTOR> VAR_27062 [2021-02-24 11:54:26.824] Warning! Cannot find section for worklet [wf_dss_000Q1J0I0000000001QW] and folder [] in parameter file [C:\Program Files\Informatica Cloud Secure Agent\apps\Data_Integration_Server\data\parameters\s_dss_000Q1J0I0000000001QW.param].

            DIRECTOR> VAR_27028 [2021-02-24 11:54:26.824] Use override value [s_dss_000Q1J0I0000000001QW_6_error.csv] for user-defined workflow/worklet variable:[$ErrorFileName].

            DIRECTOR> VAR_27028 [2021-02-24 11:54:26.824] Use override value ['1970-01-01'] for user-defined workflow/worklet variable:[$LastRunDate].

            DIRECTOR> VAR_27028 [2021-02-24 11:54:26.824] Use override value ['1970-01-01 00:00:00'] for user-defined workflow/worklet variable:[$LastRunTime].

            DIRECTOR> VAR_27028 [2021-02-24 11:54:26.824] Use override value [s_dss_000Q1J0I0000000001QW_2_24_2021_11_54_success.csv] for user-defined workflow/worklet variable:[$SuccessFileName].

            DIRECTOR> TM_6014 [2021-02-24 11:54:26.824] Initializing session [s_dss_000Q1J0I0000000001QW] at [Wed Feb 24 11:54:26 2021].

            DIRECTOR> TM_6683 [2021-02-24 11:54:26.824] Repository Name: [XMLRepository]

            DIRECTOR> TM_6684 [2021-02-24 11:54:26.824] Server Name: [rDTM]

            DIRECTOR> TM_6686 [2021-02-24 11:54:26.824] Folder: []

            DIRECTOR> TM_6685 [2021-02-24 11:54:26.824] Workflow: [wf_dss_000Q1J0I0000000001QW] Run Instance Name: [] Run Id: [0]

            DIRECTOR> TM_6101 [2021-02-24 11:54:26.824] Mapping name: m_dss_000Q1J0I0000000001QW.

            DIRECTOR> TM_6964 [2021-02-24 11:54:26.824] Date format for the Session is [MM/DD/YYYY HH24:MI:SS]

            DIRECTOR> TM_6703 [2021-02-24 11:54:26.824] Session [s_dss_000Q1J0I0000000001QW] is run by 64-bit Integration Service  [], version [9.1.0 HotFix1], build [0627].

            MANAGER> PETL_24058 [2021-02-24 11:54:28.355] Running Partition Group [1].

            MANAGER> PETL_24000 [2021-02-24 11:54:28.355] Parallel Pipeline Engine initializing.

            MANAGER> PETL_24001 [2021-02-24 11:54:28.355] Parallel Pipeline Engine running.

            MANAGER> PETL_24003 [2021-02-24 11:54:28.355] Initializing session run.

            MAPPING> CMN_1569 [2021-02-24 11:54:28.355] Server Mode: [UNICODE]

            MAPPING> CMN_1570 [2021-02-24 11:54:28.355] Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]

            MAPPING> TM_6151 [2021-02-24 11:54:28.355] The session sort order is [Binary].

            MAPPING> TM_6185 [2021-02-24 11:54:28.355] Warning. Code page validation is disabled in this session.

            MAPPING> CMN_65083 [2021-02-24 11:54:28.355] Current Timezone:[UTC +0:0]

            MAPPING> CMN_65085 [2021-02-24 11:54:28.355] Current Process ID:[5556]

            MAPPING> TM_6155 [2021-02-24 11:54:28.355] Using HIGH precision processing.

            MAPPING> TM_6180 [2021-02-24 11:54:28.355] Deadlock retry logic will not be implemented.

            MAPPING> TM_6181 [2021-02-24 11:54:28.355] Session source-based commit interval is [5000].

            MAPPING> PMJVM_42020 [2021-02-24 11:54:28.371] [INFO] Loaded library : C:\PROGRA~1\INFORM~1\apps\jdk\1.8.0_252_SA\jre\bin\server\jvm.dll.

            MAPPING> PMJVM_42014 [2021-02-24 11:54:28.371] [DEBUG] The value specified for option JVMOption1 is not valid and will be ignored.

            MAPPING> PMJVM_42014 [2021-02-24 11:54:28.371] [DEBUG] The value specified for option JVMOption2 is not valid and will be ignored.

            MAPPING> PMJVM_42014 [2021-02-24 11:54:28.371] [DEBUG] The value specified for option JVMOption3 is not valid and will be ignored.

            MAPPING> PMJVM_42014 [2021-02-24 11:54:28.371] [DEBUG] The value specified for option JVMOption4 is not valid and will be ignored.

            MAPPING> PMJVM_42014 [2021-02-24 11:54:28.371] [DEBUG] The value specified for option JVMOption5 is not valid and will be ignored.

            MAPPING> PMJVM_42009 [2021-02-24 11:54:28.888] [INFO] Created Java VM successfully.

            MAPPING> SDKS_38029 [2021-02-24 11:54:29.462] Loaded plug-in 507900: [plugin description].

            MAPPING> SDKS_38024 [2021-02-24 11:54:29.556] Plug-in 507900 initialization complete.

            MAPPING> SDKS_38016 [2021-02-24 11:54:29.556] Reader SDK plug-in intialization complete.

            MAPPING> TM_6308 [2021-02-24 11:54:29.556] DTM error log using code page UTF-8.

            MAPPING> TM_6310 [2021-02-24 11:54:29.556] DTM error log enabled at file [D:\Program Files\Informatica\error\s_dss_000Q1J0I0000000001QW_6_error.csv].

            MAPPING> TM_6682 [2021-02-24 11:54:29.556] Warning: Since DTM Error Logging is enabled, row level errors will not be written to the session log. Please check the DTM Error Log for any row level errors.

            MAPPING> TM_6701 [2021-02-24 11:54:29.556] Warning: The data column delimiter for DTM Error logging and the flat file delimiter are the same [','].

            MAPPING> TE_7022 [2021-02-24 11:54:29.556] TShmWriter: Initialized

            MAPPING> DBG_21491 [2021-02-24 11:54:29.649] Source-based commit interval based on rows from Source Qualifier [SQ_NAImpIDCardPrints]

            MAPPING> DBG_21244 [2021-02-24 11:54:29.649] Initializing Load Order Group [1]. Source Qualifier is [SQ_NAImpIDCardPrints]

            MAPPING> TM_6007 [2021-02-24 11:54:29.649] DTM initialized successfully for session [s_dss_000Q1J0I0000000001QW]

            DIRECTOR> PETL_24033 [2021-02-24 11:54:29.649] All DTM Connection Info: [<NONE>].

            MANAGER> PETL_24004 [2021-02-24 11:54:29.649] Starting pre-session tasks. : (Wed Feb 24 11:54:29 2021)

            MANAGER> PETL_24027 [2021-02-24 11:54:29.743] Pre-session task completed successfully. : (Wed Feb 24 11:54:29 2021)

            DIRECTOR> PETL_24006 [2021-02-24 11:54:29.743] Starting data movement.

            MAPPING> TM_6660 [2021-02-24 11:54:29.743] Total Buffer Pool size is 7767040 bytes and Block size is 429280 bytes.

            READER_1_1_1> AnalyticsConnection_10057 [2021-02-24 11:54:30.379] [INFO] Proxy information is not set.

            READER_1_1_1> AnalyticsConnection_10048 [2021-02-24 11:54:33.755] [INFO] The account [ah16616@legatohealth.com.eadev] logged on to salesforce.com at the following time: [https://commercial--eadev.cs60.my.salesforce.com/services/Soap/u/42.0].

            READER_1_1_1> CMN_1761 [2021-02-24 11:54:33.755] Timestamp Event: [Wed Feb 24 11:54:33 2021]

            READER_1_1_1> JAVA PLUGIN_1762 [2021-02-24 11:54:33.755] [ERROR] java.lang.NullPointerException

            READER_1_1_1> CMN_1761 [2021-02-24 11:54:33.755] Timestamp Event: [Wed Feb 24 11:54:33 2021]

            READER_1_1_1> JAVA PLUGIN_1762 [2021-02-24 11:54:33.755] [ERROR] at com.informatica.cloud.api.adapter.reader.runtime.GenericRdrPartitionDriver.init(Unknown Source)

            READER_1_1_1> CMN_1761 [2021-02-24 11:54:33.755] Timestamp Event: [Wed Feb 24 11:54:33 2021]

            READER_1_1_1> SDKS_38200 [2021-02-24 11:54:33.755] Partition-level [SQ_NAImpIDCardPrints]: Plug-in #507900 failed in init().

            READER_1_1_1> GENERIC_READ_10031 [2021-02-24 11:54:33.755] [WARNING] Call to deinit in 'READER' failed due to : [java.lang.NullPointerException

             

             

            at com.informatica.cloud.api.adapter.utils.Utils.findDeinitMethod(Unknown Source)

             

             

            at com.informatica.cloud.api.adapter.reader.runtime.GenericRdrPartitionDriver.deinit(Unknown Source)

             

             

            ]

            MANAGER> PETL_24031 [2021-02-24 11:54:33.755]

            ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****

            Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_NAImpIDCardPrints] has completed. The total run time was insufficient for any meaningful statistics.

             

             

            MANAGER> PETL_24005 [2021-02-24 11:54:33.834] Starting post-session tasks. : (Wed Feb 24 11:54:33 2021)

            MANAGER> PETL_24029 [2021-02-24 11:54:33.912] Post-session task completed successfully. : (Wed Feb 24 11:54:33 2021)

            MAPPING> SDKS_38025 [2021-02-24 11:54:33.912] Plug-in 507900 deinitialized and unloaded with status [-1].

            MAPPING> SDKS_38018 [2021-02-24 11:54:33.912] Reader SDK plug-ins deinitialized with status [-1].

            MAPPING> TM_6315 [2021-02-24 11:54:33.912] DTM error log: file log [D:\Program Files\Informatica\error\s_dss_000Q1J0I0000000001QW_6_error.csv], rows inserted [0].

            MAPPING> TM_6018 [2021-02-24 11:54:33.912] The session completed with [0] row transformation errors.

            MANAGER> PETL_24002 [2021-02-24 11:54:34.490] Parallel Pipeline Engine finished.

            DIRECTOR> PETL_24013 [2021-02-24 11:54:34.552] Session run completed with failure.

            DIRECTOR> TM_6022 [2021-02-24 11:54:34.552]

             

             

            SESSION LOAD SUMMARY

            ================================================

             

             

            DIRECTOR> TM_6252 [2021-02-24 11:54:34.552] Source Load Summary.

            DIRECTOR> CMN_1740 [2021-02-24 11:54:34.552] Table: [SQ_NAImpIDCardPrints] (Instance Name: [SQ_NAImpIDCardPrints])

            Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]

            DIRECTOR> TM_6253 [2021-02-24 11:54:34.552] Target Load Summary.

            DIRECTOR> CMN_1740 [2021-02-24 11:54:34.552] Table: [NAImpIDCardPrints] (Instance Name: [NAImpIDCardPrints])

            Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]

            DIRECTOR> TM_6023 [2021-02-24 11:54:34.552]

            ===================================================

             

             

            DIRECTOR> TM_6020 [2021-02-24 11:54:34.552] Session [s_dss_000Q1J0I0000000001QW] completed at [Wed Feb 24 11:54:34 2021].

            • 3. Re: Salesforce Analytics to .CSV via IICS
              Akshaye Shreenithi Kirupa Guru

              Hi Chaitanya

               

              One of the reasons for NullPointerException could be due to a mismatch of the source definition in the source instance and definition available in mapping.

               

              Can you try reimporting the source/target object and see if that helps? If the issue persits, please share the session log for further analysis.

              • 4. Re: Salesforce Analytics to .CSV via IICS
                Sathyanath N Seasoned Veteran

                Note that the Salesforce Analytics connector is something that can be used at Target only and you can not read from that.

                • 5. Re: Salesforce Analytics to .CSV via IICS
                  Dan Kampschroer New Member

                  Chaitanya, this functionality does not yet exist for this connector.  Feature Request (CCON-28014) is pending with R&D to get it addressed.  Please check an option of SalesForce Analytics API calls to retrieve the needed data.  Perhaps that could be a workaround until the feature is released?  Here: Salesforce Developers