Authors: gcook Skip navigation
1 2 Previous Next

Cloud Data Integration

19 Posts authored by: gcook

Webinar: Meet the Experts: Deep-Dive, Demo, Roadmap - Informatica Cloud App/API Integration

 


Join Informatica product experts as they dive deep into the API and application integration capabilities for accelerating your digital transformation. You will learn:

 

  1. How to develop processes, APIs, and connect to any APIs without any coding
  2. What Intelligent APIs are and why Informatica is uniquely qualified to offer these?
  3. About management of integration artifacts and APIs
  4. The “ilities” (performance, scalability, reliability) of our platform
  5. The IaaS, SaaS, and on-prem partners we integrate with

ALERT:  for customers using Informatica Cloud

(Updated added MemSQL and Eloqua - 6/6/2019; added Ariba - 7/31/2019)

 

To better serve our customers, we are planning to place older connectors and unused connectors in End-of-Life (EOL) or Maintenance Mode mid-2019. if you need a new connector enabled, per the customer action below, please create a shipping case and request we add the new connector to your org(s). The differences between Maintenance Mode and EOL are summarized in table below:

 

Term

Description

Bug Fixes

Enhancements

Connector continues to work

Customer Action

End-of-Life (EOL)

Connector at end-of-life. 
Informatica will no longer support; no bug fixes; no enhancements. 
There will be no automatic migrations, upgrades for existing work.

No

No

No

Connector will no longer work post the next release planned for July; and will not be available in your org anymore.
Please verify you not using any connector, or move mappings to an alternative connector, if available.

Maintenance Mode

Connector in maintenance mode. 
Informatica will no longer enhance; and bug fixes may be considered. 
There will be no automatic migrations, upgrades for existing work. You will need to apply the latest recommended connector and migrate your jobs to the next connector.

Yes

No

Yes

Connector will work post R32.
Customer should consider moving to alternative connector, if available; the alternative connector will continue to be further enhanced as necessary.

 

Am I impacted?

Refer to the list below to determine if you are using one of these connectors. A separate email will be sent to all "Subscription" only customers for these connectors.

 

How do I address the issue?

Please reference Customer Action in table above; and Notes to Customers and Alternative connector columns in table below.

 

The following table shows the connectors planned for end-of-life (EOL) or maintenance mode mid-2019

 

Nr

Data Source

Connector Name

EOL or Maintenance Mode

Alternative Connector

Notes to Customers

1

Amazon QuickSight

Amazon QuickSight

EOL

None

 

2

Arc GIS

Arc GIS

EOL

None

 

3

Attensity Discovery Now

Attensity Discovery Now

EOL

None

 

4

Avature

Avature

EOL

Generic REST V2 / WS Connector

 

5

Birst

Birst

EOL

Birst Cloud Connect

 

6

Cloud File Transfer

Cloud File Transfer

EOL

None

 

7

Club Assistant

ClubAssistant

EOL

None

 

8

DataSift

DataSift

EOL

None

 

9

EPIC

EPIC

EOL

None

 

10

IDVExpress

IDVExpress

EOL

None

 

11

Informatica Data Prep

Informatica Data Prep

EOL

None

 

12

Informatica Rev

Informatica Rev

EOL

None

 

13

Intuit QuickBooks

Intuit Quickbooks

EOL

QuickBooks V2

 

14

Intuit Quickbooks Online

Intuit Quickbooks Online

EOL

QuickBooks V2

 

15

Magento

Magento

EOL

None

 

16

Marketo

Marketo 2

EOL

Marketo V3

 

17

Microsoft Dynamics AX

Microsoft Dynamics AX 2009

EOL

None

 

18

Microsoft Dynamics GP

Microsoft Dynamics GP 2010

EOL

None

New connector on roadmap.

19

Oracle Netsuite

NetSuite (Restlet) Write only

EOL

NetSuite

 

20

Oracle Peoplesoft

Oracle Peoplesoft 9.x

EOL

Use generic REST V2 or WS Connector

 

21

Oracle Taleo Business Edition

Oracle Taleo Business Edition

EOL

Generic REST V2 / WS Connector

 

22

Oracle Taleo Enterprise Edition

Oracle Taleo Enterprise Edition

EOL

Generic REST V2 / WS Connector

 

23

Rapnet

Rapnet

EOL

None

 

24

Rave

Rave

EOL

None

 

25

Reltio

Reltio

EOL

None

 

26

Rev

Rev

EOL

None

 

27

Saaggita

Saaggita

EOL

None

 

28

Salesforce Insights

Salesforce Insights

EOL

None

 

29

Snowflake

Snowflake V1 Connector

EOL

Snowflake Cloud Data Warehouse

 

30

Snowflake

Snowflake Big Data Warehouse

EOL

Snowflake Cloud Data Warehouse

 

31

Sugar CRM

Sugar CRM

EOL

Sugar CRM REST

 

32

Tableau (Server)

Tableau V1

EOL

Tableau V3

 

33

Trackwise

Trackwise

EOL

None

 

34

Vindicia

Vindicia

EOL

None

 

35

Zoho

Zoho

EOL

Generic REST V2 / WS Connector

 

36

Amazon Dynamo DB

Amazon Dynamo DB

Maintenance Mode

None

New connector on roadmap.

37

Anaplan

Anaplan

Maintenance Mode

Anaplan V2

 

38

Apache Hive

Hadoop

Maintenance Mode

Hive Connector

 

39

Box

Box

Maintenance Mode

None

New connector on roadmap.

40

Box

Box API

Maintenance Mode

None

New connector on roadmap.

41

Chatter

Chatter

Maintenance Mode

None

 

42

Coupa

Coupa

Maintenance Mode

Coupa V2

 

43

Dropbox

Dropbox

Maintenance Mode

None

New connector on roadmap.

44EloquaEloqua (Soap)Maintenance ModeEloqua Bulk, Eloqua REST

45

Google API

Google API

Maintenance Mode

Google analytics

 

46

LinkedIn

LinkedIn

Maintenance Mode

None

New connector on roadmap.

47

Marketo

Marketo

Maintenance Mode

Marketo V3

 

48

Marketo

Marketo REST

Maintenance Mode

Marketo V3

 

49MemSQLMemSQLMaintenance ModeMemSQL V2Work with MemSQL for connector access

50

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V1

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

51

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V2

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

52

Microsoft Azure Cosmos DB SQL API

Microsoft Azure Document DB

Maintenance Mode

Microsoft Azure Cosmos DB SQL API

Consider building new and updating existing mappings to use Cosmos DB SQL API connector. Note that the Cosmos DB SQL API connector does not support DSS yet. Support for DSS equivalent functionality with the Cosmos DB SQL API connector is planned to be available in 1H 2020

53

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V1

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

54

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V2

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

55

Microsoft Azure SQL DW

Microsoft Azure SQL DW V1

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

56

Microsoft Azure SQL DW

Microsoft Azure SQL Data Warehouse V2

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

57

Microsoft Dynamics AX

Microsoft Dynamics AX 2012

Maintenance Mode

Microsoft Dynamics AX 2012 V3

 

58

Microsoft Excel

Microsoft Excel v1

Maintenance Mode

Intelligent Structure Discovery

 

59

Oracle EBS

Oracle EBS 12.x (Cloud only)

Maintenance Mode

Use generic REST V2 or WS Connector

 

60

Oracle EBS

Oracle InterfaceTable

Maintenance Mode

Use generic REST V2 or WS Connector

 

61SAP AribaAriba HierMaintenance ModeAriba V2

62

SAP Concur

SAP Concur

Maintenance Mode

Concur V2

 

63

SAP SuccessFactors

SAP SuccessFactors SOAP

Maintenance Mode

SAP SuccessFactors Odata

 

64

TFS

TFS

Maintenance Mode

Generic REST V2 / WS Connector

 

65

TM2

TM2

Maintenance Mode

None

 

66

Twitter

Twitter

Maintenance Mode

None

New connector on roadmap.

67

WebServices - REST

REST

Maintenance Mode

REST V2

 

68

WebServices - SOAP

SOAP WebServices

Maintenance Mode

Webservices Consumer Transform

 

69

Webservices V2

Webservices V2

Maintenance Mode

Webservices Consumer Transform

 

70

Workday

Workday

Maintenance Mode

Workday V2

 

71

Zendesk

Zendesk

Maintenance Mode

Zendesk V2

 

72

Zuora

Zoura (SOAP)

Maintenance Mode

Zuora REST V2, Zuora AQuA

 

The Informatica Intelligent Cloud Services (IICS) Winter 2019 release offers several new capabilities that address key data challenges that businesses are facing today. Highlights are listed below.

 

Data Integration

  • Data discovery in Cloud Data Integration with Enterprise Data Catalog (EDC) integration - Customers can now search and discover enterprise-wide metadata from within Data Integration, import connection & object metadata, and use that information to more easily create new or enrich existing mappings and tasks by connecting with an existing EDC installation.
  • “Smart match” recommendations for field mappings increases the frequency of field matches in mappings and tasks. Expanding on the existing automatch, smart match looks for common patterns in field names (prefixes, suffixes, abbreviations, etc.) based on six additional matchers and fuzzy match techniques for recommending field mappings.
  • Taskflows can be invoked via APIs for externalized scheduling and execution. With this enhancement, customers now can invoke taskflows on-demand via an API call and provide input parameters for the tasks it orchestrates, allowing customers to fully leverage Data Integration’s parameterization capabilities. Please refer to the Taskflow as a Service FAQ.
  • Taskflows have also been enhanced to allow them to embed other taskflows to promote reuse.
  • Change data capture has been expanded to include additional sources for DB2 on Linux, Unix, Windows, and iSeries (also known as AS400, i5/OS) platforms, which further enables near real-time changed data propagation capabilities.
  • Mass ingestion is extending connector support, adding Google Storage & Google Big Query as targets and HDFS as both a source and target. Additional enhancements expose CRUD-focused APIs.

 

API and Application Integration

  • Support for Kafka Messaging – Messaging is at the core of many publish-subscribe (Pub/Sub) based applications as a means of decoupling producers and consumers of data.  The addition of Kafka for application integration significantly increases current message-based Pub/Sub interactions between data and applications that today are fulfilled using JMS, AMQP, Amazon SNS/SQS, and Azure Service Bus based “topics.” The ability to bridge these message-based events with the Cloud Integration Hub Pub/Sub style of data propagation provides additional integration pattern options making Informatica unique in the flexibility and capabilities it provides for its customers.
  • JSON Web Token (JWT) based authentication – The API and Application Integration services now support JSON Web Token (JWT) based authentication, an open standard (RFC 7519) that defines a compact and self-contained way for securely transmitting information between API consumers and REST web services. This provides IICS users that use API Manager with another and more secure means of API authentication.
  • API grouping – To better manage the use of JWT tokens and associate their use to multiple API endpoints, a new API “application” grouping capability is being introduced in API Manager. This capability will provide API consumers with fewer tokens to deal with, and API producers will now more easily manage or revoke a consumer’s access to multiple APIs.
  • Japanese language support for the API and Application Integration services – In addition to Japanese language support for the Data Integration service, Japanese customers now have access to the API and Application Integration services user interface and documentation in Japanese.
  • REST and SOAP Service API-based “service connectors” – distributed via GitHub.

Today 55% of Application Integration customer’s connectivity needs are met using service connectors. A service connector allows customers to define REST (XML/JSON), JSON/RPC, or SOAP service integration using a simple web form, with no coding required. The Application Integration service takes care of the rest. If the service offers a WSDL or Swagger interface document, the service connector can be automatically generated by importing the interface document. By creating service connectors, customers can import and configure pre-built business and data service definitions as reusable assets that they can share and/or move from one environment to another. 

This capability, unique to Informatica, provides customers with unparalleled value. Service connectors avoid lock-in or an inability to make updates as you need them to take advantage of new capabilities or built-in extensibility that an API platform can offer. 

To provide this flexibility to customers and encourage community contribution by customers, partners, and other practitioners, Informatica is establishing a GitHub repository where it will publish the service connectors it has created and which it will share with its customers and partners. Customers and partners are free to use these definitions without restriction, including the rights to use, copy, modify, merge, publish, and distribute these under an MIT license. Informatica will also encourage contributions back to the community. Our goal is simple: drive innovation and reduce perceived barriers to adoption.

 

 

Integration Hub

  • Informatica has improved the search experience for Hub events and support for the CLOB data type on topics.

 

B2B Gateway

  • ICS B2B Gateway customers will be migrated to the IICS platform as part of the release upgrade and will benefit from all IICS platform capabilities.

 

Intelligent Structure Discovery

  • Intelligent Structure Discovery expanded its parsing capabilities to handle ORC format files and Excel multiple sheet files. The user can now design the structure model based on multiple sheet structures and then use the model at run time to parse Excel files in their entirety.
  • With R31, a Structure Parser transformation can be positioned mid-stream to enable a more flexible mapping usage and chaining. In addition, the Intelligent Structure Model detected datatypes are now propagated to the Structure Parser output ports.
  • The ISD design time user interface is enhanced with a "find" functionality which allows the user to search for a specific string in the discovered tree fields and get a list of results showing the path and visually correlated with the model representation. The user can also perform actions on multiple elements chosen from the result list such as include, exclude, replace, and even change of element type. The ability to perform actions on multiple elements significantly improves the usability and productivity.
  • A new vertical folder view mode will be available in R31 for handling complex hierarchy files.

 

IICS Platform

  • Common Explore productivity enhancements – Improved copy functionality with overwrite & rename conflict resolution options to copy assets within and across folders. “Created by” and “last updated by” attributes as columns for all asset types in the common Explore page.
  • Export/import capability for sub-organizations which enables asset migration across environments that use an organization hierarchy. More control and flexibility with enable/disable checksum validation options during export and import.
  • Improved export/import error logging along with the ability to access and download export/import logs through the UI and the API.
  • API to search, list, and filter assets in projects and folders using a variety of conditions such as timestamp, location, “last updated by,” and tags. This API can also be leveraged along with export APIs to export objects.
  • Improvements to the RunAJob utility – Support for projects and folders to invoke tasks by task name.
  • Usability improvements – Ability to copy table cell data in the common Explore page, Monitor service, and Administrator service for use in other user interfaces like search boxes and filter conditions for easier task completion.
  • Search capability for connections and job monitoring to quickly and easily find needed information.
  • Ability to enable and disable a service for agents in a Secure Agent group to effectively deploy workloads and efficiently utilize computing resources.
  • Secure Agent registration using tokens (instead of passwords) for increased security and enabling SAML single sign-on.

 

Operational Insights 

  • Operational Insights extends support to the on-premises Data Quality product, in addition to BDM and PowerCenter, with capabilities such as domain health, job run analytics, resource utilization, and alerts.
  • Click-through analytics journey from cross-domain to the individual job level and enhancements to job run analytics for on-premises products (PowerCenter, BDM). Plus, enhancements to job run analytics to report on data loaded and data processed.
  • Power Center & BDM infrastructure e-mail alert enhancements such as Secure Agent unavailability and Operational Insights data collection failures.

 

Connectivity Enhancements

New AWS Quickstart for Cloud Analytics Modernization, an end-to-end solution for self-service cloud analytics with Informatica (IICS, Enterprise Data Catalog), Tableau Server, and AWS Services.

Several new connectors and enhancements to existing connectors across ecosystems as listed below. New connectors introduced are highlighted in bold:

  • Azure: ADLS Gen 2 Preview, Azure DW V3, Azure Data Lake Store V3, Azure Blob V3 
  • Google: Google Cloud Storage V2, Google Analytics, Google Big Query V2, Google Big Query 
  • Amazon: Amazon S3 V2, Amazon Aurora, Amazon Redshift V2 
  • Salesforce: Salesforce Marketing Cloud (SFMC), SFDC (Sales and Service) 
  • SAP: SAP Connector, SAP HANA Cloud Platform (DB) 
  • Adobe: Adobe Cloud Platform 
  • Analytics: CDM Folders connector preview, Tableau V3, Tableau V2 
  • Databases: MySQL Relational, Hive, Greenplum, DashDB, Snowflake 
  • Tech: REST V2, WSconsumer, Complex File Processor 
  • Microsoft Apps: Microsoft SharePoint Online 
  • Oracle: Oracle Netsuite V1, Oracle Relational 

 

Summary of some of the connectivity enhancements are as follows:

AWS

  • Supporting file names longer than 250 characters with S3
  • Support for custom JDBC URL for Redshift
  • Support for ORC files with S3

Snowflake

  • Custom Query metadata fetching without having to run the query

Google

  • Custom Query Support for Google Big Query V2 connector
  • Pushdown support for Google Big Query thru ODBC
  • Google Analytics - Enhancement to fetch fields based on multiple Views IDs from GA
  • Google Big Query Mass Ingestion - Direct load Cloud Storage->Big Query

Azure

  • Preview of ADLS Gen2 connector: support create target, configurable escape character and text qualifier in R/W scenarios, create and rename directory, rename file, header-less files, support RBAC for all types of AAD Authentication, append data to an existing file, support parameterization
  • Azure DW V3: support TARGET NAME OVERRIDE and TARGET SCHEMA NAME OVERRIDE in the writer, support SOURCE NAME OVERRIDE and SOURCE SCHEMA OVERRIDE with the reader, support custom query and multiple objects in CMD and MCT

 

Microsoft Apps

  • Microsoft SharePoint Online: support for agents running on Linux

 

Analytics

  • Preview of CDM Folders connector: new connector, with the ability to write to ADLS Gen 2 in CDM format, and then access the data from Power BI as a dataflow
  • Tableau V2: upgrade connector to the latest Tableau SDK

Databases & DW

  • MySQL Relational: array insert support
  • Greenplum: native reader and writer

 

NetSuite V1: address 2-factor authentication

 

Salesforce

  • SFDC Sales, Service connector: support latest SFDC API
  • Salesforce Marketing Cloud SFMC: insert/update operation for “Non Contact Linked Data Extensions”

Here is an orientation video for the upcoming Informatica Cloud to Informatica Intelligent Cloud Services (IICS) migration process. The video provides an introduction to IICS for customers that are migrating from Informatica Cloud.

 

We will be migrating organizations between April and August. You will receive notification when YOUR organization(s) is going to be migrated and when your sandbox environment (pre-release) is available.

 

Video link (moved to YouTube so that it doesn't have to be downloaded):

 

Introducing Informatica Intelligent Cloud Services - YouTube

 

Full FAQ:

 

Informatica Intelligent Cloud Services Migration FAQ

Informatica Intelligent Cloud Services Migration FAQ
(Updated: June 27, 2019)

 

The following questions and answers will help you better understand the Informatica Intelligent Cloud Services migration. This is a live document, and we will be adding questions and answers as they arise.

 

This document is organized in sections by Service: common questions, Data Integration questions, API and Application Integration questions, and Data Integration Hub questions.

 

 

Common Questions

 

1. When will the Informatica Cloud to Informatica Intelligent Cloud Services migration begin?

Migration to the new Informatica Intelligent Cloud Services platform will begin in early Q2 2018.

 

2. What order of services will the migration follow?

As a first step, Informatica Cloud Services (ICS) customers that do not use the Informatica Cloud Real Time (ICRT) services will be migrated. ICRT customer migration, including customers that have licensed Cloud Premium services, is commencing in July 2018. ICRT customers are grouped by functional usage. Migration of customers is planned to be completed before the end of 2018.

 

3. Where can I find more information about the new features and behavior changes in Informatica Intelligent Cloud Services?

See the IICS Navigation Overview for ICS and ICRT Users video to for a quick tour of IICS. This video provides an overview of Informatica Intelligent Cloud Services for users that are already familiar with Informatica Cloud.

 

The Migration folder of the Cloud Application Integration Community Site provides links to a number of resources that can help you with this. These include:

 

For ICS Users

 

For ICRT Users

 

4. Will all customers be migrated at the same time?

Migration will be performed in multiple batches to ensure maximum flexibility and a minimal amount of disruption to customers. These batches may include ICS and ICRT customers.

 

5. When will customers be notified about the migration?

Customers will be notified 6 weeks prior to the migration. The communication will go out to all users of the Informatica Cloud organization being migrated.

 

6. What is the migration process?

All customers will be migrated to a Sandbox environment before they are migrated to the production environment. The Sandbox environment will be available to you for at least three weeks, to allow you to perform testing. Your Informatica Cloud assets will be migrated to the Sandbox environment, but an asset that you create in the Sandbox environment will not be migrated to the production environment. If you create assets in Informatica Cloud during the three weeks of preview testing, these assets will be migrated to the production environment. After the three weeks of the preview, testing has elapsed, your organization will be migrated to the production environment. If you have concerns or need more time please contact Informatica Global Customer Support.

 

7. Should all customers test the migration?

Yes, all customers are expected to test the migration in the Sandbox environment. It is critical for all customers to participate in the testing to ensure a smooth transition from Informatica Cloud to Informatica Intelligent Cloud Services. As a minimum, Informatica recommends that you test at least one organization with a good mix of use cases.

 

8. Are there any special instructions for using the Sandbox environment?

Yes. When you are notified of the upcoming migration you will be able to access your Sandbox environment. The following instructions apply to the Sandbox environment:

  • You can access the Sandbox environment after Informatica notifies you that your organization has been migrated to the Sandbox environment. To access the Sandbox environment, open the Sandbox environment URL (https://dm-sandbox.informaticacloud.com/identity-service/home) using your Informatica Cloud credentials.
  • Customers that have licensed Data Integration (previously ICS) should verify that they can access the Data Integration Service
  • Customers that have licensed Application Integration (previously ICRT or Cloud Premium customers) should verify that they can access the Application Integration Service
  • If you are unable to access either service, contact Informatica Global Customer Support.
  • If your company uses a firewall, you may need to update your whitelist of IP addresses. The Secure Agent IP address ranges differ among Informatica Cloud, the IICS Sandbox environment, and the IICS production environment. The approved Secure Agent IP address ranges for the production and Sandbox environments are listed in KB article 524982.
  • Download a Secure Agent for the Sandbox environment.
    Existing preview agents will not be upgraded. Uninstall any previous preview agent, and then download and install a new preview agent from the Sandbox environment. The new Secure Agent will point to the Sandbox environment and allow you to run jobs.
    Note: On Windows, you must install the Secure Agent on its own machine. You cannot run multiple agents on the same Windows machine. On Linux, you can install multiple agents on the same machine if you install them under a different ID and folder.
  • Update your tasks and connections to use the new Secure Agent that you downloaded from Informatica Intelligent Cloud Services.
    Tip: If you have the Secure Agent Cluster license, you can add the preview agent to an existing Secure Agent group so that you won't have to update tasks and connections individually.
  • If you want tasks in the Sandbox environment to run on the schedules that you defined in Informatica Cloud, edit the tasks and save them. Schedules are migrated to the Sandbox environment, but they are not activated. When you save a task, the schedule will be re-activated.
  • Clear your browser cache if you see a blank screen or the message, “Internal Server Error. Please contact Support.” This issue is caused by a browser caching problem. Clear your browsing history, including all cookies and all cached images and files, and then close the browser. Re-open the browser and log in again.

 

ICRT service users should also review the ICRT Service Migration to the Cloud API and Application Integration Service guide.

 

9. Which web browsers can customer use with IICS?

IICS supports the following browsers: Google Chrome, Microsoft Internet Explorer 11, and Mozilla Firefox. For more information, see the PAM for Informatica Intelligent Cloud Services (IICS) on Informatica Network.

If you use IE11, note the following:

  • You must enable cross-origin support (CORS) in the browser. For information about enabling CORS in IE11, see the "Enabling CORS in Internet Explorer 11" topic in the Data Integration online help.
  • The time stamps displayed in the Monitor service and on the My Jobs page in Data Integration appear in Coordinated Universal Time (UTC).

 

10. What will happen to Secure Agents during the production migration?

During the production migration, all of your Informatica Cloud Secure Agents will be upgraded to the latest version. Secure Agents that you downloaded from the IICS Sandbox environment will not be upgraded.

The migration process retains the following items:

  • Connection properties that you stored with a local Secure Agent.
  • Secure Agent configuration property changes.
  • All files that you copy to <Secure Agent installation directory>/apps/Data_Integration_Server/ext/

The migration process does not retain manual changes that you made to configuration files in the Secure Agent installation directory or its subdirectories.

Note: As a best practice, Informatica recommends that you back up your Informatica Cloud Secure Agent directories before the migration so that you can restore them easily in the unlikely event of a rollback.

 

11. How much disk space is required to upgrade the Secure Agent?

To calculate the free space required for upgrade, use the following formula:

Minimum required free space = 3 * (size of current Secure Agent installation directory - space used for logs directory) + 1 GB

 

12. For customers that use a firewall, what are the Informatica Intelligent Cloud Services URLs that need to be included in the whitelist of approved IP addresses?

The approved Secure Agent IP address ranges for production and Sandbox environments are listed in KB article 524982.

 

13. Will there be any downtime for migration, and if yes, what is the expected downtime?

The migration will affect your service’s availability. The exact duration of the downtime will be communicated to each customer as part of the migration notification. The exact downtime depends upon the number of Informatica Cloud assets and organizations that a customer has. Informatica estimates the downtime to be in the range of 1-4 hours.

 

14. What will happen to the old Informatica Cloud organization after the migration is completed?

The Informatica Cloud organization will be deactivated, but its metadata will be retained for 30 days post-migration to ensure that Informatica has a copy for comparison and for roll-back in case of unforeseen issues.

 

15. Will my organization ID change after migration?

Yes. You will get a temporary organization ID in the Sandbox environment. During production migration, your organization will get a new, permanent organization ID.

 

16. Can customers choose the migration schedule?

Informatica will build the migration batches and communicate the migration schedule to each customer. If the published schedule does not meet your needs, customers are requested to contact support to reschedule to a different migration batch.

 

17. If a customer has more than two organizations, can they be migrated in separate batches?

While this is possible, Informatica doesn’t recommend this. Customers should consider the impact of having the organizations in different platforms for even a short duration. Customers should work with their customer success manager and Informatica Global Customer Support to ensure that organizations are scheduled in the appropriate batches.

 

18. Are there any security changes in Informatica Intelligent Cloud Services?

We have introduced user-defined roles in Informatica Intelligent Cloud Services. User roles are automatically created for corresponding user groups in Informatica Cloud. If there are any asset-level custom permissions in Informatica Cloud in which asset permissions granted to a user are higher than the permissions granted to the user via the user group, then these asset permissions are not honored for the user. Customers need to pay attention to this and manually adjust asset-level permissions as needed.

 

19. When should we do a rollback?

Post-migration, if the customer raises a P1 ticket that can’t be resolved within 24 hours, Informatica will consider the rollback option. Rollback should be done only after all other avenues to resolve the issue have been exhausted. Rollback requires an approval from the project management team.

 

20. Is the rollback automated?

Informatica has a rollback script that deactivates the Informatica Intelligent Cloud Services organization, reactivates the Informatica Cloud organization, and downgrades the Secure Agent back to the Informatica Cloud version. If any jobs have been run in Informatica Intelligent Cloud Services either partially or successfully prior to the rollback, the state of those jobs and their job logs will not be rolled back, nor will they be ported back to Informatica Cloud.

 

21. I created a new organization in Informatica Intelligent Cloud Services using my Informatica Cloud username. Can my Informatica Cloud user account be migrated if there is already an IICS user account with the same name?

Usernames in Informatica Intelligent Cloud Services must be unique. If there is already an IICS user account that has your Informatica Cloud username, then your IICS username will be appended with some extra characters to form a unique name. For example, if your Informatica Cloud username is infaclouduser, your IICS username might be changed to infaclouduser.IICS. (Your Informatica Cloud username will not change.) Informatica will send you an email with the new IICS username, and you will be able to log in to IICS using the new name.

If you use SAML or Salesforce single sign-on and there is already an IICS account with your username, the IICS username that appears in your user profile will be appended with a string such as “.SAML” or “.Salesforce” to ensure that the username is unique. You will be able to log in to IICS using single sign-on as you did with Informatica Cloud.

 

22. Do I need to change the API endpoints that I am using on Informatica Cloud?

After migration, login API requests will be automatically redirected to Informatica Intelligent Cloud Services. This redirection service will be available through February 28, 2019. As before, you must construct subsequent API requests based on the <serverUrl> and <icSessionId> that was received in the login response. Ensure that you have not hard-coded the base URL for any other API endpoints other than the login API endpoint.

After February 28, you must replace your current Informatica Cloud domain URLs with the Informatica Intelligent Cloud Services (IICS) URLs mentioned in KB article 524982 to use the APIs in IICS. (For example, if your POD is located in North America, the new IICS domain URL is https://dm-us.informaticacloud.com, and the V2 login IICS API endpoint to use is https://dm-us.informaticacloud.com/ma/api/v2/user/login.)

 

23. How do I leverage new features such as export/import through APIs in Informatica Intelligent Cloud Services?

New features such as export/import are currently only available through the V3 APIs. To leverage these APIs, use the V3 endpoints described in the REST API Reference. (For example, use the V3 login API with the following endpoint: https://dm-us.informaticacloud.com/saas/public/core/v3/login.)

 

24. What are the Cloud Application Integration IP Address ranges that you need to add to your list of approved IP addresses?

Please review the https://kb.informatica.com/faq/7/Pages/21/535281.aspx articles for the whitelist of either Domain names or the IP address ranges.

 

 

Data Integration Questions

 

1. Does the customer have to change mappings, tasks, connections, processes, or other assets after migration to make them work?

The production migration process is seamless, and existing assets will continue to work after migration without manual changes. (Please see question 2 for potential post-migration tasks.)

 

2. What are customers expected to do before, during, and after migration?

Before:

1.     Ensure that no metadata is modified during the migration window.

2.     As a best practice, back up your Secure Agent directories so that you can restore them easily in the unlikely event of a rollback.

3.     Set up appropriate blackout window.

4.     Ensure that your Informatica Cloud Secure Agents are up and running.

5.     Make sure that you have completed any task provided by Informatica Global Customer Support for migration.

6.     Prepare a checklist to validate post-migration.

During:

Monitor your email for any communication from Informatica Global Customer Support.

After:

1.     Log into the new Informatica Intelligent Cloud Services organization and do a quick validation to ensure that all metadata counts are in line.

2.     Verify that jobs are running as expected.

3.     Reset the blackout window.

4.     Review the activity logs and audit logs.

5.     If you see any discrepancies, log a support ticket as soon as possible.

 

3. Are there any manual steps involved in the migration?

Informatica is making every effort to automate the migration from end-to-end. However, there are certain areas that need attention:

  • If you use the REST API and have enabled two-factor authentication for your organization, add the following IP addresses to the list of “Trusted IP Ranges” in Informatica Cloud before the migration:
    APP and APP3 PODs: 206.80.52.0/24, 206.80.61.0/24, 209.34.91.0/24, 209.34.80.0/24
    APP2 POD: 206.80.52.0/24, 168.128.27.61/32, 168.128.27.47/32, 168.128.29.12/32, 168.128.29.92/32
  • Outbound message links will change and must be updated post migration in Salesforce. Informatica will redirect for 4 weeks after the migration, but the new links need to be updated in Salesforce.
  • If you use SAML single sign-on, you must download the IICS service provider metadata after the migration and deliver the metadata and the IICS single sign-on URL for your organization to your SAML identity provider administrator. Additionally, ensure that you update the IICS single sign-on URL and app in your identity provider application.
  • Contact Validation tasks will not be migrated. You need to convert to an Address Doctor based web service to cleanse the addresses.
  • Data Assessment tasks will not be migrated. You need to convert to DQ Radar before the migration.
  • Any task flow that references a Contact Validation or Data Assessment task will not be migrated. If you want the task flow to be migrated, adjust the task flow logic and remove the Contact Validation and Data Assessment tasks before migration.

 

4. What will happen to views after the migration?

Public views in Informatica Cloud are replaced with tags in Informatica Intelligent Cloud Services. All assets in the view will be labeled with a tag in IICS Data Integration that has the same name as the view. For example, if you created custom view called SalesObjects that contained 30 mappings in Informatica Cloud, all 30 mappings will be labeled with the tag SalesObjects in IICS Data Integration.If the same view name was used for different asset types, the tag names will have different suffixes in IICS Data Integration. For example, if you created the SalesObjects view for mappings and also for mapping tasks, mappings might be labeled with the tag SalesObjects and mapping tasks with the tag SalesObjects_1.You will be able to browse tagged assets and view all assets with a specific tag.Private views, views that are associated with connectors, and activity log views are not migrated.

 

5. Which versions of the REST API can I use with Informatica Intelligent Cloud Services?

REST API version 1 is no longer supported. For IICS, use REST API version 2 and version 3. The Informatica Cloud Data Integration REST API Reference explains the two REST API versions in detail and how to use each of them to interact with Data Integration using REST API calls.

 

6. Can I use the runAJobCli utility to run tasks in Informatica Intelligent Cloud Services?

Yes. To use the utility in Informatica Intelligent Cloud Services, update the restenv.properties file to use the new Informatica Intelligent Cloud Services URL:

Note that if you run the utility with the task name (-n) option, and you have multiple tasks with the same name in different folders, the utility runs the task in the Default folder. To run a task in a different folder, use the task ID (-i) option instead of the task name option.

 

7. The ODBC and JDBC drivers are missing from my MYSQL connector. How do I fix this?

In Informatica Intelligent Cloud Services, Informatica no longer includes the MySQL ODBC and JDBC drivers with the MySQL connector. Before you use the MySQL connector, download and install the drivers. For information about installing and configuring the drivers, see the following article or videos:

 

API & Application Integration Questions

 

1. ICRT Service (including Cloud Premium) customers should be aware of the following:

 

Preview/Sandbox Migration:

Service URLs

Create and invoke processes in the sandbox account. However, do not use sandbox service URLs in any production activity. Sandbox service URLs are not permanent and are only for testing.

Be careful when you invoke processes in the sandbox environment. Verify that the execution of a process does not affect production. For example, if you execute a "Create Order" process in the sandbox, an order will be created.

Scheduled Processes

Your schedules are migrated to the sandbox in the 'Not Enabled' state. This is to ensure that there is no duplicate process invokes because legacy Informatica Cloud Real Time continues to function during the sandbox testing period. To test schedules, create new processes in the sandbox and assign schedules to the processes.

Invoked Processes

Processes invoked before migration do not appear on the sandbox Application Integration Console service. Use the legacy Process Console to use these processes.

Processes that you invoke using sandbox service URLs will appear in the sandbox Application Integration Console service.

 

Production Migration:

 

 

Service URLs

 

Post the migration to IICS/CAI Production, the client will still be able to send requests to the older ICRT Service URL, which will be automatically redirected to the equivalent CAI Service URL. However, please note that this redirection will be available only for a short period of time until the end of September 2019. We suggest that you should plan to update your client to send the requests to the new CAI Service URL, as soon as possible to reduce the number of network hops and thereby improve the performance.

 

Invoked Processes

 

It is suggested to turn off the requests from the client at the time of migration (that has been communicated over email), although it is not mandatory. Please plan ahead. You need to be cognizant that any requests that are instantiated at the time of production migration, will likely not complete successfully. You might receive an HTTP 500 or HTTP 503 as a response, and the runtime of the instance 'attempted' to be instantiated in ICS/RT will not be migrated to the IICS/CAI Production server.

 

 

2. How do I learn about migration?

The Migration folder of the Cloud Application Integration Community Site provides links to a number of resources that can help you with this. These include:

 

 

Most ICRT service customers use ICS data integration services. To see what's available to you as you migrate to IICS, see the Essential Cloud Data Integration Demos set of videos.

 

3. How do I prepare for migration?

To prepare for Sandbox, or before and after migration to your Production Environment you should review the ICRT Service Migration to the Cloud API and Application Integration Service document.

 

4. Where are all my processes and other assets?

All your assets have been migrated to the Default folder on the Explore page.

 

5. I moved a Mapping Task out of the Default folder. Now, a process that calls the Mapping Task throws an error. What do I do?

If a process uses the Run Cloud Task service, and you have moved the cloud asset from the Default project to another project or folder, you must go to the process and reselect the cloud task. Save and publish the process.

 

6. Will the Schedules that I created in Informatica Cloud Real Time be migrated?

Yes, all schedules will be migrated to your sandbox account. However, to avoid multiple invokes of the same process, they will be migrated into the 'Not Enabled'  state. Informatica suggests that you verify that no processes are scheduled during the migration window.

 

7. Will the Informatica Cloud Real-Time Service URLs still be valid?

Yes, old service URLs will still be valid post sandbox migration. Continue to use these URLs until Informatica migrates your organization to a production account. Do not embed sandbox service URLs anywhere.

 

8. Do I need to republish assets after migration?

No, you do not need to republish assets after migration. All published assets will be migrated in the published state.

 

9. Will my Secure Agent be migrated? Do I need to download an agent again?

Your Secure Agent will be migrated, as will all assets that are published to the agent. You do not need to download an agent again.

 

10. What are the Cloud Application Integration IP Address ranges that you need to add to your list of approved IP addresses?

Please review the https://kb.informatica.com/faq/7/Pages/21/535281.aspx articles for the whitelist of either Domain names or the IP address ranges.

 

11. What will happen to the existing managed APIs after the migration?

Managed APIs will not be migrated. If an API Manager customer needs to migrate the existing managed APIs, then the customer should contact the customer success manager or support manager.

 

12. Will the managed API URLs still be valid?

No, old managed API URLs will not be valid post-production migration if you do not arrange with your customer success manager to migrate the existing managed APIs. Instead, create a new managed API for each service that you want to manage, and use the new URL.

If you do request to migrate existing managed APIs, the old URLs will resolve after production migration and DNS resolution. However, it is recommended to use new URLs.

 

13. Can I still use the old API Manager?

After production migration, do not use the old API Manager. Instead, use Informatica Intelligent Cloud Services API Manager to create and perform all operations with managed APIs.

 

14. Can I use sandbox API Manager URLs after production migration?

No, do not embed sandbox URLs anywhere.

 

15. Do I need to manually update the Salesforce guide setup URLs after migration?

Yes, you must manually update the Salesforce guide setup URLs after migration. The guide setup URLs will not be automatically redirected after migration.

 

You must log in to Salesforce and manually update the Informatica Cloud Real Time Host URL specified under advanced settings in the Guide Setup tab. If you had embedded guide URLs, you must manually update them with the new URLs after migration. See the next question for more information.

 

16. What are the post-migration tasks that I must perform for the Salesforce managed package?

If you use the Salesforce managed package, you must perform the following tasks after migration:

 

Log in to Salesforce and verify your guides.

Log in to Salesforce and verify that your guides are visible on the relevant Salesforce object pages. If you do not see your guides, log out of Salesforce, clear the browser cache, and then log in to Salesforce.

 

Log in to Salesforce and verify the Informatica Cloud Real Time Host URL.

Log in to Salesforce and verify that the Informatica Cloud Real Time Host URL specified under advanced settings in the Guide Setup tab is correct.

The format of the Host URL must be as follows: <Cloud Application Integration URL>,<Informatica Organization ID>

 

For example: https://na1.ai.dm-us.informaticacloud.com,afdq9RWEA4vjIQWQcE88XB

To view your Cloud Application Integration URL, log in to Informatica Intelligent Cloud Services and select the Application Integration service. From the browser address bar, copy the URL from https till .com as shown in the following image:

To view your Informatica Organization ID, log in to Informatica Intelligent Cloud Services, select the Administrator service and then click Organization. Copy the Informatica Organization ID displayed in the ID field as shown in the following image:

17. Can I use Guide Designer in Salesforce to create a new guide?

No. Informatica does not support Guide Designer in Salesforce. To create a guide, log in to Informatica Intelligent Cloud Services and use the Application Integration Guide Designer.

 

18. What are the post-migration tasks that I must perform for Data Integration tasks that use custom permissions?

If you had assigned custom permissions to a Data Integration task and are invoking the Data Integration task through an Application Integration process or a guide, after migration, you must complete either of the following tasks:

  • Give the Application Integration anonymous user permission to run the associated Data Integration asset.
  • Add the Application Integration anonymous user to a user group that has permission to run the associated Data Integration asset.

The following image shows an Application Integration anonymous user account that is authorized to run a Data Integration mapping:

 

More Information:

If you have licensed Application Integration, Informatica Intelligent Cloud Services creates a system user called CAI_Anonymous_<Organization_ID>. Application Integration needs this user when you invoke an anonymous process that calls a Data Integration task.

 

Important: Do not edit or delete the Application Integration anonymous user if you need to invoke an anonymous process that calls a Data Integration task.

 

The following image shows an Application Integration anonymous user account named CAI_Anonymous_6gPInky12gwbSxPUcH8v0H:

 

19. After migration, why does a process fail if it connects to a service that uses TLS version 1.1 or earlier?

By default, Application Integration uses TLS 1.2 version to connect to third-party services. TLS version 1.1 has been deprecated.

If a process connects to a service that uses TLS version 1.1 or earlier, you must manually edit the server ssl-enabled-protocols property to point to TLS version 1.1.

 

Perform the following steps after migration:

  1. In the Data Integration home page, click Administrator.
  2. Click Runtime Environments.
  3. Click the Secure Agent for which you want to configure TLS.
  4. Click Edit.
  5. Under the System Configuration Details section, select the service as Process Server, and select the type as server.
  6. Click the Edit pencil icon against the server ssl-enabled-protocols property and set the value to 'TLSv1.1'.
  7. Restart the Secure Agent for the changes to take effect.

 

20. When I start the Process Server on a UNIX operating system, why do I see the following errors:
Cannot write to temp location [/tmp]
"java.io.FileNotFoundException: ... (Too many open files)".

These errors occur because UNIX limits the number of files that can be created by a single process. The NOFILE parameter defines the maximum number of files that can be created by a single process, which is 1024.

 

Edit the UNIX security configuration file to allow a larger number of files to be opened. Configure the value of the NOFILE parameter to raise the limit from the default 1024 value. A value of 10240 should suffice.

 

Open the file /etc/security/limits.conf and add the following line:

- nofile 10240

 

If you are unsure of the value, you can set the value to unlimited.

 

21. After migration, why does a process still show the status as running even though it successfully completed earlier in Informatica Cloud Real Time?

If you had published a process on the agent and the status still shows as running even though the process successfully completed in Informatica Cloud Real Time, you must manually apply a patch to fix the issue. For more information, see the following Informatica Knowledge Base article: 566279.

 

Integration Hub Questions

 

1. What will happen to the existing Integration Hub artifacts after the migration?

Integration Hub artifacts will not be migrated. Informatica Cloud publication and subscription mappings and tasks will be migrated and can be used to create Integration Hub artifacts in Informatica Intelligent Cloud Services. If an Integration Hub customer needs to use the existing Integration Hub artifacts in Informatica Intelligent Cloud Services, then the customer should contact the customer success manager or support manager prior to the migration.

 

B2B Gateway Questions

 

1. What will happen to organizations that are defined in B2B Gateway with invalid Informatica Cloud user credentials?

Organizations with invalid Informatica Cloud user credentials will not be migrated. Before you start the migration, verify that the Informatica Cloud user credentials in the B2B Gateway Settings page are valid.

 

2. What will happen to existing B2B Gateway artifacts after the migration?

All of your B2B Gateway artifacts, including customers, suppliers, and monitoring rules, will be migrated. Customer and supplier schedules will be created in Administrator.

 

3. What will happen to intelligent structure models that are used in B2B Gateway partner definition after the migration?

Intelligent structure models that are used in B2B Gateway partner definitions will be created in Data Integration.

 

4. What will happen to B2B Gateway events after the migration?

B2B Gateway events will not be migrated.

 

5. Will the URLs of the Run Partner API and Event Status API still be valid?

No, Run Partner API and Event Status API URLs will not be valid after the migration. You must update the API requests with the new URLs, as follows:

  • Run Partner API request: https://<pod>-b2bgw.dm-us.informaticacloud.com/b2b-gw/api/v1/partner/run
  • Event Status request: https://<pod>-b2bgw.dm-us.informaticacloud.com/b2b-gw/api/v2/event/status/<eventId>

...where <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access Cloud B2B Gateway. For example: usw1.

Cloud Integration Hub Technical Deep Dive Webinar and Demo

Scott Hedrick, Director Product Marketing, Etty Afriat, Director Product Management, Amit Vaswani, Product Specialist Leader
Sep 6, 2017 8:00 AM PT     Click here to Register

 

Are you starting to build up difficult to manage data integration spaghetti as your cloud data integrations proliferate? Organize and simplify multi-point cloud data integrations with Cloud Integration Hub, the modern publish/subscribe solution for efficiency and productivity. Reduce data transfer and API charges by removing redundant Cloud synchronizations with Informatica’s Hub for data. Jumpstart your use of the Cloud Integration Hub with the new Salesforce Accelerator. Existing Informatica Intelligent Cloud Services Premium and Advanced customers can get started right way with the one year of basic Cloud Integration Hub with their subscriptions.

In this webinar, we will:

 

  • Provide a technical overview of the key capabilities and value of Cloud Integration Hub by Informatica experts
  • Demo the latest version of Cloud Integration Hub including the new Salesforce Accelerator
  • Show how existing Informatica Intelligent Cloud Services customers can get started with the one year of basic Cloud Integration Hub included in their subscriptions.


We will be moving the data center for App2. The migration should be seamless to you, and there will be no product changes. If you currently whitelist Informatica Cloud IP addresses, you will need to whitelist the new IP addresses before the data center move. For you to validate your whitelisting, go to https://app20.informaticacloud.com/saas/icsalive.html from any browser within the network to verify you can connect and get the message "ICS NODE IS ALIVE".

 

What IP Addresses need to be whitelisted?

  1. 168.128.27.61/32
  2. 168.128.27.47/32
  3. 168.128.29.12/32
  4. 168.128.29.92/32

 

Can I remove the old IP addresses?

No you cannot. Please retain the IP addresses as we use these for DR and an entry point for app.informaticaondemand.com.

 

Why are we moving data centers?

The new data center has the latest infrastructure and can provide room for growth, higher availability and reliability.

 

Will there be any product changes?

No, we will not be making any changes to the product during this move. This is not a release of any product, and there will be no product enhancements or changes during the move.

 

Is there an update of the agent?

The agent will not be changed in any way. There are no application, patch, package or agent updates during this move.

 

Do I have to whitelist the IPs to run jobs after the migration?

You only must whitelist the IP’s if you had to do this for the current data center. We strongly recommend using the pre-release agent and testing connectivity to verify and validate your organization’s readiness.

 


If you have any questions please contact us through Global Customer Support at 1-877-INFAHELP from the US, or Worldwide click here.  You may also email pre-release@informatica.com for any questions or concerns regarding this notice.

 

For more information on IP Address Ranges in Informatica Cloud, including previous IP ranges see this KB:112401

ROOT CAUSE ANALYSIS REPORT

 

 

OVERVIEW

Applicable to

Informatica Customers on APP2 Only

PRODUCTS AFFECTED

ICS, ICRT, DQ RADAR, B2B

   INCIDENT START/END: June 20, 2017 22:35 PDT – June 21, 2017 15:15 PDT

RCA COMPLETED:  6/22/2017

INCIDENT DETAILS

INCIDENT SUMMARY

The ICS product hosted on “APP2” became unavailable to handle customer traffic via secure agents caused by a “channel issue due to insufficient memory”. During the Tuesday portion, service was intermittent and then escalated into full outage during Wednesday. This caused other dependent services that rely on ICS, including ICRT, DQ Radar and API Gateway to be unavailable. The customer impact was inability to process any jobs through the Informatica Cloud Services.  The root cause of this incident was server memory / swap space exhaustion on the channel server nodes.  This caused the service response to be intermittent or unavailable, since the agent to host communication was impacted. 

ROOT CAUSE ANALYSIS

The memory footprint on the channel server nodes have increased after the recent Informatica Cloud upgrades as part of new functionality, but within the limits of tolerance. There was errant backup process that started consuming 50% of the available, 90% of CPU time and swap space, and this ‘caused the machine to hang and become unavailable.  This resulted in the communication failure between the agents and host and caused ICS tasks to hang or fail.  Also, real time messages were not processed during this disruption.

 

The initial symptom was observed Tuesday night (June 20th) 22:35 PDT and task and process failures were detected for a few Orgs.   Remediation was made to address the problem initially, however the problem reappeared Wednesday morning (June 21st) at approximately 07:30 PDT, and caused the channel server nodes to go down causing the ICS, ICRT and related services to become unavailable for all customers on APP2 on Wednesday morning (June 21st).

 

Note: A Disaster Recovery (DR) process was started the morning of June 21st, but was later stopped once we addressed the issue on the APP2 host.

 

Once the problem was addressed on the channel server nodes and the nodes were restarted, they operated normally.   ICS tasks were manually restarted, or automatically executed, based on schedules.  

 

Applicable to ICRT Customer: Post an agent restart ICRT processes operated normally.

Applicable to ICRT outbound messaging customers: The ICRT Salesforce outbound listener was resumed to process the incoming messages from Salesforce.

 

 

 

RISK-REDUCTION REMEDIATION ACTIONS TAKEN

Actions that have already been taken to reduce the risk of a future occurrence.

DATE

 

ACTION TAKEN and Planned for this incident

6/21/2017

1. Focus on Audit system parameters monitoring and alerts. Swap space monitoring for this channel server was missing. We now added Swap space monitoring to the monitoring/alerting system. Tested and working fine.

2. Introducing an additional channel server to distribute the load. Adding this additional server will not impact customers.

3. Re-evaluate the emergency notification and escalation process, including more frequent simulations.

4. RCA with Vendor in-progress to determine the backup process impact on the system, which triggered a spike in swap space utilization.   The backup service is an optional service and has been turned OFF on the channel server nodes.

 

Long Term Remediation Plan

Roadmap to a quicker recovery to reduce the downtime duration

DATE

Plans

7/14/2017

Review the current DR process and adjust criteria for when to start the DR process.

7/14/2017

Work with hosting provider and gain full visibility of all processes running on the nodes.

7/14/2017

Add additional monitoring for relevant system parameters and define thresholds on which we can get alerted.

8/15/2017

Additional emergency communication & notification plans (other than trust site updates).

TBD

An enhancement to the Trust Site to subscribe for notifications.

Preface/Overview:

 

The main goal of this feature is to encourage/enable existing PowerCenter customers to leverage the wide connectivity options available in Informatica Cloud.

Informatica Cloud has over 150+ Connectors, a much higher number compared to the connectors available in PowerCenter.

 

Using this feature/architecture PowerCenter customers can run jobs where the source/target is an Informatica Cloud-based connector.

 

CaaS - Features

 

·        The CaaS architecture is designed such that minimal Informatica Cloud knowledge/skill is required by  a PowerCenter Developer/Administrator

·        Tasks like developing a CaaS based mapping and running it, follows the regular PowerCenter clients and development flow.

·        The only additional steps required are:

o   Login to the Informatica Cloud Server (web portal)

o   Download and install the Informatica Cloud Secure Agent.

o   Create connections in the ICS Org, for the desired end-points.

               (All these steps need to be performed only once)

·        Currently, several ICS connectors are supported by CaaS architecture. More connectors will be added to this list soon.

·        Pros :

o   One can leverage ICS Cloud Connectors from within PowerCenter

o   No learning curve i.e. no additional skill required for PowerCenter developer.

o   Seamless integration between PowerCenter and Informatica Cloud

o   Informatica internal advantage – If a connector is already available on Cloud need not develop a PowerCenter connector for the same.

·        Cons :

o   Requires Secure Agent installation in the customer’s network.

o   Performance can be slower compared to a regular task as this architecture involves data transfer between 2 DTMs (Described below).

o   Firewall modifications to enable PowerCenter Server Communication to the ICS Server.

 

Mapping Internal Conversion:

 

A CaaS based PowerCenter mapping will internally be converted to two mappings as described in the below diagram:

pc1.jpg

                     (Runs on Secure Agent machine)                         (Runs on PowerCenter Server machine)

 

 

Run-time behavior of a CaaS based PC Mapping:

 

Below figure show the activities that happen when a CaaS based job is triggered from PowerCenter.

In the diagram, the scenario is a PC Mapping with CaaS based source and a regular target.

 

pc2.jpg

 

 

1)     PowerCenter Integration Service spawns the pmdtm process.

2)     The pmdtm process detects that the source is a CaaS based source (not a regular source) and sends a request to the ICS Server.

The request has required mapping metadata (ICS Connection, ICS Source Object etc.)

3)     ICS Server generates a Cloud Mapping on the fly and sends an execute request to the Secure Agent.

4)     The Secure Agent spawns a DTM process (Cloud DTM) and executes the generated Cloud Mapping.

5)     The Cloud DTM process reads data from the Source end-point.

6)     The Cloud DTM feeds this data to the PowerCenter pmdtm.

7)     PowerCenter pmdtm executes any other transformation in the PowerCenter mapping and finally loads data into the Target.

Are you facing issues after the Informatica Cloud Spring '17 upgrade? We are here to help. We have identified some of the common issues encountered by users and pulled together a list of solutions to help you address them. Read on to know more.

 

Note: If you have any concerns please open a case with Informatica Support by clicking on Contact Support option within Informatica Cloud.

 

SUMMARY

 

At the bottom of this post, we have a running list of known issues in Spring '17 upgrade and their solutions. We will continue to update this list so check back later for more updates. If you already have a network.informatica.com login, we recommend following  this post so that you can automatically receive updates for any new issues.

 

You can also follow us on Twitter @INFAsupport for all the latest updates on Informatica Cloud Spring '17

 

ISSUES AND SOLUTIONS (APP2/3) April 22, 2017

Spring '17 Documentation

Informatica Cloud Spring 2017 - Helping customers accelerate their journey to the cloud

Issue: All tasks are failing post the upgrade and session log is not generated.

Internal error. The DTM process terminated unexpectedly

 

Solution:

 

Check if the session log directory is a non-default value. If so, revert it to the default value.

Steps:

Login to the ICS UI, Configure > Secure Agent.

Under System Configuration Details, choose Service = Data Integration Server and Type = DTM.

Check if the value of setting ‘$PMSessionLogDir’ is the default value '$PMRootDir\..\logs'.

If not, edit and revert back to the default value.

 

Wait for the Data Integration Service to restart automatically and then start the task.

Issue: The upgrade was not successful. [================================  02/06/2016  1:36:27.41  copy upgrades\fullupgrade\FileLockDetect.exe FileLockDetect.exe        1 file(s) copied. Waiting for file handles to clean up ERROR: Input redirection is not supported, exiting the process immediately. JAVA_HOME: .\jre agentCoreMajorUpgrMarker exists  Do major upgrade.  Interim agent exists before upgrade, so remove it first.  main2\tomcat\work\Tomcat - The directory is not empty. main2 cannot be removed.  ]


 

Solution: Please visit the discussion link: https://network.informatica.com/message/199683

Issue: “[Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Database driver error” with Oracle/SQL Server/MySQL tasks after Informatica Cloud Spring 2017 upgrade on Windows Secure Agent

 

Solution:

This error may happen if the Registry entries that are sent as part of this upgrade were not updated during the upgrade.

Please follow the steps in this KB to resolve the issue: https://kb.informatica.com/solution/23/Pages/61/511902.aspx

Issue: The test connection for $SAP_CNX failed. Could not initialize class com.sap.conn.

jco.rt.JCo.JCoRuntimeFactory

The test connection for $SAP_CNX failed. Error getting the version of the native layer: Java.lang.UnsatisifiedLinkError: no sapjco3 in java.library.

 

Solution: You need to have the correct JCO libraries for the 64 bit Agent in the environment. If you have moved from 32 to 64 bit agent, then you need to download and setup the JCO files before connectivity to SAP can be established.

 

See the Solution in this KB 504336

Issue: Tasks with Hierarchy Builder transformation and REST V2 connector not processing data after Spring 2017 upgrade.

 

Solution:

This can happen if hierarchical data is sent to a string field in the REST V2 connector. The Request payload seen in the session log would show that there are “\” escape characters added to the request.

 

This issue can be resolved by adding a JVM Option to the Secure Agent configuration.

-      Navigate to Configure > Runtime Environments > Secure Agent.

-      Select “Data Integration Server” under Service and select Type as “DTM”

-      Edit the JVMOption1/2/3 etc.(whichever is available) and add the parameter '-DPromoteToArray=false'

-      Click on “Ok” to save the setting and wait for the Data Integration service to restart.

-      Run the task after the new Data Integration service is up.

Issue: Tasks that use Business Services will fail if the Business Services are edited, Existing unmodified tasks and newly created tasks will work fine.

 

Details:

The Spring 2017 release includes the following enhancement for REST V2 Connector:

 

A fault group to process fault response is enabled when you create a new business service or edit an existing business service.

 

After you upgrade to Spring 2017 release, if you edit a business service that is used in a REST V2 midstream transformation, the respective mapping will fail.


Solution:

 

To avoid this issue, do not edit the business service. If you edit the business service, recreate the midstream transformation and map all the required fields.

Issue: CA Certs Post Upgrade Task

 

 

Solution: ONLY if after past releases you have had to copy security Jars from the Jre2 backup directory back to the Jre directory, you will need to copy these files to a new location as the directory structure will change.

 

1.      Copy the local_policy.jar and the US_export_policy.jar files from the following directory:

<Secure Agent installation directory>\jre2\lib\security

2.      Paste the jar files to the following directory:

<Secure Agent installation directory>\jre\lib\security

3.      Restart the agent


connectors using security certificates - Amazon (Redshift, S3), Microsoft Dynamics CRM

Issue: Jobs on Windows using ODBC fail with the following error: Data source name not found and no default driver specified Database driver error.


Solution: The agent should run as an Administrator

During the upgrade the agent will update the registry to add need ODBC entries. If your agent does not have access to update the registry you may face the following error:

WRT_8001 Error connecting to database... WRT_8001 [Session s_dss_000GSG0I0000000000xx Username YOURUSER Error -1 [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Database driver error... Function Name : Connect Database driver error... Function Name : Connect Database Error: Failed to connect to database using username and connection string [odbc://dbtype=<oracle or SQLServer>;host=,host.;port=<port>;database=<database>].]

This occurs in Windows for Oracle and SQL Server type connections.

 

Cause: Since there is a new ODBC driver for oracle and SQL server, as a part of upgrade, the registry will be updated to add new entries for these 2 drivers, if the user running the agent does not have permission to modify the registry entries, then these entries will not be added to the registry, hence when the task tried to load the drivers, it looks for registry entry and fails.

 

Solution: If you are not running your agent as an Administrator we recommend you run as an Administrator before the upgrade. KB149494 has the steps to run as an Administrator.

 

What if I face the above error after the upgrade?

 

1) Trigger the upgrade by modifying the infaagentversion file (in a text editor) in {agent}/main and change 26.* - to 25.*

2) Restart the agent as admin (steps same as in KB149494)

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icoss1.informaticacloud.com/saas/download/upgrade-22.3.2.1.0.0.0/fullupgrade/linux32/upgrade.zip: [Download failed due to HTTP 403 response.]]

 

Solution: Please visit the document  link: Secure Agent IP Address KB:112401

Issue: Upgrade failure. [Cannot download upgrade file:https://icsdownloadsecure.informatica.com/ICS/r23/mirror2/download/upgrade-22.3.2.1.0.0.0/fullupgrade/win32/upgrade.zip: [Download failed due to HTTP 502 response.]]

 

Solution: This is because our mirror sites have not been whitelisted in your network. Please contact Informatica Cloud Support with your orgid, so that we can push the download via a secure site.

Issue: ODBC DSN not found on Linux after the upgrade will encounter the following error:

The connection test failed because of the following error: [unixODBC][Driver Manager]Data source name not found, and no default driver specified (0)


Solution:


This is because the odbc.ini file was in the main directory and it was backed up to main2 folder during the upgrade.

To fix the issue copy over the odbc.ini file from infaagent/main2 to infaagent/main.

Also you should store the odbc.ini outside of main, going forward, or keep a copy in rdtm-extra directory, so it is copied back automatically after future upgrades.

Issue: Copy Customized connector configuration files (if any)

If any of your licensed connector has configuration files, custom configured in your Informatica cloud secure agent, it is recommended that you copy these files

From

<Secure Agent installation directory>\main2\bin\rdtm\javalib\<plugin ID>\

 

To

<Secure Agent installation directory>\downloads\<latest connector zip package>\package\rdtm\javalib\<Plugin ID>

 

And

 

From:

<Secure Agent installation directory>\main2\tomcat\plugins\<plugin ID>\

 

To

<Secure Agent installation>\downloads\<latest connector zip package>\package\plugins\<Plugin ID>

 

connectorConfig File
AvaturecustomFields.ini
Birstbirstconfiguration.ini
Box APIconfig.properties
Coupacoupa.ini, read.xsd
Dropboxconfig.ini
Eloqua Bulk APIActivityConfig.json
Google APIconfig.properties
HadoopsetHadoopConnectorClassPath.sh
JDBCjdbc.ini
Jiraconfig.ini, jirafields.ini
JSON Targetconfig.ini
Marketoactivityattributes.csv
Open AirOpenAirCodes.properties
Quickbooks V2connectionparameters.ini
Workdayfields.ini
XML Sourceconfig.ini

Issue: Will Discovery IQ package available in Informatica Spring 2017 release.

 

During R27 upgrade, DiscoveryIQ package would be automatically removed and you may want to uninstall DiscoveryIQ agent  (especially windows agent env) from your environment.

Please refer to blog post for details of supported features of DiscoveryIQ and steps to uninstall DiscoveryIQ agent.

Issue:Tasks that use ODBC Connection, configured with DataDirect SequeLink driver, fail with error:

Specified driver could not be loaded due to system error  182:  (DataDirect SequeLink 6.0, C:\Program Files\DataDirect\slodbc60\dwslk22.dll).

 

Solution:

Copy the following files from the SequeLink/main2 folder

icuuc34.dll

libeay32.dll

ssleay32.dll

to

%AGENT_HOME%\downloads\package-ICSAgent_Rxx\package\ICS\main\bin\rdtm folder.

(Choose option ‘Copy and Replace’ when prompted)

 

This change will cause tasks that use Salesforce Connection with API version 32 or less to fail with:

Couldn't load the library [pmsfdcXXX.dll] for plug-in #310600.  Error msg: [Database driver event...Error occurred loading library [pmsfdcXXX.dll]. System error encountered is 182. Error text is The operating system cannot run %1.].

 

To resolve the problem with the Salesforce connection, change the Service URL in Salesforce to use API version 33 or higher.

 

ISSUES AND SOLUTIONS (APP) April 15, 2017

Spring '17 Documentation

Informatica Cloud Spring 2017 - Helping customers accelerate their journey to the cloud

Issue: The upgrade was not successful. [================================  02/06/2016  1:36:27.41  copy upgrades\fullupgrade\FileLockDetect.exe FileLockDetect.exe        1 file(s) copied. Waiting for file handles to clean up ERROR: Input redirection is not supported, exiting the process immediately. JAVA_HOME: .\jre agentCoreMajorUpgrMarker exists  Do major upgrade.  Interim agent exists before upgrade, so remove it first.  main2\tomcat\work\Tomcat - The directory is not empty. main2 cannot be removed.  ]


 

Solution: Please visit the discussion link: https://network.informatica.com/message/199683

Issue: The test connection for $SAP_CNX failed. Could not initialize class com.sap.conn.

jco.rt.JCo.JCoRuntimeFactory

The test connection for $SAP_CNX failed. Error getting the version of the native layer: Java.lang.UnsatisifiedLinkError: no sapjco3 in java.library.

 

Solution: You need to have the correct JCO libraries for the 64 bit Agent in the environment. If you have moved from 32 to 64 bit agent, then you need to download and setup the JCO files before connectivity to SAP can be established.

 

See the Solution in this KB 504336

Issue: CA Certs Post Upgrade Task

 

 

Solution: ONLY if after past releases you have had to copy security Jars from the Jre2 backup directory back to the Jre directory, you will need to copy these files to a new location as the directory structure will change.

 

1.      Copy the local_policy.jar and the US_export_policy.jar files from the following directory:

<Secure Agent installation directory>\jre2\lib\security

2.      Paste the jar files to the following directory:

<Secure Agent installation directory>\jre\lib\security

3.      Restart the agent


connectors using security certificates - Amazon (Redshift, S3), Microsoft Dynamics CRM

Issue: Jobs on Windows using ODBC fail with the following error: Data source name not found and no default driver specified Database driver error.


Solution: The agent should run as an Administrator

During the upgrade the agent will update the registry to add need ODBC entries. If your agent does not have access to update the registry you may face the following error:

WRT_8001 Error connecting to database... WRT_8001 [Session s_dss_000GSG0I0000000000xx Username YOURUSER Error -1 [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Database driver error... Function Name : Connect Database driver error... Function Name : Connect Database Error: Failed to connect to database using username and connection string [odbc://dbtype=<oracle or SQLServer>;host=,host.;port=<port>;database=<database>].]

This occurs in Windows for Oracle and SQL Server type connections.

 

Cause: Since there is a new ODBC driver for oracle and SQL server, as a part of upgrade, the registry will be updated to add new entries for these 2 drivers, if the user running the agent does not have permission to modify the registry entries, then these entries will not be added to the registry, hence when the task tried to load the drivers, it looks for registry entry and fails.

 

Solution: If you are not running your agent as an Administrator we recommend you run as an Administrator before the upgrade. KB149494 has the steps to run as an Administrator.

 

What if I face the above error after the upgrade?

 

1) Trigger the upgrade by modifying the infaagentversion file (in a text editor) in {agent}/main and change 26.* - to 25.*

2) Restart the agent as admin (steps same as in KB149494)

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icoss1.informaticacloud.com/saas/download/upgrade-22.3.2.1.0.0.0/fullupgrade/linux32/upgrade.zip: [Download failed due to HTTP 403 response.]]

 

Solution: Please visit the document  link: Secure Agent IP Address KB:112401

Issue: Upgrade failure. [Cannot download upgrade file:https://icsdownloadsecure.informatica.com/ICS/r23/mirror2/download/upgrade-22.3.2.1.0.0.0/fullupgrade/win32/upgrade.zip: [Download failed due to HTTP 502 response.]]

 

Solution: This is because our mirror sites have not been whitelisted in your network. Please contact Informatica Cloud Support with your orgid, so that we can push the download via a secure site.

Issue: ODBC DSN not found on Linux after the upgrade will encounter the following error:

The connection test failed because of the following error: [unixODBC][Driver Manager]Data source name not found, and no default driver specified (0)


Solution:


This is because the odbc.ini file was in the main directory and it was backed up to main2 folder during the upgrade.

To fix the issue copy over the odbc.ini file from infaagent/main2 to infaagent/main.

Also you should store the odbc.ini outside of main, going forward, or keep a copy in rdtm-extra directory, so it is copied back automatically after future upgrades.

Issue: Copy Customized connector configuration files (if any)

If any of your licensed connector has configuration files, custom configured in your Informatica cloud secure agent, it is recommended that you copy these files

From

<Secure Agent installation directory>\main2\bin\rdtm\javalib\<plugin ID>\

 

To

<Secure Agent installation directory>\downloads\<latest connector zip package>\package\rdtm\javalib\<Plugin ID>

 

And

 

From:

<Secure Agent installation directory>\main2\tomcat\plugins\<plugin ID>\

 

To

<Secure Agent installation>\downloads\<latest connector zip package>\package\plugins\<Plugin ID>

 

connectorConfig File
AvaturecustomFields.ini
Birstbirstconfiguration.ini
Box APIconfig.properties
Coupacoupa.ini, read.xsd
Dropboxconfig.ini
Eloqua Bulk APIActivityConfig.json
Google APIconfig.properties
HadoopsetHadoopConnectorClassPath.sh
JDBCjdbc.ini
Jiraconfig.ini, jirafields.ini
JSON Targetconfig.ini
Marketoactivityattributes.csv
Open AirOpenAirCodes.properties
Quickbooks V2connectionparameters.ini
Workdayfields.ini
XML Sourceconfig.ini

Issue: Will Discovery IQ package available in Informatica Spring 2017 release.

 

During R27 upgrade, DiscoveryIQ package would be automatically removed and you may want to uninstall DiscoveryIQ agent  (especially windows agent env) from your environment.

Please refer to blog post for details of supported features of DiscoveryIQ and steps to uninstall DiscoveryIQ agent.

Issue: Tasks that use Business Services will fail if the Business Services are edited, Existing unmodified tasks and newly created tasks will work fine.

 

Details:

The Spring 2017 release includes the following enhancement for REST V2 Connector:

 

A fault group to process fault response is enabled when you create a new business service or edit an existing business service.

 

After you upgrade to Spring 2017 release, if you edit a business service that is used in a REST V2 midstream transformation, the respective mapping will fail.


Solution:

 

To avoid this issue, do not edit the business service. If you edit the business service, recreate the midstream transformation and map all the required fields.

Issue:Tasks that use ODBC Connection, configured with DataDirect SequeLink driver, fail with error:

Specified driver could not be loaded due to system error  182:  (DataDirect SequeLink 6.0, C:\Program Files\DataDirect\slodbc60\dwslk22.dll).

 

Solution:

Copy the following files from the SequeLink/main2 folder

 

libeay32.dll

ssleay32.dll

to

%AGENT_HOME%\downloads\package-ICSAgent_Rxx\package\ICS\main\bin\rdtm folder.

(Choose option ‘Copy and Replace’ when prompted)

 

This change will cause tasks that use Salesforce Connection with API version 32 or less to fail with:

Couldn't load the library [pmsfdcXXX.dll] for plug-in #310600.  Error msg: [Database driver event...Error occurred loading library [pmsfdcXXX.dll]. System error encountered is 182. Error text is The operating system cannot run %1.].

 

To resolve the problem with the Salesforce connection, change the Service URL in Salesforce to use API version 33 or higher.

Are you facing issues after the Informatica Cloud Fall '16 upgrade? We are here to help. We have identified some of the common issues encountered by users and pulled together a list of solutions to help you address them. Read on to know more.

 

Note: If you have any concerns please open a case with Informatica Support by clicking on Contact Support option within Informatica Cloud.

 

SUMMARY

 

At the bottom of this post, we have a running list of known issues in Fall '16 upgrade and their solutions. We will continue to update this list so check back later for more updates. If you already have a network.informatica.com login, we recommend following  this post so that you can automatically receive updates for any new issues.

 

You can also follow us on Twitter @INFAsupport for all the latest updates on Informatica Cloud Fall '16

 

ISSUES AND SOLUTIONS (APP2 and APP3) Nov 19, 2016 -

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icsdownloadsecure.informatica.com/ICS/r23/mirror1/download/upgrade-24.1.2.1.0.0.0/fullupgrade/win32/upgrade.zip: [There is not enough space on the disk]]

 

Solution: The minimum free space required for the upgrade is equal to 2X size of the current <SecureAgent>\Main directory plus 1 GB. Please ensure that enough disk space is available.

You may remove old upgrade backups such as Main2, Main3, Main4 etc. You can check the cache directory and remove any files there. You can remove the tomcat log or session logs that has gotten large or accumulated over time.

ISSUE 2: The upgrade was not successful. [================================  02/06/2016  1:36:27.41  copy upgrades\fullupgrade\FileLockDetect.exe FileLockDetect.exe        1 file(s) copied. Waiting for file handles to clean up ERROR: Input redirection is not supported, exiting the process immediately. JAVA_HOME: .\jre agentCoreMajorUpgrMarker exists  Do major upgrade.  Interim agent exists before upgrade, so remove it first.  main2\tomcat\work\Tomcat - The directory is not empty. main2 cannot be removed.  ]

 

SOLUTION 2: Please visit the discussion link: https://network.informatica.com/message/199683

Problem: Jobs on Windows using ODBC fail with the following error: Data source name not found and no default driver specified Database driver error.


Solution: The agent should run as an Administrator

During the upgrade the agent will update the registry to add need ODBC entries. If your agent does not have access to update the registry you may face the following error:

WRT_8001 Error connecting to database... WRT_8001 [Session s_dss_000GSG0I0000000000xx Username YOURUSER Error -1 [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Database driver error... Function Name : Connect Database driver error... Function Name : Connect Database Error: Failed to connect to database using user [xxx] and connection string [odbc://dbtype=<oracle or SQLServer>;host=,host.;port=<port>;database=<database>].]

This occurs in Windows for Oracle and SQL Server type connections.

 

Cause: Since there is a new ODBC driver for oracle and SQL server, as a part of upgrade, the registry will be updated to add new entries for these 2 drivers, if the user running the agent does not have permission to modify the registry entries, then these entries will not be added to the registry, hence when the task tried to load the drivers, it looks for registry entry and fails.

 

Solution: If you are not running your agent as an Administrator we recommend you run as an Administrator before the upgrade. KB149494 has the steps to run as an Administrator.

 

What if I face the above error after the upgrade?

 

1) Trigger the upgrade by modifying the infaagentversion file (in a text editor) in {agent}/main and change 26.* - to 25.*

2) Restart the agent as admin (steps same as in KB149494)

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icoss1.informaticacloud.com/saas/download/upgrade-22.3.2.1.0.0.0/fullupgrade/linux32/upgrade.zip: [Download failed due to HTTP 403 response.]]

 

Solution: Please visit the document  link: Secure Agent IP Address KB:112401

Issue: Upgrade failure. [Cannot download upgrade file:https://icsdownloadsecure.informatica.com/ICS/r23/mirror2/download/upgrade-22.3.2.1.0.0.0/fullupgrade/win32/upgrade.zip: [Download failed due to HTTP 502 response.]]

 

Solution: This is because our mirror sites have not been whitelisted in your network. Please contact Informatica Cloud Support with your orgid, so that we can push the download via a secure site.

Issue: ODBC DSN not found on Linux after the upgrade will encounter the following error:

The connection test failed because of the following error: [unixODBC][Driver Manager]Data source name not found, and no default driver specified (0)


Solution:


This is because the odbc.ini file was in the main directory and it was backed up to main2 folder during the upgrade.

To fix the issue copy over the odbc.ini file from infaagent/main2 to infaagent/main.

Also you should store the odbc.ini outside of main, going forward, or keep a copy in rdtm-extra directory, so it is copied back automatically after future upgrades.

Issue: Issues while reading data using Microsoft Excel connector:

You might notice issues when trying to use Microsoft excel connector on ICS. The test connection would be successful however when trying to select the connection as a source in DSS\mapping task we receive error occurred while fetching objects.


Solution:

We have a solution for this issue and customers can reach out to Informatica Support to get the patch.

Issue: Data source name not found and no default driver specified Database driver error


Solution:

The agent should run as an Administrator

During the upgrade the agent will update the registry to add need ODBC entries. If your agent does not have access to update the registry you may face the following error:

WRT_8001 Error connecting to database... WRT_8001 [Session s_dss_000GSG0I0000000000xx Username YOURUSER Error -1 [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Database driver error... Function Name : Connect Database driver error... Function Name : Connect Database Error: Failed to connect to database using user [xxx] and connection string [odbc://dbtype=<oracle or SQLServer>;host=,host.;port=<port>;database=<database>].]

This occurs in Windows for Oracle and SQL Server type connections.

 

Cause: Since there is a new ODBC driver for oracle and SQL server, as a part of upgrade, the registry will be updated to add new entries for these 2 drivers, if the user running the agent does not have permission to modify the registry entries, then these entries will not be added to the registry, hence when the task tried to load the drivers, it looks for registry entry and fails.

 

Solution: If you are not running your agent as an Administrator we recommend you run as an Administrator before the upgrade. KB149494 has the steps to run as an Administrator.

 

What if I face the above error after the upgrade?

 

  1. 1) Trigger the upgrade by modifying the infaagentversion file (in a text editor) in {agent}/main and change 26.* - to 25.*
  2. 2) Restart the agent as admin (steps same as in KB149494)

 

ISSUES AND SOLUTIONS (APP only) Nov 12, 2016 -

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icsdownloadsecure.informatica.com/ICS/r23/mirror1/download/upgrade-24.1.2.1.0.0.0/fullupgrade/win32/upgrade.zip: [There is not enough space on the disk]]

 

Solution: The minimum free space required for the upgrade is equal to 2X size of the current <SecureAgent>\Main directory plus 1 GB. Please ensure that enough disk space is available.

You may remove old upgrade backups such as Main2, Main3, Main4 etc. You can check the cache directory and remove any files there. You can remove the tomcat log or session logs that has gotten large or accumulated over time.

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icoss1.informaticacloud.com/saas/download/upgrade-22.3.2.1.0.0.0/fullupgrade/linux32/upgrade.zip: [Download failed due to HTTP 403 response.]]

 

Solution: Please visit the document  link: Secure Agent IP Address KB:112401

 

Issue: Upgrade failure. [Cannot download upgrade file:https://icsdownloadsecure.informatica.com/ICS/r23/mirror2/download/upgrade-22.3.2.1.0.0.0/fullupgrade/win32/upgrade.zip: [Download failed due to HTTP 502 response.]]

 

Solution: This is because our mirror sites have not been whitelisted in your network. Please contact Informatica Cloud Support with your orgid, so that we can push the download via a secure site.


New features and Documentation - Fall '16

Informatica Cloud Fall 2016

ISSUE: Log File Name Change Starting from next release, Fall ‘16, we are providing an option to create the tomcat and infaagent logs with rolling log appenders after a certain size for e.g.: 100MB. This will help to ensure that the filesize is limited, and help to resolve disk space issues. The infaagent.log will still be created and will contain output from the windows service (infaagent.exe) and scripts (agent_start, runAgentCore, etc.). There is a separate file called agentcore.log from the agentcore process, which will now contain the current content of the infaagent.log. Log rolling configuration is applied to the tomcat.log as well.


Solution:

How will this impact me?If you are parsing the infaagent.log file currently, you will need to start parsing agentcore.log, after the upgrade to Fall ’16.  If you simply read the logs when there is an issue, then there is no impact. The only impact is if you have any process that automatically reads/parses the log files. For more details on default log size and configuring/tuning to your specific requirements, please refer to the following KB.

Issue: ODBC DSN not found on Linux after the upgrade will encounter the following error:

The connection test failed because of the following error: [unixODBC][Driver Manager]Data source name not found, and no default driver specified (0)


Solution:


This is because the odbc.ini file was in the main directory and it was backed up to main2 folder during the upgrade.

To fix the issue copy over the odbc.ini file from infaagent/main2 to infaagent/main.

Also you should store the odbc.ini outside of main, going forward, or keep a copy in rdtm-extra directory, so it is copied back automatically after future upgrades.

Issue: Customers on Linux with PowerCenter and Cloud using the same user to start both services will encounter the following error: Internal error. The DTM process terminated unexpectedly. Contact Informatica Global Customer Support.

 

Solution: Put a line on the top of the agent_start.sh file, located under <agent install Dir>  ‘unset INFA_HOME’  and restart your agnet.

Issue: Task fails at runtime with below Error message on windows for Oracle or SQL Server type connections: WRT_8001 Error connecting to database... WRT_8001 [Session s_dss_000XXX0I00000xxxxxxx Username User_Name DB Error -1 [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified Database driver error... Function Name : Connect Database driver error... Function Name : Connect Database Error: Failed to connect to database using user [User_Name] and connection string [odbc://dbtype=<oracle or SQLServer>;host=,host.;port=<port>;database=<database>].]

 

Solution:

Please reach out to Informatica Support with the infaagentversion file and we shall help you resolve the issue.



Informatica Cloud now supports Java 8.  You'll need to uninstall and download the latest agent from the 'agent download' in Informatica Cloud in order to get the Java 8 version of the agent. All upgraded agents were upgraded to the Java 7 version.

 

If you are interested, or have the requirement to use Java 8, please refer to the following Yonyx with step by step instructions:  498429

 

The default agent available before Summer' 16 upgrade uses Java 7, and was not updated to Java 8 during the upgrade. The migration to Java 8 needs to be performed by the user. If you download a new agent from Informatica Cloud, it uses Java 8.

Java 8 agents allows better performance when it comes metadata fetch by avoiding the use of the legacy ODBC-JDBC bridge. Java 8 also supports TLS encryption by default.