Skip navigation
1 2 3 Previous Next

Cloud Data Integration

69 posts

The Informatica Global Customer Support Team is excited to announce an all-new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data and so on. In these sessions, we will strive to provide you with as many technical details as possible, including new features and functionalities, and where relevant, show you a demo or product walk-through as well.

 

Topic and Agenda

 

 

Multi-cloud and hybrid architectures make data management complicated. You need a comprehensive next-generation integration platform as a service (iPaaS) to handle new use cases as they emerge.

 

Join us for a Meet the Experts webinar about how to keep up with your evolving data management needs with Informatica Intelligent Cloud Services (IICS):

  • Discover some of our new iPaaS services and capabilities
  • See a demo of IICS Integration at Scale, which uses serverless technology for data integration to help you process data in the cloud
  • Learn how AI-driven capabilities and microservices-based architecture supports use cases such as B2B integration, data quality, master data management, database and streaming ingestion, and more.

 

Investing in SaaS, PaaS, and IaaS means managing more data in more ways across increasingly complex architectures. Register for this webinar to explore how IICS can make it easier.

Webinar: Meet the Experts: Deep-Dive, Demo, Roadmap - Informatica Cloud App/API Integration

 


Join Informatica product experts as they dive deep into the API and application integration capabilities for accelerating your digital transformation. You will learn:

 

  1. How to develop processes, APIs, and connect to any APIs without any coding
  2. What Intelligent APIs are and why Informatica is uniquely qualified to offer these?
  3. About management of integration artifacts and APIs
  4. The “ilities” (performance, scalability, reliability) of our platform
  5. The IaaS, SaaS, and on-prem partners we integrate with

The IICS Summer 2019 release offers several new capabilities that include integration patterns, expanded connectivity, and platform functions. Highlights are listed below.

 

Data Integration

  • Machine learning-based transformation recommendations allow you to get suggestions for the next best transformation based on the mapping you are working on.
  • Improved support for custom, local parameter file locations, support for fully parameterized SQL queries, ability to overwrite connections at runtime using parameter files.
  • Support for invoking taskflows using the RunAJob command line utility and using file listeners. These can be passed as parameters to data tasks for listener-driven events.
  • New and improved taskflow canvas with a more compact layout improving design experience.
  • The SQL transformation now supports ad hoc SQL queries (in addition to Stored Procedures).
  • Intelligent Structure Discovery:
    • Enhancements to existing ISD models with data from additional sample files (JSON & XML) to quickly adapt to data drift.
    • The Structure Parser transformation can now update or change the associated intelligent structure model without breaking unchanged ports mappings which enhances mapping flexibility and allows for fast adaptation to changes in incoming files.
    • Simplified design time experience with the ability to add a prefix or a suffix to existing node names using the bulk action functionality.
    • Enhanced parsing engine support with the ability to discover and parse AVRO files.
  • Mass Ingestion:  
    • Additional sources and targets:  ADLS GEN1 as source, ADLS GEN2 as source and target, Snowflake as source and target.
    • Support for custom actions during file transfer:  compress/decompress, encrypt/decrypt
    • Integration with Enterprise Data Catalog (EDC): select Mass Ingestion source from EDC.
    • Filter files to transfer by file size and date.
    • Optimization for large file handling – resume from point of failure.

 

Application Integration

  • Developer productivity enhancements: new compact canvas, improved diagnostics and debugging 
    • Improved Process and Guide canvas compact layout improving design experience.
    • New validation panels help diagnose service connectors and connection errors.
    • Fault details are now available with every step to diagnose runtime errors and improve debugging. 
  • Guides: Improved page layout and Salesforce Lightning theme
    • Guide Screen step designer has been revamped to provide more real estate to design guides.
    • Salesforce Lightning theme support is now available for guides and the Guide Launcher.
  • Source control and deployment automation 
    • New CLI utility automates export and extract of design assets to any source control system. The package, import and publish capabilities of the CLI provides the ability to deploy full deployments, packages and patches across environments.
    • Sample Jenkins pipeline to help developers adopt continuous integration practices for Application and Data Integration, and help operations staff with the automation of continuous deployments.
  • Expanded pub/sub and messaging capabilities with Salesforce Platform Events and RabbitMQ to compliment the current support for AMQP, JMS, AWS SNS/SQS, Azure Service Bus/Event Hub and Kafka.
    • New support for Salesforce Platform Event and Push Topic events enabling customers to deliver secure, scalable and customizable event notifications within Salesforce or from external sources. 
    • New RabbitMQ native connector providing support for durable queues. 

 

API Management

  • CLAIRE-enabled privacy data leak identification: Privacy data identification in API content, masking or blocking of sensitive personally identifiable information. (This feature is in preview mode).
  • Ability to customize the structure of API URLs.
  • Support for the internationalization of API, group, and organization names.

 

Integration Hub  

  • Governance enhancement: Support for predefined and custom roles with granular privileges that promote governance.

 

B2B Gateway

  • Exchange files with your partners using Cloud AS2.
  • Get insights into your daily gateway activities and statuses with the new dashboard.

 

Platform

  • Asset Dependency feature providing “used by” and “uses” relationship information across all asset types and services for impact analysis.
  • Common Explore enhancements: Changes made by the user to the Explore column settings are now automatically persisted for that user for various filtered views such as “all assets,” “projects & folders,” etc.
  • More control and flexibility in managing users: Grant/deny service level access to a user through assign services capability. Disable users to prevent user logins for both native and SAML authentication mechanisms. REST APIs for user management – users, groups and roles.
  • Ability to start/stop Secure Agent services to optimize computing resources and isolate the need to restart a single service from others.

 

Connectivity Enhancements  

  • New native connectors to connect to CDM/Power BI, Snowflake (V2), MongoDB, Adobe Experience Platform, Ariba (V2), and Cassandra.
  • Updates to existing connectors: Microsoft (MSD 365 Operations, Cosmos DB, ADLS Gen2 and Gen 1, SQL DW), Google (Google Big Query, Google Storage), Amazon (S3, Redshift), Oracle NetSuite, Salesforce, SAP (Concur V2, S/4 HANA) and Greenplum

 

New IICS Services

The IICS Summer 2019 release offers the following new services:

 

Streaming Ingestion (included in Data Ingestion Service)

  • Ingestion of streaming data from logs, Apache Kafka and IoT sources.
  • Supports AWS & Azure ecosystems as targets of Ingestion: Amazon Kinesis Streams, Kinesis Firehose, Amazon S3 and Azure EventHub
  • Supports lightweight transformations at the edge.
  • Real-time monitoring of streaming ingestion jobs.

 

Integration at Scale (included in Data Integration Service)

  • Run data integration jobs at scale on a fully managed spark serverless cluster.
  • Support for AWS ecosystem: S3 & Redshift
  • Auto-tune and scale fully-managed Kubernetes clusters.

 

Operational Insights

  • Operational Insights is now available for IICS cloud services to help with efficient infrastructure monitoring and alerting. IICS cloud services users can monitor the health of a) Secure Agents, b) services running on each Secure Agent, c) Runtime environments and d) subscribed cloud services.
  • Email Alerts can be set on services and Secure Agents for unavailability or excessive resource consumption.

 

Reference 360  

  • Enables business users to manage enterprise reference data in an easy-to-use, configurable, business-friendly user interface.
  • Provides versioning, collaboration and complete life-cycle management of reference data.
  • REST APIs for managing data and meta-data, enabling automation and system integration.
  • Improved stability and performance.

 

 

Data Quality

  • Enable business users to create rule specifications to perform a variety of Data Quality actions including completeness, validation and standardization.
  • Create and manage value lists for data quality operations using dictionaries. Dictionaries can be used to identify, validate and standardize data as part of a rule specification.
  • Embed Data Quality rules in Data Integration mappings to support Data Quality activities in typical data integration scenarios such as data warehousing, data migrations and more.

 

Thank you, 

The IICS Team

This export zip package includes assets that will help you get started with Cloud Data Integration. This zip includes the following assets:

  • Mapping and mapping task to perform simple join across multiple sources.
  • Parameterized mapping to show re-use of mapping logic across multiple objects.
  • Taskflows with decision tasks

ALERT:  for customers using Informatica Cloud

(Updated added MemSQL and Eloqua - 6/6/2019; added Ariba - 7/31/2019)

 

To better serve our customers, we are planning to place older connectors and unused connectors in End-of-Life (EOL) or Maintenance Mode mid-2019. if you need a new connector enabled, per the customer action below, please create a shipping case and request we add the new connector to your org(s). The differences between Maintenance Mode and EOL are summarized in table below:

 

Term

Description

Bug Fixes

Enhancements

Connector continues to work

Customer Action

End-of-Life (EOL)

Connector at end-of-life. 
Informatica will no longer support; no bug fixes; no enhancements. 
There will be no automatic migrations, upgrades for existing work.

No

No

No

Connector will no longer work post the next release planned for July; and will not be available in your org anymore.
Please verify you not using any connector, or move mappings to an alternative connector, if available.

Maintenance Mode

Connector in maintenance mode. 
Informatica will no longer enhance; and bug fixes may be considered. 
There will be no automatic migrations, upgrades for existing work. You will need to apply the latest recommended connector and migrate your jobs to the next connector.

Yes

No

Yes

Connector will work post R32.
Customer should consider moving to alternative connector, if available; the alternative connector will continue to be further enhanced as necessary.

 

Am I impacted?

Refer to the list below to determine if you are using one of these connectors. A separate email will be sent to all "Subscription" only customers for these connectors.

 

How do I address the issue?

Please reference Customer Action in table above; and Notes to Customers and Alternative connector columns in table below.

 

The following table shows the connectors planned for end-of-life (EOL) or maintenance mode mid-2019

 

Nr

Data Source

Connector Name

EOL or Maintenance Mode

Alternative Connector

Notes to Customers

1

Amazon QuickSight

Amazon QuickSight

EOL

None

 

2

Arc GIS

Arc GIS

EOL

None

 

3

Attensity Discovery Now

Attensity Discovery Now

EOL

None

 

4

Avature

Avature

EOL

Generic REST V2 / WS Connector

 

5

Birst

Birst

EOL

Birst Cloud Connect

 

6

Cloud File Transfer

Cloud File Transfer

EOL

None

 

7

Club Assistant

ClubAssistant

EOL

None

 

8

DataSift

DataSift

EOL

None

 

9

EPIC

EPIC

EOL

None

 

10

IDVExpress

IDVExpress

EOL

None

 

11

Informatica Data Prep

Informatica Data Prep

EOL

None

 

12

Informatica Rev

Informatica Rev

EOL

None

 

13

Intuit QuickBooks

Intuit Quickbooks

EOL

QuickBooks V2

 

14

Intuit Quickbooks Online

Intuit Quickbooks Online

EOL

QuickBooks V2

 

15

Magento

Magento

EOL

None

 

16

Marketo

Marketo 2

EOL

Marketo V3

 

17

Microsoft Dynamics AX

Microsoft Dynamics AX 2009

EOL

None

 

18

Microsoft Dynamics GP

Microsoft Dynamics GP 2010

EOL

None

New connector on roadmap.

19

Oracle Netsuite

NetSuite (Restlet) Write only

EOL

NetSuite

 

20

Oracle Peoplesoft

Oracle Peoplesoft 9.x

EOL

Use generic REST V2 or WS Connector

 

21

Oracle Taleo Business Edition

Oracle Taleo Business Edition

EOL

Generic REST V2 / WS Connector

 

22

Oracle Taleo Enterprise Edition

Oracle Taleo Enterprise Edition

EOL

Generic REST V2 / WS Connector

 

23

Rapnet

Rapnet

EOL

None

 

24

Rave

Rave

EOL

None

 

25

Reltio

Reltio

EOL

None

 

26

Rev

Rev

EOL

None

 

27

Saaggita

Saaggita

EOL

None

 

28

Salesforce Insights

Salesforce Insights

EOL

None

 

29

Snowflake

Snowflake V1 Connector

EOL

Snowflake Cloud Data Warehouse

 

30

Snowflake

Snowflake Big Data Warehouse

EOL

Snowflake Cloud Data Warehouse

 

31

Sugar CRM

Sugar CRM

EOL

Sugar CRM REST

 

32

Tableau (Server)

Tableau V1

EOL

Tableau V3

 

33

Trackwise

Trackwise

EOL

None

 

34

Vindicia

Vindicia

EOL

None

 

35

Zoho

Zoho

EOL

Generic REST V2 / WS Connector

 

36

Amazon Dynamo DB

Amazon Dynamo DB

Maintenance Mode

None

New connector on roadmap.

37

Anaplan

Anaplan

Maintenance Mode

Anaplan V2

 

38

Apache Hive

Hadoop

Maintenance Mode

Hive Connector

 

39

Box

Box

Maintenance Mode

None

New connector on roadmap.

40

Box

Box API

Maintenance Mode

None

New connector on roadmap.

41

Chatter

Chatter

Maintenance Mode

None

 

42

Coupa

Coupa

Maintenance Mode

Coupa V2

 

43

Dropbox

Dropbox

Maintenance Mode

None

New connector on roadmap.

44EloquaEloqua (Soap)Maintenance ModeEloqua Bulk, Eloqua REST

45

Google API

Google API

Maintenance Mode

Google analytics

 

46

LinkedIn

LinkedIn

Maintenance Mode

None

New connector on roadmap.

47

Marketo

Marketo

Maintenance Mode

Marketo V3

 

48

Marketo

Marketo REST

Maintenance Mode

Marketo V3

 

49MemSQLMemSQLMaintenance ModeMemSQL V2Work with MemSQL for connector access

50

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V1

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

51

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V2

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

52

Microsoft Azure Cosmos DB SQL API

Microsoft Azure Document DB

Maintenance Mode

Microsoft Azure Cosmos DB SQL API

Consider building new and updating existing mappings to use Cosmos DB SQL API connector. Note that the Cosmos DB SQL API connector does not support DSS yet. Support for DSS equivalent functionality with the Cosmos DB SQL API connector is planned to be available in 1H 2020

53

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V1

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

54

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V2

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

55

Microsoft Azure SQL DW

Microsoft Azure SQL DW V1

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

56

Microsoft Azure SQL DW

Microsoft Azure SQL Data Warehouse V2

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

57

Microsoft Dynamics AX

Microsoft Dynamics AX 2012

Maintenance Mode

Microsoft Dynamics AX 2012 V3

 

58

Microsoft Excel

Microsoft Excel v1

Maintenance Mode

Intelligent Structure Discovery

 

59

Oracle EBS

Oracle EBS 12.x (Cloud only)

Maintenance Mode

Use generic REST V2 or WS Connector

 

60

Oracle EBS

Oracle InterfaceTable

Maintenance Mode

Use generic REST V2 or WS Connector

 

61SAP AribaAriba HierMaintenance ModeAriba V2

62

SAP Concur

SAP Concur

Maintenance Mode

Concur V2

 

63

SAP SuccessFactors

SAP SuccessFactors SOAP

Maintenance Mode

SAP SuccessFactors Odata

 

64

TFS

TFS

Maintenance Mode

Generic REST V2 / WS Connector

 

65

TM2

TM2

Maintenance Mode

None

 

66

Twitter

Twitter

Maintenance Mode

None

New connector on roadmap.

67

WebServices - REST

REST

Maintenance Mode

REST V2

 

68

WebServices - SOAP

SOAP WebServices

Maintenance Mode

Webservices Consumer Transform

 

69

Webservices V2

Webservices V2

Maintenance Mode

Webservices Consumer Transform

 

70

Workday

Workday

Maintenance Mode

Workday V2

 

71

Zendesk

Zendesk

Maintenance Mode

Zendesk V2

 

72

Zuora

Zoura (SOAP)

Maintenance Mode

Zuora REST V2, Zuora AQuA

 

The Informatica Intelligent Cloud Services (IICS) Winter 2019 release offers several new capabilities that address key data challenges that businesses are facing today. Highlights are listed below.

 

Data Integration

  • Data discovery in Cloud Data Integration with Enterprise Data Catalog (EDC) integration - Customers can now search and discover enterprise-wide metadata from within Data Integration, import connection & object metadata, and use that information to more easily create new or enrich existing mappings and tasks by connecting with an existing EDC installation.
  • “Smart match” recommendations for field mappings increases the frequency of field matches in mappings and tasks. Expanding on the existing automatch, smart match looks for common patterns in field names (prefixes, suffixes, abbreviations, etc.) based on six additional matchers and fuzzy match techniques for recommending field mappings.
  • Taskflows can be invoked via APIs for externalized scheduling and execution. With this enhancement, customers now can invoke taskflows on-demand via an API call and provide input parameters for the tasks it orchestrates, allowing customers to fully leverage Data Integration’s parameterization capabilities. Please refer to the Taskflow as a Service FAQ.
  • Taskflows have also been enhanced to allow them to embed other taskflows to promote reuse.
  • Change data capture has been expanded to include additional sources for DB2 on Linux, Unix, Windows, and iSeries (also known as AS400, i5/OS) platforms, which further enables near real-time changed data propagation capabilities.
  • Mass ingestion is extending connector support, adding Google Storage & Google Big Query as targets and HDFS as both a source and target. Additional enhancements expose CRUD-focused APIs.

 

API and Application Integration

  • Support for Kafka Messaging – Messaging is at the core of many publish-subscribe (Pub/Sub) based applications as a means of decoupling producers and consumers of data.  The addition of Kafka for application integration significantly increases current message-based Pub/Sub interactions between data and applications that today are fulfilled using JMS, AMQP, Amazon SNS/SQS, and Azure Service Bus based “topics.” The ability to bridge these message-based events with the Cloud Integration Hub Pub/Sub style of data propagation provides additional integration pattern options making Informatica unique in the flexibility and capabilities it provides for its customers.
  • JSON Web Token (JWT) based authentication – The API and Application Integration services now support JSON Web Token (JWT) based authentication, an open standard (RFC 7519) that defines a compact and self-contained way for securely transmitting information between API consumers and REST web services. This provides IICS users that use API Manager with another and more secure means of API authentication.
  • API grouping – To better manage the use of JWT tokens and associate their use to multiple API endpoints, a new API “application” grouping capability is being introduced in API Manager. This capability will provide API consumers with fewer tokens to deal with, and API producers will now more easily manage or revoke a consumer’s access to multiple APIs.
  • Japanese language support for the API and Application Integration services – In addition to Japanese language support for the Data Integration service, Japanese customers now have access to the API and Application Integration services user interface and documentation in Japanese.
  • REST and SOAP Service API-based “service connectors” – distributed via GitHub.

Today 55% of Application Integration customer’s connectivity needs are met using service connectors. A service connector allows customers to define REST (XML/JSON), JSON/RPC, or SOAP service integration using a simple web form, with no coding required. The Application Integration service takes care of the rest. If the service offers a WSDL or Swagger interface document, the service connector can be automatically generated by importing the interface document. By creating service connectors, customers can import and configure pre-built business and data service definitions as reusable assets that they can share and/or move from one environment to another. 

This capability, unique to Informatica, provides customers with unparalleled value. Service connectors avoid lock-in or an inability to make updates as you need them to take advantage of new capabilities or built-in extensibility that an API platform can offer. 

To provide this flexibility to customers and encourage community contribution by customers, partners, and other practitioners, Informatica is establishing a GitHub repository where it will publish the service connectors it has created and which it will share with its customers and partners. Customers and partners are free to use these definitions without restriction, including the rights to use, copy, modify, merge, publish, and distribute these under an MIT license. Informatica will also encourage contributions back to the community. Our goal is simple: drive innovation and reduce perceived barriers to adoption.

 

 

Integration Hub

  • Informatica has improved the search experience for Hub events and support for the CLOB data type on topics.

 

B2B Gateway

  • ICS B2B Gateway customers will be migrated to the IICS platform as part of the release upgrade and will benefit from all IICS platform capabilities.

 

Intelligent Structure Discovery

  • Intelligent Structure Discovery expanded its parsing capabilities to handle ORC format files and Excel multiple sheet files. The user can now design the structure model based on multiple sheet structures and then use the model at run time to parse Excel files in their entirety.
  • With R31, a Structure Parser transformation can be positioned mid-stream to enable a more flexible mapping usage and chaining. In addition, the Intelligent Structure Model detected datatypes are now propagated to the Structure Parser output ports.
  • The ISD design time user interface is enhanced with a "find" functionality which allows the user to search for a specific string in the discovered tree fields and get a list of results showing the path and visually correlated with the model representation. The user can also perform actions on multiple elements chosen from the result list such as include, exclude, replace, and even change of element type. The ability to perform actions on multiple elements significantly improves the usability and productivity.
  • A new vertical folder view mode will be available in R31 for handling complex hierarchy files.

 

IICS Platform

  • Common Explore productivity enhancements – Improved copy functionality with overwrite & rename conflict resolution options to copy assets within and across folders. “Created by” and “last updated by” attributes as columns for all asset types in the common Explore page.
  • Export/import capability for sub-organizations which enables asset migration across environments that use an organization hierarchy. More control and flexibility with enable/disable checksum validation options during export and import.
  • Improved export/import error logging along with the ability to access and download export/import logs through the UI and the API.
  • API to search, list, and filter assets in projects and folders using a variety of conditions such as timestamp, location, “last updated by,” and tags. This API can also be leveraged along with export APIs to export objects.
  • Improvements to the RunAJob utility – Support for projects and folders to invoke tasks by task name.
  • Usability improvements – Ability to copy table cell data in the common Explore page, Monitor service, and Administrator service for use in other user interfaces like search boxes and filter conditions for easier task completion.
  • Search capability for connections and job monitoring to quickly and easily find needed information.
  • Ability to enable and disable a service for agents in a Secure Agent group to effectively deploy workloads and efficiently utilize computing resources.
  • Secure Agent registration using tokens (instead of passwords) for increased security and enabling SAML single sign-on.

 

Operational Insights 

  • Operational Insights extends support to the on-premises Data Quality product, in addition to BDM and PowerCenter, with capabilities such as domain health, job run analytics, resource utilization, and alerts.
  • Click-through analytics journey from cross-domain to the individual job level and enhancements to job run analytics for on-premises products (PowerCenter, BDM). Plus, enhancements to job run analytics to report on data loaded and data processed.
  • Power Center & BDM infrastructure e-mail alert enhancements such as Secure Agent unavailability and Operational Insights data collection failures.

 

Connectivity Enhancements

New AWS Quickstart for Cloud Analytics Modernization, an end-to-end solution for self-service cloud analytics with Informatica (IICS, Enterprise Data Catalog), Tableau Server, and AWS Services.

Several new connectors and enhancements to existing connectors across ecosystems as listed below. New connectors introduced are highlighted in bold:

  • Azure: ADLS Gen 2 Preview, Azure DW V3, Azure Data Lake Store V3, Azure Blob V3 
  • Google: Google Cloud Storage V2, Google Analytics, Google Big Query V2, Google Big Query 
  • Amazon: Amazon S3 V2, Amazon Aurora, Amazon Redshift V2 
  • Salesforce: Salesforce Marketing Cloud (SFMC), SFDC (Sales and Service) 
  • SAP: SAP Connector, SAP HANA Cloud Platform (DB) 
  • Adobe: Adobe Cloud Platform 
  • Analytics: CDM Folders connector preview, Tableau V3, Tableau V2 
  • Databases: MySQL Relational, Hive, Greenplum, DashDB, Snowflake 
  • Tech: REST V2, WSconsumer, Complex File Processor 
  • Microsoft Apps: Microsoft SharePoint Online 
  • Oracle: Oracle Netsuite V1, Oracle Relational 

 

Summary of some of the connectivity enhancements are as follows:

AWS

  • Supporting file names longer than 250 characters with S3
  • Support for custom JDBC URL for Redshift
  • Support for ORC files with S3

Snowflake

  • Custom Query metadata fetching without having to run the query

Google

  • Custom Query Support for Google Big Query V2 connector
  • Pushdown support for Google Big Query thru ODBC
  • Google Analytics - Enhancement to fetch fields based on multiple Views IDs from GA
  • Google Big Query Mass Ingestion - Direct load Cloud Storage->Big Query

Azure

  • Preview of ADLS Gen2 connector: support create target, configurable escape character and text qualifier in R/W scenarios, create and rename directory, rename file, header-less files, support RBAC for all types of AAD Authentication, append data to an existing file, support parameterization
  • Azure DW V3: support TARGET NAME OVERRIDE and TARGET SCHEMA NAME OVERRIDE in the writer, support SOURCE NAME OVERRIDE and SOURCE SCHEMA OVERRIDE with the reader, support custom query and multiple objects in CMD and MCT

 

Microsoft Apps

  • Microsoft SharePoint Online: support for agents running on Linux

 

Analytics

  • Preview of CDM Folders connector: new connector, with the ability to write to ADLS Gen 2 in CDM format, and then access the data from Power BI as a dataflow
  • Tableau V2: upgrade connector to the latest Tableau SDK

Databases & DW

  • MySQL Relational: array insert support
  • Greenplum: native reader and writer

 

NetSuite V1: address 2-factor authentication

 

Salesforce

  • SFDC Sales, Service connector: support latest SFDC API
  • Salesforce Marketing Cloud SFMC: insert/update operation for “Non Contact Linked Data Extensions”

Dear Customer,

 

Today, we are excited to announce our acquisition of all assets of SynQ. SynQ, as you may know, was a boutique Informatica partner for many years and is a data integration platform that offers connectors to ServiceNow on Informatica Intelligent Cloud Services that your organization currently uses. This acquisition now means that Informatica will directly sell and support the connector. We believe this acquisition is positive for all our customers allowing Informatica to offer superior support and innovate faster in the ServiceNow cloud ecosystem.

 

  If you have any questions about this development, please contact our Global Support or your sales representative. For additional information, please refer to the attached FAQ

The Informatica Global Customer Support Team is excited to announce an all-new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data and so on. In these sessions, we will strive to provide you with as much technical details including new features and functionalities as possible, and where relevant, show you a demo or product walk-through as well.

 

Topic and Agenda

 

Topic: Cloud Webinar Series:  3 paths to revolutionize data integration

Date: Thursday, 11 October 2018

Time: 11:00 AM PDT/ 2:00 PM EDT

Duration: 1 Hour

 

About this Webinar:

 

Three focused, high-impact sessions with presentation and deep-dive demo debut on Oct. 11 at 11 am PDT:

  • Modernize your data warehouse, enable analytics, with next-gen cloud integration: This conversation and deep-dive demo explore how Informatica Cloud Data Integration lets you easily connect to any data source, quickly integrate and load trusted data to a cloud data warehouse, or deliver insights via self-service BI tools.
  • Integrate cloud and on-premises applications, no coding required: You’ve got Salesforce, Workday, and NetSuite in the cloud—but you also have SAP and Oracle on-premises. Learn how Informatica's next-generation Integration Platform as a Service (iPaaS) lets you quickly, easily synchronize data in real time in a hybrid environment.
  • 3Modernize legacy applications with API and data integration: Every new cloud application has to work with traditional on-premises applications. Learn how to use Informatica’s iPaaS solution to modernize your legacy environments for today’s cloud-first strategy.

 

Speakers:

 

Andrew Comstock, Director of Product Management, Informatica

Luc Clement, Senior Director of Product Management, Informatica

 

 

Join the webinar that address your needs, or sign up for all three sessions and take full advantage of your cloud potential—and the next generation of cloud integration.

 

-------------------------------------------------------

To register for this meeting

-------------------------------------------------------

 

Please find the details here: Cloud Webinar series: 3 paths to fast and agile cloud integration | Informatica US

 

You can also contact us at:

network@informatica.com

 

Regards,

MeetTheExperts Team

In this series of videos you will learn some essentials of Cloud Data Integration.

 

Getting StartedContent

Introducing Cloud Data Integration

 

This video is a quick introduction to Informatica Intelligent Cloud Data Integration.

Tasks and Mappings

 

Overview of the integration tasks and mappings to transform data using Cloud Data Integration

File-based Mass Ingestion

 

Using file-based mass ingestion for moving files from on-premise to Cloud storage like Amazon S3,Azure Blob

Advanced Taskflow

 

Using taskflows to orchestrate data  integration tasks

How to Download and Install a Secure Agent for IICS

 

This video shows you how to download and install a Secure Agent to use with Informatica Intelligent Cloud Services (IICS).



Here is an orientation video for the upcoming Informatica Cloud to Informatica Intelligent Cloud Services (IICS) migration process. The video provides an introduction to IICS for customers that are migrating from Informatica Cloud.

 

We will be migrating organizations between April and August. You will receive notification when YOUR organization(s) is going to be migrated and when your sandbox environment (pre-release) is available.

 

Video link (moved to YouTube so that it doesn't have to be downloaded):

 

Introducing Informatica Intelligent Cloud Services - YouTube

 

Full FAQ:

 

Informatica Intelligent Cloud Services Migration FAQ

Dear Customer,

The Informatica Global Customer Support Team is excited to announce an all new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data etc. In these sessions, we will strive to provide you with as much technical details including new features and functionalities as possible, and where relevant, show you a demo or product walk-through as well.

Topic and Agenda

 

  • Meet the Experts –  Introduction to Informatica Intelligent Cloud Services
  • Date: March 13th 2018
  • Time: 8:00 AM PST
  • Duration: 1 Hour
  • What’s new in Cloud Data Integration
  • Increase your productivity with pre-defined templates
  • Build advanced orchestrations with new Taskflows
  • Leverage the SDLC capabilities of the platform for agile development.
  • Demo
  • The team will also cover everything that you need to know about migrating your current ICS orgs to IICS.

 

 

  Speakers:

 

  • Meenakshi Vasudevan
  • Dilip Poluru
  • Vivin Nath
  • Anand Peri

 

 

-----------------------------------------------------
To register for this meeting
-----------------------------------------------------
1. Go to
https://informatica-events.webex.com/informatica-events/j.php?RGID=rda4d5dab5df15af2228c1f2a3b50d88a
2. Register for the meeting.

Once the host approves your request, you will receive a confirmation email with instructions for joining the meeting.

To view in other time zones or languages, please click the link:
https://informatica-events.webex.com/informatica-events/j.php?RGID=r44607166cb690cc9522b298c0ce42491

-----------------------------------------------------
For assistance
-----------------------------------------------------
1. Go to
https://informatica-events.webex.com/informatica-events/mc
2. On the left navigation bar, click "Support".
You can contact me at:
meettheexperts@informatica.com

 

 

Informatica Intelligent Cloud Services Migration FAQ
(Updated: June 27, 2019)

 

The following questions and answers will help you better understand the Informatica Intelligent Cloud Services migration. This is a live document, and we will be adding questions and answers as they arise.

 

This document is organized in sections by Service: common questions, Data Integration questions, API and Application Integration questions, and Data Integration Hub questions.

 

 

Common Questions

 

1. When will the Informatica Cloud to Informatica Intelligent Cloud Services migration begin?

Migration to the new Informatica Intelligent Cloud Services platform will begin in early Q2 2018.

 

2. What order of services will the migration follow?

As a first step, Informatica Cloud Services (ICS) customers that do not use the Informatica Cloud Real Time (ICRT) services will be migrated. ICRT customer migration, including customers that have licensed Cloud Premium services, is commencing in July 2018. ICRT customers are grouped by functional usage. Migration of customers is planned to be completed before the end of 2018.

 

3. Where can I find more information about the new features and behavior changes in Informatica Intelligent Cloud Services?

See the IICS Navigation Overview for ICS and ICRT Users video to for a quick tour of IICS. This video provides an overview of Informatica Intelligent Cloud Services for users that are already familiar with Informatica Cloud.

 

The Migration folder of the Cloud Application Integration Community Site provides links to a number of resources that can help you with this. These include:

 

For ICS Users

 

For ICRT Users

 

4. Will all customers be migrated at the same time?

Migration will be performed in multiple batches to ensure maximum flexibility and a minimal amount of disruption to customers. These batches may include ICS and ICRT customers.

 

5. When will customers be notified about the migration?

Customers will be notified 6 weeks prior to the migration. The communication will go out to all users of the Informatica Cloud organization being migrated.

 

6. What is the migration process?

All customers will be migrated to a Sandbox environment before they are migrated to the production environment. The Sandbox environment will be available to you for at least three weeks, to allow you to perform testing. Your Informatica Cloud assets will be migrated to the Sandbox environment, but an asset that you create in the Sandbox environment will not be migrated to the production environment. If you create assets in Informatica Cloud during the three weeks of preview testing, these assets will be migrated to the production environment. After the three weeks of the preview, testing has elapsed, your organization will be migrated to the production environment. If you have concerns or need more time please contact Informatica Global Customer Support.

 

7. Should all customers test the migration?

Yes, all customers are expected to test the migration in the Sandbox environment. It is critical for all customers to participate in the testing to ensure a smooth transition from Informatica Cloud to Informatica Intelligent Cloud Services. As a minimum, Informatica recommends that you test at least one organization with a good mix of use cases.

 

8. Are there any special instructions for using the Sandbox environment?

Yes. When you are notified of the upcoming migration you will be able to access your Sandbox environment. The following instructions apply to the Sandbox environment:

  • You can access the Sandbox environment after Informatica notifies you that your organization has been migrated to the Sandbox environment. To access the Sandbox environment, open the Sandbox environment URL (https://dm-sandbox.informaticacloud.com/identity-service/home) using your Informatica Cloud credentials.
  • Customers that have licensed Data Integration (previously ICS) should verify that they can access the Data Integration Service
  • Customers that have licensed Application Integration (previously ICRT or Cloud Premium customers) should verify that they can access the Application Integration Service
  • If you are unable to access either service, contact Informatica Global Customer Support.
  • If your company uses a firewall, you may need to update your whitelist of IP addresses. The Secure Agent IP address ranges differ among Informatica Cloud, the IICS Sandbox environment, and the IICS production environment. The approved Secure Agent IP address ranges for the production and Sandbox environments are listed in KB article 524982.
  • Download a Secure Agent for the Sandbox environment.
    Existing preview agents will not be upgraded. Uninstall any previous preview agent, and then download and install a new preview agent from the Sandbox environment. The new Secure Agent will point to the Sandbox environment and allow you to run jobs.
    Note: On Windows, you must install the Secure Agent on its own machine. You cannot run multiple agents on the same Windows machine. On Linux, you can install multiple agents on the same machine if you install them under a different ID and folder.
  • Update your tasks and connections to use the new Secure Agent that you downloaded from Informatica Intelligent Cloud Services.
    Tip: If you have the Secure Agent Cluster license, you can add the preview agent to an existing Secure Agent group so that you won't have to update tasks and connections individually.
  • If you want tasks in the Sandbox environment to run on the schedules that you defined in Informatica Cloud, edit the tasks and save them. Schedules are migrated to the Sandbox environment, but they are not activated. When you save a task, the schedule will be re-activated.
  • Clear your browser cache if you see a blank screen or the message, “Internal Server Error. Please contact Support.” This issue is caused by a browser caching problem. Clear your browsing history, including all cookies and all cached images and files, and then close the browser. Re-open the browser and log in again.

 

ICRT service users should also review the ICRT Service Migration to the Cloud API and Application Integration Service guide.

 

9. Which web browsers can customer use with IICS?

IICS supports the following browsers: Google Chrome, Microsoft Internet Explorer 11, and Mozilla Firefox. For more information, see the PAM for Informatica Intelligent Cloud Services (IICS) on Informatica Network.

If you use IE11, note the following:

  • You must enable cross-origin support (CORS) in the browser. For information about enabling CORS in IE11, see the "Enabling CORS in Internet Explorer 11" topic in the Data Integration online help.
  • The time stamps displayed in the Monitor service and on the My Jobs page in Data Integration appear in Coordinated Universal Time (UTC).

 

10. What will happen to Secure Agents during the production migration?

During the production migration, all of your Informatica Cloud Secure Agents will be upgraded to the latest version. Secure Agents that you downloaded from the IICS Sandbox environment will not be upgraded.

The migration process retains the following items:

  • Connection properties that you stored with a local Secure Agent.
  • Secure Agent configuration property changes.
  • All files that you copy to <Secure Agent installation directory>/apps/Data_Integration_Server/ext/

The migration process does not retain manual changes that you made to configuration files in the Secure Agent installation directory or its subdirectories.

Note: As a best practice, Informatica recommends that you back up your Informatica Cloud Secure Agent directories before the migration so that you can restore them easily in the unlikely event of a rollback.

 

11. How much disk space is required to upgrade the Secure Agent?

To calculate the free space required for upgrade, use the following formula:

Minimum required free space = 3 * (size of current Secure Agent installation directory - space used for logs directory) + 1 GB

 

12. For customers that use a firewall, what are the Informatica Intelligent Cloud Services URLs that need to be included in the whitelist of approved IP addresses?

The approved Secure Agent IP address ranges for production and Sandbox environments are listed in KB article 524982.

 

13. Will there be any downtime for migration, and if yes, what is the expected downtime?

The migration will affect your service’s availability. The exact duration of the downtime will be communicated to each customer as part of the migration notification. The exact downtime depends upon the number of Informatica Cloud assets and organizations that a customer has. Informatica estimates the downtime to be in the range of 1-4 hours.

 

14. What will happen to the old Informatica Cloud organization after the migration is completed?

The Informatica Cloud organization will be deactivated, but its metadata will be retained for 30 days post-migration to ensure that Informatica has a copy for comparison and for roll-back in case of unforeseen issues.

 

15. Will my organization ID change after migration?

Yes. You will get a temporary organization ID in the Sandbox environment. During production migration, your organization will get a new, permanent organization ID.

 

16. Can customers choose the migration schedule?

Informatica will build the migration batches and communicate the migration schedule to each customer. If the published schedule does not meet your needs, customers are requested to contact support to reschedule to a different migration batch.

 

17. If a customer has more than two organizations, can they be migrated in separate batches?

While this is possible, Informatica doesn’t recommend this. Customers should consider the impact of having the organizations in different platforms for even a short duration. Customers should work with their customer success manager and Informatica Global Customer Support to ensure that organizations are scheduled in the appropriate batches.

 

18. Are there any security changes in Informatica Intelligent Cloud Services?

We have introduced user-defined roles in Informatica Intelligent Cloud Services. User roles are automatically created for corresponding user groups in Informatica Cloud. If there are any asset-level custom permissions in Informatica Cloud in which asset permissions granted to a user are higher than the permissions granted to the user via the user group, then these asset permissions are not honored for the user. Customers need to pay attention to this and manually adjust asset-level permissions as needed.

 

19. When should we do a rollback?

Post-migration, if the customer raises a P1 ticket that can’t be resolved within 24 hours, Informatica will consider the rollback option. Rollback should be done only after all other avenues to resolve the issue have been exhausted. Rollback requires an approval from the project management team.

 

20. Is the rollback automated?

Informatica has a rollback script that deactivates the Informatica Intelligent Cloud Services organization, reactivates the Informatica Cloud organization, and downgrades the Secure Agent back to the Informatica Cloud version. If any jobs have been run in Informatica Intelligent Cloud Services either partially or successfully prior to the rollback, the state of those jobs and their job logs will not be rolled back, nor will they be ported back to Informatica Cloud.

 

21. I created a new organization in Informatica Intelligent Cloud Services using my Informatica Cloud username. Can my Informatica Cloud user account be migrated if there is already an IICS user account with the same name?

Usernames in Informatica Intelligent Cloud Services must be unique. If there is already an IICS user account that has your Informatica Cloud username, then your IICS username will be appended with some extra characters to form a unique name. For example, if your Informatica Cloud username is infaclouduser, your IICS username might be changed to infaclouduser.IICS. (Your Informatica Cloud username will not change.) Informatica will send you an email with the new IICS username, and you will be able to log in to IICS using the new name.

If you use SAML or Salesforce single sign-on and there is already an IICS account with your username, the IICS username that appears in your user profile will be appended with a string such as “.SAML” or “.Salesforce” to ensure that the username is unique. You will be able to log in to IICS using single sign-on as you did with Informatica Cloud.

 

22. Do I need to change the API endpoints that I am using on Informatica Cloud?

After migration, login API requests will be automatically redirected to Informatica Intelligent Cloud Services. This redirection service will be available through February 28, 2019. As before, you must construct subsequent API requests based on the <serverUrl> and <icSessionId> that was received in the login response. Ensure that you have not hard-coded the base URL for any other API endpoints other than the login API endpoint.

After February 28, you must replace your current Informatica Cloud domain URLs with the Informatica Intelligent Cloud Services (IICS) URLs mentioned in KB article 524982 to use the APIs in IICS. (For example, if your POD is located in North America, the new IICS domain URL is https://dm-us.informaticacloud.com, and the V2 login IICS API endpoint to use is https://dm-us.informaticacloud.com/ma/api/v2/user/login.)

 

23. How do I leverage new features such as export/import through APIs in Informatica Intelligent Cloud Services?

New features such as export/import are currently only available through the V3 APIs. To leverage these APIs, use the V3 endpoints described in the REST API Reference. (For example, use the V3 login API with the following endpoint: https://dm-us.informaticacloud.com/saas/public/core/v3/login.)

 

24. What are the Cloud Application Integration IP Address ranges that you need to add to your list of approved IP addresses?

Please review the https://kb.informatica.com/faq/7/Pages/21/535281.aspx articles for the whitelist of either Domain names or the IP address ranges.

 

 

Data Integration Questions

 

1. Does the customer have to change mappings, tasks, connections, processes, or other assets after migration to make them work?

The production migration process is seamless, and existing assets will continue to work after migration without manual changes. (Please see question 2 for potential post-migration tasks.)

 

2. What are customers expected to do before, during, and after migration?

Before:

1.     Ensure that no metadata is modified during the migration window.

2.     As a best practice, back up your Secure Agent directories so that you can restore them easily in the unlikely event of a rollback.

3.     Set up appropriate blackout window.

4.     Ensure that your Informatica Cloud Secure Agents are up and running.

5.     Make sure that you have completed any task provided by Informatica Global Customer Support for migration.

6.     Prepare a checklist to validate post-migration.

During:

Monitor your email for any communication from Informatica Global Customer Support.

After:

1.     Log into the new Informatica Intelligent Cloud Services organization and do a quick validation to ensure that all metadata counts are in line.

2.     Verify that jobs are running as expected.

3.     Reset the blackout window.

4.     Review the activity logs and audit logs.

5.     If you see any discrepancies, log a support ticket as soon as possible.

 

3. Are there any manual steps involved in the migration?

Informatica is making every effort to automate the migration from end-to-end. However, there are certain areas that need attention:

  • If you use the REST API and have enabled two-factor authentication for your organization, add the following IP addresses to the list of “Trusted IP Ranges” in Informatica Cloud before the migration:
    APP and APP3 PODs: 206.80.52.0/24, 206.80.61.0/24, 209.34.91.0/24, 209.34.80.0/24
    APP2 POD: 206.80.52.0/24, 168.128.27.61/32, 168.128.27.47/32, 168.128.29.12/32, 168.128.29.92/32
  • Outbound message links will change and must be updated post migration in Salesforce. Informatica will redirect for 4 weeks after the migration, but the new links need to be updated in Salesforce.
  • If you use SAML single sign-on, you must download the IICS service provider metadata after the migration and deliver the metadata and the IICS single sign-on URL for your organization to your SAML identity provider administrator. Additionally, ensure that you update the IICS single sign-on URL and app in your identity provider application.
  • Contact Validation tasks will not be migrated. You need to convert to an Address Doctor based web service to cleanse the addresses.
  • Data Assessment tasks will not be migrated. You need to convert to DQ Radar before the migration.
  • Any task flow that references a Contact Validation or Data Assessment task will not be migrated. If you want the task flow to be migrated, adjust the task flow logic and remove the Contact Validation and Data Assessment tasks before migration.

 

4. What will happen to views after the migration?

Public views in Informatica Cloud are replaced with tags in Informatica Intelligent Cloud Services. All assets in the view will be labeled with a tag in IICS Data Integration that has the same name as the view. For example, if you created custom view called SalesObjects that contained 30 mappings in Informatica Cloud, all 30 mappings will be labeled with the tag SalesObjects in IICS Data Integration.If the same view name was used for different asset types, the tag names will have different suffixes in IICS Data Integration. For example, if you created the SalesObjects view for mappings and also for mapping tasks, mappings might be labeled with the tag SalesObjects and mapping tasks with the tag SalesObjects_1.You will be able to browse tagged assets and view all assets with a specific tag.Private views, views that are associated with connectors, and activity log views are not migrated.

 

5. Which versions of the REST API can I use with Informatica Intelligent Cloud Services?

REST API version 1 is no longer supported. For IICS, use REST API version 2 and version 3. The Informatica Cloud Data Integration REST API Reference explains the two REST API versions in detail and how to use each of them to interact with Data Integration using REST API calls.

 

6. Can I use the runAJobCli utility to run tasks in Informatica Intelligent Cloud Services?

Yes. To use the utility in Informatica Intelligent Cloud Services, update the restenv.properties file to use the new Informatica Intelligent Cloud Services URL:

Note that if you run the utility with the task name (-n) option, and you have multiple tasks with the same name in different folders, the utility runs the task in the Default folder. To run a task in a different folder, use the task ID (-i) option instead of the task name option.

 

7. The ODBC and JDBC drivers are missing from my MYSQL connector. How do I fix this?

In Informatica Intelligent Cloud Services, Informatica no longer includes the MySQL ODBC and JDBC drivers with the MySQL connector. Before you use the MySQL connector, download and install the drivers. For information about installing and configuring the drivers, see the following article or videos:

 

API & Application Integration Questions

 

1. ICRT Service (including Cloud Premium) customers should be aware of the following:

 

Preview/Sandbox Migration:

Service URLs

Create and invoke processes in the sandbox account. However, do not use sandbox service URLs in any production activity. Sandbox service URLs are not permanent and are only for testing.

Be careful when you invoke processes in the sandbox environment. Verify that the execution of a process does not affect production. For example, if you execute a "Create Order" process in the sandbox, an order will be created.

Scheduled Processes

Your schedules are migrated to the sandbox in the 'Not Enabled' state. This is to ensure that there is no duplicate process invokes because legacy Informatica Cloud Real Time continues to function during the sandbox testing period. To test schedules, create new processes in the sandbox and assign schedules to the processes.

Invoked Processes

Processes invoked before migration do not appear on the sandbox Application Integration Console service. Use the legacy Process Console to use these processes.

Processes that you invoke using sandbox service URLs will appear in the sandbox Application Integration Console service.

 

Production Migration:

 

 

Service URLs

 

Post the migration to IICS/CAI Production, the client will still be able to send requests to the older ICRT Service URL, which will be automatically redirected to the equivalent CAI Service URL. However, please note that this redirection will be available only for a short period of time until the end of September 2019. We suggest that you should plan to update your client to send the requests to the new CAI Service URL, as soon as possible to reduce the number of network hops and thereby improve the performance.

 

Invoked Processes

 

It is suggested to turn off the requests from the client at the time of migration (that has been communicated over email), although it is not mandatory. Please plan ahead. You need to be cognizant that any requests that are instantiated at the time of production migration, will likely not complete successfully. You might receive an HTTP 500 or HTTP 503 as a response, and the runtime of the instance 'attempted' to be instantiated in ICS/RT will not be migrated to the IICS/CAI Production server.

 

 

2. How do I learn about migration?

The Migration folder of the Cloud Application Integration Community Site provides links to a number of resources that can help you with this. These include:

 

 

Most ICRT service customers use ICS data integration services. To see what's available to you as you migrate to IICS, see the Essential Cloud Data Integration Demos set of videos.

 

3. How do I prepare for migration?

To prepare for Sandbox, or before and after migration to your Production Environment you should review the ICRT Service Migration to the Cloud API and Application Integration Service document.

 

4. Where are all my processes and other assets?

All your assets have been migrated to the Default folder on the Explore page.

 

5. I moved a Mapping Task out of the Default folder. Now, a process that calls the Mapping Task throws an error. What do I do?

If a process uses the Run Cloud Task service, and you have moved the cloud asset from the Default project to another project or folder, you must go to the process and reselect the cloud task. Save and publish the process.

 

6. Will the Schedules that I created in Informatica Cloud Real Time be migrated?

Yes, all schedules will be migrated to your sandbox account. However, to avoid multiple invokes of the same process, they will be migrated into the 'Not Enabled'  state. Informatica suggests that you verify that no processes are scheduled during the migration window.

 

7. Will the Informatica Cloud Real-Time Service URLs still be valid?

Yes, old service URLs will still be valid post sandbox migration. Continue to use these URLs until Informatica migrates your organization to a production account. Do not embed sandbox service URLs anywhere.

 

8. Do I need to republish assets after migration?

No, you do not need to republish assets after migration. All published assets will be migrated in the published state.

 

9. Will my Secure Agent be migrated? Do I need to download an agent again?

Your Secure Agent will be migrated, as will all assets that are published to the agent. You do not need to download an agent again.

 

10. What are the Cloud Application Integration IP Address ranges that you need to add to your list of approved IP addresses?

Please review the https://kb.informatica.com/faq/7/Pages/21/535281.aspx articles for the whitelist of either Domain names or the IP address ranges.

 

11. What will happen to the existing managed APIs after the migration?

Managed APIs will not be migrated. If an API Manager customer needs to migrate the existing managed APIs, then the customer should contact the customer success manager or support manager.

 

12. Will the managed API URLs still be valid?

No, old managed API URLs will not be valid post-production migration if you do not arrange with your customer success manager to migrate the existing managed APIs. Instead, create a new managed API for each service that you want to manage, and use the new URL.

If you do request to migrate existing managed APIs, the old URLs will resolve after production migration and DNS resolution. However, it is recommended to use new URLs.

 

13. Can I still use the old API Manager?

After production migration, do not use the old API Manager. Instead, use Informatica Intelligent Cloud Services API Manager to create and perform all operations with managed APIs.

 

14. Can I use sandbox API Manager URLs after production migration?

No, do not embed sandbox URLs anywhere.

 

15. Do I need to manually update the Salesforce guide setup URLs after migration?

Yes, you must manually update the Salesforce guide setup URLs after migration. The guide setup URLs will not be automatically redirected after migration.

 

You must log in to Salesforce and manually update the Informatica Cloud Real Time Host URL specified under advanced settings in the Guide Setup tab. If you had embedded guide URLs, you must manually update them with the new URLs after migration. See the next question for more information.

 

16. What are the post-migration tasks that I must perform for the Salesforce managed package?

If you use the Salesforce managed package, you must perform the following tasks after migration:

 

Log in to Salesforce and verify your guides.

Log in to Salesforce and verify that your guides are visible on the relevant Salesforce object pages. If you do not see your guides, log out of Salesforce, clear the browser cache, and then log in to Salesforce.

 

Log in to Salesforce and verify the Informatica Cloud Real Time Host URL.

Log in to Salesforce and verify that the Informatica Cloud Real Time Host URL specified under advanced settings in the Guide Setup tab is correct.

The format of the Host URL must be as follows: <Cloud Application Integration URL>,<Informatica Organization ID>

 

For example: https://na1.ai.dm-us.informaticacloud.com,afdq9RWEA4vjIQWQcE88XB

To view your Cloud Application Integration URL, log in to Informatica Intelligent Cloud Services and select the Application Integration service. From the browser address bar, copy the URL from https till .com as shown in the following image:

To view your Informatica Organization ID, log in to Informatica Intelligent Cloud Services, select the Administrator service and then click Organization. Copy the Informatica Organization ID displayed in the ID field as shown in the following image:

17. Can I use Guide Designer in Salesforce to create a new guide?

No. Informatica does not support Guide Designer in Salesforce. To create a guide, log in to Informatica Intelligent Cloud Services and use the Application Integration Guide Designer.

 

18. What are the post-migration tasks that I must perform for Data Integration tasks that use custom permissions?

If you had assigned custom permissions to a Data Integration task and are invoking the Data Integration task through an Application Integration process or a guide, after migration, you must complete either of the following tasks:

  • Give the Application Integration anonymous user permission to run the associated Data Integration asset.
  • Add the Application Integration anonymous user to a user group that has permission to run the associated Data Integration asset.

The following image shows an Application Integration anonymous user account that is authorized to run a Data Integration mapping:

 

More Information:

If you have licensed Application Integration, Informatica Intelligent Cloud Services creates a system user called CAI_Anonymous_<Organization_ID>. Application Integration needs this user when you invoke an anonymous process that calls a Data Integration task.

 

Important: Do not edit or delete the Application Integration anonymous user if you need to invoke an anonymous process that calls a Data Integration task.

 

The following image shows an Application Integration anonymous user account named CAI_Anonymous_6gPInky12gwbSxPUcH8v0H:

 

19. After migration, why does a process fail if it connects to a service that uses TLS version 1.1 or earlier?

By default, Application Integration uses TLS 1.2 version to connect to third-party services. TLS version 1.1 has been deprecated.

If a process connects to a service that uses TLS version 1.1 or earlier, you must manually edit the server ssl-enabled-protocols property to point to TLS version 1.1.

 

Perform the following steps after migration:

  1. In the Data Integration home page, click Administrator.
  2. Click Runtime Environments.
  3. Click the Secure Agent for which you want to configure TLS.
  4. Click Edit.
  5. Under the System Configuration Details section, select the service as Process Server, and select the type as server.
  6. Click the Edit pencil icon against the server ssl-enabled-protocols property and set the value to 'TLSv1.1'.
  7. Restart the Secure Agent for the changes to take effect.

 

20. When I start the Process Server on a UNIX operating system, why do I see the following errors:
Cannot write to temp location [/tmp]
"java.io.FileNotFoundException: ... (Too many open files)".

These errors occur because UNIX limits the number of files that can be created by a single process. The NOFILE parameter defines the maximum number of files that can be created by a single process, which is 1024.

 

Edit the UNIX security configuration file to allow a larger number of files to be opened. Configure the value of the NOFILE parameter to raise the limit from the default 1024 value. A value of 10240 should suffice.

 

Open the file /etc/security/limits.conf and add the following line:

- nofile 10240

 

If you are unsure of the value, you can set the value to unlimited.

 

21. After migration, why does a process still show the status as running even though it successfully completed earlier in Informatica Cloud Real Time?

If you had published a process on the agent and the status still shows as running even though the process successfully completed in Informatica Cloud Real Time, you must manually apply a patch to fix the issue. For more information, see the following Informatica Knowledge Base article: 566279.

 

Integration Hub Questions

 

1. What will happen to the existing Integration Hub artifacts after the migration?

Integration Hub artifacts will not be migrated. Informatica Cloud publication and subscription mappings and tasks will be migrated and can be used to create Integration Hub artifacts in Informatica Intelligent Cloud Services. If an Integration Hub customer needs to use the existing Integration Hub artifacts in Informatica Intelligent Cloud Services, then the customer should contact the customer success manager or support manager prior to the migration.

 

B2B Gateway Questions

 

1. What will happen to organizations that are defined in B2B Gateway with invalid Informatica Cloud user credentials?

Organizations with invalid Informatica Cloud user credentials will not be migrated. Before you start the migration, verify that the Informatica Cloud user credentials in the B2B Gateway Settings page are valid.

 

2. What will happen to existing B2B Gateway artifacts after the migration?

All of your B2B Gateway artifacts, including customers, suppliers, and monitoring rules, will be migrated. Customer and supplier schedules will be created in Administrator.

 

3. What will happen to intelligent structure models that are used in B2B Gateway partner definition after the migration?

Intelligent structure models that are used in B2B Gateway partner definitions will be created in Data Integration.

 

4. What will happen to B2B Gateway events after the migration?

B2B Gateway events will not be migrated.

 

5. Will the URLs of the Run Partner API and Event Status API still be valid?

No, Run Partner API and Event Status API URLs will not be valid after the migration. You must update the API requests with the new URLs, as follows:

  • Run Partner API request: https://<pod>-b2bgw.dm-us.informaticacloud.com/b2b-gw/api/v1/partner/run
  • Event Status request: https://<pod>-b2bgw.dm-us.informaticacloud.com/b2b-gw/api/v2/event/status/<eventId>

...where <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access Cloud B2B Gateway. For example: usw1.

As of Feb 5, 2018, Salesforce will be updating Symantec issued certificates with new DigiCert-issued certificates.

This will impact the Informatica Cloud jobs which use a Salesforce connection with API 31 or below and have bulk API enabled.

For instance, if the connection has the service URL as https://login.salesforce.com/services/Soap/u/31.0 and the “Use Bulk API” option is set to true in the task.

The jobs will fail with the error message of “[ERROR] Bulk API cURL error received. Error message [SSL certificate problem: unable to get local issuer certificate]”.

 

To avoid the impact, you can follow one of the below suggestions:

  • Add the new certificate to the ‘ca-bundle.crt’ file in the agent folder by following the KB article  523972. The certificate is attached in the article.        
  • Change the service URL in the Salesforce connection from version 31 or below to the latest supported API version i.e., 39.0. Changing the service URL to 32 or above might have some limitations:
    • Power Center Service tasks may fail with API 32.0 and above.
    • Tasks with Salesforce.com lookups with multiple matches and multiple lookup return fields may run into errors.

For the above situations please update certificates as per KB 523972.

The certificate file update can be done immediately, we do not have to wait for the day of the change.

 

P.S: This is not an Informatica Cloud change. This is a Salesforce.com change imposing us to make changes to the Secure Agent to ensure integrations run without issues. Please refer Salesforce.com Help link describing this change.

Today, we are incredibly excited to announce the launch of Informatica Intelligent Cloud Services - Winter 2017 December Release. At Informatica, our mission is to accelerate data driven digital transformation of businesses. At the heart of the mission is the innovation in enterprise cloud data management space to meet growing needs of enterprises. To realize this goal, we built Informatica Intelligent Cloud Services (IICS) from ground up leveraging cloud native frameworks.

 

Based on a micro services architecture and a modern user interface, the IICS platform is built for the future to provide complete end-to-end data management in a uniform, non-siloed approach. IICS unifies existing Informatica cloud service offerings and expands into a full suite of cloud data management services over time. IICS will be comprised of four clouds: Integration Cloud, Data Quality & Governance Cloud, Master Data Management Cloud, and Data Security Cloud, powered by a common core platform and the Informatica CLAIRE metadata intelligence engine.

 

The Winter 2017 December Release is the initial offering of Informatica Intelligent Cloud Services and covers Cloud Data Integration within Integration Cloud. This release offers a rich set of new capabilities to our customers.

 

Below are some highlights of this release:

 

New & Modern User Interface Experience

First, the new user interface design provides a consistent look and feel across all intelligent cloud services and experiences tailored to user roles: Designer, Operator and Administrator using a common user interface shell and service switcher.

 

Next, the new design introduces the concept of workspaces which allows users to keep multiple tabs open within a cloud service. With this capability, users don’t lose the state of assets they are working on when they navigate from one screen to another or open multiple assets for additional context to complete their workflow.

 

Finally, the Home Page provides actionable insights like runtime/job health, recent organization-wide activity as well as easy access to Marketplace and Community.

 

Template Driven Development

Productivity is a key theme for this release. As part of this theme, we added template driven development capability. Customers can now create a new integration asset either from scratch or by choosing a template from dozens of templates packaged as part of the Data Integration service. Templates include pre-built logic and cover data integration areas like data preparation, cleansing and data warehousing.

 

In addition to providing templates to our customers, we want to continue to empower customers to create their own templates/solutions and share those within their organization and even publish to Informatica Marketplace. Towards that end, we expanded bundles capability to support mapping tasks in addition to mappings, Visio templates and mapplets.

 

Templates and bundles together provide reusability and promote best practice design, thus boosting productivity and simplifying the development experience for organizations.

 

Enterprise Orchestration

The advanced taskflow capability of this release enables users to design complex taskflows by orchestrating mapping tasks and synchronization tasks in a non-linear fashion. Users can define custom logic that involves actions like parallel tasks, loops, conditions, decisions, wait time, exception and error handling, thus achieving much more complex orchestrations than before. The advanced taskflow designer has a similar user interface as the mapping designer, and integrates seamlessly with the Monitor service.

 

File Mass Ingestion

The file mass ingestion capability of this release enables customers to transfer enterprise data assets in a flat file format from on-premise to Amazon S3 datastores and Amazon Redshift data warehouses in cloud using standard protocols – FTP, SFTP and FTPS. Developers can easily author mass ingestion tasks using a wizard-based approach and monitor task executions both at job and file granularity using the same Monitor service used for other data integration jobs. File mass Ingestion is robust and is designed to handle thousands of files a day in a single process.

 

Integrated Asset Management

As data complexity and the scope of integration work grows, asset management becomes a challenge for customers. Projects and folders in the Explore user interface help address this challenge and enable customers to organize their integration assets using taxonomy suitable for their business needs, applying proper security controls.

 

Administrators can now manage project and folder structures and grant access to users through role-based security. Administrators also have flexibility to set fine grained access control at the individual project, folder or asset level on top of role based security.

Once security is granted, developers can easily store, discover, search and manage their assets through actions exposed via Explore. As business requirements or team structures change, developers can also reorganize their assets using copy/move operations which ensure asset dependencies are taken care of.

 

Design-time assets organized in projects and folders can continue to be easily attached to runtime assets like schedules, agents and connections. Through proper design of role based and instance based security for both design-time and runtime assets, enterprises can realize the right level of isolation for their organizations.

 

APIs that enable Continuous Delivery

With the IICS platform, we are promoting DevOps practice that includes iterative development throughout the lifecycle of the project with close collaboration between developers and IT operations. To enable DevOps practices, we are surfacing project/folder/asset export and import APIs that facilitate continuous delivery through automation with external version control systems, release and deployment pipelines. In addition to these capabilities that enable continuous delivery, IICS provides automated monitoring capabilities that completes the agile team based development model between developers and operations.

 

Connectivity Enhancements

Informatica provides extensive connectivity to on-premises and cloud applications and services. We have extended connectivity options in this release by adding several new connectors, most of which are co-engineered with our ecosystem partners. New connectors in this release include: Snowflake, AWS Redshift Spectrum, Microsoft Azure Blob Storage (version 2), Oracle HCM, Microsoft Dynamics 365 for Sales, SAP SuccessFactors, SAP Ariba, MemSQL, CallidusCloud Badgeville, CallidusCloud Litmos. To see a complete list of available connectors, click here

 

Cloud data warehouse connectors such as Snowflake, AWS Redshift and Microsoft Azure Data Warehouse are optimized for efficient data loads and includes features such as pushdown optimization and partitioning.

 

We have also made significant enhancements to over 40+ connectors across application and SaaS connectors such as NetSuite, Microsoft Dynamics and Salesforce; analytics connectors such as Tableau and Cloud platforms such as Google, Amazon AWS and Microsoft Azure.

 

Support for popular data formats such as Avro, Parquet, JSON was being added to cloud object store connectors. Performance enhancements include leveraging the latest APIs, optimizing for bulk loads and adding additional partitioning options. Technology connector enhancements include a new connector for OData, support for additional Hadoop distributions, and enhancements to REST and file connectors.

 

Informatica is committed to offering universal connectivity across the IICS Platform. This release enables Informatica to deliver new and enhanced connectors at more rapid pace by introducing a connectivity micro service in the platform. The connectivity micro service also enables Informatica to address any issues more seamlessly providing an overall superior experience.

 

Conclusion

These are just some of the highlights of this release. There are also other useful capabilities in this release like custom roles support, self-service non-administrator user registration, in-service notifications, improved job monitoring to isolate job issues, PowerCenter to cloud conversion utility and more.

 

Informatica Intelligent Cloud Services – Winter 2017 December Release is a significant milestone. At Informatica, we are laser focused to deliver the next generation of iPaaS and Data Management, and this is just the beginning of a new journey for us. We can’t wait to see how customers around the world use Informatica Intelligent Cloud Services to unleash the power of data!

 

Learn More

iPaaS Reimagined

Informatica is pleased to announce the launch of our new status page, http://status.informatica.com!

 

Status.informatica.com displays the production status of all publicly-hosted Informatica cloud products. All planned maintenance updates are posted to this status page, and during an unscheduled outage, it will have the most current information. To ensure you are notified of updates and outages in real time, you can subscribe to all cloud products, individual cloud products, specific incidents/outages, or to any changes to the site. Subscribing to the site is the best way to be certain you never miss an update!

 

To subscribe, simply go to http://status.informatica.com and click the blue ‘SUBSCRIBE TO UPDATES’ button.

 

 

  

You can then choose to receive notifications sent as emails, SMS text messages, webhooks, RSS feeds, or any combination.

 

The old trust site, trust.informaticacloud.com/status, is still up and is still broadcasting usage metrics, but going forward, all product updates will be posted only on the new status page. The old site will be phased out over the next couple of months as we migrate the metrics to status.informatica.com.

 

 

- Informatica Cloud Support Team