Authors: gcook Skip navigation
1 2 Previous Next

Cloud Data Integration

25 Posts authored by: gcook

IICS no longer support Windows 2008 as of Spring '20.

 

Windows Server 2008 Operating System version is already out of support by Microsoft.

 

To avoid any kind of issues while using the IICS product, you must upgrade your agent machine's Operating System to one of the supported OS versions defined in the PAM for Informatica Intelligent Cloud Services (IICS).

 

We will not support any future issues on Windows 2008 after the next release in October/November 2020 timeframe.

 

For any questions or concerns please open a case with Informatica technical support.

The Success Portal - Tech Tuesdays technical sessions are designed to encourage interaction and knowledge sharing around some of our latest innovations and capabilities across Informatica Products. In these sessions, we will focus on new features, latest releases, performance tuning, best practices, etc. and as relevant, show you a demo or product walk-through as well. For June, all webinars have been planned especially for Informatica Cloud customers.

Upcoming Webinars

 

What’s New in Cloud Application Integration 2020 releases?

Date and Time: June 2, 2020, 8:00 AM Pacific Time
This presentation will cover the features and functionalities that have been or will be released as part of Cloud Application Integration releases in 2020. It will also showcase some of these features through a demo that would simulate a real-life use-case.

What’s New in Cloud Data Integration?

Date and Time: June 9, 2020, 8:00 AM Pacific Time
Watch this webinar to learn about the new capabilities of Cloud Data Integration service as part of Spring 2020 launch and understand how to leverage them in your integration flows.

Secure Agent in IICS

Date and Time: June 16, 2020, 8:00 AM Pacific Time
This webinar is intended for users to understand the best practices for Secure Agent in IICS. After the webinar, you will be able to manage memory requirements on agent machine, understand more on agent level logging, manage space on the agent machine etc.

IICS secure agent 101

Date and Time: June 23, 2020, 8:00 AM Pacific Time
This webinar is intended for anyone who works on IICS. This would most benefit IICS Admins and ETL architects who procure, maintain and manage IICS Agent infrastructure. The webinar would cover IICS secure agent architecture and operations performed by the agent. At the end of this webinar, you will have an end to end picture of how IICS secure agent orchestrates various services and its requirements. This will also enable users to better troubleshoot agent services and gauge sizing for an agent host.

Integrated Monitoring with Operational Insights

Date and Time: June 30, 2020, 8:00 AM Pacific Time
This webinar is intended for users who are looking for Integrated Monitoring of Informatica Cloud and On-Premise products like DEI (Formerly BDM), IDQ, PowerCenter. At the end of this session, you will be able to get insights about Operation Analytics, Monitoring Services, Infrastructure Alerting along with Best Practices.

gcook

IICS Runtime Continuity FAQ

Posted by gcook Mar 30, 2020

The following frequently asked questions (FAQ) and answers are intended to help customers better understand the runtime continuity capabilities being made available as of the Informatica Intelligent Cloud Services (IICS) Spring 2020 April release.

 

This functionality will allow current jobs to continue running during the upgrade process.

The Spring 2020 release of Informatica Intelligent Cloud Services includes many changes. Please review the attached document for more details. We will provide individual pod upgrade dates within another week.

·

Cloud Data Integration

Continued investments to the current leader in integration for data warehousing to make it more scalable, flexible, and dynamic.

  • Data warehousing support for transaction control transform and dynamic file names to enhance the control over the writing of target data.
  • Highly requested capabilities of mid-stream data preview for a better debugging experience.
  • Expansion of the user, task, and sessions variables for enhanced expression creation.
  • Enhancements to parameterization capabilities to access parameter files from cloud stores like Amazon S3, Microsoft Azure Data Lake Storage Gen2 (ADLS Gen2), or Google Cloud Storage.
  • Improvements to the Union transformation to support more than two input groups.
  • Ability to specify the data flow run order to load targets in a mapping in a particular order.
  • Extending Change Data Capture (CDC) sourcing capabilities to include z/OS VSAM file capture.
  • Databricks Delta connector as a source and target for mass ingestion tasks.
  • Roll-out of Operational Insights for Cloud Data Integration and improved visualizations for new operational analytics.

 

Cloud Data Integration Elastic

  • Support for hierarchical datatypes. While applications are collecting massive amounts of data, including IoT data, storage costs become a concern and we begin looking at columnar databases and binary formats such as JSON, Avro, Parquet, and ORC. To enable BI applications to use the binary data, developers can pass the data into an elastic mapping and convert it into a structured format.
  • Serverless. A new modern cloud architecture that removes the need to manage servers, virtual machines (VMs), and containers. Developers can run mappings in response to events without weighing down administrators to maintain infrastructure or install a Secure Agent. Auto-scaling,high availability, and recovery arebuilt in and enabled automatically. This feature is in preview.
  • Auto-tuning. Traditional approaches to manually tune the Spark engineare an inefficient and time-consuming process too often characterized by trial and error. Attempts to tune areerror-prone because one parameter generally impacts another. CLAIRE, Informatica’s AI engine, now automatically tunes mapping tasks based on a number of heuristics including the size of the data, transformation logic, the size of the cluster, and the size ofcluster nodes.
  • Node bootstrapping. Cluster nodes can run initialization scripts to perform bootstrap actions that install additional software or customize cluster instances.
  • Support for the Sequence Generator transformation.
  • Auto-scaling enhancements for an elastic cluster through a custom scheduler.
  • Support for high availability. You can configure an elastic cluster to become highly available so that jobs continue running smoothly without interruption.
  • New platform capabilities: Continuous availability through upgrades, job recovery, high availability, and disaster recovery.

 

Taskflows for Cloud Data Integration

Continued investments of Cloud Data Integration taskflow support of ingestion use cases. Taskflows now support:

  • Mass ingestion tasks: Provides the means to apply any post or pre process business logic before or after ingesting the data to/from a data lake.
  • Inline file listener tasks: Makes it possible to wait for the arrival of a file before proceeding to further processing steps.
  • Unique API name override: Customers can now override the unique API name assigned to taskflows. This makes it easier to put new versions in production without the need to update consumers.

Note: You can find a summary of taskflow features introduced in 2019 in the Taskflow folder of the Cloud Data Integration community and specifically in this article: Cloud Data Integration - Taskflows - 2019 Highlights

 

Cloud Data Quality

Cloud Data Quality continues its evolution of data quality services. New with the Spring 2020 release are:

  • Deduplicate asset:
    • Identity matching (single-source) functionality to compare records on a field-by-field basis and generate a numerical score that indicates the degree of similarity between the records.
    • Optional consolidation of the records that the deduplication process identifies as duplicates. Consolidation evaluates each set of duplicate records and creates a preferred version of the records in the set according to criteria that the user defines.
  • Parse asset: Parse discrete words or strings from an input field using regular expressions and/or dictionaries.
  • Rule specification asset: Additional function expressions are available in rule statements: Convert to Date, Convert to Float, Convert to Integer, Is Spaces, Substring, and To Char.
  • Data Profiling:
    • Perform What-If scenarios by profiling outputs of Data Quality assets: rule specification, verifier, and cleanse.
    • Extension of source support for Salesforce, Microsoft Common Data Model, and Google Big Query.
    • Auto-Assignment of Data Quality assets to source columns of Microsoft Common Data Model.
    • Ability to modify connection and source object from existing profiling tasks.
    • Option to query records that has data quality issues into delimited file.
    • Integration of Data Profiling with the Operational Insights service.

 

Cloud Integration Hub

  • Ability to subscribe using API to partial published data to allow lower amounts of data to be consume per API subscription call.
  • Subscription API does not have a size limit when using a private publication repository.
  • Ability to create a synchronization task-based subscription from the Cloud Integration Hub subscription menu for better self-service.
  • Ability to create a topic based on an existing connection schema.
  • Explore functionality now supports folders and tags.
  • Usability enhancements for topics with topic-related publications and subscription tables and visual alignment across all pages.
  • Performance enhancements when using a private publication repository.
  • Onboard Cloud Integration Hub to Azure POD.

 

Intelligent Structure Discovery (ISD)

  • Ability to use the Structure Parser transformation in Data Integration with real time connectors.
  • Ability to use the Structure Parser transformation structure parser in Data Integration in passthrough mode.
  • Ability to set the Structurer Parse transformation in Data Integration for non-relational output (serialize to JSON, XML, Parquet, Avro & ORC).

 

B2B Gateway

  • B2B Partners portal: Take partners community management to the next level of empowering business partners. Partners can use the portal to track their file exchanges with the organization and to send and receive files to and from the organization, leveraging a secure HTTPs protocol.
  • SFTP server: The new SFTP service provides customers with the ability to manage and use SFTP servers to exchange files with partners.

 

API Gateway

  • OAuth 2.0 support: The API Gateway and Informatica Intelligent Cloud Services platform are delivering a new authorization option for API access. Initially available to Cloud Application Service API consumers, OAuth 2.0 used in conjunction with Client ID and Secrets extends authorization choices that today include basic authentication, and JSON-Web Token (JWT)-based authentication.
  • Personal Identifiable Information (PII) Privacy Policies have been extended. The PII policy not only reports on incoming and outgoing PII transfer, but now also provides the ability to block requests/response that contain sensitive PII data.

 

Cloud Application Integration

Spring 2020 April

This release includes the following updates:

  • Support for an Operational Insights Dashboard for Application Integration depicting API, process, and connector execution metrics.
  • Application Integration API endpoint authorization.
  • Will now be able to make use of the API Gateway’s OAuth service. This extends support for OAuth 2.0-based authorization in addition to the current basic authentication and JSON Web Token (JWT) based authentication capabilities.
  • Will now be able to restrict incoming requests from the API Gateway. This ensures that monitoring and policy enforcement are applied consistently at the gateway.

 

January 2020

The January 2020 release represents a major release of Application Integration. See the following link for information about the new features: Announcing the January 2020 release of Cloud Application Integration

Summarizing:

  • Making it easier to implement by creating process object schemas simply by importing WSDL, XSD, or Swagger interface documents from a file, a zip, or a URL.
  • Making it easier to debug:
    • To help you debug processes, the Assignment step now gives you access to field values controlled by a process’s tracing level. To turn up tracing without redeploying a process, a new X-Debug HTTP header has been introduced.
    • Tools such as Postman, SOAPUI, or RESTclient are great but require you to leave Process Designer to initiate a process. You can now create and associate with a process one or more JSON or XML process inputs and run a process with one or all inputs. You can then use the new process instance debugging capabilities to better analyze and identify the root cause of errors.
  • Making it easier to consume databases as fully-fledged REST-enabled APIs. Not only can you enable your database with OData v4 with a single click, you can now expose it as a fully-fledged REST API-enabled database. Just download the auto-generated Swagger interface and you’re good to go.
  • Making it easier for developers to consume JSON and XML content-type responses and work with attachments and multipart responses.
  • Unique API name override for processes and taskflows.
  • Making it possible for operators to restart processes from the Application Integration Console to recover from unhandled errors communicating with the end system.

 

Cloud Mass Ingestion Service

Enhanced capabilities for data ingestion from a variety of sources, using a simple and unified user experience with extensive connectivity, to address mass ingestion use cases for Cloud data warehouses and Cloud data lakes.

 

Mass Ingestion Databases

Continuing to build usability and resilience into the service while adding new functionality. New features include:

  • Schema drift support, which enables running database ingestion tasks to recognize when the schema of source tableswithin the task change (column changes only) and to dynamically process the changes through to the CDC-supported target.
  • Asset import and export functionality for database ingestion tasks.
  • GitHub source control for database ingestion tasks.

 

Mass Ingestion Streaming

  • Continuing to enable ingestion from variety of streaming sources with real time monitoring and lifecycle management.
  • New streaming connectivity and enhancements, new streaming sources and targets:
    • New connectivity: Amazon Kinesis Streams source and Microsoft Azure Data Lake Storage Gen2 target.
    • Connectivity enhancements: Flat file source.
    • Test connection for Mass Ingestion Streaming connectors.
  • New transformations during ingestion:
    • Python transformation support
    • Splitter transformation support (technical preview).
  • Real-time monitoring and lifecycle management:
    • Real-time refresh of Mass Ingestion Streaming job statistics.
    • Stop and Resume support for Mass Ingestion Streaming tasks.
  • Enterprise readiness:
    • Sub-organization support to represent different business environments.
    • Integration with GitHub for source control on Mass Ingestion Streaming tasks.
    • Deployment of the Mass Ingestion Streaming service in Kubernetes for autoscaling and high availability.

 

Mass Ingestion Files

  • Continuing to enable mass ingestion of files from variety of sources to cloud repositories with real time monitoring and different scheduling capabilities.
  • New connectivity: Databricks Delta Lake as source and target.
  • Taskflow integration for Mass Ingestion Files tasks to support complex file ingestion flows that require orchestration of multiple tasks.

 

MDM - Reference 360

  • Delta export:
    • Retrieve values that have changed in a given time period.
  • Validation:
    • Configure attribute-level validation rules to be executed when creating or editing code values.
    • Receive validation errors on create and update of individual code values.
  • Improved loading of larger data sets:
    • Reduce the loading time for data sets and hierarchies by loading a subset of code values at a time.
  • Export enhancements:
    • Choose which attributes to export both from UI and API.
    • Export data in JSON format with the REST API.
  • Workflow email notifications:
    • Receive email confirmations at each step of an approval workflow.
    • Navigate to your task in Reference 360 through a link in the notification email.

 

IICS Platform

Runtime Continuity (more details here)

Runtime continuity (zero downtime for runtime) to enable uninterrupted execution of scheduled jobs and processes at all times including during Informatica Intelligent Cloud Services upgrade windows.

 

GitHub Integration

  • Undo checkout capability for administrators on behalf of any user and any asset.
  • Bulk unlink capability.
  • Multi-project pull capability to enable pull across multiple projects with cross-project dependencies.
  • Automatically deploy upstream dependent objects upon save to be consistent with the behavior of non-source-controlled Cloud Data Integration assets.
  • Inline Git actions in dependency view.

 

Export/import Environment Properties and Schedules

  • Ability to export/import Secure Agent configurations to automate agent provisioning with the option to either restore or tune agent configuration settings through export/import APIs. This enables users to tune runtime environment properties such as agent configurations.
  • Ability to export/import schedule objects across environments through export/import APIs and in the Informatica Intelligent Cloud Services user interface.

 

Asset Management

  • Ability to reassign asset ownership to enable uninterrupted job execution when the asset owners are disabled in the organization.

 

New PODs

  • Informatica Intelligent Cloud Servicesavailability on Japan POD (Azure).
  • Informatica Intelligent Cloud Services availability on Google Cloud Platform (GCP)POD and Marketplace.
  • Informatica Intelligent Cloud Services availability in Canada region.

 

Operational Insights

  • Global availability of Operational Insights across all Amazon Web Services (AWS) PODs.
  • Generate alerts based on resource usage by individual agent services.
  • Generate alerts on disk utilization of Secure Agents.
  • Take preventive actions using custom scripts based on generated alerts.
  • In-app notification of infrastructure alert messages.

 

Ecosystems and Connectivity

  • Expanded coverage in terms of functional depth and breadth for cloud ecosystems:
    • Azure:
      • SQL Data Warehouse V3: Unconnected lookup, ADLS Gen2 as optional staging area for Polybase, improved exception handling, source/target parameterization overrides with parameter files, parameterization overrides (for schema, database, and table) in PRE SQL/POST SQL/SQL Override, and performance enhancements.
      • ADLS Gen2: Source/target parameterization overrides with parameter files,  Azure Gov Cloud, FileName port, New data type (Date/Time) support for Parquet files, Parquet Decimal, Date/Time data type support, User authenticated proxy, and performance enhancements.
      • Blob Storage: Source/target parameterization overrides with parameter files.
      • CDM Folders (available forpreview only): Support for new CDM schema (v0.9) and unicode character support.
    • AWS:
      • S3 V2: Hierarchical data types, ISD, multi-region, Parquet Decimal, Date/Time datatype support, KMS (other accounts), source/target parameterization.
      • RedShift: Ecosystem pushdown optimization (S3 to RedShift), JDBC driver update, KMS (other accounts), multi-region, source/target parameterization.
    • Snowflake DW:
      • Database pushdown optimization enhancements.
      • Unconnected lookups with pushdown optimization.
      • Snowflake on Google Cloud Platform.
      • OAuth2.
      • gzip compression.
    • Google:
      • IICS on Google Cloud Platform.
      • BigQuery: CDC support, lookup.
      • Google Cloud Storage: Read from directory.
    • Microsoft Business Apps:
      • Common Data Model (CDM) folders: Schema update (0.9).
      • Dynamics 365 Operations:  Certificate, update retry.
      • CRM:  Client secrets.
    • SAP:
      • SAP BW Reader: Supports dates before the year 1753.
      • SAP HANA (new private connector): Read from tables and modelling views (analytical, attribute, and calculation views).
    • Salesforce:
      • Sales and Service Cloud: API updates,  DRS enhancements to support deleted records.
      • Commerce Cloud: Cycle dependencies.
    • Oracle Apps:
      • Oracle HCM:  Writer updates.
      • NetSuite: API update (2019.2).
    • Adobe:  XDM connector enhancements.
  • Support for new patterns and use cases:
    • Cloud Data Integration Elastic: 
      • Amazon S3 V2:  Hierarchical data type and ISD support.
      • Azure SQL Data Warehouse V3: Elastic mappings.
      • ADLS Gen2: Elastic mappings (available for preview only).
      • JDBC V2:  Scala v2,partitioning, AWS and Azure runtime support.
    • Cloud Mass Ingestion:
      • Databricks Delta
      • ADLS Gen2: Direct Polybase load from ADLS Gen2 and performance improvements.
    • Cloud Data Quality: CDM
    • Kafka Pub/Sub Connector in Cloud Data Integration.
  • Improved connectivity across horizontal technology, systems, and applications:
    • Technology:
      • OAuth 2.0 JWT support.
      • REST V2:  Real-time transaction support, minor enhancements,  Hadoop 3.1 for Cloudera, 4.1 for HDInsights.
    • Database and EDW:
      • ODBC:  Unconnected lookup.
      • Oracle:  Blob/Clob datatype.
      • MySQL:  Advanced runtime properties.
      • SQL Server:  Advanced runtime properties.
      • Azure SQL DB: Bulk, Upsert.
      • PostgreSQL:  Schema name enhancements.
    • MongoDB:  BSON and JSON document support, partitioning, create target with schema-less support.
    • Workday:  Hierarchy parameterization support.
    • ServiceNow:  Upsert.
    • Cloud Apps:
      • Cvent:  Child object
      • Coupa: Proxy
  • New Add-on Connectors page to access add-on connectors on Marketplace.

Want to deploy code every 11.7 seconds? Transform your integration with Agile, DevOps, CI/CD

 

Deploy Faster, Reduce Outages, and Increase Revenue & Productivity

Tuesday, February 4, 2020 8:30am PST

Within a year of Amazon's move to AWS, engineers were deploying code every 11.7 seconds, on average. The agile approach also reduced both the number and duration of outages, resulting in increased revenue. Many organizations have moved from clunky, legacy software and systems to an agile-based DevOps approach that has yielded significant increase in development productivity.

Agile, DevOps, and CI/CD are three distinct and critical tools for today’s development organization. When all three are used for their intended purposes, the results are transformational. But what do these tools mean in the context of application and data integration.

Join this webinar to learn about leveraging Informatica’s cloud data integration and application integration services to support your CI/CD pipeline. You will learn how to:

 

  • Enable version control of application and data integration assets with any third-party version control system
  • Perform continuous integration and continuous delivery for application and data integrations
  • Promote DevOps for application and data integration

 

Featured Speakers:

  • Vivin Nath, Principal Product Manager, Informatica
  • Sorabh Agarwal, Principal Product Manager, Informatica

 



Click here to register

We are hosting a webinar with Kelly Services on September 4, 2019. Ravi Ginjupalli reviews the process to master 4 critical domains, starting with customer, and connecting Informatica MDM with Salesforce through IICS. The webinar is co-presented with Capgemini. This is also a Teradata to Azure migration at Kelly Services. Hear about the story, challenges, and lessons learned at Kelly Services as Ravi Ginjupalli and Venkat Gupta of Capgemini discuss the approach and technology behind their effort.

 

You can register for the webinar on Sept 4, 2019, at 11 AM PDT here: https://www.informatica.com/about-us/webinars.html?commid=367064

 

They realize:

  • 99.99% accuracy in identifying duplicates using MDM match rules
  • 1000 Average number of Accounts weekly (Includes both prospects and new clients)
  • 500k Number of customer loaded during the initial load
  • Eliminated 18% duplicate Customer records in Salesforce
  • 6 month Time to deliver; Salesforce + D&B integration + Hierarchies; 180 tables; 5218 Attributes; 288 mapping

Webinar: Meet the Experts: Deep-Dive, Demo, Roadmap - Informatica Cloud App/API Integration

 


Join Informatica product experts as they dive deep into the API and application integration capabilities for accelerating your digital transformation. You will learn:

 

  1. How to develop processes, APIs, and connect to any APIs without any coding
  2. What Intelligent APIs are and why Informatica is uniquely qualified to offer these?
  3. About management of integration artifacts and APIs
  4. The “ilities” (performance, scalability, reliability) of our platform
  5. The IaaS, SaaS, and on-prem partners we integrate with

ALERT:  for customers using Informatica Cloud

 

To better serve our customers, we place older connectors and unused connectors in End-of-Life (EOL) or Maintenance Mode. If you need a new connector enabled, per the customer action below, please create a shipping case and request we add the new connector to your org(s). The differences between Maintenance Mode and EOL are summarized in table below:

 

Term

Description

Bug Fixes

Enhancements

Connector continues to work

Customer Action

End-of-Life (EOL)

Connector at end-of-life. 
Informatica will no longer support; no bug fixes; no enhancements. 
There will be no automatic migrations, upgrades for existing work.

No

No

No

Connector will no longer work post the next release (~6 mos from initial announcement); and will not be available in your org anymore.
Please verify you not using any connector, or move mappings to an alternative connector, if available.

Maintenance Mode

Connector in maintenance mode. 
Informatica will no longer enhance; and bug fixes may be considered. 
There will be no automatic migrations, upgrades for existing work. You will need to apply the latest recommended connector and migrate your jobs to the next connector.

Yes

No

Yes

Customer should consider moving to alternative connector, if available; the alternative connector will continue to be further enhanced as necessary.

 

Am I impacted?

Refer to the list below to determine if you are using one of these connectors.  As necessary, emails are sent to "Subscription" customers for these connectors in advance of EOL.  For EOL connectors, customers are given 6 months notice to address their existing mappings; maintenance mode connectors will move to EOL as appropriate.

 

How do I address the issue?

Please reference Customer Action in table above; and Notes to Customers and Alternative connector columns in table below.

 

The following table shows the connectors in end-of-life (EOL) or maintenance mode

The table below is continually updated.

 

Nr

Data Source

Connector Name

EOL or Maintenance Mode

Alternative Connector

Notes to Customers

1

Amazon QuickSight

Amazon QuickSight

EOL

None

 

2

Arc GIS

Arc GIS

EOL

None

 

3

Attensity Discovery Now

Attensity Discovery Now

EOL

None

 

4

Avature

Avature

EOL

Generic REST V2 / WS Connector

 

5

Birst

Birst

EOL

Birst Cloud Connect

 

6

Cloud File Transfer

Cloud File Transfer

EOL

None

 

7

Club Assistant

ClubAssistant

EOL

None

 

8

DataSift

DataSift

EOL

None

 

9

EPIC

EPIC

EOL

None

 

10

IDVExpress

IDVExpress

EOL

None

 

11

Informatica Data Prep

Informatica Data Prep

EOL

None

 

12

Informatica Rev

Informatica Rev

EOL

None

 

13

Intuit QuickBooks

Intuit Quickbooks

EOL

QuickBooks V2

 

14

Intuit Quickbooks Online

Intuit Quickbooks Online

EOL

None

 

15

Magento

Magento

EOL

None

 

16

Marketo

Marketo 2

EOL

Marketo V3

 

17

Microsoft Dynamics AX

Microsoft Dynamics AX 2009

EOL

None

 

18

Microsoft Dynamics GP

Microsoft Dynamics GP 2010

EOL

None

New connector on roadmap.

19

Oracle Netsuite

NetSuite (Restlet) Write only

EOL

NetSuite

 

20

Oracle Peoplesoft

Oracle Peoplesoft 9.x

EOL

Use generic REST V2 or WS Connector

 

21

Oracle Taleo Business Edition

Oracle Taleo Business Edition

EOL

Generic REST V2 / WS Connector

 

22

Oracle Taleo Enterprise Edition

Oracle Taleo Enterprise Edition

EOL

Generic REST V2 / WS Connector

 

23

Rapnet

Rapnet

EOL

None

 

24

Rave

Rave

EOL

None

 

25

Reltio

Reltio

EOL

None

 

26

Rev

Rev

EOL

None

 

27

Saaggita

Saaggita

EOL

None

 

28

Salesforce Insights

Salesforce Insights

EOL

None

 

29

Snowflake

Snowflake V1 Connector

EOL

Snowflake Cloud Data Warehouse

 

30

Snowflake

Snowflake Big Data Warehouse

EOL

Snowflake Cloud Data Warehouse

 

31

Sugar CRM

Sugar CRM

EOL

Sugar CRM REST

 

32

Tableau (Server)

Tableau V1

EOL

Tableau V3

 

33

Trackwise

Trackwise

EOL

None

 

34

Vindicia

Vindicia

EOL

None

 

35

Zoho

Zoho

EOL

Generic REST V2 / WS Connector

 

36Amazon AuroraAmazon AuroraMaintenance ModeMySQL

37

Amazon Dynamo DB

Amazon Dynamo DB

Maintenance Mode

None

New connector on roadmap.

38

Anaplan

Anaplan

Maintenance Mode

Anaplan V2

 

39Apache HDFSHadoop FilesMaintenance ModeHadoop Files V2

40

Apache Hive

Hadoop

Maintenance Mode

Hive Connector

 

41

Box

Box

Maintenance Mode

None

New connector on roadmap.

42

Box

Box API

Maintenance Mode

None

New connector on roadmap.

43

Chatter

Chatter

Maintenance Mode

None

 

44

Coupa

Coupa

Maintenance Mode

Coupa V2

 

45

Dropbox

Dropbox

Maintenance Mode

None

New connector on roadmap.

46EloquaEloqua (Soap)Maintenance ModeEloqua Bulk, Eloqua REST
47FileListFileListMaintenance ModeFlat File
48FileProcessorFileProcessorMaintenance ModeFile Mass Ingestion Services

49

Google API

Google API

Maintenance Mode

Google analytics

 

50JDBCJDBCMaintenance ModeJDBC V2, JDBC IC

51

LinkedIn

LinkedIn

Maintenance Mode

None

New connector on roadmap.

52

Marketo

Marketo

Maintenance Mode

Marketo V3

 

53

Marketo

Marketo REST

Maintenance Mode

Marketo V3

 

54MemSQLMemSQLMaintenance ModeMemSQL V2Work with MemSQL for connector access

55

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V1

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

56

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V2

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

57

Microsoft Azure Cosmos DB SQL API

Microsoft Azure Document DB

Maintenance Mode

Microsoft Azure Cosmos DB SQL API

Consider building new and updating existing mappings to use Cosmos DB SQL API connector. Note that the Cosmos DB SQL API connector does not support DSS yet. Support for DSS equivalent functionality with the Cosmos DB SQL API connector is planned to be available in 1H 2020

58

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V1

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

59

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V2

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

60

Microsoft Azure SQL DW

Microsoft Azure SQL DW V1

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

61

Microsoft Azure SQL DW

Microsoft Azure SQL Data Warehouse V2

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

62

Microsoft Dynamics AX

Microsoft Dynamics AX 2012

Maintenance Mode

Microsoft Dynamics AX 2012 V3

 

63

Microsoft Dynamics AXMicrosoft Dynamics AX v3 (supports AX 2012)Maintenance ModeNone
64Microsoft Dynamics NAVMicrosoft Dynamics NAV 2009 - 2013Maintenance ModeNone

65

Microsoft Excel

Microsoft Excel v1

Maintenance Mode

Intelligent Structure Discovery

 

66Oracle CPQ (BigMachines)Oracle CPQ (BigMachines)Maintenance ModeREST V2

67

Oracle EBS

Oracle EBS 12.x (Cloud only)

Maintenance Mode

Use generic REST V2 or WS Connector

 

68

Oracle EBS

Oracle InterfaceTable

Maintenance Mode

Use generic REST V2 or WS Connector

 

69SAP AribaAriba HierMaintenance ModeAriba V2

70

SAP Concur

SAP Concur

Maintenance Mode

Concur V2

 

71

SAP SuccessFactors

SAP SuccessFactors SOAP

Maintenance Mode

SAP SuccessFactors Odata

 

72

TFS

TFS

Maintenance Mode

Generic REST V2 / WS Connector

 

73

TM2

TM2

Maintenance Mode

None

 

74

Twitter

Twitter

Maintenance Mode

None

New connector on roadmap.

75

WebServices - REST

REST

Maintenance Mode

REST V2

 

76

WebServices - SOAP

SOAP WebServices

Maintenance Mode

Webservices Consumer Transform

 

77

Webservices V2

Webservices V2

Maintenance Mode

Webservices Consumer Transform

 

78

Workday

Workday

Maintenance Mode

Workday V2

 

79

Zendesk

Zendesk

Maintenance Mode

Zendesk V2

 

80

Zuora

Zoura (SOAP)

Maintenance Mode

Zuora REST V2, Zuora AQuA

 

 

Updates

  • 6/2019: added MemSQL and Eloqua
  • 7/2019: added Ariba
  • 11/2019: added BigMachines
  • 2/2020: added MSD AX v3, MSD NAV 2009-13
  • 5/2020: added Amazon Aurora, Hadoop Files, FileList, FileProcessor, JDBC
  • 6/2020: revised alternative connector for Intuit Quickbooks Online

The Informatica Intelligent Cloud Services (IICS) Winter 2019 release offers several new capabilities that address key data challenges that businesses are facing today. Highlights are listed below.

 

Data Integration

  • Data discovery in Cloud Data Integration with Enterprise Data Catalog (EDC) integration - Customers can now search and discover enterprise-wide metadata from within Data Integration, import connection & object metadata, and use that information to more easily create new or enrich existing mappings and tasks by connecting with an existing EDC installation.
  • “Smart match” recommendations for field mappings increases the frequency of field matches in mappings and tasks. Expanding on the existing automatch, smart match looks for common patterns in field names (prefixes, suffixes, abbreviations, etc.) based on six additional matchers and fuzzy match techniques for recommending field mappings.
  • Taskflows can be invoked via APIs for externalized scheduling and execution. With this enhancement, customers now can invoke taskflows on-demand via an API call and provide input parameters for the tasks it orchestrates, allowing customers to fully leverage Data Integration’s parameterization capabilities. Please refer to the Taskflow as a Service FAQ.
  • Taskflows have also been enhanced to allow them to embed other taskflows to promote reuse.
  • Change data capture has been expanded to include additional sources for DB2 on Linux, Unix, Windows, and iSeries (also known as AS400, i5/OS) platforms, which further enables near real-time changed data propagation capabilities.
  • Mass ingestion is extending connector support, adding Google Storage & Google Big Query as targets and HDFS as both a source and target. Additional enhancements expose CRUD-focused APIs.

 

API and Application Integration

  • Support for Kafka Messaging – Messaging is at the core of many publish-subscribe (Pub/Sub) based applications as a means of decoupling producers and consumers of data.  The addition of Kafka for application integration significantly increases current message-based Pub/Sub interactions between data and applications that today are fulfilled using JMS, AMQP, Amazon SNS/SQS, and Azure Service Bus based “topics.” The ability to bridge these message-based events with the Cloud Integration Hub Pub/Sub style of data propagation provides additional integration pattern options making Informatica unique in the flexibility and capabilities it provides for its customers.
  • JSON Web Token (JWT) based authentication – The API and Application Integration services now support JSON Web Token (JWT) based authentication, an open standard (RFC 7519) that defines a compact and self-contained way for securely transmitting information between API consumers and REST web services. This provides IICS users that use API Manager with another and more secure means of API authentication.
  • API grouping – To better manage the use of JWT tokens and associate their use to multiple API endpoints, a new API “application” grouping capability is being introduced in API Manager. This capability will provide API consumers with fewer tokens to deal with, and API producers will now more easily manage or revoke a consumer’s access to multiple APIs.
  • Japanese language support for the API and Application Integration services – In addition to Japanese language support for the Data Integration service, Japanese customers now have access to the API and Application Integration services user interface and documentation in Japanese.
  • REST and SOAP Service API-based “service connectors” – distributed via GitHub.

Today 55% of Application Integration customer’s connectivity needs are met using service connectors. A service connector allows customers to define REST (XML/JSON), JSON/RPC, or SOAP service integration using a simple web form, with no coding required. The Application Integration service takes care of the rest. If the service offers a WSDL or Swagger interface document, the service connector can be automatically generated by importing the interface document. By creating service connectors, customers can import and configure pre-built business and data service definitions as reusable assets that they can share and/or move from one environment to another. 

This capability, unique to Informatica, provides customers with unparalleled value. Service connectors avoid lock-in or an inability to make updates as you need them to take advantage of new capabilities or built-in extensibility that an API platform can offer. 

To provide this flexibility to customers and encourage community contribution by customers, partners, and other practitioners, Informatica is establishing a GitHub repository where it will publish the service connectors it has created and which it will share with its customers and partners. Customers and partners are free to use these definitions without restriction, including the rights to use, copy, modify, merge, publish, and distribute these under an MIT license. Informatica will also encourage contributions back to the community. Our goal is simple: drive innovation and reduce perceived barriers to adoption.

 

 

Integration Hub

  • Informatica has improved the search experience for Hub events and support for the CLOB data type on topics.

 

B2B Gateway

  • ICS B2B Gateway customers will be migrated to the IICS platform as part of the release upgrade and will benefit from all IICS platform capabilities.

 

Intelligent Structure Discovery

  • Intelligent Structure Discovery expanded its parsing capabilities to handle ORC format files and Excel multiple sheet files. The user can now design the structure model based on multiple sheet structures and then use the model at run time to parse Excel files in their entirety.
  • With R31, a Structure Parser transformation can be positioned mid-stream to enable a more flexible mapping usage and chaining. In addition, the Intelligent Structure Model detected datatypes are now propagated to the Structure Parser output ports.
  • The ISD design time user interface is enhanced with a "find" functionality which allows the user to search for a specific string in the discovered tree fields and get a list of results showing the path and visually correlated with the model representation. The user can also perform actions on multiple elements chosen from the result list such as include, exclude, replace, and even change of element type. The ability to perform actions on multiple elements significantly improves the usability and productivity.
  • A new vertical folder view mode will be available in R31 for handling complex hierarchy files.

 

IICS Platform

  • Common Explore productivity enhancements – Improved copy functionality with overwrite & rename conflict resolution options to copy assets within and across folders. “Created by” and “last updated by” attributes as columns for all asset types in the common Explore page.
  • Export/import capability for sub-organizations which enables asset migration across environments that use an organization hierarchy. More control and flexibility with enable/disable checksum validation options during export and import.
  • Improved export/import error logging along with the ability to access and download export/import logs through the UI and the API.
  • API to search, list, and filter assets in projects and folders using a variety of conditions such as timestamp, location, “last updated by,” and tags. This API can also be leveraged along with export APIs to export objects.
  • Improvements to the RunAJob utility – Support for projects and folders to invoke tasks by task name.
  • Usability improvements – Ability to copy table cell data in the common Explore page, Monitor service, and Administrator service for use in other user interfaces like search boxes and filter conditions for easier task completion.
  • Search capability for connections and job monitoring to quickly and easily find needed information.
  • Ability to enable and disable a service for agents in a Secure Agent group to effectively deploy workloads and efficiently utilize computing resources.
  • Secure Agent registration using tokens (instead of passwords) for increased security and enabling SAML single sign-on.

 

Operational Insights 

  • Operational Insights extends support to the on-premises Data Quality product, in addition to BDM and PowerCenter, with capabilities such as domain health, job run analytics, resource utilization, and alerts.
  • Click-through analytics journey from cross-domain to the individual job level and enhancements to job run analytics for on-premises products (PowerCenter, BDM). Plus, enhancements to job run analytics to report on data loaded and data processed.
  • Power Center & BDM infrastructure e-mail alert enhancements such as Secure Agent unavailability and Operational Insights data collection failures.

 

Connectivity Enhancements

New AWS Quickstart for Cloud Analytics Modernization, an end-to-end solution for self-service cloud analytics with Informatica (IICS, Enterprise Data Catalog), Tableau Server, and AWS Services.

Several new connectors and enhancements to existing connectors across ecosystems as listed below. New connectors introduced are highlighted in bold:

  • Azure: ADLS Gen 2 Preview, Azure DW V3, Azure Data Lake Store V3, Azure Blob V3 
  • Google: Google Cloud Storage V2, Google Analytics, Google Big Query V2, Google Big Query 
  • Amazon: Amazon S3 V2, Amazon Aurora, Amazon Redshift V2 
  • Salesforce: Salesforce Marketing Cloud (SFMC), SFDC (Sales and Service) 
  • SAP: SAP Connector, SAP HANA Cloud Platform (DB) 
  • Adobe: Adobe Cloud Platform 
  • Analytics: CDM Folders connector preview, Tableau V3, Tableau V2 
  • Databases: MySQL Relational, Hive, Greenplum, DashDB, Snowflake 
  • Tech: REST V2, WSconsumer, Complex File Processor 
  • Microsoft Apps: Microsoft SharePoint Online 
  • Oracle: Oracle Netsuite V1, Oracle Relational 

 

Summary of some of the connectivity enhancements are as follows:

AWS

  • Supporting file names longer than 250 characters with S3
  • Support for custom JDBC URL for Redshift
  • Support for ORC files with S3

Snowflake

  • Custom Query metadata fetching without having to run the query

Google

  • Custom Query Support for Google Big Query V2 connector
  • Pushdown support for Google Big Query thru ODBC
  • Google Analytics - Enhancement to fetch fields based on multiple Views IDs from GA
  • Google Big Query Mass Ingestion - Direct load Cloud Storage->Big Query

Azure

  • Preview of ADLS Gen2 connector: support create target, configurable escape character and text qualifier in R/W scenarios, create and rename directory, rename file, header-less files, support RBAC for all types of AAD Authentication, append data to an existing file, support parameterization
  • Azure DW V3: support TARGET NAME OVERRIDE and TARGET SCHEMA NAME OVERRIDE in the writer, support SOURCE NAME OVERRIDE and SOURCE SCHEMA OVERRIDE with the reader, support custom query and multiple objects in CMD and MCT

 

Microsoft Apps

  • Microsoft SharePoint Online: support for agents running on Linux

 

Analytics

  • Preview of CDM Folders connector: new connector, with the ability to write to ADLS Gen 2 in CDM format, and then access the data from Power BI as a dataflow
  • Tableau V2: upgrade connector to the latest Tableau SDK

Databases & DW

  • MySQL Relational: array insert support
  • Greenplum: native reader and writer

 

NetSuite V1: address 2-factor authentication

 

Salesforce

  • SFDC Sales, Service connector: support latest SFDC API
  • Salesforce Marketing Cloud SFMC: insert/update operation for “Non Contact Linked Data Extensions”

Here is an orientation video for the upcoming Informatica Cloud to Informatica Intelligent Cloud Services (IICS) migration process. The video provides an introduction to IICS for customers that are migrating from Informatica Cloud.

 

We will be migrating organizations between April and August. You will receive notification when YOUR organization(s) is going to be migrated and when your sandbox environment (pre-release) is available.

 

Video link (moved to YouTube so that it doesn't have to be downloaded):

 

Introducing Informatica Intelligent Cloud Services - YouTube

 

Full FAQ:

 

Informatica Intelligent Cloud Services Migration FAQ

Informatica Intelligent Cloud Services Migration FAQ
(Updated: June 27, 2019)

 

The following questions and answers will help you better understand the Informatica Intelligent Cloud Services migration. This is a live document, and we will be adding questions and answers as they arise.

 

This document is organized in sections by Service: common questions, Data Integration questions, API and Application Integration questions, and Data Integration Hub questions.

 

 

Common Questions

 

1. When will the Informatica Cloud to Informatica Intelligent Cloud Services migration begin?

Migration to the new Informatica Intelligent Cloud Services platform will begin in early Q2 2018.

 

2. What order of services will the migration follow?

As a first step, Informatica Cloud Services (ICS) customers that do not use the Informatica Cloud Real Time (ICRT) services will be migrated. ICRT customer migration, including customers that have licensed Cloud Premium services, is commencing in July 2018. ICRT customers are grouped by functional usage. Migration of customers is planned to be completed before the end of 2018.

 

3. Where can I find more information about the new features and behavior changes in Informatica Intelligent Cloud Services?

See the IICS Navigation Overview for ICS and ICRT Users video to for a quick tour of IICS. This video provides an overview of Informatica Intelligent Cloud Services for users that are already familiar with Informatica Cloud.

 

The Migration folder of the Cloud Application Integration Community Site provides links to a number of resources that can help you with this. These include:

 

For ICS Users

 

For ICRT Users

 

4. Will all customers be migrated at the same time?

Migration will be performed in multiple batches to ensure maximum flexibility and a minimal amount of disruption to customers. These batches may include ICS and ICRT customers.

 

5. When will customers be notified about the migration?

Customers will be notified 6 weeks prior to the migration. The communication will go out to all users of the Informatica Cloud organization being migrated.

 

6. What is the migration process?

All customers will be migrated to a Sandbox environment before they are migrated to the production environment. The Sandbox environment will be available to you for at least three weeks, to allow you to perform testing. Your Informatica Cloud assets will be migrated to the Sandbox environment, but an asset that you create in the Sandbox environment will not be migrated to the production environment. If you create assets in Informatica Cloud during the three weeks of preview testing, these assets will be migrated to the production environment. After the three weeks of the preview, testing has elapsed, your organization will be migrated to the production environment. If you have concerns or need more time please contact Informatica Global Customer Support.

 

7. Should all customers test the migration?

Yes, all customers are expected to test the migration in the Sandbox environment. It is critical for all customers to participate in the testing to ensure a smooth transition from Informatica Cloud to Informatica Intelligent Cloud Services. As a minimum, Informatica recommends that you test at least one organization with a good mix of use cases.

 

8. Are there any special instructions for using the Sandbox environment?

Yes. When you are notified of the upcoming migration you will be able to access your Sandbox environment. The following instructions apply to the Sandbox environment:

  • You can access the Sandbox environment after Informatica notifies you that your organization has been migrated to the Sandbox environment. To access the Sandbox environment, open the Sandbox environment URL (https://dm-sandbox.informaticacloud.com/identity-service/home) using your Informatica Cloud credentials.
  • Customers that have licensed Data Integration (previously ICS) should verify that they can access the Data Integration Service
  • Customers that have licensed Application Integration (previously ICRT or Cloud Premium customers) should verify that they can access the Application Integration Service
  • If you are unable to access either service, contact Informatica Global Customer Support.
  • If your company uses a firewall, you may need to update your whitelist of IP addresses. The Secure Agent IP address ranges differ among Informatica Cloud, the IICS Sandbox environment, and the IICS production environment. The approved Secure Agent IP address ranges for the production and Sandbox environments are listed in KB article 524982.
  • Download a Secure Agent for the Sandbox environment.
    Existing preview agents will not be upgraded. Uninstall any previous preview agent, and then download and install a new preview agent from the Sandbox environment. The new Secure Agent will point to the Sandbox environment and allow you to run jobs.
    Note: On Windows, you must install the Secure Agent on its own machine. You cannot run multiple agents on the same Windows machine. On Linux, you can install multiple agents on the same machine if you install them under a different ID and folder.
  • Update your tasks and connections to use the new Secure Agent that you downloaded from Informatica Intelligent Cloud Services.
    Tip: If you have the Secure Agent Cluster license, you can add the preview agent to an existing Secure Agent group so that you won't have to update tasks and connections individually.
  • If you want tasks in the Sandbox environment to run on the schedules that you defined in Informatica Cloud, edit the tasks and save them. Schedules are migrated to the Sandbox environment, but they are not activated. When you save a task, the schedule will be re-activated.
  • Clear your browser cache if you see a blank screen or the message, “Internal Server Error. Please contact Support.” This issue is caused by a browser caching problem. Clear your browsing history, including all cookies and all cached images and files, and then close the browser. Re-open the browser and log in again.

 

ICRT service users should also review the ICRT Service Migration to the Cloud API and Application Integration Service guide.

 

9. Which web browsers can customer use with IICS?

IICS supports the following browsers: Google Chrome, Microsoft Internet Explorer 11, and Mozilla Firefox. For more information, see the PAM for Informatica Intelligent Cloud Services (IICS) on Informatica Network.

If you use IE11, note the following:

  • You must enable cross-origin support (CORS) in the browser. For information about enabling CORS in IE11, see the "Enabling CORS in Internet Explorer 11" topic in the Data Integration online help.
  • The time stamps displayed in the Monitor service and on the My Jobs page in Data Integration appear in Coordinated Universal Time (UTC).

 

10. What will happen to Secure Agents during the production migration?

During the production migration, all of your Informatica Cloud Secure Agents will be upgraded to the latest version. Secure Agents that you downloaded from the IICS Sandbox environment will not be upgraded.

The migration process retains the following items:

  • Connection properties that you stored with a local Secure Agent.
  • Secure Agent configuration property changes.
  • All files that you copy to <Secure Agent installation directory>/apps/Data_Integration_Server/ext/

The migration process does not retain manual changes that you made to configuration files in the Secure Agent installation directory or its subdirectories.

Note: As a best practice, Informatica recommends that you back up your Informatica Cloud Secure Agent directories before the migration so that you can restore them easily in the unlikely event of a rollback.

 

11. How much disk space is required to upgrade the Secure Agent?

To calculate the free space required for upgrade, use the following formula:

Minimum required free space = 3 * (size of current Secure Agent installation directory - space used for logs directory) + 1 GB

 

12. For customers that use a firewall, what are the Informatica Intelligent Cloud Services URLs that need to be included in the whitelist of approved IP addresses?

The approved Secure Agent IP address ranges for production and Sandbox environments are listed in KB article 524982.

 

13. Will there be any downtime for migration, and if yes, what is the expected downtime?

The migration will affect your service’s availability. The exact duration of the downtime will be communicated to each customer as part of the migration notification. The exact downtime depends upon the number of Informatica Cloud assets and organizations that a customer has. Informatica estimates the downtime to be in the range of 1-4 hours.

 

14. What will happen to the old Informatica Cloud organization after the migration is completed?

The Informatica Cloud organization will be deactivated, but its metadata will be retained for 30 days post-migration to ensure that Informatica has a copy for comparison and for roll-back in case of unforeseen issues.

 

15. Will my organization ID change after migration?

Yes. You will get a temporary organization ID in the Sandbox environment. During production migration, your organization will get a new, permanent organization ID.

 

16. Can customers choose the migration schedule?

Informatica will build the migration batches and communicate the migration schedule to each customer. If the published schedule does not meet your needs, customers are requested to contact support to reschedule to a different migration batch.

 

17. If a customer has more than two organizations, can they be migrated in separate batches?

While this is possible, Informatica doesn’t recommend this. Customers should consider the impact of having the organizations in different platforms for even a short duration. Customers should work with their customer success manager and Informatica Global Customer Support to ensure that organizations are scheduled in the appropriate batches.

 

18. Are there any security changes in Informatica Intelligent Cloud Services?

We have introduced user-defined roles in Informatica Intelligent Cloud Services. User roles are automatically created for corresponding user groups in Informatica Cloud. If there are any asset-level custom permissions in Informatica Cloud in which asset permissions granted to a user are higher than the permissions granted to the user via the user group, then these asset permissions are not honored for the user. Customers need to pay attention to this and manually adjust asset-level permissions as needed.

 

19. When should we do a rollback?

Post-migration, if the customer raises a P1 ticket that can’t be resolved within 24 hours, Informatica will consider the rollback option. Rollback should be done only after all other avenues to resolve the issue have been exhausted. Rollback requires an approval from the project management team.

 

20. Is the rollback automated?

Informatica has a rollback script that deactivates the Informatica Intelligent Cloud Services organization, reactivates the Informatica Cloud organization, and downgrades the Secure Agent back to the Informatica Cloud version. If any jobs have been run in Informatica Intelligent Cloud Services either partially or successfully prior to the rollback, the state of those jobs and their job logs will not be rolled back, nor will they be ported back to Informatica Cloud.

 

21. I created a new organization in Informatica Intelligent Cloud Services using my Informatica Cloud username. Can my Informatica Cloud user account be migrated if there is already an IICS user account with the same name?

Usernames in Informatica Intelligent Cloud Services must be unique. If there is already an IICS user account that has your Informatica Cloud username, then your IICS username will be appended with some extra characters to form a unique name. For example, if your Informatica Cloud username is infaclouduser, your IICS username might be changed to infaclouduser.IICS. (Your Informatica Cloud username will not change.) Informatica will send you an email with the new IICS username, and you will be able to log in to IICS using the new name.

If you use SAML or Salesforce single sign-on and there is already an IICS account with your username, the IICS username that appears in your user profile will be appended with a string such as “.SAML” or “.Salesforce” to ensure that the username is unique. You will be able to log in to IICS using single sign-on as you did with Informatica Cloud.

 

22. Do I need to change the API endpoints that I am using on Informatica Cloud?

After migration, login API requests will be automatically redirected to Informatica Intelligent Cloud Services. This redirection service will be available through February 28, 2019. As before, you must construct subsequent API requests based on the <serverUrl> and <icSessionId> that was received in the login response. Ensure that you have not hard-coded the base URL for any other API endpoints other than the login API endpoint.

After February 28, you must replace your current Informatica Cloud domain URLs with the Informatica Intelligent Cloud Services (IICS) URLs mentioned in KB article 524982 to use the APIs in IICS. (For example, if your POD is located in North America, the new IICS domain URL is https://dm-us.informaticacloud.com, and the V2 login IICS API endpoint to use is https://dm-us.informaticacloud.com/ma/api/v2/user/login.)

 

23. How do I leverage new features such as export/import through APIs in Informatica Intelligent Cloud Services?

New features such as export/import are currently only available through the V3 APIs. To leverage these APIs, use the V3 endpoints described in the REST API Reference. (For example, use the V3 login API with the following endpoint: https://dm-us.informaticacloud.com/saas/public/core/v3/login.)

 

24. What are the Cloud Application Integration IP Address ranges that you need to add to your list of approved IP addresses?

Please review the https://kb.informatica.com/faq/7/Pages/21/535281.aspx articles for the whitelist of either Domain names or the IP address ranges.

 

 

Data Integration Questions

 

1. Does the customer have to change mappings, tasks, connections, processes, or other assets after migration to make them work?

The production migration process is seamless, and existing assets will continue to work after migration without manual changes. (Please see question 2 for potential post-migration tasks.)

 

2. What are customers expected to do before, during, and after migration?

Before:

1.     Ensure that no metadata is modified during the migration window.

2.     As a best practice, back up your Secure Agent directories so that you can restore them easily in the unlikely event of a rollback.

3.     Set up appropriate blackout window.

4.     Ensure that your Informatica Cloud Secure Agents are up and running.

5.     Make sure that you have completed any task provided by Informatica Global Customer Support for migration.

6.     Prepare a checklist to validate post-migration.

During:

Monitor your email for any communication from Informatica Global Customer Support.

After:

1.     Log into the new Informatica Intelligent Cloud Services organization and do a quick validation to ensure that all metadata counts are in line.

2.     Verify that jobs are running as expected.

3.     Reset the blackout window.

4.     Review the activity logs and audit logs.

5.     If you see any discrepancies, log a support ticket as soon as possible.

 

3. Are there any manual steps involved in the migration?

Informatica is making every effort to automate the migration from end-to-end. However, there are certain areas that need attention:

  • If you use the REST API and have enabled two-factor authentication for your organization, add the following IP addresses to the list of “Trusted IP Ranges” in Informatica Cloud before the migration:
    APP and APP3 PODs: 206.80.52.0/24, 206.80.61.0/24, 209.34.91.0/24, 209.34.80.0/24
    APP2 POD: 206.80.52.0/24, 168.128.27.61/32, 168.128.27.47/32, 168.128.29.12/32, 168.128.29.92/32
  • Outbound message links will change and must be updated post migration in Salesforce. Informatica will redirect for 4 weeks after the migration, but the new links need to be updated in Salesforce.
  • If you use SAML single sign-on, you must download the IICS service provider metadata after the migration and deliver the metadata and the IICS single sign-on URL for your organization to your SAML identity provider administrator. Additionally, ensure that you update the IICS single sign-on URL and app in your identity provider application.
  • Contact Validation tasks will not be migrated. You need to convert to an Address Doctor based web service to cleanse the addresses.
  • Data Assessment tasks will not be migrated. You need to convert to DQ Radar before the migration.
  • Any task flow that references a Contact Validation or Data Assessment task will not be migrated. If you want the task flow to be migrated, adjust the task flow logic and remove the Contact Validation and Data Assessment tasks before migration.

 

4. What will happen to views after the migration?

Public views in Informatica Cloud are replaced with tags in Informatica Intelligent Cloud Services. All assets in the view will be labeled with a tag in IICS Data Integration that has the same name as the view. For example, if you created custom view called SalesObjects that contained 30 mappings in Informatica Cloud, all 30 mappings will be labeled with the tag SalesObjects in IICS Data Integration.If the same view name was used for different asset types, the tag names will have different suffixes in IICS Data Integration. For example, if you created the SalesObjects view for mappings and also for mapping tasks, mappings might be labeled with the tag SalesObjects and mapping tasks with the tag SalesObjects_1.You will be able to browse tagged assets and view all assets with a specific tag.Private views, views that are associated with connectors, and activity log views are not migrated.

 

5. Which versions of the REST API can I use with Informatica Intelligent Cloud Services?

REST API version 1 is no longer supported. For IICS, use REST API version 2 and version 3. The Informatica Cloud Data Integration REST API Reference explains the two REST API versions in detail and how to use each of them to interact with Data Integration using REST API calls.

 

6. Can I use the runAJobCli utility to run tasks in Informatica Intelligent Cloud Services?

Yes. To use the utility in Informatica Intelligent Cloud Services, update the restenv.properties file to use the new Informatica Intelligent Cloud Services URL:

Note that if you run the utility with the task name (-n) option, and you have multiple tasks with the same name in different folders, the utility runs the task in the Default folder. To run a task in a different folder, use the task ID (-i) option instead of the task name option.

 

7. The ODBC and JDBC drivers are missing from my MYSQL connector. How do I fix this?

In Informatica Intelligent Cloud Services, Informatica no longer includes the MySQL ODBC and JDBC drivers with the MySQL connector. Before you use the MySQL connector, download and install the drivers. For information about installing and configuring the drivers, see the following article or videos:

 

API & Application Integration Questions

 

1. ICRT Service (including Cloud Premium) customers should be aware of the following:

 

Preview/Sandbox Migration:

Service URLs

Create and invoke processes in the sandbox account. However, do not use sandbox service URLs in any production activity. Sandbox service URLs are not permanent and are only for testing.

Be careful when you invoke processes in the sandbox environment. Verify that the execution of a process does not affect production. For example, if you execute a "Create Order" process in the sandbox, an order will be created.

Scheduled Processes

Your schedules are migrated to the sandbox in the 'Not Enabled' state. This is to ensure that there is no duplicate process invokes because legacy Informatica Cloud Real Time continues to function during the sandbox testing period. To test schedules, create new processes in the sandbox and assign schedules to the processes.

Invoked Processes

Processes invoked before migration do not appear on the sandbox Application Integration Console service. Use the legacy Process Console to use these processes.

Processes that you invoke using sandbox service URLs will appear in the sandbox Application Integration Console service.

 

Production Migration:

 

 

Service URLs

 

Post the migration to IICS/CAI Production, the client will still be able to send requests to the older ICRT Service URL, which will be automatically redirected to the equivalent CAI Service URL. However, please note that this redirection will be available only for a short period of time until the end of September 2019. We suggest that you should plan to update your client to send the requests to the new CAI Service URL, as soon as possible to reduce the number of network hops and thereby improve the performance.

 

Invoked Processes

 

It is suggested to turn off the requests from the client at the time of migration (that has been communicated over email), although it is not mandatory. Please plan ahead. You need to be cognizant that any requests that are instantiated at the time of production migration, will likely not complete successfully. You might receive an HTTP 500 or HTTP 503 as a response, and the runtime of the instance 'attempted' to be instantiated in ICS/RT will not be migrated to the IICS/CAI Production server.

 

 

2. How do I learn about migration?

The Migration folder of the Cloud Application Integration Community Site provides links to a number of resources that can help you with this. These include:

 

 

Most ICRT service customers use ICS data integration services. To see what's available to you as you migrate to IICS, see the Essential Cloud Data Integration Demos set of videos.

 

3. How do I prepare for migration?

To prepare for Sandbox, or before and after migration to your Production Environment you should review the ICRT Service Migration to the Cloud API and Application Integration Service document.

 

4. Where are all my processes and other assets?

All your assets have been migrated to the Default folder on the Explore page.

 

5. I moved a Mapping Task out of the Default folder. Now, a process that calls the Mapping Task throws an error. What do I do?

If a process uses the Run Cloud Task service, and you have moved the cloud asset from the Default project to another project or folder, you must go to the process and reselect the cloud task. Save and publish the process.

 

6. Will the Schedules that I created in Informatica Cloud Real Time be migrated?

Yes, all schedules will be migrated to your sandbox account. However, to avoid multiple invokes of the same process, they will be migrated into the 'Not Enabled'  state. Informatica suggests that you verify that no processes are scheduled during the migration window.

 

7. Will the Informatica Cloud Real-Time Service URLs still be valid?

Yes, old service URLs will still be valid post sandbox migration. Continue to use these URLs until Informatica migrates your organization to a production account. Do not embed sandbox service URLs anywhere.

 

8. Do I need to republish assets after migration?

No, you do not need to republish assets after migration. All published assets will be migrated in the published state.

 

9. Will my Secure Agent be migrated? Do I need to download an agent again?

Your Secure Agent will be migrated, as will all assets that are published to the agent. You do not need to download an agent again.

 

10. What are the Cloud Application Integration IP Address ranges that you need to add to your list of approved IP addresses?

Please review the https://kb.informatica.com/faq/7/Pages/21/535281.aspx articles for the whitelist of either Domain names or the IP address ranges.

 

11. What will happen to the existing managed APIs after the migration?

Managed APIs will not be migrated. If an API Manager customer needs to migrate the existing managed APIs, then the customer should contact the customer success manager or support manager.

 

12. Will the managed API URLs still be valid?

No, old managed API URLs will not be valid post-production migration if you do not arrange with your customer success manager to migrate the existing managed APIs. Instead, create a new managed API for each service that you want to manage, and use the new URL.

If you do request to migrate existing managed APIs, the old URLs will resolve after production migration and DNS resolution. However, it is recommended to use new URLs.

 

13. Can I still use the old API Manager?

After production migration, do not use the old API Manager. Instead, use Informatica Intelligent Cloud Services API Manager to create and perform all operations with managed APIs.

 

14. Can I use sandbox API Manager URLs after production migration?

No, do not embed sandbox URLs anywhere.

 

15. Do I need to manually update the Salesforce guide setup URLs after migration?

Yes, you must manually update the Salesforce guide setup URLs after migration. The guide setup URLs will not be automatically redirected after migration.

 

You must log in to Salesforce and manually update the Informatica Cloud Real Time Host URL specified under advanced settings in the Guide Setup tab. If you had embedded guide URLs, you must manually update them with the new URLs after migration. See the next question for more information.

 

16. What are the post-migration tasks that I must perform for the Salesforce managed package?

If you use the Salesforce managed package, you must perform the following tasks after migration:

 

Log in to Salesforce and verify your guides.

Log in to Salesforce and verify that your guides are visible on the relevant Salesforce object pages. If you do not see your guides, log out of Salesforce, clear the browser cache, and then log in to Salesforce.

 

Log in to Salesforce and verify the Informatica Cloud Real Time Host URL.

Log in to Salesforce and verify that the Informatica Cloud Real Time Host URL specified under advanced settings in the Guide Setup tab is correct.

The format of the Host URL must be as follows: <Cloud Application Integration URL>,<Informatica Organization ID>

 

For example: https://na1.ai.dm-us.informaticacloud.com,afdq9RWEA4vjIQWQcE88XB

To view your Cloud Application Integration URL, log in to Informatica Intelligent Cloud Services and select the Application Integration service. From the browser address bar, copy the URL from https till .com as shown in the following image:

To view your Informatica Organization ID, log in to Informatica Intelligent Cloud Services, select the Administrator service and then click Organization. Copy the Informatica Organization ID displayed in the ID field as shown in the following image:

17. Can I use Guide Designer in Salesforce to create a new guide?

No. Informatica does not support Guide Designer in Salesforce. To create a guide, log in to Informatica Intelligent Cloud Services and use the Application Integration Guide Designer.

 

18. What are the post-migration tasks that I must perform for Data Integration tasks that use custom permissions?

If you had assigned custom permissions to a Data Integration task and are invoking the Data Integration task through an Application Integration process or a guide, after migration, you must complete either of the following tasks:

  • Give the Application Integration anonymous user permission to run the associated Data Integration asset.
  • Add the Application Integration anonymous user to a user group that has permission to run the associated Data Integration asset.

The following image shows an Application Integration anonymous user account that is authorized to run a Data Integration mapping:

 

More Information:

If you have licensed Application Integration, Informatica Intelligent Cloud Services creates a system user called CAI_Anonymous_<Organization_ID>. Application Integration needs this user when you invoke an anonymous process that calls a Data Integration task.

 

Important: Do not edit or delete the Application Integration anonymous user if you need to invoke an anonymous process that calls a Data Integration task.

 

The following image shows an Application Integration anonymous user account named CAI_Anonymous_6gPInky12gwbSxPUcH8v0H:

 

19. After migration, why does a process fail if it connects to a service that uses TLS version 1.1 or earlier?

By default, Application Integration uses TLS 1.2 version to connect to third-party services. TLS version 1.1 has been deprecated.

If a process connects to a service that uses TLS version 1.1 or earlier, you must manually edit the server ssl-enabled-protocols property to point to TLS version 1.1.

 

Perform the following steps after migration:

  1. In the Data Integration home page, click Administrator.
  2. Click Runtime Environments.
  3. Click the Secure Agent for which you want to configure TLS.
  4. Click Edit.
  5. Under the System Configuration Details section, select the service as Process Server, and select the type as server.
  6. Click the Edit pencil icon against the server ssl-enabled-protocols property and set the value to 'TLSv1.1'.
  7. Restart the Secure Agent for the changes to take effect.

 

20. When I start the Process Server on a UNIX operating system, why do I see the following errors:
Cannot write to temp location [/tmp]
"java.io.FileNotFoundException: ... (Too many open files)".

These errors occur because UNIX limits the number of files that can be created by a single process. The NOFILE parameter defines the maximum number of files that can be created by a single process, which is 1024.

 

Edit the UNIX security configuration file to allow a larger number of files to be opened. Configure the value of the NOFILE parameter to raise the limit from the default 1024 value. A value of 10240 should suffice.

 

Open the file /etc/security/limits.conf and add the following line:

- nofile 10240

 

If you are unsure of the value, you can set the value to unlimited.

 

21. After migration, why does a process still show the status as running even though it successfully completed earlier in Informatica Cloud Real Time?

If you had published a process on the agent and the status still shows as running even though the process successfully completed in Informatica Cloud Real Time, you must manually apply a patch to fix the issue. For more information, see the following Informatica Knowledge Base article: 566279.

 

Integration Hub Questions

 

1. What will happen to the existing Integration Hub artifacts after the migration?

Integration Hub artifacts will not be migrated. Informatica Cloud publication and subscription mappings and tasks will be migrated and can be used to create Integration Hub artifacts in Informatica Intelligent Cloud Services. If an Integration Hub customer needs to use the existing Integration Hub artifacts in Informatica Intelligent Cloud Services, then the customer should contact the customer success manager or support manager prior to the migration.

 

B2B Gateway Questions

 

1. What will happen to organizations that are defined in B2B Gateway with invalid Informatica Cloud user credentials?

Organizations with invalid Informatica Cloud user credentials will not be migrated. Before you start the migration, verify that the Informatica Cloud user credentials in the B2B Gateway Settings page are valid.

 

2. What will happen to existing B2B Gateway artifacts after the migration?

All of your B2B Gateway artifacts, including customers, suppliers, and monitoring rules, will be migrated. Customer and supplier schedules will be created in Administrator.

 

3. What will happen to intelligent structure models that are used in B2B Gateway partner definition after the migration?

Intelligent structure models that are used in B2B Gateway partner definitions will be created in Data Integration.

 

4. What will happen to B2B Gateway events after the migration?

B2B Gateway events will not be migrated.

 

5. Will the URLs of the Run Partner API and Event Status API still be valid?

No, Run Partner API and Event Status API URLs will not be valid after the migration. You must update the API requests with the new URLs, as follows:

  • Run Partner API request: https://<pod>-b2bgw.dm-us.informaticacloud.com/b2b-gw/api/v1/partner/run
  • Event Status request: https://<pod>-b2bgw.dm-us.informaticacloud.com/b2b-gw/api/v2/event/status/<eventId>

...where <pod> is the name of the Informatica Intelligent Cloud Services point of delivery (PoD) where you access Cloud B2B Gateway. For example: usw1.

Cloud Integration Hub Technical Deep Dive Webinar and Demo

Scott Hedrick, Director Product Marketing, Etty Afriat, Director Product Management, Amit Vaswani, Product Specialist Leader
Sep 6, 2017 8:00 AM PT     Click here to Register

 

Are you starting to build up difficult to manage data integration spaghetti as your cloud data integrations proliferate? Organize and simplify multi-point cloud data integrations with Cloud Integration Hub, the modern publish/subscribe solution for efficiency and productivity. Reduce data transfer and API charges by removing redundant Cloud synchronizations with Informatica’s Hub for data. Jumpstart your use of the Cloud Integration Hub with the new Salesforce Accelerator. Existing Informatica Intelligent Cloud Services Premium and Advanced customers can get started right way with the one year of basic Cloud Integration Hub with their subscriptions.

In this webinar, we will:

 

  • Provide a technical overview of the key capabilities and value of Cloud Integration Hub by Informatica experts
  • Demo the latest version of Cloud Integration Hub including the new Salesforce Accelerator
  • Show how existing Informatica Intelligent Cloud Services customers can get started with the one year of basic Cloud Integration Hub included in their subscriptions.

Filter Blog

By date: By tag: