Skip navigation

ALERT: For Customers that have deployed IICS Secure Agent on SUSE 11 OS.


Informatica recently added SUSE 12 to the set of certified operating systems in the PAM for the IICS Secure Agent.


Informatica recommends to customers that have Secure Agent instances deployed on SUSE 11 to upgrade these instances to SUSE 12 SP3 (or higher service pack version) immediately. SUSE 11 will not be a supported agent OS version.


Please refer to the PAM for Informatica Intelligent Cloud Services (IICS) for the supported agent OS.

The Success Portal - Tech Tuesdays technical sessions are designed to encourage interaction and knowledge sharing around some of our latest innovations and capabilities across Informatica Products. In these sessions, we will focus on new features, latest releases, performance tuning, best practices, etc. and as relevant, show you a demo or product walk-through as well. For June, all webinars have been planned especially for Informatica Cloud customers.

Upcoming Webinars


What’s New in Cloud Application Integration 2020 releases?

Date and Time: June 2, 2020, 8:00 AM Pacific Time
This presentation will cover the features and functionalities that have been or will be released as part of Cloud Application Integration releases in 2020. It will also showcase some of these features through a demo that would simulate a real-life use-case.

What’s New in Cloud Data Integration?

Date and Time: June 9, 2020, 8:00 AM Pacific Time
Watch this webinar to learn about the new capabilities of Cloud Data Integration service as part of Spring 2020 launch and understand how to leverage them in your integration flows.

Secure Agent in IICS

Date and Time: June 16, 2020, 8:00 AM Pacific Time
This webinar is intended for users to understand the best practices for Secure Agent in IICS. After the webinar, you will be able to manage memory requirements on agent machine, understand more on agent level logging, manage space on the agent machine etc.

IICS secure agent 101

Date and Time: June 23, 2020, 8:00 AM Pacific Time
This webinar is intended for anyone who works on IICS. This would most benefit IICS Admins and ETL architects who procure, maintain and manage IICS Agent infrastructure. The webinar would cover IICS secure agent architecture and operations performed by the agent. At the end of this webinar, you will have an end to end picture of how IICS secure agent orchestrates various services and its requirements. This will also enable users to better troubleshoot agent services and gauge sizing for an agent host.

Integrated Monitoring with Operational Insights

Date and Time: June 30, 2020, 8:00 AM Pacific Time
This webinar is intended for users who are looking for Integrated Monitoring of Informatica Cloud and On-Premise products like DEI (Formerly BDM), IDQ, PowerCenter. At the end of this session, you will be able to get insights about Operation Analytics, Monitoring Services, Infrastructure Alerting along with Best Practices.


IICS Runtime Continuity FAQ

Posted by gcook Mar 30, 2020

(Doc updated July 7, 2021: CIH changed to supports runtime continuity)


The following frequently asked questions (FAQ) and answers are intended to help customers better understand the runtime continuity capabilities being made available as of the Informatica Intelligent Cloud Services (IICS) Spring 2020 April release.


This functionality will allow current jobs to continue running during the upgrade process.

The Spring 2020 release of Informatica Intelligent Cloud Services includes many changes. Please review the attached document for more details. We will provide individual pod upgrade dates within another week.


Cloud Data Integration

Continued investments to the current leader in integration for data warehousing to make it more scalable, flexible, and dynamic.

  • Data warehousing support for transaction control transform and dynamic file names to enhance the control over the writing of target data.
  • Highly requested capabilities of mid-stream data preview for a better debugging experience.
  • Expansion of the user, task, and sessions variables for enhanced expression creation.
  • Enhancements to parameterization capabilities to access parameter files from cloud stores like Amazon S3, Microsoft Azure Data Lake Storage Gen2 (ADLS Gen2), or Google Cloud Storage.
  • Improvements to the Union transformation to support more than two input groups.
  • Ability to specify the data flow run order to load targets in a mapping in a particular order.
  • Extending Change Data Capture (CDC) sourcing capabilities to include z/OS VSAM file capture.
  • Databricks Delta connector as a source and target for mass ingestion tasks.
  • Roll-out of Operational Insights for Cloud Data Integration and improved visualizations for new operational analytics.


Cloud Data Integration Elastic

  • Support for hierarchical datatypes. While applications are collecting massive amounts of data, including IoT data, storage costs become a concern and we begin looking at columnar databases and binary formats such as JSON, Avro, Parquet, and ORC. To enable BI applications to use the binary data, developers can pass the data into an elastic mapping and convert it into a structured format.
  • Serverless. A new modern cloud architecture that removes the need to manage servers, virtual machines (VMs), and containers. Developers can run mappings in response to events without weighing down administrators to maintain infrastructure or install a Secure Agent. Auto-scaling,high availability, and recovery arebuilt in and enabled automatically. This feature is in preview.
  • Auto-tuning. Traditional approaches to manually tune the Spark engineare an inefficient and time-consuming process too often characterized by trial and error. Attempts to tune areerror-prone because one parameter generally impacts another. CLAIRE, Informatica’s AI engine, now automatically tunes mapping tasks based on a number of heuristics including the size of the data, transformation logic, the size of the cluster, and the size ofcluster nodes.
  • Node bootstrapping. Cluster nodes can run initialization scripts to perform bootstrap actions that install additional software or customize cluster instances.
  • Support for the Sequence Generator transformation.
  • Auto-scaling enhancements for an elastic cluster through a custom scheduler.
  • Support for high availability. You can configure an elastic cluster to become highly available so that jobs continue running smoothly without interruption.
  • New platform capabilities: Continuous availability through upgrades, job recovery, high availability, and disaster recovery.


Taskflows for Cloud Data Integration

Continued investments of Cloud Data Integration taskflow support of ingestion use cases. Taskflows now support:

  • Mass ingestion tasks: Provides the means to apply any post or pre process business logic before or after ingesting the data to/from a data lake.
  • Inline file listener tasks: Makes it possible to wait for the arrival of a file before proceeding to further processing steps.
  • Unique API name override: Customers can now override the unique API name assigned to taskflows. This makes it easier to put new versions in production without the need to update consumers.

Note: You can find a summary of taskflow features introduced in 2019 in the Taskflow folder of the Cloud Data Integration community and specifically in this article: Cloud Data Integration - Taskflows - 2019 Highlights


Cloud Data Quality

Cloud Data Quality continues its evolution of data quality services. New with the Spring 2020 release are:

  • Deduplicate asset:
    • Identity matching (single-source) functionality to compare records on a field-by-field basis and generate a numerical score that indicates the degree of similarity between the records.
    • Optional consolidation of the records that the deduplication process identifies as duplicates. Consolidation evaluates each set of duplicate records and creates a preferred version of the records in the set according to criteria that the user defines.
  • Parse asset: Parse discrete words or strings from an input field using regular expressions and/or dictionaries.
  • Rule specification asset: Additional function expressions are available in rule statements: Convert to Date, Convert to Float, Convert to Integer, Is Spaces, Substring, and To Char.
  • Data Profiling:
    • Perform What-If scenarios by profiling outputs of Data Quality assets: rule specification, verifier, and cleanse.
    • Extension of source support for Salesforce, Microsoft Common Data Model, and Google Big Query.
    • Auto-Assignment of Data Quality assets to source columns of Microsoft Common Data Model.
    • Ability to modify connection and source object from existing profiling tasks.
    • Option to query records that has data quality issues into delimited file.
    • Integration of Data Profiling with the Operational Insights service.


Cloud Integration Hub

  • Ability to subscribe using API to partial published data to allow lower amounts of data to be consume per API subscription call.
  • Subscription API does not have a size limit when using a private publication repository.
  • Ability to create a synchronization task-based subscription from the Cloud Integration Hub subscription menu for better self-service.
  • Ability to create a topic based on an existing connection schema.
  • Explore functionality now supports folders and tags.
  • Usability enhancements for topics with topic-related publications and subscription tables and visual alignment across all pages.
  • Performance enhancements when using a private publication repository.
  • Onboard Cloud Integration Hub to Azure POD.


Intelligent Structure Discovery (ISD)

  • Ability to use the Structure Parser transformation in Data Integration with real time connectors.
  • Ability to use the Structure Parser transformation structure parser in Data Integration in passthrough mode.
  • Ability to set the Structurer Parse transformation in Data Integration for non-relational output (serialize to JSON, XML, Parquet, Avro & ORC).


B2B Gateway

  • B2B Partners portal: Take partners community management to the next level of empowering business partners. Partners can use the portal to track their file exchanges with the organization and to send and receive files to and from the organization, leveraging a secure HTTPs protocol.
  • SFTP server: The new SFTP service provides customers with the ability to manage and use SFTP servers to exchange files with partners.


API Gateway

  • OAuth 2.0 support: The API Gateway and Informatica Intelligent Cloud Services platform are delivering a new authorization option for API access. Initially available to Cloud Application Service API consumers, OAuth 2.0 used in conjunction with Client ID and Secrets extends authorization choices that today include basic authentication, and JSON-Web Token (JWT)-based authentication.
  • Personal Identifiable Information (PII) Privacy Policies have been extended. The PII policy not only reports on incoming and outgoing PII transfer, but now also provides the ability to block requests/response that contain sensitive PII data.


Cloud Application Integration

Spring 2020 April

This release includes the following updates:

  • Support for an Operational Insights Dashboard for Application Integration depicting API, process, and connector execution metrics.
  • Application Integration API endpoint authorization.
  • Will now be able to make use of the API Gateway’s OAuth service. This extends support for OAuth 2.0-based authorization in addition to the current basic authentication and JSON Web Token (JWT) based authentication capabilities.
  • Will now be able to restrict incoming requests from the API Gateway. This ensures that monitoring and policy enforcement are applied consistently at the gateway.


January 2020

The January 2020 release represents a major release of Application Integration. See the following link for information about the new features: Announcing the January 2020 release of Cloud Application Integration


  • Making it easier to implement by creating process object schemas simply by importing WSDL, XSD, or Swagger interface documents from a file, a zip, or a URL.
  • Making it easier to debug:
    • To help you debug processes, the Assignment step now gives you access to field values controlled by a process’s tracing level. To turn up tracing without redeploying a process, a new X-Debug HTTP header has been introduced.
    • Tools such as Postman, SOAPUI, or RESTclient are great but require you to leave Process Designer to initiate a process. You can now create and associate with a process one or more JSON or XML process inputs and run a process with one or all inputs. You can then use the new process instance debugging capabilities to better analyze and identify the root cause of errors.
  • Making it easier to consume databases as fully-fledged REST-enabled APIs. Not only can you enable your database with OData v4 with a single click, you can now expose it as a fully-fledged REST API-enabled database. Just download the auto-generated Swagger interface and you’re good to go.
  • Making it easier for developers to consume JSON and XML content-type responses and work with attachments and multipart responses.
  • Unique API name override for processes and taskflows.
  • Making it possible for operators to restart processes from the Application Integration Console to recover from unhandled errors communicating with the end system.


Cloud Mass Ingestion Service

Enhanced capabilities for data ingestion from a variety of sources, using a simple and unified user experience with extensive connectivity, to address mass ingestion use cases for Cloud data warehouses and Cloud data lakes.


Mass Ingestion Databases

Continuing to build usability and resilience into the service while adding new functionality. New features include:

  • Schema drift support, which enables running database ingestion tasks to recognize when the schema of source tableswithin the task change (column changes only) and to dynamically process the changes through to the CDC-supported target.
  • Asset import and export functionality for database ingestion tasks.
  • GitHub source control for database ingestion tasks.


Mass Ingestion Streaming

  • Continuing to enable ingestion from variety of streaming sources with real time monitoring and lifecycle management.
  • New streaming connectivity and enhancements, new streaming sources and targets:
    • New connectivity: Amazon Kinesis Streams source and Microsoft Azure Data Lake Storage Gen2 target.
    • Connectivity enhancements: Flat file source.
    • Test connection for Mass Ingestion Streaming connectors.
  • New transformations during ingestion:
    • Python transformation support
    • Splitter transformation support (technical preview).
  • Real-time monitoring and lifecycle management:
    • Real-time refresh of Mass Ingestion Streaming job statistics.
    • Stop and Resume support for Mass Ingestion Streaming tasks.
  • Enterprise readiness:
    • Sub-organization support to represent different business environments.
    • Integration with GitHub for source control on Mass Ingestion Streaming tasks.
    • Deployment of the Mass Ingestion Streaming service in Kubernetes for autoscaling and high availability.


Mass Ingestion Files

  • Continuing to enable mass ingestion of files from variety of sources to cloud repositories with real time monitoring and different scheduling capabilities.
  • New connectivity: Databricks Delta Lake as source and target.
  • Taskflow integration for Mass Ingestion Files tasks to support complex file ingestion flows that require orchestration of multiple tasks.


MDM - Reference 360

  • Delta export:
    • Retrieve values that have changed in a given time period.
  • Validation:
    • Configure attribute-level validation rules to be executed when creating or editing code values.
    • Receive validation errors on create and update of individual code values.
  • Improved loading of larger data sets:
    • Reduce the loading time for data sets and hierarchies by loading a subset of code values at a time.
  • Export enhancements:
    • Choose which attributes to export both from UI and API.
    • Export data in JSON format with the REST API.
  • Workflow email notifications:
    • Receive email confirmations at each step of an approval workflow.
    • Navigate to your task in Reference 360 through a link in the notification email.


IICS Platform

Runtime Continuity (more details here)

Runtime continuity (zero downtime for runtime) to enable uninterrupted execution of scheduled jobs and processes at all times including during Informatica Intelligent Cloud Services upgrade windows.


GitHub Integration

  • Undo checkout capability for administrators on behalf of any user and any asset.
  • Bulk unlink capability.
  • Multi-project pull capability to enable pull across multiple projects with cross-project dependencies.
  • Automatically deploy upstream dependent objects upon save to be consistent with the behavior of non-source-controlled Cloud Data Integration assets.
  • Inline Git actions in dependency view.


Export/import Environment Properties and Schedules

  • Ability to export/import Secure Agent configurations to automate agent provisioning with the option to either restore or tune agent configuration settings through export/import APIs. This enables users to tune runtime environment properties such as agent configurations.
  • Ability to export/import schedule objects across environments through export/import APIs and in the Informatica Intelligent Cloud Services user interface.


Asset Management

  • Ability to reassign asset ownership to enable uninterrupted job execution when the asset owners are disabled in the organization.


New PODs

  • Informatica Intelligent Cloud Servicesavailability on Japan POD (Azure).
  • Informatica Intelligent Cloud Services availability on Google Cloud Platform (GCP)POD and Marketplace.
  • Informatica Intelligent Cloud Services availability in Canada region.


Operational Insights

  • Global availability of Operational Insights across all Amazon Web Services (AWS) PODs.
  • Generate alerts based on resource usage by individual agent services.
  • Generate alerts on disk utilization of Secure Agents.
  • Take preventive actions using custom scripts based on generated alerts.
  • In-app notification of infrastructure alert messages.


Ecosystems and Connectivity

  • Expanded coverage in terms of functional depth and breadth for cloud ecosystems:
    • Azure:
      • SQL Data Warehouse V3: Unconnected lookup, ADLS Gen2 as optional staging area for Polybase, improved exception handling, source/target parameterization overrides with parameter files, parameterization overrides (for schema, database, and table) in PRE SQL/POST SQL/SQL Override, and performance enhancements.
      • ADLS Gen2: Source/target parameterization overrides with parameter files,  Azure Gov Cloud, FileName port, New data type (Date/Time) support for Parquet files, Parquet Decimal, Date/Time data type support, User authenticated proxy, and performance enhancements.
      • Blob Storage: Source/target parameterization overrides with parameter files.
      • CDM Folders (available forpreview only): Support for new CDM schema (v0.9) and unicode character support.
    • AWS:
      • S3 V2: Hierarchical data types, ISD, multi-region, Parquet Decimal, Date/Time datatype support, KMS (other accounts), source/target parameterization.
      • RedShift: Ecosystem pushdown optimization (S3 to RedShift), JDBC driver update, KMS (other accounts), multi-region, source/target parameterization.
    • Snowflake DW:
      • Database pushdown optimization enhancements.
      • Unconnected lookups with pushdown optimization.
      • Snowflake on Google Cloud Platform.
      • OAuth2.
      • gzip compression.
    • Google:
      • IICS on Google Cloud Platform.
      • BigQuery: CDC support, lookup.
      • Google Cloud Storage: Read from directory.
    • Microsoft Business Apps:
      • Common Data Model (CDM) folders: Schema update (0.9).
      • Dynamics 365 Operations:  Certificate, update retry.
      • CRM:  Client secrets.
    • SAP:
      • SAP BW Reader: Supports dates before the year 1753.
      • SAP HANA (new private connector): Read from tables and modelling views (analytical, attribute, and calculation views).
    • Salesforce:
      • Sales and Service Cloud: API updates,  DRS enhancements to support deleted records.
      • Commerce Cloud: Cycle dependencies.
    • Oracle Apps:
      • Oracle HCM:  Writer updates.
      • NetSuite: API update (2019.2).
    • Adobe:  XDM connector enhancements.
  • Support for new patterns and use cases:
    • Cloud Data Integration Elastic: 
      • Amazon S3 V2:  Hierarchical data type and ISD support.
      • Azure SQL Data Warehouse V3: Elastic mappings.
      • ADLS Gen2: Elastic mappings (available for preview only).
      • JDBC V2:  Scala v2,partitioning, AWS and Azure runtime support.
    • Cloud Mass Ingestion:
      • Databricks Delta
      • ADLS Gen2: Direct Polybase load from ADLS Gen2 and performance improvements.
    • Cloud Data Quality: CDM
    • Kafka Pub/Sub Connector in Cloud Data Integration.
  • Improved connectivity across horizontal technology, systems, and applications:
    • Technology:
      • OAuth 2.0 JWT support.
      • REST V2:  Real-time transaction support, minor enhancements,  Hadoop 3.1 for Cloudera, 4.1 for HDInsights.
    • Database and EDW:
      • ODBC:  Unconnected lookup.
      • Oracle:  Blob/Clob datatype.
      • MySQL:  Advanced runtime properties.
      • SQL Server:  Advanced runtime properties.
      • Azure SQL DB: Bulk, Upsert.
      • PostgreSQL:  Schema name enhancements.
    • MongoDB:  BSON and JSON document support, partitioning, create target with schema-less support.
    • Workday:  Hierarchy parameterization support.
    • ServiceNow:  Upsert.
    • Cloud Apps:
      • Cvent:  Child object
      • Coupa: Proxy
  • New Add-on Connectors page to access add-on connectors on Marketplace.

Choosing the Right Path to Cloud Data Warehousing

Thursday, February 27, 2020 | 10:00am PST

Data is too important to your business to risk making the wrong decision about implementing cloud data warehousing.
Join our live webinar “Cloud Data Warehouse Drivers: Migrate, Modernize, or Both?” as we explore:

  • Key business and technical drivers for cloud migration
  • The cloud data warehouse ecosystem
  • Challenges with data modernization
  • Best practices for data migration

Featured speakers:

  • Sam Tawfik, Product Marketing Manager, Informatica
  • Andrew Comstock, Senior Director, Product Management


Click here to register.

This article showcases how Informatica Cloud Data Integration supports implementation of 'Data Vault' models.


What is Data Vault Modeling?


Data Vault is a modeling method that is used to design Data Warehouses. DV mainly consists for three types of tables:

  • Hubs – Hub tables contain unique list of Business Keys
  • Links – Links include the relationships between two or more Business Keys
  • Satellites – Satellite tables include descriptive data that changes over time


Is Data Vault same as Dimensional Modeling?


No, Data Vault and Dimensional Modeling (Star and Snowflake Schema) are different modeling methods used to design Data Warehouses.


What is Data Vault 2.0 or DV 2.0?


Data Vault 2.0 is the next iteration of Data Vault modeling with enhancements made in the modeling technique such as ‘Use of Hash Keys’ to support parallel loads.


Why chose Data Vault over Dimensional Modeling?


Data Vaults are known for flexibility and scalability. The way in which DV is designed provides for long-term historical data. It is known to be very flexible to add new sources to a DV model. Data Vaults are also known to be high performant and more usable by business users since it is modeled after the business domain.


Informatica Cloud Data Integration and Data Vault model


Informatica Intelligent Cloud Services makes it easy to develop Data Warehouses using the Data Vault model for the following reasons-


  • Rich Transformation Support – Informatica Cloud Data Integration provides a rich set of transformation that supports variety of integration use cases including Data Vault implementation. Pre-built functions such as MD5 makes it very straightforward to perform hashing which is center to DV implementation


  • Comprehensive Connectivity – Building your Data Warehouse on-premise or in the cloud, you are able to easily move data over leveraging the 100’s of out-of-the-box connectors we offer


  • Reusability – Parameterization support in Informatica Cloud Data Integration allows reuse of data flow for multiple table loads.


Sample DV Implementation in IICS


Below is a simple 'Data Vault' implementation built purely to illustrate how DV can be easily implemented using Informatica Cloud Data Integration. Example below involves customer and location information.




Step 1 – Loading Hub tables


Customer Hub



Location Hub



Step 2 – Loading Customer-Location Link table



Step 3 – Loading Satellite tables


Customer Satellite Table



Location Satellite Table



Optional - Orchestration of the loads in a Taskflow



Preview of data loaded by the mappings


select * from [dbo].[DV_Cust_Hub]




select * from [dbo].[DV_Cust_Loc_Link]




In Summary, IICS continues to be the leader in the iPaaS market and as demonstrated in this article, is feature rich that makes loading your Data Vault very easy. To try out Informatica Cloud Integration for your DW implementation, you can do a free 30 day trial from here.

This document summarizes the key capabilities that were introduced in the Taskflows module of Cloud Data Integration in 2019.

Want to deploy code every 11.7 seconds? Transform your integration with Agile, DevOps, CI/CD


Deploy Faster, Reduce Outages, and Increase Revenue & Productivity

Tuesday, February 4, 2020 8:30am PST

Within a year of Amazon's move to AWS, engineers were deploying code every 11.7 seconds, on average. The agile approach also reduced both the number and duration of outages, resulting in increased revenue. Many organizations have moved from clunky, legacy software and systems to an agile-based DevOps approach that has yielded significant increase in development productivity.

Agile, DevOps, and CI/CD are three distinct and critical tools for today’s development organization. When all three are used for their intended purposes, the results are transformational. But what do these tools mean in the context of application and data integration.

Join this webinar to learn about leveraging Informatica’s cloud data integration and application integration services to support your CI/CD pipeline. You will learn how to:


  • Enable version control of application and data integration assets with any third-party version control system
  • Perform continuous integration and continuous delivery for application and data integrations
  • Promote DevOps for application and data integration


Featured Speakers:

  • Vivin Nath, Principal Product Manager, Informatica
  • Sorabh Agarwal, Principal Product Manager, Informatica


Click here to register

The Informatica Global Customer Support Team is excited to announce an all-new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data and so on. In these sessions, we will strive to provide you with as many technical details as possible, including new features and functionalities, and where relevant, show you a demo or product walk-through as well.


Topic and Agenda



Despite your cloud investments, you still have on-premise systems you can't do without. If that's not likely to change in the foreseeable future, your business needs tight integration and efficient data management to continue operating effectively.


Join Informatica for "Hybrid Data Management with Informatica Operational Insights," a complimentary webinar on monitoring and managing data in complex hybrid environments. It will include:

  • Best practices for integrating and managing a growing number of systems spread across the cloud and on-premises
  • Using Informatica Operational Insights to monitor and manage Informatica investments
  • A live demo of Operational Insights


Whether you're a line-of-business application owner or an IT operations manager, you won't want to miss this opportunity to ask our experts about enhancing your hybrid integration platform. Reserve your spot now.

The Informatica Global Customer Support Team is excited to announce an all-new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data and so on. In these sessions, we will strive to provide you with as many technical details as possible, including new features and functionalities, and where relevant, show you a demo or product walk-through as well.


Topic and Agenda



Data integration is a core capability for delivering digital transformation and competitive analytics. Next-gen data integration enables you to combine data from various sources with ease of use and scalability. 


You may be already using many of the existing IICS Cloud Data Integration (CDI) features but you may not be aware of other existing or recently added features.

  • Did you know that you can access several hundred connectors to databases, cloud data lakes, and on-premises/SaaS applications?
  • Did you know that your IICS CDI allows you to scale your integrations as your business grows with support for high-performance pushdown optimization?
  • Did you know that you can quickly build codeless advanced integrations with wizards and out-of-the-box pre-defined templates?


Join Andrew Comstock for a deep dive into IICS Cloud Data Integration with a specific focus on going beyond the traditional use cases:

  • Connect to any of several hundred sources with flexibility and out-of-the-box pre-defined templates
  • Trusted connections to AWS Redshift, Microsoft Azure SQL Data Warehouse, Google BigQuery, and Snowflake cloud databases
  • Going beyond application integration with data integration
  • Learn how to scale your integration processes using serverless deployments


Andrew will also demo the IICS CDI Mapping Designer to illustrate the power and flexibility of building simple and complex data integration loads with out-of-the-box advanced data integration transformations.

The Informatica Global Customer Support Team is excited to announce an all-new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data and so on. In these sessions, we will strive to provide you with as many technical details as possible, including new features and functionalities, and where relevant, show you a demo or product walk-through as well.


Topic and Agenda


  • Topic: Meet the Experts Webinar - "Ingesting Data to Your Cloud Data Lake or Cloud Data Warehouse: Deep Dive & Demo"
  • Date: 31 October 2019
  • Time: 9:00 AM Pacific Daylight Time (PDT)
  • Duration: 1 Hour
  • Webinar Registration Link:
  • Speakers:
    • Etty Afriat, Director, Product Management, Informatica
    • Vishwa Belur, Director, Product Management, Informatica


When building a new Data Warehouse or data lake in the cloud, you need to ingest large volumes of data into the Cloud. The same logic applies to migrating existing data warehouses or data lakes. This could mean tens of thousands of relational tables or data from IoT devices. You need different types of ingestion – file, database or streaming depending on the data you are ingesting into Cloud. Join this technical webinar with a demo and roadmap to learn more about Informatica’s mass ingestion capability on Informatica Intelligent Cloud Integration Services (IICS).


Join the experts from Informatica as they deep dive into different types of data ingestion capabilities for:


  • Exchanging large data files between on-premises and Cloud repositories
  • Datawarehouse modernization/migration
  • Ingestion of database content onto Cloud Data Lake
  • Streaming & IoT data ingestion onto data lake and messaging hubs

We are hosting a webinar with Kelly Services on September 4, 2019. Ravi Ginjupalli reviews the process to master 4 critical domains, starting with customer, and connecting Informatica MDM with Salesforce through IICS. The webinar is co-presented with Capgemini. This is also a Teradata to Azure migration at Kelly Services. Hear about the story, challenges, and lessons learned at Kelly Services as Ravi Ginjupalli and Venkat Gupta of Capgemini discuss the approach and technology behind their effort.


You can register for the webinar on Sept 4, 2019, at 11 AM PDT here:


They realize:

  • 99.99% accuracy in identifying duplicates using MDM match rules
  • 1000 Average number of Accounts weekly (Includes both prospects and new clients)
  • 500k Number of customer loaded during the initial load
  • Eliminated 18% duplicate Customer records in Salesforce
  • 6 month Time to deliver; Salesforce + D&B integration + Hierarchies; 180 tables; 5218 Attributes; 288 mapping

The Informatica Global Customer Support Team is excited to announce an all-new technical webinar and demo series – Meet the Experts, in partnership with our technical product experts and Product management. These technical sessions are designed to encourage interaction and knowledge gathering around some of our latest innovations and capabilities across Data Integration, Data Quality, Big Data and so on. In these sessions, we will strive to provide you with as many technical details as possible, including new features and functionalities, and where relevant, show you a demo or product walk-through as well.


Topic and Agenda



Multi-cloud and hybrid architectures make data management complicated. You need a comprehensive next-generation integration platform as a service (iPaaS) to handle new use cases as they emerge.


Join us for a Meet the Experts webinar about how to keep up with your evolving data management needs with Informatica Intelligent Cloud Services (IICS):

  • Discover some of our new iPaaS services and capabilities
  • See a demo of IICS Integration at Scale, which uses serverless technology for data integration to help you process data in the cloud
  • Learn how AI-driven capabilities and microservices-based architecture supports use cases such as B2B integration, data quality, master data management, database and streaming ingestion, and more.


Investing in SaaS, PaaS, and IaaS means managing more data in more ways across increasingly complex architectures. Register for this webinar to explore how IICS can make it easier.

Webinar: Meet the Experts: Deep-Dive, Demo, Roadmap - Informatica Cloud App/API Integration


Join Informatica product experts as they dive deep into the API and application integration capabilities for accelerating your digital transformation. You will learn:


  1. How to develop processes, APIs, and connect to any APIs without any coding
  2. What Intelligent APIs are and why Informatica is uniquely qualified to offer these?
  3. About management of integration artifacts and APIs
  4. The “ilities” (performance, scalability, reliability) of our platform
  5. The IaaS, SaaS, and on-prem partners we integrate with

The IICS Summer 2019 release offers several new capabilities that include integration patterns, expanded connectivity, and platform functions. Highlights are listed below.


Data Integration

  • Machine learning-based transformation recommendations allow you to get suggestions for the next best transformation based on the mapping you are working on.
  • Improved support for custom, local parameter file locations, support for fully parameterized SQL queries, ability to overwrite connections at runtime using parameter files.
  • Support for invoking taskflows using the RunAJob command line utility and using file listeners. These can be passed as parameters to data tasks for listener-driven events.
  • New and improved taskflow canvas with a more compact layout improving design experience.
  • The SQL transformation now supports ad hoc SQL queries (in addition to Stored Procedures).
  • Intelligent Structure Discovery:
    • Enhancements to existing ISD models with data from additional sample files (JSON & XML) to quickly adapt to data drift.
    • The Structure Parser transformation can now update or change the associated intelligent structure model without breaking unchanged ports mappings which enhances mapping flexibility and allows for fast adaptation to changes in incoming files.
    • Simplified design time experience with the ability to add a prefix or a suffix to existing node names using the bulk action functionality.
    • Enhanced parsing engine support with the ability to discover and parse AVRO files.
  • Mass Ingestion:  
    • Additional sources and targets:  ADLS GEN1 as source, ADLS GEN2 as source and target, Snowflake as source and target.
    • Support for custom actions during file transfer:  compress/decompress, encrypt/decrypt
    • Integration with Enterprise Data Catalog (EDC): select Mass Ingestion source from EDC.
    • Filter files to transfer by file size and date.
    • Optimization for large file handling – resume from point of failure.


Application Integration

  • Developer productivity enhancements: new compact canvas, improved diagnostics and debugging 
    • Improved Process and Guide canvas compact layout improving design experience.
    • New validation panels help diagnose service connectors and connection errors.
    • Fault details are now available with every step to diagnose runtime errors and improve debugging. 
  • Guides: Improved page layout and Salesforce Lightning theme
    • Guide Screen step designer has been revamped to provide more real estate to design guides.
    • Salesforce Lightning theme support is now available for guides and the Guide Launcher.
  • Source control and deployment automation 
    • New CLI utility automates export and extract of design assets to any source control system. The package, import and publish capabilities of the CLI provides the ability to deploy full deployments, packages and patches across environments.
    • Sample Jenkins pipeline to help developers adopt continuous integration practices for Application and Data Integration, and help operations staff with the automation of continuous deployments.
  • Expanded pub/sub and messaging capabilities with Salesforce Platform Events and RabbitMQ to compliment the current support for AMQP, JMS, AWS SNS/SQS, Azure Service Bus/Event Hub and Kafka.
    • New support for Salesforce Platform Event and Push Topic events enabling customers to deliver secure, scalable and customizable event notifications within Salesforce or from external sources. 
    • New RabbitMQ native connector providing support for durable queues. 


API Management

  • CLAIRE-enabled privacy data leak identification: Privacy data identification in API content, masking or blocking of sensitive personally identifiable information. (This feature is in preview mode).
  • Ability to customize the structure of API URLs.
  • Support for the internationalization of API, group, and organization names.


Integration Hub  

  • Governance enhancement: Support for predefined and custom roles with granular privileges that promote governance.


B2B Gateway

  • Exchange files with your partners using Cloud AS2.
  • Get insights into your daily gateway activities and statuses with the new dashboard.



  • Asset Dependency feature providing “used by” and “uses” relationship information across all asset types and services for impact analysis.
  • Common Explore enhancements: Changes made by the user to the Explore column settings are now automatically persisted for that user for various filtered views such as “all assets,” “projects & folders,” etc.
  • More control and flexibility in managing users: Grant/deny service level access to a user through assign services capability. Disable users to prevent user logins for both native and SAML authentication mechanisms. REST APIs for user management – users, groups and roles.
  • Ability to start/stop Secure Agent services to optimize computing resources and isolate the need to restart a single service from others.


Connectivity Enhancements  

  • New native connectors to connect to CDM/Power BI, Snowflake (V2), MongoDB, Adobe Experience Platform, Ariba (V2), and Cassandra.
  • Updates to existing connectors: Microsoft (MSD 365 Operations, Cosmos DB, ADLS Gen2 and Gen 1, SQL DW), Google (Google Big Query, Google Storage), Amazon (S3, Redshift), Oracle NetSuite, Salesforce, SAP (Concur V2, S/4 HANA) and Greenplum


New IICS Services

The IICS Summer 2019 release offers the following new services:


Streaming Ingestion (included in Data Ingestion Service)

  • Ingestion of streaming data from logs, Apache Kafka and IoT sources.
  • Supports AWS & Azure ecosystems as targets of Ingestion: Amazon Kinesis Streams, Kinesis Firehose, Amazon S3 and Azure EventHub
  • Supports lightweight transformations at the edge.
  • Real-time monitoring of streaming ingestion jobs.


Integration at Scale (included in Data Integration Service)

  • Run data integration jobs at scale on a fully managed spark serverless cluster.
  • Support for AWS ecosystem: S3 & Redshift
  • Auto-tune and scale fully-managed Kubernetes clusters.


Operational Insights

  • Operational Insights is now available for IICS cloud services to help with efficient infrastructure monitoring and alerting. IICS cloud services users can monitor the health of a) Secure Agents, b) services running on each Secure Agent, c) Runtime environments and d) subscribed cloud services.
  • Email Alerts can be set on services and Secure Agents for unavailability or excessive resource consumption.


Reference 360  

  • Enables business users to manage enterprise reference data in an easy-to-use, configurable, business-friendly user interface.
  • Provides versioning, collaboration and complete life-cycle management of reference data.
  • REST APIs for managing data and meta-data, enabling automation and system integration.
  • Improved stability and performance.



Data Quality

  • Enable business users to create rule specifications to perform a variety of Data Quality actions including completeness, validation and standardization.
  • Create and manage value lists for data quality operations using dictionaries. Dictionaries can be used to identify, validate and standardize data as part of a rule specification.
  • Embed Data Quality rules in Data Integration mappings to support Data Quality activities in typical data integration scenarios such as data warehousing, data migrations and more.


Thank you, 

The IICS Team

Filter Blog

By date: By tag: