Authors: gcook Skip navigation
1 2 Previous Next

Cloud Data Integration

30 Posts authored by: gcook

Notification of an upcoming change of behavior in Amazon Redshift V2 Connector

Product: Informatica Intelligent Cloud Services

Connector: Amazon Redshift V2

Description of the change

With the upcoming July 2021 release, when you migrate a Redshift V2 connection related mapping you can run the imported mapping using the target org’s connection, without any changes to the mapping attributes.

For example, you develop a mapping in the IICS development organization and you then migrate the mapping to the IICS production organization. When the connection endpoint or object path in the production organization differs, you can update the Redshift V2 connection to reflect the production endpoint. You do not have to modify the mapping again. Previously, you had to manually update the mapping and create the object in the production organization.

 

Impact of this change

After July Major release, an Amazon Redshift V2 mapping will fail when the Target transformation includes all of the following configurations: 

  • The target operation selected is update, upsert, or delete.
  • The column in the Update Columns field is not selected.
  • Target table name specified as an override in the target advanced properties does not have a primary key.

 

The following image shows the configurations in the Target transformation that you need to consider:

Error Scenario

If the target table specified in the Redshift V2 mapping at the object level has a primary key, but the table name specified as an override in the target advanced properties does not have a primary key defined, Data Integration considers the table name specified in the advanced properties. The mapping fails with the following error message:

[ERROR] <Target_table_name> No key is selected for Operation Type

Solution

When the mapping includes an update, upsert, or delete operation in the Target transformation, perform one of the following tasks before you move to the July 2021 release and migrate mappings from one environment to another:

  • In the Update Column field, select the column that you want to use as the primary key.
  • Ensure that a primary key is defined in the Redshift target table if you override Target Table Name in the target advanced properties.

We have the following dates set for the July 2021 major release. For all future upgrades and maintenance schedules, please follow our calendar here:Link to Calendar


See our What's New Guide and list of new features here: What's New in the Informatica Intelligent Cloud Services October 2021 Release

 

PODsDateDayTime
Sandbox (Pre-Release)

21 Jun 2021

Mon6 AM - 9 AM PST
USWest1 USEast2

11 Jul 2021

Sun6 AM - 9 AM PST
USWest1 Azure, APNorthEast1 Azure

11 Jul 2021

Sun9  AM - 12 PM PST
USWest3, USEast4, APSouthEast1

18 Jul 2021

Sun6 AM - 9 AM PST
USWest3 Azure, EMWest1

18 Jul 2021

Sun9 AM - 12 PM PST
USWest1GCP, CANADACenteral1

24 Jul 2021

Sat6 PM - 9 PM PST
USWest5, USEast6

24 Jul 2021

Sat9 PM - 12 AM PST

 

*Private pods are not be listed. Please contact support for those dates.

As a result of the recent separation of the United Kingdom from the European Union, Informatica has made the decision to relocate our disaster recovery site for our EMEA IICS cloud services. You are receiving this notice as an administrator of an Informatica service located in our primary EMEA site, Dublin Ireland (“eu-west-1"). We will move the Disaster Recovery site to Paris, France (“eu-west-3") on January 30, 2021 to help our customers keep our European Union (EU) instances in the EU. All Informatica services currently running in the primary EMEA site (“eu-west-1") will remain unchanged.

 

As a result of this change, we request that you verify and if necessary, update the allowed domains & IP address ranges in your firewalls (as you had previously for our existing site) as documented in the following KB articles.

 

  • Customers using Cloud Data Integration (CDI): - 628338
  • Customer using API Manager (APIM): - 94569
  • Customers using Cloud Integration Hub (CIH): - 570388
  • Customers using Cloud Application Integration (CAI): - 535281

 

We do not expect any downtime, impact or risk from this change. The disaster recovery IP addresses are for disaster recovery situations only and will be invoked if a DR situation were to occur.

November 17, 2020

 

Upcoming dropped support of Internet Explorer (IE) 11 and transition to Chromium-based Microsoft Edge

Microsoft announced that they will drop support for IE 11 on August 17, 2021. In preparation for the dropped support, Informatica will start testing IICS products on the new Microsoft Edge Chromium-based browser. We aim to have all IICS products supported on Microsoft Edge by July or August 2021, when we will drop support for IE 11. We will deprecate  IE 11 soon, providing time for customers to make a smooth transition to Microsoft Edge before we drop support for IE 11. Please plan for this coming transition.

With the coming Informatica Fall '20 release, File Mass Ingestion task is moving from Data Integration service to the unified Mass Ingestion service, introducing a common and unified user experience to all ingestion types: Steaming, Database and File.

 

Please read the attached short deck to get some early insight into the coming change and learn what’s new in the coming release for File Mass Ingestion.

 

 

For any questions you can contact the product management team:

 

Belur, Vishwanath  (vbelur@informatica.com)

Afriat, Etty  (eafriat@Informatica.com )

IICS no longer support Windows 2008 as of Spring '20.

 

Windows Server 2008 Operating System version is already out of support by Microsoft.

 

To avoid any kind of issues while using the IICS product, you must upgrade your agent machine's Operating System to one of the supported OS versions defined in the PAM for Informatica Intelligent Cloud Services (IICS).

 

We will not support any future issues on Windows 2008 after the next release in October/November 2020 timeframe.

 

For any questions or concerns please open a case with Informatica technical support.

The Success Portal - Tech Tuesdays technical sessions are designed to encourage interaction and knowledge sharing around some of our latest innovations and capabilities across Informatica Products. In these sessions, we will focus on new features, latest releases, performance tuning, best practices, etc. and as relevant, show you a demo or product walk-through as well. For June, all webinars have been planned especially for Informatica Cloud customers.

Upcoming Webinars

 

What’s New in Cloud Application Integration 2020 releases?

Date and Time: June 2, 2020, 8:00 AM Pacific Time
This presentation will cover the features and functionalities that have been or will be released as part of Cloud Application Integration releases in 2020. It will also showcase some of these features through a demo that would simulate a real-life use-case.

What’s New in Cloud Data Integration?

Date and Time: June 9, 2020, 8:00 AM Pacific Time
Watch this webinar to learn about the new capabilities of Cloud Data Integration service as part of Spring 2020 launch and understand how to leverage them in your integration flows.

Secure Agent in IICS

Date and Time: June 16, 2020, 8:00 AM Pacific Time
This webinar is intended for users to understand the best practices for Secure Agent in IICS. After the webinar, you will be able to manage memory requirements on agent machine, understand more on agent level logging, manage space on the agent machine etc.

IICS secure agent 101

Date and Time: June 23, 2020, 8:00 AM Pacific Time
This webinar is intended for anyone who works on IICS. This would most benefit IICS Admins and ETL architects who procure, maintain and manage IICS Agent infrastructure. The webinar would cover IICS secure agent architecture and operations performed by the agent. At the end of this webinar, you will have an end to end picture of how IICS secure agent orchestrates various services and its requirements. This will also enable users to better troubleshoot agent services and gauge sizing for an agent host.

Integrated Monitoring with Operational Insights

Date and Time: June 30, 2020, 8:00 AM Pacific Time
This webinar is intended for users who are looking for Integrated Monitoring of Informatica Cloud and On-Premise products like DEI (Formerly BDM), IDQ, PowerCenter. At the end of this session, you will be able to get insights about Operation Analytics, Monitoring Services, Infrastructure Alerting along with Best Practices.

gcook

IICS Runtime Continuity FAQ

Posted by gcook Mar 30, 2020

(Doc updated July 7, 2021: CIH changed to supports runtime continuity)

 

The following frequently asked questions (FAQ) and answers are intended to help customers better understand the runtime continuity capabilities being made available as of the Informatica Intelligent Cloud Services (IICS) Spring 2020 April release.

 

This functionality will allow current jobs to continue running during the upgrade process.

The Spring 2020 release of Informatica Intelligent Cloud Services includes many changes. Please review the attached document for more details. We will provide individual pod upgrade dates within another week.

·

Cloud Data Integration

Continued investments to the current leader in integration for data warehousing to make it more scalable, flexible, and dynamic.

  • Data warehousing support for transaction control transform and dynamic file names to enhance the control over the writing of target data.
  • Highly requested capabilities of mid-stream data preview for a better debugging experience.
  • Expansion of the user, task, and sessions variables for enhanced expression creation.
  • Enhancements to parameterization capabilities to access parameter files from cloud stores like Amazon S3, Microsoft Azure Data Lake Storage Gen2 (ADLS Gen2), or Google Cloud Storage.
  • Improvements to the Union transformation to support more than two input groups.
  • Ability to specify the data flow run order to load targets in a mapping in a particular order.
  • Extending Change Data Capture (CDC) sourcing capabilities to include z/OS VSAM file capture.
  • Databricks Delta connector as a source and target for mass ingestion tasks.
  • Roll-out of Operational Insights for Cloud Data Integration and improved visualizations for new operational analytics.

 

Cloud Data Integration Elastic

  • Support for hierarchical datatypes. While applications are collecting massive amounts of data, including IoT data, storage costs become a concern and we begin looking at columnar databases and binary formats such as JSON, Avro, Parquet, and ORC. To enable BI applications to use the binary data, developers can pass the data into an elastic mapping and convert it into a structured format.
  • Serverless. A new modern cloud architecture that removes the need to manage servers, virtual machines (VMs), and containers. Developers can run mappings in response to events without weighing down administrators to maintain infrastructure or install a Secure Agent. Auto-scaling,high availability, and recovery arebuilt in and enabled automatically. This feature is in preview.
  • Auto-tuning. Traditional approaches to manually tune the Spark engineare an inefficient and time-consuming process too often characterized by trial and error. Attempts to tune areerror-prone because one parameter generally impacts another. CLAIRE, Informatica’s AI engine, now automatically tunes mapping tasks based on a number of heuristics including the size of the data, transformation logic, the size of the cluster, and the size ofcluster nodes.
  • Node bootstrapping. Cluster nodes can run initialization scripts to perform bootstrap actions that install additional software or customize cluster instances.
  • Support for the Sequence Generator transformation.
  • Auto-scaling enhancements for an elastic cluster through a custom scheduler.
  • Support for high availability. You can configure an elastic cluster to become highly available so that jobs continue running smoothly without interruption.
  • New platform capabilities: Continuous availability through upgrades, job recovery, high availability, and disaster recovery.

 

Taskflows for Cloud Data Integration

Continued investments of Cloud Data Integration taskflow support of ingestion use cases. Taskflows now support:

  • Mass ingestion tasks: Provides the means to apply any post or pre process business logic before or after ingesting the data to/from a data lake.
  • Inline file listener tasks: Makes it possible to wait for the arrival of a file before proceeding to further processing steps.
  • Unique API name override: Customers can now override the unique API name assigned to taskflows. This makes it easier to put new versions in production without the need to update consumers.

Note: You can find a summary of taskflow features introduced in 2019 in the Taskflow folder of the Cloud Data Integration community and specifically in this article: Cloud Data Integration - Taskflows - 2019 Highlights

 

Cloud Data Quality

Cloud Data Quality continues its evolution of data quality services. New with the Spring 2020 release are:

  • Deduplicate asset:
    • Identity matching (single-source) functionality to compare records on a field-by-field basis and generate a numerical score that indicates the degree of similarity between the records.
    • Optional consolidation of the records that the deduplication process identifies as duplicates. Consolidation evaluates each set of duplicate records and creates a preferred version of the records in the set according to criteria that the user defines.
  • Parse asset: Parse discrete words or strings from an input field using regular expressions and/or dictionaries.
  • Rule specification asset: Additional function expressions are available in rule statements: Convert to Date, Convert to Float, Convert to Integer, Is Spaces, Substring, and To Char.
  • Data Profiling:
    • Perform What-If scenarios by profiling outputs of Data Quality assets: rule specification, verifier, and cleanse.
    • Extension of source support for Salesforce, Microsoft Common Data Model, and Google Big Query.
    • Auto-Assignment of Data Quality assets to source columns of Microsoft Common Data Model.
    • Ability to modify connection and source object from existing profiling tasks.
    • Option to query records that has data quality issues into delimited file.
    • Integration of Data Profiling with the Operational Insights service.

 

Cloud Integration Hub

  • Ability to subscribe using API to partial published data to allow lower amounts of data to be consume per API subscription call.
  • Subscription API does not have a size limit when using a private publication repository.
  • Ability to create a synchronization task-based subscription from the Cloud Integration Hub subscription menu for better self-service.
  • Ability to create a topic based on an existing connection schema.
  • Explore functionality now supports folders and tags.
  • Usability enhancements for topics with topic-related publications and subscription tables and visual alignment across all pages.
  • Performance enhancements when using a private publication repository.
  • Onboard Cloud Integration Hub to Azure POD.

 

Intelligent Structure Discovery (ISD)

  • Ability to use the Structure Parser transformation in Data Integration with real time connectors.
  • Ability to use the Structure Parser transformation structure parser in Data Integration in passthrough mode.
  • Ability to set the Structurer Parse transformation in Data Integration for non-relational output (serialize to JSON, XML, Parquet, Avro & ORC).

 

B2B Gateway

  • B2B Partners portal: Take partners community management to the next level of empowering business partners. Partners can use the portal to track their file exchanges with the organization and to send and receive files to and from the organization, leveraging a secure HTTPs protocol.
  • SFTP server: The new SFTP service provides customers with the ability to manage and use SFTP servers to exchange files with partners.

 

API Gateway

  • OAuth 2.0 support: The API Gateway and Informatica Intelligent Cloud Services platform are delivering a new authorization option for API access. Initially available to Cloud Application Service API consumers, OAuth 2.0 used in conjunction with Client ID and Secrets extends authorization choices that today include basic authentication, and JSON-Web Token (JWT)-based authentication.
  • Personal Identifiable Information (PII) Privacy Policies have been extended. The PII policy not only reports on incoming and outgoing PII transfer, but now also provides the ability to block requests/response that contain sensitive PII data.

 

Cloud Application Integration

Spring 2020 April

This release includes the following updates:

  • Support for an Operational Insights Dashboard for Application Integration depicting API, process, and connector execution metrics.
  • Application Integration API endpoint authorization.
  • Will now be able to make use of the API Gateway’s OAuth service. This extends support for OAuth 2.0-based authorization in addition to the current basic authentication and JSON Web Token (JWT) based authentication capabilities.
  • Will now be able to restrict incoming requests from the API Gateway. This ensures that monitoring and policy enforcement are applied consistently at the gateway.

 

January 2020

The January 2020 release represents a major release of Application Integration. See the following link for information about the new features: Announcing the January 2020 release of Cloud Application Integration

Summarizing:

  • Making it easier to implement by creating process object schemas simply by importing WSDL, XSD, or Swagger interface documents from a file, a zip, or a URL.
  • Making it easier to debug:
    • To help you debug processes, the Assignment step now gives you access to field values controlled by a process’s tracing level. To turn up tracing without redeploying a process, a new X-Debug HTTP header has been introduced.
    • Tools such as Postman, SOAPUI, or RESTclient are great but require you to leave Process Designer to initiate a process. You can now create and associate with a process one or more JSON or XML process inputs and run a process with one or all inputs. You can then use the new process instance debugging capabilities to better analyze and identify the root cause of errors.
  • Making it easier to consume databases as fully-fledged REST-enabled APIs. Not only can you enable your database with OData v4 with a single click, you can now expose it as a fully-fledged REST API-enabled database. Just download the auto-generated Swagger interface and you’re good to go.
  • Making it easier for developers to consume JSON and XML content-type responses and work with attachments and multipart responses.
  • Unique API name override for processes and taskflows.
  • Making it possible for operators to restart processes from the Application Integration Console to recover from unhandled errors communicating with the end system.

 

Cloud Mass Ingestion Service

Enhanced capabilities for data ingestion from a variety of sources, using a simple and unified user experience with extensive connectivity, to address mass ingestion use cases for Cloud data warehouses and Cloud data lakes.

 

Mass Ingestion Databases

Continuing to build usability and resilience into the service while adding new functionality. New features include:

  • Schema drift support, which enables running database ingestion tasks to recognize when the schema of source tableswithin the task change (column changes only) and to dynamically process the changes through to the CDC-supported target.
  • Asset import and export functionality for database ingestion tasks.
  • GitHub source control for database ingestion tasks.

 

Mass Ingestion Streaming

  • Continuing to enable ingestion from variety of streaming sources with real time monitoring and lifecycle management.
  • New streaming connectivity and enhancements, new streaming sources and targets:
    • New connectivity: Amazon Kinesis Streams source and Microsoft Azure Data Lake Storage Gen2 target.
    • Connectivity enhancements: Flat file source.
    • Test connection for Mass Ingestion Streaming connectors.
  • New transformations during ingestion:
    • Python transformation support
    • Splitter transformation support (technical preview).
  • Real-time monitoring and lifecycle management:
    • Real-time refresh of Mass Ingestion Streaming job statistics.
    • Stop and Resume support for Mass Ingestion Streaming tasks.
  • Enterprise readiness:
    • Sub-organization support to represent different business environments.
    • Integration with GitHub for source control on Mass Ingestion Streaming tasks.
    • Deployment of the Mass Ingestion Streaming service in Kubernetes for autoscaling and high availability.

 

Mass Ingestion Files

  • Continuing to enable mass ingestion of files from variety of sources to cloud repositories with real time monitoring and different scheduling capabilities.
  • New connectivity: Databricks Delta Lake as source and target.
  • Taskflow integration for Mass Ingestion Files tasks to support complex file ingestion flows that require orchestration of multiple tasks.

 

MDM - Reference 360

  • Delta export:
    • Retrieve values that have changed in a given time period.
  • Validation:
    • Configure attribute-level validation rules to be executed when creating or editing code values.
    • Receive validation errors on create and update of individual code values.
  • Improved loading of larger data sets:
    • Reduce the loading time for data sets and hierarchies by loading a subset of code values at a time.
  • Export enhancements:
    • Choose which attributes to export both from UI and API.
    • Export data in JSON format with the REST API.
  • Workflow email notifications:
    • Receive email confirmations at each step of an approval workflow.
    • Navigate to your task in Reference 360 through a link in the notification email.

 

IICS Platform

Runtime Continuity (more details here)

Runtime continuity (zero downtime for runtime) to enable uninterrupted execution of scheduled jobs and processes at all times including during Informatica Intelligent Cloud Services upgrade windows.

 

GitHub Integration

  • Undo checkout capability for administrators on behalf of any user and any asset.
  • Bulk unlink capability.
  • Multi-project pull capability to enable pull across multiple projects with cross-project dependencies.
  • Automatically deploy upstream dependent objects upon save to be consistent with the behavior of non-source-controlled Cloud Data Integration assets.
  • Inline Git actions in dependency view.

 

Export/import Environment Properties and Schedules

  • Ability to export/import Secure Agent configurations to automate agent provisioning with the option to either restore or tune agent configuration settings through export/import APIs. This enables users to tune runtime environment properties such as agent configurations.
  • Ability to export/import schedule objects across environments through export/import APIs and in the Informatica Intelligent Cloud Services user interface.

 

Asset Management

  • Ability to reassign asset ownership to enable uninterrupted job execution when the asset owners are disabled in the organization.

 

New PODs

  • Informatica Intelligent Cloud Servicesavailability on Japan POD (Azure).
  • Informatica Intelligent Cloud Services availability on Google Cloud Platform (GCP)POD and Marketplace.
  • Informatica Intelligent Cloud Services availability in Canada region.

 

Operational Insights

  • Global availability of Operational Insights across all Amazon Web Services (AWS) PODs.
  • Generate alerts based on resource usage by individual agent services.
  • Generate alerts on disk utilization of Secure Agents.
  • Take preventive actions using custom scripts based on generated alerts.
  • In-app notification of infrastructure alert messages.

 

Ecosystems and Connectivity

  • Expanded coverage in terms of functional depth and breadth for cloud ecosystems:
    • Azure:
      • SQL Data Warehouse V3: Unconnected lookup, ADLS Gen2 as optional staging area for Polybase, improved exception handling, source/target parameterization overrides with parameter files, parameterization overrides (for schema, database, and table) in PRE SQL/POST SQL/SQL Override, and performance enhancements.
      • ADLS Gen2: Source/target parameterization overrides with parameter files,  Azure Gov Cloud, FileName port, New data type (Date/Time) support for Parquet files, Parquet Decimal, Date/Time data type support, User authenticated proxy, and performance enhancements.
      • Blob Storage: Source/target parameterization overrides with parameter files.
      • CDM Folders (available forpreview only): Support for new CDM schema (v0.9) and unicode character support.
    • AWS:
      • S3 V2: Hierarchical data types, ISD, multi-region, Parquet Decimal, Date/Time datatype support, KMS (other accounts), source/target parameterization.
      • RedShift: Ecosystem pushdown optimization (S3 to RedShift), JDBC driver update, KMS (other accounts), multi-region, source/target parameterization.
    • Snowflake DW:
      • Database pushdown optimization enhancements.
      • Unconnected lookups with pushdown optimization.
      • Snowflake on Google Cloud Platform.
      • OAuth2.
      • gzip compression.
    • Google:
      • IICS on Google Cloud Platform.
      • BigQuery: CDC support, lookup.
      • Google Cloud Storage: Read from directory.
    • Microsoft Business Apps:
      • Common Data Model (CDM) folders: Schema update (0.9).
      • Dynamics 365 Operations:  Certificate, update retry.
      • CRM:  Client secrets.
    • SAP:
      • SAP BW Reader: Supports dates before the year 1753.
      • SAP HANA (new private connector): Read from tables and modelling views (analytical, attribute, and calculation views).
    • Salesforce:
      • Sales and Service Cloud: API updates,  DRS enhancements to support deleted records.
      • Commerce Cloud: Cycle dependencies.
    • Oracle Apps:
      • Oracle HCM:  Writer updates.
      • NetSuite: API update (2019.2).
    • Adobe:  XDM connector enhancements.
  • Support for new patterns and use cases:
    • Cloud Data Integration Elastic: 
      • Amazon S3 V2:  Hierarchical data type and ISD support.
      • Azure SQL Data Warehouse V3: Elastic mappings.
      • ADLS Gen2: Elastic mappings (available for preview only).
      • JDBC V2:  Scala v2,partitioning, AWS and Azure runtime support.
    • Cloud Mass Ingestion:
      • Databricks Delta
      • ADLS Gen2: Direct Polybase load from ADLS Gen2 and performance improvements.
    • Cloud Data Quality: CDM
    • Kafka Pub/Sub Connector in Cloud Data Integration.
  • Improved connectivity across horizontal technology, systems, and applications:
    • Technology:
      • OAuth 2.0 JWT support.
      • REST V2:  Real-time transaction support, minor enhancements,  Hadoop 3.1 for Cloudera, 4.1 for HDInsights.
    • Database and EDW:
      • ODBC:  Unconnected lookup.
      • Oracle:  Blob/Clob datatype.
      • MySQL:  Advanced runtime properties.
      • SQL Server:  Advanced runtime properties.
      • Azure SQL DB: Bulk, Upsert.
      • PostgreSQL:  Schema name enhancements.
    • MongoDB:  BSON and JSON document support, partitioning, create target with schema-less support.
    • Workday:  Hierarchy parameterization support.
    • ServiceNow:  Upsert.
    • Cloud Apps:
      • Cvent:  Child object
      • Coupa: Proxy
  • New Add-on Connectors page to access add-on connectors on Marketplace.

Want to deploy code every 11.7 seconds? Transform your integration with Agile, DevOps, CI/CD

 

Deploy Faster, Reduce Outages, and Increase Revenue & Productivity

Tuesday, February 4, 2020 8:30am PST

Within a year of Amazon's move to AWS, engineers were deploying code every 11.7 seconds, on average. The agile approach also reduced both the number and duration of outages, resulting in increased revenue. Many organizations have moved from clunky, legacy software and systems to an agile-based DevOps approach that has yielded significant increase in development productivity.

Agile, DevOps, and CI/CD are three distinct and critical tools for today’s development organization. When all three are used for their intended purposes, the results are transformational. But what do these tools mean in the context of application and data integration.

Join this webinar to learn about leveraging Informatica’s cloud data integration and application integration services to support your CI/CD pipeline. You will learn how to:

 

  • Enable version control of application and data integration assets with any third-party version control system
  • Perform continuous integration and continuous delivery for application and data integrations
  • Promote DevOps for application and data integration

 

Featured Speakers:

  • Vivin Nath, Principal Product Manager, Informatica
  • Sorabh Agarwal, Principal Product Manager, Informatica

 



Click here to register

We are hosting a webinar with Kelly Services on September 4, 2019. Ravi Ginjupalli reviews the process to master 4 critical domains, starting with customer, and connecting Informatica MDM with Salesforce through IICS. The webinar is co-presented with Capgemini. This is also a Teradata to Azure migration at Kelly Services. Hear about the story, challenges, and lessons learned at Kelly Services as Ravi Ginjupalli and Venkat Gupta of Capgemini discuss the approach and technology behind their effort.

 

You can register for the webinar on Sept 4, 2019, at 11 AM PDT here: https://www.informatica.com/about-us/webinars.html?commid=367064

 

They realize:

  • 99.99% accuracy in identifying duplicates using MDM match rules
  • 1000 Average number of Accounts weekly (Includes both prospects and new clients)
  • 500k Number of customer loaded during the initial load
  • Eliminated 18% duplicate Customer records in Salesforce
  • 6 month Time to deliver; Salesforce + D&B integration + Hierarchies; 180 tables; 5218 Attributes; 288 mapping

Webinar: Meet the Experts: Deep-Dive, Demo, Roadmap - Informatica Cloud App/API Integration

 


Join Informatica product experts as they dive deep into the API and application integration capabilities for accelerating your digital transformation. You will learn:

 

  1. How to develop processes, APIs, and connect to any APIs without any coding
  2. What Intelligent APIs are and why Informatica is uniquely qualified to offer these?
  3. About management of integration artifacts and APIs
  4. The “ilities” (performance, scalability, reliability) of our platform
  5. The IaaS, SaaS, and on-prem partners we integrate with

ALERT:  for customers using Informatica Cloud

 

To better serve our customers, we place older connectors and unused connectors in End-of-Life (EOL) or Maintenance Mode. If you need a new connector enabled, per the customer action below, please create a shipping case and request we add the new connector to your org(s). The differences between Maintenance Mode and EOL are summarized in table below:

 

Term

Description

Bug Fixes

Enhancements

Connector continues to work

Customer Action

End-of-Life (EOL)

Connector at end-of-life. 
Informatica will no longer support; no bug fixes; no enhancements. 
There will be no automatic migrations, upgrades for existing work.

No

No

No

Connector will no longer work post the next release (~6 mos from initial announcement); and will not be available in your org anymore.
Please verify you not using any connector, or move mappings to an alternative connector, if available.

Maintenance Mode

Connector in maintenance mode. 
Informatica will no longer enhance; and bug fixes may be considered. 
There will be no automatic migrations, upgrades for existing work. You will need to apply the latest recommended connector and migrate your jobs to the next connector.

Yes

No

Yes

Customer should consider moving to alternative connector, if available; the alternative connector will continue to be further enhanced as necessary.

 

Am I impacted?

Refer to the list below to determine if you are using one of these connectors.  As necessary, emails are sent to "Subscription" customers for these connectors in advance of EOL.  For EOL connectors, customers are given 6 months notice to address their existing mappings; maintenance mode connectors will move to EOL as appropriate.

 

How do I address the issue?

Please reference Customer Action in table above; and Notes to Customers and Alternative connector columns in table below.

 

The following table shows the connectors in end-of-life (EOL) or maintenance mode

The table below is continually updated.

 

Nr

Data Source

Connector Name

EOL or Maintenance Mode

Alternative Connector

Notes to Customers

1

Amazon QuickSight

Amazon QuickSight

EOL

None

 

2

Arc GIS

Arc GIS

EOL

None

 

3

Attensity Discovery Now

Attensity Discovery Now

EOL

None

 

4

Avature

Avature

EOL

Generic REST V2 / WS Connector

 

5

Birst

Birst

EOL

Birst Cloud Connect

 

6

Cloud File Transfer

Cloud File Transfer

EOL

None

 

7

Club Assistant

ClubAssistant

EOL

None

 

8

DataSift

DataSift

EOL

None

 

9

EPIC

EPIC

EOL

None

 

10

IDVExpress

IDVExpress

EOL

None

 

11

Informatica Data Prep

Informatica Data Prep

EOL

None

 

12

Informatica Rev

Informatica Rev

EOL

None

 

13

Intuit QuickBooks

Intuit Quickbooks

EOL

QuickBooks V2

 

14

Intuit Quickbooks Online

Intuit Quickbooks Online

EOL

None

 

15

Magento

Magento

EOL

None

 

16

Marketo

Marketo 2

EOL

Marketo V3

 

17

Microsoft Dynamics AX

Microsoft Dynamics AX 2009

EOL

None

 

18

Microsoft Dynamics GP

Microsoft Dynamics GP 2010

EOL

None

New connector on roadmap.

19

Oracle Netsuite

NetSuite (Restlet) Write only

EOL

NetSuite

 

20

Oracle Peoplesoft

Oracle Peoplesoft 9.x

EOL

Use generic REST V2 or WS Connector

 

21

Oracle Taleo Business Edition

Oracle Taleo Business Edition

EOL

Generic REST V2 / WS Connector

 

22

Oracle Taleo Enterprise Edition

Oracle Taleo Enterprise Edition

EOL

Generic REST V2 / WS Connector

 

23

Rapnet

Rapnet

EOL

None

 

24

Rave

Rave

EOL

None

 

25

Reltio

Reltio

EOL

None

 

26

Rev

Rev

EOL

None

 

27

Saaggita

Saaggita

EOL

None

 

28

Salesforce Insights

Salesforce Insights

EOL

None

 

29

Snowflake

Snowflake V1 Connector

EOL

Snowflake Cloud Data Warehouse

 

30

Snowflake

Snowflake Big Data Warehouse

EOL

Snowflake Cloud Data Warehouse

 

31

Sugar CRM

Sugar CRM

EOL

Sugar CRM REST

 

32

Tableau (Server)

Tableau V1

EOL

Tableau V3

 

33

Trackwise

Trackwise

EOL

None

 

34

Vindicia

Vindicia

EOL

None

 

35

Zoho

Zoho

EOL

Generic REST V2 / WS Connector

 

36Amazon AuroraAmazon AuroraMaintenance ModeMySQL

37

Amazon Dynamo DB

Amazon Dynamo DB

Maintenance Mode

None

New connector on roadmap.

38

Anaplan

Anaplan

Maintenance Mode

Anaplan V2

 

39Apache HDFSHadoop FilesMaintenance ModeHadoop Files V2

40

Apache Hive

Hadoop

Maintenance Mode

Hive Connector

 

41

Box

Box

Maintenance Mode

None

New connector on roadmap.

42

Box

Box API

Maintenance Mode

None

New connector on roadmap.

43

Chatter

Chatter

Maintenance Mode

None

 

44

Coupa

Coupa

Maintenance Mode

Coupa V2

 

45

Dropbox

Dropbox

Maintenance Mode

None

New connector on roadmap.

46EloquaEloqua (Soap)Maintenance ModeEloqua Bulk, Eloqua REST
47FileListFileListMaintenance ModeFlat File
48FileProcessorFileProcessorMaintenance ModeFile Mass Ingestion ServicesThis is an Active Connector

49

Google API

Google API

Maintenance Mode

Google analytics

 

50JDBCJDBCMaintenance ModeJDBC V2, JDBC IC

51

LinkedIn

LinkedIn

Maintenance Mode

None

New connector on roadmap.

52

Marketo

Marketo

Maintenance Mode

Marketo V3

 

53

Marketo

Marketo REST

Maintenance Mode

Marketo V3

 

54MemSQLMemSQLMaintenance ModeMemSQL V2Work with MemSQL for connector access

55

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V1

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

56

Microsoft Azure Blob Storage

Microsoft Azure Blob Storage V2

Maintenance Mode

Microsoft Azure Blob Storage V3

Consider building new and updating existing mappings to use Blob Storage V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

57

Microsoft Azure Cosmos DB SQL API

Microsoft Azure Document DB

Maintenance Mode

Microsoft Azure Cosmos DB SQL API

Consider building new and updating existing mappings to use Cosmos DB SQL API connector. Note that the Cosmos DB SQL API connector does not support DSS yet. Support for DSS equivalent functionality with the Cosmos DB SQL API connector is planned to be available in 1H 2020

58

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V1

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

59

Microsoft Azure Data Lake Store Gen1

Microsoft Azure Data Lake Store V2

Maintenance Mode

Microsoft Azure Data Lake Store V3

Consider building new and updating existing mappings to use ADLS V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

60

Microsoft Azure SQL DW

Microsoft Azure SQL DW V1

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

61

Microsoft Azure SQL DW

Microsoft Azure SQL Data Warehouse V2

Maintenance Mode

Microsoft Azure SQL Data Warehouse V3

Consider building new and updating existing mappings to use SQL DW V3 connector. Note that the V3 connector does not support DSS yet. Support for DSS equivalent functionality with the V3 connector is planned to be available in 1H 2020

62

Microsoft Dynamics AX

Microsoft Dynamics AX 2012

Maintenance Mode

Microsoft Dynamics AX 2012 V3

 

63

Microsoft Dynamics AXMicrosoft Dynamics AX v3 (supports AX 2012)Maintenance ModeNone
64Microsoft Dynamics NAVMicrosoft Dynamics NAV 2009 - 2013Maintenance ModeNone

65

Microsoft Excel

Excel

Maintenance Mode

Microsoft Excel

 

66Oracle CPQ (BigMachines)Oracle CPQ (BigMachines)Maintenance ModeREST V2

67

Oracle EBS

Oracle EBS 12.x (Cloud only)

Maintenance Mode

Use generic REST V2 or WS Connector

 

68

Oracle EBS

Oracle InterfaceTable

Maintenance Mode

Use generic REST V2 or WS Connector

 

69SAP AribaAriba HierMaintenance ModeAriba V2

70

SAP Concur

SAP Concur

Maintenance Mode

Concur V2

 

71

SAP SuccessFactors

SAP SuccessFactors SOAP

Maintenance Mode

SAP SuccessFactors Odata

 

72

TFS

TFS

Maintenance Mode

Generic REST V2 / WS Connector

 

73

TM2

TM2

Maintenance Mode

None

 

74

Twitter

Twitter

Maintenance Mode

None

New connector on roadmap.

75

WebServices - REST

REST

Maintenance Mode

REST V2

 

76

WebServices - SOAP

SOAP WebServices

Maintenance Mode

Webservices Consumer Transform

 

77

Webservices V2

Webservices V2

Maintenance Mode

Webservices Consumer Transform

 

78

Workday

Workday

Maintenance Mode

Workday V2

 

79

Zendesk

Zendesk

Maintenance Mode

Zendesk V2

 

80

Zuora

Zoura (SOAP)

Maintenance Mode

Zuora REST V2, Zuora AQuA

 

81XML FilesXML SourceEOLHierarchy Parser/Intelligent Structure Discovery Transformation
82XML FilesXML TargetEOLHierarchy Builder transformation
83JSON filesJSON targetEOLHierarchy Parser/Intelligent Structure Discovery Transformation

 

Updates

  • 6/2019: added MemSQL and Eloqua
  • 7/2019: added Ariba
  • 11/2019: added BigMachines
  • 2/2020: added MSD AX v3, MSD NAV 2009-13
  • 5/2020: added Amazon Aurora, Hadoop Files, FileList, FileProcessor, JDBC
  • 6/2020: revised alternative connector for Intuit Quickbooks Online

The Informatica Intelligent Cloud Services (IICS) Winter 2019 release offers several new capabilities that address key data challenges that businesses are facing today. Highlights are listed below.

 

Data Integration

  • Data discovery in Cloud Data Integration with Enterprise Data Catalog (EDC) integration - Customers can now search and discover enterprise-wide metadata from within Data Integration, import connection & object metadata, and use that information to more easily create new or enrich existing mappings and tasks by connecting with an existing EDC installation.
  • “Smart match” recommendations for field mappings increases the frequency of field matches in mappings and tasks. Expanding on the existing automatch, smart match looks for common patterns in field names (prefixes, suffixes, abbreviations, etc.) based on six additional matchers and fuzzy match techniques for recommending field mappings.
  • Taskflows can be invoked via APIs for externalized scheduling and execution. With this enhancement, customers now can invoke taskflows on-demand via an API call and provide input parameters for the tasks it orchestrates, allowing customers to fully leverage Data Integration’s parameterization capabilities. Please refer to the Taskflow as a Service FAQ.
  • Taskflows have also been enhanced to allow them to embed other taskflows to promote reuse.
  • Change data capture has been expanded to include additional sources for DB2 on Linux, Unix, Windows, and iSeries (also known as AS400, i5/OS) platforms, which further enables near real-time changed data propagation capabilities.
  • Mass ingestion is extending connector support, adding Google Storage & Google Big Query as targets and HDFS as both a source and target. Additional enhancements expose CRUD-focused APIs.

 

API and Application Integration

  • Support for Kafka Messaging – Messaging is at the core of many publish-subscribe (Pub/Sub) based applications as a means of decoupling producers and consumers of data.  The addition of Kafka for application integration significantly increases current message-based Pub/Sub interactions between data and applications that today are fulfilled using JMS, AMQP, Amazon SNS/SQS, and Azure Service Bus based “topics.” The ability to bridge these message-based events with the Cloud Integration Hub Pub/Sub style of data propagation provides additional integration pattern options making Informatica unique in the flexibility and capabilities it provides for its customers.
  • JSON Web Token (JWT) based authentication – The API and Application Integration services now support JSON Web Token (JWT) based authentication, an open standard (RFC 7519) that defines a compact and self-contained way for securely transmitting information between API consumers and REST web services. This provides IICS users that use API Manager with another and more secure means of API authentication.
  • API grouping – To better manage the use of JWT tokens and associate their use to multiple API endpoints, a new API “application” grouping capability is being introduced in API Manager. This capability will provide API consumers with fewer tokens to deal with, and API producers will now more easily manage or revoke a consumer’s access to multiple APIs.
  • Japanese language support for the API and Application Integration services – In addition to Japanese language support for the Data Integration service, Japanese customers now have access to the API and Application Integration services user interface and documentation in Japanese.
  • REST and SOAP Service API-based “service connectors” – distributed via GitHub.

Today 55% of Application Integration customer’s connectivity needs are met using service connectors. A service connector allows customers to define REST (XML/JSON), JSON/RPC, or SOAP service integration using a simple web form, with no coding required. The Application Integration service takes care of the rest. If the service offers a WSDL or Swagger interface document, the service connector can be automatically generated by importing the interface document. By creating service connectors, customers can import and configure pre-built business and data service definitions as reusable assets that they can share and/or move from one environment to another. 

This capability, unique to Informatica, provides customers with unparalleled value. Service connectors avoid lock-in or an inability to make updates as you need them to take advantage of new capabilities or built-in extensibility that an API platform can offer. 

To provide this flexibility to customers and encourage community contribution by customers, partners, and other practitioners, Informatica is establishing a GitHub repository where it will publish the service connectors it has created and which it will share with its customers and partners. Customers and partners are free to use these definitions without restriction, including the rights to use, copy, modify, merge, publish, and distribute these under an MIT license. Informatica will also encourage contributions back to the community. Our goal is simple: drive innovation and reduce perceived barriers to adoption.

 

 

Integration Hub

  • Informatica has improved the search experience for Hub events and support for the CLOB data type on topics.

 

B2B Gateway

  • ICS B2B Gateway customers will be migrated to the IICS platform as part of the release upgrade and will benefit from all IICS platform capabilities.

 

Intelligent Structure Discovery

  • Intelligent Structure Discovery expanded its parsing capabilities to handle ORC format files and Excel multiple sheet files. The user can now design the structure model based on multiple sheet structures and then use the model at run time to parse Excel files in their entirety.
  • With R31, a Structure Parser transformation can be positioned mid-stream to enable a more flexible mapping usage and chaining. In addition, the Intelligent Structure Model detected datatypes are now propagated to the Structure Parser output ports.
  • The ISD design time user interface is enhanced with a "find" functionality which allows the user to search for a specific string in the discovered tree fields and get a list of results showing the path and visually correlated with the model representation. The user can also perform actions on multiple elements chosen from the result list such as include, exclude, replace, and even change of element type. The ability to perform actions on multiple elements significantly improves the usability and productivity.
  • A new vertical folder view mode will be available in R31 for handling complex hierarchy files.

 

IICS Platform

  • Common Explore productivity enhancements – Improved copy functionality with overwrite & rename conflict resolution options to copy assets within and across folders. “Created by” and “last updated by” attributes as columns for all asset types in the common Explore page.
  • Export/import capability for sub-organizations which enables asset migration across environments that use an organization hierarchy. More control and flexibility with enable/disable checksum validation options during export and import.
  • Improved export/import error logging along with the ability to access and download export/import logs through the UI and the API.
  • API to search, list, and filter assets in projects and folders using a variety of conditions such as timestamp, location, “last updated by,” and tags. This API can also be leveraged along with export APIs to export objects.
  • Improvements to the RunAJob utility – Support for projects and folders to invoke tasks by task name.
  • Usability improvements – Ability to copy table cell data in the common Explore page, Monitor service, and Administrator service for use in other user interfaces like search boxes and filter conditions for easier task completion.
  • Search capability for connections and job monitoring to quickly and easily find needed information.
  • Ability to enable and disable a service for agents in a Secure Agent group to effectively deploy workloads and efficiently utilize computing resources.
  • Secure Agent registration using tokens (instead of passwords) for increased security and enabling SAML single sign-on.

 

Operational Insights 

  • Operational Insights extends support to the on-premises Data Quality product, in addition to BDM and PowerCenter, with capabilities such as domain health, job run analytics, resource utilization, and alerts.
  • Click-through analytics journey from cross-domain to the individual job level and enhancements to job run analytics for on-premises products (PowerCenter, BDM). Plus, enhancements to job run analytics to report on data loaded and data processed.
  • Power Center & BDM infrastructure e-mail alert enhancements such as Secure Agent unavailability and Operational Insights data collection failures.

 

Connectivity Enhancements

New AWS Quickstart for Cloud Analytics Modernization, an end-to-end solution for self-service cloud analytics with Informatica (IICS, Enterprise Data Catalog), Tableau Server, and AWS Services.

Several new connectors and enhancements to existing connectors across ecosystems as listed below. New connectors introduced are highlighted in bold:

  • Azure: ADLS Gen 2 Preview, Azure DW V3, Azure Data Lake Store V3, Azure Blob V3 
  • Google: Google Cloud Storage V2, Google Analytics, Google Big Query V2, Google Big Query 
  • Amazon: Amazon S3 V2, Amazon Aurora, Amazon Redshift V2 
  • Salesforce: Salesforce Marketing Cloud (SFMC), SFDC (Sales and Service) 
  • SAP: SAP Connector, SAP HANA Cloud Platform (DB) 
  • Adobe: Adobe Cloud Platform 
  • Analytics: CDM Folders connector preview, Tableau V3, Tableau V2 
  • Databases: MySQL Relational, Hive, Greenplum, DashDB, Snowflake 
  • Tech: REST V2, WSconsumer, Complex File Processor 
  • Microsoft Apps: Microsoft SharePoint Online 
  • Oracle: Oracle Netsuite V1, Oracle Relational 

 

Summary of some of the connectivity enhancements are as follows:

AWS

  • Supporting file names longer than 250 characters with S3
  • Support for custom JDBC URL for Redshift
  • Support for ORC files with S3

Snowflake

  • Custom Query metadata fetching without having to run the query

Google

  • Custom Query Support for Google Big Query V2 connector
  • Pushdown support for Google Big Query thru ODBC
  • Google Analytics - Enhancement to fetch fields based on multiple Views IDs from GA
  • Google Big Query Mass Ingestion - Direct load Cloud Storage->Big Query

Azure

  • Preview of ADLS Gen2 connector: support create target, configurable escape character and text qualifier in R/W scenarios, create and rename directory, rename file, header-less files, support RBAC for all types of AAD Authentication, append data to an existing file, support parameterization
  • Azure DW V3: support TARGET NAME OVERRIDE and TARGET SCHEMA NAME OVERRIDE in the writer, support SOURCE NAME OVERRIDE and SOURCE SCHEMA OVERRIDE with the reader, support custom query and multiple objects in CMD and MCT

 

Microsoft Apps

  • Microsoft SharePoint Online: support for agents running on Linux

 

Analytics

  • Preview of CDM Folders connector: new connector, with the ability to write to ADLS Gen 2 in CDM format, and then access the data from Power BI as a dataflow
  • Tableau V2: upgrade connector to the latest Tableau SDK

Databases & DW

  • MySQL Relational: array insert support
  • Greenplum: native reader and writer

 

NetSuite V1: address 2-factor authentication

 

Salesforce

  • SFDC Sales, Service connector: support latest SFDC API
  • Salesforce Marketing Cloud SFMC: insert/update operation for “Non Contact Linked Data Extensions”

Here is an orientation video for the upcoming Informatica Cloud to Informatica Intelligent Cloud Services (IICS) migration process. The video provides an introduction to IICS for customers that are migrating from Informatica Cloud.

 

We will be migrating organizations between April and August. You will receive notification when YOUR organization(s) is going to be migrated and when your sandbox environment (pre-release) is available.

 

Video link (moved to YouTube so that it doesn't have to be downloaded):

 

Introducing Informatica Intelligent Cloud Services - YouTube

 

Full FAQ:

 

Informatica Intelligent Cloud Services Migration FAQ

Filter Blog

By date: By tag: