Authors: EC63039 Skip navigation

Cloud Data Integration

7 Posts authored by: EC63039

Guest Blog Written by Richard Seroter:

CRM growth – especially in the cloud – shows no signs of slowing down. continues to register 20%+ annual growth and Microsoft Dynamics CRM recently crossed $1 billion dollars in annual revenue. Now more than ever, integration of cloud systems is critical to business success. You simply cannot afford to have mission-critical, isolated systems that don’t share data with the rest of the enterprise environment.


This topic is the focus of an upcoming event held at the Microsoft office in London. The Hybrid Organization is a one day event for architects and developers who want to learn about the best techniques for integrating systems through the cloud. While this is a Microsoft-oriented event, Informatica Cloud will be part of the showcase demonstrations. My first talk is focused on integrating cloud CRM platforms and I’ll show the audience how to use the Informatica Cloud to easily connect and Dynamics CRM Online to on-premises systems. And for fun, I’ll also show off how easy it is for and Dynamics CRM Online to communicate with each other! In my second talk, I’ll walk through the patterns, technologies and trends in cloud integration and cover how products like Informatica Cloud are a critical part of the cloud architect’s toolbelt. Microsoft has a fantastic set of cloud integration tools, but nothing quite like what Informatica has to offer. The pragmatic architect must look at technologies across the vendor landscape and make sure to use the best tool for the given situation.


If you are in the UK on September 11th and interested in hearing about the most exciting technologies in cloud integration, join me at the Hybrid Organization event!


Richard Seroter is a senior product manager for cloud software company Tier 3, a trainer for Pluralsight, Microsoft MVP, editor, blogger,, author, and speaker. You can find Richard on Twitter as@rseroter.

Since 2000, Informatica has presented Innovation Awards to data integration professionals who have demonstrated leadership in using Informatica solutions to drive innovation and business success.

Informatica announced the winners in each category, and the overall best-of-the-best winner as part of Informatica’s 13th annual user conference, Informatica World 2013. Drawn from more than 5,000 Informatica customers, Innovation Award finalists are companies and organizations that are using the Informatica Platform to drive exemplary data integration/data management results and business outcomes. These include delivering new revenues at lower cost, forging stronger and more profitable customer relationships, and ensuring competitive differentiation for the business.


The 2013 Innovation Awards’ categories and finalists include:

Application Information Lifecycle Management


- Avago Technologies, Cox Communications*, Ochsner Medical Center


  • B2B Data Exchange


- HMS*, LexisNexis*, US Transportation Command with Lockheed Martin, Vale


  • Cloud Integration


- Intuit, Ultimate Software*


  • Enterprise Data Integration


- Liberty Mutual, Pacific Life, Quintiles*


  • Real Time Data Integration


- BC Hydro, UMass Memorial Health Care*


  • Test Data Management/Data Masking


- Cognizant*, EVERTEC with Stk Puerto Rico, The Hartford


  • Data Quality


- Datasus, Ruukki, Tani*, State of Colorado


  • Master Data Management (MDM)


- Logitech, Ruukki*, State of Colorado


  • Informatica Marketplace


- Kelley Blue Book – Lunexa Web Analytics Integration Package from Marketplace, Partner Lunexa; Global Financial Services organization – GenInfa 2 from Marketplace Partner The ETL Factory*


  • Megatrend: Hybrid IT


- Blue Shield CA with Model Metrics*, Canon Australia, Actelion


  • Megatrend: Big Data:


- Kelley Blue Book, LexisNexis, PayPal*, State of Colorado



Congratulations to the Informatica Innovation Award 2013 nominees and winners!

Guest Blog Written by Ivan Belal

Informatica World is soon approaching and I am excited to present in the Hybrid IT track session: Hybrid Integration: Interoperability of Informatica Cloud & PowerCenter.


When we first started our journey around finding an integration solution for Salesforce, we didn’t set out to find a hybrid solution; we wanted to find the best solution for our needs. There were a variety of solutions were evaluated – all on-premise, all cloud, hand-code, etc. – but chose Informatica’s hybrid integration solution of Informatica Cloud & PowerCenter for 4 reasons:


  • Total Cost of Ownership (TCO)
  • Security and Compliance – this is huge since we are in the Insurance industry
  • Breadth of Functionality
  • Resource Constraints


Learn more about how we implemented Cloud & PowerCenter and what use cases we are addressing at Informatica World. The session I will be speaking at will also feature 2 other Informatica Cloud customers from Actelion Pharmaceuticals and Brocade.


I look forward to interacting with my peers and the Informatica team in June. If you haven’t already, register for Informatica World here – you don’t miss out!


About Ivan Belal: Developer Analyst, Legg Mason, Inc, Diversified Global Asset Management Firm serving Individual and Intuitional Investors for over a century. I have been with Asset Management Firms over 15 years primarily supporting technology needs for distributions. My most recent endovers have been with migrating legacy CRM systems to SaaS application like Salesforce and developing solutions to integrate between the two. I try to stay fit, however going to the gym is a significant commitment

Guest Blog Written by Daniel Sisson


There are several best practices associated with integration of Salesforce with other applications. The following tips will help you to have a successful implementation project:

  • Consider your key data elements that are necessary for your Salesforce application. These elements might include your customer master, product master, order or transaction information, target market segments or prospect profile types.
  • Determine the source of the data elements. Make sure you consider what source is the “one truth” in your company. For example, should your customer master be your accounts receivable database, or should it be your sales or marketing database? This can be determined by level of accuracy, frequency of updates, or a combination.
  • Review the data management requirements that should be in place with the integration. This could include data replication, business intelligence, analytics, data cleansing or validation practices, and the overall process flow. For example, if someone using Salesforce discovers an error in a customer phone number, how should it be corrected?
  • Use standard integration applications. Whenever possible, avoid writing customized code for your integration. Instead look for powerful products such as Informatica Cloud to build a solid foundation that maintains data integrity while optimizing the integration process.
  • Initiate quality assurance (QA) as a critical component of your integration. Sometimes the source data systems are built around decades of legacy policies and procedures. These older processes could have left gaps in your data integrity. With a good QA process in place, issues can be identified and resolved before they become a problem in your Salesforce integration.
  • Audits and logging are crucial. Maintain a very detailed and robust logging process that includes error notification capabilities and auditing reports. This includes logging for automated jobs as well as user-executed transactions.
  • Review your security policies with all integrated systems to make sure there is consistency and compliance in all areas.


Your Salesforce implementation is a worthwhile investment. Take it to an even higher level of performance with integration to your back office systems while using industry best practices.


Daniel Sisson, Marketing and Business Development Director at Mansa Systems, Mansa is an SI and ISV that specializes in implementing, integrating, and devloping for Salesforce. Follow Mansa: @mansasys

Last week, the Informatica Cloud Team was at Salesforce Customer Company Tour Chicago. Although it was pouring and there were 30 mph winds, there was a good showing at the event - this just shows you how loyal Salesforce customers are! We had a lively reception with our partner, Cloud Sherpas, and CipherCloud.


Next week, the Informatica Cloud Team will be at the Salesforce Customer Company Tour London. We are at Booth #120 - swing by to learn about our Salesforce Cloud Data Management solutions. Since Chicago was such a blast, we will have an encore reception with Cloud Sherpas and CipherCloud. RSVP here.


If you haven't already registered for Salesforce Customer Company Tour London, do so here.


To learn more about Informatca Cloud @ Salesforce Customer Company Tour, visit this page.


See you at an upcoming Salesforce event!

Guest Blog Written By Terry Luschen

Originally posted here:

Informatica Cloud gives you a great platform for working integrations between platforms, databases and applications.  I currently work mostly with Salesforce, and Informatica Cloud has a built-in connector to Salesforce that makes it a great choice for me.  This connector can be used to extract, update or delete Salesforce data.  The data format that is used is your typical csv file.  So, if you are used to interacting with APEX Data Loader, the concepts will seem the same to you.


With Informatica Cloud you setup ‘Connections’ to your Salesforce environment and a directory where your csv file will live.  From there you create a ‘Data Synchronization Task’ under the ‘Applications’, ‘Data Synchronization’ menu item.  If you have a group of tasks that need to be run one after the other then you can group these tasks into a ‘Task Flow’, which can be found under the menu item ‘Applications’, ‘Task Flows’.


This is already an amazing amount of functionality out of the box.  But what if you need some special functionality like doing a FTP after you have downloaded some data from Salesforce?  Informatica Cloud does not have that built in, but they have a very easy way to hook in added functionality.  All you need to do is build an exe and then place that exe in a path that can be accessed by the ‘Secure Agent’, which is the local application that is doing the work for Informatica.  All configuration work is done in the ‘cloud’, but all the work is done by the ‘Secure Agent’ which is running on your local computer.




Now that we have our exe we simply need to add this exe to the ‘Postprocessing Commands’ text box on the ‘6. Schedule’ tab under the ‘Advanced Options’ section.  An example of a post processing command may look like this…


C:\InformaticaExes\FileProcessing\FTP_Me.exe C:\InformaticaExes\FTPCredentials.txt C:\InformaticaExes\FTP_File.csv


So in this example we have a FTP_Me.exe that is going to run that is accepting two parameters…

1) The first parameter is C:\InformaticaExes\FTPCredentials.txt and this will contain the credentials to the FTP site and the connection information.  You can even encrypt this file or the string within it so it is not plainly viewable on the Windows box.

2) The second parameter is C:\InformaticaExes\FTP_File.csv, which is the name of the file to FTP.


Now we are all set!  After our file is downloaded, the exe will be kicked off and our file will be FTPed as needed.


This can easily be extended in so many ways.  Maybe before the file is FTPed you need to modify the contents of the csv file by removing the headers and deleting the ID column in the first column.  Those types of things are so easy to do with a .NET exe and it is so easy to add that into your existing exe or even another one that can be in a different post processing command.  I would suggest keeping each exe to doing only a very specific task. That will make them easier to test and manage.


How have you extended Informatica Cloud?  I would love to hear about it!


Let’s get back to integrating our Cloud!


Terry Luschen: Systems Architect / Senior Software Engineer at Sundog. Specializes in Cloud integrations with Salesforce and other technologies. Salesforce Certifications: Advanced Developer, Sales Cloud and Service Cloud. Find me @TerryLuschen

Guest Blog Written by Richard Seroter

Last week I had the pleasure of visiting the Developer User Group in San Francisco and I delivered a presentation called Using the  Integration APIs. This group meets regularly to discuss timely development topics related to applications and the platform. In my presentation, I reviewed the comprehensive set of APIs offered by, when to use each, and how software providers like Informatica are using these APIs to create powerful software products.


What kind of APIs does provide to developers? First, there are three options for interacting with from outside systems.


•    SOAP services. SOAP is the classic web services protocol for exchanging XML messages between systems. The SOAP API is a great fit for quick development or integration with existing tools that know how to invoke SOAP-based services. The SOAP API has a rich set of operations for querying, creating, and modifying data or application settings.


•    REST services. RESTful services are all about using meaningful URIs and a common interface (e.g. HTTP POST, GET, PUT, DELETE) to interact with representations of data resources. The REST API gives developers the ability to interact with standard or custom objects represented as XML or JSON objects. This is a great fit for mobile applications or modern web applications that embrace JSON and prefer the REST-style of web services.


•    Batch processing. Both the SOAP and REST APIs work best with bite-sized messages as opposed to massive data sets. However –  as I’m sure the folks at Informatica can tell you – there are plenty of reasons to transfer large blocks of data between systems! The Batch API lets you upload XML or CSV data files as “jobs” that are processed in parallel on servers. Choose this API when performing data migration activities or irregular data synchronizations between systems.





In addition to the host of inbound integration options above, gives developers outbound interfaces for connecting to remote endpoints.


•    Message streaming. One way to detect changes in a system is to constantly poll it and try and find out what’s new. The Streaming API is a push-based channel for sending data events out of in real time. Once a developer defines a “PushTopic” that specifies the condition they are interested in, the consuming application can connect to that PushTopic and listen for changes. As soon as that condition is met in the system (e.g. “New Account Added in California”), a message is instantly sent to all subscribers to that topic. This is an intriguing option for organizations who want a real-time feed of information from and have applications behind the firewall that can connect to the Streaming API.


•    Outbound Messaging. Much like the Streaming API, Outbound Messaging is a powerful way to do event-driven integration with Whenever a user-created workflow condition is met, a SOAP message is sent to a listening web service. Unlike the Streaming API which only delivers messages once to online subscribers, the Outbound Messaging infrastructure can queue failed messages and try to deliver them later when the listener is available.




All of these diverse integration endpoints make it possible for software companies like Informatica to create compelling products that interface with applications. In fact, Informatica provides multiple ways to integrate with First, it can act as a listener for Outbound Messages. When using as a source in an Informatica Cloud data synchronization task, you can choose to “run this task in real-time” as a response to receiving an Outbound Message. Imagine building an Informatica flow where as soon as a record changed in, the data was instantly transferred to an internal database for processing.


The demonstration of Informatica Cloud that I presented to the User Group was taking data and regularly transferring it to another cloud CRM system, Microsoft Dynamics CRM Online. Because of the handy library of Informatica cloud connectors, it’s a snap to quickly build enterprise-class ETL operations between SaaS systems. In this demonstration, I showed how easy it was to use the Informatica Cloud Data Synchronization Task Wizard to connect to, connect to Dynamics CRM Online, filter the data, map the fields between the two systems, and schedule the ETL. After executing the ETL and showing how simple this really was, one User Group attendee jumped on Twitter to proclaims his readiness to try out the platform!




This was a fun group to present to as they regularly face the challenge of connecting applications to other systems used by their customers. While the APIs provided by give developers plenty of choices for integration, using them requires a fair amount of knowledge and custom code. However, tools like the Informatica Cloud simplify the process by offering an easy-to-use web interface for creating robust and reliable data integration flows. I’ve been a fan of this platform for a while and even taught a video-on-demand course that highlights its role in a cloud architect’s tool belt. Its critical to understand the marketplace so that we can make the best possible build vs. buy decision when integrating the cloud!


Richard Seroter is a senior product manager for cloud software company Tier 3, a trainer for Pluralsight, Microsoft MVP, editor, blogger,, author, and speaker. You can find Richard on Twitter as @rseroter.

Filter Blog

By date: By tag: