Skip navigation

We are excited to announce the release of Informatica Cloud Winter ’16 Pre-Release.

 

Before the release, we would like our customers to try some of the new features, and give us any feedback. Pre-release began TODAY, Monday January 11 and will end on Friday January 29. You can submit any questions or concerns to pre-release@informatica.com. For any technical issues please create a support case through the normal channel.

 

For more information on the pre-release process, login url, and best practices, please read this community post: Informatica Cloud Summer '19  Pre-Release Program

 

You can submit any questions or concerns to pre-release@informatica.com. For any technical issues please create a support case through the normal channel.


We strongly encourage our customers to participate in the pre-release and provide feedback.

Informatica Network FAQs

Posted by Vidya Nov 27, 2015

The new Informatica Network will be available starting December 7, 2015. Following is a list of FAQs to get you started.

 

GENERAL


What is Informatica Network?
Informatica Network is a unified community that provides a healthy ecosystem for our customers to connect with their peers, Informatica experts, and broader community to accelerate their learning, deployment and adoption of Informatica products using any of the interaction channels.


What has changed? And what happens to the current sites?
To cultivate a customer centric approach, we have consolidated all silos – My Support customer portal, Informatica Communities, Informatica University, and User Groups. Using a single login and profile, you can access all these under the umbrella of Informatica Network. Users from all those sites will be redirected to Informatica Network.


What’s new for the migrated users?
All the existing customers will now have access to the varied Informatica resources available under the Network – product communities, latest forum discussions, SupportTV, Expert Assistant, Velocity (Best Practices) and much more.


How is the search different on Informatica Network?
We now have a federated search available on the Network that pulls results from the Knowledge Base, critical sources like the End of Life announcements, product availability matrix, customer discussions, etc. In short, it is your one-stop shop for all things Informatica!


SIGN UP AND LOGIN


Which Informatica systems are unified under the Single Sign-On (SSO)?

Your doorway to Informatica now has a single sign-on, which means, with a single username and password you can now access MySupport, Informatica Communities, User Groups, and Informatica University.


How is SSO different from Informatica passport?
SSO includes more systems under one umbrella than Informatica passport did. You can now access various Informatica resources with a common user profile. You do not need to maintain multiple login credentials. Now it is One login, one profile!


Can I log into Informatica Network using My Support credentials?
Yes, you can use your My Support credentials. However, upon logging in you will have to reset your password to update it as per the new Informatica password policy.


Why do I need to reset my password when I first log into Informatica Network?
We take security seriously, and want to ensure that all user accounts migrated to the Informatica Network are as secure as possible. Therefore, we are asking you to follow a password updating process to reflect the new Informatica password policy.


I'm having trouble signing in - help me!
The table below explains the various login scenarios and access details.



Scenarios

User Access

If you have MySupport (MS) login ID

Enter your MS credentials and do the following:

  • Correct Password: Reset your password on the ‘Complete your Profile’ page.
  • Incorrect Password: You will receive a verification email with steps to proceed.
  • Forgot Password: Click on ‘Forgot Password’ link to reset your password

If you have Informatica Passport login (for Marketplace, Informatica Communities etc.) only

  • Use your Passport credentials to login and access the Informatica Network home page.

If you had different login IDs for Informatica Passport and MySupport

You can use either one of the IDs to log into Informatica Network (using the steps mentioned above).

If have not registered for Informatica Passport or MySupport

Please click on Sign-up and complete the registration process.


What will happen to all my User Profile details – including Profile pictures, avatars, bookmarks etc.?
Your personal details such as Profile Picture, avatars, bookmarks etc. will be migrated as is. 

 

How do I update my personal details?
You can update your personal details using the Edit Profile option from the drop down next to your User Name on the top right of the page.


Are my previously included project details available and secure?

Yes, all your Project/case/tech profile details are available and secure as there no changes done to eSupport (online support).


Persona- Based

 

Can I access content on Informatica Network without logging in? If yes, then what are the resources that will be available to me?
As an anonymous user, you will have access to most of the Informatica resources such as:

  • Federated Knowledge Base search
  • Product Resources such as discussions, Product Document resources and How-to-Libraries of all the Informatica products
  • Other resources such as SupportTV, Informatica Expert Assistant, Debugging Tools etc.
  • Access to Informatica University and User groups.


How is anonymous user view different from a logged in user view?

As a logged in user in Informatica Network, besides what is available to anonymous users, additionally you will also have access to the following privileges

  • Option to personalize your home page view using the Manage Product Profile option
  • Federated KB search with access to Velocity and EOLs and PAMs
  • Access to 2-year archives of our newsletter SupportFlash
  • Get access to Informatica Support Statements
  • Option to view discussions and other content types
  • Option to Start a discussion, comment/reply to posts.
  • Create your own streams and follow spaces, people, discussions, blogs etc to get customized alerts
  • Collaborate with experts, peers and Informatica Moderators etc.
  • Customize your user profile with your skill sets, status updates, avatars and profile pics.

 

What special privilege or access does a user with Support Contract have?
As a Support Contract user, you will have access to:

  • Personalized Support Cases widget that displays your top five cases and the Technical Profiles
  • Case Management system (eSupport)
  • CSM
  • HotFixes and Emergency bug Fixes
  • Change Request Tracking system
  • GCS exclusive Support Enablement Materials
  • GCS Policies and Procedures Guide and so on..


OTHER


Why some of my browser Bookmarks are not working?
As part of the site revamp, some of the links/URLs had to be changed and therefore, you may notice a break in some of your bookmarks. Please use the search box or navigate through different product spaces to find what you are looking for.

 

Is my email address exposed to external search engines?
Not unless you choose to make it public. You have the option to specify who can view your email address and other details. Go to Edit Profile and visit the Privacy tab to set your preferences.

 

 

I have more questions, who should I contact?

 

You can reach out to praman@informatica.com or vidyasm@informatica.com for further clarifications.

Informatica has just won the Technology Services Industry Association (TSIA) STAR award for Innovation in Enabling Customer Outcomes/Customer Success & Support – Enterprise Level.

 

Informatica received this recognition for its commitment to drive customer success through the deployment of new technology tools, which maximize the customer experience through proactive features. It is our technology focus to drive product adoption through industry first solutions like Informatica Discovery IQ, Predictive Escalations, and Interaction Data Hub – that won us the award. Informatica DiscoveryIQ (IDIQ) is a cloud product offered as part of the support offerings and is available through the Informatica Cloud product. IDIQ offers proactive features such as operational and adoption reports, Advanced log analysis and recommendations for proactive operational support, Cloud upgrade impact reports for better insight during upgrade process, and so on. To know more about the IDIQ product please visit:

https://community.informatica.com/community/products/informatica_cloud/discoveryiq/blog/2015/04/18/announcing-new-informatica-discoveryiq-in-cloud-spring-15

 

The TSIA’s STAR Award is one of the highest honors the technology services industry has to offer and the award for Innovation in Enabling Customer Success in Support Services recognizes the company that demonstrates the most innovative approach, through support services, to assisting customers in realizing the maximum business value from the use of its products.

 

You can read more about this award at the TSIA blog http://blog.tsia.com/blog/congratulations-2015-star-awards-winners

 

We are delighted to receive this award and would like to thank you for your continued support!

 

-Team Informatica

 


We are bursting at the seams, eager to share our news! This November, we will shutter our old support portal and unveil the Informatica Network.


Maintaining the highest level of service and support is our top priority. We’ve listened to your feedback and are redesigning the experience from the ground up, improving efficiency, accessibility, and integration.

 

With this in mind, look for:

  • Consolidated USer Communities
  • Single sign-on across all your Informatica accounts and applications
  • Public access to the knowledge base and documentation
  • More efficient search
  • A streamlined user experience
  • And more!

We understand how busy you are. Our aim remains, as always, to get you the information you need quickly and conveniently.

 

 

 

We are now proud to unveil more details of these efforts…

One Community. One login. Unified user experience

You will now have one home to access community discussions, the KnowledgeBase, blog posts, social groups, and all things data centric.

 

1.png

 

 

  • Your doorway to Informatica now has a single sign-on for MySupport, Informatica Communities, and all other Informatica web properties
  • Join a User group to connect with people in a domain of your interest or geographical location. You can then follow and participate in calendar events hosted by those groups!
  • Use our Resources to enhance your effectiveness with Informatica products.  Our trouble-shooting wizards (Informatica Expert Assistant) walk youthrough the most popular product challenges or upgrade processes
  • Start a discussion on Curated Communities or browse through some featured or recent content. You could also view posts by Top Participants and collaborate with them on all things data centric.
  • Up your Informatica ante with some learning at Informatica University
  • View SupportTV for some visual guides and webinars

 

Customized user experience
MySupport is now tailored to show you information for your Cases, your subscribed products and products that you are interested in. Bespoke all the way!

2.png


  • Track your cases and personalize your account page based on selected products
    • View discussions and product release information for your products
  • View content, contact associated people, or attend calendar invites related to your products

 

Integrated platforms in KnowledgeBase search

Your search will now fetch results from across all the unified communities, including discussions & blogs, PAM, and EOL information.

 

3.png

 

  • Access the KnowledgeBase without credentials. The Informatica KnowledgeBase is now a non-gated community. All Content will be Search engine optimized.
  • End of Life (EOL) or Product Availability Matrix (PAM) information is now integrated in the KnowledgeBase.


Come November, you will login to this exciting new experience! And yes, we have interactive guides to familiarize you with all the constructive changes in this space. We look forward to welcoming you into the new Informatica network next month. Catch you there!

Written by:  Sumit Jain

 

 

After working extensively on Informatica Cloud, I have collected some to dos and guidelines of the Informatica Cloud product. You may know them before hand, but I thought to compile a list and share it with a wider audience:

 

p_sumit.png

 

1. Create a Naming Conventions document detailing the standard naming conventions for different types of Connections, Tasks, Taskflows and schedules within Informatica Cloud. All the developers should rigorously follow these naming conventions. It has been observed that when multiple people are working simultaneously, they tend to follow their own naming standards and at the end, there are lot of tasks and it is very difficult to identify tasks and administrator has to spend a good amount of time identifying the correct tasks.

 

 

 

2. Add meaningful description to all your tasks, connections, taskflows and schedules in such a way that they convey the purpose of their use and thus do not create confusion to other users.

 

 

 

3. The machine the Informatica Cloud Secure Agent runs on must always be on. It must not be in a sleep mode or in “idle”. This might be indicated when the Agent status fluctuates between an active/inactive state. Make sure this computer is on or re-install the agent on a computer that stays on continuously.

 

 

 

4. If you are using the CSV files as source or target, make sure that you match up the date format in the associated connection for flat files, by dropping down the Date Format list and choosing the matching format.

 

And If there isn’t a matching format in the drop down list, then you will need to explicitly format the date in Step 5 of Data Synchronization task by using a transformation function called TO_DATE.

 

 

 

5. If there is a requirement of performing a lookup on Salesforce objects, then do not create a direct lookup. A direct lookup on Salesforce object will call the Salesforce object for each record processed and thus performance will decrease considerably. Instead, write the data of Salesforce object in a flat file and then use the flat file for doing the lookup.

 

 

 

6. For incremental processing, use the “$LastRunTime” and “$LastRunDate” variables in conjunction with a source field of “date” type. Informatica Cloud supports and maintains these variables automatically. For example, if your source has a field called LASTMODIFIEDDATE, you could set up your filter such that LASTMODIFIEDDATE > $LastRunTime. If your schedule then runs the task on a daily basis, that means each day you will only have to deal with the new/changed records from the previous day, instead of worrying about ALL records.

 

 

 

7. If the Informatica Cloud Secure Agent is running on a linux or unix server, it will not support MS SQL Server as source or target.

 

 

 

8. In a multi-user environment where the number of tasks to be created is very high, create Views based on logical groups for viewing similar group task in a single task view. Similarly you can create views for connection and taskflows.

 

 

 

9. SYSDATE is the current datetime you can use to denote the current date and time.

 

 

 

10. Use logical operators like IIF and DECODE to encode conditional logic in the expression in Step 5 of Data Synchronization task.

 

 

This has been posted on the community page as well. https://community.informatica.com/docs/DOC-3772

 

What are some of your best practices?

Please share with us on the comment section below.

 

Thanks!

We kicked off the month of October by holding another session on cloud integration templates - this time building an entirely new one, as opposed to using a prebuilt template within Informatica Cloud. Our last session on templates focused on a specific customer use case - a Zuora to Great Plains template with some complex SQL Server stored procedures.

 

We went over some of the most important transformations (such as Joiners, Filters, Normalizers, and Routers), and what they're used for. We also went over certain "Template Stubs" (which consist of a combination of prebuilt transformations) that can help you be even more productive as you build your custom integration template. These template stubs broadly fell into 3 categories:

 

1) Accessing Data Sources

2) Transforming, Filtering, & Aggregating Data

3) Routing & Joining Data

4) Looking up Data

5) Writing to Targets

 

You can view the slides here as well as the recording featuring Naresh Govindaraj from our product management team. Tomorrow's TechTuesdays session on Oct. 8 at 830am is on an exciting topic, REST Web Services and JSON, and represents a snapshot of what's going on in the evolution of the internet.

Our TechTuesdays session on September 24th focused on what happens during large deployments of Salesforce or Oracle database clusters. Typically, before any such deployment takes place, stringent testing of the integration environment needs to happen to ensure that the production environment functions as smoothly as possible.

 

The only way to do this is to create a copy of the production data in the testing environment, and test out all the integration mappings there before pushing into production. At present, when using Salesforce, developers have two options:

  1. Use developer or configuration-only sandboxes
  2. Use full sandbox copies

 

Either approach carries with it a lot of time, cost, and/or risk. With developer-only sandboxes, test data sets are not representative of their final production datasets and as a result, can omit a lot of crucial integration scenarios. With full sandbox copies, costs skyrocket tremendously, and can lower the ROI of a Salesforce deployment.

 

An alternative solution instead, is to copy over only a subset of the main production data, and mask any personally identifiable or confidential information (such as credit cards, social security numbers, or dates of birth). The main demo component of TechTuesdays focused on this aspect.

 

Here are the slides for this session, as well as the recording. Tomorrow's TechTuesdays session focuses on building integration templates from scratch.

On our seventh Informatica Cloud #TechTuesdays session on September 17th, "Cloud Migration: Sandbox Best Practices", we talked about the importance of testing out your integration workflows before deploying them into production. Using an Informatica Cloud sandbox has three primary benefits, especially for companies focused on improving the productivity of their integration projects without compromising on security:

 

  • Security: The Informatica Cloud Secure Agent can be deployed into separate virtual LANs, each one demarcated away from any sensitive corporate IT resources
  • Streamlined Integration Lifecycle: With all the different stages in an integration lifecycle (development, testing, and production), having Informatica Cloud sandboxes, makes it easy to move approved and tested integrations from the testing phase, directly into production
  • Debugging: If any bugs have been encountered in a production environment, it is very easy to to isolate, trace, and resolve these bugs in the testing environment within sandboxes

 

Krupa Natarajan from our cloud product management team ran through a demo showing how to move objects from a sandbox environment to production, and ran through the following demo scenarios:

  • Having the 'Migrate Objects' functionality enabled by your Informatica Cloud Administrator
  • Credentialing, and access between sandbox and production accounts
  • Ensuring that users have the correct privileges to either read/write/update objects and tasks via Fine Grained Access Controls
  • The types of "objects" that would be migrated, be it an application's object, a connection name, a plugin, etc.
  • Keeping migration privileges separate from task-related privileges
  • Editing connections to point to a different connection, especially when having separate test accounts for your SaaS applications as well

 

You can download the recording to view the replay, as well as the slides. Tomorrow's TechTuesday session on 9/24 at 830am PDT will go one step further, and focus on Data Masking - this is a very useful feature to have when you want to copy a subset of your production data into a testing environment in order to mimic real-life scenarios, but wish to keep confidential information private and secure.

During our sixth Informatica Cloud #TechTuesdays session, "Reusable Integration: The Power of Integration Templates" on September 10, we had a customer, Intralinks show us their use case of integrating Zuora with Great Plains using Informatica Cloud's templates. The main integration scenario revolved around the need to access multiple invoice objects from Zuora and integrate them with the main financial and accounting components within Great Plains.

 

However, there was also the issue of certain legacy stored procedures from SQL Server that needed to be combined with the Zuora data before being integrated with Great Plains. Intralinks was able to use several joiner and filter conditions to make this happen via their template, as you can see below:

 

Intralinks Zuora GP.bmp

 

Amol Dongre from our Engineering team also covered how to import templates from the shared master org using object migration. These templates are being made available to all current Informatica Cloud customers so contact me at aviswanath@informatica.com if you'd like the password to this repository.

 

Here are the slides and the entire recording. Tomorrow's  session  on September 17 will focus on using sandboxes within Informatica Cloud and best practices for separating out your development, test, and production environments.

Guest Blog Written by Richard Seroter:

CRM growth – especially in the cloud – shows no signs of slowing down. Salesforce.com continues to register 20%+ annual growth and Microsoft Dynamics CRM recently crossed $1 billion dollars in annual revenue. Now more than ever, integration of cloud systems is critical to business success. You simply cannot afford to have mission-critical, isolated systems that don’t share data with the rest of the enterprise environment.

 

This topic is the focus of an upcoming event held at the Microsoft office in London. The Hybrid Organization is a one day event for architects and developers who want to learn about the best techniques for integrating systems through the cloud. While this is a Microsoft-oriented event, Informatica Cloud will be part of the showcase demonstrations. My first talk is focused on integrating cloud CRM platforms and I’ll show the audience how to use the Informatica Cloud to easily connect Salesforce.com and Dynamics CRM Online to on-premises systems. And for fun, I’ll also show off how easy it is for Salesforce.com and Dynamics CRM Online to communicate with each other! In my second talk, I’ll walk through the patterns, technologies and trends in cloud integration and cover how products like Informatica Cloud are a critical part of the cloud architect’s toolbelt. Microsoft has a fantastic set of cloud integration tools, but nothing quite like what Informatica has to offer. The pragmatic architect must look at technologies across the vendor landscape and make sure to use the best tool for the given situation.

 

If you are in the UK on September 11th and interested in hearing about the most exciting technologies in cloud integration, join me at the Hybrid Organization event!

 

Richard Seroter is a senior product manager for cloud software company Tier 3, a trainer for Pluralsight, Microsoft MVP, InfoQ.com editor, blogger,, author, and speaker. You can find Richard on Twitter as@rseroter.

During our fifth Informatica Cloud #TechTuesdays session, "SOAP Web Services Made Simple" on August 20, we focused on the fact that there are several legacy on-premise systems that need to communicate with modern SaaS applications such as Salesforce. The easiest way to establish such a communication path is to put a SOAP web service wrapper around the legacy system, along with a WSDL that appropriately describes all the desired methods and operations.

 

We also discussed the fact that SOAP web services can be of two different kinds - procedural, or document style. A document-style web service is a bit more complicated than a procedural web service because they are not self-describing through the WSDL alone – you would need the WSDL as well as the XSD schema to use the service.

 

During the demo component we showcased the following aspects of using Informatica Cloud for SOAP web services:

  • Connecting to the main WSDL URL
  • Looking through the WSDL URL's binding information to put the appropriate service into the endpoint URL
  • Using the 5-step wizard to map each web service to the appropriate fields within a Salesforce object
  • Applying filters to a web service to extract only relevant information
  • Using task flows to execute consecutive web service calls

 

Here are the slides and entire recording featuring Bryan Plaster from our Cloud Labs team. Next week's  session on September 10 will focus on using integration templates, and will feature an actual customer taking us through a template use case.

Our fourth Informatica Cloud #TechTuesdays session, "Benefits of Using REST APIs in Integration", focused on the specific resources that Informatica Cloud's REST API provides, and the various scenarios in which they can be used. We first went over the most common kinds of requests (such as GET, POST, PUT, DELETE) that REST calls make, and then gave an overview of the resources that the Informatica Cloud REST API provides. These resources can be broadly grouped into the following categories:

  • Activities
  • Secure Agent Details
  • Jobs
  • Logins
  • Integration Templates
  • Workflow Details
  • Schedules

 

When using Informatica Cloud's REST API, it is very important to know what the header configuration looks like, and also to use only Version 2 of the REST API. The demo showcased the following scenarios:

 

  • Embedding integration into a SaaS Application
  • Job scheduling & monitoring
  • Administering new users

 

Here are the slides and entire recording featuring Amol Dongre from our Cloud Engineering team. Tomorrow's session will focus on using SOAP web services within Informatica Cloud.

Our third Informatica Cloud #TechTuesdays session, "Simplifying SAP Connectivity with Cloud Integration", focused on how cloud applications were rapidly gaining traction, especially in the CRM space, and needed to connect to SAP R/3 to access crucial customer, pricing, and product master data. The reason why integration to SAP is so important is because over 25% of ERP deployments in 2012 were based upon SAP (Souce: Forbes, Gartner).

 

http://b-i.forbesimg.com/louiscolumbus/files/2013/05/ERP-Market-Share-2012-Stats.jpg

 

Although SAP only grew 2.2% from 2011 to 2012 while other cloud-based  ERP vendors such as Workday, and NetSuite grew 114.7% and 34%  respectively over the same period, the fact remains that as a category,  SAP still holds the lions share of the ERP market. Coupled with the fact  that CRM is one of the fastest growing categories of cloud  applications, it's not hard to see why integration between cloud CRM  applications (such as Salesforce and Microsoft Dynamics CRM Online ) and  SAP is such a sought-after solution.We discussed three primary use  cases for SAP integration: data warehousing, data migration, and data  synchronization. Some of the most exciting features of Informatica  Cloud's

SAP connector that were shown on the demo was the ability to toggle between the  technical and business names in R/3, and how easy it was to extract data  from transparent, clustered, and pooled tables and views. The demo  zeroed in on the following themes:

 

  • Loading SAP customer info into Salesforce
  • Bulk extraction from clustered tables to flat files
  • Using Salesforce outbound messaging
  • Performing lookups

 

Here are the slides from the session, and the entire recording featuring Anand Peri, who takes care of product management for our SAP connector.

 

Session 4, which focused on using the Informatica Cloud REST API for integration just got over this morning and we'll have the slides and recordings up soon.

Our second Informatica Cloud #TechTuesdays session, "Accelerating Big Data Initiatives through Cloud Integration", shone a light upon the growing use of Big Data for datawarehousing projects. We cleared the confusion around many of the flavors of Big Data out there, such as Hadoop, and focused the discussion on Big Data providers in the cloud, such as Amazon RedShift.

 

Big Data has several use cases such as data warehousing, predictive analytics, machine data, and OLTP, and we decided to tackle the data warehousing use case. When looking at the industries that had the fastest adoption of Big Data, we found out, not suprisingly, that the banking, media, and government industry verticals led the way (Source: Forbes, Gartner).

 

http://blogs-images.forbes.com/louiscolumbus/files/2012/08/big-data-heat-map-by-industry.jpg

 

During the session, we discussed that the main drivers behind moving to cloud-based Big Data for data warehousing projects was because of the speed with which you could provision multiple database nodes. Other benefits involved saving on costs of provisioning multiple on-premise databases, as well as the ability to start petabyte-scale data warehousing projects a lot sooner. The demo itself touched on the following aspects of using Amazon RedShift:

 

  • Configuring RedShift for first-time use by downloading SQL Workbench
  • Ensuring that security groups were set up correctly
  • Writing to Amazon RedShift using the Informatica Cloud connector
  • Reading from Amazon RedShift using ODBC

 

Here are the slides from the session, and the entire recording featuring Vijay Narayanan, who takes care of product management for a lot of our new connectors.

 

Session 3, which focused on SAP integration just got over this morning and we'll have the slides and recordings up soon.

Last Tuesday we held our first ever Informatica Cloud #TechTuesdays session, "Better Business Intelligence: Blazing Fast Data Replication for Cloud Apps".

 

We focused on how cloud applications such as Salesforce were built from the ground up to allow rapid customization, and as a result, contain numerous custom objects and custom fields. As a consequence, when extracting data from Salesforce into a staging database, it is important to ensure that all the changed fields have propagated their way into the relevant mappings for the business intelligence process.

 

The demo focused on four critical elements of a data replication process:

 

  • Replicating Deleted & Archived Rows
  • Auto-creation of Database Tables
  • “Full Load” vs. “Incremental Load”
  • Using the Salesforce Bulk API

 

Here are the slides from the session, and the entire recording featuring Ron Lunasin who heads our Cloud Platform Adoption team.

 

Session 2 takes place tomorrow, Tuesday, July 30th at 830am PDT and will focus on Big Data, specifically how to perform data warehousing through the cloud with Amazon RedShift.

Filter Blog

By date: By tag: