Hive Connector > Introduction to Hive Connector > Running a mapping on Azure HDInsights Kerberos cluster with WASB storage
  

Running a mapping on Azure HDInsights Kerberos cluster with WASB storage

To read and process data from sources that use a Kerberos-enabled environment, you must configure the Kerberos configuration file, create user authentication artifacts, and configure Kerberos authentication properties for the Informatica domain.
To run a mapping for Hive Connector using the Azure HDInsights with Windows Azure Storage Blob (WASB) kerberos cluster, perform the following steps:
  1. 1. Go to the /usr/lib/python2.7/dist-packages/hdinsight_common/ directory on the Hadoop cluster node.
  2. 2. Run the following command to decrypt the account key:
  3. /decrypt.sh ENCRYPTED ACCOUNT KEY
  4. 3. Edit the core-site.xml file, in Agent conf location.
  5. 4. Replace the encrypted account key provided in the fs.azure.account.key.STORAGE_ACCOUNT_NAME.blob.core.windows.net property with the decrypted key, received as the output of the step #2.
  6. 5. Comment out the following properties to disable encryption and decryption of the account key:
  7. 6. Save the core-site.xml file.
  8. 7. Copy the hdinsight_common folder from /usr/lib/python2.7/dist-packages/hdinsight_common/ to the Secure Agent location.
  9. 8. Open the core-site.xml file in a browser to verify if the xml tags appear and ensure that there are no syntax issues.
  10. 9. Restart the Secure Agent.
Note: Azure HDInsights Kerberos cluster with WASB storage is not applicable for elastic mappings.