Configure Apache Hive policies in HDInsight with Enterprise Security Package
Learn how to configure Apache Ranger policies for Apache Hive. In this article, you create two Ranger policies to restrict access to the hivesampletable. The hivesampletable comes with HDInsight clusters. After you have configured the policies, you use Excel and ODBC driver to connect to Hive tables in HDInsight.
- A HDInsight cluster with Enterprise Security Package. See Configure HDInsight clusters with ESP.
- A workstation with Office 2016, Office 2013 Professional Plus, Office 365 Pro Plus, Excel 2013 Standalone, or Office 2010 Professional Plus.
Connect to Apache Ranger Admin UI
To connect to Ranger Admin UI
From a browser, connect to Ranger Admin UI. The URL is https://<ClusterName>.azurehdinsight.net/Ranger/.
Ranger uses different credentials than Apache Hadoop cluster. To prevent browsers using cached Hadoop credentials, use new InPrivate browser window to connect to the Ranger Admin UI.
Log in using the cluster administrator domain user name and password:
Currently, Ranger only works with Yarn and Hive.
Create Domain users
See Create a HDInsight cluster with ESP, for information on how to create hiveruser1 and hiveuser2. You use the two user accounts in this tutorial.
Create Ranger policies
In this section, you create two Ranger policies for accessing hivesampletable. You give select permission on different set of columns. Both users were created using Create a HDInsight cluster with ESP. In the next section, you will test the two policies in Excel.
To create Ranger policies
Open Ranger Admin UI. See Connect to Apache Ranger Admin UI.
Click <ClusterName>_hive, under Hive. You shall see two pre-configure policies.
Click Add New Policy, and then enter the following values:
Policy name: read-hivesampletable-all
Hive Database: default
Hive column: *
Select User: hiveuser1
If a domain user is not populated in Select User, wait a few moments for Ranger to sync with AAD.
Click Add to save the policy.
Repeat the last two steps to create another policy with the following properties:
- Policy name: read-hivesampletable-devicemake
- Hive Database: default
- table: hivesampletable
- Hive column: clientid, devicemake
- Select User: hiveuser2
- Permissions: select
Create Hive ODBC data source
The instructions can be found in Create Hive ODBC data source.
|Data Source Name||Give a name to your data source|
|Host||Enter <HDInsightClusterName>.azurehdinsight.net. For example, myHDICluster.azurehdinsight.net|
|Port||Use 443. (This port has been changed from 563 to 443.)|
|Hive Server Type||Select Hive Server 2|
|Mechanism||Select Azure HDInsight Service|
|HTTP Path||Leave it blank.|
|User Name||Enter email@example.com. Update the domain name if it is different.|
|Password||Enter the password for hiveuser1.|
Make sure to click Test before saving the data source.
Import data into Excel from HDInsight
In the last section, you have configured two policies. hiveuser1 has the select permission on all the columns, and hiveuser2 has the select permission on two columns. In this section, you impersonate the two users to import data into Excel.
Open a new or existing workbook in Excel.
From the Data tab, click From Other Data Sources, and then click From Data Connection Wizard to launch the Data Connection Wizard.
![Open data connection wizard][img-hdi-simbahiveodbc.excel.dataconnection]
Select ODBC DSN as the data source, and then click Next.
From ODBC data sources, select the data source name that you created in the previous step, and then click Next.
Reenter the password for the cluster in the wizard, and then click OK. Wait for the Select Database and Table dialog to open. This can take a few seconds.
Select hivesampletable, and then click Next.
In the Import Data dialog, you can change or specify the query. To do so, click Properties. This can take a few seconds.
Click the Definition tab. The command text is:
SELECT * FROM "HIVE"."default"."hivesampletable"
By the Ranger policies you defined, hiveuser1 has select permission on all the columns. So this query works with hiveuser1's credentials, but this query does not work with hiveuser2's credentials.
Click OK to close the Connection Properties dialog.
Click OK to close the Import Data dialog.
Reenter the password for hiveuser1, and then click OK. It takes a few seconds before data gets imported to Excel. When it is done, you shall see 11 columns of data.
To test the second policy (read-hivesampletable-devicemake), you created in the last section
Add a new sheet in Excel.
Follow the last procedure to import the data. The only change you make is to use hiveuser2's credentials instead of hiveuser1's. This fails because hiveuser2 only has permission to see two columns. You shall get the following error:
[Microsoft][HiveODBC] (35) Error from Hive: error code: '40000' error message: 'Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [hiveuser2] does not have [SELECT] privilege on [default/hivesampletable/clientid,country ...]'.
Follow the same procedure to import data. This time, use hiveuser2's credentials, and also modify the select statement from:
SELECT * FROM "HIVE"."default"."hivesampletable"
SELECT clientid, devicemake FROM "HIVE"."default"."hivesampletable"
When it is done, you shall see two columns of data imported.
- For configuring a HDInsight cluster with Enterprise Security Package, see Configure HDInsight clusters with ESP.
- For managing a HDInsight cluster with ESP, see Manage HDInsight clusters with ESP.
- For running Hive queries using SSH on HDInsight clusters with ESP, see Use SSH with HDInsight.
- For Connecting Hive using Hive JDBC, see Connect to Apache Hive on Azure HDInsight using the Hive JDBC driver
- For connecting Excel to Hadoop using Hive ODBC, see Connect Excel to Apache Hadoop with the Microsoft Hive ODBC drive
- For connecting Excel to Hadoop using Power Query, see Connect Excel to Apache Hadoop by using Power Query
We'd love to hear your thoughts. Choose the type you'd like to provide:
Our feedback system is built on GitHub Issues. Read more on our blog.