Use DataFu with pig on HDInsight

Learn how to use DataFu with HDInsight. DataFu is a collection of Open Source libraries for use with Pig on Hadoop.


  • An Azure subscription.

  • An Azure HDInsight cluster (Linux or Windows based)


    Linux is the only operating system used on HDInsight version 3.4 or greater. For more information, see HDInsight retirement on Windows.

  • A basic familiarity with using Pig on HDInsight

Install DataFu on Linux-based HDInsight


DataFu is installed on Linux-based clusters version 3.3 and higher, and on Windows-based clusters. It is not installed on Linux-based clusters earlier than 3.3.

If you are using a Windows-based cluster, or a Linux-based cluster higher than version 3.3, skip this section.

DataFu can be downloaded and installed from the Maven repository. Use the following steps to add DataFu to your HDInsight cluster:

  1. Connect to your Linux-based HDInsight cluster using SSH. For more information, see Use SSH with HDInsight.

  2. Use the following command to download the DataFu jar file using the wget utility, or copy and paste the link into your browser to begin the download.

  3. Next, upload the file to default storage for your HDInsight cluster. Placing the file in default storage makes it available to all nodes in the cluster.

    hdfs dfs -put datafu-1.2.0.jar /example/jars


    The previous command stores the jar in /example/jars since this directory already exists on the cluster storage. You can use any location you wish on HDInsight cluster storage.

Use DataFu With Pig

The steps in this section assume that you are familiar with using Pig on HDInsight. For more information on using Pig with HDInsight, see Use Pig with HDInsight.


If you manually installed DataFu using the steps in the previous section, you must register it before using it.

  • If your cluster uses Azure Storage, use a wasb:// path. For example, register wasb:///example/jars/datafu-1.2.0.jar.

  • If your cluster uses Azure Data Lake Store, use an adl:// path. For example, register adl://home/example/jars/datafu-1.2.0.jar.

You often define an alias for DataFu functions. The following example defines an alias of SHA:

DEFINE SHA datafu.pig.hash.SHA();

You can then use this alias in a Pig Latin script to generate a hash for the input data. For example, the following code replaces the location in the input data with a hash value:

raw = LOAD '/HdiSamples/HdiSamples/SensorSampleData/building/building.csv' USING',', 'NO_MULTILINE', 'UNIX', 'SKIP_INPUT_HEADER') AS
mask = FOREACH raw GENERATE int1, id1, int2, id2, SHA(location);
DUMP mask;

It generates the following output:


Next steps

For more information on DataFu or Pig, see the following documents: