Mountains and a bright, cloudy sky reflected in a clear lake

Azure Data Lake Store – Service-to-Service Authentication

Purpose

Setting up service-to-service authentication while working with Azure Data Lake Store and Data Lake Analytics.

Prerequisites

This article assumes we have

  • An Azure subscription, if not, you can create one free trial on the Microsoft Azure site.
  • Provisioned an Azure Data Lake or Azure Data Analytics
  • Created Azure Data Factory project using Visual Studio 2015

Implementation

Azure Data Lake and Data Lake Analytics works with Azure Active Directory for authentication and we have two options for authentication available – User level authentication or service-to-service authentication.

The whole process is divided into 3 main steps
  • Create an Active Directory Application
  • Give Azure Data Lake store access to app created in above step #1
  • Update Visual Studio Azure Data Factory solution with new app id and key
1 – Create Active Directory Application

Below are steps describing how to complete App Registration using the new Azure portal.

  • Open the Microsoft Azure Portal and sign-in. Once signed in, on the left side component navigation pane search on “App registration” (this will be under “Security + Identity”)

  • On the App registrations screen, click on Add to open a new app blade

  • Enter a unique name, select – Web app/API under Application Type and put a valid URL in Sign-on URL

  • Once the app gets successfully registered, you should see something like the screen below

  • Click on the app name to open the application main page and then check on Settings. Copy and store Application ID from the above screen into a notepad for section #3 in this blog

  • Go to Keys on the Settings blade and generate a new key. Give it a name and select the expire option (1 year/2 years or never expires)

Make sure to click Save on top of the same blade to generate a unique key value.

Please copy and save this locally in the same notepad along with Application Id. This will be used later in the Data Factory connection to Data Lake Store.

2- Give above created app access to Azure Data Lake Store

We will use the existing Azure Data Lake Store and give access to the above created app.

  • Open an existing Azure Data Lake Store and you should see something similar as below. Click Data Explorer on the top ribbon to open a new blade.

  • Data Explorer will display a file explorer type view with the folder hierarchy on the left and files (for selected folder) in the right pane.

  • Go to the Access option on the top ribbon to open the access blade

  • The access blade will list details on what types of access are provided to existing accounts if assigned to the Data Lake. There are three different permissions – Read, Write and Execute and they are categorized by Owners, Assigned Permissions and Everyone Else

  • Click on Add to open a new blade to assign permissions to the app we created in Step #1. Click on Select User or Group and add the app name azureadlsapp in the Select text box (please type full name to find app name). Click on the app name and then the Select button at the bottom of the blade to go back to the Assign Permissions blade.

  • On the Assign Permissions blade click on the Select Permissions option, give appropriate permission rights Read/Write/Execute (you can give multiple rights by selecting checkbox next to permission option). Pick one of the option in the Add to and Add as permission options and click Select at the bottom of the blade.

  • Click Select on the Assign Permissions blade to get back to the Access blade and you should get a success message similar to screen below

3 – Update Visual Studio Azure Data Factory solution with new app id and key

Open the Data Lake Store Linked Service JSON file in your Visual Studio Data Factory solution and add servicePrincipalID and servicePrincipalKey which is mapped to Application Id and App Key respectively from above Step #2. The updated JSON script should look something similar to above screen.

Compile the updated project and publish the new linked service JSON to the Azure Data Factory component on the Azure subscription.

Tags> , ,

Shabbir Mala

Shabbir Mala

Shabbir is a Business Intelligence Evangelist at SPR Consulting