Learn how to access the data you need from your Azure Blob in a few easy steps. This guide will tell you how to configure and connect to your data source and provide details about setting various permissions.
Data Source Basics
All data sources have a protocol and a label that you will use to reference your data. For instance
azureblob is the protocol we’ll use in this guide and the label will be automatically assigned to your data connection as a unique identifier, but you may change it later if you wish.
Configure a New Data Connection to your Azure Blob
To create a new data connection first navigate to Algorithmia’s Data Portal where you’ll notice there is a drop down that says ‘New Data Source’ where you’ll see the options:
Select ‘Azure Blob’ and a form will open to configure a connection. Here you will need to enter your Azure credentials for the Blob Container you will be using.
For programmatic access to Azure Storage Blobs you will need a service level Shared Access Signature for the Blob Container you wish to access. The Azure documentation includes detailed information about creating a service SAS, as well as best practices when using SAS.
In addition to generating an SAS with code, you can obtain an SAS URI for your Blob Container through the Azure Portal as follows:
- Go to https://portal.azure.com and navigate to Storage Accounts. Click on the Storage account you want to manage, then “Storage Explorer”.
- Under “Blob Containers”, right-click the container you want and pick “Get Shared Access Signature”; make sure “Read” permission is checked and hit the “Create” button.
- Copy the URL value from the resulting pane into the SAS URI input.
NOTE: While an algorithm NEVER sees credentials used to access data in Azure, it is recommended that you provide access that:
- Can only list, get, and put objects to Azure (i.e. cannot perform other operations on your account)
- Can only access the paths in Azure that you want Algorithmia to access
Setting Labels For Data Connections
You will need to provide a unique label for your data connector, editable in the “Label” field.
We require these unique labels because you may want to add multiple connections to the same Azure account and they will each need a unique label for later reference in your algorithm. The reason you might want to have multiple connections to the same source is so you can set different access permissions to each connection such as read from one file and write to a different folder.
NOTE: The unique label follows the protocol: ‘azureblob+unique_label://’
Setting Path Restrictions for Azure Folder and File Access
The default path restrictions are set to allow access to all paths in your Azure account, however you may want to restrict your algorithm’s access to specific folders or files:
- Access to a single file: ‘team.jpg’
- Access to everything in a specific subfolder: ‘somefolder/*’
NOTE: ‘somefolder*’ might match more than you’d like, so if you want to match a directory exactly end with a ‘/’.
Here we are setting our path restrictions to everything in the subfolder ‘Algorithmia’, so I’d only be able to get files below azureblob://Algorithmia/* :
Setting Read and Write Access
The default access for your data source is set to read only, but you can change this to read and write access by checking the ‘Write Access’ box.
NOTE: Write access also means you can delete anything in the path you’ve specified in the previous step so be careful that you want read-write-delete access to the path you set in ‘Path Restriction’. Also, if your data source has Read/Write privileges, then an algorithm that you call also has Read/Write privileges to your data source.
Accessing your Data
- client = Algorithmia.client(‘YOUR_API_KEY’)
For example, to retrieve and print a file’s contents in Python:
The above examples work when accessing data from a local script or app code. If you’re writing an algorithm and accessing a data source from inside the algorithm, create the client without an API Key parameter:
client = Algorithmia.client()
If you’re calling an algorithm that takes a file or directory as input from the Data API, you can also provide it a file or directory from one of your data sources:
NOTE: If you call an algorithm it can only access your own data sources. This means that it is NOT possible for an algorithm to read data from your Azure blob and write that data to an account controlled by an another algorithm author. Algorithms do NOT have direct access to any credentials associated with your data sources, and can only access data from a data source using the Algorithmia API.
Data Source Routes and Data API Routes
Once a data source connection has been created and configured, all of the Algorithmia client code for interacting with the Data API for file or directory creation, deletion and listing will function identically with a data source route and a data API route except for:
- We do not support generic ACLs for data sources and the only way to update permissions for a data source is through the data portal where you created your data source connection.
If you’re implementing a new client or using cURL it is preferred to use the following URL structure:
We have tested to ensure that data source paths function in all of our Algorithmia clients, however:
- Python support was added in version 1.0.4
If you have any questions about Algorithmia please get in touch!