I recently had a need to provide an interface for some users to our Azure Blob Storage.  Rather than download the Microsoft Storage Explorer and give the users access to the storage account I decided I would make a web based storage explorer.  The two main needs were:

  1.  The users needed to be able to navigate around the blob containers
  2. the users needed to be able to download archived files.

On the surface this seemed like a simple task, I would find an Jquery file tree and append the file URI's with a Shared Access Token to allow the download.  Unfortunately this simple task turned into a week of banging my head against the wall and chasing my tail.  This is partially due to my chosen library not being targeted at use with azure, and some quirks of .net framework.  However I was able to get my file explorer to work to the delight of my users.   After all the struggles, I have decided that I will share what I have learned and the code I have comeup with  in the hopes that at the very least, it will help someone else that is struggling with finding a solution to using jsTree with their Azure Blob Storage.  If you want to skip to the code, you can download the project from GitHub:
https://github.com/agentKnipe/AzureBlobStorageExplorer


So, at the beginning of this saga I went on the search for a jquery library that would allow me to list a folder structure.  I found many libraries out there but they all seemed to be targeted a local file storage or were way over kill for what I needed.  I eventually stumbled on https://www.jstree.com/.  The library had most of what I needed, it took in Json and would display the resulting structure.   It looked perfect.  The next step was to begin creating a service that I could call from ajax in order to return the json to jsTree.  Easy Peasy... or so I thought.

Microsoft provides a library for accessing Azure Blob Storage, however I wanted a simple reusable class that would, eventually, allow for uploads, downloads, as well as the file browsing functions.  I created the below interface and class. 

public interface IStorageService {        
     Task<string> Upload(string Container, string Directory, string FileName, byte[] FileContents);        
     Task Delete(string Container, string Directory, string fileName);        
     List<string> ContainerList();        
     Task<List<BlobDirectory>> DirectoryListAsync(string Container);        
     List<BlobDirectory> DirectoryList(string Container);        
     Task<List<BlobDirectory>> DirectoryListAsync(string Container, string Directory);        
     List<BlobDirectory> DirectoryList(string Container, string Directory);        
     Task<List<BlobFile>> FileListAsync(string Container, string Directory);        
     List<BlobFile> FileList(string Container, string Directory);    
}
namespace Troop603.Models {
    private readonly IConfiguration _Config;
    private CloudStorageAccount _CSA;
    private CloudBlobClient _CBC;

    public AzureBlobStorageService(IConfiguration config) {
        _Config = config;
        var storageConnString = _Config.GetValue<string>("AzureStorageConfig:StorageConnectionString"); 

        _CSA = CloudStorageAccount.Parse(storageConnString); 
        _CBC = _CSA.CreateCloudBlobClient();
    }

    public async Task<string> Upload(string Container, string Directory, string FileName, byte[] FileContents) { 
        var cloudBlobContainer = _CBC.GetContainerReference(Container); 
        
        await cloudBlobContainer.CreateIfNotExistsAsync(); 
        var blobFileFullName = FileName; 
        
        if (!string.IsNullOrEmpty(Directory)) { 
            blobFileFullName = $"{Directory}/{FileName}"; 
        } 
        
        var blob = cloudBlobContainer.GetBlockBlobReference(blobFileFullName); 
        
        if (!await blob.ExistsAsync()) { 
            await blob.UploadFromByteArrayAsync(FileContents, 0, FileContents.Length); 
        } 
        return blob.Uri.ToString(); 
    }

    public Task Delete(string Container, string Directory, string fileName) { 
        throw new NotImplementedException(); 
    }

    public List<string> ContainerList() { 
        var returnList = new List<string>(); 
        var containers = _CBC.ListContainers(); 
        
        foreach (var container in containers) { 
            returnList.Add(container.Name); 
        } 
        return returnList; 
    }

    public async Task<List<BlobDirectory>> DirectoryListAsync(string Container) { 
        BlobContinuationToken blobContinuationToken = null; 
        var returnList = new List<BlobDirectory>(); 
        var blobContainer = _CBC.GetContainerReference(Container); 
        
        do { 
            var directories = await blobContainer.ListBlobsSegmentedAsync(null, blobContinuationToken); 
            
            foreach (var directory in directories.Results.Where(w => w as CloudBlobDirectory != null).ToList()) { 
                var uriParts = directory.Uri.ToString().Split('/'); 
                var newDirectory = new BlobDirectory() { 
                    DirectoryURI = directory.Uri.ToString(), 
                    ContainerName = directory.Container.Name 
                }; 
                
                returnList.Add(newDirectory); 
            } 
        } 
        while (blobContinuationToken != null); 
        
        return returnList; 
    }
    
    public List<BlobDirectory> DirectoryList(string Container) {
        BlobContinuationToken blobContinuationToken = null; 
        var returnList = new List<BlobDirectory>(); 
        var blobContainer = _CBC.GetContainerReference(Container); 
        
        do { 
            var directories = blobContainer.ListBlobsSegmented(null, blobContinuationToken); 
            
            foreach (var directory in directories.Results.Where(w => w as CloudBlobDirectory != null).ToList()) { 
                var uriParts = directory.Uri.ToString().Split('/'); 
                var newDirectory = new BlobDirectory() { 
                    DirectoryURI = directory.Uri.ToString(), 
                    ContainerName = directory.Container.Name 
                }; 
                
                returnList.Add(newDirectory); 
            } 
        } 
        while (blobContinuationToken != null);
        
        return returnList; 
    }
    
    public async Task<List<BlobDirectory>> DirectoryListAsync(string Container, string Directory) { 
        BlobContinuationToken blobContinuationToken = null; 
        var returnList = new List<BlobDirectory>(); 
        var blobContainer = _CBC.GetContainerReference(Container); 
        var subDirectory = blobContainer.GetDirectoryReference(Directory);
        
        do { 
            var directories = await subDirectory.ListBlobsSegmentedAsync(blobContinuationToken); 
            
            foreach (var directory in directories.Results.Where(w => w as CloudBlobDirectory != null).ToList()) { 
                var uriParts = directory.Uri.ToString().Split('/'); 
                var newDirectory = new BlobDirectory() { 
                    DirectoryURI = directory.Uri.ToString(), 
                    DirectoryParents = Directory, 
                    ContainerName = directory.Container.Name 
                }; 
                
                returnList.Add(newDirectory); 
            } 
        } 
        while (blobContinuationToken != null); 
        
        return returnList; 
    }
    
    public List<BlobDirectory> DirectoryList(string Container, string Directory) { 
        BlobContinuationToken blobContinuationToken = null; 
        var returnList = new List<BlobDirectory>(); 
        var blobContainer = _CBC.GetContainerReference(Container); 
        var subDirectory = blobContainer.GetDirectoryReference(Directory); 
        
        do { 
            var directories = subDirectory.ListBlobsSegmented(blobContinuationToken); 
            
            foreach (var directory in directories.Results.Where(w => w as CloudBlobDirectory != null).ToList()) { 
                var uriParts = directory.Uri.ToString().Split('/'); 
                var newDirectory = new BlobDirectory() { 
                    DirectoryURI = directory.Uri.ToString(), 
                    DirectoryParents = Directory, 
                    ContainerName = directory.Container.Name 
                }; 
                
                returnList.Add(newDirectory); 
            } 
        } 
        while (blobContinuationToken != null); 

        return returnList; 
    }
    
    public async Task<List<BlobFile>> FileListAsync(string Container, string Directory) { 
        BlobContinuationToken blobContinuousToken = null; 
        var returnList = new List<BlobFile>(); 
        var cloudBlobContainer = _CBC.GetContainerReference(Container); 
        var directory = cloudBlobContainer.GetDirectoryReference(Directory); 
        var sasToken = GenerateSharedAccessSignature(cloudBlobContainer); 
        
        do { 
            var results = await directory.ListBlobsSegmentedAsync(blobContinuousToken); 
            blobContinuousToken = results.ContinuationToken; 
            
            foreach (var blob in results.Results.Where(w => w as CloudBlobDirectory == null).ToList()) { 
                var newBlob = new BlobFile(); 
                newBlob.Container = Container; 
                newBlob.Directory = Directory; 
                newBlob.DownloadURL = blob.Uri.ToString(); 
                newBlob.sasToken = sasToken; returnList.Add(newBlob); 
            } 
        } while (blobContinuousToken != null); 

        return returnList; 
    }
    
    public List<BlobFile> FileList(string Container, string Directory) { 
        BlobContinuationToken blobContinuousToken = null; 
        var returnList = new List<BlobFile>(); 
        var cloudBlobContainer = _CBC.GetContainerReference(Container); 
        var directory = cloudBlobContainer.GetDirectoryReference(Directory); 
        var sasToken = GenerateSharedAccessSignature(cloudBlobContainer); 
        do { 
            var results = directory.ListBlobsSegmented(blobContinuousToken); 
            blobContinuousToken = results.ContinuationToken; 
            
            foreach (var blob in results.Results.Where(w => w as CloudBlobDirectory == null).ToList()) { 
                var newBlob = new BlobFile(); 
                newBlob.Container = Container;
                newBlob.Directory = Directory; 
                newBlob.DownloadURL = blob.Uri.ToString(); 
                newBlob.sasToken = sasToken; returnList.Add(newBlob); 
            } 
        } while (blobContinuousToken != null); 

        return returnList; 
    }
    
    private string GenerateSharedAccessSignature(CloudBlobContainer Container) { 
        var sasConstraint = new SharedAccessBlobPolicy(); 
        sasConstraint.SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(5); 
        sasConstraint.Permissions = SharedAccessBlobPermissions.Read; 
        var sasToken = Container.GetSharedAccessSignature(sasConstraint); 
        return sasToken; 
    }
}

 

After looking at the code above, you must be beginning to wonder why I have dependency injection built into my Storage Service if I had written this for .net framework.  To this, I must confess I have rewritten my solution for .net core so that I could use it for other projects.  In order to really simplify my life, I am creating the storage account and blob client references as class wide variables.  This allows me to reuse them for multiple tasks if necessary and not reinstantiate them for each method, which is how must of the documentation online has structured things.  Now, for the connection string, you will want to pull the connection string from your storage account in azure, I am assuming you know how to do this and wont go through that detail.  My AppSettings contains a lot more information than is truely needed, but I like having the portions of the Connection String readily available.

{
	"AzureStorageConfig": {
		"AccountName": "<storage account name here>",
		"AccountKey": "<storage account access key here>",
		"DefaultEndpointsProtocol": "https",
		"EndpointSuffix": "core.windows.net",
		"StorageConnectionString": "<connection string here>"
	},
	"Logging": {
		"LogLevel": {
			"Default": "Warning"
		}
	},
	"AllowedHosts": "*"
}

Now that I have a class that can access Azure Blob Storage, I needed a controller that jsTree could call from ajax that would return a list of containers, which would have child directories, and potentially files.  For those not familiar with Azure Blob Storage, directories are more of a logical construct.  They dont actually exists within the blob container.  When you place a file in blob storage, and place it in a directory the blob is actually in the form of director/fileName.ext.  Fortunately, the Microsoft library provide a CloudBlobDirectory class that can be used to cast your blobs and identify whether they are a directory or not.  

Next I have created a Storage controller to act as my API.  There are 2 methods in this controller, one to get a list of folders for a specific node, and one to get all the contents of a specific node.  I seperated these into 2 seperate actions because the expanding and collapsing of the jsTree is handled differently than selecting folder(node) in js tree.

[Route("[controller]/[action]")] 
public class StorageController : Controller { 
    private readonly IStorageService _Storage; 
    
    public StorageController(IStorageService storage) { 
        _Storage = storage; 
    } 
    
    [HttpPost] 
    public JsonResult GetNode([FromBody] Node Node) { 
        var containers = new List<JsTreeFolder>(); 
        
        if (Node.NodeID == "#") { 
            var containerList = _Storage.ContainerList(); 
            
            for (int i = 0; i < containerList.Count(); i++) { 
                var newJSTree = new JsTreeFolder() { 
                    id = containerList[i], 
                    text = containerList[i], parent = "#" 
                }; 
                
                containers.Add(newJSTree); 
            } 
        } 
        else { 
            var nodeParts = Node.NodeID.Split('/'); 
            var container = nodeParts[0]; 
            var directory = string.Join("/", nodeParts.Skip(1).ToArray()); 
            var directoryList = new List<BlobDirectory>(); 
            var parent = container; 
            
            if (!string.IsNullOrEmpty(directory)) { 
                parent = $"{container}/{directory}"; 
            } 
            
            directoryList = _Storage.DirectoryList(container, directory); 
            
            for (int i = 0; i < directoryList.Count(); i++) { 
                var newJSTree = new JsTreeFolder() { 
                    id = $"{parent}/{directoryList[i].DirectoryName}", 
                    text = directoryList[i].DirectoryName, parent = parent 
                }; 
                
                containers.Add(newJSTree); 
            } 
        }

        var serializedObjects = JsonConvert.SerializeObject(containers);
        return Json(containers);
    }

    [HttpPost]
    public JsonResult GetNodeContents([FromBody] Node Node) {
        var nodeObjects = new List<JsTreeFile>();
        var nodeParts = Node.NodeID.Split('/');
        var container = nodeParts[0];
        var directory = string.Join("/", nodeParts.Skip(1).ToArray());
        var directoryList = new List<BlobDirectory>();
        var fileList = new List<BlobFile>();
        var parent = container;
        if (!string.IsNullOrEmpty(directory)) {
            parent = $"{container}/{directory}";
        }
        directoryList = _Storage.DirectoryList(container, directory);
        fileList = _Storage.FileList(container, directory).OrderByDescending(o => o.FileName).ToList();
        for (int i = 0; i < directoryList.Count(); i++) {
            var newJSTreeContent = new JsTreeFile() {
                id = $"{parent}/{directoryList[i].DirectoryName}",
                text = directoryList[i].DirectoryName,
                parent = parent,
                cssClass = "directory"
            };
            nodeObjects.Add(newJSTreeContent);
        }
        for (int i = 0; i < fileList.Count(); i++) {
            var newJSTreeContent = new JsTreeFile() {
                id = $"{parent}/{fileList[i].FileNameNoExt}",
                text = fileList[i].FileName,
                parent = parent,
                cssClass = "file",
                uri = fileList[i].DownloadURL + fileList[i].sasToken
            };
            nodeObjects.Add(newJSTreeContent);
        }
        var serializedObjects = JsonConvert.SerializeObject(nodeObjects);

        return Json(nodeObjects);
    }
}

Now the hard part.  While jsTree can make an ajax call, the example and documentation I found didnt seem to work correctly.  Further, the json structure that the ajax call is expecting doesnt seem to match anything I found on the website.  After scouring the web and combining several different methods I came upon the following structure that works:

public class JsTreeFolder { 
    public string id { get; set; } 
    public string parent { get; set; } 
    public string text { get; set; } 
    
    public string state { 
        get { return "closed"; } 
    }
    
    public bool children { 
        get { 
            return true; 
        } 
    } 
}

Which when serialized looks like:

[
	{
		"id": "blogimages",
		"parent": "#",
		"text": "blogimages",
		"state": "closed",
		"children": true
	}
]

The big keyes here are the state and the children elements.  jsTree requires these elements or it simply doesnt work.  For my implementation, I created a couple of div to hold the results of the ajax calls.  One is the jsTree, and the other next to it is the results of selected the folder to view its contents.  To create the jsTree with ajax you simply need to point it at your controller

$('#jsTree').jstree({ 
    'plugins': ['themes', 'json_data', 'ui', 'types'],                    
    'types': { 
        "default": { 
            "icon": "fa fa-folder"                        
        } 
    },                    
    'core': { 
        'data': { 
            'type': 'POST',                            
            'url': 'Storage/GetNode',                            
            'contentType': "application/json; charset=utf-8",                            
            'dataType': "json",                            
            'data': function(node) { 
                return '{NodeID:"' + node.id + '"}'                            
            },                            
            "success": function(new_data) { 
                return new_data; 
            } 
        } 
    } 
});

There is some magic wrapped up inside jsTree that allows this to work. When you expand a node, it causes the ajax to fire again, with the expanded node being available to pass as data to your method.  Since this happens when the node is expanded, your nested directory structure becomes very easy to display.  The next thing I wanted to be able to do was display the list of files that are in the selected folder, similar to what you would see in a windows file explorer.  When selecting a node, you can capture the selected node event and use that to call a function or make your ajax call.  In my case, I am simply calling a function that builds out the file view.

$('#jsTree').on('select_node.jstree', function(e, data) { 
    GetDirectoryContents(data.node.id); 
}).jstree({ 
    'plugins': ['themes', 'json_data', 'ui', 'types'],                    
    'types': { 
        "default": { 
            "icon": "fa fa-folder"                        
        } 
    },                    
    'core': { 
        'data': { 
            'type': 'POST',                            
            'url': 'Storage/GetNode',                            
            'contentType': "application/json; charset=utf-8",                            
            'dataType': "json",                            
            'data': function(node) { 
                return '{NodeID:"' + node.id + '"}'                            
            },                            
            "success": function(new_data) { 
                return new_data; 
            } 
        } 
    } 
}); 

function GetDirectoryContents(NodeID) {            
    $.ajax({ 
        type: "POST",                
        url: "Storage/GetNodeContents",                
        contentType: "application/json; charset=utf-8",                
        dataType: "json",                
        data: '{NodeID:"' + NodeID + '"}',                
        success: function(msg) {                    
            $('#files').empty(); 
            var html = '<div class="row">'; 
            
            for (obj in msg) { 
                debugger; 
                var newObj = msg[obj]; 
                
                if (newObj.cssClass == "directory") { 
                    html += '<div class="col-md-3 ' + newObj.cssClass + '" id="' + newObj.id + '" data-parent="' + newObj.parent + '" style="text-align:center;"><div style="font-size:30px;"><i class="fa fa-folder"></i></div><div>' + newObj.text + '</div></div>'                        
                }
                else { 
                    html += '<div class="col-md-3 ' + newObj.cssClass + '" style="text-align:center;"><a href="' + newObj.uri + '"><img src="' + newObj.uri + '" width="150px"><div>' + newObj.text + '</div></a></div>'                        
                }
            } 
            
            html += '</div > ';                    
            
            $('#files').append(html); 
        } 
    });        
}

 

It should be noted that you can display these files how ever you like.  For my actual users, I am simply displaying a file icon from font awesome, for this demo I am displaying the image that is in the folder.   Each of the images are links to the blob appended with a Shared Access Token.  I have set the timeout to 5 minutes, but that can be configured in your application.  This allows your users to access and download your azure blobs without them needing to actually have access to the storage account.

 

The last thing that I wanted was pure icing.  The containers my users access have a number of directories in them.  I wanted my users to be able to drill down into a folder from the file list shown above.  In order to accomplish this I am triggering a jsTree event that causes jsTree to expand and select the folder, and then update the files that are list.  The jsTree events can take call backs, which makes chaining anonymous functions super easy. 

$("#files").on('click', '.directory', function () {                var parentNodeID = $(this).data('parent');                var selectedNodeID = this.id;                $.jstree.reference('#jsTree').open_node(parentNodeID                    , function (parentNodeID) {                        $.jstree.reference('#jsTree').deselect_node(parentNodeID, selectNode(selectedNodeID));                    });            });function selectNode(SelectedNodeID) {            $.jstree.reference('#jsTree').select_node(SelectedNodeID);        }

 

And thats it, you now have a fully function Azure Blob Storage File Explorer.