A few month ago I thought it would be nice to have a vSphere content library on blob store. I did some googling and found Trevor Davis had posted this: https://avs.ms/centralized-avs-content-library-on-azure-blob/. I read it and thought there has got to be a better way than making the library on my machine running William Lam’s script and uploading everything to a blob store. Sure you could do that but why not over complicate it with more code?
So I forked William’s repo (https://github.com/lamw/vmware-scripts) and got to work. I created a new version of his make_vcsp_2018.py script and added support for making a content library out of an Azure blob container. Until the PR gets approved to merge back into William’s repo you can access the script here: https://github.com/khensler/vmware-scripts/blob/AZBlob/python/make_vcsp_2022.py. (EDIT: PR was approved https://github.com/lamw/vmware-scripts/blob/master/python/make_vcsp_2022.py) The script requires one more module that differ from the last iteration: azure.storage.blob. Once that is installed along with all the previous requirements you are ready to go. Just set two environment variables for your blob store connection string and container name:AZURE_STORAGE_CONNECTION_STRING
AZURE_BLOB_STORE_CONTAINER
The Connection String is available on the storage account under Access Keys

Click the Show keys button and copy the connection string. The container name is in the same interface under Containers.

The container should contain a directory per item such as ISOs, OVFs, or OVAs. For ISOs each directory should have one file. For OVFs all of the OVF files should be in the same directory (ovf, vmdk, mf…), and for OVAs only one OVA per directory. The container should not have sub directories under each directory. The root of the container should have a structure similar to this

A sub directory for an OVF should look similar to this. The item.json file is generated by the script.

When the script runs it will parse all of the files and create the appropriate lib.json and all of the item(s).json files. To connect this to vCenter just make an new subscribed content library with the url of the lib.json file in the root. To find the url click on the lib.json file in the Azure portal and copy the url. The url must be available to your vCenter server. You may need to adjust the network access under the Networking section of the storage account. Unfortunately if the library is available to the internet there is no way to limit the access to specific IPs. There is also no way to enable a username and password on the blob store directly. In case you ever wondered what the enable authentication check box on the the subscribed Content Library in vCenter does it enables basic authentication on the HTTP(S) request with the username “vcsp” and the password specified in the interface. You could build something that enforced authentication but I’ll leave that up to you.
Good so now you have a working content library on blob store. What about updates to that library? Should you have to run the script each time you upload a new file? Well you could but again why not write a bunch more code to make it automatic?
Enter Azure Functions. Azure Functions can be triggered by a blob store action. You can follow the guide here on a basic function. I’ve created a template repo here: https://github.com/khensler/avs-blob-content-library-function. You will need to update function.json to reference your path and connection string variable name. On the function app in Azure there are two additional Application Settings required:AZURE_STORAGE_CONNECTION_STRING
AZURE_BLOB_STORE_CONTAINER
Look familiar? Same as the ones from above. Same values. Once this is configured you can deploy your function app how you want. I use Deployment Center to build and publish upon commits to main.
Another fun option is to enable a CDN on the blob store. Since I build clouds around the world for different testing a global CDN on my content library was a neat option. Just navigate to the Azure CDN option and enable it. I’m not sure if this is cost effective or not since it’s all just internal consumption funny money to me.

And there you go. A fully functional, automatically updating, globally replicated vSphere Content Library.