How to use Azure Storage account with CAP (Solution without Destinations)
I work in a new field with SAP Cloud Foundry, so I think I can take you with me on my journey with SAP behavior. This will all be hosted within the BTP (Business Technology Platform).
The requirement
The following requirements are set
- The files will be managed within the Application (hosted in BTP)
- The files are stored in a dedicated storage
- The storage must be configured from any business Admin
- The files must be accessed by other suppliers (with Windows tools)
- The files can only securely be accessed
So I thought about using the BTP's document store. But the problem with that is that it's not accessible externally, and it becomes a hard dependency on the system. Because some kind of endpoint (in Cloud Foundry) must be writtenfor that.
The decision was then to use an external Store (I am an MS guy, and I prefer the Azure tools ;)), so we used an Azure Storage account for that.
So, How do I connect to that? I asked some SAP dev Guys, and they taught me something about a thing called "destinations" in SAP. These are ... yeah.. let's say ... proxies with benefits that can be configured centrally.
The problem is that the business admin cannot set this setting, and it can get a bit complicated for them.
So I advised accessing the storage account itself via the Blob Store SDK. For that, we need to store the settings within our system. We have an Administration section for our business administrators. So, the required settings, especially the access keys (ClientID / Secret), are stored there.
So let do a recap to check if we got every requirement covered:
Requirement Checklist
- ✅ The files will be managed within the Application (hosted in BTP)
- ✅The files are stored in a dedicated storage
- ✅The storage must be configured from any business Admin
- ✅The files must be accessed by other suppliers (with Windows tools)
- ✅The files can only securely be accessed
Looks good! So next, let's do some implementation. But first, we need
The setup
First, before we do some coding, we need the possibility to access the Storage account. For that, we need a service principal from Azure. Here is a small PowerShell script that creates a new service principal
# Install required module
Install-Module -Name Az -AllowClobber -Scope CurrentUser
# Connect to Azure Account
Connect-AzAccount
# Set variables
$principalName="StorageAccountPrincipal"
# Create Service Principal
$sp= New-AzADServicePrincipal -DisplayName $principalName
$credential = New-AzADAppCredential -ObjectId $sp.Id -Password (New-Guid).Guid -DisplayName "TheSecret"
#Display ClientID and Secret
$clientId = $sp.AppId
$clientSecret = $credential.SecretText
Write-Output "Application (Client) ID: $clientId"
Write-Output "Client Secret: $clientSecret"
New the Service Principal must granted access to the file storage itself
# Variables
$resourceGroupName = "MyResourceGroup"
$storageAccountName = "MyStorageAccount"
# Get storage account scope
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
$scope = $storageAccount.Id
# Get service principal ID
$principal = Get-AzADServicePrincipal -DisplayName $principalName
$principalId = $principal.Id
# Assign the role
New-AzRoleAssignment -ObjectId $principalId -RoleDefinitionName "Storage Blob Data Contributor" -Scope $scope
New-AzRoleAssignment -ObjectId $principalId -RoleDefinitionName "Storage Account Contributor" -Scope $scope
# Verify the assignment
Get-AzRoleAssignment -Scope $scope -ObjectId $principalId
After that, you will have access to the storage account with the service principal to create containers and files within containers.
Working with the files
Now, after we set up anything, we can start coding with these. Let's assume you have already created a CAP Service in Cloud Foundry.
Within that, I created a small Class that will handle the write and read from the file storage these will derive from the interface:
/**
* Interface representing a cloud file repository.
*/
export interface ICloudFileRepo {
/**
* Uploads a file to the cloud storage.
*
* @param content - The binary content of the file to be uploaded.
* @param bucket - The name of the bucket where the file will be stored.
* @param directoryName - The name of the directory within the bucket where the file will be stored.
* @param fileName - The name of the file to be uploaded.
* @param referecneId - The reference ID associated with the file.
* @returns A promise that resolves to the URL of the uploaded file.
*/
doFileUpload(content: Buffer, bucket: string, directoryName: string, fileName: string, referenceId: string): Promise<string>;
/**
* Retrieves the binary content of a file from the cloud storage.
*
* @param bucket - The name of the bucket where the file is stored.
* @param directoryName - The name of the directory within the bucket where the file is stored.
* @param fileName - The name of the file to be retrieved.
* @returns A promise that resolves to the binary content of the file.
*/
getFileBinary(bucket: string, directoryName: string, fileName: string): Promise<Buffer>;
/**
* Checks if a file exists in the cloud storage.
*
* @param bucket - The name of the bucket where the file is stored.
* @param directoryName - The name of the directory within the bucket where the file is stored.
* @param fileName - The name of the file to check.
* @returns A promise that resolves to a boolean indicating whether the file exists.
*/
fileExists(bucket: string, directoryName: string, fileName: string): Promise<boolean>;
/**
* Retrieves the name of the container used for cloud storage.
*
* @returns The name of the container.
*/
getContainerName(): string;
}
You can upload, download, and check if the file exists. Very simple, the class implementation for completion is here:
/**
* CloudFileRepo class implements the ICloudFileRepo interface and provides methods to upload files to a cloud storage.
*/
export class CloudFileRepo implements ICloudFileRepo {
/**
* Client ID for authentication.
*/
private clientId: string;
/**
* Secret key for authentication.
*/
private secret: string;
/**
* Storage account name.
*/
private storageAccount: string;
private containername: string;
private tenantId: string;
/**
* Initializes a new instance of the CloudFileRepo class.
*/
constructor(tenantId: string, clientId: string, secret: string, storageAccount: string, constainername: string) {
this.tenantId = tenantId;
this.clientId = clientId;
this.secret = secret;
this.storageAccount = storageAccount;
this.containername = constainername;
}
/**
* Retrieves a client credential using the tenant ID, client ID, and secret.
*
* @returns {ClientSecretCredential} The client secret credential.
*/
private getClientCredential(): ClientSecretCredential {
return new ClientSecretCredential(this.tenantId, this.clientId, this.secret);
}
public getContainerName(): string {
return this.containername;
}
/**
* Uploads a file to the specified cloud storage bucket and directory.
* @param content - The binary content of the file to upload.
* @param bucket - The name of the storage bucket.
* @param directoryName - The name of the directory within the bucket.
* @param fileName - The name of the file to upload.
* @returns A promise that resolves to a success message when the file is uploaded.
* @throws An error if the file upload fails.
*/
public async doFileUpload(content: Buffer, bucket: string, directoryName: string, fileName: string, referenceId: string): Promise<string> {
try {
if (!content)
throw new Error("File content is missing");
const blobServiceClient = new BlobServiceClient(`https://${this.storageAccount}.blob.core.windows.net/`, this.getClientCredential());
// Get a reference to the container client
const containerClient = blobServiceClient.getContainerClient(this.containername);
if ((await containerClient.exists()) == false) {
await containerClient.create();
}
const containerPath: string = `${bucket}/${directoryName}/${fileName}`;
// Get a reference to the block blob client
const blockBlobClient = containerClient.getBlockBlobClient(containerPath);
// Perform upload
const uploadResponse: BlobUploadCommonResponse = await blockBlobClient.upload(content, content.byteLength);
// Set metadata reference ID
await blockBlobClient.setMetadata({ "referenceId": referenceId, "type": bucket });
if (uploadResponse._response.status != 201) {
throw new Error("File upload failed");
}
return containerPath;
} catch (error) {
throw new Error("File upload failed");
}
}
/**
* Retrieves the binary content of a file from a specified container in Azure Blob Storage.
*
* @param containerName - The name of the container where the file is stored.
* @param fullpathWithFileName - The full path including the file name within the container.
* @returns A promise that resolves to a Buffer containing the binary content of the file.
* @throws Will throw an error if the file is not found.
*/
public async getFileBinary(bucket: string, directoryName: string, fileName: string): Promise<Buffer> {
const blobClient: BlobClient = await this.getBlobClient(bucket, directoryName, fileName);
if (await blobClient.exists()) {
const content: Buffer = await blobClient.downloadToBuffer();
return content;
}
throw new Error("File not found");
}
/**
* Checks if a file exists in a specified container in Azure Blob Storage.
*
* @param containerName - The name of the container where the file is stored.
* @param fullpathWithFileName - The full path including the file name within the container.
* @returns A promise that resolves to a boolean indicating whether the file exists.
*/
public async fileExists(bucket: string, directoryName: string, fileName: string): Promise<boolean> {
const blobClient: BlobClient = await this.getBlobClient(bucket, directoryName, fileName);
return await blobClient.exists();
}
/**
* Retrieves a BlobClient object for a specified container and file.
*
* @param containerName - The name of the container where the file is stored.
* @param fullpathWithFileName - The full path including the file name within the container.
* @returns A promise that resolves to a BlobClient object.
*/
private async getBlobClient(bucket: string, directoryName: string, fileName: string): Promise<BlobClient> {
const blobServiceClient = new BlobServiceClient(`https://${this.storageAccount}.blob.core.windows.net/`, this.getClientCredential());
const containerClient = blobServiceClient.getContainerClient(this.containername);
if ((await containerClient.exists()) == false) {
await containerClient.create();
}
const containerPath: string = `${bucket}/${directoryName}/${fileName}`;
const blobClient: BlobClient = containerClient.getBlobClient(containerPath);
return blobClient;
}
}
So for downloading or uploading a file within a cap service, you will be able to use it like this
const repo: ICloudFileRepo = new CloudFileRepo(
"YOUR_TENANT_ID",
"YOUR_CLIENT_ID",
"YOUR_CLIENT_SECRET",
"TARGET_CONTAINER_NAME")
const content= Buffer.from("test");
const uploadedDirectory:string= await repo.doFileUpload(content,"Rootdirectory", "Subdirectory","File.txt","YourReferenceID");
Is that easy? You now have a nice and small toolkit that you can use for interacting with the azure blob store, without using any destinations.
Advantages / Disadvantages
So you ask me, why do I not use the destination variant?
The problem is that you cannot directly grant external users access to the storage account.
Let's say you have a tester who will check if your generated document is ready. You must provide an extra endpoint to download only for a specific test case.
This will cause ( in my opinion) a small data leak because these documents can be downloaded by everyone (who has access to this endpoint). So, instead, you can provide limited access through a SAS Token or to a direct credential. So you are very independent from a provider. Also, you can switch easily within some sort of env variables or set the region of the blob storage (maybe due to governance policy).
The next advantage is that you are not dependent on any other IT department. So you are very flexible about changing or modifying the settings because you set this in your own application.
Final Word
Yes, I am not a professional in SAP CAP development, but I can write some sort of TypeScript. So, I come from a world that is open-minded and not bound to a specific provider. So I think I bring my very own spirit and new thoughts and impulses to this SAP World. ;) What do you think? How can I improve my solution? What are your thoughts about this solution? Do you have any recommendations?