[[Blog/Home|Home]]
One of the tasks i've been working on recently in my job is to run up an exported azure resource manager (ARM) template into other subscriptions. It's been a bit of a learning curve but a fun one at that.
## ARM Templates
ARM templates are Azure's built-in output for resources as they're built, they're available for all resources built and with some tweaking of the files can be easily repeatable for recurring deployments of things like storage accounts and the like.
This can be good for files all the way up to 4mb in size, but once you hit that magic number, you end up hitting a limit where it stops functioning, this is where Linked Templates come in.
Linked Templates are helpfully provided by the ARM template export from Azure. To use the tempaltes requires a place where it can dynamically generate URI's after you provide just the master template, and maybe a parameters file. According to the documentation, the recommended place is publicly facing storage account - if the data isn't sensitive, or by SAS token if your account is stored behind a private endpoint.
Now what I've found is that the SAS Token which can generate fine is dependent on the command you're using, be that Storage Account with the Storage Account Key, on the Container Itself, or on the BLOB itself. The permissions provided by the SAS Key tending to provide different layers of access, but seemingly access to the container itself requires the use of a storage account key. Not a big issues if you're storing this securely in an Azure Key Vault for example, but even after the assembly required to build it out the template, which returns when using curl, refuses to pass successfully via Azure CLI to the Azure Resource Manager for deployment.
This has been a challenging exercise all in all, and I've been spending quite a few hours inside and outside of hours trying to crack the problem.
## Azure Devops
Having been using Jenkins as the devops pipeline for some time now, i found the configuration of Azure Devops to be fairly painless, setting up deployments to specific environments, building out different strategies for different operating systems (I only read about this, haven't experimented yet) and trigger automatically on git push was very easy to configure, and every thing about the interface felt very responsive.
One of the things I've just learned is how to pass variables between stages in the yaml file. This is achieved with
```sh
echo "###cso[task.setvariable=ARMTemplateMasterURI]$INSERTYOURCODEORVARIABLEHERE"
```
This then sets the environment variable for use in the next task, and can be called with the syntax
```sh
$(ARMTemplateMasterURI)
```
Now with all things environment variables, this can only happen when the source of the shell is reloaded or forcibly reset. and so the environment variable set with the above command doesn't within the current task, but you are still able to set standard variables.
Now I still haven't solved this problem, but I believe the next step for my URI task is to use this function to set the variable, and then call the ARMTemplateDeployment task from the azure pipeline. The URL when called with CURL returns the full template no worries, but when passed to Azure CLI it fails saying there's no blob at the address, indicating a problem with the template URL. I'll poke at it some more.
## Bicep
This seems fine for anything that you're happy with a straight export from the arm template, but in some instances the template isn't ready to go, it's simply too much. So in parallel I've been working with Azure Bicep which is an extension of the ARM function in Azure. The really wonderful thing here is due to it being built on top of ARM, it's able to actually decompile the ARM templates into ~ALMOST~ usable bicep code.
Bicep code is so very close to Terraform, i was able to slide into it easily. The project by Microsoft grants access to idempotent code that isn't reliant on a state file, but rather the ARM manager itself. It checks for the existence of a service as named in your state files and will modify it as required, or if you change it out of incremental mode, will ensure that all resources in the resource group will match what is written in the bicep file (I haven't experimented with this function yet).
With the ARM templates decompiled I was able to begin refactoring the code with environment specific variables, built with an environment prefix passed as part of each start of the pipeline, this meant that each environment could reuse the same code, but create unique (environment prefixed code). It enabled the creation of a keyvault, the import using the existing flag of secrets from the keyvault, and the use of those secrets as part of the environment configuration.
The way it all hangs together is very cool. I would say - and again, I'm still fairly new with it, but from my first impression this is all super awesome, however the single file, and resource group focus on this made it feel a little impractical when you're trying to build across multiple subscriptions. Where Terraform would allow you to structure a whole folder into multiple.tf files, the .bicep files are a single file, with some parameters maybe specified in another file. It makes for some massive files that takes some reading through to find what you're looking for. For something like an Azure Data Factory, it can be quite a lot.
## Closing thoughts
So in conclusion if you don't have the infrastructure set up to support a big terraform deployment, and the keeping of statefiles in a remote location like Azure blob storage, I can wholeheartedly recommend for smaller environments the use of Azure Bicep. It allows quick reproduction of environments, and when combined with Azure Devops pipelines it feels great. I think the ARM templates certainly have their place, but maybe more as a vehicle for extracting the information from Azure, to be decompiled into much more usable Bicep code.
Hey, thanks for sticking with me through this, if you're maybe interested in receiving a newsletter from me some point in the future, I'm gauging interest at the below link. In the News letter, I'll go more in depth than my blog articles with links to git repository's with actual code, and practical security tips to help protect you, your family, and your business from security threats!.
## [>>REGISTER YOUR INTEREST IN A NEWSLETTER<<](https://mcsec.ck.page/ecc5642a3a)