In many environments, it isn’t really feasible to take build-time or even run-time dependencies on public repositories such as npmjs, nuget, or maven central,generally due to concerns about security, performance, availability, or all of the above.
Historically the story for go has been slightly different than for these other platforms, preferring to fetch dependencies directly from the source - often GitHub - rather than create a centralised registry of go packages.
This has come up a couple of times in client work, so I thought I’d write down how I go about doing it. This technique can be useful when you have enviroment-specific lists of resources - such as users - that need to be created.
Consider a file users.txt:
Luke Bo Daisy Jesse This file contains a list of objects we wish to create. In this example they will be of type aws_iam_user, but the same technique can be applied for any kind of resource or provider.
Installing Prerequisites I’m not a frequent user of Windows, but I understand getting dependencies installed for local development can sometimes be a bit of a pain. I’m using an Azure VM1, but these instructions should work on a regular Windows 10 installation. Since I’m not a “Windows Insider”, I followed the manual steps here to get WSL installed, then upgrade to WSL2. The steps are reproduced here for convenience:
Setting up WSL2 Enable WSL dism.
One of the principles of the twelve-factor app methodology is strict separation between code and config, where config means everything that is likely to vary between deployments, and code is everything that doesn’t. Historically, was not a good fit for .NET Framework applications, which relied on tools such as web.config transformation and Slow Cheetah to apply build-time transformations to application configuration files. These transformations are based on environment-specific config files stored alongside the application code in the source repo.
There are many tutorials on the web that describe more or less automated ways of deploying packaged software to Azure Web Apps. For this particular one I’ve decided to use the popular Content Management System WordPress.
What many such applications have in common is that there is an initial installation procedure, consisting of copying files to a web host and configuring database access, followed by a web-based graphical installer that sets up all of the application configuration.
There are a couple of different ways to store secret variables in an Azure Pipeline. Secrets that are only needed by one pipeline can be created at that scope using the web UI:
Creating a pipeline-scoped secret variable
Secrets that are used by more than one pipeline can be added to a variable group: Creating a secret variable in a variable group
Variable groups can also be linked to an Azure Key Vault.
A short post which might be of use to some, as it took me a while to figure it out.
I’ve been making a few changes to this site lately, one of which was to move from having the images remotely hosted in AWS S3 to having them locally in the repo. This was prompted by the availability of the Hugo page bundles feature, which I think was introduced several years ago without me noticing.
For users migrating from the “Classic” VSTS/Azure DevOps release experience, it is not entirely obvious how to set up what used to be known as Pre-deployment approvals as part of a multi-stage YAML pipeline.
Pre-deployment approvals in a classic release pipeline
The documentation about this is rather unclear, not least because it mixes together concepts from the “Classic” Release Management experience with concepts from the multi-stage YAML experience.
In the context of Azure Network Security Groups, it’s often useful to be able to specify security rules that only apply in certain environments. For example, we might have some kind of load testing tool that should only be permitted to connect to our testing environment, or we might want to restrict our public facing load balancer so that it is only able to connect to our production environment.
I’ve long been of the opinion that when faced with complicated code of uncertain semantics - and ARM Templates for networking certainly tick both of these boxes - that a good way to understand the behaviour of the code is to write tests.
Prompted by some discussion on the SQL Community Slack, I thought I’d revisit this old post on the SSDT Team Blog which outlines how to filter specific objects from a dacpac deployment using the Schema Compare API.
In the past, I’ve used Ed Elliott’s filtering deployment contributor for this kind of thing, but in the interest of experimentation I thought I’d have a look at what comes “in the box”, not least because deployment contributors can, ironically, be a bit of a pain to deploy.