Recent Projects

Here's more details about some of the recent projects I've completed.

Defined Azure policies for governance and compliance

Building a presence in Azure can quickly become messy. Resources get created and forgotten about, projects are abandoned part-way through, multiple people do things in their own way with no standardisation. If left unchecked, you'll end up with a whole lot of random resources that you don't need but are paying for.

By defining a set of governance and compliance policies, the mess and cost can be limited.

Some common policies might include:

Naming standards

Define naming standards for your resources. For example, vm-exchange-prod-uks-01 would represent a VM used to run Exchange in the Production environment in the UK South region. The number on the end indicates there are multiple VMs used for this so the number would increment for each VM.
For resource groups, maybe rg-CustomerWebsite-DevTest-weu could represent a resource group for all resources related to the dev/test version of your customer website based in West Europe.

Different resource types have different restrictions on names, so a one size fits all approach isn't possible. For example, storage accounts must be lower case names with no hyphens, so they will require a different convention.

Restricting by Policy

It's really easy to spin up a virtual machine in Azure. It takes minutes. But what if someone creates a hugely powerful VM without realising how expensive the cost will be? Using Azure Policies, you can restrict what types of VM can be deployed, to avoid the big ones.

Some resources might be more expensive in certain regions. Latency might be higher. Or you might need to keep all of your data within the UK for data sovereignty reasons. You can restrict deployments to a set of locations using Azure Policies.

Backup Policies

In general, anything that can be backed up, should be backed up. Especially if it can't easily be recreated.
Virtual machines, databases, files, app services, etc should be backed up to either a Recovery Services Vault or a Backup Vault, depending upon the type of resource. Retention policies for these backups need to be defined.

Some resources cannot yet be backed up natively in Azure. For these resources, it might be recommended to deploy them using ARM templates/Infrastructure as Code, if possible, so they can be redeployed if accidentally deleted. However, this won't prevent data loss or configuration changes.

Tagging of resources

As your Azure footprint grows, you might lose track of the purpose of some of the resources, especially if multiple people/teams are deploying them. Tagging can be used to keep track of it all. For examples, you could use the following tags at the resource group level:

  • Owner: who uses the resource
  • Owner email: used to easily see the email address of the owner
  • Cost centre: who is paying for the resource
  • Review date: when should the resources be reviewed to make sure they're still needed (resources often get created and abandoned so need to be tidied up)
  • Project/application: which project or application use these resources
  • Environment: are they test or production resources?
  • Created by: who created the resources
  • Created on: when were the resources created

I use a PowerShell script that runs once a month, looks at which resources are due to be reviewed at the end of this month and send an email to the resource owner using the value of the 'Owner email' tag to see if the resources are still needed. Without this, you end up paying for resources unnecessarily - people tend to forget to tell you they're finished with something, especially when it's not their money!

Compliance Reports

Azure comes with a set of compliance reports that just need to be configured/tweaked. I configured some to check that all VMs are included in backup policies and that all resource groups and VMs have the required set of mandatory tags assigned.


Migrated VM patching from Azure Automation to Azure Update Manager

Azure allows virtual machines to be automatically patched on a schedule of your choosing. Originally, this was done with Azure Automation Accounts which came with PowerShell scripts to automatically power on any VMs that are due to be patched but are powered off, allowing them to be patched. These VMs were then automatically powered off again after patching.

In 2024, Microsoft replaced this method with Azure Update Manager. While this came with new pre- and post-event handlers, it did not include scripts or a way to power on/off the VMs. I needed to create new PowerShell scripts to handle this - the previous scripts functioned differently. Once I'd done that, I could then migrate the VMs to use the new Update Manager method for patching.


Set up an Azure Data Factory to perform nightly data extracts for web queries

An external company needed to query data held in an internal database. Querying directly was not an option.

I set up an Azure SQL database to hold an extract of the data needed. I set up an Azure Data Factory to pull a subset of data from the internal production database on a nightly basis and save it into the Azure SQL database. I then created a simple .Net web form that could be used by the external company to query the data in the Azure SQL database. This form was locked down by IP address and AD authentication.


Developed an Azure App Service to perform "mail merge" type activities

A system used for customer communications would produce letters. Sometimes these letters needed to be modified before being sent out but the system couldn't do that. A solution was needed.

I developed a .Net application in Azure App Services. The customer system would a POST web request to the .Net app containing a series of key/value pairs. One of these pairs was a letter template ID. The app connected to a SharePoint document library and retrieved a Word document matching the template ID.

The app then opened the document and replaced tags in it with the corresponding key/value pairs. It then saved the amended document to another SharePoint library. The app then returned the URL of the saved document back to the customer system where it was added to the customer record.


Created a Power Automate workflow to strip and unzip attachments from emails

Automated emails containing attachments were received daily into a shared mailbox.

I created a Power Automated workflow to monitor the mailbox for emails from a specific address and extract the attachments. The workflow then uploaded the attachment to an SFTP server. If the attachment was a ZIP file, the contents were extracted and uploaded to the SFTP server rather than the ZIP file.


Converted classic ASP forms to Power Apps/Power Automate

As part of the Cloud Migration Project, one of the systems I migrated and modernised was an access request and approval system. This system was written in Classic ASP and data stored in an Oracle database.

I migrated this system to a set of Power App forms and Power Automate workflows with SharePoint as the data store. As well as modernising the technology and allowing servers running unsupported operating systems to be decommissioned, it also allowed more non-developer people to be able to support the system going forward.


Created PowerShell scripts to automate user account creation/deletion

User accounts created manually by a Service Desk can be created inconsistently - something I've seen at several companies. Accounts for leavers often remain active and forgotten about.

To help standardise account creation, I developed a set of forms for managers to complete. Data from the forms was stored in SharePoint lists. A PowerShell form checked the lists every 15 minutes for new starters. It created the user accounts and mailboxes in a consistent manner.

A second script checked once a day for complete leaver forms and processed those, archiving user files where required and granting managers access to the archive.

Accounts were created with expiry dates. On a monthly basis, another script would look for accounts approaching their expiry date and email the line manager asking them to raise a request to extend or disable the account. Yet another script would run monthly to catch and accounts where the manager had taken no action and the expiry date had passed.

In 2024, the process was extended. New users, leavers and changes to user details now came from the HR system for the majority of users. An additional PowerShell script was created to download an HR extract of changes from an SFTP server and process that file as well as the SharePoint lists.


Developed a .Net app to move files from on-premise locations to SharePoint Online

Various systems produced reports and CSV files. Some of these legacy systems could only output the files to network shares or FTP servers. The modern way of working was to put files into SharePoint. A solution was needed to cater for systems that couldn't save files to SharePoint.

I wrote a .Net console app to upload files from the network shares to SharePoint. The app queried a config list in SharePoint that contained entries for each set of files being uploaded. Each entry included:

  • Source network share
  • Destination SharePoint site URL and library name
  • Optional filename prefix
  • Optional file extension
  • Should the source files be deleted or archived. If archived, to where and for how long?

This allowed users to self-serve without needing me to set up the transfer.