No one likes waste and yet with traditional web projects we are doing just that. Our infrastructures are often designed to not only serve up our sites but also to run all manner of functions in the background. This demand on the processing power of the infrastructure is something we have had to put up with. However, those clever bods at Microsoft have a solution that can help remove those functions from our infrastructures to make them more efficient.
Azure Functions are a Functions as a Service (FAAS), a type of Serverless Architecture. More specifically, they are an example of serverless computing, which enables the developer to create and run small pieces of code without the need for an application. It’s important to note here that this doesn’t mean there is no computer or server involved. What it does mean is that this model is a level above Platform as a Service (PAAS) where the application framework (where you run your functions) is provisioned by the cloud provider.
There are obviously a number of different FAAS providers but for this post I wanted to delve into our recent experience at MMT Digital with Azure Functions and Azure Table Storage when implementing a Serverless API for one of our Kentico projects.
To give you some context, Azure functions can be triggered in several ways;
- Via Blob/Queue Storage Event
- Via Azure Cosmos DB
- Via SaaS Event Processing, e.g. by saving a file in OneDrive
- Via HTTP Webhook URLs
- Via Timer based processing
- Via Real-time Stream / Real-time Bot Messaging
The solution for this requirement was originally pitched as an API, but, on closer inspection the size of the logic meant it was better suited to the HTTP/webhook approach of an Azure Function. The immediate benefit was that this would reduce the required infrastructure and running costs, a huge advantage for the client.
(For reference, the Azure Functions Pricing is per call model and the first million calls per month are free!)
To make best use of Azure Functions, our approach for this solution was to separate out the logic and data access into class libraries and have the Azure Function code just make the initiating call. This allowed us to introduce unit tests against the logic code and gave us the flexibility to reuse or deliver by an alternative mechanism if we decided that we no longer needed the Azure Function.
The data for this solution was a very simple, structured piece set of reference data. We needed to find somewhere to store this data so that it could be accessed in the optimal way.
Like most Azure Functions, ours is lightweight. The best practice is to ensure that if it needs access to a database then the access should be either restricted as much as possible or the data it requires should be segregated from the main database. This enables us to tighten up security around the function and reduce the impacts of any potential breaches (always a bonus in this new GDPR age!).
The ideal solution for this was an Azure Storage Table, a NoSQL key-value store that facilitates rapid development using semi-structured databases. We could easily access the data from the C# code and easily review the reference data through Azure Storage Explorer.
Why would we recommend Azure Functions?
The experience overall has been very positive and there are some real benefits to using Azure Functions.
For a start, having smaller, manageable, focused pieces of code triggered from the required points is very efficient from both a development and performance perspective. Specifically, we can use Azure Functions alongside Kentico to augment or enhance a site by moving repetitive or small pieces of work into functions, thereby allowing Kentico to focus on serving up user content requests or process e-commerce transactions.
For example, we can use Azure Functions to trigger an image resizing process from an upload to Blob Storage (if blob storage is holding our Kentico Media Library files). We can reap the benefits here as well as we are able to distribute the processing across the cloud resource allowing the consuming application to focus on serving visitor requests.
As a result, for those larger or legacy sites, we can use Azure Functions to off-load processing into the cloud to give us great boosts in both performance and scalability. In the case of Kentico, we can leave the application to focus on the core functionality whilst the visitor browser can interact with our Azure Functions directly for the necessary functionality.
And finally, the running costs for an Azure Function are very low and they are highly scalable – all of which is managed for you by Azure. On top of this, moving this code out to Azure Functions has a positive impact on our Kentico websites and applications as less resource is required to scale them, meaning lower costs!
What should you be aware of?
At the time of writing, we were not able to find a way to IP restrict the Azure Function. This is something that is very useful for pre-production environments. However, it is possible to restrict the access with a key so that it is secure.
This application is currently using two Azure Functions which are easy to maintain and manage. If we increased that number then the co-ordination effort would also increase and a solid process would be required to co-ordinate the suite of functions. Azure API Management could help here if the functions were HTTP triggered. Azure Functions Proxies are a recent feature which allow proxy endpoints to be created, enabling multiple HTTP triggered functions to be put behind a single endpoint so that consuming the HTTP functions is much simpler. This is something we are going to delve into further.
Azure functions could create a hidden functionality situation. Several years ago, database triggers were out of favour as an insert into a table could trigger lots of hidden logic. Azure Functions may create the same situation overtime if not managed carefully.
How could you use them?
There are many different possible use cases for Azure Functions, especially when you consider the different methods for triggering a function. Here are a few ideas to get you started:
- Time function to trigger scheduled jobs via an API call, log deletion etc.
- Simple API replacements with HTTP Triggered functions, microservice architecture possible via a single gateway if using azure function proxies
- Image resizing on blob storage upload
- Staging record processing with Azure Cosmos DB, validate or transform data being imported
Want to know more?