Azure Infrastructure Optimization

DevOps & Cloud
Azure, .NET, Python, SQL Server, ElasticSearch

Our client is one of the world’s leading manufacturers of professional photographic equipment. In particular, it delivers light solutions, lenses, photo accessories, and more. It also has several dedicated software apps, mobile apps and maintainable firmware.

Business Challenge

Our client picked Software Country as its new partner to harmonize various software development, operational, and data analysis tasks. One of the challenges was to analyse and improve the existing Azure infrastructure. Among other things, we were asked for an audit and suggestions on improving the current infrastructure and reducing costs.


Software Country’s team analysed the existing infrastructure and found the following issues:

  • Lack of documentation about the multiple parts of the infrastructure created by various third party companies.
  • Lack of proper backup for all critical components, including some VMs hosted outside Azure.
  • The data analysis subsystem created with Azure Data Factory and Azure Databricks accounted for 50% of overall monthly Azure costs.
  • A lot of abandoned data in various Azure Storage accounts.
  • Inefficient settings for Application Services in use.

We then focused on these issues one by one.

First, we created proper documentation and architecture diagrams at the customer’s Confluence portal.

Then, we harmonized the backup for multiple components using Azure Recovery Services Vault and added backup for critical VMs, MS SQL and MySQL databases.

Removing the abandoned data and tweaking Application Services configuration allowed us to reduce monthly costs by around 5%.

The inefficiency issue was chiefly related to the data analysis subsystem. We found Azure Databricks used together with Azure Data Factory with default configuration settings. This caused spawns of quite powerful processing clusters even for quite simple jobs. By properly configuring the Azure Databricks cluster to sufficiently meet the customer’s needs and ensuring correct usage of it from Azure Data Factory, we also managed to significantly reduce costs.

Finally, we carefully analysed all Python / Sparks data processing scripts used by Databricks. Our team were able to improve processing speed for some scripts up to ten times.


We analysed, documented, and improved the existing infrastructure and implemented proper backup solutions for all critical infrastructure components. Most cost inefficient components were improved. As a result, Azure monthly costs reduced by over 50%, which made our client opt for partnering with us.

Related Cases

Read all

Online Robotics Simulation Application

An educational robotics kit—a browser app simulating the whole process of building, programming and testing a robot.

Implementing LTI 1.3 for LMS

Implementation of the latest version of the standard, LTI 1.3 and in particular LTI Advantage.

OneRoster 1.2 Integration for LMS

A solution for passing grade information from the LMS to a student information system (SIS)