Deploy ARM Templates using Key Vault

I’ve been deploying to Azure a lot of resources, one of my favorite things is to create templates where I can reuse for other situations. But sometimes, and in some situations, you need to increase the security. That is where I started to leverage the Key Vault to store my secrets.

Over the years, I have been creating a linked template reference where the main template passes parameters and the key vault references to the linked template. Although the template is uploaded to a storage blob container.

Let me exemplify. Imagine that you have the following scenario, you want to leverage the Azure Portal to deploy ARM Templates and you want to use the Key Vault to store the secrets. How I can do it? Although, your Manager concern is, since the URI is public how I can protect the storage container?

The way that I have been getting around on this situation is using private blob storage in conjunction with SAS.

Step 1 – Use Azure File Copy to copy your template to the storage account. Azure File Copy will give you a SAS token. Then use the token to deploy.

Step 2 – On Azure Resource Group Deployment, you have an option to override the template parameters.

Step 3 – You only need to append the SAS token to the final URL.

Cheers,

Marcos Nogueira
Azure MVP
azurecentric.com
Twitter: @mdnoga

Managed disk on Azure

You probably already saw this when you are creating a Virtual Machine on Azure. After you insert the basic information like Name of the VM, choose the size, then comes the time to define and configure the settings of that VM. One of the first thing is the Use of managed disk.

But what is managed disks? How they work? What are the implication of using the Managed disks?

So, first thing, Managed Disk allow you to abstract the storage accounts where you will use on your virtual machine (see pictures below). When you select that you want to use managed disk, you don’t have to setup or choose the storage account where those disks will be stored.

When you don’t want to use Managed disks, you have to select the storage account.

With Managed disk, you only have to specify the size of the disk, and Azure manage for you. That allows you more granular access control. You don’t have to care with the storage account limits and you will gain higher scalability, meaning that you can create up to 10000 disks per region per subscription.

Managed disk will increase your resilience for your availability sets, by making sure that the disk will belong to a storage unit that is on a different fault domain. In my experience, when you create storage account, it’s not guarantee that your storage account will be on a different fault domain. That scenario, even if you use availability sets on the setup, doesn’t avoid a single point of failure.

But if you are thinking, that you prefer to use storage accounts, to control the access to the VHDs, with managed disks you can use RBAC as well, to assign the permissions for a managed disk to one or more users. In this scenario, you have to managed disk by disk, and not to the entire storage account. That means more granular access control. You can prevent, for example, a user of copy that vhd, but still use the virtual machine.

The integration with Azure Backup is great. You can use Azure Backup Service with managed disk to create a backup job that will easy your VM restoration. Managed disks although, only support the Locally Redundant Storage (LRS) as a replication option, this mean that 3 copies of the vhd within the region.

To resume, here are the benefits of managed disks:

  • Simple and scalable VM deployment
  • Better reliability for Availability Sets
  • Granular Access control
  • Azure Backup service support

Cheers,

Marcos Nogueira
Azure MVP

azurecentric.com
Twitter: @mdnoga

Bigger disks on Azure Storage

If you follow the announcements during the Microsoft Build 2017 conference on the beginning of the month, one of the announcements was the increase of the size of the disks in Azure. Azure had a hard limit of a 1TB size disks. But those days are almost over. Oh Yeah Baby!

During one session at the Build 2017 about Big data workloads with Azure Blob Storage, they announce the increase of those disk limits. Today, Microsoft announce the preview of those disks.

So, what is that means? Beside the increase of the size, they are increasing the performance of the disks as well. Be able to have more space and IOPS is always nice.

New Disk Sizes Details

This table provides more details on the exact capabilities of the new disk sizes in Azure:

Disk Type P40 (Premium) P50 (Premium) S40 (Standard) S50 (Standard)
Disk Size 2048 GB 4095 GB 2048 GB 4095 GB
Disk IOPS 7,500 IOPS 7,500 IOPS Up to 500 IOPS Up to 500 IOPS
Disk Bandwidth 250 MBps 250 MBps Up to 60 MBps Up to 60 MBps

To see the session for further details

 

Cheers,

Marcos Nogueira
azurecentric.com
Twitter: @mdnoga

Azure Backup – Part 3 – Backup Virtual Machines

On the first post (see here), I explained how the Azure backup works. On this post, I’m explaining how to backup Virtual Machines with Azure Backup.

If the systems that you want to protect are running the Windows or Linux operating systems on Azure virtual machines, then in addition to running Azure Site Recovery agent–based backups (as explained on the previous posts), you also have the option to perform a VM-level backup.

This process uses the Azure Backup VM extension and offers some additional benefits, including application consistency for Windows virtual machines, support for Linux, and a higher limit for the number of protected systems per vault, which is 200 Azure VMs versus 50 protected systems with the Azure Site Recovery agent. On the other hand, the backup frequency in this case is limited to once per day.

You should also keep in mind that the restore process creates a new virtual machine. As a result, an Azure VM–level backup does not provide a convenient option for restoring individual files or folders from a backup. In addition, the restore does not take into account such VM-level settings as network configuration, which means that you must recreate them after the restore. However, you can automate the restore process, by using Azure PowerShell or Azure CLI, for example. You must use scripting when recovering Azure virtual machines that host Active Directory domain controllers or that have complicated network configuration, including such characteristics as load balancing, multiple reserved IP addresses, or multiple network adapters.

Setting up an Azure IaaS VM-level backup by using the Azure portal involves the following steps:

  1. If you do not already have an available Recovery Services vault, create a new one.
    Note that the vault must reside in the same Azure region as the Azure IaaS virtual machines.
  2. Specify the vault’s storage replication type.
  3. Specify Backup goal settings, including the:
    – Location of the workload: Azure
    – Workload type: Virtual machine
  4. Choose the backup policy. The policy determines backup frequency and retention range. The default, predefined policy triggers the backup daily at 7:00 PM and has the 30-day retention period. You can create a custom policy to modify these values, by scheduling backup to take place on specific days and setting the retention period on a daily, weekly, monthly, and yearly basis.
  5. Specify the virtual machines to back up. The Azure portal will automatically detect the Azure virtual machines which satisfy Azure VM–level backup requirements. When you click Items to backup on the Getting started with backup blade, the Azure portal will display these virtual machines on the Select virtual machines blade. This will automatically deploy the Azure VM backup extension to the virtual machines you that select and register them with the vault.
  6. At this point, you can identify the Azure virtual machines that are backed up to the vault by viewing the content of the Backup Items blade.

Cheers,

Marcos Nogueira
azurecentric.com
Twitter: @mdnoga

 

Azure Backup – Part 2 – Azure Backup Agent

On the first post (see here), I explain how the Azure backup works. On this post, I’m explain how the files and folders are backup with the Azure Backup agent.

Azure Backup’s most basic functionality allows you to protect folders and files on 64-bit Windows Server and client operating systems, both on-premises and in Azure. This functionality relies on the Azure Site Recovery agent, which is available for download on the Azure Recovery Services vault interface in the Azure portal. You must install the agent on every system that you want to protect, and you must register it with the target vault.

To set up Azure Site Recovery agent–based protection from the Azure portal, perform the following steps:

  1. Create a Recovery Services vault.
  2. Configure the Backup Infrastructure storage replication type, by choosing either the Locally-redundant option or the Geo-redundant option on the Backup Configuration
  3. Specify Backup goal settings, including the:
    – Location of the workload: On-premises
    – Workload type: Files and folders
  4. Download the vault credentials from the Prepare infrastructure blade of the Azure Recovery Services vault. The Azure Site Recovery agent uses vault credentials to register with the vault during the installation process.
  5. Download and install the Azure Site Recovery agent from the Prepare infrastructure Choose the appropriate option for the system that you want to protect. In this case, you need to select the Download Agent for Windows Server or Windows Client option. When registering the local computer with the vault, you designate a passphrase for encrypting backups.
  6. Use the Azure Backup console to configure and schedule backups. After installing the agent, the new console, whose interface closely matches the native Windows backup console, becomes available. This allows you to select files and folders to back up and to schedule a backup directly to the Azure Recovery Services vault. You can also use Azure PowerShell to configure and initiate backup operations. After you schedule a backup, you also have the option to run an on-demand backup.

Note: If the computer that you want to protect contains a large amount of data and you have limited bandwidth in your Internet connection to Azure, consider using the Azure Import/Export service to perform the initial backup. In this approach, you copy the data to back up locally to a physical disk, encrypt it, and then ship the disk to the Azure datacenter where the vault is located. Azure then restores the content directly to the vault, which allows you to perform an incremental rather than full backup following the registration.

Cheers,

Marcos Nogueira
azurecentric.com
Twitter: @mdnoga