Azure Toolbox – Part One

Today I was working with someone new to Azure and they asked what tools should they use to manage Azure. It was a fair question, so I figured I would create a two part post to share the tools in my toolbox.

#1 Azure Cloud Shell

Azure Cloud Shell Screenshot

This one almost seems like cheating since it’s built into the Azure Portal. However, I’m amazed at how often I use it. Azure Cloud Shell provides a PowerShell or Bash shell right in the Azure Portal and works with any web browser – no software to install! All of the commands you’d expect, like the PowerShell Az cmdlets and Azure CLI, are at you fingers tips. In addition common tools like Ansible, Terraform and even OpenSSL are pre-installed.

Take the Azure Cloud Shell PowerShell Quickstart for a spin.

#2 Azure PowerShell AZ Module and Azure CLI

Azure CLI Screenshot

Now if the Azure Cloud Shell doesn’t work for you, you’re going to need to install the Azure PowerShell Az Module and/or the Azure CLI. Both of these tools will allow you to interact with Azure from the comfort of your computer. Now I’ve been asked should I choose one over the other, and the answer is – it depends. I would say get really familiar with one, but know when the other one may be a better choice. My preference is the PowerShell Module, but there are honestly many things that are way easier to do in the Azure CLI.

#3 Azure Storage Explorer

Azure Storage Explorer Screenshot

Once you start using Azure more, you’re going to need to interact with storage accounts frequently. The Azure Portal lets you do a lot – but there are some things that are so much easier with a client. The Azure Storage Explorer is a cross platform tool that makes it a breeze to interact with Azure Storage resources including Blob, File Shares, Queues and Tables. In addition, developers can interact with a local storage emulator.

#4 FileZilla

FileZilla Screenshot

OK, I know we just covered Azure Storage Explorer so you’re probably asking why do you need FileZilla? Well, if you do anything with App Services and have developers using older methods of deployment (cough cough SFTP), you’ll find FileZilla handy.

#5 Putty

Putty Screenshot

If you need to work with Linux VMs and you’re on Windows, you’re probably going to need to SSH into them. For this, I use Putty. If you download the bundle you’ll also get the PuttyGen tool for generating and working with keys.

#6 Azure Mobile App

When on the go and all you have is your phone, you may be able to do what you need from the Azure Mobile app. Available for iOS and Android, you can access a ton of Azure features, including the Azure Cloud Shell, which basically means you can do anything – as long as you know the commands.

#7 Azure Portal App

Azure Portal App Screenshot

The Azure Portal app is currently in preview and gives you full access to the Azure Portal via an app. I like it because it’s one less browser window to keep up with – and if you’re like me and have 3+ accounts you have to swap between, it gives you one more place to stay logged in. The Azure Cloud Shell is also at your fingertips. The only downside is it’s Windows only. Give it a try!

That’s a few tools I find myself using on a daily basis to interact with Azure resources. In part two I’ll cover a bunch of more obscure tools but super helpful nonetheless.

PowerShell Where() Method vs. Where-Object Cmdlet

In my last post I mentioned attempting to speed up a slow PowerShell script by swapping out Out-Null for > $null. Well, I’m back at it and trying to speed up the script a bit more.

I’m using several Where-Object cmdlets to find items that match a string using -like. The collections are rather large (40,000+ items) so it takes a while to find matching items.

PowerShell version 4 introduced a new Where() method that operates on arrays. The Where() method can be quite a bit faster compared to the Where-Object cmdlet. It can, however, use quite a bit more memory and in some cases might be slower, so as usual – your mileage may vary.

Example:

$files = Get-ChildItem -Path C:\ -Recurse -ErrorAction Ignore
In this case, Where() shaved 10 seconds off compared to Where-Object.

PowerShell Out-Null vs. $null

It’s always the little things. I was recently troubleshooting a slow PowerShell script which required piping the output of a long-running command to Out-Null. Now there may have been other ways to avoid this, but that’s another story.

A quick Google search led to *tons* of articles and posts on why never to use Out-Null (OK, there are reasons, but my particular use was not one of them) and instead redirect to $null.

So I swapped out | Out-Null for > $null and sure enough problem solved.

Here’s a quick snippet that shows the speed difference:

Part 2: Azure Storage Static Websites with CloudFlare

In part one we covered the basic set up for Azure static websites. In part two we’re going to cover configuring the Storage Account for a custom domain and the CloudFlare set up to act as the CDN (proxy)for the Azure static website.

The first step is to create a free CloudFlare account. I’m not going to cover this details since the steps are going to vary based on your domain registrar. In my case I changed my domain’s nameservers to CloudFlare’s. It may take a while for this step to fully complete.

Here’s what we want to accomplish:

  • Create/change a CNAME for www that points to the static website endpoint URL
  • Tell the Azure Storage Account about our custom domain
  • Set up a Page Rule in CloudFlare to redirect domainname.com to www.domainname.com
  • Require SSL

If you recall in part one you were given a new static website endpoint URL when you enabled the feature, which you’ll need when creating the CloudFlare DNS CNAME.

From the CloudFlare DNS page, create, or modify if you already have one, a new CNAME pointed to the Azure static website endpoint URL.

Important: One thing I failed to capture in the screenshot, click the orange cloud icon and turn it gray when adding the CNAME. You can also click it after adding the CNAME. We will turn this back on later but we’ll need it gray in order to add the custom domain to the Azure Storage Account. Essentially this tells CloudFlare we’re just interested in using it for DNS, not as a CDN. We can get around this another way, but this is the simplest.

At this point go get a cup of coffee and then we’ll set up the Storage Account custom domain.

Head over to the Azure Portal and the Custom domain blade of your Storage Account.

Enter your custom domain you set up  in CloudFlare – in my case www.domainname.com and click Save. You will receive a confirmation message the save was successful. If you receive an error, double check your spelling – if it all looks good, wait another 10 minutes or so and try again.

Assuming you were able to successfully save your custom domain, head back over to the CloudFlare portal and the DNS page. Click the gray cloud icon next to your www CNAME and turn it orange. This will tell CloudFlare you want to use it as a CDN (proxy).

Next, click Page Rules.

Click the Create Page Rule and add a new rule that matches the URL domainname.com/*. Next, add a setting that uses the Forwarding URL action with a 301 Status Code. The URL you want to forward to is https://www.domainname.com/$1.

This rule will forward requests for http://domainname.com/ and https://domainname.com/ to https://www.domainname.com/ which will be served by your Azure static website.

OK, one last step: SSL! Head over to the Crypto section of CloudFlare. Ensure SSL is set to Full (strict) and Always Use HTTPS is On.

If you made it this far your static website should be accessible at https://www.domainname.com/. If not, wait 20 minutes and try again. In some cases CloudFlare takes a lot longer to create the SSL certificate for your site. They advertise it can take up to 24 hours but it usually takes less than 30 minutes.

Part 1: Azure Storage Static Websites with CloudFlare

Azure Storage is a great multipurpose storage solution and is utilized by many Azure services to store everything from log files to VM disks. It also offers many services on its own, the latest fresh out of preview is Azure Storage Static Websites.

AWS has long offered static website hosting in S3 buckets, so it’s nice to see Azure catch up on this front.

For the heck of it I decided to take this new feature for a spin, with one catch – use CloudFlare as the CDN. CloudFlare offers tons of features, even with the free account, such as DDoS protection and free SSL. Now it’s possible to use an Azure CDN to do something similar, but sometimes it’s nice to see how everything can play together.

If you’re following along at home, you’ll need an Azure subscription, a domain name and a free CloudFlare account.

First step, create a new new Azure Gen 2 Storage Account:

Once created, head over to the Static website blade for the Storage Account. Enable the Static website feature and click Save. Once saved, you will be presented wit ha new Primary endpoint URL specifically for the static website.

While you’re here, go ahead and provide an Index document name and Error document name.

Now that the feature is enabled, head over to the Blobs blade. Notice a new container has been created named $web. Azure has already set the permissions so that it’s accessible by everyone.

At this point go ahead and use the Azure Portal to upload a sample index.html file to the $web container. I highly recommend downloading  Azure Storage Explorer if you plan to interact with Storage Accounts on a regular basis.

At this point fire up a web browser and navigate to the endpoint URL. You should be presented with your index.html document.

OK, if that worked it’s time to to move on to configuring CloudFlare and the Storage Account for a custom domain. I’ll cover this in part two.

Microsoft Storage Spaces Direct — Let’s go!

In enterprise IT, every 6 or 7 years hardware nears its end of live and it’s time to research and replace it. OK, this may not be the case for everything, but for the most part servers from Dell, HPE and Lenovo seven years marks the end of support and you have to turn to 3rd parties.

In my case, it’s decision time for a big part of our infrastructure. For the past 10+ years we have taken the traditional storage approach of a big SAN to support our storage needs. Over the years we have relied on NetApp, EMC and Dell solutions to serve block and file storage. We have also made a big investment in VMware and have have taken the hard line of virtual first and even offer co-location for others within the business.

Back to the ticking time bomb. Our VMware cluster runs in HPE blade chassis gets its storage from a NetApp SAN. The HPE hardware is nearing seven years and the NetApp will be up five years old next year – we purchased five years maintenance up front, so I’m sure the renewal is going to be a small fortune.

Our NetApp serves several purposes:

  • Block storage over fiber channel for VMware
  • Block storage over fiver channel physical Windows and Linux hosts
  • NFS file storage for an Oracle Virtual environment (Oracle Financials)
  • CIFS/SMB storage for client file shares

With so much going on in the cloud space, and more specifically Azure for me, we decided to rethink our need for for traditional storage. And since we’re rethinking storage why not rethink virtualization platforms?

After some debate, we decided to move from VMware to Hyper-V and build out a hyper converged infrastructure leveraging Microsoft Storage Spaces Direct (S2D). We’re still working on sizing and selecting a hardware vendor, although Dell will most likely be our choice with their ready nodes.

While this takes care of one of the NetApp workloads,  I still have three more to deal with! Fortunately, there’s other projects in motion to shorten the list to one. We’ll probably end up with a new, smaller traditional array to serve the block storage to the few physical Windows and Linux hosts — and I’m OK with that.

New Azure Certifications – Changing Paths

Last week Microsoft announced a new certification: Microsoft Certified Azure Administrator. There are two exams in beta you’ll need to take to obtain this certification:

You must take both exams to obtain this certification. The cert and exams are designed to eventually take the place of the 70-533 Implementing Microsoft Azure Infrastructure Solutions exam.

The new certification and exams are supposed to be more real-world and cover topics you’re more likely to use on the job. I recently started studying for 70-533 so I’ll start shifting my focus to the topics covered in AZ-100 so I can take it once it’s available in a few months.

One last thing. If you have already taken 70-533, there’s a new transition exam to get you the Microsoft Certified Azure Administrator certification:

Azure Functions

Serverless computing has been all the rage recently. I’ve spent a little time playing around with AWS Lambda and found it interesting, but never really used it for more than tinkering.

Microsoft recently released Azure Serverless Computing Cookbook, a free ebook that walks you through creating a sample application using Azure Functions. The 300+ page book does a nice job of covering the basics and starts you down the path of using Azure Functions for more advanced uses.

If you don’t know C#, no problem. For the most part you’re copy and pasting code from the book. While that doesn’t help you learn C#, if does get you familiar with Azure Functions capabilities, including integrating third-party apps/APIs like SendGrid and Twilio.