Aug 242016
 

This post is one of those brief ones I put out here as a reminder to myself and as a potential help to anyone else experiencing similar grief.

Recently, out of nowhere, I started getting the error “No valid combination of account information found” when executing a parse on a connection string for an Azure Storage Account.  Specifically, the failure occurred on

CloudStorageAccount.Parse(storageSetting);

Come to find out, in the midst of a huge merge of code, someone had inadvertently changed the connection string in the Web.config file of my ASP.NET site.  The string changed from

“DefaultEndpointsProtocol=https;AccountName=**************;AccountKey=*********

to

“DefaultEndpointsProtocol=https;Accountname=**************;AccountKey=*********

With this library, case matters and “AccountName” was changed to “Accountname”, which caused the error.

So beware, Microsoft.WindowsAzure.Storage is picky when it comes to casing.

Mar 012016
 

4059154289_44d9d5702f_o
Data stored in Azure Storage accounts is very securely protected by Microsoft.  There is very little (if any) reason to worry about catastrophic equipment failure causing your data to be lost.  It’s the beauty of the cloud! However, Microsoft can’t protect our data from inadvertent user errors that we or our software might make that could corrupt or destroy our data.  Because of this, backups are still a very necessary part of life in the cloud.

Unfortunately, there really wasn’t an easy or cost effective way to backup table data from an Azure Storage Account (as of 3/1/2016).  Microsoft doesn’t offer an app on Azure for tables and, while there are a few backup players, their solutions are pretty costly.  Through searches on the interwebs I have found that I am not the only person with this dilemma, so I set out to figure out how to do this quickly, effectively, reliably, and cheaply.

My first stop was at the new and cool Azure Storage Data Movement Library (DML).  My thought was that I could use the DML library in an Azure Web Job.  Everything would be contained, everything would be in the cloud, everything would be tight. However, I was disappointed to find that the DML does not yet support tables.  Further, offers to add table support to the open-source DML by the community was met with delay by Microsoft, as Microsoft’s reps said to wait, that table support would be forthcoming.  That’s great, but that doesn’t help me right now.

So, with some poking around, I put together a solution that doesn’t cause too much pain and get the job done effectively.  My solution uses the command line AZCopy.exe tool from Microsoft and a batch file with a few tweaks to backup a list of tables and blob containers.  To make my backup work, I spun up a virtual machine in Azure, using the cheapest available configuration (A0) and loaded AZCopy.  I also copied the backup.bat file onto the machine.  I then used the Task Scheduler to call my backup.bat file at a given interval.  When the scheduler hits the bat file (in my case once a day at midnight), it pulls all of the table blob data to my virtual machine and then pushes the data back out to a backup storage account for safekeeping.

Later, if I experience a catastrophic screw up of either my or of an infinite loop’s proportions, I can restore the data to a new storage account, do some testing, and then cut over my web app to the new storage.

You can checkout the backup and restore batch files here.  Bear in mind that I would like to improve the restore batch file at some point so that it is not necessary to spell out every table manifest file to restore.  If you have ideas or solutions, please contribute back.

Photo credit: M i x y via VisualHunt.com / CC BY-NC-SA

Feb 292016
 

Microsoft’s Azure Search service is an incredible way to provide your users with a very powerful data navigation tool.  With its REST API and .NET library, it is platform agnostic, meaning that you can utilize it in your web app, mobile client, or desktop app.

Check out my series of posts on the East Five site for an introduction:  http://www.eastfive.com/2016/02/24/microsoft-azure-search-a-practical-introduction/

Dec 312015
 
data

Microsoft has provided a great amount of tooling for access to and modification of resources on their Azure platform. There is the Azure Portal and there is also a comprehensive set of cmdlets for Azure Powershell. The Azure CLI is yet another way to script changes to your Azure resources.

There comes a time, though, when you need tooling to modify Azure resources in a specific way. This has led to the creation of the BlackBarLabs.Tools.Azure library.

Currently, the library contains a single tool, the AzureTableAndBlobCopier. This tool is a wrapper around some great work by Alexandre Brisebois. With this tool, you can specify a source and target Azure Storage account and the tool will copy the tables and blobs from source to target. This is a great tool to use when moving data from, say, a staging environment to a user acceptance testing environment.

To use the tool, pull the code, update the app.config to have the connection string for your source and target storage accounts, then run the exe. That’s all there is to it.

Take a look at the library, enjoy, and extend!

Nov 192015
 

Microsoft Azure PowerShell is a very powerful way to script deployment and tweaks to your environment in Azure. There are cmdlets for creating webapps, storage, and other resources. You can also use the PowerShell to automate modifications to app settings on your sites. The hardest part is getting started, so here are a few steps to take to give it a shot:

  1. Download Microsoft Azure PowerShell – You download the tool through the Microsoft Web Platform Installer.
  2. Launch the PowerShell and run the cmdlet Get-AzurePublishSettingsFile.  This will launch your default browser and take you to a page to download your Azure subscription file.  You need to do this step so that the PowerShell will have your credentials.Azure PowerShellAzure PowerShell Settings File
  3. Select your subscription and submit.  This will download your subscription file.
  4. Follow the instructions by running the Import-AzurePublishSettingsFile with the path to your settings file.  This will return the names and Ids of your environments.  You can now use this to run the many cmdlets available via Azure.

One other important thing to keep in mind for later, when you may be managing multiple Azure subscriptions, is that you need to make sure that the Azure Powershell has the subscription you want to work in set as the default.  To find out what subscription is set as default, run the command “Get-AzureSubscription”.  In my example, Powershell knows about two subscriptions that I have been working with.  You can see from this example, that Powershell knows about two subscriptions and that the Free Trial subscription is set as the default.

Get-AzureSubscription

To switch to the other account so that my scripting will run against it instead of the Free Trial subscription, I had to run “Select-AzureSubscription”.  This allowed me to select my Pay-As-You-Go subscription by name.  Running “Get-AzureSubscription” after this showed that the Pay-As-You-Go subscription was select.  All subsequent commands acted upon this account.

This is (by no means) meant to be a complete post on all of the powerful things available via the Azure PowerShell, but rather is a quick start to get your settings file and get moving.  For more information, read up at the Azure PowerShell page.

Nov 172015
 

I recently worked with an endpoint in an MVC project that intentionally returns a 409 conflict HTTP status code when a user posts a model with an Id that already exists in the database.  Upon encountering this conflict, the server is supposed to return the 409 status, but the client also expects to receive the conflicting record in the response body.  When running this code in the debugger and hitting the endpoint on localhost via Postman, the 409 status is returned and the existing record is shown in Postman as it is passed in the body.  However, this was not the behavior encountered when hitting this endpoint on a site deployed to Azure.

I deployed the same code that I ran in debug to a WebApp in Azure.  When I did a post to this same endpoint in the Azure site, the 409 status was returned.  However, instead of seeing the conflicting record in the response body, I received this error:

The page cannot be displayed because an internal server error has occurred.

This was strange, considering the fact that I had just seen the conflicting record returned when posting to the same endpoint on a server running in debug on my machine.  After some research, I learned that Azure is suppressing the body on purpose to mask error details (e.g. stack traces) from being shown to consumers of the API.

I found a way to ensure that the body is returned from the site on Azure when a 400 or 500 series status is encountered.  It is to be used with caution because you could expose more information than you want about 500 status, such as stack trace information.  To ensure that the body is returned in these cases, add these lines to your web.config: