Wednesday, 23 November 2016

Getting Web Deploy working

When web deploy works from Visual Studio, you get lazy and forget how much hassle it is to set up!

We have a new staging server and want to be able to web deploy so here is what you need to do!

Do these in the order given. There is at least one scenario where attempting to install Web Deploy before running the management service will not work!


  1. Make sure the Web Management Service is running. If it's not installed, you have to add the feature to the server. Assuming you have IIS installed, you need to add the feature Management Service.
  2. Open IIS, click on the server in the left-hand side and double-click Management Service under Management. If the page is disabled, click Stop on the right and Enable remote connections and then click Start. Optionally, you can lock down the remote IP addresses for the service. (You get a 403 if this is not setup)
  3. Open firewall port 8172 for TCP. You can lock this down to IP addresses if required.
  4. Install Web Deploy (current version is 3.5) by downloading direct from Microsoft. The web platform installer might not install the handler needed! (You get a 404 if this isn't installed)
  5. Create a site in IIS, if it does not exist, pointing to a new folder that will be the destination for your site. This will need to match the name you specify for Site name in the web deploy dialog. You can setup https bindings at this point if required.
  6. You have the choice of 2 ways of setting up user access. Either use an IIS user, which you create under the server tab in IIS Manager Users or otherwise use a windows user. If you want to use IIS users, you need to enable this under the Management Service page on the server. (I couldn't get the Windows user to work!)
  7. Click on the web site you want to set permissions for and double-click IIS Manager Permissions. Click Allow User on the right and either choose a windows or IIS user to give permissions.
  8. If you have used an IIS user, you need to add a delegate rule to allow the user to create an application and content. Double-click Management Service Delegation in the server tab and click Add Rule. Choose Deploy Application and Content and then once it is added, ensure the rule has contentPath, createApp, iisApp and setAcl ticked. Then add the IIS user you created to the rule.
  9. Make sure the user you are using has full control permission to the web root folder on the server to create the directories and files (it needs full control, even for subsequent deployments, which is sad but true!). For the IIS Users, you need to add these permissions for whatever user is running the Web Management Service (Local Service by default). If using a windows user, that user needs modify permission also.

Tuesday, 22 November 2016

Automated Builds for Visual Studio Project/Solution with custom NuGet packages and Bamboo

Atlassian Bamboo

We've started using Atlassian Bamboo as a build server for our mainly .Net projects. Apart from the fact that it plays nicely with Jira and Bitbucket and it seems to be very nice to use, it also supports a useful feature that it can build every branch that is pushed rather than just the "master". This allows a build for every defect fix which will live in its own branch but without having to set them up each time.

Bamboo Concepts

Anyway, the concepts are a little confusing in Bamboo because it is capable of very complex builds and workflows but you basically create a:

  1. Project - This is basically what you expect and will likely map onto a single solution in source control. It might however be a "super" project which would integrate several solutions into a single product.
  2. Plan - Each plan is like a "build type" so you might have one for Continuous Integration, one for each type of deployment etc. CI is likely to be automatically triggered and deployments are likely to be manually triggered only.
  3. Stage - Each plan has 1 or more stages. Stages will be run one after the other but are able to share artifacts between them i.e. the output of one stage can be fed to the next.
  4. Job - Each stage can have 1 or more jobs. Jobs are designed to be run in parallel on multiple agents so if you want something simple, just use a single job. Bamboo licensing is based around jobs so adding multiple jobs can cause you to require more licensing.
  5. Task - Each job runs 1 or more tasks. This is the lowest level entity. Each task will run one after the other but you are allowed to say whether the task always runs on failure of any tasks. This might be useful for calling a cleanup task. These are called "Final tasks".

Tasks for Building .NET

Source Code Checkout

Firstly, you will need a Source Code Checkout task. Note that the default properties for Bamboo will check everything out under Bamboo Home, which is in C:\Users\ by default and which might easily fill your C drive. I changed these paths and restarted Bamboo to make it use another drive for builds and artifacts.

Most of this should be easy to setup but you might need to setup SSH keys etc. for Bamboo to access Git, Subversion etc. I did this from the command line using git-scm but it looks like Bamboo does its own stuff by creating the application link to BitBucket and making Bamboo create a key.

NuGet Package Restore

In .NET, you are likely to need to restore NuGet packages. You shouldn't store these in your code repository because they are a waste of space and are easy to pull down onto the build server on build.

You will need NuGet.exe, which you can download from nuget.org as a single executable, which might as well be copied into C:\Program Files (x86)\NuGet. You will probably already have this directory if you have installed Visual Studio.

Now is the fun part! If you are only using public NuGet packages, then you should be able to skip the next part and go to "Create the Task" in this section.

Custom Feeds

If you are getting packages from any custom feeds, there are various ways for NuGet to find these. Depending on your version of NuGet, the following config files are used when NuGet attempts a package restore:
  1. Project-specific NuGet.Config files located in any folder from the solution folder up to the drive root.
  2. A solution-specific NuGet.Config file located within a .nuget folder in the solution.
  3. The global config file located in %APPDATA%\NuGet\NuGet.Config
  4. Additional machine-wide config files (NuGet 2.6 and later) located in %ProgramData%\NuGet\Config[\{IDE}[\{Version}[\{SKU}\]]]NuGet.Config
  5. (NuGet 2.7 and later) The "defaults" file located at %PROGRAMDATA%\NuGet\NuGetDefaults.config
So what should we use? As with anything, changing the fewest files is the best option. Option 5, above, is a way to share nuget config between separate users by having a potentially network-wide config. As long as security isn't an issue, this file could be pushed by group policy (or some other way) and avoid any additional setup. Option 4 is if you want to lock it to the build machine only, option 3 for the build user only and options 2 and 1 if the settings need to be project specific.

BE CAREFUL that you are not pulling in a config file into the users Roaming profile accidentally. This happened to me and caused a feed to be disabled! You can find out by opening C:\Users\\AppData\Roaming\NuGet and seeing if anything unexpected is there.

Once you've decided the appropriate config file to edit, add your feed endpoint in the normal NuGet format. If you are using VSTS feeds, you can copy the URL from the "package source URL" box in the "Connect to feed" dialog in VSTS. If the feed does not require authentication, it will simply be a key="" and value="" added under packageSources.

If you need to use a VSTS feed, the easiest way to authenticate is to add the CredentialProvider from VSTS, which is designed to bring up Internet Explorer to allow you to authenticate using your VSTS account and allow the provider to download and secure the creds needed to access the package. The instructions for this are linked in the "Connect to feed" dialog and are rubbish!

Download the CredentialProvider.zip link onto the build machine from the "Download NuGet + VSTS Credential Provider" link on the "Connect to feed" dialog in VSTS. Unzip it and copy it into a directory named "CredentialProviders" under C:\Users\\AppData\Local\NuGet. NuGet will look in all sub directories of this directory if you want it in another directory. If you cannot easily log in as the build user onto the build machine, you might also want to copy the credential provider into another users AppData directory for testing. The credentials will be cached per user so you will need to run NuGet as the build user at some point.

You MUST disable Internet Explorer Enhanced Security (also known as "no functionality" mode) if using the VSTS credential provider. NuGet won't use another default browser possibly due to the file type it is opening.

Log into the build machine somewhere that you can test package restore. I ran a Bamboo plan that just did a source code checkout so I had a starting point. Open a command prompt (doesn't need to be admin) and navigate to the directory that has the solution (unless you want to put in full paths) and run "C:\Program Files (x86)\NuGet\NuGet.exe" restore SolutionName.sln. It should realise that it needs to authenticate using the MS credential provider and open that Visual Studio login window for you to enter your VSTS credentials. Once that is done, they will be downloaded and stored (I believe in the TPM of the machine) so it won't need to be done again.

Remember that it will need to work as the build user since Bamboo will not be running the same thing as you (hopefully)!

Create the Task

Create a second task of type "Script" and set it to use cmd.exe as the interpreter, use inline as the script location and paste in something like: "C:\Program Files (x86)\NuGet\nuget.exe" restore "${bamboo.build.working.directory}\MySolution.sln" into "Script Body"

Building .NET

You can theoretically build .NET projects just with MSBuild which is usually installed into the .NET framework directories in c:\windows\Microsoft.NET but actually there are various dependencies that can make this tricky to setup. You will also need to install Visual Studio Build Tools to ensure you get a batch file that is used by Bamboo.

By FAR the easiest way to do this is to install Visual Studio Community edition for free and ensure you also install Visual C++, which installs the required batch file.

Add a new task of type Visual Studio (or you can try and get an MSBuild task working). Add a new executable if required and note it wants to the path to devenv.exe only, not the file name! Options needs something, e.g. /build Debug and the platform is likely to be amd64.

This should now be enough to manually run the plan and see any errors that are produced by the build in the various logs. A common problem is that the build server rarely has as many libraries installed as the development machine so it will potentially cause dependency errors but that is for another day!

Thursday, 20 October 2016

Azure App Services not ready for the big time yet?

I really like the idea of App Services for web apps. It basically looks like cloud services but with shared tenancy so lower cost per "server" than a cloud service.

The other really big winner for me is the much quicker scalability when scaling out. Cloud services understandably take a long time to scale out and cannot handle short traffic bursts for that reason.

App Services on the other hand can scale out in seconds. I think they do this by sharing the code and just increasing the number of virtual servers.

It looks great on paper and deployment is as easy as Right-Click->Publish from visual studio, using web deploy and therefore taking seconds. Cloud service deployment is still cryingly slow!

So that's the good news, what's the bad news?

It doesn't seem to work properly!

We set up an app service plan (Standard) and deployed a web application that calls a web service, which accesses a database. We also use Azure Redis Cache.

We fired a load test at the system to see what sort of load we could handle and setup auto-scale for a CPU of 80% to allow form 2 (minimum) to 6 (maximum) instances.

So what happens when we run the test? Loads of Redis timeout errors. Hmm, the requests per second are only about 100/150 and the response times are less than 1 second so why is the redis 1.5second timeout occurring?

We had a few pains, where re-deployments didn't seem to fix the problem and even though we could see the remote files easily using the Server Explorer in Visual Studio, the presence of the machine key element didn't seem to remove the errors related to "could not decrypt anti-forgery token". Hmmm.

To be fair, it did expose a problem related to the default number of worker threads. In .Net, this is set to the number of cores (1 in our case) and any new threads can take a minimum of 500mS time to create, easily tipping the request over the 1.5 seconds. So I increased the number of threads available but still we were getting timeouts even when the number of threads was way below the threshold for delays.

Maybe we were pegging CPU or memory causing the delays? No idea. The shared tenancy of App Service hosting does not currently allow the real time metrics to display CPU or memory for the individual virtual server so that's no help. It also wasn't scaling up so I guessed it wasn't a problem with resources and why should it be at such a low user load?

I finally got fed up and created a cloud service project, set the instances to the same size as the app service instances, using the SAME redis connection and ran the same test. Only a single error and this pointed to an issue with a non-thread safe method we were using (which is now fixed). No redis timeouts and a CPU and memory barely breaking a sweat.

We ended up in a weird situation where we couldn't seem to get much more than about 100 requests per second from our AWS load test but that is another problem.

Why does a cloud service running the same project work where the same thing on App Services doesn't? I couldn't quite work out what was wrong. A problem with Kudu deployment? Some kind of low-level bug in App Services? A setting in my project that was not playing nice with App Services?

No idea but there is NO way I am going to use App Services in production for another few months until I can properly understand what is wrong with it.

These PaaS tools are great but when they go wrong, there is almost no way to debug them usefully. The problem might have been me (most likely!) but how can I work that out?

Cannot answer incoming calls on Android

Weird problem: Phone rings and/or vibrates but the screen doesn't change. All you can see is the home screen. How can you answer the call? You can't.

Well you can but you need to know what you did!

In my case, I am on the Three mobile network, which is, basically, terrible around where I live and work. They have an app that uses the Internet for calls and texts if you are not in mobile range. If you are, all kinds of pain occurs because texts sometimes go to the app and not the phone etc.

Anyway, like many annoying apps, decide that they can fill up your notification area with a permanent message saying that Wi-Fi calling is ready. Amazing!

Anyway, I had disabled notifications for the Three app to get rid of that annoying message but the thing I hadn't realised is that it will also disable the incoming call notification. For some reason, the phone still rings and vibrates but you can't do anything.

I had to re-enable notifications and then, of course, I can answer the phone! I also have to live with the annoying permanent notification.

The same has also occurred with Google Dialer according to forums but I don't have myself.

Friday, 14 October 2016

Azure App Services (Websites) - Cannot find the X.509 certificate using the following search criteria

Deployed a brand new WCF .Net web service to Azure App Services and when trying to load up the svc file in the browser, got the message above.

Here's what you need to know:


  1. You need to upload the required certificates to the Azure portal
  2. You need to make sure that you are referencing the certs from the CurrentUser store, NOT the LocalMachine store. App Services uses shared hardware so you can only access the CurrentUser location.
  3. You need to add an App Setting to tell the service which certs to make available to the web service. You only need to do this for certs referenced in the app, not for certs you are only using for an SSL endpoint. The key is WEBSITE_LOAD_CERTIFICATES and the value is 1 or more comma-delimited thumbprints for the certs you want to load.
  4. You CANNOT add this only in the web config file, despite Azure merging portal and web config values, it MUST be added in the portal to the Application Settings tab.

Thursday, 13 October 2016

Encrypting/Decrypting Web.config sections with PKCS12

What is it?

I'm not sure why I haven't blogged this before, since it can be quite confusing but this is a cool way of protecting web.config secrets from being seen casually by Developers.

The basic idea is that you use an SSL certificate as the key and then the sections in web.config are not readable. They will decrypt automatically, however, when used by IIS if the certificate is installed locally.

It is worth mentioning that if a developer can run the code locally, they will still be able to find out the secrets, it is more of a protection from people who can see code in a repository but not run it.

You can use any SSL/PKCS12 certificate for encryption/decryption but I recommend using a self-signed certificate that should be in-date (since some services e.g. Azure App Services will not allow upload of expired certificates). If you use a self-signed certificate, you get to control its lifetime more easily if you are worried about it being hacked. If you share one of your public certs, you will probably have more work to do when the cert expires.

You should be able to encrypt all web config sections but although this works in theory, certain sections cannot be encrypted due to when they are accessed by the system. I can't find a useful resource so you will have to try it and see. It will be obvious if a section fails to decrypt.

So how do you do it?

First, you should reference or install Pkcs12ProtectedConfigurationProvider, which does exist as a NuGet package.

You then use aspnet_regiis.exe, which lives in the .Net framework directory(ies) in C:\Windows\Microsoft.Net.

So how does it know which key to use to encrypt? You need to create a section in your web config that references the thumbprint of the certificate you want to use. It looks like this:

<configprotecteddata>
    <providers>
        <add name="CustomProvider" thumbprint="d776576a90b5c345f8a9d94e732c86c1076eff78" type="Pkcs12ProtectedConfigurationProvider.Pkcs12ProtectedConfigurationProvider, PKCS12ProtectedConfigurationProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=34da007ac91f901d">
    </add></providers>
</configprotecteddata>

aspnet_regiis performs various other functions as well but the two options we are interested in are:

-pe to encrypt a section
-pd to decrypt it.

If you use the f option as well, you can specify a physical location for a web.config, which I have found is easier than using the VirtualPath (and in some cases you won't have virtual path).

The -f should point to the directory containing the web.config, not the web.config file itself. You will end up with a command like this:

aspnet_regiis.exe -pef "connectionStrings" "C:\Users\luke\Documents\Visual Studio 2015\Projects\SolutionFolder" -prov "CustomProvider"

Note that prov specifies the name of the key entry that you want to use to encrypt the section.

If you are encrypting a custom section, you might have some fun and games trying to get aspnet_regiis to find and load the dll that defines the config section. You can use gacutil to add these dlls to global assembly cache and then use the correct version and public key (which you can find in a tool like ILSpy) so that aspnet_regiis knows to look for the dll in the gac.

If your section is not at the top level, you need to specify it like system.web\mysection

Sometimes, it pretends to work but doesn't. Run it again, it sometimes catches up! Google any errors, they are fairly easy to work out.

You will then end up with a section of the same name but with a reference to the customProvider you are using and an XML encrypted packet.

Remember to upload the certificate you are using to any systems that will need to read the section otherwise you will quickly get an error!

And as always, start with a really simple example and make sure it works before going crazy and trying to encrypt everything.

Wednesday, 28 September 2016

.Net MVC5 - RedirectToAction not Redirecting in the browser

I had a funny one on a test site I was using. Very simple MVC app, a page that requires a login and then a logout link that does the following:

public ActionResult Logout()
{
    Request.GetOwinContext().Authentication.SignOut();
    return RedirectToAction("Index");
}

However, after calling this code, a 302 is returned to the browser for the redirect but it doesn't include the Location HTTP header pointing to the new location, so the browser does not redirect. Instead, the user has the click the link on the displayed page.

I thought this was very strange and dug into the code that produces the Redirect inside System.Web.HttpResponse and found that the Location was always added using the same url that was correctly inserted into the response body. Clearly something was removing the header after it was added and before the redirect took place.

After I commented out SignOut(), the redirect worked correctly so somehow, for some reason, IAuthenticationManager.SignOut() is removing the Location header, but AFTER it has been added in the next line.

I haven't found a reason on stack overflow but I might dig a bit deeper into SignOut to find out what is happening.