Tuesday, 18 July 2017

.Net Web API Validation

So I'm writing a Web Api .Net service to call from some mobile apps. Before you ask, I haven't used .Net Core since it requires all the support libraries are portable and that is not a 5 minute job!

Anyway, it basically works but I found a couple of funnies that have been reported elsewhere but they are not things that are obviously broken - thank goodness for Unit Tests!

1) I have an attribute that validates the model required by the API action and then sets BadRequest if the model doesn't validate - this saves calling if (ModelState.IsValid) everywhere. It didn't seem to work, IsValid was true when I called an action with no parameters. The reason? If the model is null, it passes validation! Terrible but true. I had to add an additional line of code to ensure the model was null before checking whether it was valid.

2) The RegularExpressionAttribute does not validate empty strings according to the regex. It would be nice if it was a property of the attribute but it isn't, it just doesn't. Again, I had to subclass RegularExpressionAttribute, override IsValid to ensure the value is not empty and then call the base class IsValid. I then subclassed this into my specific Attributes so that they all work as expected.

Tuesday, 20 June 2017

Client Certificate does not appear in Windows Credential Manager

This is one of those jobs I have done several times but couldn't remember why it didn't work the next time.

You add a client certificate to your personal store under Current User, it is in-date, it chains OK but when using Windows Credential Manager to add a connection, it doesn't offer you this certificate to choose,.

As pointed out here, you have to edit the properties of the certificate and untick "Smart Card Logon" and "Any Purpose" otherwise Windows will ask for a Smart Card to access the client cert!

Wednesday, 14 June 2017

OutOfRangeInput One of the request inputs is out of range on Azure CDN

Setting up a new environment that was (theoretically) the same as an existing system. Created a new CDN on Azure, pointed it at blob storage and tried to access it and Azure gives you a rather esoteric (and apparently catch-all) error.

Most answers that I found related to using invalid naming i.e. requesting a table with upper-case letters, when tables are not allowed to have upper-case letters (which matches the error message).

The issue here is that the CDN is hiding an error that is actually a storage error and, surprise, surprise, is nothing to do with the request but is related to a permission error.

I had setup the storage blob with "Private" permission but it actually needs "Blob" permission, which allows anonymous to read but not write blobs.

I updated it to use the correct permission but it still didn't work because.....it's a CDN and everything takes ages to propagate. I waited a while and it worked.

Wednesday, 24 May 2017

Build sqlproj projects with MSBuild on a Build Server

God bless Microsoft, each time a new Visual Studio comes out, they make an improvement, like making the install directories more logical and allowing better side-by-side installations. The problem? Most of these are not backwards compatible and it creates a whole load of compatibility problems.

Install VS 2017 and create a database project (sqlproj) in a solution. Open up the sqlproj file and you will see some really poorly thought out targets:

<PropertyGroup>
    <VisualStudioVersion Condition="'$(VisualStudioVersion)' == ''">11.0</VisualStudioVersion>
    <!-- Default to the v11.0 targets path if the targets file for the current VS version is not found -->
    <SSDTExists Condition="Exists('$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v$(VisualStudioVersion)\SSDT\Microsoft.Data.Tools.Schema.SqlTasks.targets')">True</SSDTExists>
    <VisualStudioVersion Condition="'$(SSDTExists)' == ''">11.0</VisualStudioVersion>
  </PropertyGroup>

Basically, what this says is that if I don't know what the visual studio version is when I build, then I will assume that I should look for v11 (VS2012) directories and fail if I don't find them rather than what most people would do, which would be either to fail if the version is not passed in, or to hard-code the version you chose when you added the project.

Run this on a build server with MSBuild instead of Visual Studio and you might see the following error:

The imported project "C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\MSBuild\Microsoft\VisualStudio\v11.0\SSDT\Microsoft.Data.Tools.Schema.SqlTasks.targets" was not found

Which makes sense because I don't have VS2012 installed on the build server at all.

I eventually realised the issue is that VS injects the version into the target whereas MSBuild does not. A simple parameter passed to MSBuild (/p:VisualStudioVersion=15.0) sorts that problem and tells it to use VS2017, which I have installed on the server, although only the Build Tools.

I then get a different error:

The imported project "C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\MSBuild\Microsoft\VisualStudio\v15.0\SSDT\Microsoft.Data.Tools.Schema.SqlTasks.targets" was not found

Well, it looks the same but this time it should work, since I have v15 installed. I had a look and sure enough, the SQL tools were not installed. Installations have changed in VS2017 and although I tried to install the Data Processing workload for the Build Tools, the option was not there. I installed the workload using the VS2017 community edition, checked for the target file, which was now there but the build failed again.

Looking closer, I noticed that the path was almost correct. MSBuild uses the BuildTools subdirectory of 2017 whereas proper Visual Studio uses community (in my case). Basically, there is no obvious way to install SSDT into the Build Tools area, which is where MSBuild looks so instead I copied over the MSBuild\Microsoft\VisualStudio\v15.0\SSDT folder from community into buildtools (with its directory structure) and also copied over Common7\IDE\Extensions\Microsoft\SQL* directories, which are used by the sqlproj target and the build worked!

Weird errors deploying new MVC site to IIS with Web Deploy

This is a brand new MVC .Net site, which has old functionality ported into it and it works fine on my local machine. Deploy it to an internal web server using Web Deploy and I get some strange errors:

The first is obvious, .Net 4.5.2 is not recognised. Using the Web Platform installer, I download and install that.

Then I get a weird compiler error: "The compiler failed with error code -2146232576". This was simply because the site was trying to compile but the App Pool Identity did not have modify access to the web site folder so I added that permission.

Then I get another weird error: "%1 is not a valid Win32 application". This basically means that something is 32 bit but is attempting to be accessed by a 64-bit only app pool. I tried enabling 32 bit in the app pool but that didn't fix it. Then I found that there is an issue running the roslyn compiler (I don't know why) and the workaround is to edit the "Allow precompiled site to be updatable" in the publish settings and disable it. This means everything will be compiled during deployment and it won't need to happen in-place.

Not sure why these things are alive in the wild but at least the site works now.

If you do need to be able to update the site, you might be stuck for now....

Tuesday, 9 May 2017

New to MongoDB and starting out

When you first try something new, you don't know what you don't know. Unfortunately with MongoDB, there is a large mixture of old and new tutorials. Some of them are still linked from the official site even though they are not relevant any more.

So there are two things I wanted to point out when using the instructions from MongoDB and doing your first operations on a database.

Firstly, the instructions about setting up auth and creating an admin user are incomplete. You try and connect to a test database and it doesn't work. Why? Because the official docs only tell you to give the admin user a role of userAdminAnyDatabase, which is exactly what it sounds like. If you are just playing around and don't want to start creating users, you will also need to use dbAdminAnyDatabase and readWriteAnyDatabase roles. If you have already set the user up, you will need to use the console and run db.updateUser()

Secondly, you should know that the operations on the SDK are lazy-invoked. For instance, if you call GetDatabase(), it will return a meta-object whether or not the client can reach the server. It is only when you actually need to query or write to the database that the connection is attempted and at this point, the operation might fail for several reasons. This means that you can use, for instance, GetCollection() and test for null to see if it exists, because it will never be null even if the collection doesn't exist (but you'll find out later!). Instead, in that example, you would instead use something like await db.ListCollectionsAsync(), which will block and call onto the database.

Thirdly, you should know that users are usually added to individual databases, so you would need to use the database name as part of the credential. HOWEVER, if you need to access several databases with the same user, you should instead create a single user in admin (which is the name you would pass in the credential for database) and add roles to this user that specify the database see example here and the large list of built-in roles here. Please don't deploy production systems with super user connections!

Tuesday, 2 May 2017

Removing google secondary email address

Just when you thought Google couldn't make their interface any more confusing, I got tripped up, couldn't find anything useful by searching Google and had to work it out myself. Not the pinnacle of usability!

I wanted to delete someone's secondary email address, which was actually an alias that was added to the user to continue to receive an ex-employees emails. She didn't want them any more.

Opened up the details in Admin screens, and press Edit next to the secondary email address contact information. Deleted the email address, pressed Update User, it all looks happy but behold, the email is still listed as a secondary email address.

The problem? You first have to delete the alias for the user and press Save. Then you can edit the contact information, remove the email address and it stays removed!