TomHarris.co.uk >> Blog

Saturday 25 October 2014

Testing WebApi with Owin Hosting

Lately I have been working with the OWIN specification; specifically OWIN hosting, known as Katana, which is Microsoft's implementation of the OWIN specification. During this time I have come across some tools that have helped me create my project. One in particular stood out so I thought I would share.

Brief introduction to Owin hosting

Here is a brief example of OWIN hosting Startup class initializing WebApi, full solution here.

using Owin;
using System.Web.Http;

namespace WebApi
{
    public class Startup
    {
        public void Configuration(IAppBuilder builder)
        {
            HttpConfiguration config = new HttpConfiguration();
            config.Routes.MapHttpRoute("Default", "{controller}/{customerID}", new { controller = "Customer", customerID = RouteParameter.Optional });

            builder.UseWebApi(config);
        }
    }
}

This is the entry point to your API, regardless of being hosted in IIS, Windows Service or even a command window. This is the "first" part of  the code that will be touched in the solution, meaning it's the first place you could break your site unless you have some test covering it.

To learn more about OWIN Katana, see these examples http://www.asp.net/aspnet/samples/owin-katana.

Why is this important?

This pattern of hosting is the future of ASP.net. This is proven with the latest version of vNext using this method.

Testing boundaries

The current trends on testing seem to be in a state of flux. Many people have different views on what the best way of testing is, an argument I'm not trying to take part in here. However, one of these views is to test from the boundaries, not unit testing. In our API's case, the boundaries can be considered as the Startup.cs and likely a database or some external resource. These are the first and last entry points of our solution.

If you want to know more about boundaries, check out Ian Cooper's presentation on Hexagonal Architecture, I saw it at DDD East Anglia and it clarifies a few points of confusion.

Why does the OWIN host make this easy?

One of the NuGets that has come out with the introduction of OWIN host is the Microsoft.Owin.Testing library. With this library we have the ability to call our API from the outside, from the first boundary point of our code.

When using the Mircosoft.Owin.Testing library, the tests are started in an in-memory host. Any calls to the host will now be routed via your "Startup.Configuration" which in our example will pass the calls to the WebApi framework.

This means no setup of a host (IIS, Windows Service...) before you run your test. The tests are completely contained.

Creating a test

  1. Create your API using Owin Hosting, as shown earlier
  2. Create a Unit Test project,
  3. Add a reference to your API project. This allows us to reference the Startup class.
  4. Add the "Microsoft.Owin.Testing" NuGet
  5. Add the following code in your test. Changing the URL to your API url.

var server = TestServer.Create<Startup>();

var response = server.HttpClient.GetAsync("/hello/world").Result;
var result = response.Content.ReadAsAsync<IEnumerable<string>>().Result;

Assert.AreEqual("helloworld", result.First());

Now you can run your test knowing that all your routing and configuration is correct, without setting up any IIS or Service to host your solution.

For some other example see https://github.com/justeat/openrasta-hosting-owin/tree/master/src/OpenRasta.Owin.Test

Not just WebApi

This doesn't just work with WebApi! Any framework that has an OWIN hosting can run within the Microsoft.Owin.Testing in-memory host, here are a couple of frameworks that now allow this.

Conclusion

Whether you want to do integration testing, boundary testing or feature testing I believe that you can use the OWIN in-memory host to achieve this. Removing any prior configuration of a server before running the test also allows for easier collaboration on different machines. With Microsoft following the same patterns in vNext it seems like a promising road map to be on.

**UPDATE**

Since releasing this post I have now been working with JustFakeIt. This is a nice wrapper around a in process host using similar methods as above. The benefit is that we can use it like many mock frameworks with expected results and ignoring parameters. Check it out https://github.com/justeat/JustFakeIt.


Sunday 31 August 2014

Finally moved to the cloud

I have been with GoDaddy now for a while but have noticed as I progress with my site and my Mini Projects become more complicated(heavy) that GoDaddy was getting harder to manage and struggling with the loads. I cannot really complain I bought this hosting package for peanuts.

Lucky for me I have a MSDN subscription, with that comes £35 of free Azure credit per month... It was time to move the site. I kept putting it off, assuming it would take a while and I would struggle with the GoDaddy DNS, but thanks to YouTube I found a simple walk through and an amazing bit of background music :).

How To Forward a GoDaddy Domain name to Windows Azure Website



In the end it took all of about 20 minutes including some reading to get up and running.

Just watch out when adding you CNAMEs, you will need to add "awverify", "awverify.www" and change "www", it was just a little hard to see in the video due to resolution.

So now when you visit tomharris.net you are in the cloud.


Tuesday 15 April 2014

Agile development - Remote/Distributed teams


Applying Agile methodologies at the best of times can be difficult, getting all the business to buy into the concept is a tough sell. I see the key strength of Agile as regular communications via demos, scrums, kanban… all of these are about communicating progress, however what do you do when you teams are spread globally or on customer sites. In this situation the Kanban whiteboard and post-its don’t work.
I currently work in a company where the development team is spread across multiple regions, from Vietnam to the UK and many stops in-between, people work from home on a regular basis, all this makes communication hard. To overcome this we use a couple of techniques to address the gap and make it feel like we are all in one office.


Daily Scrum

This goes without saying but we use Skype/Lync both these tools allow for our whole team to do a stand-up(seat down with headsets). Ideally getting people to have the video camera on where possible, keep the call short and sweet.


Digital Team Board

Dashboards

We are a Microsoft house traditionally so we opted TFS as our project tracking system. The benefit of TFS 2012 is the introduction of the Web access and dashboard. This new web access brings a wealth of tools with it which in most of the case just don’t exists in the Visual Studio version.

Team Capacity

In TFS dashboard you can also configure and view you Team capacity Vs Requirements. This is a really powerful live view of each individuals ability to deliver, allowing for you to take into consideration holidays per sprint/iteration.


Digital Kanban

In TFS you also have several Kanban style dashboards to allow you to share progress at a multiple of levels.
  1. Work>Board>Requirements
  2. Work>Board>Team
  3. Work>Backlog>Board

All are pretty similar as below but obviously depict different information. One great bit of the Backlog board is the ability to customize the columns, this allows for you to add more steps to your process very easily.

The key is to share these boards during you daily calls with the team, and share them outside the team. The point is to replicate this as a white board on the wall.

Live Burn down

Using the information gathered about the requirements and the capacity TFS is also capable to provide a burn down, this is great to share to the non-technical teams the estimates completion date.
A Burndown is rarely a perfectly linear line, in all processes you will have times where development is more productive than others. Be careful when sharing this that people do not get scared that you are behind or add work because you are in front, the line is a prediction and not fact.

Custom Counters

On the home page of TFS 2012 you also have the ability to configure TFS Query counters, ideal for showing bug counts etc. In addition you can also add on this screen the Build status, for continuous integration this is key as it will show the whole team that the builds broke before they go home for the day.

Alternatives to TFS Dashboard

You have plenty of add-ons for TFS which can achieve similar and more, below is a list but I have not tested them all.
http://scrumdashboard.codeplex.com/
http://www.telerik.com/teampulse/tfs
If you do not use TFS you do have alternatives options still with Team-City for example you can use
http://www.sonarqube.org/

But where is my whiteboard

So, all the above is great but how do we make sure people look at it? “When we was not digital we could ensure that the team could see it everyday buy having it on the whiteboard in the dev area” is the question I can hear. The simple answer is you cannot make sure people look at it unless you keep a board, but lets have a digital board.

Display

We opted for a 42inch screen in each location local and remote(obviously we could not enforce this on home works). We also had to have something to run our display so we repurposed an old laptop, most companies have one, it really doesn’t have to be special.


Sharing content

Now we had our screen we needed some content, I wanted a scrolling screen showing everything from burndowns, capacities, company twitter… The main goal is to make this something worth looking at daily.
Things I have on our screen:
  • Company Twitter feed
  • Each Project Dashboard
  • Burndown chart (Note for this you need to access the tfs team chart api, to do this press F12 click on the chart and it will have an image URL, this is your api).
  • Team Capacities
  • Popular Blogs
  • Microsoft Training sessions
  • Company Logo
  • Team Social Pictures
One of the key things is to pick content that will change naturally, we need to make the screen engaging.

Adding live web pages (TFS Dashboards)

To achieve this I come across a tool called LiveWeb which is a addin for PowerPoint, you can follow the guide to setting up a presentation containing web pages here. The key benefit that LiveWeb brings is the fact your web pages will refresh each time you load a slide, showing only the latest information.

Looping you PowerPoint

In PowerPoint 2013 you have the ability to loop your slide show based on timings, as below.
  • Click on “Set up Slide Show”

  • Select “Loop Continously until ‘Esc’” and “Using Timings, if present”


  • For each slide set the timings, you do this by click on “Start Recording from Current Slide…”, you then need to wait for the time you want for this slide and then press esc. Do this for every slide until all of them have a timing associated, this process is better for adding new slides as you can change the timing for the current slide only and not all.

  • Press F5 and away you go.
You also have the option to Broadcast this presentation, with this option you can potentially let other remote location view the exact same presentation, although I never tried this.
More to come as we continue on our journey of multiple location development…..

Tuesday 8 April 2014

Hadoop for Windows 7 64bit

Hadoop for windows 7 64bit

I recently got a new job, Yay! for me. Part of this new job includes working with Hadoop and MapReduce, I had previous read the tutorials and understood how it works but never got into the actual code. As a historically Microsoft developer it was a scary thought, to step into Unix and Java seemed like a challenge. Despite Java fear I could see the benefits of starting to step away from Microsoft and into the new OpenSource world of BigData, however baby steps are required and stepping away from Windows was one step to far for me now, I have so much other development to do and mostly use Visual Studio.
I started by following some tutorials for setting up Hadoop on windows. You have a lot of documentation out there but sadly it is not up to the standard that you might expect, mainly because Windows and Hadoop is not so common. I struggled with this for a couple of days and felt that it was my duty to try to help anyone in my position. So, please find below a list of tips and tricks to get you up and running, I doubt it is fool proof but I'm sure it will save anyone some time.

SSH service setup

I started by following this tutorial (https://gist.github.com/tariqmislam/2159173) But found what seemed to be a better SSHD guide (http://evalumation.com/blog/86-cygwin-windows7-sshd) and then a potentially even better one (http://venuktan.wordpress.com/2012/11/14/setting-up-hadoop-0-20-2-single-node-cluster-on-ubuntu/)
It is worth pointing out that I followed a combination of these to get to my goal.

Cygwin

First things first you need an introduction to Cygwin. The site mentions what it is and is not, but the best explanation I have seen is "Cygwin provides native integration of Windows-based applications, data, and other system resources with applications, software tools, and data of the Unix-like environment" on Wikipedia.

Problems with installing Cygwin

The install is fairly easy but I did come across some problems one of which is picking the right mirror. You pick the mirror in the installer itself It is almost impossible to know what is right for you but I found some of them resulted in a bad install, so you have to go back and pick another, simple resolution but can be annoying to figure out.

Problems with Setting up SSH

1) Running the SSH install (Privilege separation issue)
No matter how many guides I followed I always come up against a issue with Privilege separation. If you cannot get your service to start then you likely have the same issue, you can test this by running ssh.exe in the Cygwin tool to see full error. I found two solutions to this one felt like a hack the second change my life (maybe a bit dramatic but it helped.)
  • Add config: Add "sshd:x:71:65:SSH daemon:/var/lib/sshd:/bin/false" to the "etc/passwd" file in the Cygwin directory. You need to change the highlighted text for a real user id, to obtain these numbers just type "id" in the Cygwin prompt. See here for the details that led me to this.
  • Install Param: Rather than following all the steps to install I found you can just run "ssh-host-config -y"  it was by far the easier option.
2) Uninstall and start again
A tip is, if it is going wrong uninstall and start again. If you have the service already installed use the cmd line below.
  • C:>sc delete [service name]
3) IPV6 Error
When accessing the SSH service I kept getting an error as seen here. I found that if you run the command below after updating the ssh/config as described in the link the error is removed.
  • $ SSH Localhost -oAddressFamily=inet
However I never really solved this as the issue returned when I later run Hadoop start-all at which point the issue returned. Despite this it didn't really cause any major issues.
Hopefully you now have a service up and running?

Hadoop setup

The tutorials I followed all work against an old version of Hadoop which you can find here. If the link is broken the you need to find version 0.20.2 to run this. I found that newer versions have little or no support for windows, if you find details of a newer version working on windows let me know.
1) Directory
For me it was not clear where to place Hadoop, I in fact tried several places but in the end you need to put it in the "Home/{UserName}" folder. Before starting all the install you need to run "cd hadoop" it wasn't obvious to me and I continued to struggle with having the incorrect directory.
2) Bug Alert
Perhaps this is fixed in newer version but at the point of formatting you will be asked to confirm, make sure you use uppercase "Y". See here for details of the bug.
3) Settings
You will need to configure some settings outside of Cygwins, one of which is JAVA_HOME. If you are not familiar with Environment Variables then look up how to do this, see here for details.
4) Bug Alert 2
"bin/hadoop: line 320 : C:\Program: Command not found" in the console. The route cause of this is the fact we have a space between "Program" and "Files". You can easily resolve this with the fix detailed in the link below http://stackoverflow.com/questions/12378199/hadoop-configuration-on-windows-through-Cygwin.
5) Deprecated commands
It appears Hadoop has or Cygwin has moved on since the tutorial I followed. as a result you will need to amend your Core-Site.XML to include "hdfs://localhost:9100"

looking at the HDFS

Technically if you have followed the tutorials and worked through the issues above you should have Hadoop up and running you can validate this by looking at the following URLs.
 You can also make sure that the Hadoop HDFS system is in place by following the details in this link (http://stackoverflow.com/questions/8209616/how-to-read-a-file-from-hdfs-through-browser)

 You can also look to run the examples for mapReduce which are in "{drive}:\cygwin64\home\{username}\hadoop\src\examples\org\apache\hadoop\examples". You run these with the following command in Cygwin, or similar.
  • $ bin/hadoop jar hadoop/examples.jar wordcount /user/t1/tharris/input output2

Update

Although this might seem like a great way for a Windows developer to get started the reality is you can achieve this all in minutes if you use AWS or Azure. Basically save your time in setup and invest it in features.