Java still has interesting projects

Despite its archaic syntax… some more interesting projects that have crossed my rss feed.

Big Data ingestion framework – 

Rx like framework –



Toastr – Non Blocking JS Notifications

Toastr – Non Blocking JS Notifications

Nice notifications demo. Hook it up with signal R for real time notifications.

Tagged ,

ApplicationCache Douchebagery

Very entertaining look at the HTML5 Application cache for building offline apps… and why it may not be your saviour.

I am building an offline app presently and had been thinking I would just turn on Application Cache when I was ready. Clearly some more thought is required.

Tagged , ,

Async Execution Queue in TypeScript

I’m in the process of creating a fully client-side (with server side sync) JavaScript application.
Once thing I needed was the ability to queue and execute commands asynchronously but serially.
With JQuery Promises, this is remarkably easy to achieve.

The Gist below is in typescript.

Tagged , ,

Refactoring can only be called refactoring if the *same
tests* can be used to exercise *different implementations* of the
*same behavior*.

Really liked this definition of Refactoring, as apposed to Refuctoring, from the DDD/Cqrs Group
t speaks to the scope of what a Unit Test should test… and how!

Refactoring Definition

Tagged ,

Jingo: Module Management for Javascript

My new role is around creating the next generation of Digiterre’s customised CRM solution for the finanical industry. This has meant dusting off my Javascript skills, really relearning them as Javascript has come a long way since we last danced. One of the things I knew I wanted off the bat was a way to modularise and re-use Javascript classes, I knew from painful experience how quickly “scripting” in Javascript  can get out of control.

We wanted a framework that would allow us to decompose our functionality into logical units which declare their dependencies. If a developer required one of those modules, they wouldn’t have to worry about manually referencing all the required components in the correct order (and maintaining them)

After a look at the major players, I decided on Jingo and haven’t looked back. I really can’t sing its praises more highly. Go over and read about it from the horses mouth rather than me trying to paraphrase it. It requires a little more work if you want to “jingo-ise” your references like JQuery etc (I dont bother with jQuery but do with the others)

Here is an example of how it works from their wiki.

Tropical Software Development

It has been almost 8 months since I have posted! All my time has been focused on moving to the Caribbean; a life long dream of mine. We have been in the Virgin Islands now for 6 months, working remotely for a great software house, Digiterre.

Working remotely in the tropics has been awesome thus far but does come with some pit falls:

  • Power: We picked accommodation that have backup generators and have been thankful since day one as we regularly loose power during the day.
  • Internet: This was what I have been most worried about, however besides being slow ( 1 Mbps) it has been reliable. I use my iPhone as a backup over 3G
  • Social: All the sunshine, bikini clad beach babes, sailing and scuba diving in the world can’t replace the geek interaction you get in the office. As sad as that is to admit.

Now the move is behind me and we are settled in, I’ll try and chronicle some of the things I’m currently working on.

NuGet Developer WorkFlow

During the proof of concept of migrating our internal dependency resolution from SVN Externals to NuGet I forgot an important stakeholder; Developers.  🙂

When the build updates the packages; how does a developer ensure their environment is up to date?

Edit: There is now a NuGet PowerTools Package that will add a PackageConsole command to restore all packages.
I still like the TortoiseHook and still wish I could specify the packages location, but we’re getting closer to ‘there’.

Check In Packages

The obvious answer is to check in the packages. After all that’s how the SVN Externals solution worked and it ensures the repeatability tenant of continuous builds. Sometimes however it’s not an option for a number of reasons; company policy, ruthless team leads or dictating architects, take your pick.

Check In Packages to a “temp” repository.

The way our SVN externals solution worked was to keep a separate repository entirely for “packages”. This repository could be blown away when it got bloated and replaced by a new version. Because it was used for multiple solutions/Company projects, it also enabled remote workers to keep their own copy of the repository locally (using DNS or host file entries).

This method can still be used with NuGet and means that an SVN update updates everything.

Pre-Build Step

A pre-build step can be added for each project to run NuGet Install. I don’t really like this as it is an unnecessary build step. This task is an “Update” task not a build task and the speed of my build is sacrosanct to me.

Manually with Batch File

After updating the local copy, the Developer can run a batch file manually to update packages.

for /r %cd% %%X in (*.csproj) do (nuget install %%~pX\packages.config -OutputDirectory packages)

This batch file command runs nuget install for every packages.config in a solution.

Automate the Batch File with Tortoise Hooks

Obviously an automated solution is preferred to adding a secondary step to the developer’s workflow. If your VCS client supports hooks such as TortoiseSVN client hooks  you can check-in the above batch file and get developers to hook it in to their VCS Update.

Configure NuGet to use shared packages repository

The true Holy Grail for people who don’t want to check-in packages is the ability to tell NuGet where packages live. It could then be configured to use a shared network drive to find the packages. This would be the same repository used by the build so by default it is always up to date. Unfortunately this is not yet a feature of NuGet. It is being discussed as a new feature, so if its something you are keen on go vote for it at

Tagged ,

Diamonds Aren’t Forever – The Build

This example will use TeamCity as the build platform as well as the Alpha Nuget Trigger and Update/Install Packages Build step found here.
It will also use my preferred Package Update Strategy defined here.
The build targets are specified in the build directory of the solution (here)
The build targets are linked into the solutions via the Named MsBuild Hook.

The Build

The automated build for each component has 4 main build steps

  1. Update and Install packages
    This is done by adding the Alpha NuGet Build Step and selecting the options relevant to your update strategy. I make liberal use of Build Parameters here.
  2. Increment Version Number
    This is done via the TeamCity.targets.
    It takes the TeamCity build number, makes it the revision number in the AssemblyInfo. The pack command later uses this as the package version.
  3. Build
  4. Check In
    This is not implemented in the example and will depend on your VCS. However updating packages will change the packages.config and the project files of your solution. These will need to be checked in. If you also check in packages, these will need to be added as well.
  5. Create, Lock Versions, Publish Package
    Creating and Publishing a package are trivial with NuGet. Package.Targets contains targets that Exec NuGet to do create a package. I use a Shared Drive for my repository so publishing is just a matter of specifying the output directory of pack to the repository UNC path. However there is also a publish command in NuGet if you want to push to the official feed.The interesting part of this step is my need to ensure all dependencies read from my project file will be locked to the version used to build the current version. This involves changing all dependency elements in the nuspec file inside the package. The target unzips the package, runs the LockDepedencyVersions Target found here and then re-zips the package.


The last piece of the TeamCity puzzle is triggers; we want a build to be triggered if there is a new dependent package. At the moment it is a little clumsy as you must create a trigger per package you wish to monitor. I only monitor internal dependencies and then only direct internal dependencies.

So for example the UI project whilst it uses Common, Common is a dependency of Service A and Service B. So I only need to add triggers for those two packages. Because when Common changes, Service A and B will be triggered and when they publish their new versions the UI build will be triggered.

Tagged , , ,