Despite its archaic syntax… some more interesting projects that have crossed my rss feed.
Rx like framework – https://github.com/reactor/reactor
Very entertaining look at the HTML5 Application cache for building offline apps… and why it may not be your saviour.
I am building an offline app presently and had been thinking I would just turn on Application Cache when I was ready. Clearly some more thought is required.
Once thing I needed was the ability to queue and execute commands asynchronously but serially.
With JQuery Promises, this is remarkably easy to achieve.
The Gist below is in typescript.
Refactoring can only be called refactoring if the *same
tests* can be used to exercise *different implementations* of the
Really liked this definition of Refactoring, as apposed to Refuctoring, from the DDD/Cqrs Group
It speaks to the scope of what a Unit Test should test… and how!
We wanted a framework that would allow us to decompose our functionality into logical units which declare their dependencies. If a developer required one of those modules, they wouldn’t have to worry about manually referencing all the required components in the correct order (and maintaining them)
After a look at the major players, I decided on Jingo and haven’t looked back. I really can’t sing its praises more highly. Go over and read about it from the horses mouth rather than me trying to paraphrase it. It requires a little more work if you want to “jingo-ise” your references like JQuery etc (I dont bother with jQuery but do with the others)
It has been almost 8 months since I have posted! All my time has been focused on moving to the Caribbean; a life long dream of mine. We have been in the Virgin Islands now for 6 months, working remotely for a great software house, Digiterre.
Working remotely in the tropics has been awesome thus far but does come with some pit falls:
Now the move is behind me and we are settled in, I’ll try and chronicle some of the things I’m currently working on.
During the proof of concept of migrating our internal dependency resolution from SVN Externals to NuGet I forgot an important stakeholder; Developers. 🙂
When the build updates the packages; how does a developer ensure their environment is up to date?
Edit: There is now a NuGet PowerTools Package that will add a PackageConsole command to restore all packages.
I still like the TortoiseHook and still wish I could specify the packages location, but we’re getting closer to ‘there’.
The obvious answer is to check in the packages. After all that’s how the SVN Externals solution worked and it ensures the repeatability tenant of continuous builds. Sometimes however it’s not an option for a number of reasons; company policy, ruthless team leads or dictating architects, take your pick.
The way our SVN externals solution worked was to keep a separate repository entirely for “packages”. This repository could be blown away when it got bloated and replaced by a new version. Because it was used for multiple solutions/Company projects, it also enabled remote workers to keep their own copy of the repository locally (using DNS or host file entries).
This method can still be used with NuGet and means that an SVN update updates everything.
A pre-build step can be added for each project to run NuGet Install. I don’t really like this as it is an unnecessary build step. This task is an “Update” task not a build task and the speed of my build is sacrosanct to me.
After updating the local copy, the Developer can run a batch file manually to update packages.
for /r %cd% %%X in (*.csproj) do (nuget install %%~pX\packages.config -OutputDirectory packages)
This batch file command runs nuget install for every packages.config in a solution.
Obviously an automated solution is preferred to adding a secondary step to the developer’s workflow. If your VCS client supports hooks such as TortoiseSVN client hooks you can check-in the above batch file and get developers to hook it in to their VCS Update.
The true Holy Grail for people who don’t want to check-in packages is the ability to tell NuGet where packages live. It could then be configured to use a shared network drive to find the packages. This would be the same repository used by the build so by default it is always up to date. Unfortunately this is not yet a feature of NuGet. It is being discussed as a new feature, so if its something you are keen on go vote for it at http://nuget.codeplex.com/workitem/215
This example will use TeamCity as the build platform as well as the Alpha Nuget Trigger and Update/Install Packages Build step found here.
It will also use my preferred Package Update Strategy defined here.
The build targets are specified in the build directory of the solution (here)
The build targets are linked into the solutions via the Named MsBuild Hook.
The automated build for each component has 4 main build steps
The last piece of the TeamCity puzzle is triggers; we want a build to be triggered if there is a new dependent package. At the moment it is a little clumsy as you must create a trigger per package you wish to monitor. I only monitor internal dependencies and then only direct internal dependencies.
So for example the UI project whilst it uses Common, Common is a dependency of Service A and Service B. So I only need to add triggers for those two packages. Because when Common changes, Service A and B will be triggered and when they publish their new versions the UI build will be triggered.