New NPackage binaries

May 16, 2010 at 06:57 PM | categories: NPackage | View Comments

I've shaved two yaks this weekend:

  • Ported the NPackage install command to F#. There shouldn't be any visible change in the application: rather, F# lets me write more succinct code for handling HTTP and package dependencies. (Specifically, I was able to replace a lot of my boilerplace C# code with a computation workflow that takes care of deciding which files need to be downloaded again and which ones can be left alone.)

  • Set up an instance of TeamCity to take the source code from GitHub and build binaries for me. This isn't a version 1.0 release yet, but you're welcome to download the binaries and help me test them.

Download the most recent NPackage binaries

PS last week I promised I'd implement package dependencies. I haven't done that yet.

PPS here's another .NET packaging system to look out for: Sebastien Lambla's OpenWrap

Read and Post Comments

This week on NPackage

May 09, 2010 at 08:19 PM | categories: NPackage | View Comments

Browse the NPackage source on GitHub

I implemented the local package database idea that I mentioned last weekend. Now the NPackage client downloads a packages.js file that describes every version of every package; now you don't have to specify full package file URLs with version numbers. I've also switched to JSON syntax for the package files, instead of using my hand-made Yaml-ish parser.

I want to do at least two more things before putting togther an NPackage version 1.0 release:

  • Packages should be able to depend on other packages. These dependencies should consist of a package name and, optionally, the range of version numbers that are acceptable. NPackage will pick the latest version of each package that satisfies all the version number constraints.
  • Developers should be able to set up a local package repository that takes prececdence over the web site

Hopefully I'll at least have dependencies working by next weekend.

Read and Post Comments

First six NPackage packages

May 03, 2010 at 03:49 PM | categories: NPackage | View Comments

Browse the NPackage source on GitHub

This weekend I've had a couple of productive sessions on NPackage and I'm pretty happy with how it's working out.

I've set up package files on Amazon S3 for the following libraries:

These six packages test a few different scenarios:

  • NUnit and SharpZipLib are needed to build NPackage itself
  • Rhino Mocks is a good test because the download URL doesn't resemble the name of the file that gets downloaded; I had to write code to parse the HTTP Content-Disposition header to make sense of it
  • Cecil is an example of a library that's distributed with the rest of Mono, which is supplied as a large .tar.gz archive (the other libraries on the list above are all .zip files)
  • NHibernate is an example of a library that has its own dependencies, i.e. log4net

Although NPackage is working nicely for its own development, I need to put in more work before the installation process is simple enough. Right now the command to install NPackage's own dependencies looks like this:

NPackage http://np.partario.com/nunit-2.5.5.10112/nunit.np \
         http://np.partario.com/sharpziplib-0.85.5/sharpziplib.np

I'd like to simplify it to this:

np install nunit sharpziplib

To do this I'll need to handle dependencies properly. I expect I'll need to drop the approach of putting package descriptions in their own files: the client will need to contain enough intelligence to put together a dependency graph and install the right versions of the right packages. It will be easier to do this if the client can download a central package database list and make its decisions based on the descriptions within there.

I envisage having local databases that can contain a few local packages and delegate the rest to the central list on the web. I'd also like the client to be self-contained enough to carry on working even if this central package server falls over: the client should cache the most recent version of the list and work from there.

Read and Post Comments

NPackage news

April 27, 2010 at 01:00 PM | categories: NPackage | View Comments

I've had some time over the last couple of evenings to do some coding, so I've made a start on my package manager. So far I'm calling it NPackage.

I defined a Yaml-format package file, containing the package name, description, version number; the name and email address of the maintainer; and an optional field for the URL of the download site. The package file then contains a list of libraries (i.e. .NET assembles) contributed by the package, with download URLs specified relative to the download site.

The idea behind making the download URL optional is that these package files can point to an existing download site for established libraries like NUnit. If the download URL is omitted then the NPackage client will look for binaries on the NPackage server, in the same location as the package file itself. A lot of libraries are going to be distributed in .zip files, so I was planning on having the NPackage client download these .zip files and unpack them into the layout dictated by the package file.

I'm using NPackage to develop NPackage, which means I had to hack together the parsing and download code myself without any unit testing framework or parsing library. Now that I've done that, I've hand-crafted a package file for NUnit (nunit.np) that'll let me start writing unit tests.

There are a few areas I'm not sure about:

  • I've started writing it in C#, but I'm tempted to switch to F#, at least for the core functionality. I'm expecting to need some strong dependency graph logic (for dependencies between packages and between files within a package), which will be easier to code in F#. However I'd like to be able to be able to build the sources on Mono, and I'm not aware of a standard way of invoking the F# compiler from a Mono build process.
  • I'm only dealing with binary distribution for now (and xcopy deployment at that). Building .NET libraries from source in a standard way could be tricky.
  • I've picked a Yaml-based file format over XML because I expect these package files to be created by hand. As a result, it's going to be harder to generate or parse these files as part of an automated build system.

Here's the notes I made before I got started:

  1. Find the package
  2. Package files like in cabal
  3. A couple of standard locations: Hackage-like web server, internal source control repository
  4. Package identified by: name and version
  5. Deal with variants (like: 2.0 vs 3.5 vs 4.0; 32-bit vs 64-bit) by having separate packages released at the same time
  6. Install dependencies
  7. Package files declare their own dependencies, cabal style
  8. Recursively fetch and install dependencies
  9. Download the code
  10. Package file specifies location of source code (default is src/ directory relative to package file)
  11. Packages can release binaries only, e.g. NUnit, log4net etc. get downloaded from their normal locations
  12. Support fetching from source control as well as HTTP? - may make sense for internal deployments, what about mixing matching SCC systems?
  13. Build the code
  14. Skip this if the package just has binaries
  15. Reference the binaries
  16. Update VS solution and project files
Read and Post Comments

.NET package manager feedback

April 19, 2010 at 06:19 PM | categories: NPackage | View Comments

I don't seem to be the first one to be having problems keeping track of his .NET binaries:

Finally, Terry Spitz had some fairly enthusiastic feedback:

hell yes! we've got various vbscripts to do this. shouldn't it be 'easy' in say MSI (if too heavyweight), or powershell. additional points if it can handle multi-level caching, i.e. cross-region or internet code is cached on a team share as well as locally.

Windows Installer occurred to me when I started thinking about this. However, I think such a tool should be limited to deploying assemblies to a particular project's source tree -- deploying them via MSIs suggests putting them into a central location on each machine, and I predict that individual projects will start interfering with each other this way, particularly on a build server. On the other hand, Windows Installer does have the concept of merge modules: mini MSIs for software components that get merged into the final application installer.

Terry's multi-level caching idea is nice. There should definitely be local team and Internet repositories. Additionally, geographically distributed teams probably want local caches to keep overhead to a minimum. And I noticed that my Amazon-based web server cleverly goes to a special Ubuntu package repository hosted on S3, which keeps things quick and hopefully reduces my bandwidth costs.

Read and Post Comments

Next Page ยป