NPackage news

April 27, 2010 at 01:00 PM | categories: NPackage | View Comments

I've had some time over the last couple of evenings to do some coding, so I've made a start on my package manager. So far I'm calling it NPackage.

I defined a Yaml-format package file, containing the package name, description, version number; the name and email address of the maintainer; and an optional field for the URL of the download site. The package file then contains a list of libraries (i.e. .NET assembles) contributed by the package, with download URLs specified relative to the download site.

The idea behind making the download URL optional is that these package files can point to an existing download site for established libraries like NUnit. If the download URL is omitted then the NPackage client will look for binaries on the NPackage server, in the same location as the package file itself. A lot of libraries are going to be distributed in .zip files, so I was planning on having the NPackage client download these .zip files and unpack them into the layout dictated by the package file.

I'm using NPackage to develop NPackage, which means I had to hack together the parsing and download code myself without any unit testing framework or parsing library. Now that I've done that, I've hand-crafted a package file for NUnit ( that'll let me start writing unit tests.

There are a few areas I'm not sure about:

  • I've started writing it in C#, but I'm tempted to switch to F#, at least for the core functionality. I'm expecting to need some strong dependency graph logic (for dependencies between packages and between files within a package), which will be easier to code in F#. However I'd like to be able to be able to build the sources on Mono, and I'm not aware of a standard way of invoking the F# compiler from a Mono build process.
  • I'm only dealing with binary distribution for now (and xcopy deployment at that). Building .NET libraries from source in a standard way could be tricky.
  • I've picked a Yaml-based file format over XML because I expect these package files to be created by hand. As a result, it's going to be harder to generate or parse these files as part of an automated build system.

Here's the notes I made before I got started:

  1. Find the package
  2. Package files like in cabal
  3. A couple of standard locations: Hackage-like web server, internal source control repository
  4. Package identified by: name and version
  5. Deal with variants (like: 2.0 vs 3.5 vs 4.0; 32-bit vs 64-bit) by having separate packages released at the same time
  6. Install dependencies
  7. Package files declare their own dependencies, cabal style
  8. Recursively fetch and install dependencies
  9. Download the code
  10. Package file specifies location of source code (default is src/ directory relative to package file)
  11. Packages can release binaries only, e.g. NUnit, log4net etc. get downloaded from their normal locations
  12. Support fetching from source control as well as HTTP? - may make sense for internal deployments, what about mixing matching SCC systems?
  13. Build the code
  14. Skip this if the package just has binaries
  15. Reference the binaries
  16. Update VS solution and project files
blog comments powered by Disqus