What I omitted to mention was the purpose of the Horn Dsl. I now run the risk of losing whatever interest that the readers (if there are indeed any) may have by still not mentioning the Dsl but instead mentioning what purpose it is supposed to serve. I feel I must further waiver from the technical details in order to give proper context to the Dsl's reason to exist.
As I mentioned in the introduction, horn is based on the gentoo portage package manager. At the heart of both portage and indeed horn is the metaphor of the package tree. The package tree is conceptually a tree that contains leaves of package build instructions. In reality it is a directory structure that contains Dsl instance files of package build instructions. Below is how the horn package tree looks at this time of writing.
If you examine the image above, you can see that we have one top level node that is rather unimaginatively named package_tree. Below the root package_tree node are child nodes that represent package categories like ORM's, IOC's, loggers etc. The rational being that the we can use these next level nodes as search criteria to list all the object relational mappers for example that horn could install. Contained in each of the category nodes are the individual package nodes like Nhiberate. In the example above you can see that Nhibernate is rightly positioned under a category parent node named orm. Contained within the indiviual Nhiberante package node is one build file named nhibernate.boo which contains the build metadata needed to install not only Nhibernate but any dependencies that are specified in the build file. The observant amongst you will notice that the package tree is persisted and retrieved from a subversion repository.
The package tree can be thought of a database of packages and their build instructions. When a user enters a command into the command prompt like the one below:
the horn software system will search through the package tree to try and find a node named the same as the value of the -install switch which in this case is Nhibernate. Upon finding the correct node, horn will then parse the contents of the build file which in this case is nhbiernate.boo into an semantic model or domain model that will be persisted in memory for the duration of the installation and more importantly contain all the information required to build and install the requested package. Martin Fowler called the domain model in this context the semantic model. In this article he states that from the domain model's point of view the DSL is just a fancy alternative way of creating it's objects and hooking them together.
The Dsl describes a BuildMetaData object that contains a SourceControl object and a BuildEngine object.
Typically, the source code for a particular package will be retrieved from a source control management system. This has been modelled as a SourceControl abstract class with particular implementations derived with their own specific implementation details. We currently have support for subversion only.
Once we have exported the source code to the client's file system, the next logical step in the workflow is to build the source code. Horn has the concept of a BuildEngine object which is charged with building the source code. The Dsl has as part of it's definition a directive that tells horn which BuildTool to build the source from. Currently we have 3 BuildTool implementations, Nant, msbuild and rake. In most cases, the build of an open source system is complex and contrived. Typically the source code will contain a build script which orchestrates this complicated build process. We need to utilise these existing build scripts which in the .NET space are more times than not defined in Nant. The MSBuild build tool will generally work with single components.
Once we have constructed this semantic model filled with the values in the DSL then horn can build and install the requested package.
In the next article I will finally get to the boo DSL. I will explain how horn loads specific instances of the Dsl into memory and parses it's contents into the semantic model.
If any of this is of interest to you then please join the Horn user group for updates or check out the source here.