Ken MuseALM | DevOps Ranger & Azure MVP

DevOps and Documentation


Just because your software iterations are fast and lean doesn't mean that you shouldn't have good documentation. In fact, with shorter release cycles documentation becomes even more important. Consequently, it's important to have a a process that makes rapid updates possible.

I was faced with just such a situation the other day. I had to assist with some documentation which were authored as Markdown and stored in source control. The documents were manually published by converting them to HTML using a Python 3 package called Grip. This formatted the documents and gave them an appearance based on Github's look and feel. The results were packaged in a ZIP file for distribution. A repetitive process like this seems like a great excuse to automate!

There's a lot of ways to render Markdown, but for this example I'm going to show an approach using Grip and PowerShell to create a cross-platform build pipeline. I'm also going to assume that the build host (or local developer box) has Python 3.6+ and PowerShell or PowerShell Core installed.

Why PowerShell? For creating build scripts, it is an invaluable language. With the arrival of PowerShell Core, it's possible to use PowerShell on any platform, including MacOS and Linux. Having a single cross-platform language means that my build will work in any development environment. While I could do the same thing with Python, I also tend to use PowerShell DSC frequently for infrastructure-as-code. Using the same language to define the machine and the build process can make it easier for maintenance.

The Build Script

Let's start by creating a build script in our source code repository, build.ps1. This will be the entry point to our build process. Because a built should be a repeatable process, taking you from a clean machine state to the final output, we'll begin by installing Grip:

pip install grip

Next, let's setup some variables for the outputs we will be creating. We'll take advantage of the fact that Azure DevOps (formerly VSTS) gives us an environmental variable for staging build binaries and output artifacts

$buildFolder = Join-Path $env:BUILD_BINARIESDIRECTORY
$zip = Join-Path $env:BUILD_STAGINGDIRECTORY 'lab.zip'

This sets up a staging folder for our built files, which will then be ZIP'd and published.

Now, for the actual processing:

$markdownFiles = Get-ChildItem -Recurse -include *.md

forEach ($file In $markdownFiles) {
  # Build the output folder path
  $relative = Resolve-Path -Relative $file
  $html = Join-Path $buildFolder $relative
  
  # Change the output file extension from .md to .html
  $html = [IO.Path]::ChangeExtension($html, "html") 
  $html = [IO.Path]::GetFullPath($html)

  # Ensure the directory exists for the output content
  $targetFolder =[IO.Path]::GetDirectoryName($html)
  New-Item -ItemType directory -Force -Path $targetFolder

  # Run Grip to convert the file
  grip "$file" --export "$html"

}

This finds all of the Markdown files in the source tree, and creates a path to a corresponding HTML file in the $buildFolder. To prevent an error from Grip, the code uses New-Item to ensure that the target directory exists. The script then invokes Grip to read the markdown file and convert it to HTML.

Finally, we need to compress the generated files in our build folder:

Compress-Archive -Path (Join-Path $buildFolder '*') -CompressionLevel Optimal -DestinationPath $zip

The ZIP file is created in the Artifact staging folder, where it is published as a pipeline artifact for later use. Remember that a build process should not generally also deploy the code. Leave that step to the Release pipeline. This allows you to add staging environments, automated tests, release gates, and rollbacks to your delivery process.

Automatic Builds

To ensure that every checkin automatically triggers a new build, I want to include one more change. Azure DevOps supports YAML-based build definitions. These definitions can be used to automatically create a continuous integration build pipeline. To take advantage of this, we create a file called .vsts-ci.yml. When this file exists in your repository, a build pipeline is automatically created (and updated), with CI enabled.

resources:
- repo: self

queue:
  name: Hosted Ubuntu 1604

steps:
# Ensure we are running the correct version of Pthon
- task: UsePythonVersion@0
  displayName: 'Use Python 3.x'

# Execute the build script
- powershell: ./build.ps1 
  displayName: Build

# Upload the ZIP file as a pipeline artifact
- task: PublishPipelineArtifact@0
  displayName: 'Publish Pipeline Artifact'
  inputs:
    artifactName: Labs
    targetPath: '$(Build.StagingDirectory)'

This default CI build pipeline is configured to utilize the Hosted Ubuntu 1604 build agent (although Hosted VS 2017 also works). It has three steps. First, ensure that Python 3 is the active version on the agent. Next, execute our build script. Finally, publish the output of the build script as a Pipeline Artifact so that we can use it in a Release Pipeline.

And that's it. With two files, we've enabled our documentation for continuous integration and prepared ourselves to support continuous delivery (or continuous deployment) of the documentation. There's certainly more we can add to the process, but that's the joy of iterative development -- we start small and grow as needed. In the mean time, we've ensure that everyone on the team has the ability to generate the latest version of the documentation locally and that the documentation is potentially publishable after every check-in.