In a previous post I described how you could build NuGet packages in Docker. One of the advantages of building NuGet packages in Docker is that you can don't need any dependencies installed on the build-server itself, you can install all the required dependencies in the Docker container instead. One of the disadvantages of this approach is that getting at the NuGet packages after they've been built is more tricky - you have to run the image to get at the files.
Given that constraint, it's likely that if you're building your apps in Docker, you'll also want to push your NuGet packages to a feed (e.g. nuget.org or myget.org from Docker.
In this post I show how to create a Dockerfile for building your NuGet packages which you can then run as a container to push them to a NuGet feed.
Previous posts in this series:
- Exploring the .NET Core Docker files: dotnet vs aspnetcore vs aspnetcore-build
- Building ASP.NET Core apps using Cake in Docker
- Optimising ASP.NET Core apps in Docker - avoiding manually copying csproj files
- Optimising ASP.NET Core apps in Docker - avoiding manually copying csproj files (Part 2)
- Creating a generalised Docker image for building ASP.NET Core apps using ONBUILD
- Creating NuGet packages in Docker using the .NET Core CLI
- Setting ASP.NET Core version numbers for a Docker ONBUILD builder image
Building your NuGet packages in Docker
I've had a couple of questions since my posting on building NuGet packages in Docker as to why you would want to do this. Given Docker is for packaging and distributing apps, isn't it the wrong place for building NuGet packages?
While Docker images are a great way for distributing an app, one of their biggest selling points is the ability to isolate the dependencies of the app it contains from the host operating system which runs the container. For example, I can install a specific version of Node in the Docker container, without having to install Node on the build server.
That separation doesn't just apply when you're running your application, but also when building your application. To take an example from the .NET world - if I want to play with some pre-release version of the .NET SDK, I can install it into a Docker image and use that to build my app. If I wasn't using Docker, I would have to install it directly on the build server, which would affect everything it built, not just my test app. If there was a bug in the preview SDK it could potentially compromise the build-process for production apps too.
I could also use a global.json file to control the version of the SDK used to build each application.
The same argument applies to building NuGet packages in Docker as well as apps. By doing so, you isolate the dependencies required to package your libraries from those installed directly on the server.
For example, consider this simple Dockerfile. It uses the .NET Core 2.1 release candidate SDK (as it uses the 2.1.300-rc1-sdk
base image), but you don't need to have that installed on your machine to be able to build and produce the required NuGet packages.
FROM microsoft/dotnet:2.1.300-rc1-sdk AS builder
ARG Version
WORKDIR /sln
COPY . .
RUN dotnet restore
RUN dotnet build /p:Version=$Version -c Release --no-restore
RUN dotnet pack /p:Version=$Version -c Release --no-restore --no-build -o /sln/artifacts
This Dockerfile doesn't have any optimisations, but it will restore and build a .NET solution in the root directory. It will then create NuGet packages and output them to the /sln/artifacts directory. You can set the version of the package by providing the Version
as a build argument, for example:
docker build --build-arg Version=0.1.0 -t andrewlock/test-app .
If the solution builds successfully, you'll have a Docker image that contains the NuGet .nupkg files, but they're not much good sat there. Instead, you'll typically want to push them to a NuGet feed. There's a couple of ways you could do that, but in the following example I show how to configure your Dockerfile so that it pushes the files when you docker run
the image.
Pushing NuGet packages when a container is run
Before I show the code, a quick reminder on terminology:
- An image is essentially a static file that is built from a Dockerfile. You can think of it as a mini hard-drive, containing all the files necessary to run an application. But nothing is actually running; it's just a file.
- A container is what you get if you run an image.
The following Dockerfile expands on the previous one, so that when you run the image, it pushes the .nupkgs built in the previous stage to the nuget.org feed.
FROM microsoft/dotnet:2.1.300-rc1-sdk AS builder
ARG Version
WORKDIR /sln
COPY . .
RUN dotnet restore
RUN dotnet build /p:Version=$Version -c Release --no-restore
RUN dotnet pack /p:Version=$Version -c Release --no-restore --no-build -o /sln/artifacts
ENTRYPOINT ["dotnet", "nuget", "push", "/sln/artifacts/*.nupkg"]
CMD ["--source", "https://api.nuget.org/v3/index.json"]
This Dockerfile makes use of both ENTRYPOINT
and CMD
commands. For an excellent description of the differences between them, and when to use one over the other, see this article. In summary, I've used ENTRYPOINT
to define the executable command to run and it's constant arguments, and CMD
to specify the optional arguments. When you run the image built using this Dockerfile (andrewlock/test-app
for example) it will combine ENTRYPOINT
and CMD
to give the final command to run.
For example, if you run:
docker run --rm --name push-packages andrewlock/test-app
then the Docker container will execute the following command in the container:
dotnet nuget push /sln/artifacts/*.nupkg --source https://api.nuget.org/v3/index.json
When pushing files to nuget.org, you will typically need to provide an API key using the --api-key
argument, so running the container as it is will give a 401 Unauthorized
response. To provide the extra arguments to the dotnet nuget push
command, add them at the end of your docker run
statement:
docker run --rm --name push-packages andrewlock/test-app --source https://api.nuget.org/v3/index.json --api-key MY-SECRET-KEY
When you pass additional arguments to the docker run
command, they replace any arguments embedded in the image with CMD
, and are appended to the ENTRYPOINT
, to give the final command:
dotnet nuget push /sln/artifacts/*.nupkg --source https://api.nuget.org/v3/index.json --api-key MY-SECRET-KEY
Note that I had to duplicate the
--source
argument in order to add the additional--api-key
argument. When you provide additional arguments to thedocker run
command, it completely overridees theCMD
arguments, so if you need them, you must repeat them when you calldocker run
.
Why push NuGet packages on run instead of on build?
The example I've shown here, using dotnet run
to push NuGet packages to a NuGet feed, is only one way you can achieve the same goal. Another valid approach would be to call dotnet nuget push
inside the Dockerfile itself, as part of the build process. For example, you could use the following Dockerfile:
FROM microsoft/dotnet:2.1.300-rc1-sdk AS builder
ARG Version
ARG NUGET_KEY
ARG NUGET_URL=https://api.nuget.org/v3/index.json
WORKDIR /sln
COPY . .
RUN dotnet restore
RUN dotnet build /p:Version=$Version -c Release --no-restore
RUN dotnet pack /p:Version=$Version -c Release --no-restore --no-build -o /sln/artifacts
RUN dotnet nuget push /sln/artifacts/*.nupkg --source NUGET_URL --api-key $NUGET_KEY
In this example, building the image itself would push the artifacts to your NuGet feed:
docker build --build-arg Version=0.1.0 --build arg NUGET_KEY=MY-SECRET-KEY .
So why choose one approach over the other? It's a matter of preference really.
Oftentimes I have a solution that consists of both libraries to push to NuGet and applications to package and deploy as Dockerfiles. In those cases, my build scripts tend to look like the following:
- Restore, build and test the whole solution in a shared Dockerfile
- Publish each of the apps to their own images
- Pack the libraries in an image
- Test the app images
- Push the app Docker images to the Docker repository
- Push the NuGet packages to the NuGet feed by running the Docker image
Moving the dotnet nuget push
out of docker build
and into docker run
feels conceptually closer to the two-step approach taken for the app images. We don't build and push Docker images all in one step; there's a build phase and a push phase. The setup with NuGet adopts a similar approach. If I wanted to run some checks on the NuGet packages produced (e.g. testing they have been built with required attributes for example) then I could easily do that before they're pushed to NuGet.
Whichever approach you take, there's definitely benefits to building your NuGet packages in Docker.
Summary
In this post I showed how you can build NuGet packages in Docker, and then push them to your NuGet feed when you run the container. By using ENTRYPOINT
and CMD
you can provide default arguments to make it easier to run the container. You don't have to use this two-stage approach - you could push your NuGet packages as part of the docker build
call. I prefer to separate the two processes to more closely mirror the process of building and publishing app Docker images.