128 points by shcheklein
3 days ago
Disclosure: I work at Microsoft / Azure.
I am a dev container addict. I work on tons of different projects across languages, and it's so nice for the dev environment to just work (mostly- if it doesn't, it's because of M1 issues). I also love Github Codespaces (that uses dev containers), I've been using it for all my workshops lately and attendees tend to prefer Codespaces instead of local dev.
If you're looking for example dev containers, I've got a bunch of repos linked from my read me that have support:
The trickiest ones have involved databases, but I was happy to be able to get PostgreSQL running inside the container, along with an admin for it!
We did a Youtube stream showing that here:
Yesterday I got docker-in-docker working as I'm running a Docker workshop today and want Codespaces to be an option. So cool. That's at
Dev containers support Docker Compose, is running the database inside the dev container preferable to using that? Or is that what you meant?
Yep, sorry for being unclear, I used docker-compose:
There are a few config options there that took some tweaking.
Thank you, your demo is very cool and comprehensive
You only need a PostgreSQL server installed on different machine. Maintaining a container will cost more on long term.
The way I see it - it's a really nice way to make reusable modules out of your Docker files (e.g. we need to install a tool and we need to run apt-get + some config, etc). And a simpler, JSON syntax to apply those modules on top of a base image.
I love the experience so far, we've done a few features (modules) - e.g. this one to install `nvtop` to see GPU utilization 
The whole CUDA + nvtop + (some other tools) setup for a project to be run on a remote machine (e.g. via VS Code) becomes like this 
And that's enough to run ML training on GH Codespaces with GPU support. Super cool experience.
I’m a huge fan of devcontainers. If you use VS Code or Codespaces, it makes it really easy to do a remote environment exactly the same for everyone. I use them in workshops and it’s great.
It’s an open standard and we’re starting to see more members of the community adopt it or integrate support into their stuff too (including some larger Nix players like devenv and Jetpack.io) and I really hope others adopt it too. For me, for however many years I’ve been using these (it’s at least 2 but it could be longer, time is a circle), it has definitely improved my QoL.
Disclosure: I work at GitHub which uses devcontainers in Codespaces and I used to work at Microsoft which developed this and used it with VS Code and the amazing VS Code remote extension.
Not seeing the value prop over something open like Nix, is there one?
Hi! Welcome to this <whatever> workshop. First thing we're going to need to do is install a new os!
It's always weird to me how the response to 'its great for showing new users stuff' is 'WHAT!? why arn't you having them install this new os/whatever instead'. I feel like hacker news commenters completely lack perspective sometimes.
It’s possible to install Nix (the package manager) without NixOS (the OS). It runs on macOS and Linux.
I mean, if I’m giving a workshop, being able to point everyone to the same GitHub repo and then have everyone able to load the same environment with either Codespaces or the VS Code remote extension is a huge time saver. I’ll be honest, I wouldn’t touch Nix in a workshop. If someone knows how to use it already and wanted to configure everything themselves, fine. But getting people to get Docker and VS Code installed can be challenge enough. If I’ve got 4 hours (or two hours), I really want to spend as little time as possible on the dev environment setup process.
Codespaces makes this really ideal, but just using the VS Code extension (after getting everyone to get Docker installed), saves a lot of time and ensures everyone is using the same version of everything.
Not sure why it's downvoted but I'd be also interested to understand how it compares to a real local machine in terms of ... many things
- installing dependencies
I read an unrelated comment that installing PostgreSQL "was among the trickiest issue" ... Cool, but no matter the machine, getting a PostgreSQL DB up and running is usually rather straightforward. So, not sure what to exactly think of this but would be happy to hear from someone with hands-on experience.
What’s not open about this specification and the open source utilities built to support it?
The tools are not open source. You cannot use dev containers with e.g. vscodium
Sure you can. You technically can’t use the VS Code remote extension in Codium because of licensing reasons (tho people have packaged it for that marketplace anyway - and I’m not suggesting anyone do that, I am saying it has been done), but there is nothing stopping anyone from building a remote extension that uses the dev container spec.
I’m not debating that that isn’t a lot of extra work, but the spec itself is absolutely open for anyone else to integrate. I’d posit that the reason we haven’t seen alternatives (to my knowledge) to remote extension because of how much work is involved when most people (maybe not you, but most people), are comfortable just turning off the telemetry flag in VS Code rather than using Codium.
But if you’d rather use something like Nix, using something like devbox or devenv will achieve something similar to dev containers and that’s fine foo.
Oh I see, yeah that’s definitely not clear from the site. I don’t use vscode, it had me fooled... classic Microsoft
Devcontainers aren’t better than Nix fundamentally, but people can get running with them in an hour and be productive, instead of Nix that’s a week long effort for newbies.
This is very true. There need to be a Nix guy in the team to bootstrap and be the evangelist. My experience is that in one week everyone gets very familiar with the workflow, and in 3 months all of my team become experienced Nix users.
Great.... How do you do that inside a 4 hour workshop?
I really dislike asdf. It is supposed to be the hail marry, but it doesn't support .nvmrc with content like lts/hydrogen. Also, they don't have a pyenv venv plugin. So I'm still stuck with rbenv, nvm, jenv, pyenv and pyenv venv...
Have you tried nix's or guix's shells? They can be used as v env for anything.
Have you tried this? https://github.com/asdf-vm/asdf-nodejs#nvmrc-and-node-versio...
Also lts, lts-hydrogen, etc are available to install I can see when running `asdf list all nodejs`
I don't get why I need this rather than just a plain container/dockerfile. That is how I work today and that is my "devcontainer", no separate markup needed.
If you need to...
- Mount volumes in your dev container
- Run commands after the container is built
- Pass environment variables from the host to containers
- Specify which containers will be ran in a docker-compose definition and which one will host the vs code server
- Use arbitrary flags in docker run / docker build / docker-compose
You will end up with a Makefile or bash script that roughly looks like a .devcontainer.json file.
Devcontainer files are also neat from an UX pov because new devs can just click on "open remote" and wait. I've been using this feature since 2019 and it did wonders for project onboarding.
> - Mount volumes in your dev container
> - Pass environment variables from the host to containers
> - Specify which containers will be ran in a docker-compose definition and which one will host the vs code server
Docker compose yaml files are pretty cool to define these.
Whereas these are just asking for trouble/non-reproducible docker environment and could be part of container definitions:
> - Run commands after the container is built
> - Use arbitrary flags in docker run / docker build / docker-compose
The UX that you mention on the other hand is pretty cool:
Same, but I use podman, and a simple k8s compatible manifest that I start with "podman play kube manifest.yml"
It supports named volumes, configmaps and that is enough for local dev environments, and editor agnostic.
Not used Codespaces, but gitpod.io.
What you end up cobbling together to really create a fully fledged environment ends up being what the yaml does for you in a consistent manner.
Also, getting off of "DOCKER", and specifically "docker desktop" and its magic, is beneficial.
containers.dev is a spec created and only implemented by Microsoft (in Visual Studio [Code] and GitHub Codespaces). Tool vendors (of IDEs) decided to go to war, instead of collaborating, it seems. XKCD's "standards" comic seems to apply  here.
As I've blogged about here , there are other specs such as Devfile.io or GitPod's proprietary gitpod.yml file.
To be fair, GitPod claims they will support devcontainer.json: https://github.com/gitpod-io/gitpod/issues/7721
Do they? It reads more like a request from the community, but without any statement from Gitpod itself.
Yeah, I really hope JetBrains would adopt this spec as well. So far they seem to be going in another direction.
Sometimes you need to name something and give it a spec for others to implement it.
What I find interesting is:
json... with comments
If only JSON would have supported comments right from the start, our lives would be a bit better.
But JSON wasn't intended for configuration or meddling around with much. But it's incredibly useful to have a file format parsable with vanilla JS so it just became de facto configuration syntax - without comments!
It did, but people went straight to abusing them, so they were removed.
I think the most popular implementation of json with comments is now the VSCode `settings.json`, so maybe that gets developers to slowly adopt such an extension.
The bigger deal about JSONC isn't comments, it's trailing commas.
Aside of comments about the containers spec, I'd like to share that codespaces-like environments are huge for me.
Being able to able to open an open source project on the go just by launching a codespace is no small feat.
I have also been recently programming in quite heavy environments which my computer could not handle (especially when screen sharing) properly so launching the environment in a remote cloud instance with 8 core and 16GB of ram was useful.
All my personal projects are developed in codespaces too to minimize time I spend setting up or syncing those to lower the friction as much as possible. Going from desktop to tablet to my laptop all I need to do is to launch the codespace.
I hope other vendors will contribute to the spec, there's nothing to benefit here from every cloud implementing their own spec, they will eventually lose to Microsoft rather than leverage the spec.
Your choice of dev environment should be driven by product requirements, period. Dev environments should dogfood the production supply chain as much as possible, meaning use the same base container/OS/toolchain/etc that will be used in production.
Otherwise you are going to miss bugs/vulnerabilities and spend a lot of time keeping dev/prod provisioning in sync.
Deploying to containers/k8s? Develop in a container.
Deploying to VM/bare metal? Develop in a VM (Vagrant).
Developing a portable library? Nix may be a good fit.
Is this basically Nix derivations, but with a more familiar syntax? (Dockerfiles)
I don't believe they use Nix at all.
Dev Containers are basically Docker Containers (either pulled from a repo or built from a Dockerfile) with some additional metadata attached so they can work as a Development Environment. Metadata includes stuff to configure debuggers, IDEs, etc.
I think the comment meant it does the same thing as Nix derivations. Not that it uses Nix.
While you can use nix derivations to build development containers, you still would need a Makefile / bash script specifying how to run them.
The devcontainer spec combines 3 things:
- How to build containers for a development environment (using Docker or docker-compose)
- How to run containers for a development environment (volumes to mount, environment variables, commands to execute after containers are running, where the project will be mounted...)
- Which VS Code configuration to install and use (including extensions) within the development environment
yeah don't really like all this. I mean if ppl are use to Dockfiles or dock compose files, then just stick with that can change the underlying tech to Nix, firejail, fire cracker or whatever. It is quite annoying to have a mega corp popping up with these different container tech that pretty much does the same thing.
Nix derivations: Microsoft E.E.E. edition.
I love the standard, in fact we have implemented it in the new beta version of Codeanywhere
what is the difference with docker-compose here ?
It's supported by your IDE in this case VSCode.
So it's a docker-compose that runs up in your IDE.
Then you have all your tools ready to go i.e. postgres.
Most importantly you can share it with your team.
There's an additional devcontainer.json file that specifies editor configuration, like extensions, startup commands, extension settings, etc.
Tbh I feel that all this everything version pinned and containerized ultimately is inducing lot of fragility in the ecosystem, and makes it more and more difficult to work with wide range of tools. Reminds me of smalltalk images in one way, and in other way of one enterprise gig I did where using some ancient IBM IDE and runtime was required because the project setup was too crusty to trust it to work any other way.
I long for the opportunity where the software I develop would work with plain vanilla Debian stable and would require nothing else.
I agree. Dev containers solve the wrong problem. (Works on my machine dev environments). Its always fun and easy to sell someone that you need to develop a layer that will solve all problems, but the underlying problem is the one we actually need to solve.
I wouldn't recommend. If a dev environment needs such containers you probably work in crap
No, I simply do not want my workstation to be littered with dev dependencies that are used on a single project, or have to install nvm, rvm, asdf, to manage conflicting language runtimes. Nor I want to use Nix, not my jam, sorry.
All I need is Emacs, and podman.
Or you juggle multiple projects that may not be on the same versions of everything. Or you sometimes need to git-bisect back to an older version of your project, which may have older dependencies including e.g. Ruby or Python or Node versions or uses a different and incompatible version of Cmake or whatever, and it's much nicer to have an isolated environment for that, scripted such that it's used automatically, than having to fiddle with the programs actually installed on your dev machine.
And it's nice to have all that work across an entire team, no "works on my machine" weirdness.
Maintainig containers will cost more than keeping the package.json or rubyjso whatsoever updated.
Also you need some hacky things to make vbox running on OSX so it becomes totally a "works on my machine".
> Maintainig containers will cost more than keeping the package.json or rubyjso whatsoever updated.
Fine. What about compilers? System libraries? PostgreSQL version & config? Does package.json manage your node version? Oh and we switched from npm to some hipster new thing a while back, and older package.json files of ours will fail to install properly with the new thing. Have fun git-bisecting.
> Also you need some hacky things to make vbox running on OSX so it becomes totally a "works on my machine".
Been doing this for like a decade and it's not much more than "brew install docker" and a line or two in your shell rc file. It used to be slightly worse because you had to manage docker-machine separately, but it's never been hard or complicated.
How do you install brew? You copy paste a line from a website?
Yeah, of course.
I'm more of a Nix guy for solving this particular issue, but even assuming you were right (I don't believe so), sometimes you do have to work on crap. And it's better if you have a more reproducible dev environment when doing it.
Working on crap and working in crap are different things.
Working in crap: doing someones elses work
Working on crap: doing some work on legacy
And what would you recommend?
Also, if you fail to see the value in this, then I'd ask what you would use:
- When you need to work on multiple products with different environments.
- When you want to temporarily spin up development environments on powerful machines for parts of your work and rest of the time develop locally?
- When you want to pass an identical environment to someone else and have it just work.
You interact with APIs. You don't have to install a database or another daemon to your local computer.
This has never happened.
All the dev stacks has something like package.json
> You interact with APIs. You don't have to install a database or another daemon to your local computer.
How about the people writing the APIs?, or the ones writing the servers?, or the os, or database, etc?
> This has never happened
In your case, use a bit of imagination.
> All the dev stacks has something like package.json
That is a very broad statement and in my experience, no.
So when you introduce or change a dependency to your project that affects what your dev environment must look like, how do you tie that change in with the main changes and make it available and consistent across all of your devs?
Email them? "Hey, everybody should upgrade to OpenCV version 4.1.5"? Garbage.
Devs interact with APIs, not OS-level libraries. If you do scientific research, you probably use python notebook or sth like that. If you need that OS-level solution to be integrated in your dev environment, you put them to Lambda or some other amazon instance, or a dedicated server.
Who makes the code that powers the APIs?! Who writes the OS-level libraries? What about people that write software for automotive, IoT, robotics, machinery? Is it your understanding that python notebooks never need to call out to OS-level libraries? (what is conda for then??)
Web developers always just assume the entire world is other web developers, and confidently dispense advice without any caveat re. its limited purview. The world is bigger than python notebooks and shitty ruby apps.
Well we are on web.