I am experimenting with dev containers at the moment. I am finding them very useful for setting up environments and they work as advertised. Where I am struggling is the poor support in the Jetbrain's suite and having to accommodate different editors. We need more tooling and editor plurality for dev containers to take off.
I wonder if this could be adapted to run vscode, with a devcontainer, but in a sandbox? I want to see and interact with the vscode GUI, and I want vscode to be able interact with the container, but I do not want vscode to have any access to anything else that I don't explicitly grant. This includes my home directory and even my network.
One approach might be to try to put vscode into the devcontainer or into another container. But it needs a hole poked for the GUI for for someone to do the plumbing to get the GUI to run in a sandboxed browser context, and I don't think MS makes this easy.
(Note that vscode has no security model. If you connect vscode's normal frontend to a malicious backend, you are pwned, and this isn't even considered a bug.)
DevContainers allow for setting up your IDE with extensions, rules, and other configuration. They also support Docker compose so migration shouldn't be that bad
> DevContainers allow for setting up your IDE with extensions, rules, and other configuration.
Are people sharing their editor configs with this? I thought it was a way of getting a development environment setup, but those shouldn't have editor extensions and configuration.
Can someone describe what it's like to use devcontainers?
Is it a last resort, like if your arch is so different from your project environment that you have to go so far as to develop inside a container?
I hate even running local docker containers the second I want to do anything inside them. And that's the vibe I get when I read about devcontainers despite wanting to earnestly try it out at one point (I frankly couldn't figure it out back when I tried).
It’s like the entirely of the project is completely self contained. Your host system can be running anything and you don’t need to worry about installing tools on your system to get going. You also don’t need to worry about different dev environments for different projects.
Is this a Go project and you don’t have the right Go tools installed locally? Not a problem, as it’s in the container.
Do you have multiple node/js projects and you don’t want modules or tools to overlap? Each project has its own container.
Do you work on Mac, but want to deploy on Linux? The container makes this work too. (This one is my primary usecase). With a devcontainer, I can work on my Mac without having to install dozens of homebrew packages. Better yet, I can work with others and it doesn’t matter what computer they use, as the dev environment is standardized.
If you ever used Vagrant to work in a dev VM, it’s like that… but much faster to get started.
I’ve been using dev containers every day for the last two years, so I’m a big fan.
Working across different code bases that use variations of tools and versions, I no longer have to think about them clashing, or using Homebrew.
Very useful for rapidly onboarding new team members, they are of course free to roll their own dev environment, but we have a known good system to use if that doesn’t work or breaks.
I'm currently experimenting with toolbox [0] on fedora, and it's great. There's not much isolation going on, but if I wanted that, I'd go with a virtual machine (which I do on macOS). It uses basic images so you can create one that have all your tools installed. I prefer installing core production dependencies via the package manager instead of using tools like NVM.
The main benefit is actual containerization of the project dependencies (apart from tools that love to download stuff in $HOME). So I can wipe them and recreate them in seconds. And you can use different distros for different workflows.
I use them as a last resort. They're nice in theory but you'll lose your extensions (which you can configure but if you check it in you're imposing on everyone else...), bash history, and most debugging tools you have installed locally. Unfortunately, those are really "features" of being in a container and if you work around them it's kind of defeating the point of a container.
They're helpful for cross-architecture debugging and serving as a source of truth that the build actually works on a dev machine but definitely the ergonomics leave a bit to be desired.
If you don't get to pick your own extensions, does that mean you also don't get to pick your own keyboard shortcuts, font and colour scheme?! That would be a wider interpretation of "cattle, not pets" than I am used to...
Those are configured on the client so luckily you get to keep them. Extensions in VS Code are a bit different because some (many?) run on the host and as a result devcontainers wipe them out.
I do agree though, I think devcontainers are in an awkward spot where they're cattle but my IDE/environment is typically a pet...
They also allow you to package up and share your entire development environment and all the tooling you use with it.
That means that someone coming into a project has nothing to setup on their local machine - runtimes, compilers, etc.
You can use them for this reason without doing remote dev. But yes for remote dev you wouldn't even need to setup docker locally, and you're able to develop on a different architecture.
I am experimenting with dev containers at the moment. I am finding them very useful for setting up environments and they work as advertised. Where I am struggling is the poor support in the Jetbrain's suite and having to accommodate different editors. We need more tooling and editor plurality for dev containers to take off.
Have you considered the possibility that Jetbrain's users are left behind while devcontainers have already taken off?
Has it taken off though? It reeks like a semi open-source Microsoft product where the only goal is to lock you into VS Code.
Have you tried Gitpod or Devpod for Jetbrains?
I wonder if this could be adapted to run vscode, with a devcontainer, but in a sandbox? I want to see and interact with the vscode GUI, and I want vscode to be able interact with the container, but I do not want vscode to have any access to anything else that I don't explicitly grant. This includes my home directory and even my network.
One approach might be to try to put vscode into the devcontainer or into another container. But it needs a hole poked for the GUI for for someone to do the plumbing to get the GUI to run in a sandboxed browser context, and I don't think MS makes this easy.
(Note that vscode has no security model. If you connect vscode's normal frontend to a malicious backend, you are pwned, and this isn't even considered a bug.)
If you've already standardized on Docker Compose for development, is there an advantage to migrating to devcontainers?
DevContainers allow for setting up your IDE with extensions, rules, and other configuration. They also support Docker compose so migration shouldn't be that bad
> DevContainers allow for setting up your IDE with extensions, rules, and other configuration.
Are people sharing their editor configs with this? I thought it was a way of getting a development environment setup, but those shouldn't have editor extensions and configuration.
Yes, certain extensions and settings (formatting) go along with setting up the dev environment:
https://containers.dev/supporting
Authors blog post talking about this project:
https://blog.lohr.dev/launching-dev-containers
Can someone describe what it's like to use devcontainers?
Is it a last resort, like if your arch is so different from your project environment that you have to go so far as to develop inside a container?
I hate even running local docker containers the second I want to do anything inside them. And that's the vibe I get when I read about devcontainers despite wanting to earnestly try it out at one point (I frankly couldn't figure it out back when I tried).
It’s like the entirely of the project is completely self contained. Your host system can be running anything and you don’t need to worry about installing tools on your system to get going. You also don’t need to worry about different dev environments for different projects.
Is this a Go project and you don’t have the right Go tools installed locally? Not a problem, as it’s in the container.
Do you have multiple node/js projects and you don’t want modules or tools to overlap? Each project has its own container.
Do you work on Mac, but want to deploy on Linux? The container makes this work too. (This one is my primary usecase). With a devcontainer, I can work on my Mac without having to install dozens of homebrew packages. Better yet, I can work with others and it doesn’t matter what computer they use, as the dev environment is standardized.
If you ever used Vagrant to work in a dev VM, it’s like that… but much faster to get started.
I’ve been using dev containers every day for the last two years, so I’m a big fan.
Working across different code bases that use variations of tools and versions, I no longer have to think about them clashing, or using Homebrew.
Very useful for rapidly onboarding new team members, they are of course free to roll their own dev environment, but we have a known good system to use if that doesn’t work or breaks.
I wouldn’t use the CLI of this post, instead I launch them using a Raycast quicklink (e.g. vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/ministryofjustice/modernisation-platform-...)
The team I work on publish some internal guidance for using them https://github.com/ministryofjustice/.devcontainer
I'm currently experimenting with toolbox [0] on fedora, and it's great. There's not much isolation going on, but if I wanted that, I'd go with a virtual machine (which I do on macOS). It uses basic images so you can create one that have all your tools installed. I prefer installing core production dependencies via the package manager instead of using tools like NVM.
The main benefit is actual containerization of the project dependencies (apart from tools that love to download stuff in $HOME). So I can wipe them and recreate them in seconds. And you can use different distros for different workflows.
[0]: https://containertoolbx.org/
I use them as a last resort. They're nice in theory but you'll lose your extensions (which you can configure but if you check it in you're imposing on everyone else...), bash history, and most debugging tools you have installed locally. Unfortunately, those are really "features" of being in a container and if you work around them it's kind of defeating the point of a container.
They're helpful for cross-architecture debugging and serving as a source of truth that the build actually works on a dev machine but definitely the ergonomics leave a bit to be desired.
If you don't get to pick your own extensions, does that mean you also don't get to pick your own keyboard shortcuts, font and colour scheme?! That would be a wider interpretation of "cattle, not pets" than I am used to...
Those are configured on the client so luckily you get to keep them. Extensions in VS Code are a bit different because some (many?) run on the host and as a result devcontainers wipe them out.
I do agree though, I think devcontainers are in an awkward spot where they're cattle but my IDE/environment is typically a pet...
Nix with direnv is better for something like this but it obviously won't work on Windows.
They also allow you to package up and share your entire development environment and all the tooling you use with it.
That means that someone coming into a project has nothing to setup on their local machine - runtimes, compilers, etc.
You can use them for this reason without doing remote dev. But yes for remote dev you wouldn't even need to setup docker locally, and you're able to develop on a different architecture.