aktau 2 days ago

I have a bunch, but one that I rarely see mentioned but use all the time is memo(1) (https://github.com/aktau/dotfiles/blob/master/bin/memo).

It memoizes the command passed to it.

  $ memo curl https://some-expensive.com/api/call | jq . | awk '...'
Manually clearing it (for example if I know the underlying data has changed:

  $ memo -c curl https://some-expensive.com/api/call
In-pipeline memoization (includes the input in the hash of the lookup):

  $ cat input.txt | memo -s expensive-processor | awk '...'
This allows me to rapidly iterate on shell pipelines. The main goal is to minimize my development latency, but it also has positive effects on dependencies (avoiding redundant RPC calls). The classic way of doing this is storing something in temporary files:

  $ curl https://some-expensive.com/api/call > tmpfile
  $ cat tmpfile | jq . | awk '...'
But I find this awkward, and makes it harder than necessary to experiment with the expensive command itself.

  $ memo curl https://some-expensive.com/api/call | jq . | awk '...'
  $ memo curl --data "param1=value1" https://some-expennsive.com/api/call | jq . | awk '...'
Both of those will run curl once.

NOTE: Currently environment variables are not taken into account when hashing.

  • aabdelhafez 2 days ago

    You're gonna absolutely love up (https://github.com/akavel/up).

    If you pipe curl's output to it, you'll get a live playground where you can finesse the rest of your pipeline.

      $ curl https://some-expensive.com/api/call | up
    • aktau 2 days ago

      up(1) looks really cool, I think I'll add it to my toolbox.

      It looks like up(1) and memo(1) have similar use cases (or goals). I'll give it a try to see if I can appreciate its ergonomics. I suspect memo(1) will remain my mainstay:

        1. After executing a pipeline, I like to press the up arrow (heh) and edit. Surprisingly often I need to edit something that's *not* the last part, but somewhere in the middle. I find this cumbersome in default line editing mode, so I will often drop into my editor (^X^E) to edit the command.
        2. Up seems to create a shell command after completion. Avoiding the creation of extra files was one of my goals for memo(1). I'm sure some smart zsh/bash integration could be made that just returns the completed command after completing.
  • aktau 2 days ago

    Another thing I built into memo(1) which I forgot to mention: automatic compression. memo(1) will use available (de)compressors (in order of preference: zstd, lz4, xz, gzip) to (de)compress stored contents. It's surprising how much disk space and IOPS can be saved this way due to redundancy.

    I currently only have two memoized commands:

      $ for f in /tmp/memo/aktau/* ; do 
          ls -lh "$f" =(zstd -d < $f) 
        done
      -rw-r----- 1 aktau aktau  33K /tmp/memo/aktau/0742a9d8a34c37c0b5659f7a876833b6dad9ec689f8f5c6065d05f8a27d993c7bbcbfdc3a7337c3dba17886d6f6002e95a434e4629.zst
      -rw------- 1 aktau aktau 335K /tmp/zshSQRwR9
    
      -rw-r----- 1 aktau aktau  827 /tmp/memo/aktau/8373b3af893222f928447acd410779182882087c6f4e7a19605f5308174f523f8b3feecbc14e1295447f45b49d3f06da5da7e8d7a6.zst
      -rw------- 1 aktau aktau 7.4K /tmp/zshlpMMdo
    
    That's roughly 10x compression ratio.
  • 1vuio0pswjnm7 2 days ago

    .

       #!/usr/bin/env bash
       #
       # memo(1), memoizes the output of your command-line, so you can do:
       #
       #  $ memo <some long running command> | ...
       #
       # Instead of
       #
       #  $ <some long running command> > tmpfile
       #  $ cat tmpfile | ...
       #  $ rm tmpfile
       
       to save output, sed can be used in the pipeline instead of tee
       for example,
       
       x=$(mktemp -u);
       test -p $x||mkfifo $x;
       zstd -19 < $x > tmpfile.zst &
       <long running command>|sed w$x|<rest of pipeline>;
       
       # You can even use it in the middle of a pipe if you know that the input is not
       # extremely long. Just supply the -s switch:
       #
       #  $ cat sitelist | memo -s parallel curl | grep "server:"
       
       grep can be replaced with sed and search results sent to stderr
       
       < sitelist curl ...|sed '/server:/w/dev/stderr'|zstd -19 >tmpfile.zst;
       
       or send search results to stderr and to some other file
       sed can save output to multiple files at a time
       
       < sitelist curl ...|sed -e '/server:/w/dev/stderr' -e "/server:/wresults.txt"|zstd -19 >tmpfile.zst;
    • aktau a day ago

      Those commands are a (1) harder to grok and (2) do not actually use the memoized result (tmpfile.zst) to speed up a subsequent run.

      Can you give a more complete example of how you would use this to speed up developing a pipeline?

    • 1vuio0pswjnm7 a day ago

      If provide sample showing (a) input format of text and (b) desired output format of text, then perhaps can provide an example of how to do the text processing

  • dotancohen 2 days ago

    This is terrific! I curl to files and then pipe them, all the time. This will be a great help.

    I wonder if we have gotten to the point where we can feed an LLM our bash history and it could suggest improvements to our workflow.

    • edanm a day ago

      Interesting idea. And pretty easy to try.

      If you do it, I'd love to hear your results.

      In general, I wonder if we're at the point where an LLM watching you interact with your computer for twenty minutes can improve your workflow, suggest tools, etc. I imagine so, because when I think to ask how to do something, I often get an answer that is very useful, so I've automated/fixed far more things than in the past.

  • news_hacker 2 days ago

    I've been using bkt (https://github.com/dimo414/bkt) for subprocess caching. It has some nice features, like providing a ttl for cache expiration. In-pipeline memoization looks nice, I'm not sure it supports that

    • aktau a day ago

      I was not aware of bkt. Thanks for the link. It seems very similar to memo, and has more features:

        - Explicit TTL
        - Ability to include working directory et al. as context for the cache key.
      
      There do appear to be downsides (from my PoV) as well:

        - It's a rust program, so it needs to be compiled (memo is a bash/zsh script and runs as-is).
        - There's no mention of transparent compression, either in the README or through simple source code search. I did find https://github.com/dimo414/bkt/issues/62 which mentions swappable backends. The fact that it uses some type of database instead of just the filesystem is not a positive for me, I prefer the state to be easy to introspect with common tools. I will often memo commands that output gigabytes of data, which is usually highly compressible. Transparent compression fixes that up. One could argue this could be avoided with a filesystem-level feature, like ZFS transparent compression. But I don't know how to detect that in a cross-FS fashion.
      
      I opened https://github.com/dimo414/bkt/discussions/63 so the author of bkt can perhaps also participate.
  • gavinray 2 days ago

    15 years of Linux and I learn something new all the time...

    • mlrtime 2 days ago

      Its why I keep coming back, now how do I remember to use this and not go back to using tmpfiles :)

      • divan 2 days ago

        I use Warp terminal for couple of years, and recently they embeeded AI into it. At first I was irritated, disabled it, but AI Agent is built in as an optional mode (Cmd-I to toggle). And I found myself using it more and more often for commands that I have no capacity or will to remember or dig through the man pages (from "figure out my IP address on wifi interface" to "make ffmpeg do this or that"). It's fast and can iterate over own errors, and now I can't resist using it regularly. Removes the need for "tools to memorize commands" entirely.

  • Perepiska 2 days ago

    Caching some API call because it is expensive and use cached data many months later because of bash suggestion :(

    • aktau 2 days ago

      The default storage location for memo(1) output is /tmp/memo/${USER}. Most distributions either have some automatic periodic cleanup, and/or wipe it on restart.

      Separately from that:

        - The invocation contains *memo* right in there, so you (the user) knows that it might memoize.
        - One uses memo(1) for commands that are generally slow. Rerunning your command that has a slow part and having it return in a millisecond while you weren't expecting it should make the spider-sense tingle.
      
      In practice, this has never been a problem for me, and I've used this hacked together command for years.
  • naikrovek 2 days ago

    i see no way to name the memo in your examples, so how do you refer to them later?

    also, this seems a lot like an automated way to write shell scripts that you can pipe to and from. so why not use a shell script that won't surprise anyone instead of this, which might?

    • aktau 2 days ago

      The name of the memo is the command that comes after it:

        $ memo my-complex-command --some-flag my-positional-arg-1
      
      In this invocation, a hash (sha512) is taken of "my-complex-command --some-flag my-positional-arg-1", which is then stored in /tmp/memo/${USER}/{sha512hash}.zst (if you've got zstd installed, other compression extensions otherwise).
  • sgarland 2 days ago

    Dude, this is _awesome_. Thank you for sharing!

    • aktau 2 days ago

      Glad you like it. Hope you get as much use of it as me.

  • cryptonector 2 days ago

    > `curl ... | jq . | awk '...'`

    Uhm, jq _is_ as powerful (more) as awk. You can use jq directly and skip awk.

    (I know, old habits die hard, and learning functional programming languages is not easy.)

    • aktau 2 days ago

      Yes, I know. I should've taken a different example. But it's also realistic in a way. When I'm doing one-offs, I will sometimes take shortcuts like this. I know awk fairly well, and I know enough of jq that I know invoking jq . pretty prints the inbound json on multiple lines. While I know I could create a proper jq expression, the combo will get me there quicker. Similarly I'll sometimes do:

        $ awk '...' | grep | ...
      
      Because I'm too lazy to go back to the start of the awk invocation and add a match condition there. If I'm going to save it to a script, I'll clean it up. (And for jq, I gotta be honest that my starting point these days would probably be to show my contraption to an LLM and use its answer as a starting point, I don't use jq nearly enough to learn its language by memory.)
latexr 3 days ago

> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.

The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.

> jsonformat takes JSON at stdin and pretty-prints it to stdout.

Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.

> uuid prints a v4 UUID. I use this about once a month.

Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?

https://www.man7.org/linux/man-pages/man1/uuidgen.1.html

  • tester457 3 days ago

    I am not the author, but my bet is that he didn't know of its existence.

    The best part about sharing your config or knowledge is that someone will always light up your blind spots.

    • t_mahmood 2 days ago

      > The best part about sharing your config or knowledge is that someone will always light up your blind spots.

      Yes! I will take this as a chance to thank every people who shared their knowledge on the Internet. You guys are so freaking awesome! You are always appreciated.

      A big chunk of my whole life's learning came from all the forums that I used to scour through, hours after hour! Because these awesome people always sharing their knowledge, and someone adding more. That's what made Internet, Internet. And all is now almost brink of loss, because of greedy corporates.

      This habit also helped me with doom-scrolling. I sometimes, do doomscroll, but I can catch it quickly and snap out of it. Because, my whole life, I always jumped in to the rabbit holes, and actually read those big blog posts, where you had those `A-ha` moments, "Oohh, I can use that", "Ahh, that's clever!".

      When, browsing, do not give me that, by brain actually triggers, "What are you doing?"

      Later, I got lazy, which I am still paying for. But I am going to get out of it.

      Never stop jumping into those rabbit holes!! Well, obviously, not always it's a good rabbit hole, but you'll probably come out wiser.

    • _kb 3 days ago

      Or more abstractly: post anything to the internet and people will always detail how you’re wrong. Sometimes that can be useful.

      • byryan 3 days ago

        That seems to be especially true on HN. Other forums there is some of that as well, but HN it seems nearly every single comment section is like 75% (random number) pointing out faults in the posted article.

        • gaudystead 3 days ago

          Although I normally loathe pedantic assholes, I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).

          I've found that pedantic conversations here seem to actually have a greater potential for me to learn something from them than other forums/social platforms. On other platforms, I see someone providing a pedantic response and I'll just keep moving on, but on HN, I get curious to not only see who wins the nerd fight, but also that I might learn at least one thing along the way. I like that it's had an effect on how I engage with comment sections.

          • password4321 3 days ago

            And the worst of it gets flagged and even dead-ed so most skip it after a bit, as I assumed would happen recently

            https://news.ycombinator.com/item?id=45649771

            • imcritic 2 days ago

              Yes, flagging mechanism on HN is evil.

              • MyOutfitIsVague 2 days ago

                I have showdead on, and almost every single flagged post I've seen definitely deserves it. Every time it wasn't "deserved", the person simply took an overly aggressive tone for no real reason.

                In short, I've never seen somebody flagged simply for having the wrong opinion. Even controversial opinions tend to stay unflagged, unless they're incredibly dangerous or unhinged.

                • lolc 2 days ago

                  I've seen a few dead posts where there was an innocent misunderstanding or wrong assumption. In those cases it would have been beneficial to keep the post visible and post a response, so that readers with similarly mistaken assumptions could have seen a correction. Small minority of dead posts though. They can be vouched for actually but of course this is unlikely to happen.

                  I agree that most dead posts would be a distraction and good to have been kept out.

              • kergonath 2 days ago

                It’s a blunt tool, but quite useful for posts. I read most dead posts I come across and I don’t think I ever saw one that was not obviously in violation of several guidelines.

                OTOH I don’t like flagging stories because good ones get buried regularly. But then HN is not a great place for peaceful, nuanced discussion and these threads often descend into mindless flame wars, which would bury the stories even without flagging.

                So, meh. I think flagging is a moderately good thing overall but it really lacks in subtlety.

                • freedomben 2 days ago

                  Agreed, flagging for comments seems to function pretty well for the most part, and the vouch option provided a recourse for those that shouldn't have been killed.

                  On stories however, I think the flag system is pretty broken. I've seen so many stories that get flagged because people find them annoying (especially AI-related things) or people assume it will turn into a flame war, but it ends up burying important tech news. Even if the flags are reversed, the damage is usually done because the story fell off the front page (or further) and gets very little traction after that.

                • imcritic a day ago

                  Just imagine this comment of yours would get flagged. Was it something very valuable and now the discussion is lacking something important? Surely not, but how would you feel? So what that you have some not so mild and not so "pleasant" opinion on something - why flag the comment? Just let people downvote it!

          • nosianu 2 days ago

            > I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).

            Can you back this up with data? ;-)

            I see citations and links to sources about as little as on reddit around here.

            The difference I see is in the top 1% comments, which exist in the first place, and are better on average (but that depends on what other forums or subreddits you compare it to, /r/AskHistorians is pretty good for serious history answers for example), but not in the rest of the comments. Also, less distractions, more staying on topic, the joke replies are punished more often and are less frequent.

        • bdangubic 3 days ago

          I find that endearing for two reasons:

          - either critique is solid and I learn something

          - or commenter is clueless which makes it entertaining

          there is very seldom a “middle”

          • byryan 3 days ago

            Yea I don't particularly mind it, just an interesting thing about HN compared to many other forums.

        • Mawr 2 days ago

          That's a sampling bias. You're not seeing the opinions of every single person who has viewed an article, just the opinions of those who have bothered to comment.

          People who agree with an article will most likely just upvote. Hardly anyone ever bothers to comment to offer praise, so most comments that you end up seeing are criticisms.

      • mlrtime 2 days ago

        True true, one of my favorite things is watching the shorts on home improvement or 'hacks' and sure enough there is always multiple comments saying why it won't work and why its not the right way. Just as entertaining as the video.

    • gigatexal 3 days ago

      Exactly! I didn’t know macOS ships JQ or the uuidgen tool. Very cool

    • dylan604 3 days ago

      also possible (even though I've seen the author's response to not knowing) is that the scripts were written before native was included. at that point, the muscle memory is just there. I know I have a few scripts like that myself

  • rbonvall 3 days ago

    Python also pretty-prints out of the box:

        $ echo '{ "hello": "world" }' | python3 -m json.tool
        {
            "hello": "world"
        }
  • idoubtit 3 days ago

    Other examples where native features are better than these self-made scripts...

    > vim [...] I select a region and then run :'<,'>!markdownquote

    Just select the first column with ctrl-v, then "i> " then escape. That's 4 keys after the selection, instead of 20.

    > u+ 2025 returns ñ, LATIN SMALL LETTER N WITH TILDE

    `unicode` is widely available, has a good default search, and many options. BTW, I wonder why "2025" matched "ñ".

         unicode ñ
        U+00F1 LATIN SMALL LETTER N WITH TILDE
        UTF-8: c3 b1 UTF-16BE: 00f1 Decimal: &#241; Octal: \0361
    
    > catbin foo is basically cat "$(which foo)"

    Since the author is using zsh, `cat =foo` is shorter and more powerful. It's also much less error-prone with long commands, since zsh can smartly complete after =.

    I use it often, e.g. `file =firefox` or `vim =myscript.sh`.

    • oneeyedpigeon 2 days ago

      > `unicode` is widely available

      It's not installed by default on macOS or Ubuntu, for me.

      • pmontra 2 days ago

        You are right but

          $ unicode
          Command 'unicode' not found, but can be installed with:
          sudo apt install unicode
        
        and it did. So it really was available. That's Debian 11.
  • shortrounddev2 3 days ago

    > Why prioritise node instead of jq?

    In powershell I just do

        > echo '{"foo": "bar"} | ConvertFrom-Json | ConvertTo-Json
        {
            "foo": "bar"
        }
    
    But as a function
  • mmmm2 3 days ago

    `trash` is good to know, thanks! I'd been doing: "tell app \"Finder\" to move {%s} to trash" where %s is a comma separated list of "the POSIX file <path-to-file>".

    • gcanyon 3 days ago

      Oooh, I just suggested in another comment that using applescript would be possible. I didn't think it would be this easy though.

  • lkbm 2 days ago

    > Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.

    That was my thought. I use jq to pretty print json.

    What I have found useful is j2p and p2j to convert to/from python dict format to json format (and pretty print the output). I also have j2p_clip and p2j_clip, which read from and then write to the system clipboard so I don't have to manually pipe in and out.

    > Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?

    I also made a uuid, which just runs uuidgen, but then trims the \n. (And maybe copied to clipboard? It was at my old job, and I don't seem to have saved it to my personal computer.)

  • frumplestlatz 3 days ago

    For trash on macOS, I recommend https://github.com/ali-rantakari/trash

    Does all the right things and works great.

    There’s a similar tool that works well on Linux/BSDs that I’ve used for years, but I don’t have my FreeBSD desktop handy to check.

  • YouAreWRONGtoo 3 days ago

    Instead of trash, reimplementing rm (to only really delete after some time or depending on resource usage or to shred of you are paranoid if the goal is to really delete something) or using zfs makes much more sense.

    • orhmeh09 3 days ago

      I can't imagine a scenario where I would want to reimplement rm just for this.

      • YouAreWRONGtoo 2 days ago

        [flagged]

        • latexr 2 days ago

          https://news.ycombinator.com/newsguidelines.html

          > Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

          Instead of being rude to a fellow human making an inoffensive remark, you could’ve spent your words being kind and describing the scenario you claim exists. For all you know, maybe they did ask ChatGPT and were unconvinced by the answer.

          As a side note, I don’t even understand how your swipe would make sense. If anything, needing ChatGPT is what demonstrates a lack of imagination (having the latter you don’t need the former).

          • YouAreWRONGtoo 2 days ago

            What makes you think I need ChatGPT, since I just wondered whether ChatGPT was as stupid, since obviously I do know why that would be useful?

    • thiht 2 days ago

      How is this better?

  • gcanyon 3 days ago

    I believe it would be possible to execute an applescript to tell the finder to delete the files in one go. It would theoretically be possible to construct/run the applescript directly in a shell script. It would be easier (but still not trivial) to write an applescript file to take the file list as an argument to then delete when calling from the shell.

    • latexr 2 days ago

      It’s not theoretical, and it is trivial. Like I said, I did exactly that for years. Specifically, I had a function in my `.zshrc` to expand all inputs to their full paths, verify and exclude invalid arguments, trash the rest in one swoop, then show me an error with the invalid arguments, if any.

  • derintegrative 2 days ago

    On Linux you also have

        % cat /proc/sys/kernel/random/uuid
        464a4e91-5ce4-47b6-bb09-8a60fde572fb
  • true_religion 3 days ago

    Trash command first appeared in macOS 14.0, which was 2023.

  • energy123 2 days ago

    I do `mv a.txt /tmp` instead of `rm`.

  • sedatk 3 days ago

    and it's `New-Guid` in PowerShell.

soiltype 3 days ago

This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?

What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!

Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.

A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)

  • fragmede 3 days ago
    • yellowapple a day ago

      One of the very few things I like about macOS is that it rebinds the CUA key from Ctrl to Cmd, freeing up Ctrl for these Emacs-style text navigation keybinds. It's odd to me that seemingly zero Linux distros/DEs do this by default.

    • mlrtime 2 days ago

      Wow thanks, I'm tattooing this on my right hand now. :)

      • kevinrineer 2 days ago

        No one tell him about `set editing-mode vi` or `info readline`

  • klaussilveira 2 days ago

    This is one of the things I miss the most about hacker conferences. The sharing of tools, scripts, tips and tricks. It was, and still is, just as fun as trading cards.

    • bawis a day ago

      What's a hacker conference?

      • yellowapple a day ago

        A conference in which hackers congregate.

        My favorite recent one was Handmade Seattle, but that one's kaput as of this year, and it seems everything else along similar lines is overseas and/or invite-only.

  • chipsrafferty 3 days ago

    I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.

    • taejavu 3 days ago

      Not all time is created equal. I’ll happily invest more time than I’ll ever get back in refining a script or vim config or whatever, so that later, when I’m busy and don’t have time to muck around, I can stay in the flow and not be annoyed by distractions.

    • karczex 3 days ago

      Sometimes it's rather matter of sanity than time management. I once created systemd service which goes to company web page and downloads some files which I sometimes need. This script was pretty hacky, and writing it took me a lot of time - probably more than clicking manually on this page in the long run. But clicking it so annoying, that I feel it was totally worth.

    • latexr 2 days ago

      > reference them when forgetting syntax

      If you have to do that, the script needs improvement. Always add a `--help` which explains what it does and what arguments it takes.

      • tom_ 2 days ago

        If you write these sorts of things in Python, argparse is worth investigating: https://docs.python.org/3/library/argparse.html - it's pretty easy to use, makes it easy to separate the command line handling from the rest of the code, and, importantly, will generate a --help page for you. And if you want something it can't do, you can still always write the code yourself!

        • latexr 2 days ago

          I don’t like Python in general, but even so I’ll say that argparse is indeed very nice. When I was writing ruby, I always felt that OptionParser¹ wasn’t as good. Swift has Argument Parser², officially from Apple, which is quite featureful. For shell, I have a a couple of bespoke patterns I have been reusing in every script for many years.

          ¹ https://github.com/ruby/optparse

          ² https://github.com/apple/swift-argument-parser

          • tom_ a day ago

            Regarding other ports, I've also been pretty happy with https://github.com/nodeca/argparse, which works nicely from Typescript. Looks like it hasn't been updated for a while, but it's not like there's a great deal wrong with it.

            https://github.com/p-ranav/argparse is a single-file argparse for Modern C++, which means it's typically straightforward, if baffling in places and a bit annoying to step through in the debugger.

            The nice thing about the argparse ports is that provided they take their job seriously, your programs will all end up with a vaguely consistent command line UX in terms of longopt syntax, and, importantly, a --help page.

          • yellowapple a day ago

            Many years ago I wrote a library I called “Ruby on Bales”¹ specifically due to my frustrations with the state of command-line argument parsing for Ruby scripts. I haven't touched it in a long while; maybe I should revisit it.

            ----

            ¹ https://github.com/YellowApple/bales

    • te_cima 3 days ago

      why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.

      • akersten 3 days ago

        It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/

        For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.

        • duskdozer 3 days ago

          Maybe, but

          1. even if it costs more time, it could also save more annoyance which could be a benefit

          2. by publishing the scripts, anyone else who comes across them can use them and save time without the initial cost. similarly, making and sharing these can encourage others to share their own scripts, some of which the author could save time with

          • chipsrafferty 2 days ago

            In my experience, it's not "maybe" but "almost certainly" which is why I stopped doing this. Every time I get a new system I would have to set everything up again, it's not cross platform, doesn't work when using someone else's computer, suddenly breaks for some reason or another, or you forget it exists...

            The annoyance of all these factors for outweighs the benefits, in my experience. It's just that the scripts feel good at first and the annoyance doesn't come until later and eventually you abandon them.

            • yellowapple a day ago

              > Every time I get a new system I would have to set everything up again

              Sounds like something you could automate with a script :)

        • skydhash 3 days ago

          Sometimes, you explore to have ideas. By fixing a few problems like these, you learn about technologies that can help you in another situation.

        • janalsncm 3 days ago

          Not all time is created equally though, so I disagree with that xkcd.

          If something is time sensitive it is worth spending a disproportionate amount of time to speed things up at some later time. For example if you’re debugging something live, in a live presentation, working on something with a tight deadline etc.

          Also you don’t necessarily know how often you’ll do something anyways.

          • normie3000 2 days ago

            > I disagree with that xkcd

            The xkcd doesn't seem to be pushing an agenda, just providing a lookup table. Time spent vs time saved is factual.

            • janalsncm 2 days ago

              The title of the comic is “ Is It Worth the Time?”.

              To take a concrete example, if I spend 30 minutes on a task every six months, over 5 years that’s 5 hours of “work” hours. So the implication is that it’s not worth automating if it takes more than 5 hours to automate.

              But if those are 5 hours of application downtime, it’s pretty clearly worth it even if I have to spend way more than 5 hours to reduce downtime.

              • yellowapple a day ago

                Time saved also ain't the only factor here. I'll often automate something not because it actually saves a lot of time, but rather because it codifies an error-prone process and having it scripted out reduces the risk of human error by enough of a degree to be worth spending more time on it than I'd save.

        • kelvinjps10 3 days ago

          I find that now with AI, you can make scripts very quickly, reducing the time to write them by a lot. There is still some time needed for prompting and testing but still.

        • latexr 2 days ago

          One thing which is often ignored in these discussions is the experience you gain. The time you “wasted” on your previous scripts by taking longer to write them compounds in time saved in the future because you can now write more complex tasks faster.

          • dbalatero 2 days ago

            The problem is, to really internalize that benefit, one would need to have an open mind to trying things out, and many folks seem to resist that. Oh well, more brain connections for me I suppose.

        • r4tionalistmind 2 days ago

              >YOU DON'T UNDERSTAND. I NEED TO BE CONSTANTLY OPTIMIZING MY UPTIME. THE SCIENCE DEMANDS IT. TIMEMAXXING. I CAN'T FREELY EXPLORE OR BRAINSTORM, IT'S NOT XKCD 1205 COMPLIANT. I MUST EVALUATE EVERY PROPOSED ACTIVITY AGAINST THE TIME-OPTIMIZATION-PIVOT-TABLE.
      • chipsrafferty 2 days ago

        Because some of them OP said they use a few times a year. This means they'll probably use it like 150 times in their life. If it saves a minute each time, but it takes 5 hours to create it and 5 hours to maintain it over the years, then it's not really a win.

  • freedomben 2 days ago

    I love this kind of stuff too, but too many times over the years I've found myself in environments without some of these higher level and more niche tools (including my own dot files), or the tool ends up abandoned, and I struggle to remember how to use the basics/builtins. I've gotten a lot more conservative about adopting them because of that.

    • bityard 2 days ago

      Pretty much my take as well. I imagine spending a few hours a month customizing your shell and text editor (hello vim/Emacs folks) to be more efficient and powerful is _great_ for developers who rarely leave their own workstation. But I spend much of my day logging into random hosts that don't have my custom shell scripts and aliases, so I'm actively careful not to fill my muscle memory with custom shortcuts and the like.

      Of course, I _do_ have some custom shell scripts and aliases, but these are only for things I will ever do locally.

  • djtriptych 2 days ago

    Loved it too. Made me want to write a schema for other developers to add (tool, frequency_of_use, category, description) tuples.

oceanplexian 3 days ago

It's weird how the circle of life progresses for a developer or whatever.

- When I was a fresh engineer I used a pretty vanilla shell environment

- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP

- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.

  • chis 3 days ago

    I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).

    When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install

    • heyitsguay 3 days ago

      I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.

      • nijaru 3 days ago

        You can apply your dotfiles to servers you SSH into rather easily. I'm not sure what your workflow is like but frameworks like zsh4humans have this built in, and there are tools like sshrc that handle it as well. Just automate the sync on SSH connection. This also applies to containers if you ssh into them.

        • theshrike79 3 days ago

          I'm guessing you haven't worked in Someone Else's environment?

          The amount of shit you'll get for "applying your dotfiles" on a client machine or a production server is going to be legendary.

          Same with containers, please don't install random dotfiles inside them. The whole point of a container is to be predictable.

          • nijaru 3 days ago

            Do you have experience with these tools? Some such as sshrc only apply temporarily per session and don't persist or affect other users. I keep plain 'ssh' separate from shell functions that apply dotfiles and use each where appropriate. You can also set up temporary application yourself pretty easily.

          • fragmede 3 days ago

            If, in the year 2025, you are still using a shared account called "root" (password: "password"), and it's not a hardware switch or something (and even they support user accounts these days), I'm sorry, but you need to do better. If you're the vendor, you need to do better, if you're the client, you need to make it an issue with the vendor and tell them they need to do better. I know, it's easy for me to say from the safety of my armchair at 127.0.0.1. I've got some friends in IT doing support that have some truly horrifying stories. But holy shit why does some stuff suck so fucking much still. Sorry, I'm not mad at you or calling you names, it's the state of the industry. If there were more pushback on broken busted ass shit where this would be a problem, I could sleep better at night, knowing that there's somebody else that isn't being tortured.

            • theshrike79 3 days ago

              It’s 2025. I don’t even have the login password to any server, they’re not unicorns, they’re cattle.

              If something is wrong with a server, we terminate it and spin up a new one. No need for anyone to log in.

              In very rare cases it might be relevant to log in to a running server, but I haven’t done that in years.

          • LinXitoW 2 days ago

            In other replies you explicitly state how rare it is that you log in to other systems.

            Aren't you therefore optimizing for 1% of the cases, but sabotaging the 99%?

          • YouAreWRONGtoo 3 days ago

            Someone else's environment? That should never happen. You should get your own user account and that's it.

            • mlrtime 2 days ago

              Sometimes we need to use service accounts, so while you do have your own account all the interesting things happen in svc_foo which you cannot add your .files.

            • theshrike79 3 days ago

              I don’t even get an account on someone else’s server. There’s no need for me to log in anywhere unless it’s an exceptional situation.

              • YouAreWRONGtoo 2 days ago

                This doesn't make sense.

                You said you were already using someone else's environment.

                You can't later say that you don't.

                Whether or not shell access makes sense depends on what you are doing, but a well written application server running in a cloud environment doesn't need any remote shell account.

                It's just that approximately zero typical monolithic web applications meet that level of quality and given that 90% of "developers" are clueless, often they can convince management that being stupid is OK.

                • 1718627440 2 days ago

                  They do get to work on someone else's server, they do not get a separate account on that server. There client would be not happy to have them mess around with the environment.

                  • YouAreWRONGtoo 2 days ago

                    By definition, it the client Alice gives contractor Mallory access to user account alice, that's worse than giving them an account called mallory.

                    Accounts are basically free. Not having accounts; that's expensive.

                    • nomel 2 days ago

                      They specifically mentioned service accounts. If they’re given an user account to login as, they still might have to get into and use the service account, and its environment, from there. If the whole purpose was to get into the service account, and the service account is already setup for remote debug, then the client might prefer to skip the creation of the practically useless user account.

                      • YouAreWRONGtoo 2 days ago

                        That's still not professional, but then again 99.9% of companies aren't.

                        • nomel 2 days ago

                          Could you help me understand what assumptions about the access method you have in place that make this seem unprofessional?

                          Let's assume they need access to the full service account environment for the work, which means they need to login or run commands as the service account.

                          This is a bit outside my domain, so this is a genuine question. I've worked on single user and embedded systems where this isn't possible, so I find the "unprofessional" statement very naive.

      • tester457 3 days ago

        The defaults are unbearable. I prefer using chezmoi to feel at home anywhere. There's no reason I can't at least have my aliases.

        I'd rather take the pain of writing scripts to automate this for multiple environments than suffer the death by a thousand cuts which are the defaults.

        • fragmede 3 days ago

          chezmoi is the right direction, but I don't want to have to install something on the other server, I should just be able to ssh to a new place and have everything already set up, via LocalCommand and Host * in my ~/.ssh/config

    • bigwheels 3 days ago

      Atuin is new to me!

      https://github.com/atuinsh/atuin

      Discussed 4 months ago:

      Atuin – Magical Shell History https://news.ycombinator.com/item?id=44364186 - June 2025, 71 comments

      • auraham 3 days ago

        I gave it a try a few months ago, but did not work for me. My main issue is that atuin broke my workflow with fzf (If I remember correctly, pressing ctrl + r to lookup my shell history did not work well after installing atuin).

        • bigwheels 3 days ago

          I'm sympathetic, also a longtime fzf user here. I install it reflexively on any system I use for more than a day or two.

        • TsiCClawOfLight 2 days ago

          This is configurable! I use atuin, but fzf with ctrl-r.

      • tacker2000 3 days ago

        I like atuin but why is it so slow when first opening (hitting up) in the shell?

        • johntash 3 days ago

          I'd recommend disabling atuin when hitting up and just leave it on ctrl+r instead

        • YouAreWRONGtoo 2 days ago

          Either it wasn't a design goal or they are stupid. Why don't you tell us?

          The right way this would work is via a systemd service and then it should be instant.

  • trenchpilgrim 3 days ago

    When I had one nix computer, I wanted to customize it heavily.

    Now I have many nix computers and I want them consistent and with only the most necessary packages installed.

    • sestep 3 days ago

      For anyone else reading this comment who was confused because this seems like the opposite of what you'd expect about Nix: Hacker News ate the asterisks and turned them into italics.

      • fragmede 3 days ago

        use a backslash. \*

        (had to use a double backslash to render that correctly)

        • latexr 3 days ago

          Or two consecutive asterisks: ** becomes *

    • ozim 3 days ago

      Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.

      • soraminazuki 3 days ago

        What does your wife, dog, children, chores, and shopping have to do with custom configuration and scripts? Just set up a Git repo online, put your files there, and take a couple of minutes to improve it incrementally when you encounter inconveniences. And just like that, you made your life easier for a marginal effort.

        • 1718627440 2 days ago

          They compete for time.

          • mlrtime 2 days ago

            Don't even try to explain the scripts to wife*, try the dog. At least he'll understand it just as much and be enthusiastic to hear it!

            *may not be applicable to all wives, ymmv.

            • TsiCClawOfLight 2 days ago

              I thought my wife latex, she loves me for it :D

          • soraminazuki 2 days ago

            I'm saying that makes no sense, as I've wrote in the comment you're replying to.

            • 1718627440 2 days ago

              Having a wife increases the opportunity costs of the time you spend on maintaining the scripts and also increases the costs while writing these (when the wife is nagging).

    • Ferret7446 2 days ago

      I don't get why this is a problem. Just stick all your configs in a git repo and clone it wherever you need it.

      • trenchpilgrim a day ago

        I run two OSes. So two variants.

        Some are desktops, some laptops, some servers. Different packages installed, different hardware. Three more variants.

        Yes, I do have a script to set up my environment, but it already has a lot of conditional behavior to handle these five total variants. And I don't want to have to re-test the scripts and re-sync often.

  • imiric 3 days ago

    I've heard this often, but I'm going on ~25 years of using Linux, and I would be lost without my dotfiles. They represent years of carefully crafting my environment to suit my preferences, and without them it would be like working on someone else's machine. Not impossible, just very cumbersome.

    Admittedly, I've toned down the configs of some programs, as my usage of them has evolved or diminished, but many are still highly tailored to my preferences. For example, you can't really use Emacs without a considerable amount of tweaking. I mean, you technically could, but such programs are a blank slate made to be configured (and Emacs is awful OOB...). Similarly for zsh, which is my main shell, although I keep bash more vanilla. Practically the entire command-line environment and the choices you make about which programs to use can be considered configuration. If you use NixOS or Guix, then that extends to the entire system.

    If you're willing to allow someone else to tell you how you should use your computer, then you might as well use macOS or Windows. :)

  • D13Fd 3 days ago

    I would still call my Python scripts “scripts.” I don’t think the term “scripts” is limited to shell scripts.

  • planb 3 days ago

    Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.

  • jamesbelchamber 3 days ago

    I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."

    Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.

    (I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))

    • dbalatero 2 days ago

      I've been programming 30 years and I really don't find it a hassle:

      - if you commit them to git, they last your entire career

      - improving your setup is basically compound interest

      - with a new laptop, my setup script might cause me 15 minutes of fixing a few things

      - the more you do it, the less any individual hassle becomes, and the easier it looks to make changes – no more "i don't have time" mindset

  • grimgrin 3 days ago

    this is how it works for you

    as a person who loves their computer, my ~/bin is full. i definitely (not that you said this) do not think "everything i do has to be possible on every computer i am ever shelled into"

    being a person on a computer for decades, i have tuned how i want to do things that are incredibly common for me

    though perhaps you're referring to work and not hobby/life

  • subsection1h 3 days ago

    > When I was a fresh engineer I used a pretty vanilla shell environment. When I got a year or two of experience, I wrote tons of scripts

    Does this mean that you learned to code to earn a paycheck? I'm asking because I had written hundreds of scripts and Emacs Lisp functions to optimize my PC before I got my first job.

  • heap_perms 3 days ago

    I can't say I relate at all (5 years of experience). They'll have to pry my 1000-line .zshrc from my cold, dead hands. For example, zsh-autosuggestions improves my quality of life so ridiculously much it's not even funny.

    • cvak 3 days ago

      I moved away from 1000 lines .zshrc when I had to do stuff on linux VMs/dockers and I was lost a lot. But you zsh-autosuggestions, and fzf-tab is not going anywhere.

  • eikenberry 3 days ago

    Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.

  • nonethewiser 3 days ago

    On the other hand, the author seems to have a lot of experience as well.

    Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.

    edit: In the case of the author I guess he's probably wants to live in the terminal full time. And perhaps offline. there is a lot of static data he's stored like http status codes: https://codeberg.org/EvanHahn/dotfiles/src/commit/843b9ee13d...

    In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.

  • russellbeattie 3 days ago

    The moment of true enlightenment is when you finally decide to once and for all memorize all the arguments and their order for those command line utilities that you use at an interval that's just at the edge of your memory: xargs, find, curl, rsync, etc.

    That, plus knowing how to parse a man file to actually understand how to use a command (a skill that takes years to master) pretty much removes the need for most aliases and scripts.

    • whatevertrevor 3 days ago

      I already have limited space for long term memory, bash commands are very far down the list of things I'd want to append to my long term storage.

      I use ctrl-R with a fuzzy matching program, and let my terminal remember it for me.

      And before it's asked: yes that means I'd have more trouble working in a different/someone else's environment. But as it barely ever happens for me, it's hardly an important enough scenario to optimize for.

    • npodbielski 2 days ago

      Why would I even attempt to do that? Life is too short to try to remember something like that. Maybe 20 years ago when internet was not that common. Or maybe if you are a hacker, hacking other peoples machines. Me? Just some dev trying yo make some money to feed my family? I prefer to have a walk to the woods.

  • dylan604 3 days ago

    man, i couldn't live without alias ..='cd ..' or alias ...='cd ../..'

    to this day, i still get tripped up when using a shell for the first time without those as they're muscle memory now.

    • fiddlerwoaroof 3 days ago

      I just use the autocd zsh shell option for this. And I also use `hash -d` to define shortcuts for common directories. Then just “executing” something like `~gh/apache/kafka` will cd to the right place.

    • 1718627440 2 days ago

      Thanks. I haven't considered these aliases, but they seam useful, so I just added them for my user. :-)

    • 400thecat 2 days ago

      you can configure Alt+Left to go up level

  • shermanyo 3 days ago

    I use a dotfile with aliases and functions, mostly to document / remember commands I find useful. It's been a handy way to build a living document of the utils I use regularly, and is easy to migrate to each new workstation.

  • Mikhail_Edoshin 3 days ago

    Given the nature of current operating systems and applications, do you think the idea of “one tool doing one job well” has been abandoned? If so, do you think a return to this model would help bring some innovation back to software development?

    Rob Pike: Those days are dead and gone and the eulogy was delivered by Perl.

    • president_zippy 3 days ago

      But was the eulogy written in Perl poetry? I see it everywhere, but I don't know who this JAPH guy is. It's a strange way of spelling Jeff, and it's odd that he types his name in all caps, but he has published a remarkable quantity of works and he's even more famous than the anonymous hacker known as 4chan.

    • npodbielski 2 days ago

      Oh I hate that paradigm. Well, maybe chmod and ls rsync and curl all do they OWN thing very well but every time I am using one of those tools I have to remember if i.e. more detailed response is -v or maybe -vvv or --verbose or -x for some reason because maintainer felt like it at 2:32 in the morning 17 years ago... Some consistency would help, but... Probably it is impossible the flame war over -R being recursive or read-only would never end.

  • mlrtime 2 days ago

    For the Infra Engineers out there who still manage fleets of pets, this is double true. You may not have access or be able to use all your shortcut scripts so you better know the raw commands on that unsupported RHEL6 host.

  • denimnerd42 3 days ago

    I prefer using kubectl than any other method so i have plenty of functions to help with that. I'd never consider using python or go for this although I do have plenty of python and go "scripts" on my path too.

  • fragmede 3 days ago

    If you come through the other side, you set up LocalCommand in your .ssh/config which copies your config to every server you ssh to, and get your setup everywhere.

  • apprentice7 3 days ago

    It's the bell curve meme all along.

  • stronglikedan 3 days ago

    Different strokes for different folks - tenured engineers just settle into whatever works best for them.

  • bdangubic 3 days ago

    or just ask claude etc to do it for ya

elric 2 days ago

I've written on this before, but I have an extensive collection of "at" scripts. This started 25+ years ago when I dragged a PC tower running BSD to a friend's house, and their network differed from mine. So I wrote an @friend script which did a bunch of ifconfig foo.

Over time that's grown to an @foo script for every project I work on, every place I frequent that has some kind of specific setup. They are prefixed with an @ because that only rarely conflicts with anything, and tab-complete helps me remember the less frequently used ones.

The @project scripts setup the whole environment, alias the appropriate build tools and versions of those tools, prepare the correct IDE config if needed, drop me in the project's directory, etc. Some start a VPN connection because some of my clients only have git access over VPN etc.

Because I've worked on many things over many years, most of these scripts also output some "help" output so I can remember how shit works for a given project.

Here's an example:

    # @foo
    
    PROJECT FOO
    -----------
    
    VPN Connection: active, split tunnel
    
    Commands: 
    tests: mvn clean verify -P local_tests
    build all components: buildall
    
    Tools:
    java version: 17.0.16-tem
    maven version: 3.9.11
Edit: a word on aliases, I frequently alias tools like maven or ansible to include config files that are specific to that project. That way I can have a .m2 folder for every project that doesn't get polluted by other projects, I don't have to remember to tell ansible which inventory file to use, etc. I'm lazy and my memory is for shit.
  • blixt 2 days ago

    Slightly related but mise, a tool you can use instead of eg make, has “on enter directory” hooks that can reconfigure your system quite a bit whenever you enter the project directory in the terminal. Initially I was horrified by this idea but I have to admit it’s been quite nice to enter into a directory and everything is set up just right, also for new people joining. It has built in version management of just about every command line tool you could imagine, so that an entire team can be on a consistent setup of Python, Node, Go, etc.

    • blixt 2 days ago

      I see other people mentioning env and mise does this too, with additional support to add on extra env overrides with a dedicated file such as for example .mise.testing.toml config and running something like:

      MISE_ENV=testing bun run test

      (“testing” in this example can be whatever you like)

    • nullwarp 2 days ago

      This is very useful to me and I had no idea, thanks for pointing that feature out!

  • mlrtime 2 days ago

    I'm stealing the top comment here because you probably know what I'm asking.

    I've always wanted a linux directory hook that runs some action. Say I have a scripts dir filled with 10 different shells scripts. I could easily have a readme or something to remember what they all do.

    What I want is some hook in a dir that every time I cd into that dir it runs the hook. Most of the time it would be a simple 'cat usage.txt' but sometimes it maybe 'source .venv/bin/activate'.

    I know I can alias the the cd and the hook together but I don't want that.

    • eadmund 2 days ago

      I recommend direnv for that: https://direnv.net/

      Its intended use case is loading environment variables (you could use this to load your virtualenv), but it works by sourcing a script — and that script can be ‘cat usage.txt.’

      Great tool.

      If you use Emacs (and you should!), there’s a direnv mode. Emacs also has its own way to set configuration items within a directory (directory-local variables), and is smart enough to support two files, so that there can be one file checked into source control for all members of a project and another ignored for one’s personal config.

    • hellcow 2 days ago

      direnv does exactly what you describe (and a lot more) using flake.nix. cd into the directory and it automatically runs. I use it in every single project/repository to set environment variables and install project-specific dependencies locked to specific versions.

      • eadmund 2 days ago

        > direnv does exactly what you describe (and a lot more) using flake.nix

        Direnv is awesome! Note, thought, that it does not depend on Nix, just a Unix-like OS and a supported shell: https://direnv.net/#prerequisites

    • oulipo2 2 days ago

      As other comments say, direnv does that, but honestly you should look into mise-en-place (mise) which is really great, and also includes a "mini-direnv"

southwindcg 3 days ago

Regarding the `line` script, just a note that sed can print an arbitrary line from a file, no need to invoke a pipeline of cat, head, and tail:

    sed -n 2p file
prints the second line of file. The advantage sed has over this line script is it can also print more than one line, should you need to:

    sed -n 2,4p file
prints lines 2 through 4, inclusive.
  • tonmoy 3 days ago

    It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing

    • 1-more 3 days ago

      yeah I almost always start with `cat` but I still pipe it into `sed -n 1,4p`

    • southwindcg 3 days ago

      True, everything depends on what one is trying to do at the time.

alberand 3 days ago

My fav script to unpack anything, found a few years ago somewhere

      # ex - archive extractor
      # usage: ex <file>
      function ex() {
          if [ -f $1 ] ; then
          case $1 in
              *.tar.bz2) tar xjf $1 ;;
              *.tar.gz) tar xzf $1 ;;
              *.tar.xz) tar xf $1 ;;
              *.bz2) bunzip2 $1 ;;
              *.rar) unrar x $1 ;;
              *.gz) gunzip $1 ;;
              *.tar) tar xf $1 ;;
              *.tbz2) tar xjf $1 ;;
              *.tgz) tar xzf $1 ;;
              *.zip) unzip $1 ;;
              *.Z) uncompress $1;;
              *.7z) 7z x $1 ;;
              *) echo "'$1' cannot be extracted via ex()" ;;
          esac
          else
              echo "'$1' is not a valid file"
          fi
      }
  • _whiteCaps_ 3 days ago

    `tar xf` autodetects compressed files now. You can replace any of your instances of tar with that.

    • alberand 2 days ago

      Honestly, it doesn't need any updates, it works so great without any pain, I'm just happy with it

    • soraminazuki 3 days ago

      Yes, but only bsdtar has support for zip, rar, and 7z.

  • rbonvall 3 days ago

    I use dtrx, which also ensures that all files are extracted into a folder.

  • juancroldan 3 days ago

    That's brilliant. Now I need its compressing counterpart.

    • alberand 2 days ago

      For compression, I have one for .tar.gz. But it's not that popular in my system. I need something a bit easier than 'pack file file file archive.tar.gz'

  • junkblocker 2 days ago

    I had found a zsh version somewhere which I've updated a few times over the years though I don't get a chance to use it much. :'D

        un () {
     unsetopt extendedglob
     local old_dirs current_dirs lower do_cd
     if [ -z "$1" ]
     then
      print "Must supply an archive argument."
      return 1
     fi
     if [ -d "$1" ]
     then
      print "Can't do much with directory arguments."
      return 1
     fi
     if [ ! -e "$1" -a ! -h "$1" ]
     then
      print "$1 does not exist."
      return 1
     fi
     if [ ! -r "$1" ]
     then
      print "$1 is not readable."
      return 1
     fi
     do_cd=1 
     lower="${(L)1}" 
     old_dirs=(*(N/)) 
     undone=false 
     if which unar > /dev/null 2>&1 && unar "$1"
     then
      undone=true 
     fi
     if ! $undone
     then
      INFO="$(file "$1")" 
      INFO="${INFO##*: }" 
      if command grep -a --line-buffered --color=auto -E "Zstandard compressed data" > /dev/null <<< "$INFO"
      then
       zstd -T0 -d "$1"
      elif command grep -a --line-buffered --color=auto -E "bzip2 compressed" > /dev/null <<< "$INFO"
      then
       bunzip2 -kv "$1"
      elif command grep -a --line-buffered --color=auto -E "Zip archive" > /dev/null <<< "$INFO"
      then
       unzip "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "RAR archive" > /dev/null
      then
       unrar e "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E 'xar archive' > /dev/null
      then
       xar -xvf "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "tar archive" > /dev/null
      then
       if which gtar > /dev/null 2>&1
       then
        gtar xvf "$1"
       else
        tar xvf "$1"
       fi
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "LHa" > /dev/null
      then
       lha e "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "LHa" > /dev/null
      then
       lha e "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "compress'd" > /dev/null
      then
       uncompress -c "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "xz compressed" > /dev/null
      then
       unxz -k "$1"
       do_cd=0 
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "7-zip" > /dev/null
      then
       7z x "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "RPM " > /dev/null
      then
       if [ "$osname" = "Darwin" ]
       then
        rpm2cpio "$1" | cpio -i -d --quiet
       else
        rpm2cpio "$1" | cpio -i --no-absolute-filenames -d --quiet
       fi
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "cpio archive" > /dev/null
      then
       cpio -i --no-absolute-filenames -d --quiet < "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E "Debian .* package" > /dev/null
      then
       dpkg-deb -x "$1" .
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i " ar archive" > /dev/null
      then
       ar x "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "ACE archive" > /dev/null
      then
       unace e "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "ARJ archive" > /dev/null
      then
       arj e "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "xar archive" > /dev/null
      then
       xar -xvf "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "ZOO archive" > /dev/null
      then
       zoo x "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -Ei "(tnef|Transport Neutral Encapsulation Format)" > /dev/null
      then
       tnef "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "InstallShield CAB" > /dev/null
      then
       unshield x "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -Ei "(mail|news)" > /dev/null
      then
       formail -s munpack < "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "uuencode" > /dev/null
      then
       uudecode "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "cab" > /dev/null
      then
       cabextract "$1"
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E -i "PPMD archive" > /dev/null
      then
       ln -s "$1" . && ppmd d "$1" && rm `basename "$1"`
      elif [[ $lower == *.zst ]]
      then
       zstd -T0 -d "$1"
      elif [[ $lower == *.bz2 ]]
      then
       bunzip2 -kv "$1"
      elif [[ $lower == *.zip ]]
      then
       unzip "$1"
      elif [[ $lower == *.jar ]]
      then
       unzip "$1"
      elif [[ $lower == *.xpi ]]
      then
       unzip "$1"
      elif [[ $lower == *.rar ]]
      then
       unrar e "$1"
      elif [[ $lower == *.xar ]]
      then
       xar -xvf "$1"
      elif [[ $lower == *.pkg ]]
      then
       xar -xvf "$1"
      elif [[ $lower == *.tar ]]
      then
       if which gtar > /dev/null 2>&1
       then
        gtar xvf "$1"
       else
        tar xvf "$1"
       fi
      elif [[ $lower == *.tar.zst || $lower == *.tzst ]]
      then
       which gtar > /dev/null 2>&1
       if [[ $? == 0 ]]
       then
        gtar -xv -I 'zstd -T0 -v' -f "$1"
       elif [[ ${OSTYPE:l} == linux* ]]
       then
        tar -xv -I 'zstd -T0 -v' -f "$1"
       else
        zstd -d -v -T0 -c "$1" | tar xvf -
       fi
      elif [[ $lower == *.tar.gz || $lower == *.tgz ]]
      then
       which gtar > /dev/null 2>&1
       if [[ $? == 0 ]]
       then
        gtar zxfv "$1"
       elif [[ ${OSTYPE:l} == linux* ]]
       then
        tar zxfv "$1"
       else
        gunzip -c "$1" | tar xvf -
       fi
      elif [[ $lower == *.tar.z ]]
      then
       uncompress -c "$1" | tar xvf -
      elif [[ $lower == *.tar.xz || $lower == *.txz ]]
      then
       which gtar > /dev/null 2>&1
       if [[ $? == 0 ]]
       then
        xzcat "$1" | gtar xvf -
       else
        xzcat "$1" | tar xvf -
       fi
      elif echo "$INFO" | command grep -a --line-buffered --color=auto -E 'gzip compressed' > /dev/null || [[ $lower == *.gz ]]
      then
       if [[ $lower == *.gz ]]
       then
        gzcat -d "$1" > "${1%.gz}"
       else
        cat "$1" | gunzip -
       fi
       do_cd=0 
      elif [[ $lower == *.tar.bz2 || $lower == *.tbz ]]
      then
       bunzip2 -kc "$1" | tar xfv -
      elif [[ $lower == *.tar.lz4 ]]
      then
       local mytar
       if [[ -n "$(command -v gtar)" ]]
       then
        mytar=gtar 
       else
        mytar=tar 
       fi
       if [[ -n "$(command -v lz4)" ]]
       then
        $mytar -xv -I lz4 -f "$1"
       elif [[ -n "$(command -v lz4cat)" ]]
       then
        lz4cat -kd "$1" | $mytar xfv -
       else
        print "Unknown archive type: $1"
        return 1
       fi
      elif [[ $lower == *.lz4 ]]
      then
       lz4 -d "$1"
      elif [[ $lower == *.epub ]]
      then
       unzip "$1"
      elif [[ $lower == *.lha ]]
      then
       lha e "$1"
      elif which aunpack > /dev/null 2>&1
      then
       aunpack "$@"
       return $?
      else
       print "Unknown archive type: $1"
       return 1
      fi
     fi
     if [[ $do_cd == 1 ]]
     then
      current_dirs=(*(N/)) 
      for i in {1..${#current_dirs}}
      do
       if [[ $current_dirs[$i] != "$old_dirs[$i]" ]]
       then
        cd "$current_dirs[$i]"
        ls
        break
       fi
      done
     fi
        }
  • YouAreWRONGtoo 3 days ago

    Now, add inotify and a systemd user service and you would be getting somewhere. Also packaged versions of that exist already.

    So, you created a square wheel, instead of a NASA wheel.

Noumenon72 3 days ago

While you're creating and testing aliases, it's handy to source your ~/.zshrc whenever you edit it:

    alias vz="vim ~/.zshrc && . ~/.zshrc"
I alias mdfind to grep my .docx files on my Mac:

    docgrep() {
      mdfind "\"$@\"" -onlyin /Users/xxxx/Notes 2> >(grep --invert-match ' \[UserQueryParser\] ' >&2) | grep -v -e '/Inactive/' | sort
    }
I use an `anon` function to anonymize my Mac clipboard when I want to paste something to the public ChatGPT, company Slack, private notes, etc. I ran it through itself before pasting it here, for example.

    anonymizeclipboard() {
      my_user_id=xxxx
      account_ids="1234567890|1234567890"  #regex
      corp_words="xxxx|xxxx|xxxx|xxxx|xxxx"  #regex
      project_names="xxxx|xxxx|xxxx|xxxx|xxxx"  # regex
      pii="xxxx|xxxx|xxxx|xxxx|xxxx|xxxx"  # regex
      hostnames="xxxx|xxxx|xxxx|xxxx|xxxx|xxxx|xxxx|xxxx|xxxx"  # regex
      # anonymize IPs
      pbpaste | sed -E -e 's/([0-9]{1,3})\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}/\1.x.x.x/g' \
      -e "s/(${corp_words}|${project_names}|${my_user_id}|${pii}|${hostnames})/xxxx/g" -e "s/(${account_ids})/1234567890/g" | pbcopy
      pbpaste
    }

    alias anon=anonymizeclipboard
It prints the new clipboard to stdout so you can inspect what you'll be pasting for anything it missed.
  • xwowsersx 3 days ago

    ha! alias vz="vim ~/.zshrc && . ~.zshrc" is brilliant. Editing zshrc and sourcing is something I do pretty often. Never thought to alias

  • 1718627440 2 days ago

    What's the difference between 'source' and '.' ?

    • iguessthislldo 2 days ago

      I think they're the same except '.' is POSIX and 'source' is specific to bash and compatible shells. I personally just use source since it's easier to read and zsh and bash account for basically 100% of my shell usage.

    • Noumenon72 2 days ago

      If $SHELL is /bin/sh, the source command does not exist, but '.' still works.

  • banku_brougham 3 days ago

    brilliant! this happens all the time and I never found a convenient way to manage

dannyobrien 3 days ago

Historical note: getting hold of these scripts by chatting to various developers was the motivation for the original 2004 "lifehacks" talk[1][2]. If you ever get into an online argument over what is a "life hack" and what isn't, feel free to use short scripts like these as the canonical example.

Otherwise, I am happy to be pulled into your discussion, Marshall McLuhan style[3] to adjudicate, for a very reasonable fee.

[1] https://craphound.com/lifehacksetcon04.txt

[2] https://archive.org/details/Notcon2004DannyOBrienLifehacks

[3] https://www.openculture.com/2017/05/woody-allen-gets-marshal...

andai 3 days ago

I use these two all the time to encode and cut mp4s.

The flags are for maximum compatibility (e.g. without them, some MP4s don't play in WhatsApp, or Discord on mobile, or whatever.)

    ffmp4() {
        input_file="$1"
        output_file="${input_file%.*}_sd.mp4"

        ffmpeg -i "$input_file" -c:v libx264 -crf 33 -profile:v baseline -level 3.0 -pix_fmt yuv420p -movflags faststart "$output_file"

        echo "Compressed video saved as: $output_file"
    }
    
    
ffmp4 foo.webm

-> foo_sd.mp4

    fftime() {
        input_file="$1"
        output_file="${input_file%.*}_cut.mp4"
        ffmpeg -i "$input_file" -c copy -ss "$2" -to "$3" "$output_file"

        echo "Cut video saved as: $output_file"
    }

fftime foo.mp4 01:30 01:45

-> foo_cut.mp4

Note, fftime copies the audio and video data without re-encoding, which can be a little janky, but often works fine, and can be much (100x) faster on large files. To re-encode just remove "-c copy"

abetusk 3 days ago

I'm kicking myself for not thinking of the `nato` script.

I tend to try to not get too used to custom "helper" scripts because I become incapacitated when working in other systems. Nevertheless, I really appreciate all these scripts if nothing else than to see what patterns other programmers pick up.

My only addition is a small `tplate` script that creates HTML, C, C++, Makefile, etc. "template" files to start a project. Kind of like a "wizard setup". e.g.

  $ tplate c
  #include <stdio.h>
  #include <stdlib.h>
  int main(int argc, char **argv) {
  }
And of course, three scripts `:q`, `:w` and `:wq` that get used surprisingly often:

  $ cat :q
  #!/bin/bash
  echo "you're not in vim"
SoftTalker 3 days ago

Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.

sed, awk, grep, and xargs along with standard utilities get you a long long way.

  • scoodah 3 days ago

    Same. I interact with too many machines, many of which are ephemeral and will have been reprovisioned the next time I have to interact with it.

    I value out of the box stuff that works most everywhere. I have a fairly lightweight zsh config I use locally but it’s mostly just stuff like a status like that suits me, better history settings, etc. Stuff I won’t miss if it’s not there.

  • pinkmuffinere 3 days ago

    I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.

yipbub 3 days ago

I have mkcd exactly ( I wonder how many of us do, it's so obvious)

I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.

I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.

I used to have one called timespeak that would speak the time to me every hour or half hour.

I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.

I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.

  • justusthane 3 days ago

    Obviously that script is more convenient, but if you’re on a system where you don’t have it, you can do the following instead:

        mkdir /some/dir    
        cd !$   
        (or cd <alt+.>)
  • linsomniac 3 days ago

    >I have mkcd exactly ( I wonder how many of us do, it's so obvious)

    Mine is called "md" and it has "-p" on the mkdir. "mkdir -p $1 && cd $1"

  • mttpgn 3 days ago

    I too have a `mkcd` in my .zshrc, but I implemented it slightly differently:

      function mkcd {
        newdir=$1
        mkdir -p $newdir
        cd $newdir
      }
  • taejavu 3 days ago

    Doesn’t the built in `take` do exactly what `mkcd` does? Or is `take` a zsh/macos specific thing?

    Edit: looks like it’s a zsh thing

    • codesnik 3 days ago

      it's an .oh-my-zsh thing (~/.oh-my-zsh/lib/functions.zsh) but thanks, I didn't know about it.

  • aib 2 days ago

    One more from me:

      mkcd() {
        mkdir -p -- "$1" &&
        cd -- "$1"
      }
WA 3 days ago

One script I use quite often:

    function unix() {
      if [ $# -gt 0 ]; then
        echo "Arg: $(date -r "$1")"
      fi
      echo "Now: $(date) - $(date +%s)"
    }
Prints the current date as UNIX timestamp. If you provide a UNIX timestamp as arg, it prints the arg as human readable date.
  • derintegrative 2 days ago

    Similarly I have for Linux

        epoch () {
            if [[ -z "${1:-}" ]]
            then
                    date +'%s'
            else
                    date --date="@${1}"
            fi
        }
    
        % epoch
        1761245789
    
        % epoch 1761245789
        Thu Oct 23 11:56:29 PDT 2025
o11c 3 days ago

I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:

  #!/bin/sh
  if test "$#" != 2
  then
      echo 'Error: unmv must have exactly 2 arguments'
      exit 1
  fi
  exec mv "$2" "$1"
  • virgoerns a day ago

    How do you use it so it's more ergonimic than typing arguments manually reversed? `unmv !$` ?

  • Too 2 days ago

    Nice. This can be generalized to a ”mirror” command to swap the arguments of anything.

Noumenon72 3 days ago

> ocr my_image.png extracts text from an image and prints it to stdout. It only works on macOS

The Mac Shortcut at https://github.com/e-kotov/macos-shortcuts lets you select a particular area of the screen (as with Cmd-Shift-4) and copies the text out of that, allowing you to copy exactly the text you need from anywhere on your screen with one keyboard shortcut. Great for popups with unselectable text, and copying error messages from coworkers' screenshares.

  • 0cf8612b2e1e 3 days ago

    I have a Linux equivalent that uses maim to select a region and then tesseract to do the OCR.

GNOMES 2 days ago

I use my "dc" command to reverse "cd" frequently https://gist.github.com/GNOMES/6bf65926648e260d8023aebb9ede9...

Ex:

    > echo $PWD
    /foo/bar/batz/abc/123

    > dc bar && echo $PWD
    /foo/bar
Useful for times when I don't want to type a long train of dot slashes(ex. cd ../../..).

Also useful when using Zoxide, and I tab complete into a directory tree path where parent directories are not in Zoxide history.

Added tab complete for speed.

internet_points 3 days ago

With `xsel --clipboard` (put that in an alias like `clip`), you can use the same thing to replace both `copy` and `pasta`:

    # High level examples
    run_some_command | clip
    clip > file_from_my_clipboard.txt
    
    # Copy a file's contents
    clip < file.txt

    # indent for markdown:
    $ clip|sed 's/^/    /'|clip
dcassett 3 days ago

I find that I like working with the directory stack and having a shortened version of the directory stack in the title bar, e.g. by modifying the stock Debian .bashrc

  # If this is an xterm set the title to the directory stack
  case "$TERM" in
  xterm*|rxvt*)
      if [ -x ~/bin/shorten-ds.pl ]; then
    PS1="\[\e]0;\$(dirs -v | ~/bin/shorten-ds.pl)\a\]$PS1"
      else
    PS1="\[\e]0;${debian_chroot:+($debian_chroot)}\u@\h:   \w\a\]$PS1"
      fi
      ;;
  \*)
      ;;
  esac
The script shorten_ds.pl takes e.g.

  0  /var/log/apt
  1  ~/Downloads
  2  ~
and shortens it to:

  0:apt 1:Downloads 2:~

  #!/usr/bin/perl -w
  use strict;
  my @lines;
  while (<>) {
    chomp;
    s%^ (\d+)  %$1:%;
    s%:.*/([^/]+)$%:$1%;
    push @lines, $_
  }
  print join ' ', @lines;

That coupled with functions that take 'u 2' as shorthand for 'pushd +2' and 'o 2' for 'popd +2' make for easy manipulation of the directory stack:

  u() {
    if [[ $1 =~ ^[0-9]+$ ]]; then
      pushd "+$1"
    else
      pushd "$@"
    fi
  }

  o() {
    if [[ $1 =~ ^[0-9]+$ ]]; then
      popd "+$1"
    else
      popd "$@" # lazy way to cause an error
    fi
  }
vilhelmen 2 days ago

Something I've long appreciated is a little Perl script to compute statistics on piped in numbers, I find it great for getting quick summaries from report CSVs.

    #!/usr/bin/perl
    # http://stackoverflow.com/a/9790056
    use List::Util qw(max min sum);
    @a=();
    while(<>){
        $sqsum+=$_*$_;
        push(@a,$_)
    };
    $n=@a;
    $s=sum(@a);
    $a=$s/@a;
    $m=max(@a);
    $mm=min(@a);
    $std=sqrt($sqsum/$n-($s/$n)*($s/$n));
    $mid=int @a/2;
    @srtd=sort @a;
    if(@a%2){
        $med=$srtd[$mid];
    }else{
        $med=($srtd[$mid-1]+$srtd[$mid])/2;
    };
    print "records:$n\nsum:$s\navg:$a\nstd:$std\nmed:$med\max:$m\nmin:$mm";
sdovan1 3 days ago

I have three different way to open file with vim: v: vim (or neovim, in my case) vv: search/preview and open file by filename vvv: search/preview and open file by its content

    alias v='nvim'
    alias vv='f=$(fzf --preview-window "right:50%" --preview "bat --color=always {1}"); test -n "$f" && v "$f"'
    alias vvv='f=$(rg --line-number --no-heading . | fzf -d: -n 2.. --preview-window "right:50%:+{2}" --preview "bat --color=always --highlight-line {2} {1}"); test -n "$(echo "$f" | cut -d: -f1)" && v "+$(echo "$f" | cut -d: -f2)" "$(echo "$f" | cut -d: -f1)"'
chasil 3 days ago

I like this one.

  $ cat /usr/local/bin/awkmail
  #!/bin/gawk -f

  BEGIN { smtp="/inet/tcp/0/smtp.yourco.com/25";
  ORS="\r\n"; r=ARGV[1]; s=ARGV[2]; sbj=ARGV[3]; # /bin/awkmail to from subj < in

  print "helo " ENVIRON["HOSTNAME"]        |& smtp;  smtp |& getline j; print j
  print "mail from:" s                     |& smtp;  smtp |& getline j; print j
  if(match(r, ","))
  {
   split(r, z, ",")
   for(y in z) { print "rcpt to:" z[y]     |& smtp;  smtp |& getline j; print j }
  }
  else { print "rcpt to:" r                |& smtp;  smtp |& getline j; print j }
  print "data"                             |& smtp;  smtp |& getline j; print j

  print "From:" s                          |& smtp;  ARGV[2] = ""   # not a file
  print "To:" r                            |& smtp;  ARGV[1] = ""   # not a file
  if(length(sbj)) { print "Subject: " sbj  |& smtp;  ARGV[3] = "" } # not a file
  print ""                                 |& smtp

  while(getline > 0) print                 |& smtp

  print "."                                |& smtp;  smtp |& getline j; print j
  print "quit"                             |& smtp;  smtp |& getline j; print j

  close(smtp) } # /inet/protocol/local-port/remote-host/remote-port
bemmu 2 days ago

The one I use the most is "cdn". It cds to the newest subdirectory.

So if you're in your projects folder and want to keep working on your latest project, I just type "cdn" to go there.

jrm4 3 days ago

Broadly, I very much love this approach to things and wish it was more "acceptable?" It reminds me of the opposite of things like "the useless use of cat" which to me is one of the WORST meme-type-things in this space.

Like, it's okay -- even good -- for the tools to bend to the user and not the other way around.

arjie 3 days ago

A few I use are:

    #!/usr/bin/env bash
    # ~/bin/,dehex

    echo "$1" | xxd -r -p

and

    #!/usr/bin/env bash
    # ~/bin/,ht

    highlight() {
      # Foreground:
      # 30:black, 31:red, 32:green, 33:yellow, 34:blue, 35:magenta, 36:cyan

      # Background:
      # 40:black, 41:red, 42:green, 43:yellow, 44:blue, 45:magenta, 46:cyan
      escape=$(printf '\033')
      sed "s,$2,${escape}[$1m&${escape}[0m,g"
    }

    if [[ $# == 1 ]]; then
      highlight 31 $1
    elif [[ $# == 2 ]]; then
      highlight 31 $1 | highlight 32 $2
    elif [[ $# == 3 ]]; then
      highlight 31 $1 | highlight 32 $2 | highlight 35 $3
    elif [[ $# == 4 ]]; then
      highlight 31 $1 | highlight 32 $2 | highlight 35 $3 | highlight 36 $4
    fi
I also use the comma-command pattern where I prefix my personal scripts with a `,` which allows me to cycle between them fast etc.

One thing I have found that's worth it is periodically running an aggregation on one's history and purging old ones that I don't use.

revicon 3 days ago

I have a bunch of little scripts and aliases I've written over the years, but none are used more than these...

alias ..='cd ..'

alias ...='cd ../..'

alias ....='cd ../../..'

alias .....='cd ../../../..'

alias ......='cd ../../../../..'

alias .......='cd ../../../../../..'

  • cb321 a day ago

    I used to do this, but unary kind of sucks after 3; So maybe others might like this better before their fingers get trained:

        ..() { # Usage: .. [N=1] -> cd up N levels
          local d="" i
          for ((i = 0; i < ${1:-"1"}; i++))
            d="$d/.."  # Build up a string & do 1 cd to preserve dirstack
          [[ -z $d ]] || cd ./$d
        }
    
    Of course, what I actually have been doing since the early 90s is realize that a single "." with no-args is normally illegal and people "cd" soooo much more often than sourcing script definitions. So, I hijack that to save one "." in the first 3 cases and then take a number for the general case.

        # dash allows non-AlphaNumeric alias but not function names; POSIX is silent.
        cd1 () { if [ $# -eq 0 ]; then cd ..; else command . "$@"; fi; } # nice "cd .."
        alias .=cd1
        cdu() {           # Usage: cdu [N=2] -> cd up N levels
          local i=0 d=""  # "." already does 1 level
          while [ $i -lt ${1:-"2"} ]; do d=$d/..; i=$((i+1)); done
          [ -z "$d" ] || cd ./$d; }
        alias ..=cdu
        alias ...='cd ../../..' # so, "."=1up, ".."=2up, "..."=3up, ".. N"=Nup
    
    and as per the comment this even works in lowly dash, but needs a slight workaround. bash can just do a .() and ..() shell function as with the zsh.
  • jcgl 2 days ago

    In fish, I have an abbreviation that automatically expands double dots into ../ so that you can just spam double dots and visually see how far you're going.

      # Modified from
      # https://github.com/fish-shell/fish-shell/issues/1891#issuecomment-451961517
      function append-slash-to-double-dot -d 'expand .. to ../'
       # Get commandline up to cursor
       set -l cmd (commandline --cut-at-cursor)
      
       # Match last line
       switch $cmd[-1]
       case '*.'
        commandline --insert './'
       case '*'
        commandline --insert '.'
       end
      end
  • cosmos0072 3 days ago

    I need this *so* often that I programmed my shell to execute 'cd ..' every time I press KP/ i.e. '/' on the keypad, without having to hit Return.

    Other single-key bindings I use often are:

    KP* executes 'ls'

    KP- executes 'cd -'

    KP+ executes 'make -j `nproc`'

  • Bishonen88 3 days ago

    up() { local d="" for ((i=1; i<=$1; i++)); do d="../$d" done cd "$d" }

    up 2, up 3 etc.

  • tacone 3 days ago

    I have setup a shortcut: alt+. to run cd.., it's pretty cool.

    I also aliased - to run cd -

    • fragmede 3 days ago

      but alt-. in bash is used for pasting the last argument to the previous command into the current one.

      • tacone 2 days ago

        Good point, when working with keybindings, you'll inevitably end up overriding built-ins. I see it as a trade-off, between something I don't know of (and wouldn't use) and something I find useful. Works for me :)

        • fragmede 2 days ago

          absolutely. From back in the day, the annoying one was GNU screen, which took over ctrl-a by default. Overrode that to be ctrl-^, which in bash is transpose, make "zx be "xz", which was rare enough to okay with losing.

          • fragmede 12 hours ago

            it was ctrl-t, not ctrl-^

  • vunderba 3 days ago

    Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.

    • machomaster 3 days ago

      Yes it does.

      • Noumenon72 3 days ago

        Not on my Mac.

            zsh: permission denied: ..
            zsh: command not found: ...
  • Too 2 days ago

    alias cdtop=’cd $(git rev-parse --show-toplevel)’

nberkman 3 days ago

Nice! Tangentially related: I built a (MacOS only) tool called clippy to be a much better pbcopy. It was just added to homebrew core. Among other things, it auto-detects when you want files as references so they paste into GUI apps as uploads, not bytes.

  clippy image.png  # then paste into Slack, etc. as upload

  clippy -r         # copy most recent download

  pasty             # copy file in Finder, then paste actual file here
https://github.com/neilberkman/clippy / brew install clippy
  • Tempest1981 3 days ago

    Adding the word "then" to your first comment would have helped me: (lacking context, I thought the comments explained what the command does, as is common convention)

      clippy image.png   # then paste into Slack, etc. as upload
    
    Also:

      pasty              # paste actual file, after copying file in Finder
    • nberkman 3 days ago

      Updated, I appreciate it!

  • gigatexal 3 days ago

    Awesome. Gonna check this out.

a_e_k 2 days ago

I like the NATO one.

It occurred to me that it would be more useful to me in Emacs, and that might make a fun little exercise.

And that's how I discovered `M-x nato-region` was already a thing.

alentred 3 days ago

> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often (probably about once a month)

I genuinely wonder, why would anyone want to use this, often?

  • abetusk 2 days ago

    As a programmer, you sometimes want to make an alphabet lookup table. So, something like:

      var alpha_lu = "abcdefghijklmnopqrstuvwxyz";
    
    Typing it out by hand is error prone as it's not easy to see if you've swapped the order or missed a character.

    I've needed the alphabet string or lookup rarely, but I have needed it before. Some applications could include making your own UUID function, making a small random naming scheme, associating small categorical numbers to letters, etc.

    The author of article mentioned they do web development, so it's not hard to imagine they've had to create a URL shortener, maybe more than once. So, for example, creating a small name could look like:

      function small_name(len) {
        let a = "abcdefghijklmnopqrstuvwxyz",
            v = [];
        for (let i=0; i<len; i++) {
          v.push( a[ Math.floor( Math.random()*a.length ) ] );
        }
        return v.join("");
      }
      //...
      small_name(5); // e.g. "pfsor"
    
    Dealing with strings, dealing with hashes, random names, etc., one could imagine needing to do functions like this, or functions that are adjacent to these types of tasks, at least once a month.

    Just a guess on my part though.

    • cleartext412 2 days ago

      Personally I only ever needed it once. I was re-implementing javascript function doing some strange string processing by using characters in the input string to calculate indexes of alphabet array to replace them with. Since I was using Python I just imported string.ascii_lowercase instead of manually typing the sequence, and when I showed the code to someone more experienced than me, I was told it's base64, so all my efforts were replaced with a single base64.b64_decode() call.

    • atiedebee 2 days ago

      In C I've personally always just done 'a' + x, no table needed

  • CGamesPlay 3 days ago

    If your native language uses a different alphabet, you might not have been taught "the alphabet song". For example, I speak/read passable Russian, but could not alphabetize a list in Russian.

  • dcassett 3 days ago

    For me it's when I call customer service or support on the phone, and either give them an account #, or confirm a temporary password that I have been verbally given.

    • Tempest1981 3 days ago

      Are you referring to the nato alphabet utility? Or the alphabet script that prints

        abcdefghijklmnopqrstuvwxyz
        ABCDEFGHIJKLMNOPQRSTUVWXYZ
      • johntash 2 days ago

        I imagine all of his passwords are abcdefghijklmnopqrstuvwxyz

      • dcassett 2 days ago

        Apologies, mistook for NATO. Maybe the alphabet is for fonts :)

briansm 3 days ago

Using 'copy' as a clipboard script tells me OP never lived through the DOS era I guess... Used to drive me mad switching between 'cp' in UNIX and 'copy' in DOS. (Same with the whole slash vs backslash mess.)

linsomniac 2 days ago

The Gen AI tooling is exceptionally good at doing these sorts of things, and way more than just "mkdir $1 && cd $1". For example:

I have used it to build an "escmd" tool for interacting with Elasticsearch. It makes the available commands much more discoverable, the output it formats in tables, and gets rid of sending JSON to a curl command.

A variety of small tools that interact with Jira (list my tickets, show tickets that are tagged as needing ops interaction in the current release).

A tool to interact with our docker registry to list available tags and to modify tags, including colorizing them based on the sha hash of the image so it's obvious which ones are the same. We manage docker container deploys based on tags so if we "cptag stg prod" on a project, that releases the staging artifact to production, but we also tag them by build date and git commit hash, so we're often working with 5-7 tags.

Script to send a "Software has successfully been released" message via gmail from the command-line.

A program to "waituntil" a certain time to run a command: "waituntil 20:00 && run_release", with nice display of a countdown.

I have a problem with working on too many things at once and then committing unrelated things tagged with a particular Jira case. So I had it write me a commit program that lists my tickets, shows the changed files, and lets me select which ones go with that ticket.

All these are things I could have built before, but would have taken me hours each. With the GenAI, they take 5-15 minutes of my attention to build something like this. And Gen AI seems really, really great at building these small, independent tools.

interestica 3 days ago

Share yours!

I use this as a bookmarklet to grab the front page of the new york times (print edition). (You can also go back to any date up to like 2011)

I think they go out at like 4 am. So, day-of, note that it will fail if you're in that window before publishing.

    javascript:(()=>{let d=new Date(new Date().toLocaleString('en-US',{timeZone:'America/New_York'})),y=d.getFullYear(),m=('0'+(d.getMonth()+1)).slice(-2),g=('0'+d.getDate()).slice(-2);location.href=`https://static01.nyt.com/images/${y}/${m}/${g}/nytfrontpage/scan.pdf`})()
andai 3 days ago

    alias mpa='mpv --no-video'

    mpa [youtube_url]
I use this to listen to music / lectures in the terminal.

I think it needs yt-dlp installed — and reasonably up to date, since YouTube keeps breaking yt-dlp... but the updates keep fixing it :)

  • andai 3 days ago

    On the subject of yt-dlp, I use it to get (timestamped) transcripts from YouTube, to shove into LLMs for summaries.

        ytsub() {
            yt-dlp \
                --write-sub \
                --write-auto-sub \
                --sub-lang "en.*" \
                --skip-download \
                "$1" && vtt2txt
        }
    
        ytsub [youtube_url]
    
    Where vtt2txt is a python script — slightly too long to paste here — which strips out the subtitle formatting, leaving a (mostly) human readable transcript.
WhyNotHugo 3 days ago

> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.

You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.

helicaltwine 3 days ago

As a bonus, I prepend my custom aliases or scripts with my user name and hyphen (i.e helicaltwine-). It helps me recall rarely used scripts when I need them and forget the names.

  • dunb 3 days ago

    I follow a similar but more terse pattern. I prepend them all with a comma, and I have yet to find any collisions. If you're using bash (and I assume posix sh as well), the comma character has no special meaning, so this is quite a nice use for it. I agree that it's nice to type ",<tab>" and see all my custom scripts appear.

Tempest1981 3 days ago

> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often

I'm curious to hear some examples (feel like I'm missing out)

chamomeal 3 days ago

I started writing way more utility scripts when I found babashka. Magic of clojure, instant startup, easy to shell out to any other command, tons of useful built in stuff, developing with the REPL. It’s just a good time!!

cool-RR 3 days ago

The most useful script I wrote is one I call `posh`. It shorten a file path by using environment variables. Example:

  $ posh /home/ramrachum/Dropbox/notes.txt
  $DX/notes.txt
Of course, it only becomes useful when you define a bunch of environment variables for the paths that you use often.

I use this a lot in all of my scripts. Basically whenever any of my script prints a path, it passes it through `posh`.

  • oneeyedpigeon 2 days ago

    I'd love to see this script. Does it use `env` and strip out things like PWD?

    • cool-RR 2 days ago

      I wrote it in a way that's too intertwined with my other shit to be shareable with people, but honestly you can copy-paste my comment to your friendly neighborhood LLM and you'll get something decent. Indeed it uses `env`.

      • oneeyedpigeon 2 days ago

        Understood. I'd rather write it myself from scratch than use an LLM; confirmation of the general process should be enough, I hope!

sedatk 3 days ago

> `rn` prints the current time and date using date and cal.

And you can type `rn -rf *` to see all timezones recursively. :)

lolive 2 days ago

My most important script has been to remap CapsLock as a kind of custom Meta key, that transforms (when pressed) the Space into Return, hjkl into arrows, io into PgUp/PgDn, and 1-9 into function keys. Now I have a 60% keyboard that takes 0 space on my desk. And I am reaaaally happy with this setup.

[that, plus LinkHint plugin for Firefox, and i3 for WM is my way to go for a better life]

cshores 2 days ago

I have a script called catfiles that I store in ~/.local/bin that recursively dumps every source file with an associated file header so I can paste the resulting blob in to Gemini and ChatGPT in order to have a conversation about the changes I would like to make before I send off the resulting prompt to Gemini Code Assist.

Heres my script if anyone is interested in as I find it to be incredibly useful.

find . -type f \( -name ".tf" -o -name ".tfvars" -o -name ".json" -o -name ".hcl" -o -name ".sh" -o -name ".tpl" -o -name ".yml" -o -name ".yaml" -o -name ".py" -o -name ".md" \) -exec sh -c 'for f; do echo "### FILE: $f ###"; cat "$f"; echo; done' sh {} +

sorenjan 3 days ago

I got a ccurl python script that extracts the cookies from my Firefox profile and then passes those on to curl, that way I can get webpages where I'm logged in.

hyperman1 2 days ago

I have a script/alias,named p, that allows me to task switch. It takes an issue and maybe an argument, and does a bunch of things if they make sense in context. It has grown over the years.

So 'p ISSUE-123' :

* creates a folder issues/ISSUE-123 for work files, containing links to a backed up folder and the project repository . The shell is cd'd to it

* The repo might get a new branch with the issue name.

* An IDE might start containing the project.

* The browsers home button brings you to a page with all kinds of relevant links:. The issue tracker, the CI, all kinds of test pages, etc...

* The open/save dialogs for every program gets a shortcut named'issue'

* A note is made in a log that allows me to do time tracking at the end if the week.

* A commit message template with the issue is created.

internet_points 2 days ago

on my ubuntu, `date -I` does iso dates

Also re: alphabet

    $ echo {a..z}
    a b c d e f g h i j k l m n o p q r s t u v w x y z
  • oneeyedpigeon 2 days ago

    date -I even works on macOS, which I was pleasantly surprised by!

    If you want the exact alphabet behaviour as the OP:

        $ echo {a..z} $'\n' {A..Z} | tr -d ' '
javier123454321 3 days ago

This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.

yellowapple 2 days ago

Love it! I'll absolutely be borrowing some of these :)

On every machine of mine I tend to accumulate a bunch of random little scripts along these lines in my ~/.local/bin, but I never seem to get around to actually putting them anywhere. Trying to knock that habit by putting any new such scripts in a “snippets” repo (https://fsl.yellowapple.us/snippets); ain't a whole lot in there yet, but hopefully that starts to change over time.

fragmede 3 days ago

Where are the one letter aliases? My life got better after I alias k=kubectl

brainzap 2 days ago

My most used automation copies a file with rclone to backblaze blob storage, and puts the link into the clipboard. (for sharing memes)

and alias debian="docker run -it --rm -v $(pwd):/mnt/host -w /mnt/host --name debug-debian debian"

spiffyk 2 days ago

> url "$my_url" parses a URL into its parts. I use this about once a month to pull data out of a URL, often because I don’t want to click a nasty tracking link.

This sounds pretty useful!

Coincidentally, I have recently learned that Daniel Stenberg et al (of cURL fame) wrote trurl[1], a libcurl-based CLI tool for URL parsing. Its `--json` option seems to yield similar results as TFA's url, if slightly less concise because of the JSON encoding. The advantage is that recent releases of common Linux distros seem to include trurl in their repos[2].

[1]: https://curl.se/trurl/

[2]: https://pkgs.org/search/?q=trurl

lillesvin 3 days ago

Obviously, to each their own, but to me, this is an overwhelming amount of commands to remember on top of all the ones they are composed of that you will likely need to know anyway — regardless if all the custom ones exist.

Like, I'd have to remember both `prettypath` and `sed`, and given that there's hardly any chance I'll not need `sed` in other situations, I now need to remember two commands instead of one.

On top of that `prettypath` only does s/:/\\n/ on my path, not on other strings, making its use extremely narrow. But generally doing search and replace in a string is incredibly useful, so I'd personally rather just use `sed` directly and become more comfortable with it. (Or `perl`, but the point is the same.)

As I said, that's obviously just my opinion, if loads of custom scripts/commands works for you, all the more power to you!

  • cleartext412 2 days ago

    For someone using sed often enough inventing prettypath won't make sense. However, if producing correct sed command, be it by remembering the options, reading manual or digging through the shell history, takes some amount of mental effort, your brain will happily stick "prettypath" into memory as long as doing so stays less mentally taxing than doing original task from scratch.

0xbadcafebee 3 days ago

The scripts from my junk drawer (https://github.com/peterwwillis/junkdrawer) I use every day are 'kd' and 'gw', which use the Unix dialog command to provide an easy terminal UI for Kubectl and Git Worktrees (respectively)... I probably save 15+ minutes a day just flitting around in those UIs. The rest of the scripts I use for random things; tasks in AWS/Git/etc I can never remember, Terraform module refactoring, Bitbucket/GitHub user management, Docker shortcuts, random password generation, mirroring websites with Wget, finding duplicate files, etc.

omnster a day ago

Regarding the `timer` script, it appears to block the shell. A way to avoid this would be to spawn a subshell for the sleep command like this: `( sleep "$1" && notify ... ) &`

nullgeo 2 days ago

One I use a lot is kp, it kills a process listening to a particular TCP port.

  kp () {
           if [ -z "$1" ] then
                   echo "Usage: kp <port>"
                   return 1
           fi
           lsof -nP -iTCP:"$1" -sTCP:LISTEN | awk 'NR>1 {print $2}' | xargs kill -9
   }
rcarmo 3 days ago

As a fun game, I suggest feeding the entire piece to an LLM and asking it to create those scripts. The differences between Claude, GOT-5 and Gemini are very interesting.

some_guy_nobel 3 days ago

These are great, and I have a few matching myself.

Here are some super simple ones I didn't see that I use almost every day:

cl="clear"

g="git"

h="history"

ll="ls -al"

path='echo -e ${PATH//:/\\n}'

lv="live-server"

And for common navigation:

dl="cd ~/Downloads"

dt="cd ~/Desktop"

  • hackeraccount 2 days ago

    I'm terrible about remembering shortcuts (edit a bash line in an editor? Can never remember it) but clear (CTRL-l) is one that really stuck.

    That and exit (CTRL-d). A guy I used to work with just mentioned it casually and someone it just seared itself into my brain.

    • virgoerns a day ago

      FYI, ctrl-d isn't a shortcut to exit terminal. It sends EOF (end of file) character which, when reaches shell, closes stdinput file of shell. It generally closes any active interactive input, like all repls, interactive input to sed etc. When interactive shell loses possibility to get more input, it closes as soon as possible and then its parent, the terminal window, also closes. More-less :)

hiq 3 days ago

I've started using snippets for code reviews, where I find myself making the same comments (for different colleagues) regularly. I have a keyboard shortcut opening a fuzzy search to find the entry in a single text file. That saves a lot of time.

As an aside, I find most of these commands very long. I tend to use very short aliases, ideally 2 characters. I'm assuming the author uses tab most of the time, if the prefixes don't overlap beyond 3 characters it's not that bad, and maybe the history is more readable.

amterp 3 days ago

Love this, lots of great ideas I'll be stealing :)

Folks interested in scripting like this might like this tool I'm working on https://github.com/amterp/rad

Rad is built specifically for writing CLI scripts and is perfect for these sorts of small to medium scripts, takes a declarative approach to script arguments, and has first-class shell command integration. I basically don't write scripts in anything else anymore.

greenpizza13 2 days ago

For tempe, recommend changing "cd" to "push" do you can "popd" as soon as you're done.

sitebolts 2 days ago

Here are some snippets that we've compiled over time:

https://snhps.com

They're not all necessarily the most efficient/proper way to accomplish a task, but they're nice to have on hand and be able to quickly share.

Admittedly, their usefulness has been diminished a bit since the rise of LLMs, but they still come in handy from time to time.

vunderba 3 days ago

Nice. I have a bash script similar to the one listed "removeexif" called prep_for_web which takes any image file (PNG, BMP, JPG, WebP), scrubs EXIF data, checks for transparency and then compresses it to either JPG using MozJPEG or to PNG using PNGQuant.

[1] https://github.com/mozilla/mozjpeg

[2] https://pngquant.org

haskellshill 3 days ago

> `nato bar` returns Bravo Alfa Romeo. I use this most often when talking to customer service and need to read out a long alphanumeric string, which has only happened a couple of times in my whole life. But it’s sometimes useful!

Even more useful is just learning the ICAO Spelling Alphabet (aka NATO Phonetic Alphabet, of which it is neither). It takes like an afternoon and is useful in many situations, even if the receiver does not know it.

  • shellfishgene 2 days ago

    Some time ago I tried to tell my email address to someone in Japan over the phone who did not speak English very well. It turned out to be basically impossible. I realized later one could probably come up with a phonetic alphabet of English words most Japanese know!

teo_zero 3 days ago

Please note that 'each' is fundamentally different from 'xargs'.

  echo 1 2 3 | each "rm {}"
is the same as

  rm 1
  rm 2
  rm 3
while

  echo 1 2 3 | xargs rm
is the same as

  rm 1 2 3
I would rather say that 'each' replaces (certain uses of) 'for':

  for i in 1 2 3; do rm $i; done
  • jgtrosh 3 days ago

    It's equivalent to xargs -I {} rm {}

           -I replace-str
                  Replace occurrences of replace-str in the initial-arguments
                  with names read from standard input.  Also, unquoted blanks
                  do not terminate input items; instead the separator is the
                  newline character.  Implies -x and -L 1.
zeckalpha 3 days ago

A couple more standard approaches with fewer chars:

jsonformat -> jq

running -> pgrep

sid- 3 days ago

Why dont we have mkcd in linux natively boggles my mind :)

  • marcuskaz 3 days ago

    Likewise, why doesn't git clone automatically cd into the repo?

    • andriamanitra 2 days ago

      A subprocess (git) can't modify the working directory of the parent process (the shell). This is a common annoyance with file managers like yazi and ranger as well—you need an extra (usually manual!) installation step to add a shell integration for whichever shell you're using so the shell itself can change directory.

      The best solution for automatically cd'ing into the repo is to wrap git clone in a shell function or alias. Unfortunately I don't think there's any way to make git clone print the path a repository was cloned to, so I had to do some hacky string processing that tries to handle the most common usage (ignore the "gh:" in the URL regex, my git config just expands it to "git@github.com:"):

      https://github.com/Andriamanitra/dotfiles/blob/d1aecb8c37f09...

  • taejavu 3 days ago

    zsh has a `take` utility that is exactly this

exasperaited 3 days ago

The "scripts" I use the most that I am most happy with are a set of Vagrant tools that manage initialising different kinds of application environments with an apt cache on the host. Also .ssh/config includes to make it as easy as possible to work with them from VSCode.

I set this stuff up so long ago I sort of forgot that I did it at all; it's like a standard feature. I have to remember I did it.

jwsteigerwalt 2 days ago

17 years ago I wrote a short VBA macro that takes the high life’s range of cells, concatenates the values into a comma separated list, then opens the list in notepad for easy copy and further use. I can’t begin to count the number of executions by myself and those i have shared it with.

jimmySixDOF 2 days ago

I had my hopes on this project RawDog using local smol sized LLMs but it hasn't been updated in a while. I feel like all this should be running easily in the background nowadays.

https://github.com/AbanteAI/rawdog

pmontra 3 days ago

I also have a radio script to play internet streams with mpv (?). Other random stuff

A password or token generator, simple or complicated random text.

Scripts to list, view and delete mail messages inside POP3 servers

n, to start Nautilus from terminal in the current directory.

lastpdf, to open the last file I printed as PDF.

lastdownload, to view the names of the n most recent files in the Downloads directory.

And many more but those are the ones that I use often and I remember without looking at ~/bin

xiphias2 3 days ago

An important advantage of aliases was not mentioned: I see everything in one place and can easily build aliases on top of other aliases without much thinking.

Anyways, my favourite alias that I use all the time is this:

    alias a='nvim ~/.zshrc && . ~/.zshrc'
It solves the ,,not loaded automatically'' part at least for the current terminal
hughdbrown 2 days ago

This is really interesting, but I need the highlights reel. So I need a script to summarize Hacker News pages and/or arbitrary web pages. Maybe that's what I want for getting the juice out of Medium articles.

botverse 2 days ago

I did something similar with copy until I found this which works across remote terminals too:

`alias clip="base64 | xargs -0 printf '\e]52;c;%s\007'"`

It just sends it to the client’s terminal clipboard.

`cat thing.txt | clip`

frodo8sam 2 days ago

I have these kind of scripts live as functions in my .bashrc, kind of a noob, is there a reason beyond convension to store them individually in bin?

liqilin1567 2 days ago

One of my biggest headaches is stripping specific number of bytes from the head or tail of a binary file. and I couldn't find any built-in tool for that, so I wrote one in C++.

  • rkeene2 2 days ago

    Last X bytes: dd bs=1 skip=X

    First X bytes: dd bs=X count=1

    • liqilin1567 2 days ago

      Thanks, there were few errors after testing.

      1. stripping fist X bytes: dd bs=1 skip=X

      2. stripping last X bytes: truncate -s -X

apricot13 2 days ago

and here's me still ctrl+r-ing for my commonly used methods

  • dbalatero 2 days ago

    hopefully with fzf and not with the built in ctrl r

desireco42 3 days ago

I had youtube and serveit and some others, but pasta is really good, thanks!

  • janpmz 3 days ago

    Last month I saw a tweet how to serve files using

    python3 -m http.server 1337

    Then I turned it into an alias, called it "serveit" and tweeted about it. And now I see it as a bash script, made a little bit more robust in case python is not installed :)

internet_points 2 days ago

`line 10` can be written as `sed -n 10p` (instead of head+tail)

panki27 2 days ago

My most used function is probably the one I use to find the most recent files:

    lt () { ls --color=always -lt ${1} | head }
encom 2 days ago

Interesting, but none of the links are working... codeberg.org isn't responding, it just spins forever.

headgasket 3 days ago

if you use x0vnc (useful if you use a linux machine both from the attached screen and from vnc, and in a bunch of other scenarios), copy and paste to and fro the vnc client is not implemented, quite frustrating. here's 2 scripts that does that for you, I now use this all day. https://github.com/francoisp/clipshare

SuperHeavy256 3 days ago

I hope to see an operating system with these scripts as built-in, because they are so intuitive and helpful! Which OS will be the first to take this on?

CrimpCity 3 days ago

Lately I’ve been using caffeinate to run long running scripts without interruption from sleep on Mac. Nothing crazy but could be useful to newer devs.

senderista 3 days ago

fish abbreviations >> bash aliases

akullpp a day ago

A lot can be done with just aliasing to the right tools:

alias df='duf'

alias ls='eza'

alias ll='eza -l'

alias cat='bat'

alias cap='bat -p'

alias man='tldr'

alias top='glances'

alias grep='rg'

alias ps='procs'

alias cd='z'

alias g='gitui'

alias gs='git st'

alias gp='git pull'

alias gu='git add . && git commit -m "Update" && git push'

alias check='shellcheck'

alias v='nvim'

alias len='wc -l'

alias uuid='uuidgen'

alias src='source ~/.zshrc'

wiether 3 days ago

It's been a while since I haven't read something as useful!

There also some very niche stuff that I won't use but found funny

  • giraffe_lady 3 days ago

    The nato phonetic alphabet one cracked me up. My dude you don't need that, call center employees don't know it, just say S as in Sugar like ur grandma used to.

    • WCSTombs 3 days ago

      The NATO alphabet is made of commonly known words that are hard to misspell and hard to mishear (at least the initial phonemes). The person on the other end doesn't need to be able to recite the words, they just need to be able to hear "november" and recognize that it starts with N.

      • giraffe_lady 3 days ago

        Yes thank you that's a good description of what a phonetic alphabet is and how it's used.

    • ericyd 3 days ago

      The nato phonetic alphabet is still useful even if the other party doesn't know it, I've used it a bunch of times on the phone to spell out my 10- letter last name. Saves quite a lot of time and energy for me vs saying "letter as in word" for each letter.

      • vunderba 3 days ago

        Exactly. The listening party doesn't need to have knowledge of the NATO alphabet to still benefit from it since they are just regular English words.

        I once had someone sound out a serial number over a spotty phone connection years ago and they said "N as in NAIL". You know what sounds a lot like NAIL? MAIL.

        And that is why we don't just arbitrarily make up phonetic alphabets.

      • SoftTalker 3 days ago

        > saying "letter as in word" for each letter

        Which often just confuses things further.

        Me: My name is "Farb" F-A-R-B. B as in Baker.

        Them: Farb-Baker, got it.

      • giraffe_lady 3 days ago

        Right but it's not much more useful than any other phonetic alphabet the other party doesn't know, including the one you make up on the spot.

        • sfink 3 days ago

          If you're me, it's still useful because the ones I make up on the spot aren't great.

          "S-T-E-V-E @ gmail.com, S as in sun, T as in taste, ..." "Got it, fpeve."

        • dragonwriter 3 days ago

          I dunno, there's a pretty good chance that the one that people spent time and effort designing to replace earlier efforts with the goal of reducing potential ambiguity and for use over noisy connections with expectation that mistakes could cost lives is probably better than what you improvise on the spot

    • kelvinjps10 3 days ago

      When I worked in customer service, I asked a teammate what I could do to spell back something the customer said, and she taught me that system, it helped me a lot.

    • senkora 3 days ago

      I once had the customer service agent for Iberia (the Spanish airline) confirm my confirmation number with me using it.

      It worked with me and I guess it must have usually worked for him in most of his customer interactions.

    • dcassett 3 days ago

      I've found the NATO alphabet fairly common at call centers, with globalization being a factor.

PUSH_AX 3 days ago

Mkdir then cd into it, I just use ‘take’? Maybe this isn’t available by default everywhere?

ufko_org 3 days ago

Thank you, I also stopped using aliases and have everything as scripts in my ~/bin

naikrovek 3 days ago

> wifi toggle

this fella doesn't know what "toggle" means. in this context, it means "turn off if it's currently on, or turn on if it's currently off."

this should be named `wifi cycle` instead. "cycle" is a good word for turning something off then on again.

naming things is hard, but it's not so hard that you can't use the right word. :)

  • codesnik 3 days ago

    or wifi toggle-toggle!

czertyaka 3 days ago

Some of these, especially text processing, is already built-in Nushell.

yegle 3 days ago

The markdownquote can be replaced by (at least in vim):

^ (jump to the beginning)

ctrl+v (block selection)

j (move cursor down)

shift+i (bulk insert?)

type ><space>

ESC

thibran 3 days ago

30% of the productivity hacks can be archived in vanilla Nushell.

blackhaj7 3 days ago

Love this. Gunna use plenty of these

merksoftworks 3 days ago

in oh-my-zsh you can use `take` to do what mkcd does.

hshdhdhehd 3 days ago

These arent bad but much better if they were all flags to the cat command.

E.g. cat --copy

kwar13 3 days ago

that was beautiful to read. command line ftw!

ttflee 3 days ago

`perldoc perlrun`

progforlyfe 2 days ago

absolutely love these time savers!!

banku_brougham 3 days ago

this is really great. at some point i gavenup on being more efficient on the terminal, but many pain points are solved by your work

munchlax 3 days ago

mksh is already the MirBSD Korn SHell

  • rauli_ 2 days ago

    Which very very few people have actually installed on their system.

samtrack2019 3 days ago

no offense but a lot of those script are pretty hacky they may work for the user but i would not use them without making sure to review them and adapt them to my workflow

  • ziotom78 2 days ago

    That's a fair point. I think the author intended the post to be a treasure trove of ideas for your own scripts, not as something to blindly include in your daily workflow.