For sure. I'd argue to write the "stupid" code to get started, get that momentum going. The sooner you are writing code, the sooner you are making your concept real, and finding the flaws in your mental model for what you're solving.
I used to try to think ahead, plan ahead and "architect", then I realized simply "getting something on paper" corrects many of the assumptions I had in my head. A colleague pushed me to "get something working" and iterate from there, and it completely changed how I build software. Even if that initial version is "stupid" and hack-ish!
I think this is mostly true, but also I’d highlight the necessity of having a mental model, and iterating.
I think it is common for a programmer to just start programming without coming up with any model, and just try to solve the problem by adding code on top of code.
There are also many programmers who go with their first “working” implementation, and never iterate.
These days, I think the pendulum has swung too far from thinking about the program, maybe mapping it out a bit on paper before writing code.
This puts the "just get it working" as the first priority. Don't care about quality, just make it. Then, and only once you have something working, do you care about quality first. This is about getting the code into something reasonable that would pass a review (e.g., architectually sound). Finally, do an optimization pass.
This is the process I follow for PRs and projects alike. Sometimes you can mix all the steps into a single commit, if you understand the problem&solution domain well. But if you don't, you'll likely have to split it up.
Depending on how low-level your code is, this... may not work out in those terms.
In other words, I’d say that if you actually want good software—and that includes making sure its speed falls within a reasonable factor of the napkin-math theoretical maximum achievable on the platform—your three steps can easily constitute three entire rewrites or at least substantial refactors. You might well need to rearchitect if the “working well” version has multiple small loops split by domain-level concern when the hardware really wants a single large one, or if you’re doing a lot of pointer-chasing and need to flatten the whole thing into a single buffer in preorder, or if your interface assumes per-byte ops where SIMD can be applied.
This is not a condemnation of the strategy, mind you. Crap code is valuable and I wish I were better at it. I just disagree that the transition from step 2 to step 3 can be described as an optimization pass. If that’s what you limit yourself to, you’ll quite likely be forced to leave at least an order of magnitude’s worth of performance on the table.
And yes, most consumer software is very much not good by that definition.
(For instance, I’m expecting that the Ladybird devs will be able to get their browser to work well for daily tasks—which I would count a tremendous achievement—but I’m not optimistic about it then becoming any faster than the state of the art even ten or fifteen years ago.)
Some optimization problems require an entire PHD dissertation and research budget to actually optimize, so some algorithms require far more effort applied to this than is reasonable for most products. As mentioned, sometimes you can combine these all into one step -- when you know the domains well.
Sometimes, it might even be completely separate people working on each step... separated by time and space.
In any case, most software generally stops at (2) simply due to the fact that any effort towards (3) isn't worth the effort -- for example, there's very little point in spending two weeks optimizing a report generation that runs in the middle of the night, once a month. At some point, there may be, but usually not anytime soon.
This somewhat depends on how big of program/application you are making.
Again, this is something I set bite enterprise style applications quite often as they can be pushed out piecemeal where you can get things like the datastore/input APIs/UI to the customer quickly, then over the next months things like reporting, auditing, and fine grained access controls get put in, and suddenly you find yourself stuffed working around major issues where a little bit up up front thinking about the later steps would have saved you a lot of heartache.
This is where 'knowing the domain' lets you put a ton of stuff in all at once. If you have no clue what you're doing, you have to learn the lesson your talking about. As long as you can avoid joining teams that haven't learned this lesson (and others like it), you'll be fine.
I once joined a team where they knew they were going to do translations at some point ... and the way they decided to "prepare" for it was absolutely nonsensical. It was clear none of them had ever done translations before, so when it came time to actually do the work, it was a disaster. They had told product/sales "it was ready" but it didn't actually work -- and couldn't ever actually work. It required redesigning half the architecture and months of effort across a whole team to get it working. Even then, some aspects were completely untranslatable that took an additional 6-8 months of refactoring.
So, another lesson is to not try to engineer something unless your goal is to "get it working". If you don't need it, it is probably still better to actually wait until you need it.
This is fantastic, but how do you communicate this within your organization to peers, and not allow the pace of the organization to interfere? For example, can see many teams stopping after step 1.
And in some cases, there isn't a reason to continue to step 2 or 3. Software generally has a shelf-life. Most businesses write code that should be rewritten every 5-10 years, but there's that kernel of code that _never_ changes... that's the code that really needs step 2 and 3. The rest, probably only runs occasionally and doesn't explicitely need to be extremely tested and fast.
Agreed. Having worked the range of boring backend systems to performance critical embedded systems, only few areas are worth optimizing for, and we always let data inform where to invest additional time.
I much prefer a code base that is readable and straightforward (maybe at the expense of some missed perf gains) over code that is highly performant but hard to follow/too clever.
> These days, I think the pendulum has swung too far from thinking about the program, maybe mapping it out a bit on paper before writing code.
Sometimes I think about code structure like a sudoku where you have to eliminate two possibilities by following through what would happen. Writing the code is (to me) like placing the provisional numbers and finding where you have a conflict. I simply cannot do it by holding state in my head (ie without making marks on the paper).
It could definitely be a limitation of me rather than generally true.
Totally agree. Iteration is key. Mapping things out on paper after you've written the code can also be illuminating. Analysis and design doesn't imply one-and-done Architect -> Implement waterfall methods.
Knowing hard requirements up front can be critical to building the right thing. It's scary how many "temporary" things get built in top of and stuck in production. Obviously loose coupling / clear interfaces can help a lot with this.
But an easy example is "just build the single player version" (of an application) can be worse than just eating your vegetables. It can be very difficult to tack-on multiplayer, as opposed to building for this up front.
I once retrofitted a computer racing game/sim from single-player to multi-player.
I thought it was a masterpiece of abusing the C pre-processor to ensure that all variables used for player physics, game state, inputs, and position outputs to the graphics pipeline were guarded with macros to ensure as the (overwhelmingly) single-player titles continued to be developed that the code would remain clean for the two titles that we hoped to ship with split-screen support.
All the state was wrapped in ss_access() macros (“split screen access”) and compiled to plain variables for single-player titles, but with the variable name changed so writing plain access code wouldn’t compile.
I was proud of the technical use/abuse of macros. I was not proud that I’d done a lot of work and imposed a tax on the other teams all for a feature that producers wanted but that in the end we never shipped a single split-screen title. One console title was cancelled (Saturn) and one (mine) shipped single-player only (PlayStation).
That's a great point, and I feel like it is relevant for a lot more than games.
We should definitely have a plan before we start, and sketch out the broad strokes both in design and in actual code as a starting point. For smaller things it's fine to just start hacking away, but when we're designing nå entire application i think the right way to approach it is to plan it out and then solve the big problems first. Like multiplayer.
They don't have to be completely solved, it's an iterative process but they should be part of the process from the beginning.
An example from my own work: I took over an app two other developers had started. The plan was to synchronize data from a third party to our own db, but they hadn't done that. They had just used the third party api directly. I don't know why. So when they left and I took over, I ended up deleting/refactoring everything because everything was built around this third party api and there was a whole bunch of problems related to that and how they were just using the third party's data structure directly rather than shaping the data the way we wanted it. The frontend took 30-60+ seconds to load a page because it was making like 7 serialized requests, waiting for a response before sending the next one and the backend did the same thing.
Now it's loading instantly, but it did require that I basically tear out everything they've done and rewrite most of the system from scratch.
In many project it's impossible to know the requirements up front, or they are very vagues.
Business requirements != programming requirements/features.
Very often both the business requirements and programming requirements change a lot since unless you have already written this one thing, in the exact form that you are making it now, you will NEVER get it right the first time.
The problem is people don't adapt properly. If the business requirements change so much that it invalidates your previous work then you need to re-do the work. But in reality people just duct tape a bunch of workarounds together and you end up with a frankensystem that doesn't do anything right.
It is possible to build systems that can adapt to change, by decoupling and avoiding cross cutting concerns etc you can make a lot of big sweeping changes quite easily in a well designed system. It's just that most developers are bad at software development, they make a horrible mess and then they just keep making it worse while blaming deadlines and management etc.
This is why I hate software engineering as a profession.
You're going to write the "stupid code" to get things out the door, get promoted and move on to another job, and then some future engineer has to come along and fix the mess you made.
But management and the rest of the org won't understand why those future engineers are having such a hard time, why there's so much tech debt, and why any substantial improvements require major rework and refactoring.
So the people writing the stupid code get promoted and look good, but the people who have to deal with the mess end up looking bad.
Sure, that sucks. You know what else sucks? The engineer who does nothing but foundation building and then is surprised that reality doesn't align with their meticulously laid out assumptions.
An engineer that does nothing but foundations can still be a damn good geotechnical engineer.
A foundation that isn't useful to build atop is just a shitty foundation. Everyone is taking it for granted that building a good foundation is impossible if you haven't built a shitty foundation for the same building first, but that's not the only way to do things.
The analogy is strained. Software is closer to a food recipe than a building. Trying to make a 3-layer strawberry banana cake with pineapple frosting? You are going to have to bake something and taste it to see if your recipe is any good. Then make some adjustments and bake some more.
Is the argument here that a skilled chef has no better way to make good food than unguided trial and error? That's obviously not true, as the abundance of "random ingredient" cooking challenges will attest.
> I used to try to think ahead, plan ahead and "architect"
Depends on what you do. If you build a network protocol, you'd better architect it carefully before you start building upon it (and maybe others do, too).
The question is: "if I get this wrong, how much impact does it have?". Getting the API of a core service wrong will have a lot of impact, while writing a small mobile app won't affect anything other than itself.
But the thing is, if you think about that before you start iterating on your small app, then you've already taken an architectural decision :-).
or C): You cultivate a culture of continuous rewrite to match updated requirements and understandings as you code. So, so many people have never learned that, but once you do reach that state, it is very liberating as there will be no more sacred ducks.
That said, it takes quite a bit of practice to become good enough at refactoring to actually practice that.
Yeah, I think it's actually a great skill to be comfortable with not getting attached to your code, and being open to refactoring/rearchitecting -- in fact, if you have this as a common expectation, you may get really good at writing easily-maintainable code. I have started putting less and less "clever optimizations" into my code, instead opting for ease of maintainability, and onboarding for new team members to join up and start contributing. Depends on the size of project/team (and the priorities therein), but it helps me later too when I have to change functionality in something I wrote anywhere from 6-48 months ago :)
You should always have an architecture in mind. But it should be appropriate for the scale and complexity of your application _right now_, as opposed to what you imagine it will be in five years. Let it evolve, but always have it.
I frequently do both. It takes longer but leads to great overall architecture. I write a functional thing from scratch to understand the requirements and constraints. It grows organically and the architecture is bad. Once I understand better the product, I think deeply on a better architecture first before basically rewriting from scratch. I sometimes need several iterations on the most complex products.
This is where experience matters. The more experience you have, more often than not, the less stupid the code is. Not because you aren't testing your concepts as fast, but that your tooling is improved.
Basicaly, do you have a good foundation to build from. With more experience, you can build a better foundation.
>>> When I finished school in 2010 (yep, along time ago now), I wanted to go try and make it as a musician. I figured if punk bands could just learn on the job, I could too. But my mum insisted that I needed to do something, just in case.
Amusing coincidence. I also wanted to be a rock star, or at least a successful working musician. My mom also talked me out of it. Her argument was: If there's no way to learn it in school, then go to school anyway and learn something fun, like math. Then you can still be a rock star. Or a programmer, since I had already learned programming.
So I went to college as a math major, and eventually ended up with a physics degree.
I still play music, but not full time, and with the comfort of supporting myself with a day job.
> Amusing coincidence. I also wanted to be a rock star, or at least a successful working musician.
> I still play music, but not full time, and with the comfort of supporting myself with a day job.
Some people say;
Pursue your dream or you will regret it.
This is said by people who regret their own choices.
Other people say;
Don't make your dream a job, because all it will
be is a job and no longer special.
This is said by people who had misconceptions about what pursuing their dream actually entailed.
I say;
Happiness is found in neither a dream chased nor a chosen
profession. It is instead a choice we make each day in
what we do, in how we view same, and if we allow
ourselves to possess it.
What constitutes each day is immaterial.
I do think you need the dream to be happy. Money/career is a prerequisite of dreams.
Some want to live on the seas. They can be perfectly happy as a sailor, even if poor and single.
Some want a family, educated children, respect. They would likely need a nice house, enough resources to get a scholarship, a shot at retirement. This is obtainable working in public service, even without money.
But most have multiple dreams. That's what makes things complex. The man who wishes for a wife but also wishes to be on the seas will find much fewer paths available. Sailors also don't generally get respected by most in laws.
To mix the two, they try to find the dream-job. Perhaps work for a big oil company and be 'forced' to go offshore.
Eventually people learn that desire is suffering in some form and cut down on the number of dreams. They may even see this as mature and try to educate others that this is the way. Those who have kids often are forced to pick kids as the dream. So there's a selection bias as well.
"Don't put one foot in your job and the other in your dream, Ed. Go ahead and quit, or resign yourself to this life. It's just too much of a temptation for fate to split you right up the middle before you've made up your mind which way to go".
But that's a trap, the money you "need" is partly decided by how much you have available. Once you're used to the money from a 40 day job it's hard to do with less, but other people manage fine because they never got used to having a lot.
I think PP's point was .. that even if you spend your whole life pressed into laboring to produce a surplus to satisfy the excessive consumption of the elites of your heretical society, in ways that create existential risk for future generations, and are at odds with your own inner values and moral compass..
you can still see the 'immateriality' of all that in the grand scheme of things and choose to be happy.
There is no sociopolitical statement, no call-to-arms, no pontification as to the measure of one's life, no generational implications. There is an existential consideration, but not of the nature your post implies.
Happiness is an individual choice, available to us all at any time.
If anyone at any time can simply choose to be happy, why do we care whether (for instance) our arms get chopped off? We are equally capable of choosing to be happy with or without arms. Why do we avoid harm?
Thank you, your post cured my depression. (Sarcasm)
This is pure magical thinking. There are many reasons to be not happy. Being in pain, having lost a loved one, not having your physical needs met and well simply having depression or a myriad of other problems.
And people shouldn't be happy with all circumstances. It is not healthy to be happy all the time. Sometimes accepting the negative emotions is important for growth.
> Thank you, your post cured my depression. (Sarcasm)
Your welcome. (Sarcasm returned)
> This is pure magical thinking. There are many reasons to be not happy. Being in pain, having lost a loved one, not having your physical needs met and well simply having depression or a myriad of other problems.
Of course there are many life situations where "being happy" is not what a person can or needs to experience at that moment, where "moment" is defined as some period of time determined by each person. And there are medical conditions where trying to choose happiness is simply not possible, such as "having depression or a myriad of other problems."
> And people shouldn't be happy with all circumstances. It is not healthy to be happy all the time. Sometimes accepting the negative emotions is important for growth.
I never wrote anything to that effect. What I wrote was:
Happiness is an individual choice,
available to us all at any time.
Just because a choice is available does not mandate it must be chosen immediately and unconditionally.
But you go ahead and mischaracterize what I wrote to serve whatever agenda you have and I will reiterate what I posted earlier in this thread:
My key point is that happiness is a choice.
I hope everyone can find a way to choose it.
If you have your basic needs met, have no physical ailments etc. I would agree with you. Your statement applies to a certain subset of folks that are defeatist, pessimistic etc. but not everyone.
They’re not entirely wrong, and your comment is seemingly unhinged and unprovoked… but there’s a lot of literature on stuff like mindfulness, CBT, and the impact thoughts can have on one’s emotions, especially happiness.
CBT and mindfulness can help with SOME problems. It is great when it does but it also can work less well or even harmful for other problems. Especially people that are prone to rumination don't benefit much from it, they need the opposite of mindfulness.
The unhinged part was to imply that people can just choose to be happy under any circumstance which is obviously magical thinking.
>to imply that people can just choose to be happy under any circumstance which is obviously magical thinking
Worse, I'm afraid. It's ideological thinking of the basest sort.
Magical thinking at least lets a person see that their bullshit isn't working, potentially even walk it back, correct themselves.
In ideological thinking, you gotta act as if the impossible wish has already come true. Reality says otherwise? Well, wish harder - or else. That's ideological thinking for ya.
And those are only two of the cards in that deck. I've observed that with sufficient mental self-mutilation, people can in fact choose to be happy under any circumstance. Occasionally even at no cost to innocents. (Though rarely - who'd permit them a clean getaway?)
Woulda had a field day with figuring out what complexes are puppetting AdieuToLogic, if their most coherent argument wasn't "fuck off" - pardon, "full stop".
>> to imply that people can just choose to be happy under any circumstance which is obviously magical thinking
I never said nor implied that. It has only been the person with the account name "cardanome" who has applied absolutist determiners such as "all" and "any" to mischaracterize what I wrote.
> Worse, I'm afraid. It's ideological thinking of the basest sort.
Projection is a poor position to espouse and one easily identified such as the above, further supported by your previous assertion of "[w]hat you're promoting is a deeply narcissistic worldview."
Reread what I originally wrote in this thread objectively, if either you or "cardanome" can:
I say;
Happiness is found in neither a dream chased nor a chosen
profession. It is instead a choice we make each day in
what we do, in how we view same, and if we allow
ourselves to possess it.
What constitutes each day is immaterial.
But that's just me.
This is what is called a personal philosophy[0], specifically:
2 a : pursuit of wisdom
b : a search for a general understanding of values and
reality by chiefly speculative rather than
observational means
> Woulda had a field day with figuring out what complexes are puppetting AdieuToLogic, if their most coherent argument wasn't "fuck off" - pardon, "full stop".
I was directly replying to this[1] post, which contains phrases such as "I think PP's point was ..", "elites of your heretical society", and "at odds with your own inner values and moral compass".
If you and/or "cardanome" cannot comprehend why I finished with "full stop" in response to this post, then there is nothing I can do to help either or both of you understand.
Reread your own post some more. How you found it necessary to exhibit a couple of universally known pieces of horseshit so that your original piece would make sense as some sort of reaction to those. Outstanding.
There's a reason why you're finding it necessary to explain what a personal philosophy even is. Think about that before you go ooh wizdum (and if you have a spare dictionary, throw it up.)
>No, my point is happiness is a choice.
If happiness was a choice, there would be no point to happiness.
I could of course explain exactly what my words "hinge" on, and what "provokes" them. I've found that this does not create understanding where previously it was lacking. So instead let's talk about what you said.
Two of your words I consider harmful and insulting:
>is seemingly
What the hell?!
...oh, right:
- If you say "X is Y", you gotta back it up. Scary!
- If you say "X seems to me Y", you gotta justify your perceptions. Nasty!
- But saying "X is, seemingly, Y", that's totally safe! Because it's bullshit. It posits a statement as true knowledge and elidies the need for justification outright, on the syntactic level.
What's worse, you probably didn't even notice you were doing this. You just picked up the pattern from people who looked like they had what you wanted.
That cognitive habits like yours are so widely accepted as "normal", is exactly why I'm guessing that CBT (or, for that matter, parent poster's wireheading suggestions) are probably going to be super effective on you, not kidding.
If you were to give those a shot, anyway. Instead of, you know, just stating existences of literatures at people. Also unless your current state of mind wasn't already achieved by similar methods. In any case, do report back!
Easy to say it's immaterial when you're probably an american or european with plenty of material comfort. When was the last time you didn't eat for lack of food, for instance? Adieu to logic, indeed.
Luckily you can still pursue being a musician without all the pressure of having to be successful. On this road, one day you are free to declare your own success to yourself
Indeed. On the other hand, I also know my limitations, since roughly half the people I play with are pro's with music degrees. And I'm still trying to improve.
I'm inspired by the quote from Pablo Casals when he was in his 90s. They asked him why he still needed to practice, and he said: "Because I'm finally beginning to see some improvement."
Maybe if the internet and piracy hadn't fucked artists over, they could have made decent money as a musician selling their work without having to be a major-label superstar. Alas, we do not live in that timeline.
Yes. Mostly, until relatively recently on a historical time scale. In the middle ages, musicians were employed by towns, and had a guild. They also worked for princes, the church, etc. I read an article saying that they often did double duty as cops on market days.
There was perhaps less of a distinction between "arts" and trades. People did all kinds of work on paintings, sculptures, etc., and expected to get paid for it. They rarely put their names on their works.
I've read a bit about Bach's life, and he was always concerned about making money.
One music history textbook I read identified the invention of printed music as the start of the "music industry." Before the recording era, people composed and published sheet music. There were pianos in middle class parlors, and people bought sheet music to play. Two names that come to mind were Scott Joplin and Jelly Roll Morton. Movie theaters hired pianists or organists, though that employment vanished when talking movies came out. The musicians of the jazz era were making their livings from music. One familiar name is Miles Davis. His father was a dentist, and his parents considered music to be a respectable middle class career for their son. People did make a living from recordings before the Internet era. Today, lucrative work in the arts still exists for things like advertising and sports broadcasting.
(Revealing my bias, I'm a jazz musician).
In fact the expectation that an artist should not earn a decent living is kind of a new thing.
Piracy didn't fuck artists over I think (anecdotal), because it was the precursor to Spotify which has been great for artist discovery. Until the industry / artists caught on and started pushing shit. And the payment model for Spotify is bad, a million streams earns about $3-5K according to a quick google and few actually get that far.
But it's good for discovery, and artists generally don't make much off album sales either; concerts and merchandise is where it's at.
Really still kicking myself for not majoring in robotics in school. I wanted to program, so I studied computer engineering but hadn't really absorbed that much in classes. But I will likely never have access to all the robotics stuff my school had, nor the guided learnings.
Never too late to try stuff out of course, but very little beats structured higher ed education in relatively small classes (think there was only about 24 people in the robotics major?)_
Nothing beats concentrated work. You can do that without formal education. It might even be easier: you can probably afford pretty good arduinos and raspberries and H-bridges and sensors and actuators and...
It shouldn't be hard to go beyond what almost all universities provide.
On the other hand, the one robotics course I took involved getting access to computers at 3am and doing horrific matrix multiplications by hand that took hours. Of course, this was a long time ago.
While I generally agree with the conclusion of that, I think it might be a bit too naive.
The quantity group has a trivial way to "hack" the metric. I can just sit there snapping photos of everything. I could just set up a camera to automatically snap photos all day and night. To be honest, if I'm not doing this at a stationary wall there's probably a good chance I get a good photo since even a tiny probability can be an expected result given enough samples.
But I think the real magic ingredient comes from the explanation
> The group never worried about the quality of their work, so they spent time experimenting with lighting, composition, and such.
The way I read this is "The quantity group felt assured in their grade, so used the time to be creative and without pressure." But I think if you modified the experiment so that you'd grade students on a curve and in proportion to the number of photos they took then the results might differ. Their grade might not feel secure as it could take just one person to communicate that they're just snapping photos all day as fast as they can. In this setting I think you'd have even less ability to explore and experiment than the quality group.
I do think the message is right though and I think this is the right strategy in any creative or primarily mental endeavor (including coding). The more the process depends on creativity the more time needs to be allocated to this type of exploration and freedom. I think in jobs like research that this should be the basis for how they are structured and effectively you should mostly remove evaluation metrics and embrace the fundamentally ad hoc nature. In things like coding I think you need more of a mix and the right mix depends highly on the actual objectives. But I wanted to make the above distinction because I think it is important if we're trying to figure out what those objectives are.
This is an interesting take. It maps decently to the "first to market" mentality of a lot of programming these days. For sure, we'd get better programmers if they were judged by the quantity of code they produced in their personal projects, but if "quantity learners" are simply forced to churn out bad code without any reflection or the time to experiment, then I would agree that it seems pretty naive to think they'll ever improve... except at a handful of patterns that make bad code sort of work.
Also think about software recently. Is it actually better? IME I face more bugs than ever. Trivial ones to that are clearly being deprioritized but relatively easily solvable.[0]
Do an experiment for me. Write down every bug you face today. Even small. I think you'll be surprised at how many there are and even more at how many are likely simple to solve.
I know it's not just me as so many around me are getting increasingly frustrated with their devices. It's not the big things, it's a thousand paper cuts. But if you just you look at a paper cut in isolation, it isn't meaningful. That's the problem and why they get ignored. But they get created because we care more about speed than direction. I'd argue that's a good way to end up over a cliff
I mean, lets leave phones out of this for a moment and look at PCs...
When was the last time your PC operating system crashed?
When was the last time your applications you use on your PC crashed?
When was the last time you could not find an application for your PC that did what you needed to accomplish?
The early days of software absolutly sucked. Crashes, data loss, and limitations were the name of the game. The big things like data corruption were constantly problems. You didn't notice the small problems because you were fighting the big ones. The problem spaces software was solving for people was also relatively small. Now it's easy to find small bugs everywhere because software ate the world. It's harder to name something software has not expanded into than what it has, and yet we are still not near the boundaries of exploration of what software can do. When a system has not discovered its boundaries speed will almost always win over direction.
> When was the last time your PC operating system crashed?
I daily drive both Linux and Mac. Last year I had a Windows laptop from work.
For Linux? I'm not sure, but definitely longer than 6 mo. Last crash I remember was my fault and quite awhile ago.
For OSX? Last week. The most common one of when I close my laptop, go "ops, need to do X real quick", and when I open it up the screen doesn't come up. Not sure if a backlight issue but I don't think it is. It doesn't seem to be logging in and I can't get keys like volume to respond where I'd have feedback. Flashlight trick doesn't work. If I wait the laptop reboots itself and half the time I do get a crash report (I suspect this happens more than I think too since occasionally I'll come to my laptop and it was rebooted. Like back from lunch or a quick break, not overnight. Less frequently I find crash reports) Happened since day 1 and even on the last MacBook I had. I get this error at least twice a month. Also can happen when disconnecting my monitor.
Windows? Jesus Christ, how do people live like that? Arch was more stable a decade ago.
> When was the last time your applications you use on your PC crashed?
On Linux, two weeks ago I had a crash while playing cyberpunk (the most optimized game there ever was... except maybe StarField). Last week Silksong had a soft error where the joystick stopped responding when my wireless controller issued the low battery warning. Outside that, I can't think of anything other than when I accidentally run a sim calling too many resources, but that's not common and I'm not sure it counts.
On Mac, at least every two weeks. I feel like 6mo ago it was more like once a month though. I've been interviewing lately and Teams is definitely a bigger problem than Zoom. I think Firefox has crashed twice in the last 6 months? I'm also a tab hoarder. But mail crashed way more early on so I switched back to Thunderbird and while it doesn't crash I'm sure there is a small memory leak. I'll restart like once every 2 weeks because it'll start pushing a gig of ram. I know, I'm picky, but my email client shouldn't ever use a gig of ram. And I was writing my PhD thesis a few months back and preview would occasionally crash. Zathura had no issues. Slack definitely more frequently than FF.
Windows? Just about every day. I was able to reduce errors once the IT guy informed me that Windows Hello being used for the login was why Outlook was constantly crashing. Switching to a typed password did a lot. But after that it still reminded me of the days where I was learning Linux and distro hopping.
> When was the last time you could not find an application for your PC that did what you needed to accomplish?
Daily? Okay, but probably weekly?
Less of a problem on Linux. Often it's small things so I can write a quick shell script. I can find most things I want on the AUR but often I'll build from source.
OSX? That depends. Do I have to pay? If so, IDK. I'm not willing to pay a subscription fee for what's a glorified shell script.
Windows? Work laptop so wasn't really looking.
I generally agree, things are getting better but the problem wasn't that big 10-20 years ago ime. I can only think of 2 instances where I had data losses and both were on Linux laptops while distro hopping over 15 years ago. In both cases I was able to recover too. Yeah maybe the issue was bigger in the pre Windows 95 days but that's way back and hardware has also made significant strides. Don't give software the credit for better hardware.
But I think you're missing an important question:
How often do programs have unintended or unexpected behavior?
That is not a crash but is a bug. The calendar issue I mentioned above is not unique to my phone. Later on I also mention how I had essentially no means to merge contacts despite identical names, nicknames, phone numbers, and birthdays (differing only on email and notes). *WHO THE FUCK THINKS "FIND CONTRACTS" IS A BETTER SOLUTION THAN CLICKING TWO ENTIRES AND THEN CLICK "MERGE"? And guess what, it couldn't find the duplicates. This is a trivial database problem! I was able to eventually find a way to merge after some googling. But I discovered the duplication because my gf had 3 entries on my calendar for her birthday. When I merged, she ended up with 4! Again, that is a trivial database problem.
This category of problem is greatly increasing in frequency. I know you might think it's a comparison bias, and I think that's a reasonable guess, but it isn't. I've definitely "infected" my friends and family so they're more "sensitive" to bugs but years after that they agree that their devices are becoming harder to use for the same tasks they did before. Either through little bugs like this piling up or having some new features being pushed on them that they don't want and disrupts their workflow. That last one is really common.
So I don't want to dismiss you, because I don't think you're entirely wrong. But also I think you're being too quick to dismiss me. Your justification isn't a complete explanation of my experience. Nor does it account for how programs and getting slower and heavier. I mean God damn, how many seconds does it take for Microsoft Word to open (cached? Not cached?) and how many gigabytes of ram does it use? Those shouldn't even be the units of measurement we should be discussing! Both are at least a magnitude too large. I'm absolutely certain programs are bloating. And I think it should be unsurprising that with bloated comes more opportunities for errors.
This is also the type of problem I expect average people to not notice as it happens more slowly and if you don't understand computers it's easy to believe it's just the way it is. But we're on HN, we do. We know better. We also know how the sausage is made and we've experienced the increasing pressure for speed and seen the change in how programs and programmers are evaluated. I do not think you can discount these effects.
It shouldn't be a surprise that moving faster correlates strongly with making more mistakes. You don't deny the optimization for speed, but do you really think you can accelerate for free? There's a balance and I'm confident we've sped through that equilibrium.
I both agree and disagree with this post, but I might be misunderstanding it.
Near the end, it states:
“Enjoy writing it, it doesn’t have to be nice or pretty if it’s for you. Have fun, try out that new runtime or language.”
It doesn’t have to be nice or pretty EVEN if it’s NOT for you.
The value in prototyping has always been there and it’s been very concrete: to refine mental models, validate assumptions, uncover gaps in your own thinking (or your team’s), you name it.
Unfortunately it feels that the pendulum has swung in the completely opposite direction. There’s a lot of “theatre” in planning, writing endless tickets and refining them for WEEKS before actually starting to write code, in a way that’s actively harmful for building software.
When you get stuck in planning mode you let wrong assumptions grow and get baked in into the design so the sunken cost keeps rising.
Simply have a BASIC and SHARED mental model of the end goal with your team and start prototyping. LLMs have made this RIDICULOUSLY CHEAP. But, the industry is still stuck in all the wrong ways.
> It doesn’t have to be nice or pretty EVEN if it’s NOT for you.
> There’s a lot of “theatre” in planning, writing endless tickets and refining them for WEEKS before actually starting to write code, in a way that’s actively harmful for building software.
I'd love to have a "high paying job" where I am allowed to start prototyping and modelling the problem and then iteratively keep on improving it into fully functional solution.
I won't deny that the snowballing of improvements and functional completeness manifests as acceleration of "delivery speed" and as a code-producing experience is extremely enjoyable. Depth-first traversal into curiosity driven problem solving is a very pleasurable activity.
However, IME in real world, someone up the chain is going to ask "when will you deliver this". I have ever only once been in a privileged enough a position in a job to say "I am on it and I will finish it when I finish it... and it will be really cool"
Planning and task breakdown, as a developer, is pretty much like my insurance policy. Because when someone up the chain (all the way down to my direct manager) comes asking "How much progress you have made ?" I can say (OR "present the data" as it is called in a certain company ?) "as per the agreed plan, out of the N things, I have done k (< N) things so far. However at this (k+1)th thing I am slowing down or blocked because during planning that-other-thing never got uncovered and we have scope-creep/external-dependency/cattle-in-the-middle-of-the-road issue". At which point a certain type of person will also go all the way to push the blame to another colleague to make themselves appear better hence eligible for promotion.
I would highly encourage everyone to participate in the "planning theatre" and play your "role".
OR, if possible start something of your own and do it the way you always wanted to do it.
I feel like this is the time to mention "How Big Things Get Done", by Bent Flyvbjerg. "Long planning vs. start prototyping" is a false dichotomy. Prototyping IS planning.
Put another way, refining tickets for weeks isn't the problem; the problem is when you do this without prototyping, chances are you aren't actually refining the tickets.
Planning stops when you take steps that cannot be reverted, and there IS value in delaying those steps as much as possible, because your project then becomes vulnerable to outside risk. Long planning is valuable because of this; it's just that many who advocate for long planning would just take a long time and not actually use that time for planning.
For my money, certain types of software shouldn't have tests, too much planning, or any maintenance whatsoever
Prototypes (start ups) rarely have the luxury of "getting it right", their actual goal is "getting it out there FAST to capture the market (and have it working enough to keep the market)"
(Some - apologies but I'm not a game dev enough to be able to say what types this applies to) Game devs - they're more or less build it, ship it, and be done with it, players tend to be forgiving of most bugs, and they move on to the next shiny thing long before it's time to fix all the things.
Once the product has traction in the market, and you have paying customers, then it's time to deal with the load (scale up) and bugs, I recall reading somewhere that it's probably best to drop the start up team, they did their job (and are now free to move on to the next brilliant idea), and replace them with a scale up team, who will do the planning, architecting, and preparation for the long term life of the software.
I think that that approach would have worked for Facebook (for example) they had their PHP prototype that captured the market very quickly, and (IMO) they should have moved through to a scale up team (who could have used the original code as a facade, strangling it to replace it with something funky (Java/C++ would have been what was available at the time, but Go would be what I would suggest now)
Yesterday I spent the entire day working on a lib to create repos in Github from inside Emacs.
It was the first time in 3y that I had touched it.
When googling, I saw potential candidates that were much better than my simple one.
But I kept going, for the pleasure of making my own thing.
I learned a lot, and felt very accomplished, even if, at the end, it was messy, and I'll have to go back and reorganize it.
It feels like making _my_ thing, even if it is drawing my copy of Monalisa.
The graphic design version of this is "Get it done, make it beautiful," but it works for code too.
I used to get hung up on things like doing a loop when a ternary operator would work. "Somebody is going to see this and be rude about it." But sometimes you write code how you're thinking about the problem at the time. And if you think of it as a loop, or a series of if statements, or whatever, do it that way.
If it makes you feel better, note it in a comment to revisit later. And if somebody is rude about it, so what. It's not theirs, it's yours.
I agree but there are certain types of unnecessary stupidity, which feel more easy at at first, but hurt more than they help very quickly (measured in amount of code):
The first one that comes to mind relates closely to naming. If we think about a program in terms of its user facing domain, then we might start to name and structure our data, functions, types too specifically for that domain. But it's almost always better to separate computational, generic data manipulation from domain language.
You only need a _little bit_ more time to move much of the domain specific stuff into your data model. Think of domain language as values rather than field names or types. This makes code easier to work with _very quickly_.
Another stupidity is to default to local state. Moving state up requires a little bit of planning and sometimes refactoring and one has to consider the overall data model in order to understand each part. But it goes a long way, because you don't end up with entangled, implicit coordination. This is very much true for anything UI related. I almost never regret doing this, but I have regretted not doing this very often.
A third thing that is unnecessarily stupid is to spread around logic. Harder to explain, but everyone knows the easy feeling of putting an if statement (or any kind of branching, filtering etc.) that adds a bunch of variables somewhere, where it doesn't belong. If you feel pressed to do this, re-consider whether your data is rich enough (can it express the thing that I need here) and consistent enough.
> we might start to name and structure our data, functions, types too specifically for that domain.
I once worked on a Perl script that had to send an email to "Harry". (Name changed to protect the innocent). I stored Harry's email address in a variable called "$HARRY".
Later on a second person (with a different name) wanted to get the emails as well. No problem, just turn the scalar into an array, "@HARRIES".
> It’s small, it’s dumb, and there were probably plenty of options out there.
Oh, this sort of "dumb" code. That is just exercise. It bothers me that in this field we don't think we should rehearse and exercise and instead use production projects for that.
Actual dumb code is one that disregards edge cases or bets on things being guaranteed when they're not.
One thing I have found to be a very valuable habbit is to first think about what your software has to do on paper and draw some shitty flow charts, lists and other things, without too much care about whether you will do it (especially if it isn't software that you strictly need to do for some reason).
Whether an idea is good or not can often only be judged when it becomes more concrete. The actual finished project is as concrete as it gets, but it takes time and work to get there. So the next best thing is to flesh it out as much as possible ahead and decide based on that whether it is worth doing it that way.
Most people have the bad habit of being too attached to their own ideas. Kill your darlings. Ideas are meant to be either done, shelved or thrown into the bin. It doesn't do any good to roll them around in your head forever.
@author the blog scales poorly on smaller devices. The header doesn't fit the screen, margin's too big and lines are too crammed (line height needs a bit mor love).
I do not believe that the real struggle is "starting", nowadays, since AI impresses 90% that is able to complete a task. We struggle in architecting the whole thing we want to start.
You should write stupid code, but you should write good code too.
Writing stupid code is like walking to the shop. You're not going to improve your marathon time, but that's not the point. It's just using an existing skill to do something you need to do.
But you should also study and get better at things. If you learnt to cycle you could get to the shop in a third of the time. Similarly, if you learn new languages, paradigms, features etc. you will become a more powerful programmer.
Is LLM output the kind of clever we're talking about here? I always thought the quote was about abstraction astronautics, not large amounts of dumb just-do-it code.
It applies to LLM code, but if you take the law at face value, it's a very damaging one. Cleverness should be used to make your code easier to verify, not harder.
He said it with a very specific idea in mind, and like most of software engineering "laws", if you know enough to know when to apply it, you don't need the law.
No, it just means you'll be spending extra time debugging it. The most clever code is often cleverness which isn't from you, but derived from the field over time.
I believe "stupid code" is useful for sticking concepts or quick prototypes together.
But for strategic decisions, having a well-researched document (a PRD or similar) helps as a starting point for iteration, and the approach you take will be influenced by your team's culture.
I like this philosophy. It's interesting to me that the author writes about trying deno, specifically out of curiosity for compiling binaries with it, because that is something that's been specifically tickling the back of my mind for awhile now, but I've had no real reason to try it. I think this gave me the motivation to write some "stupid" code just to play with it.
It's like riding a bike. You need to start in a low gear and get some momentum - even if you are going in circles. Starting from zero in the highest gear is difficult and hard to ballance. Once you have some speed, everything gets easier.
Even if the code is for yourself or for a collaborative team for a project or for a company the quality matters. Also the software replicability, reproducibility and reliability are significant indicators for viable code and guaranteed results
I get more done by writing the stupid code, and fixing it, than junking the old code... but every now and then I can see clearly a structure for a rewrite, and then I rewrite, but its rare.
I feel like people should be writing stupid code, and in the case where its a compiled language, we should ask compiler or the language for better optimization. The other day, I was writing a check of a struct that have certain structures (protobuf probably have something like this)
struct S { int a; int b; int c; int d; int e; /* about 15 more members */ }
so I wrote
const auto match_a = s.a == 10;
const auto match_b = s.c == 20;
const auto match_c = s.e == 30;
/* about 15 more of these */
if (match_a && match_b && match_c) { return -1; }
Turns out compilers (I think because of the language) totally shit the bed at this. It generates a chain of 20 if-else instead of a mask using SIMD or whatever. I KNOW this is possible, so I asked an LLM, it was able to produce said code that uses SIMD.
I'm working on in-kernel ext3/4fs journalling support for NetBSD. The code is hot garbage but I love it because of the learning journey it's taken me on: about working in a kernel, about filesystems, etc. I'm gonna clean it up massively once I've figured out how to make the support complete, and even then I expect to be raked over the coals by the NetBSD devs for my code quality. On top of that there's the fact that real ones use ZFS or btrfs these days, and ext4 is a toy; like FAT, by comparison, so this may not even be that useful. But it's fun and lets me say hey Ma, I'm a kernel hacker now!
Ext4 is most certainly still in use and not a toy. Its trusted. It takes a lot for folks to adopt a new file system.
I worked on a research topic in grad school and learned about holes in files, and how data isn’t removed until the last fd is closed. I use that systems knowledge in my job weekly.
A tip. Kernel development can be lonely, share what you are working on and find others.
Partially true, in that on Safari on iOS, you can use that to enlarge the text. But that doesn't change what's really broken about the layout, which is that it forces the column width to allow only a small number of words-per-line, which is what makes for uncomfortable reading. Another Safari-on-iOS option would be to use the built-in "Reader" function, which re-flows the text into a cleaner layout.
Maybe you don't read much, but it's obvious they weren't making some universal statement about code. They are referring to the code you write when you are just experimenting by yourself, for yourself. The point is to not let irrelevant things like usefulness, quality, conventions, etc. limit just tinkering and learning.
I think the people who think there is no stupid code don't actually ever witness truly bad code. The worst code that they come across is, at worst, below average. And since that's the worst they see, it gets mentally defined as bad.
I think that's basically an impossibility, unless the only code they look at is from people who have 5 minutes of coding experience and attempt to get working code from vibes (without the LLM). Even suggesting this makes me think you haven't even seen truly stupid code.
I'm talking code from people with no programming experience, trying to contribute to open-source mod projects by pattern matching words they see in the file. They see the keyword static a lot, so they just put static on random things.
And reading the Linux kernel mailing list would allow him to... do what exactly? And by when? Compared to writing simple, working, usable apps in TypeScript, immediately after reading about how Deno/TypeScript/etc. work?
Linux still works by email-submitted patches, the workflow for which git was originally designed.
And if an unacceptable patch made it to Linus's desk, someone downstream hasn't been doing their damn job. The submaintainers are supposed to filter the stupid out, perhaps by more gentle guidance toward noob coders. The reason why Linus gets so angry is because the people who let it through should know better.
For sure. I'd argue to write the "stupid" code to get started, get that momentum going. The sooner you are writing code, the sooner you are making your concept real, and finding the flaws in your mental model for what you're solving.
I used to try to think ahead, plan ahead and "architect", then I realized simply "getting something on paper" corrects many of the assumptions I had in my head. A colleague pushed me to "get something working" and iterate from there, and it completely changed how I build software. Even if that initial version is "stupid" and hack-ish!
I think this is mostly true, but also I’d highlight the necessity of having a mental model, and iterating.
I think it is common for a programmer to just start programming without coming up with any model, and just try to solve the problem by adding code on top of code.
There are also many programmers who go with their first “working” implementation, and never iterate.
These days, I think the pendulum has swung too far from thinking about the program, maybe mapping it out a bit on paper before writing code.
My philosophy:
1. Get it working.
2. Get it working well.
3. Get it working fast.
This puts the "just get it working" as the first priority. Don't care about quality, just make it. Then, and only once you have something working, do you care about quality first. This is about getting the code into something reasonable that would pass a review (e.g., architectually sound). Finally, do an optimization pass.
This is the process I follow for PRs and projects alike. Sometimes you can mix all the steps into a single commit, if you understand the problem&solution domain well. But if you don't, you'll likely have to split it up.
> Finally, do an optimization pass.
Depending on how low-level your code is, this... may not work out in those terms.
In other words, I’d say that if you actually want good software—and that includes making sure its speed falls within a reasonable factor of the napkin-math theoretical maximum achievable on the platform—your three steps can easily constitute three entire rewrites or at least substantial refactors. You might well need to rearchitect if the “working well” version has multiple small loops split by domain-level concern when the hardware really wants a single large one, or if you’re doing a lot of pointer-chasing and need to flatten the whole thing into a single buffer in preorder, or if your interface assumes per-byte ops where SIMD can be applied.
This is not a condemnation of the strategy, mind you. Crap code is valuable and I wish I were better at it. I just disagree that the transition from step 2 to step 3 can be described as an optimization pass. If that’s what you limit yourself to, you’ll quite likely be forced to leave at least an order of magnitude’s worth of performance on the table.
And yes, most consumer software is very much not good by that definition.
(For instance, I’m expecting that the Ladybird devs will be able to get their browser to work well for daily tasks—which I would count a tremendous achievement—but I’m not optimistic about it then becoming any faster than the state of the art even ten or fifteen years ago.)
Some optimization problems require an entire PHD dissertation and research budget to actually optimize, so some algorithms require far more effort applied to this than is reasonable for most products. As mentioned, sometimes you can combine these all into one step -- when you know the domains well.
Sometimes, it might even be completely separate people working on each step... separated by time and space.
In any case, most software generally stops at (2) simply due to the fact that any effort towards (3) isn't worth the effort -- for example, there's very little point in spending two weeks optimizing a report generation that runs in the middle of the night, once a month. At some point, there may be, but usually not anytime soon.
This should be carved in stone on every campus computer science building.
https://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast
This somewhat depends on how big of program/application you are making.
Again, this is something I set bite enterprise style applications quite often as they can be pushed out piecemeal where you can get things like the datastore/input APIs/UI to the customer quickly, then over the next months things like reporting, auditing, and fine grained access controls get put in, and suddenly you find yourself stuffed working around major issues where a little bit up up front thinking about the later steps would have saved you a lot of heartache.
This is where 'knowing the domain' lets you put a ton of stuff in all at once. If you have no clue what you're doing, you have to learn the lesson your talking about. As long as you can avoid joining teams that haven't learned this lesson (and others like it), you'll be fine.
I once joined a team where they knew they were going to do translations at some point ... and the way they decided to "prepare" for it was absolutely nonsensical. It was clear none of them had ever done translations before, so when it came time to actually do the work, it was a disaster. They had told product/sales "it was ready" but it didn't actually work -- and couldn't ever actually work. It required redesigning half the architecture and months of effort across a whole team to get it working. Even then, some aspects were completely untranslatable that took an additional 6-8 months of refactoring.
So, another lesson is to not try to engineer something unless your goal is to "get it working". If you don't need it, it is probably still better to actually wait until you need it.
Slight variation: - Make it work - Make it right - Make it fast
This is fantastic, but how do you communicate this within your organization to peers, and not allow the pace of the organization to interfere? For example, can see many teams stopping after step 1.
You write a tech debt ticket and move on.
I've used a similar mantra of "make it work, make it pretty, make it fast" for two decades.
I think I've had to get to step 3 once and that was because the specs went from "one device" to "20 devices and two factories" after step 1 :D
And in some cases, there isn't a reason to continue to step 2 or 3. Software generally has a shelf-life. Most businesses write code that should be rewritten every 5-10 years, but there's that kernel of code that _never_ changes... that's the code that really needs step 2 and 3. The rest, probably only runs occasionally and doesn't explicitely need to be extremely tested and fast.
Agreed. Having worked the range of boring backend systems to performance critical embedded systems, only few areas are worth optimizing for, and we always let data inform where to invest additional time.
I much prefer a code base that is readable and straightforward (maybe at the expense of some missed perf gains) over code that is highly performant but hard to follow/too clever.
> These days, I think the pendulum has swung too far from thinking about the program, maybe mapping it out a bit on paper before writing code.
Sometimes I think about code structure like a sudoku where you have to eliminate two possibilities by following through what would happen. Writing the code is (to me) like placing the provisional numbers and finding where you have a conflict. I simply cannot do it by holding state in my head (ie without making marks on the paper).
It could definitely be a limitation of me rather than generally true.
Totally agree. Iteration is key. Mapping things out on paper after you've written the code can also be illuminating. Analysis and design doesn't imply one-and-done Architect -> Implement waterfall methods.
Knowing hard requirements up front can be critical to building the right thing. It's scary how many "temporary" things get built in top of and stuck in production. Obviously loose coupling / clear interfaces can help a lot with this.
But an easy example is "just build the single player version" (of an application) can be worse than just eating your vegetables. It can be very difficult to tack-on multiplayer, as opposed to building for this up front.
I once retrofitted a computer racing game/sim from single-player to multi-player.
I thought it was a masterpiece of abusing the C pre-processor to ensure that all variables used for player physics, game state, inputs, and position outputs to the graphics pipeline were guarded with macros to ensure as the (overwhelmingly) single-player titles continued to be developed that the code would remain clean for the two titles that we hoped to ship with split-screen support.
All the state was wrapped in ss_access() macros (“split screen access”) and compiled to plain variables for single-player titles, but with the variable name changed so writing plain access code wouldn’t compile.
I was proud of the technical use/abuse of macros. I was not proud that I’d done a lot of work and imposed a tax on the other teams all for a feature that producers wanted but that in the end we never shipped a single split-screen title. One console title was cancelled (Saturn) and one (mine) shipped single-player only (PlayStation).
Pain aside, this sounds like an absolute blast.
Even more of a blast was this anecdote from the same effort:
https://news.ycombinator.com/item?id=33963859
That's a great point, and I feel like it is relevant for a lot more than games.
We should definitely have a plan before we start, and sketch out the broad strokes both in design and in actual code as a starting point. For smaller things it's fine to just start hacking away, but when we're designing nå entire application i think the right way to approach it is to plan it out and then solve the big problems first. Like multiplayer.
They don't have to be completely solved, it's an iterative process but they should be part of the process from the beginning.
An example from my own work: I took over an app two other developers had started. The plan was to synchronize data from a third party to our own db, but they hadn't done that. They had just used the third party api directly. I don't know why. So when they left and I took over, I ended up deleting/refactoring everything because everything was built around this third party api and there was a whole bunch of problems related to that and how they were just using the third party's data structure directly rather than shaping the data the way we wanted it. The frontend took 30-60+ seconds to load a page because it was making like 7 serialized requests, waiting for a response before sending the next one and the backend did the same thing.
Now it's loading instantly, but it did require that I basically tear out everything they've done and rewrite most of the system from scratch.
In many project it's impossible to know the requirements up front, or they are very vagues.
Business requirements != programming requirements/features.
Very often both the business requirements and programming requirements change a lot since unless you have already written this one thing, in the exact form that you are making it now, you will NEVER get it right the first time.
The problem is people don't adapt properly. If the business requirements change so much that it invalidates your previous work then you need to re-do the work. But in reality people just duct tape a bunch of workarounds together and you end up with a frankensystem that doesn't do anything right.
It is possible to build systems that can adapt to change, by decoupling and avoiding cross cutting concerns etc you can make a lot of big sweeping changes quite easily in a well designed system. It's just that most developers are bad at software development, they make a horrible mess and then they just keep making it worse while blaming deadlines and management etc.
You are both right and that's why so many projects are over budget or even fail miserably.
This is why I hate software engineering as a profession.
You're going to write the "stupid code" to get things out the door, get promoted and move on to another job, and then some future engineer has to come along and fix the mess you made.
But management and the rest of the org won't understand why those future engineers are having such a hard time, why there's so much tech debt, and why any substantial improvements require major rework and refactoring.
So the people writing the stupid code get promoted and look good, but the people who have to deal with the mess end up looking bad.
Sure, that sucks. You know what else sucks? The engineer who does nothing but foundation building and then is surprised that reality doesn't align with their meticulously laid out assumptions.
An engineer that does nothing but foundations can still be a damn good geotechnical engineer.
A foundation that isn't useful to build atop is just a shitty foundation. Everyone is taking it for granted that building a good foundation is impossible if you haven't built a shitty foundation for the same building first, but that's not the only way to do things.
The analogy is strained. Software is closer to a food recipe than a building. Trying to make a 3-layer strawberry banana cake with pineapple frosting? You are going to have to bake something and taste it to see if your recipe is any good. Then make some adjustments and bake some more.
Is the argument here that a skilled chef has no better way to make good food than unguided trial and error? That's obviously not true, as the abundance of "random ingredient" cooking challenges will attest.
I mean, you can write the stupid code to get something working, and not submit the code until you've iterated on it.
> I used to try to think ahead, plan ahead and "architect"
Depends on what you do. If you build a network protocol, you'd better architect it carefully before you start building upon it (and maybe others do, too).
The question is: "if I get this wrong, how much impact does it have?". Getting the API of a core service wrong will have a lot of impact, while writing a small mobile app won't affect anything other than itself.
But the thing is, if you think about that before you start iterating on your small app, then you've already taken an architectural decision :-).
> I'd argue to write the "stupid" code to get started, get that momentum going.
Yes and no, depending on how dependent you become on that first iteration, you might drown an entire project or startup in technical debt.
You should only ever just jump in if:
A) it's a one off for some quick results or a demo or whatever
B) it's easy enough to throw away and nobody will try to ship it and make you maintain it
That said, having so much friction and analysis paralysis that you never ship is also no good.
or C): You cultivate a culture of continuous rewrite to match updated requirements and understandings as you code. So, so many people have never learned that, but once you do reach that state, it is very liberating as there will be no more sacred ducks.
That said, it takes quite a bit of practice to become good enough at refactoring to actually practice that.
Yeah, I think it's actually a great skill to be comfortable with not getting attached to your code, and being open to refactoring/rearchitecting -- in fact, if you have this as a common expectation, you may get really good at writing easily-maintainable code. I have started putting less and less "clever optimizations" into my code, instead opting for ease of maintainability, and onboarding for new team members to join up and start contributing. Depends on the size of project/team (and the priorities therein), but it helps me later too when I have to change functionality in something I wrote anywhere from 6-48 months ago :)
You should always have an architecture in mind. But it should be appropriate for the scale and complexity of your application _right now_, as opposed to what you imagine it will be in five years. Let it evolve, but always have it.
I frequently do both. It takes longer but leads to great overall architecture. I write a functional thing from scratch to understand the requirements and constraints. It grows organically and the architecture is bad. Once I understand better the product, I think deeply on a better architecture first before basically rewriting from scratch. I sometimes need several iterations on the most complex products.
This is where experience matters. The more experience you have, more often than not, the less stupid the code is. Not because you aren't testing your concepts as fast, but that your tooling is improved.
Basicaly, do you have a good foundation to build from. With more experience, you can build a better foundation.
With a working prototype you get to test the specification and the users, not just the code itself.
This is also why I'm not a fan of the "software architect" that doesn't write code, or at least not the code that they've architected.
Theres a phrase that's become a bit of a mantra in adjacent circles
"Just make it exist first. You can make it good later."
>>> When I finished school in 2010 (yep, along time ago now), I wanted to go try and make it as a musician. I figured if punk bands could just learn on the job, I could too. But my mum insisted that I needed to do something, just in case.
Amusing coincidence. I also wanted to be a rock star, or at least a successful working musician. My mom also talked me out of it. Her argument was: If there's no way to learn it in school, then go to school anyway and learn something fun, like math. Then you can still be a rock star. Or a programmer, since I had already learned programming.
So I went to college as a math major, and eventually ended up with a physics degree.
I still play music, but not full time, and with the comfort of supporting myself with a day job.
> Amusing coincidence. I also wanted to be a rock star, or at least a successful working musician.
> I still play music, but not full time, and with the comfort of supporting myself with a day job.
Some people say;
This is said by people who regret their own choices.Other people say;
This is said by people who had misconceptions about what pursuing their dream actually entailed.I say;
But that's just me.I do think you need the dream to be happy. Money/career is a prerequisite of dreams.
Some want to live on the seas. They can be perfectly happy as a sailor, even if poor and single.
Some want a family, educated children, respect. They would likely need a nice house, enough resources to get a scholarship, a shot at retirement. This is obtainable working in public service, even without money.
But most have multiple dreams. That's what makes things complex. The man who wishes for a wife but also wishes to be on the seas will find much fewer paths available. Sailors also don't generally get respected by most in laws.
To mix the two, they try to find the dream-job. Perhaps work for a big oil company and be 'forced' to go offshore.
Eventually people learn that desire is suffering in some form and cut down on the number of dreams. They may even see this as mature and try to educate others that this is the way. Those who have kids often are forced to pick kids as the dream. So there's a selection bias as well.
vonnegut said:
"Don't put one foot in your job and the other in your dream, Ed. Go ahead and quit, or resign yourself to this life. It's just too much of a temptation for fate to split you right up the middle before you've made up your mind which way to go".
I say if you have money to do whatever you want everyday, there’s an overwhelming chance you’ll be happy.
The rest of those sayings are just for us plebs that have to rationalize working 40-60 hours a week.
But that's a trap, the money you "need" is partly decided by how much you have available. Once you're used to the money from a 40 day job it's hard to do with less, but other people manage fine because they never got used to having a lot.
My key point is that happiness is a choice. I hope everyone can find a way to choose it.
That's wrong. Plenty of people have "found a way to choose happiness" already, and we've all seen what exactly they've wrought.
I think PP's point was .. that even if you spend your whole life pressed into laboring to produce a surplus to satisfy the excessive consumption of the elites of your heretical society, in ways that create existential risk for future generations, and are at odds with your own inner values and moral compass.. you can still see the 'immateriality' of all that in the grand scheme of things and choose to be happy.
No, my point is happiness is a choice.
There is no sociopolitical statement, no call-to-arms, no pontification as to the measure of one's life, no generational implications. There is an existential consideration, but not of the nature your post implies.
Happiness is an individual choice, available to us all at any time.
Full stop.
If anyone at any time can simply choose to be happy, why do we care whether (for instance) our arms get chopped off? We are equally capable of choosing to be happy with or without arms. Why do we avoid harm?
Thank you, your post cured my depression. (Sarcasm)
This is pure magical thinking. There are many reasons to be not happy. Being in pain, having lost a loved one, not having your physical needs met and well simply having depression or a myriad of other problems.
And people shouldn't be happy with all circumstances. It is not healthy to be happy all the time. Sometimes accepting the negative emotions is important for growth.
> Thank you, your post cured my depression. (Sarcasm)
Your welcome. (Sarcasm returned)
> This is pure magical thinking. There are many reasons to be not happy. Being in pain, having lost a loved one, not having your physical needs met and well simply having depression or a myriad of other problems.
Of course there are many life situations where "being happy" is not what a person can or needs to experience at that moment, where "moment" is defined as some period of time determined by each person. And there are medical conditions where trying to choose happiness is simply not possible, such as "having depression or a myriad of other problems."
> And people shouldn't be happy with all circumstances. It is not healthy to be happy all the time. Sometimes accepting the negative emotions is important for growth.
I never wrote anything to that effect. What I wrote was:
Just because a choice is available does not mandate it must be chosen immediately and unconditionally.But you go ahead and mischaracterize what I wrote to serve whatever agenda you have and I will reiterate what I posted earlier in this thread:
> Thank you, your post cured my depression. (Sarcasm)
I understand what both of you are saying, but I think its disingenuous to assume that happiness is simply the opposite of depression.
If you have your basic needs met, have no physical ailments etc. I would agree with you. Your statement applies to a certain subset of folks that are defeatist, pessimistic etc. but not everyone.
Yeah, no.
What you're promoting is a deeply narcissistic worldview, and I hope either the cure or the consequences reach you soon.
Though maybe those are going to present as the same thing.
They’re not entirely wrong, and your comment is seemingly unhinged and unprovoked… but there’s a lot of literature on stuff like mindfulness, CBT, and the impact thoughts can have on one’s emotions, especially happiness.
CBT and mindfulness can help with SOME problems. It is great when it does but it also can work less well or even harmful for other problems. Especially people that are prone to rumination don't benefit much from it, they need the opposite of mindfulness.
The unhinged part was to imply that people can just choose to be happy under any circumstance which is obviously magical thinking.
>to imply that people can just choose to be happy under any circumstance which is obviously magical thinking
Worse, I'm afraid. It's ideological thinking of the basest sort.
Magical thinking at least lets a person see that their bullshit isn't working, potentially even walk it back, correct themselves.
In ideological thinking, you gotta act as if the impossible wish has already come true. Reality says otherwise? Well, wish harder - or else. That's ideological thinking for ya.
And those are only two of the cards in that deck. I've observed that with sufficient mental self-mutilation, people can in fact choose to be happy under any circumstance. Occasionally even at no cost to innocents. (Though rarely - who'd permit them a clean getaway?)
Woulda had a field day with figuring out what complexes are puppetting AdieuToLogic, if their most coherent argument wasn't "fuck off" - pardon, "full stop".
>> to imply that people can just choose to be happy under any circumstance which is obviously magical thinking
I never said nor implied that. It has only been the person with the account name "cardanome" who has applied absolutist determiners such as "all" and "any" to mischaracterize what I wrote.
> Worse, I'm afraid. It's ideological thinking of the basest sort.
Projection is a poor position to espouse and one easily identified such as the above, further supported by your previous assertion of "[w]hat you're promoting is a deeply narcissistic worldview."
Reread what I originally wrote in this thread objectively, if either you or "cardanome" can:
This is what is called a personal philosophy[0], specifically: > Woulda had a field day with figuring out what complexes are puppetting AdieuToLogic, if their most coherent argument wasn't "fuck off" - pardon, "full stop".I was directly replying to this[1] post, which contains phrases such as "I think PP's point was ..", "elites of your heretical society", and "at odds with your own inner values and moral compass".
If you and/or "cardanome" cannot comprehend why I finished with "full stop" in response to this post, then there is nothing I can do to help either or both of you understand.
0 - https://www.merriam-webster.com/dictionary/philosophy
1 - https://news.ycombinator.com/item?id=45410223
Reread your own post some more. How you found it necessary to exhibit a couple of universally known pieces of horseshit so that your original piece would make sense as some sort of reaction to those. Outstanding.
There's a reason why you're finding it necessary to explain what a personal philosophy even is. Think about that before you go ooh wizdum (and if you have a spare dictionary, throw it up.)
>No, my point is happiness is a choice.
If happiness was a choice, there would be no point to happiness.
>unhinged and unprovoked
I could of course explain exactly what my words "hinge" on, and what "provokes" them. I've found that this does not create understanding where previously it was lacking. So instead let's talk about what you said.
Two of your words I consider harmful and insulting:
>is seemingly
What the hell?!
...oh, right:
- If you say "X is Y", you gotta back it up. Scary!
- If you say "X seems to me Y", you gotta justify your perceptions. Nasty!
- But saying "X is, seemingly, Y", that's totally safe! Because it's bullshit. It posits a statement as true knowledge and elidies the need for justification outright, on the syntactic level.
What's worse, you probably didn't even notice you were doing this. You just picked up the pattern from people who looked like they had what you wanted.
That cognitive habits like yours are so widely accepted as "normal", is exactly why I'm guessing that CBT (or, for that matter, parent poster's wireheading suggestions) are probably going to be super effective on you, not kidding.
If you were to give those a shot, anyway. Instead of, you know, just stating existences of literatures at people. Also unless your current state of mind wasn't already achieved by similar methods. In any case, do report back!
Put aside anything I have written elsewhere in this thread.
Look at the multiple ad hominem attacks you put forth in the above comment alone. Imagine you are the person to whom you replied when reading it.
All because someone disagreed with a post you authored by writing in part:
I truly hope you reflect on this and can find a way to talk about it with someone.Yeah, been that person, with reason and without. Anyway your turn now
*elides #noprocrast
I really don't see how your comment has to do with his at all. You're both talking about completely different things, I think.
OK, just to make sure we're on the same page here: you've already tried trying to make the connection, right?
Easy to say it's immaterial when you're probably an american or european with plenty of material comfort. When was the last time you didn't eat for lack of food, for instance? Adieu to logic, indeed.
Please step right into this here experience machine, it won't hurt you one bit
Luckily you can still pursue being a musician without all the pressure of having to be successful. On this road, one day you are free to declare your own success to yourself
Indeed. On the other hand, I also know my limitations, since roughly half the people I play with are pro's with music degrees. And I'm still trying to improve.
I'm inspired by the quote from Pablo Casals when he was in his 90s. They asked him why he still needed to practice, and he said: "Because I'm finally beginning to see some improvement."
Maybe if the internet and piracy hadn't fucked artists over, they could have made decent money as a musician selling their work without having to be a major-label superstar. Alas, we do not live in that timeline.
Did most artists (in whatever form) ever have a good living in and of itself?
No. The best/luckiest did, at least some of the time. Most didn't.
Yes. Mostly, until relatively recently on a historical time scale. In the middle ages, musicians were employed by towns, and had a guild. They also worked for princes, the church, etc. I read an article saying that they often did double duty as cops on market days.
There was perhaps less of a distinction between "arts" and trades. People did all kinds of work on paintings, sculptures, etc., and expected to get paid for it. They rarely put their names on their works.
I've read a bit about Bach's life, and he was always concerned about making money.
One music history textbook I read identified the invention of printed music as the start of the "music industry." Before the recording era, people composed and published sheet music. There were pianos in middle class parlors, and people bought sheet music to play. Two names that come to mind were Scott Joplin and Jelly Roll Morton. Movie theaters hired pianists or organists, though that employment vanished when talking movies came out. The musicians of the jazz era were making their livings from music. One familiar name is Miles Davis. His father was a dentist, and his parents considered music to be a respectable middle class career for their son. People did make a living from recordings before the Internet era. Today, lucrative work in the arts still exists for things like advertising and sports broadcasting.
(Revealing my bias, I'm a jazz musician).
In fact the expectation that an artist should not earn a decent living is kind of a new thing.
Ah, one of the luxuries of not having to support myself from music... I have no interest in making recordings. That audience means nothing to me.
Right, before the internet the labels were great to artists!...
Piracy didn't fuck artists over I think (anecdotal), because it was the precursor to Spotify which has been great for artist discovery. Until the industry / artists caught on and started pushing shit. And the payment model for Spotify is bad, a million streams earns about $3-5K according to a quick google and few actually get that far.
But it's good for discovery, and artists generally don't make much off album sales either; concerts and merchandise is where it's at.
Really still kicking myself for not majoring in robotics in school. I wanted to program, so I studied computer engineering but hadn't really absorbed that much in classes. But I will likely never have access to all the robotics stuff my school had, nor the guided learnings.
Never too late to try stuff out of course, but very little beats structured higher ed education in relatively small classes (think there was only about 24 people in the robotics major?)_
I worked with robotics engineers, their code and development methods were poor even though software is essential. You need both sides.
Nothing beats concentrated work. You can do that without formal education. It might even be easier: you can probably afford pretty good arduinos and raspberries and H-bridges and sensors and actuators and...
It shouldn't be hard to go beyond what almost all universities provide.
On the other hand, the one robotics course I took involved getting access to computers at 3am and doing horrific matrix multiplications by hand that took hours. Of course, this was a long time ago.
I'm reminded of the quantity vs. quality groups in a photography class:
https://sebastianhetman.com/why-quantity-matters/
Do stuff, and you learn stuff. Go play.
While I generally agree with the conclusion of that, I think it might be a bit too naive.
The quantity group has a trivial way to "hack" the metric. I can just sit there snapping photos of everything. I could just set up a camera to automatically snap photos all day and night. To be honest, if I'm not doing this at a stationary wall there's probably a good chance I get a good photo since even a tiny probability can be an expected result given enough samples.
But I think the real magic ingredient comes from the explanation
The way I read this is "The quantity group felt assured in their grade, so used the time to be creative and without pressure." But I think if you modified the experiment so that you'd grade students on a curve and in proportion to the number of photos they took then the results might differ. Their grade might not feel secure as it could take just one person to communicate that they're just snapping photos all day as fast as they can. In this setting I think you'd have even less ability to explore and experiment than the quality group.I do think the message is right though and I think this is the right strategy in any creative or primarily mental endeavor (including coding). The more the process depends on creativity the more time needs to be allocated to this type of exploration and freedom. I think in jobs like research that this should be the basis for how they are structured and effectively you should mostly remove evaluation metrics and embrace the fundamentally ad hoc nature. In things like coding I think you need more of a mix and the right mix depends highly on the actual objectives. But I wanted to make the above distinction because I think it is important if we're trying to figure out what those objectives are.
This is an interesting take. It maps decently to the "first to market" mentality of a lot of programming these days. For sure, we'd get better programmers if they were judged by the quantity of code they produced in their personal projects, but if "quantity learners" are simply forced to churn out bad code without any reflection or the time to experiment, then I would agree that it seems pretty naive to think they'll ever improve... except at a handful of patterns that make bad code sort of work.
Also think about software recently. Is it actually better? IME I face more bugs than ever. Trivial ones to that are clearly being deprioritized but relatively easily solvable.[0]
Do an experiment for me. Write down every bug you face today. Even small. I think you'll be surprised at how many there are and even more at how many are likely simple to solve.
I know it's not just me as so many around me are getting increasingly frustrated with their devices. It's not the big things, it's a thousand paper cuts. But if you just you look at a paper cut in isolation, it isn't meaningful. That's the problem and why they get ignored. But they get created because we care more about speed than direction. I'd argue that's a good way to end up over a cliff
[0] https://news.ycombinator.com/item?id=45363210
>I face more bugs than ever
I mean, lets leave phones out of this for a moment and look at PCs...
When was the last time your PC operating system crashed?
When was the last time your applications you use on your PC crashed?
When was the last time you could not find an application for your PC that did what you needed to accomplish?
The early days of software absolutly sucked. Crashes, data loss, and limitations were the name of the game. The big things like data corruption were constantly problems. You didn't notice the small problems because you were fighting the big ones. The problem spaces software was solving for people was also relatively small. Now it's easy to find small bugs everywhere because software ate the world. It's harder to name something software has not expanded into than what it has, and yet we are still not near the boundaries of exploration of what software can do. When a system has not discovered its boundaries speed will almost always win over direction.
Yeah. The BSoD wasn't dreaded because it was an infrequent surprise visitor.
For Linux? I'm not sure, but definitely longer than 6 mo. Last crash I remember was my fault and quite awhile ago.
For OSX? Last week. The most common one of when I close my laptop, go "ops, need to do X real quick", and when I open it up the screen doesn't come up. Not sure if a backlight issue but I don't think it is. It doesn't seem to be logging in and I can't get keys like volume to respond where I'd have feedback. Flashlight trick doesn't work. If I wait the laptop reboots itself and half the time I do get a crash report (I suspect this happens more than I think too since occasionally I'll come to my laptop and it was rebooted. Like back from lunch or a quick break, not overnight. Less frequently I find crash reports) Happened since day 1 and even on the last MacBook I had. I get this error at least twice a month. Also can happen when disconnecting my monitor.
Windows? Jesus Christ, how do people live like that? Arch was more stable a decade ago.
On Linux, two weeks ago I had a crash while playing cyberpunk (the most optimized game there ever was... except maybe StarField). Last week Silksong had a soft error where the joystick stopped responding when my wireless controller issued the low battery warning. Outside that, I can't think of anything other than when I accidentally run a sim calling too many resources, but that's not common and I'm not sure it counts.On Mac, at least every two weeks. I feel like 6mo ago it was more like once a month though. I've been interviewing lately and Teams is definitely a bigger problem than Zoom. I think Firefox has crashed twice in the last 6 months? I'm also a tab hoarder. But mail crashed way more early on so I switched back to Thunderbird and while it doesn't crash I'm sure there is a small memory leak. I'll restart like once every 2 weeks because it'll start pushing a gig of ram. I know, I'm picky, but my email client shouldn't ever use a gig of ram. And I was writing my PhD thesis a few months back and preview would occasionally crash. Zathura had no issues. Slack definitely more frequently than FF.
Windows? Just about every day. I was able to reduce errors once the IT guy informed me that Windows Hello being used for the login was why Outlook was constantly crashing. Switching to a typed password did a lot. But after that it still reminded me of the days where I was learning Linux and distro hopping.
Daily? Okay, but probably weekly?Less of a problem on Linux. Often it's small things so I can write a quick shell script. I can find most things I want on the AUR but often I'll build from source.
OSX? That depends. Do I have to pay? If so, IDK. I'm not willing to pay a subscription fee for what's a glorified shell script.
Windows? Work laptop so wasn't really looking.
I generally agree, things are getting better but the problem wasn't that big 10-20 years ago ime. I can only think of 2 instances where I had data losses and both were on Linux laptops while distro hopping over 15 years ago. In both cases I was able to recover too. Yeah maybe the issue was bigger in the pre Windows 95 days but that's way back and hardware has also made significant strides. Don't give software the credit for better hardware.
But I think you're missing an important question:
That is not a crash but is a bug. The calendar issue I mentioned above is not unique to my phone. Later on I also mention how I had essentially no means to merge contacts despite identical names, nicknames, phone numbers, and birthdays (differing only on email and notes). *WHO THE FUCK THINKS "FIND CONTRACTS" IS A BETTER SOLUTION THAN CLICKING TWO ENTIRES AND THEN CLICK "MERGE"? And guess what, it couldn't find the duplicates. This is a trivial database problem! I was able to eventually find a way to merge after some googling. But I discovered the duplication because my gf had 3 entries on my calendar for her birthday. When I merged, she ended up with 4! Again, that is a trivial database problem.This category of problem is greatly increasing in frequency. I know you might think it's a comparison bias, and I think that's a reasonable guess, but it isn't. I've definitely "infected" my friends and family so they're more "sensitive" to bugs but years after that they agree that their devices are becoming harder to use for the same tasks they did before. Either through little bugs like this piling up or having some new features being pushed on them that they don't want and disrupts their workflow. That last one is really common.
So I don't want to dismiss you, because I don't think you're entirely wrong. But also I think you're being too quick to dismiss me. Your justification isn't a complete explanation of my experience. Nor does it account for how programs and getting slower and heavier. I mean God damn, how many seconds does it take for Microsoft Word to open (cached? Not cached?) and how many gigabytes of ram does it use? Those shouldn't even be the units of measurement we should be discussing! Both are at least a magnitude too large. I'm absolutely certain programs are bloating. And I think it should be unsurprising that with bloated comes more opportunities for errors.
This is also the type of problem I expect average people to not notice as it happens more slowly and if you don't understand computers it's easy to believe it's just the way it is. But we're on HN, we do. We know better. We also know how the sausage is made and we've experienced the increasing pressure for speed and seen the change in how programs and programmers are evaluated. I do not think you can discount these effects.
It shouldn't be a surprise that moving faster correlates strongly with making more mistakes. You don't deny the optimization for speed, but do you really think you can accelerate for free? There's a balance and I'm confident we've sped through that equilibrium.
Quantity has a quality all its own.
It worked for Garry Winogrand, for one.
I both agree and disagree with this post, but I might be misunderstanding it. Near the end, it states:
“Enjoy writing it, it doesn’t have to be nice or pretty if it’s for you. Have fun, try out that new runtime or language.”
It doesn’t have to be nice or pretty EVEN if it’s NOT for you. The value in prototyping has always been there and it’s been very concrete: to refine mental models, validate assumptions, uncover gaps in your own thinking (or your team’s), you name it.
Unfortunately it feels that the pendulum has swung in the completely opposite direction. There’s a lot of “theatre” in planning, writing endless tickets and refining them for WEEKS before actually starting to write code, in a way that’s actively harmful for building software. When you get stuck in planning mode you let wrong assumptions grow and get baked in into the design so the sunken cost keeps rising.
Simply have a BASIC and SHARED mental model of the end goal with your team and start prototyping. LLMs have made this RIDICULOUSLY CHEAP. But, the industry is still stuck in all the wrong ways.
> It doesn’t have to be nice or pretty EVEN if it’s NOT for you.
> There’s a lot of “theatre” in planning, writing endless tickets and refining them for WEEKS before actually starting to write code, in a way that’s actively harmful for building software.
I'd love to have a "high paying job" where I am allowed to start prototyping and modelling the problem and then iteratively keep on improving it into fully functional solution.
I won't deny that the snowballing of improvements and functional completeness manifests as acceleration of "delivery speed" and as a code-producing experience is extremely enjoyable. Depth-first traversal into curiosity driven problem solving is a very pleasurable activity.
However, IME in real world, someone up the chain is going to ask "when will you deliver this". I have ever only once been in a privileged enough a position in a job to say "I am on it and I will finish it when I finish it... and it will be really cool"
Planning and task breakdown, as a developer, is pretty much like my insurance policy. Because when someone up the chain (all the way down to my direct manager) comes asking "How much progress you have made ?" I can say (OR "present the data" as it is called in a certain company ?) "as per the agreed plan, out of the N things, I have done k (< N) things so far. However at this (k+1)th thing I am slowing down or blocked because during planning that-other-thing never got uncovered and we have scope-creep/external-dependency/cattle-in-the-middle-of-the-road issue". At which point a certain type of person will also go all the way to push the blame to another colleague to make themselves appear better hence eligible for promotion.
I would highly encourage everyone to participate in the "planning theatre" and play your "role".
OR, if possible start something of your own and do it the way you always wanted to do it.
I feel like this is the time to mention "How Big Things Get Done", by Bent Flyvbjerg. "Long planning vs. start prototyping" is a false dichotomy. Prototyping IS planning.
Put another way, refining tickets for weeks isn't the problem; the problem is when you do this without prototyping, chances are you aren't actually refining the tickets.
Planning stops when you take steps that cannot be reverted, and there IS value in delaying those steps as much as possible, because your project then becomes vulnerable to outside risk. Long planning is valuable because of this; it's just that many who advocate for long planning would just take a long time and not actually use that time for planning.
For my money, certain types of software shouldn't have tests, too much planning, or any maintenance whatsoever
Prototypes (start ups) rarely have the luxury of "getting it right", their actual goal is "getting it out there FAST to capture the market (and have it working enough to keep the market)"
(Some - apologies but I'm not a game dev enough to be able to say what types this applies to) Game devs - they're more or less build it, ship it, and be done with it, players tend to be forgiving of most bugs, and they move on to the next shiny thing long before it's time to fix all the things.
Once the product has traction in the market, and you have paying customers, then it's time to deal with the load (scale up) and bugs, I recall reading somewhere that it's probably best to drop the start up team, they did their job (and are now free to move on to the next brilliant idea), and replace them with a scale up team, who will do the planning, architecting, and preparation for the long term life of the software.
I think that that approach would have worked for Facebook (for example) they had their PHP prototype that captured the market very quickly, and (IMO) they should have moved through to a scale up team (who could have used the original code as a facade, strangling it to replace it with something funky (Java/C++ would have been what was available at the time, but Go would be what I would suggest now)
> ... for WEEKS before actually starting to write code...
I'm curious who is in these kinds of jobs. Because I've never seen this in practice.
Yesterday I spent the entire day working on a lib to create repos in Github from inside Emacs. It was the first time in 3y that I had touched it. When googling, I saw potential candidates that were much better than my simple one. But I kept going, for the pleasure of making my own thing. I learned a lot, and felt very accomplished, even if, at the end, it was messy, and I'll have to go back and reorganize it. It feels like making _my_ thing, even if it is drawing my copy of Monalisa.
I would say that much of my code starts out stupid and, hopefully, becomes better with refinement.
First do it, then do it right, then do it better.
The graphic design version of this is "Get it done, make it beautiful," but it works for code too.
I used to get hung up on things like doing a loop when a ternary operator would work. "Somebody is going to see this and be rude about it." But sometimes you write code how you're thinking about the problem at the time. And if you think of it as a loop, or a series of if statements, or whatever, do it that way.
If it makes you feel better, note it in a comment to revisit later. And if somebody is rude about it, so what. It's not theirs, it's yours.
Yeah! That's great, thank you! I will share this quote with a couple people with whom I've discussed this topic recently :)
The more popular quote ime is "make it work, make it right, make it fast" (1983!)
https://wiki.c2.com/?MakeItWorkMakeItRightMakeItFast
Or just ship it, always an option
I agree but there are certain types of unnecessary stupidity, which feel more easy at at first, but hurt more than they help very quickly (measured in amount of code):
The first one that comes to mind relates closely to naming. If we think about a program in terms of its user facing domain, then we might start to name and structure our data, functions, types too specifically for that domain. But it's almost always better to separate computational, generic data manipulation from domain language.
You only need a _little bit_ more time to move much of the domain specific stuff into your data model. Think of domain language as values rather than field names or types. This makes code easier to work with _very quickly_.
Another stupidity is to default to local state. Moving state up requires a little bit of planning and sometimes refactoring and one has to consider the overall data model in order to understand each part. But it goes a long way, because you don't end up with entangled, implicit coordination. This is very much true for anything UI related. I almost never regret doing this, but I have regretted not doing this very often.
A third thing that is unnecessarily stupid is to spread around logic. Harder to explain, but everyone knows the easy feeling of putting an if statement (or any kind of branching, filtering etc.) that adds a bunch of variables somewhere, where it doesn't belong. If you feel pressed to do this, re-consider whether your data is rich enough (can it express the thing that I need here) and consistent enough.
> we might start to name and structure our data, functions, types too specifically for that domain.
I once worked on a Perl script that had to send an email to "Harry". (Name changed to protect the innocent). I stored Harry's email address in a variable called "$HARRY".
Later on a second person (with a different name) wanted to get the emails as well. No problem, just turn the scalar into an array, "@HARRIES".
I thought it was very funny but nobody else did.
> It’s small, it’s dumb, and there were probably plenty of options out there.
Oh, this sort of "dumb" code. That is just exercise. It bothers me that in this field we don't think we should rehearse and exercise and instead use production projects for that.
Actual dumb code is one that disregards edge cases or bets on things being guaranteed when they're not.
One thing I have found to be a very valuable habbit is to first think about what your software has to do on paper and draw some shitty flow charts, lists and other things, without too much care about whether you will do it (especially if it isn't software that you strictly need to do for some reason).
Whether an idea is good or not can often only be judged when it becomes more concrete. The actual finished project is as concrete as it gets, but it takes time and work to get there. So the next best thing is to flesh it out as much as possible ahead and decide based on that whether it is worth doing it that way.
Most people have the bad habit of being too attached to their own ideas. Kill your darlings. Ideas are meant to be either done, shelved or thrown into the bin. It doesn't do any good to roll them around in your head forever.
@author the blog scales poorly on smaller devices. The header doesn't fit the screen, margin's too big and lines are too crammed (line height needs a bit mor love).
https://i.imgur.com/Ev6Ea1b.png
This reminds me of the (excellent!) book by Jamie Buck: https://pragprog.com/titles/jbmaze/mazes-for-programmers/
They write a maze algo in any new language they learn just to learn bits of the language.
A modern variant would be to do a year id Advent of Code in the new language.
I do not believe that the real struggle is "starting", nowadays, since AI impresses 90% that is able to complete a task. We struggle in architecting the whole thing we want to start.
You should write stupid code, but you should write good code too.
Writing stupid code is like walking to the shop. You're not going to improve your marathon time, but that's not the point. It's just using an existing skill to do something you need to do.
But you should also study and get better at things. If you learnt to cycle you could get to the shop in a third of the time. Similarly, if you learn new languages, paradigms, features etc. you will become a more powerful programmer.
The Kernighan law says debugging code is twice as hard as creating it.
Therefore, if you push yourself to the limit of your abilities to create the most clever code you can, you won't be able to debug it.
> The Kernighan law says debugging code is twice as hard as creating it.
> Therefore, if you push yourself to the limit of your abilities to create the most clever code you can, you won't be able to debug it.
If only advocates of LLM-based code generation understood this lemma.
Is LLM output the kind of clever we're talking about here? I always thought the quote was about abstraction astronautics, not large amounts of dumb just-do-it code.
It applies to LLM code, but if you take the law at face value, it's a very damaging one. Cleverness should be used to make your code easier to verify, not harder.
He said it with a very specific idea in mind, and like most of software engineering "laws", if you know enough to know when to apply it, you don't need the law.
No, it just means you'll be spending extra time debugging it. The most clever code is often cleverness which isn't from you, but derived from the field over time.
Also read "stupid" code :)
I didn't know about Deno and streams, but this looks fine
Looks like straight out of Dart.
I believe "stupid code" is useful for sticking concepts or quick prototypes together.
But for strategic decisions, having a well-researched document (a PRD or similar) helps as a starting point for iteration, and the approach you take will be influenced by your team's culture.
I like this philosophy. It's interesting to me that the author writes about trying deno, specifically out of curiosity for compiling binaries with it, because that is something that's been specifically tickling the back of my mind for awhile now, but I've had no real reason to try it. I think this gave me the motivation to write some "stupid" code just to play with it.
Buns faster and the binaries are a bit smaller last I checked.
It's like riding a bike. You need to start in a low gear and get some momentum - even if you are going in circles. Starting from zero in the highest gear is difficult and hard to ballance. Once you have some speed, everything gets easier.
Even if the code is for yourself or for a collaborative team for a project or for a company the quality matters. Also the software replicability, reproducibility and reliability are significant indicators for viable code and guaranteed results
Shame that the writer didn’t tie up the initial story about waiting to be a musician without knowing anything about that with the end of the story.
Also, 2010 was just yesterday my young friend :)
I get more done by writing the stupid code, and fixing it, than junking the old code... but every now and then I can see clearly a structure for a rewrite, and then I rewrite, but its rare.
Is this one of those shortened titles?
Like the original was: Go ahead, write the "stupid" code, I dare ya!
Where did you study games? Seems like we have similar trajectories.
I feel like people should be writing stupid code, and in the case where its a compiled language, we should ask compiler or the language for better optimization. The other day, I was writing a check of a struct that have certain structures (protobuf probably have something like this)
struct S { int a; int b; int c; int d; int e; /* about 15 more members */ }
so I wrote
const auto match_a = s.a == 10; const auto match_b = s.c == 20; const auto match_c = s.e == 30; /* about 15 more of these */ if (match_a && match_b && match_c) { return -1; }
Turns out compilers (I think because of the language) totally shit the bed at this. It generates a chain of 20 if-else instead of a mask using SIMD or whatever. I KNOW this is possible, so I asked an LLM, it was able to produce said code that uses SIMD.
Why is this a struct and not an array of ints ?
I'm working on in-kernel ext3/4fs journalling support for NetBSD. The code is hot garbage but I love it because of the learning journey it's taken me on: about working in a kernel, about filesystems, etc. I'm gonna clean it up massively once I've figured out how to make the support complete, and even then I expect to be raked over the coals by the NetBSD devs for my code quality. On top of that there's the fact that real ones use ZFS or btrfs these days, and ext4 is a toy; like FAT, by comparison, so this may not even be that useful. But it's fun and lets me say hey Ma, I'm a kernel hacker now!
Ext4 is most certainly still in use and not a toy. Its trusted. It takes a lot for folks to adopt a new file system.
I worked on a research topic in grad school and learned about holes in files, and how data isn’t removed until the last fd is closed. I use that systems knowledge in my job weekly.
A tip. Kernel development can be lonely, share what you are working on and find others.
he said netbsd - there I would expect ext4 to be considered a toy even though it is used a lot in linux land. Different worlds.
Today its vibe stupid code.
"In the beginning you always want the results. In the end all you want is control."
This is a particularly bad mobile layout. Fix your margins.
At least on IOS, if you double tap the text it’ll fill the viewport just nicely.
Partially true, in that on Safari on iOS, you can use that to enlarge the text. But that doesn't change what's really broken about the layout, which is that it forces the column width to allow only a small number of words-per-line, which is what makes for uncomfortable reading. Another Safari-on-iOS option would be to use the built-in "Reader" function, which re-flows the text into a cleaner layout.
I appreciate the sentiment, but "There is no stupid code" is the dumbest sentence I've ever read.
Maybe you don't read much, but it's obvious they weren't making some universal statement about code. They are referring to the code you write when you are just experimenting by yourself, for yourself. The point is to not let irrelevant things like usefulness, quality, conventions, etc. limit just tinkering and learning.
Stupid code is fine. Make it work/exist first, you can make it good later.
Yeah, I think he’s trying to equate it to something like “there are no stupid questions.” That’s a pretty silly analogy, but you get the idea.
when you're paralyzed into not putting anything on the page, it's important to just get the dumb idea onto the IDE and refactor from there.
He will counter with "There are no stupid sentences"!
I think the people who think there is no stupid code don't actually ever witness truly bad code. The worst code that they come across is, at worst, below average. And since that's the worst they see, it gets mentally defined as bad.
That’s a charitable interpretation. The other more pessimistic one is that they only see stupid code, which cannot be made any stupider.
I think that's basically an impossibility, unless the only code they look at is from people who have 5 minutes of coding experience and attempt to get working code from vibes (without the LLM). Even suggesting this makes me think you haven't even seen truly stupid code.
I'm talking code from people with no programming experience, trying to contribute to open-source mod projects by pattern matching words they see in the file. They see the keyword static a lot, so they just put static on random things.
> Fast forward to today. I’ve been doing a dive on JavaScript/TypeScript and different runtimes like NodeJS and Deno,
That's why. If all codes in a project are stupid, there's no stupid code indeed relatively.
Go read Linux kernel mailing list.
And reading the Linux kernel mailing list would allow him to... do what exactly? And by when? Compared to writing simple, working, usable apps in TypeScript, immediately after reading about how Deno/TypeScript/etc. work?
It would allow him to brutally roast anyone who submits a sub-optimal merge request.
Linux still works by email-submitted patches, the workflow for which git was originally designed.
And if an unacceptable patch made it to Linus's desk, someone downstream hasn't been doing their damn job. The submaintainers are supposed to filter the stupid out, perhaps by more gentle guidance toward noob coders. The reason why Linus gets so angry is because the people who let it through should know better.
To avoid writing stupid code since they will see qualified people put reason on why some codes are "garbage" (I'm not saying this).