This is sweet news. I'm over 40. I enrolled at my local university in January and I'm studying (literally right now) for my linear algebra midterm [0] which is in 45 minutes! I'm on HN to calm my nerves.
I graduated high school in the early 2000s and graduated college with major in computer science and a minor in math. My goal is 5-8 more classes for a second degree in math (major).
I went to the University of Texas but I took summer courses in Houston Community College (calculus II, physics II, and more -- those classes were SO bad at UT).
It was insane how much better the courses were in the community college. Tiny class of 15. $300 or something. Amazing professor that you could ask questions to like you could in high school. Normal 20-30 question textbook homework where you just work basic problems and build confidence that you know the material.
Meanwhile UT was the opposite. I think I paid $1400/class/semester (and that's a bargain). Lecture halls where you couldn't possibly ask a question. Weird math/physics homework that was like 3-5 super hard questions that I often couldn't figure out, demoralizing. Often a TA that could barely speak English. It's actually quite insulting.
I sometimes think about enrolling in a local college for fun, the experience was that good.
> Weird math/physics homework that was like 3-5 super hard questions that I often couldn't figure out, demoralizing
Had this experience at an elite uni as well for math courses. At the time I felt like it pushed me to really grow, and it was absolutely necessary to do well in that specific course (tests often had questions that ~required you to know how to do all the uber-hard homework problems), but I wonder what the research actually says about this sort of homework vs your more standard variety.
> I wonder what the research actually says about this sort of homework vs your more standard variety.
I have a vivid memory of one of the question on a final being basically “sketch the outline of this important thing we studied”. I couldn’t do it. I took the class but didn’t see the forest for the trees.
Later I met people who talked about things with each other, including the big picture. That’s the community I was missing when I took the class solo.
In retrospect, I could have gotten something more out of those problems that I thought were so hard.
I think it also depends on what the professor's and the student's goals are; and if they're aligned.
Is the course about learning the material at hand, or laying the foundation for graduate level courses in the same subject? About teaching the most efficient way or getting a student used to deriving equations when there's not a plug and play formula.
I'm sure we can draw similar parallels between csci college courses, big tech interviews, and professional software development. Even though it's all the same pipeline, each stage/stakeholder has different goals, motivations, etc... If you're having a discussion about the pros and cons of an approach, you have to make sure the goals are aligned else you'll just be talking past each other.
I had classes with take-home tests of three impossible questions, and standard tests of disguised regurgitation. The impossible questions are the ones that will really test your understanding of the fundamentals. It's the different between "add two numbers together", and "what does adding mean"?
I found out I can't stretch my brain to truly understand the fundamentals, so I stopped after a bachelors and don't use my degree at all. I don't mind. It takes truly special people to push the limits, and a lot of not so special people to keep the world running for them.
I have wondered this too as a person who has attended a regular (non-honors) Calculus II course at a fairly top-rank private university and then again at a community college.
From what I remember, the university course also had some rote exercises for homework so it isn’t like everyone is only focusing on working the trickier exercises.
This also reminds me of the story Donald Knuth has around working every exercise in the book for a calculus class.
Large universities are focused on research, and they incur a lot of expenses due to administrators' egos (build build build), the number of administrators, and the range of microstate services offered, like their own health care system and mental health counseling (a major thing in universities now). Community colleges are focused on teaching.
I had a similar experience -- took physics at a community college when I was in high school. The 'up-side' of the overproduction of PhDs is that many people from elite backgrounds end up teaching at community colleges.
The only negative for me was that the students were pretty checked out.
> The only negative for me was that the students were pretty checked out.
I didn't put a lot of thought into where I went to school but if I could do it over again this is something I would have considered when I applied. The school I ended up at did not have many serious students. It was a night and day difference taking courses with even one or two students who were similarly engaged with the material, but most of those students ended up transferring to better schools after a year or two.
You also run into the issue later on that the people you went to school with wash out of industry (or never work in it to begin with) at much higher rates in comparison to those who went to more serious schools.
Exactly the same experience. My AP Physics teacher in high school was incredibly better than university.
UT is research focused. Depending on the department, they make the professors teach classes, which is often not aligned with their interests at all. Sometimes I think they are actively trying for bad reviews from students to incentivize the university from making them take on course load.
community colleges are... like... there for the community. and you feel that community.
a lot of big universities have people there for research. there is money to be made, grants to be given, and degrees to be minted. and you can feel that too.
source: got out of the military and went to one, then the other.
I went to a somewhat highly regarded (not MIT or CalTech tier) tech school, and then to a state university.
The tech school considered it a boast that it had more graduate students than undergrad. It was clear where the professors' emphasis was. I recognize the lecture halls where you couldn't ask questions, and the barely-anglophone instructors. (Everyone in the EE department, in particular, seemed to come "fresh off the boat" from China bringing precious little English knowledge with them. The prof for my introductory EE course mumbled on top of it.)
Then I went to state school. Ho-lee shit. Complete difference. The bad profs were incompetent chucklefucks who couldn't cut it in real academia. The good profs actually cared about teaching undergrads.
I learned a lot about choosing a college -- a few years and a few tens of thousands of dollars too late.
I hope it went well! I am in my fifties and enrolled in a master degree program for pure mathematics about 2 years ago (I don't need the degree, so I"m just taking all the classes they offer, so not about to graduate). It definitely took some time to get my brain sharper, but I am better each semester.
I hope people don't take away the negative side of the article, brain slows down, but the positive side: brain gets better with usage. Its uncomfortable, I can churn out programs as complex as programs I've already written and go to review meetings and planning meetings without much effort. But being able to solve PDEs reasonably quickly and accurately, I cannot, or have not without a great deal of practise. It's unconfortable in some weird mental but physical sense. But I'm sharper in everything else I do.
One interesting thing about software as career followed by math classes is that there's no compiler - you can type any janky thought into LaTeX and if you don't detect that it's bogus, nothing will, until you show it to a professor.
Also, the information density of maths notation is way higher than (good) code. We want code to be readable by some that doesn't know it; a lot of math seems to be readable when you sort of 80% already are familiar with all the prereqs. So no just skimming and then hitting compile/test/run (whatever validation you do). It's typing letter by letter and taking the mental effort to actually see and decipher the letter (at least, for me in my current stage; I'm trying to do novel research, but my demonstrated understanding of the details of the previous research is embarrassing low).
Also, weirdly, I still have the same fear of professors that I did as a young person. I manage it better with my decades of maturity (really) but it is still a part of my social interactions.
> One interesting thing about software as career followed by math classes is that there's no compiler - you can type any janky thought into LaTeX and if you don't detect that it's bogus, nothing will, until you show it to a professor.
The formal proof community is very interested in exactly this problem! It's not my specialty, but I believe that Lean (https://en.wikipedia.org/wiki/Lean_(proof_assistant)) is one of the very active communities.
I've done some intro to lean things, but no one in the maths program is into lean, so I'm just focusing on the math side. Terry Tao is big into the idea of lean tho, and combining LLMs with Lean.
The information density is incredible. A 2x2 matrix (Jordan constants) containing enough information to produce a slice of a hyperbolic paraboloid. Leaves me mesmerized...
It's funny, at the end of each lecture I just want to yell... "NO! Don't stop! I must see how this ends!"
Very similar to when I stop our children's movie and tell them to go take a bath.
I guess nobody knows all of math and it is constantly a learning process, but things like "Jordan constants" which defines proper nouns to thousands upon thousands of math concepts, symbols, theorems and approaches just makes it a even harder to memorize the whole shebang. Fascinating but sometimes overly complex.
I like your point about feedback. That's how I describe my difficulties with proofs, too.
There is no way of knowing a proof is right without knowing it's right. (Or maybe I am just missing the point)
Good luck! You can do it! I started doing statistics classes three years ago when I was 45, continued doing a MSc degree, which I finished successfully a few months ago. I am now looking into doing a PhD. This is more fun than I ever imagined (fair enough: I was a teenager when imagining it).
Good luck! You should check out Math Academy, it's more effective/efficient/cheaper but also a good supplement since it's accredited.
I recently turned 40 myself and I'm working through their Foundations courses (made to help adults catch up) before tackling the Machine Learning and other uni courses.
I'll tell you my experience as someone who's been using Math Academy for past 6 months.
Math Academy does what every good application or service does. Make things convenient. That's it. No juggling heavy books or multiple tabs of PDFs. Each problem comes with detailed solution so getting them wrong doesn't mean looking around on the internet for a hint about your mistake (this is pre ChatGPT era of course, where not getting something correct meant putting down MathJax on stackexchange).
> better than just prompting ChatGPT/Claude/etc
The convenience means you are doing the most important part of learning maths with most ease: problem solving and practice. That is something an LLM will not be able to help you with. For me, solving problems is pretty much the only way to mostly wrap my head around the topic.
I say mostly because LLMs are amazing at complementing Math Academy. Any time I hit a conceptual snag, I run off to ChatGPT to get more clarity. And it works great.
So in my opinion, Math Academy alone is pretty good. Even great for school level maths I'd say. Coupled with ChatGPT the package becomes a pretty solid teaching medium.
Yes, much better. ChatGPT/Claude/etc. are useful the times I want extra explanation to help connect the dots, but Math Academy incorporates spaced repetition, interleaving, etc. the way a dedicated tutor would, but in a better structured environment/UI.
Their marketing website leaves a lot to be desired (a perk since they are all math nerds focused on the product), but here are two references on their site that explain their approach:
They also did a really good interview last week that goes in depth about their process with Dr. Alex Smith (Director of Curriculum) and Justin Skycak (Director of Analytics) from Math Academy: https://chalkandtalkpodcast.podbean.com/e/math-academy-optim...
The second link really impressed me, I'm tentatively sold on (and excited for) their approach. Does anyone know of any other accredited programs similar to Math Academy, but for other subjects?
Anything in the soft sciences, or biology/organic chemistry, or comp sci. I know there are a lot of courses for the latter especially, but I'm looking for accredited ones.
I used an early e-learning platform not because I wanted to but because I was one of its developers. I didn't develop the course-content just the technical implementation.
What I didn't like about the content is I often had questions about it but there was no-one to ask the questions from. Whoever wrote that material was no longer around. It's a frustrating feeling when you can't really trust what you're studying is factually correct, or is misleading.
I assume AI will have a huge improvement in this respect.
Not OP, but I have found MathAcademy to be infinitely better. I really liked the assessment portion which levels you and gives you an idea of where you are are at the present. As someone who graduated with an engineering degree a while ago, there were things I realized I didn’t know as well as I thought I did and I probably would not have prompted an LLM to review.
Math is something that should be taught in an opinionated way with an eye toward pedagogy. Self study with GPT is an excellent tool in math, but only for those who have enough perspective to know which directions to set out on. I don’t think anybody who doesn’t know linear algebra should be guiding their studies themselves.
Given my ChatGPT and friends experience has been one of overwhelming frustration due to incorrect information, I would say Math Academy is in an entirely different galaxy. ChatGPT is great if you want to learn that pi is equal to 4.
b-b-b-but the next supercalifragilistic ChatGPT version will be able to tell you that pi is between 3.1 and 3.2. that will be a Quantum improvement, asymptotically close to AGI.
at least, i think i heard alt samman say so.
you plebs and proles better shell out the $50 a month, increasing by $10 per day, to keep dis honest billionaires able to keep on buying deir multi-million dollar yachts and personal jets.
be grateful for the valuable crumbs we toss to you, serfs.
Keep making those pushes! I was a non-traditional graduate student because around 10 years I got very serious about going for my doctorate. I literally scheduled times with my friends to watch Khan Academy videos on upper level maths and spent time practicing those skills. Then grad school is just one intensive learning session.
Years of martial arts ingrained that sense of being a life-long learner. I was taught the mantra of "Progress comes to those who train" and "Practice makes permanent" and even though those phrases were focused on learning to beat someone up, I've carried them on into other parts of my life.
Congrats! It is never too late to be doing this type of study and work.
I'm doing something similar: I just turned 50 and have been taking graduate ML classes where I work (at Carnegie Mellon). When I finish the graduate certificate program in generative AI and LLMs that I am enrolled in, I will be only two semesters away from earning a full masters degree.
> graduated college with major in computer science and a minor in math.
Me too. High five!
> My goal is 5-8 more classes for a second degree in math (major).
But why? Wouldn't it make more sense to go for a master in computer science? Are you going to use it for work. Otherwise, aren't you going to "lose it" anyways? Also, is your job paying for the degree or are you paying out of pocket?
It’s been twenty years so my opinion is skewed and my memory is quite faded, however, I’ve got opinions on the guide and class in general.
The main thing is there are no surprises or tricks. The exams are straightforward and EXHAUSTIVE. I do all the assigned homework twice. Once when we cover the material and again before the exam. Let’s hope that strategy pays off again.
This college requires something like taking 30 credits from the institution to award a degree. That's somewhere between 7-10 classes (mix of 3/4 credits each).
Yes, through admissions. Getting a degree in math, maybe... depends on how much stress this adds to my life. If I were retired I'd just take a full load, but raising a family and running my business I can only take it one class at a time.
I didn't ace it, but knew immediately what I had done wrong as I rode my bicycle home. I kept checking my linear transformation matrix and the Eigen values didn't compute... Looked again at the TI-89 when I got home and realized I swapped the orientation on the Jordan constants. I wrote all the equations out, so maybe my professor will have mercy on me. Oh well, another case of elevator wit - https://en.wikipedia.org/wiki/L%27esprit_de_l%27escalier
I've used an LLM for tutoring, but it doesn't replace the experience of biking across campus, ordering a coffee, unpacking my TI-89/iPad, cracking jokes with the professor and other students, paying attention, taking notes, and having to remember the material until exam day. This process is culture, and it's honing my mental blade. Then, as a solo-entrepreneur, I go home and use Cline+Sonnet to hack on a few side projects. These two processes compliment each other, greatly. Like I've mentioned in other replies, this is for "#2 fun" and to see if the "old guy (me) has still got it."
I have padlocks that I use to lock up my tools, or my bike, etc. The problem is, I often go several months without using some of them and forget the combinations. So, I decided to write down their combinations, but then I always lose the sheet. Being the math geek that I am, I decided on the following solution. I choose a 3 × 3 matrix and multiply this matrix by the combination and write the result on the back of the lock. For example, on the back of one lock is written “2688 − 3055 − 2750 : Birthdays,” indicating that the 3 × 3 matrix that I chose for that particular lock is the matrix whose rows consist of the birthdays of my brothers and me (from youngest to oldest). My brother Rod was born on 7/3/69, I was born on 7/28/66, and my older brother was born on 7/29/57. What is the combination of the lock?
Now, technically the LLM didn't quite know how to parse "2688 − 3055 − 2750" and ran the calculation with "[2688;-3055;2750]" and produced a response of, "These values are clearly not typical lock combinations, which suggests a potential issue with the encoding process."
Smart, kind-of. I reran with a more explicit prompt and it calculated the correct combination.
Overall though, I'm impressed with using ChatGPT as a linear algebra tutor. I wouldn't hesitate to use it in the future.
I just tried your prompt: o1, gpt4.5, gemini 2 pro solved it correctly (21-19-36), sonnet3.7 and grok3 failed because of the parsing error you described.
I'd like to see a study on how the acute stress of living in survival mode for a lifetime affects the brain by using it too much for the wrong tasks.
The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel. When I look around at the sheer computing power available to us, I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation. So that we could focus on getting real work done in the sciences for example, instead of just making rent.
I've been living like someone from movies like In Time and The Pursuit of Happyness for so many decades without a win that my subconscious no longer believes that the future will be better. I have to overcome tremendous spidey sense warning signs from my gut in order to begin working each day. The starting friction is intense. To the point where I'm not sure how much longer I can continue doing this to myself, and I'm "only" in my mid-40s. After a lifetime of negative reinforcement, I'm not sure that I can adopt new innovations like AI into my workflows.
It's a hollow feeling to have so much experience in solving any problem, when problem solving itself will soon be solved/marginalized to the point that nobody wants to pay for it because AI can do it. I feel rather strongly that within 3 years, mass-layoffs will start sweeping the world with no help coming from our elected officials or private industry. Nobody will be safe from being rendered obsolete, not even you the reader.
So I have my faculties, I have potential, but I've never felt dumber or more ineffectual than I do right now.
>I'd like to see a study on how the acute stress of living in survival mode for a lifetime affects the brain by using it too much for the wrong tasks. The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel.
I suspected something very different based off the first sentence. Like someone living in a high crime area and trying not to get dragged into it. Or constantly struggling with poverty, food insecurity, etc.
The other comments are great but I wanted to touch on why I seem to be struggling in a reality that generally provides enough today.
It's because as hard as it is to believe, especially for young people: life these days is decent despite the status quo, not because of it.
In other words, had we continued on the trajectory we were on before loosely 1980 and trickle-down economics, we could have had moonshots to solve each of humanity's problems in the order of need rather than profitability. We could have consulted academics to invent 25% efficient solar panels for under $1 per watt and had them installed on over 50% of homes by 1990. We could have invented lithium iron phosphate batteries at that same time and had $10,000 electric cars, because they simply aren't that complicated. We could have had blue LEDs, and WiFi, and flatscreens, and everything else we enjoy today, decades earlier. Stuff that doesn't even exist right now but should, like affordable public buffets, mass transit in small cities and single-payer/public healthcare. Robotic hydroponic greenhouses. Living closer to work (I know, inconceivable).
Instead, I had to watch everything roll out at a glacial pace under a risk-averse private system that allowed the Dot Bomb to happen around 2000. That defunded nearly all pure research and outsourced the jobs that provided a healthy work/life balance. That marginalized eBay businesses and online advertising and the resale market so that influencers and the ultra-wealthy could capture all of that low-hanging fruit while the rest of us have to work. And boy did I have to work, at jobs that sapped every bit of my passion, motivation and self-determination, leaving me too exhausted to pursue my side hustles fast enough to get to market before someone else beat me to it or a deregulated recession wiped me out again.
When you've watched progress flounder for as long as I have, it becomes obvious that sabotage is where the money's at. The powers that be denied innovation at every turn, in order to prop up aging industries centered around a 20th century fossil fuel economy that still dominates our lives today.
And now suddenly AI falls in our lap because a billionaire finally decided to fund it. Now you see what happens with a moonshot. Things change so rapidly that we're left reeling with their implications. The luddites come out. Politics devolves. Time runs backwards to the 1950s, the 1940s, the conditions that fanned the flames that turned into world wars.
Now they gleefully say "see! we should have kept stifling innovation! ignorance is strength!"
It's.just.so.exhausting.
I find that people fall very strongly into 2 camps, which could be loosely mapped to left and right: those who suffer knowing what could be, and those who defend what is to deny their own suffering.
You're not the only one that has had those kind of feelings, and I really relate to the movies you referenced.
Try to remember, AI is a tool, not a solution, and there will always be new problems. There's a strong case that unlike every other time people said that technology will kill all the jobs, this time it actually will. But a helpful framework comes from Clayton Christensen's Innovator's Solution (not the much more famous Innovator's Dilemma) - whereas a business has well defined needs that can be satisfied by improving products, customers (i.e. people) have ever evolving needs that will never be met. So while specific skills may lose value, there will always be a demand for the ability to recognize and provide value and solutions.
What makes a labor market for agents that recognize problems and provide solutions special or different from markets for other kinds of labor? If AIs get to a point where they dramatically outperform humans in other forms of labor, why not in this one?
I think some humans will be doing it well enough to keep themselves afloat the rest of our lifetime, and some will get fabulously rich building products as a one-man operation leveraging AIs. But there will be far more people failing at it. It will be like Youtube creators or Instagram influencers where there are few winners who take virtually all the rewards.
compared to the broadcast era aren't there way more winners -- with a smaller pieces of the pie -- nowadays?
it's still a Pareto distribution, I'm sure, but mega-stardom kinda died and was replaced by all these mini-stars, as far as I can tell. I'm not sure it supports your hypothesis.
I'm not really in touch with other genres, but I like to watch chess videos/streams on Youtube and Twitch. The vast, vast majority of views and revenue are captured by about ten people.
I like those people too, but I've also watched a lot of smaller acts, even some amateur players not much stronger than me. So I get those recommendations, and I see their view counts. They aren't making anything at all.
There are other people who have some followers, but even 50,000 followers would be a dream for most people doing it and they will make next to nothing from that. I'd guess there are at least 30x the number of strong, titled players in the 50k group as there are in the 1MM+ group. These are all people who were chess prodigies as kids, won every scholastic tournament in their state, took gap years or went to colleges that let them basically major in chess, travelled the world for tournaments, with awe-inspiring skills, and they are not making anywhere close enough to live on.
And the thing is, I think software might even be tougher in twenty years. Its hard to get people to change from a system they use to another thing, much harder than recommending a new face on Youtbue.
Maybe someday they will. But the current run of LLMs are fantastic at regurgitating and synthesizing existing knowledge, and getting better all the time, but not so good at coming up with new ideas. As long as you keep to the realm of what is known, they can seem incredibly intelligent, but as soon as you cross that boundary there's a clear change - often to just meaningless bullshit. So, I personally don't think we're going to be outsourcing idea generation to LLMs (or AI in general) anytime soon. Though to be fair, I'm only about 75% confident in that, and even so, it doesn't mean they won't be hugely transformative anyway.
Pessimistically but realistically, it doesn't matter if AI will perform better, it just matters if it's cheaper. A historical example is all the offshoring of mission critical code starting in the late 90s, early 00s. The code that came back was sub-par many of the times, particularly for the cheaper shops, but the executives got their bonus for saving money and bailed out. The new executives are now in charge of fixing the disaster of a codebase that they were left with.
I think history will rhyme with the offshoring trend but with AI this time.
> When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation.
I was inspired to get into programming by Star Trek in the early 2000s because I thought I could contribute to automation that would lead towards that kind of society; much like you've stated here. Some will say we're naive and unrealistic, but all the ingredients for having society function in this way are attainable with a bit of a cultural shift. I was fine with the idea that society could take baby steps towards it, but it seems the last 25 years have been a mixture of regressing and small incremental improvements to things that don't contribute towards that goal. Just like you, my expectations have been utterly destroyed and my outlook for the future is grim.
> but all the ingredients for having society function in this way are attainable with a bit of a cultural shift
It's awfully naive to think that you can solve the information problem with a "small cultural shift". Statements like this strike me as deeply ignorant of economics and the history of attempts to plan society. People are messy and their needs are hard to predict in any meaningful and responsive way that respects their preferences.
Imagine answering the question how many washing machines should we make. Assuming you could figure this out, you need to consider the different kinds of washing machines people may want and need. Apartment dwellers need small efficient one, and people with a lot of kids want big ones. This in turn has baring on the number of motors you have to make, feet of copper wire you need to product, plastics, rubber, and on and on. And don't forget that's just washing machines.
Now you need to figure out how to get these washing machines to people.
You just can't plan and automate everything, its far too complicated.
People came up with the information problem at a time when our ability to collect information and our computing power were several orders of magnitude inferior to what we have today. I don't think it's as big a problem as people think. Sure, it was true when every single person didn't have a device capable of instantly sending any kind of information to and from any location on earth, and when we didn't have the computing power to process that amount of information coming in 24/7. Now we do, so I believe a well functioning planned economy would be a possibility today, although it'd be a massive project to build such a thing. Even with limited technology the soviet union functioned for multiple decades and was one of the most developed nations on earth.
It’s the same thing. Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest. These are resource allocation issues. You can’t just wave that away.
Hah, I wish that were the case. A whole lot more things would be automated if that were true.
Automation requires resources, but it also requires vision, cooperation among affected parties, a workable regulatory framework, maturity and availability of required solutions, and availability of competent integrators. There are all kinds of reasons something remains manual besides mere resource availability. And all those things change over time.
There's not much you can do about most of those things, but becoming a programmer and working to develop better solutions is one way to make a difference. Even if you don't work directly in automation, your work can trickle down to the people like me who do cencern themselves with automated sewing and strawberry harvesting.
What I mean by resources is the things you mentioned inclusive of vision.
I picked those two examples because you can literally build a robot to do it, but it is either unworkable in the case of the shirt or financially not viable like the strawberry robot.
Using your model, no technological development would ever occur because the fact that something had not happened yet would indicate that it could not possibly happen due to a lack of resources. This is the anecdote about two economists walking down the street and refusing to pick up a $100 because everyone knows that in an efficient market, someone would have already picked it up.
At some point the resources necessary for development are there but the technology itself has not actuated. This invalidates your original claim that: "Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest."
No it isn't. Using a script to automate a process frees me from having to carry out the process manually. It has nothing to do with central economic planning.
Most things in the world require physical processes. Automating those is quite a different task, and requires resources. How would you go about automating sewing a shirt? How about picking strawberries?
Again this is tech hubris and a lack of understanding of economics and history.
> Most things in the world require physical processes.
No one ever disputed that. The principle still holds if we apply this logic to physical processes; by automating or reducing the labor necessary to conduct a physical process, I can enjoy the benefits of the process without having to engage in the labor of the process.
> How would you go about automating sewing a shirt?
> Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest. These are resource allocation issues. You can’t just wave that away.
This is true in the long run and I suppose the argument you are making is that any attempt to interfere with the present system of resource allocation will constitute a centralization that would be less effective than free market capitalism, so the notion that we could redistribute the surpluses generated by labor saving devices to the average person is inherently a call to economic centralization. This might be true, but I would propose an alternative reading:
The surplus of labor-saving devices has primarily accrued to the owners of these devices. You might then claim that these owners are owners because they have found a means of servicing a market demand. Each dollar they possess is a vote from the market that these guys really know what they are doing, and that the world wants more of it. If we were talking about spherical billionaires in a vacuum, I'd agree with you - but this issue is complicated by the compounding impacts of inheritance and its correlation with access to credit, as well as with the existence of competitive moats (e.g. network effects, intellectual property, sunk costs, natural monopolies, etc).
The optimistic read of the technology sector in the 2010s was that businesses would compete with one another to provide services that would ultimately improve people's lives. Instead, we got Windows 11. That wasn't a consequence of users voting with their dollars, it was a consequence of Microsoft entrenching itself into workflows that cannot be economically altered in the immediate future. There are lots of examples of the market not being particularly effective at economic allocation if we step outside of the logic that any purchase is a revealed preference which indicates approval of the good or service being purchased. Apply this logic to the purchases of gamblers, alcoholics, drug addicts, or murder-for-hire plots and the limitations of the logic become obvious.
No my argument is that trying at a broad systemic level to make specific outcomes happen is susceptible to the information problem. Trying like the op suggested to automate away work is utopian and improbable at best. If you squeeze labor out of one kind of drudgery you have no way of predicting the results, and you’re certainly not going to end up with Star Trek.
To my minor aside, look at that shirt. You have to essentially glue the fabric into a board and all the robot can do is a rudimentary set of side seams and sleeves on the tshirt. There’s no finishing work on the collar or hem so it’s useless. That robot exists as a demo and is used precisely nowhere. You could in theory do this, but it makes no sense economically.
And yes I’m aware of Japanese strawberry picking robots. You’ve clearly misunderstood what I’m saying. These thing may be technically possible but they remain infeasible for other reasons.
> No my argument is that trying at a broad systemic level to make specific outcomes happen is susceptible to the information problem.
This is exactly what I said you would say:
> I suppose the argument you are making is that any attempt to interfere with the present system of resource allocation will constitute a centralization that would be less effective than free market capitalism
Further:
> Trying like the op suggested to automate away work is utopian and improbable at best.
We are a long way off from the self-replicating systems that could feasibly make work effectively optional, but you haven't made a convincing argument as to why it is improbable that automation could reach that point.
> And yes I’m aware of Japanese strawberry picking robots.
You clearly were not aware of them or you would have picked better examples. Your original comment consisted solely of the statement: "It's the same thing." and now you're continuing with that flippant attitude by pretending that I'm misunderstanding your argument when I anticipated it in its entirety.
I clearly was aware of them. Do you think I just rattled of the bit about needing special glue to hold the fabric and only certain seam types being possible? There was a whole thing about these in the Economist last year and it was discussed on HN. While it’s technically possible you can’t deploy it. It turns out gluing an then applying solvents to fabrics doesn’t result in a product people want.
This Star Trek stuff is improbable because everything has to be coordinated somehow and waving your hand and saying magical future ai is the only proposal anyone ever has. So yeah, maybe super advanced AGI could do it, but probably not. We don’t even have good models now of how large economies work down to a granular level. People are like I said messy and respond in weird ways to their environments. The best we can do right now is working with prices as signals for the amount of effort other people are willing to put into something. And while that’s imperfect, it’s just improbable that we can do much better. Which is not to say that narrow objectives aren’t possible, only that the bigger and broader you aim the more impossible it becomes.
You cited them as examples of tasks that would be difficult to automate. The pickers have been commercially deployed for the last four years.
> This Star Trek stuff is improbable because everything has to be coordinated somehow and waving your hand and saying magical future ai is the only proposal anyone ever has.
Redistribution already occurs without the use of an AI.
> You cited them as examples of tasks that would be difficult to automate.
Yes because they are. I specifically gave an example where a machine exists but it's impossible to use for the real world, and an example where economics generally prevent adoption. That gets to my whole point.
> The pickers have been commercially deployed for the last four years.
Yes narrowly, and in only a few places where there are extreme labor shortages.
You are clearly misunderstanding me.
> Redistribution already occurs without the use of an AI.
I didn't make the claim that it didn't happen.
I feel like you're willfully ignoring what I'm saying. These things are hard and rolling them out universally often doesn't work because it is either impractical or economically infeasible to automate things or you run up against regulatory/cultural/material issues. The best we can do is piecemeal progress where incentives align.
It should be obvious. There are plenty of thing we can build robots to do, but we don't because its wildly more expensive than we can sell the resulting product for. We can mostly automate construction but it turns out the land acquisition dominates costs, installation still ends up being sloppy and human, building codes are different everywhere, and people want a different kind of dwelling than what prefab is suited for at the moment.
If it should be obvious then the evidence should be equally obvious.
Or perhaps the world is a bit more nuanced and it may very well be that we're stuck in some local maximums that our current methodologies don't allow us to escape but escaping them is relatively easy if we chose to implement a meagre amount of resources for that purpose which is something we don't do because we're stuck in that local maximum and so on and so forth.
Another way of looking at what you're saying is that we're doing things optimally and that there's no room for improvement when that very obviously is not the case.
There are many gross inefficiencies in our system as it currently is -- look at food production for example. How much of the food produced globally is outright wasted? 30%? 50%?
If we made a conscious effort to tighten that up we could reallocate those resources to solving the problems of automation issues that you're describing.
The true hole in one for automation is a durable machine that can make a copy of itself as well as useful economic goods. Bonus points if this machine can be in a humanoid form to integrate into our existing economic infrastructure.
Once you have a self replicator you can have it make as many copies as needed to solve any problem you need with minimal human effort.
But a self-replicating machine isn't on anyones radar. Have you ever seen a politician or policy person discuss this?
I’m sorry, any talk about self replicating machines is just science fiction at this point. It’s not a serious thing to discuss as a near future possibility.
"Look at this lead pencil. There’s not a single person in the world who could make this pencil. Remarkable statement? Not at all. The wood from which it is made, for all I know, comes from a tree that was cut down in the state of Washington. To cut down that tree, it took a saw. To make the saw, it took steel. To make steel, it took iron ore. This black center—we call it lead but it’s really graphite, compressed graphite—I’m not sure where it comes from, but I think it comes from some mines in South America. This red top up here, this eraser, a bit of rubber, probably comes from Malaya, where the rubber tree isn’t even native! It was imported from South America by some businessmen with the help of the British government. This brass ferrule? [Self-effacing laughter.] I haven’t the slightest idea where it came from. Or the yellow paint! Or the paint that made the black lines. Or the glue that holds it together. Literally thousands of people co-operated to make this pencil. People who don’t speak the same language, who practice different religions, who might hate one another if they ever met! When you go down to the store and buy this pencil, you are in effect trading a few minutes of your time for a few seconds of the time of all those thousands of people. What brought them together and induced them to cooperate to make this pencil? There was no commissar sending … out orders from some central office. It was the magic of the price system: the impersonal operation of prices that brought them together and got them to cooperate, to make this pencil, so you could have it for a trifling sum.
That is why the operation of the free market is so essential. Not only to promote productive efficiency, but even more to foster harmony and peace among the peoples of the world."
The Star Trek future does seem out of reach. On the other hand canonically they only got to fully automated luxury space communism after fighting a global nuclear war against eugenisists.
In Star Trek money exists but there isn't much use for it because technology has made material abundance cost approximately nothing.
Star Trek doesn't show the 50 billion landwhales watching Netflix all day, because it makes for bad television. It shows the 1% who still work even when they don't have to, who work because they want to.
People very much have to work in Star Trek. If they don't, unsuspecting enemies like Q, the Romulans, or the Borg would eradicate the Federation's existence. They also have their own forms of scarcity. Replicators are somewhat common but Holodecks are not and members of the crew must schedule time to use them. Resource allocation is determined through the decisions of leaders in the beauratic hierarchy and the longer you watch, the more you see the ways in which the Federation falls short.
AI didn't really mesh seamlessly with my work until I used Claude, I highly recommend it. If your current workflow involves googling, reading documentation and examples on github until you can put together a solution then AI should slot into your work nicely. It just does all those things but faster and can often surface what I want in 30 seconds instead of 30 minutes of research.
I wouldn't worry though, if the last 4 years are any indicator, we will continue to see LLMs refined as better and better tools at a logarithmic rate, but I don't really see them making the jump to replacing engineers entirely unless some monumental leap happens. If AI ever gets that good it will have replaced vast swathes of white collar workers before us.
I am somewhat optimistic, tech adoption is only going to go up, and the number of students pouring into CS programs is cooling off now that there aren't $100k jobs waiting for anyone who can open up an IDE. My ideal future is people who really love tech are still here in 10 years, and we will have crazy output because the tooling is so good, and all the opportunistic money seekers will have been shaken out.
> What’s wrong with people working for rent or groceries?
There's nothing wrong with people who have the ability to work for groceries being compelled to work for groceries. The rent issue is complicated by the fact that land ownership prioritizes those who have already had time to accumulate wealth over those who have not. There are some issues with abandoning prices on land entirely (e.g. if land has no cost, how do we decide who gets to live in the most desirable locations?), but there's a compelling case to be made that the contemporary system of real estate financialization is similar to the enclosure movement both in terms of its structure and impact. It becomes a question of those with good credit (typically the rich and old) being able to (in aggregate) buy up all of the desirable land and thus to set monthly claims on the income of those with bad credit over and above the level of claim that would be possible if the property purchases could not be financed by loans.
There is a legitimate cost to constructing a building and renting it out, but there is no real cost to land except the cost the market assigns to it. This might not be the worst thing (recall our example of allocating land in desirable locations), but when prospective landlords can take out loans against the property, the property's value is driven up beyond what any reasonable person would be willing to pay for the property's use. If you couldn't derive rental income from property, it would not make economical sense to finance these purchases beyond what you needed for your own use. This would (in theory) lead to lower prices.
I'd travel the world, taking in diverse centers of culture, history, and nature. I'd try to learn new languages. I'd do more track days, karting, and Ultimate. I'd buy a shell and try to get back into rowing. I'd play more computer games. I'd play ping-pong, foosball, and board games with my kids. I'd coach kids' sports. I'd go to more plays and concerts. Even movies. I'd volunteer.
Of course I wouldn't do ALL of that, since even without work there are only so many hours in the day. But I certainly wouldn't want for things to do!
Some people do all that and still work, you probably just need better time management. You could study a language before work in the morning, and then go row for a bit. Then go to work. Then you could play computer games from 5 to 6, play ping pong with kids from 6 to 6:30, eat a dinner, coach kids soccer from 7 to 8, volunteer open source from 8:30 to 9:30, catch a movie at 10.
But even without a job, you still need energy and motivation. The tax of switching between tasks (or hobbies) doesn’t magically disappear. Neither does the time suck of social media.
If you're wealthy and healthy, and even so only some of that timeline _may_ be possible, most just unrealistic.
>You could study a language before work in the morning, and then go row for a bit.
Ok, gotta be in by 9am, 30-60 minutes commute, 30 minutes learning a language, gotta eat, shower, coffee, get my row boat mounted and at the lake 20 minutes away, prep, do a 20 minute row, back again so realistically you'd need to be up at 6am, not unreasonable.
> Then go to work. Then you could play computer games from 5 to 6
Did you end work at 4pm or work from home, either way that is likely a short day but ok. A lot of people are forced to have commutes or work in a job that can't be remote, not to mention work much longer days. Hell isn't "60 hours is the sweet spot" for a work week now? (quoting Google's founder recent comments).
> play ping pong with kids from 6 to 6:30,
Have enough room to have a ping pong table at home, that must be nice, but yeah doable.
> eat a dinner, coach kids soccer from 7 to 8,
Who cooked dinner? Who cleaned up? That shit doesn't just happen by itself. So you prepped, cooked, ate and cleaned up, wrangled kids into car for soccer, and got the game field ready to play all in 30 minutes? Nope.
> volunteer open source from 8:30 to 9:30,
Game ended on time, kids didn't hang around to talk to team mates, straight in the car, no issues, and less than 30 minutes transport. Nope.
> catch a movie at 10.
30 minutes to get kids to bed, baby sitter on time (and you can afford one), doable at some ages sure. Movies are regularly 90-180 minutes so you're in bed at like 1am? For a 6am start? Again transport not taken into account.
The reason people think you can work 60 hours a week, every week, is because they don't do all the everyday things that need to get done, they have other people to do it. Also rarely do they leave enough gaps in their schedule for other peoples priorities.
Assume you WFH, 9 to 5. Commute time is zero. You have a middle class suburban house with a lake in the back. Your partner is a stay at home parent, does not work, just does household tasks and takes care of kids.
You wake up at 7. Quick 15 minute breakfast then push your kayak out to the lake and row 45 minutes on the water.
From 8 to 9, you can study a foreign language (same duration as a university course)
At 5 you can game for an hour and decompress. Then ping pong at 6.
By the time you finish ping pong with kids at 6:30, you’ve spent 90 minutes just playing around. Time for dinner, prepared by your partner. Kids have 25 minutes to get dress for soccer and eat dinner. The soccer field should be no more than 5 minute drive from your home.
After the game ends at 8:30, you could schedule an additional 20 minutes for your children’s frivolity if you like. Once you drive home you can cut down to 30 minutes working on open source stuff. A small sacrifice for their joy.
Send kids to their rooms by 9:30. Let them sleep whenever they feel like as long as they are quiet and in their room. Spend time with your partner and prepare yourselves for the night out.
By 9:45 the baby sitter arrives and you two head out for the movies. A baby sitter can be very cheap if your kids are older, often they are just a high school student doing homework or watching TV while your kids sleep or play. Don’t need a PHD.
You could be home by 1 AM depending on movie length. 6 hours of sleep is good enough, you can do it all again the next day.
It’s very doable, especially if you decide you don’t actually want to follow the same schedule everyday.
This schedule, even as a theory, assumes you work from home and have a partner who does not work and a babysitter? I don't actually know what percent of families that describes, but my guess is it's pretty low
Okay but at some point you have to make choices to work toward the life you want, it’s not just going to happen by accident with you chasing whatever you can, and that’s what people don’t understand.
If you want this schedule, prioritize a WFH career and find a partner who wants to stay home and earn enough money to hire a babysitter. If you don’t then this won’t be available to you and it’s your own fault.
I want effort, lot's of it, but let's not nitpick ...
Off the top of my head: Nobel Prize winning, world-beneficial research; lots of loving, open, deeply connected relationships; grow rapidly; be someone people turn to for support (because I help them), ...
I think if you let your imagination wander and you end up seeing the scale of potential we have and what we could really achieve, stuff like paying for rent and groceries starts to feel archaic and wasteful, or as some kind of artificial constraint holding us back as a species.
I think (from personal experience) talking with a good mental health professional would really help with your current state of mind and the pressure you’re feeling.
That's the toxic stuff you get from society, which leads to you hiring mental health professionals that can teach you healthy, effective ways of dealing with stress.
Cognitive Behavior Therapy can help with a wide range of issues. If there are worries that are not productive for you, that you can't get out of your head, a therapist can teach you how to use some basic tools to control that. And you'll probably only need a few visits. You can also read books, but given what you've stated I think you should start with a human.
My son went to a few sessions and completely got his OCD under control. He doesn't have to go anymore. I used similar technique to quit smoking 30 years ago after at least a half-dozen serious tries by other means failed. Still off them. It applies to all kinds of issues though, its also very effective for depression. According to the literature studies I did twenty years ago, it was the only technique that actually showed sustained benefit for depression other than medication.
My depression comes from super severe learned helplessness. I have been extremely stupid with money and career choices and nowadays things got hard, I have several chronic health conditions and the difficulty got up not by 2x, more like 20x. I just can't muster the will to even do one job interview, financial reserves are dwindling fast and, you get the picture.
I have zero faith any therapist can help me. They'll likely start with "but it's for your own good!" and I'll just say "yeah yeah, like 200 other things I have been told and zero of them turned out to be true". That's how I imagine it.
I am not against paying professionals. Obviously. I just don't believe in therapy at all.
What would you do to start with, with a guy like me? (I am aware you are not a therapist yourself.)
I am also not a therapist but I am a former tech founder turned executive coach so I do talk to people who are facing what feels like overwhelming challenges, risk, and uncertainty.
Even in the language you used "severe learned helplessness" and "extremely stupid", you are revealing a state of mind (cynicism, self-flagellation) that is not oriented to improving your condition.
You know you have a strong bias against therapists—given your seeming lack of knowledge about them, where do you think that bias came from? Fundamentally, we are a social species and evolved to live with strong connections to small groups.
Our society is no longer set up like that. So professionals like therapists and coaches provide the essential value of a caring, supportive, and helpful relationship that we lack. Like getting an essential nutrient that your diet lacks.
Do you have health insurance? Many of them cover mental health—the site Headway can help you find one that takes insurance. Try a few and gather some first-party data before writing them off fully. The downside is a few hundred dollars. The upside is a much brighter and materially better future.
To try to complement what other replies already said...
I think an important result of successful intervention is to awaken (or reawaken) the mind to the idea that thoughts and perceptions are internal and not always accurate representation of an objective, external world. Much psychological stress comes from these internal experiences, and subtle shifts in your mental posture can change this environment.
That's not to say that real stressors and stimuli don't exist. It's just that often times a person can spiral in a way that makes their internal reactions counterproductive and harmful to well being.
Another important result is learning better coping and adaptation strategies, so you can start to shift your mental posture or even change lifestyle and environment to reduce chronic stress.
It's not always easy, not magic, and not perfect. But, it can help...
The worst thing here is, from the beginner perspective it seems like simply reframing bad in a positive way, when bad was almost completely in their mind and didn’t exist that much. After the results you can see how twisted you were. I had my moments when I looked at the scheme of my mind on a whiteboard and had to admit how delusional I am, with zero pressure to do so.
I think its important to understand that CBT is a system, a set of tools for managing your thought patterns. Therapists who specialize in it are largely in the business of educating their clients, not having them lie on a couch and talk to the ceiling about their childhood. I'm not saying you won't have generic "talk-therapy" kind of conversations - those are still necessary for them to understand the specific issues you need to work on - but its not just someone helping you find insights that don't change anything.
If you are completely against meeting with a therapist though, you can start with books. I wish I could recommend one that I've used, but this is an example of one that looks really promising to me, with a practical approach: https://www.amazon.com/Retrain-Your-Brain-Behavioral-Depress...
This is not how therapy works. Although, tbf, it’s not hard to find a pseudotherapist who practices stereotypical bs.
What would you do to start with, with a guy like me
IANAT either, but mine would start with asking how I feel and then why. Then we’d talk about my vision of practical ways to stay afloat, the ways I maybe don’t see due to my focus, what exactly makes it hard to push through, in both known and never-tried situations. There would be some belief, avoidance, anxiety, algorithm, or a set of these. In CBT there’s a clear formalized method for each, which you can pick and work with until the next week or two. Examples are: logging your emotional responses, compiling a list of “musts”, start doing un-usual things, asking what exactly is wrong with something that seems bad.
That is, if my depression was on low. If on high, we’d address that first. Last time I pushed through it by following physical regime, a few supplements and lots of anger against it (depression can’t turn off my anger, ymmw as well as methods).
Trust me, us Europeans are not exempt from the "everyone should see a psychologist" trope blasting social media the last decade. We are not blind to every Hollywood actor having a personal therapist either.
I think the main difference (speaking as a northern European) is that when you Americans speak of therapy you seem to mean the stereotypical "talk therapy" where as basically every therapy here is cognitive behavioral therapy.
Can cognitive behavioral therapy help someone who has a bit of existential dread about his tech job? Maybe. I don't think it's silly on it's face though to say "really?" if the poster's life is in order otherwise.
Perhaps your life is on the easy setting? Hungry people work really hard. Fearing destroying an entire family by losing my job allows me to find strength and courage.
As a researcher who changed career paths to teaching at a community college, I empathize. Twenty years ago when I graduated from high school, I was inspired by the stories I’ve read about Bell Labs, Xerox PARC, and early Apple and Microsoft. I wanted to be a researcher, and I wanted to do interesting, impactful work.
Over the years I’ve become disappointed and disillusioned. We have nothing like the Bell Labs and Xerox PARC of old, where researchers were given the freedom to pursue their interests without having to worry about short-term results. Industrial research these days is not curiosity-driven, instead driven by finding immediate solutions to business problems. Life at research universities isn’t much better, with the constant “publish-or-perish” and fundraising pressures. Since the latter half of January this year, the funding situation for US scientists has gotten much worse, with disruptions to the NIH and NSF. If these disruptions are permanent, who is going to fund medium- and long-term research that cannot be monetized immediately?
I have resigned myself to the situation, and I now pursue research as a hobby instead of as a paid profession. My role is strictly a teaching one, with no research obligations. I do research during the summer months and whenever else I find spare time.
> I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation
What you stated is true, but my disappointing observation is that the people with wealth/power are only marginally smarter than the rest of us on the topic you mentioned. And then I suspect that even if one had a rich benefactor, pulling that off is not easy. It takes a threshold number people who have a holistic view of things to pull of what you mentions i.e nearly free basics of life. Check my profile etc. - some of what I wrote may strike a chord with you.
Also the proponents on Technocracy (Hubbert etc.) about a 100 years back, essentially touched on the subject you state. Note: The word technocracy today has a different connotation.
I'm very sympathetic to your experience and agree with most of what you say, but as someone who has spend half his life in academia and half outside, "who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel", I must say that 'reinventing the wheel' is at least as prevalent in academia than it is in business.
> acute stress of living in survival mode for a lifetime
For some perspective, bone evidence of pre-Columbian Indians showed that they regularly suffered from famine. There was also the constant threat of warfare from neighboring tribes.
The American colonists didn't fare much better, their bone evidence was one of extreme overwork and malnutrition.
If I may so bold as to refer to you as "my friend" (having never met you)...
My friend, I think I understand what you mean. I am about the same age too.
I would like to propose an idea to you - and it is something I have been exploring very deeply myself lately.. maybe the thing we need to start spending our time on is exactly this meta problem now. The meta problem is something like (not perfectly stated): we as humans have to decide what we value such that we can continue to give our existence purpose in the future.
I don't think AI is going to be the be-all-end-all, but it is clearly a major shift that will keep transforming work and life.
I can't point yet at a specific job, or task - but I am spending real time on this meta problem and starting to come up with some ideas. Maybe we can be part of what gets the world, and humans, ready for the future - applying our problem solving skills to that next problem?
I mean all of the above in 100% seriousness and I am willing to chat sometime if interested to compare notes.
Maybe it's time for me (40+) to go back to college. I want to pick up Mathematics and Physics up to the point of General Relativity. Since it's "use it or lose it", I better start reading now.
But I don't really have any time. There are so many things to do, to learn. Younger people who happen to stumble upon this reply, please please prioritize financial freedom if you don't have a clear objective in mind -- and from my observation many people don't have a clear objective when they are in their 20s! If you can retire around 35-40, you have ample time to pursuit any project you want for the rest of the life.
Putting in a plug for MIT OCW 8.962 [1]. I also had this itch, and was able to find time during the pandemic to work through the course (at about 1/2 speed). But true to what others are saying, life intruded for the last few lectures, so still have some items on my todo list. I thought Scott Hughes laid out the math with terrific clarity, with just the right amount of joviality. It is not for everyone, but if you have a suitable background it may turn "scratch an itch" into the obsession that it has done to me.
And to make the obligatory on-topic comment: I'm 61yo. Now get off my lawn.
Thanks! Yeah I planned to use MIT OCW for my education, at least the first 3-4 pre-requisite courses, before I even consider registering in an independent program in some University.
BTW I hope you are going to get more free time in a few years so that you can come back and enjoy the education again.
I've always toyed with the idea of studying Computer Science since I taught myself how to code.
Hell of a lot more difficult now when I need to work and don't really have the same amount of time to dedicate to studying. Hell of a lot easier when you're younger, your whole life basically revolves around the education, and any job you have generally fits around your school life rather than the other way round.
Yeah it was really a surprise to me when I realized that my energy declined to the point that I couldn't work on my side projects for the down days. Then I counted how many days I have for the rest of my life (up to 75) and this dreaded me a lot.
And it got worse after my son was born a few years ago. I would count the number of weeks available, not the days, because there has been whole weeks that I couldn't do anything. After all those are two full-time jobs.
As for your CS education, I'd recommend getting into some side projects and explore from there. If you go to a school, it's going to take too many courses.
I'm in my late 40s and I've found that my desire for working on side projects after work is affected by how engaged I am mentally at work. When I'm building new features/products from scratch and I'm having to figure out architecture and learn more about whatever language I'm coding in, I get more amped to do side projects at home. When I'm bored and just bug fixing and dealing with more mundane things, I have no desire to do any more coding after work. Something about being more engaged gets my brain in a state that I can keep going for the rest of the day until I need to pull myself away from the computer because it's 2am and I should have been asleep hours ago. I should note that I don't have children so the only "obligation" I have is to spend time with my partner and eat dinner, which I enjoy doing, of course. She usually starts getting ready for bed around 10pm and that's when I start coding. I do have some bad sleep patterns though, doesn't matter if I'm coding or not, which is probably not healthy. I have that revenge nighttime procrastination thing real bad.
Have you considered doing your side projects before work?
It takes me one call in the morning, of me saying for the hundreth time in the past 8 months that the integration is still missing data, to get me off the rails for the day. I know at 10AM that I won't touch anything else after work.
Been contemplating starting early and dedicating "the best hours" to myself.
That absolutely works for me! I play multiple instruments and have found that the early morning is the best time mentally for me to devote uninterrupted time to practice and playing. I’m also fortunate enough to have a basement with another floor between my cacophony and my sleeping family
That was something I have considered for a while, but then figured out it is unrealistic because I have a kid. But original replier probably can do if he/she doesn't have one.
I realized that frustration from work usually spills over to other parts of my life, not surprising as work is usually the first big thing we do during a day. I'm exactly like you -- when I have a lot of frustration from work, then I wouldn't want to work on side projects. It has nothing to do with how many hours I have.
I also have some bad sleep patterns as I only sleep about 5-6 hours every night most of the time.
I think, it might be useful to learn some mental skills to compartment one's mental state. If I could somehow put that frustration from work into a separate space without it spilling all over the ship, it would definitely help a lot. But so far I don't know how to do it -- plus I have a kid so I can't chill down after work until late night.
Plugging Georgia Tech's online masters program - I did it over the course of 4 years while working - can take 1 class a semester - and it's very cheap for a high quality masters
The only way I've been able to get things done is to first allocate time for the things I want to study right when I wake up, then do as much as I can to learn them. The only requirement is to try to understand the material. Putting deadlines and/or milestones in the beginning can sometimes discourage people from starting.
Any time there's a question of "what are the expected value of the max eigenvalue for a random matrix from such and such a distribution" I can answer it in 5 minutes with matlab and ChatGPT (and I know I can answer it). Making a animated GIF with successive approximation of a function by Fourier transform, no problem. Integrating by parts with hyperbolic functions in it? so slow, I'm googling the quadratic formula while the kids are yelling out the integration steps in real time.
I'm over 40 and even though I mostly manage/lead now I have time to do programming and plenty of math. I still see improvement mentally (not so much physically anymore), but also a lot of improvement in skills I neglected when I was younger like interpersonal skills and sales. I'm also learning a new language and read more than ever. Sometimes I feel like I'm less sharp, but I wonder if that's because I'm doing so much more.
My tricks that I don't always follow, is work out every day, get enough sleep, and stay off of most short form social media. I realized when I was on short form social it would zap a lot of time and kill any focus I had.
Achieving financial independence and early retirement does not mean one no longer needs any advice about life. Indeed, because those people have a longer retirement, they might ponder things like the meaning of life much more than someone who's living paycheck to paycheck and has to devote all brain cycles to survival. And there are so many options for those who retire at 40 that they genuinely need advice about what to do, how to find what matters most to them, and how to go about doing the things they had always wanted to do but couldn't.
These people have succeeded in making money and that's all. But life is so much more than just making money.
People's financial goals range wildly, which also impacts on when they can retire. Some people don't care about some combination of having a car, or expensive living quarters, or fancy food, or a family.
This advice could really backfire badly if taken literally by young people.
Optimizing for financial reward early in your career could be the surest way to end up in a dead end from a mission/purpose/domain/skills perspective.
20 years later, you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two.
> 20 years later, you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two.
Ah, but it does. Speaking as someone approaching fifty, you feel every penny. Everything about your financial situation weighs into your decision-making, makes different options possible or impossible. It changes which jobs you can take, and which jobs you can turn down. It affects how much time you can take between jobs. It affects how much energy you pour into keeping your job or chasing a promotion versus investing your energy in education or other things you find satisfying.
People worry that they will accidentally pursue money with such single-minded focus that they turn off every other part of their soul, and miss out on what they "really" want to do. But I don't think that's possible. Replace money with anything else: fame, family, intellectual achievement, hedonism. If you try to dedicate yourself 100% to one thing when something else is important to you, you'll hear the voice in the back of your head. You'll feel what it is, and if you ignore it then, that's on you.
If you don't hear that voice yet, lay down the foundation that will give you the freedom to follow it when you finally do.
Yea, there is a huge distance between "saving enough money to retire comfortably" and "letting wealth accumulation dominate every decision you make." And, honestly, most people don't even get to the first one.
You're absolutely right. I realize my comment could be understood in many various ways.
My point was that, at some point, money has a negative effect on your career. Shooting for the top percentile of revenue can take you off track for life.
But you are saying that having a few hundred thousands bucks when you hit 40-50 is a life-changer and you are absolutely right as well.
Our point of views are not incompatible and were not captured by my first comment.
Its easy to think you have too much money in your twenties. I used to save every other paycheck. But wait until you have a family and a mortgage. The money goes very quickly.
It might backfire for sure, but being financially independent gives you freedom to figure that out for the rest of the life.
IMO it's a lot better than the situation I myself am in right now, when I can clearly see myself working my ass off for the next 20-25 years in domains I totally hate, and then hopefully I can start working on interesting things when I'm ... 65?
I'd further argue that the only downside of my strategy is that you already have a clear non-monetary objective but decided to go with the money for 20 years. That's definitely a bad thing, and that's why in my original reply I rooted this out -- if you already have an objective, go for it.
Only if you get a rest of your life. While most do I've known more that one person who didn't make it to 40. Worse those that do all srart reporting their body is starting to fail. If you have not done some things by 40 it may be too late to ever do them.
Won't expound on my life story, but this is massively overlooked. You can't just prioritize money without taking into account the massive sacrifices it will require in your life. I spent a long, long time becoming successful in careers that I hated, only to burn out and do the career I knew I wanted to do since I was old enough to think and remember. Except now I have wasted decades of my life that I will never get back.
The majority of your life is spent working so you absolutely MUST find it fulfilling or you will burn out (at best) or destroy your body and mind as a sacrifice to the insatiable Mammon.
There is nothing wrong with doing a job you don't like. However you need to ensure that it doesn't burn you out. Work enough to get the money you need to live. Then do things you like. Many people do this: there are a lot of jobs people don't enjoy but they have to be done so someone does them.
Even people who find a job they love often find after 10-15 years they are sick of doing it all the time. This is likely to happen to you unless you are careful not to let your job alone be what defines you. This is normal.
Don't get yourself into a job you hate. (part of this is not being so picky you hate everything!) however liking - much less loving - your job is optional. Then go home and do something else for fun.
You can't forego savings because "I might die at 40". That's really not a sensible plan. It's a balancing act, but I'd rather have saved too much and die a little early, than not save enough and somehow live to 100.
I agree 100%. There is a balance you want enough savings for 'a rainy day' and also enjoy the rest of your life. retirement is only part of enjoping your life.
the important point is don't get so lost in saving money that you don't enjoy now.
But accumulating and saving always comes at a cost. Is it worth to earn more money, but spend less time with your family for example? It's a delicate balancing act and you never know where the right balance is
It depends a lot of where you came from. If you are coming from a poor background, without any perspective of the occasional help from parents or a possible inheritance, I'd say prioritize financial security. Of course, you can accept the occasional lower salary but with better career prospects here and there, but sometimes this is a mirage, and a lot of time, better pay comes with better career prospects.
If you didn't come from a somewhat privileged background chances are you started your career with more professional debt, without a rich contact network, you're probably a bit too humble to negotiate wages and even narratives like "when I started my business I had come from a working class family, and had to scrap by raising 80k from my relatives to start my business" are out of your reality. So, prioritize being financially secure first.
This angst about a sense of purpose is basically a privileged class malady, if you are poor our friend Maslow will ensure you have more pressing issues to care about first.
> you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two
You are describing some extreme case of money chasing and/or complete ignorance to everything else. Having the "luxury" to be covered financially for the rest of your life allows you to pursue whatever goals you have in mind at mid-life. If you are susceptible to not knowing what you want, having less money won't help you find out but having more money might.
Is it any better to know what you want to do for the next 2 decades and not ever be able to afford do it? From a practical perspective you are still missing the opportunities you want or dream of, except you're also doing it with little or no financial buffer for the things you need.
I made a lot of sacrifices and experienced some serious personal pain to achieve modest financial independence by age 35 (not "FAT" by any means, but well beyond the average American), and it was worth it. I'm still working, but only because my former career momentum has carried me into a position where I'm paid a small fortune. I would never do any kind of normal engineering job for a normal income these days.
My attitude and the way my brain processes things is completely different. Getting laid off or fired goes from something you might fear or see as a bad thing to a neutral or even positive event that just encourages you to go spend your time in a different way for as long as you want.
20 years of bad habits facilitated by a given lifestyle can also be very hard to break. Not many can manage duly accumulating the savings while completely isolating themselves from what they work on, who they work with, and how all of that impacts their worldview.
And that's not even considering health. 20 years of being in a bad mental place (stress is bad, but a perceived lack of purpose and agency might well be worse) will leave its marks.
There's also a non-zero chance you'll die before year 20. I agree with the premise that seeking financial independence should be a significant factor in career/life decisions, but if you would be filled with regret by finding out it will be cut short at year 18, you're too singularly focused.
Having the money is far, far better than _not_ having money to help you "make sense or use of the next two decades". If you don't, both the sense and use are narrowed to being chained to a job indefinitely into the future.
The thing, being poor doesn't buy one a sense of purpose too. Money for sure doesn't solve the issue, but it gives you all the freedom to solve it for the rest of your life.
Damn I wish I had a million so that I could just drop my job and twitch my coding and gaming streams 12/7. I can't do that.
I believe most people having that million wouldn't spend it to find the fulfillment in life, but would end up increasing their life style by slacking off, drugs, expensive cars and items, etc. And the million would be gone in months and you would be left with just bad habits, dopamine hangover and no idea of the further direction in life.
Thanks. I did read that but found it to be too broad. I set a very narrow target and hopefully everything can be wrapped up in 8-10 Math/Physics courses.
I got excited to do this a couple years ago. (early 30s) Time and energy were a real killer.
Physics and Math in a formal setting like school is rigorous, not fun. I found it really hard to stay motivated. I don't know how I would practically use that knowledge, i would never contribute anything scientific. It would take years of grinding through foundational math and physics to get there.
Oftentimes they're not even rigorous. At least in the math setting, many professors excuse their student's lack of comprehension on complex material when it's really bad teaching. Some attempts at explaining proofs are laughable and although their conclusions may be correct, because the covered theorems were proven by those much smarter than them, the steps taken to get there rarely follow the tight logical sequencing needed by those learning the material.
I often ponder if I have the energy to go back to school. I am employed by MIT at one of the labs where I do research for embedded security. As a consequence, they offer free classes you can pick up. I am yet to actually take advantage of that yet but your comment has me thinking the same thing. I turn 36 in a couple days!
Yay, do it! I'm in linear algebra right now (midterm in 40 minutes) and I'm over 40. I went back because I always regretted not taking more higher level math. It's been a lot of work, but very rewarding. My kids (age 7 and 5) think it's pretty cool to see dad working on his TI-89 and Notability on iPad.
I was running into the same issue. I wanted to get into deeeplearning but my math skills had atrophied. go check out mathacademy.com. its no where near the level of time investment that going back to college is and you will learn a lot!
Thanks. I definitely will teach myself some of the pre-requisites before registering in University. I need to prove to myself that I can sit down, take some course, complete the coursework + assignments + exams on MIT courseware, before committing anything that costs $$.
You would still very likely need human input and help. LLMs will hallunicate badly on problems just a bit more difficult than the very standard ones (first-hand experience with math).
More proof that old boomers don't get what its like to be a modern, young adult. I was just texting with friends about this at the coffee shop this morning while making plans for this weekend. Boss is interruping by goat-yoga mindfullness session, asking me to come into the office an hour this month. Who has time for this?
I wish I had all the money and all the time! I don't, alas...
I know it sounds stupid but I started to but lottery tickets, not to win, because statistically it is impossible, but just to give me hope, because lottery is the only thing in the world that can land a mountain of cash in one shot, with a very small investment. Nothing else can do that.
That's why humans purchase lottery tickets all the time throughout history. It's too cheer themselves up.
Needs too many correct bets or too much $$ to get a few million of returns. You can win a lottery of 50 million with just a few dollars! But I do think this is an interesting strategy. I might try it out just for fun.
Anyway I'm half joking. I do buy lottery but it is just to improve the mood of the day. Oh a good mood for a few hours is so important to keep being sane.
I strongly doubt this. It’s rare and we have all sorts of credible theories about why it’s rare, but the decline of so-called fluid intelligence is mostly Flynn Effect: people are getting better at taking tests.
It is a possibility I actually agree with, because a true understanding probably requires a lot more than taking some classes. It probably needs a PHD on Cosmology or something else.
But let's say a shallow understanding is good enough...even just completing a General Relativity graduate course with good mark is good enough.
There is absolutely zero evidence that 35 is some mystical cut off for "understanding." That poster has NO clue what they are talking about. Seriously, feel free to ignore that comment.
As for practical advice for learning, you should look into learning how to learn and then spend about 1-2 year habituating to the proper way to acquire knowledge. The science says your (not just you, practically everyone) current intuitions and habits are incorrect; as evidenced by almost everyone in this post. Youtuber Justin Sung is pretty much second to none in terms of a practical program for acquiring these skills.
Note: Simply reading that article and "understanding" what it is saying is not equivalent to having a study program that implements these things, and having a program that implements these things is not the same thing as actually executing on and habituating to said program. This process takes many months to years.
Echoing the sentiments of others here, this is why I firmly believe that public college should be free, for all, for life. Formal education just works better for some of us than video tutorials or self-paced learning, and ensuring everyone is able to learn new things and practice their skills in a consequence-free environment benefits society as a whole.
Think about the tech nerds (me) who never learned how to cook, and are in their thirties. Or lawyers and Doctors who are sick and tired of feeling like they don’t understand how computers work, and want to learn. Or an accountant who loves maths, and wants to get into the scientific side of the field. Or the homemaker who wants to re-enter the workforce now that their kids are grown, and wants to pick up carpentry and welding to become a tradesperson.
If cognitive decline comes from failing to practice it regularly, then the cheapest solution is free education for life to encourage as many people as possible to keep learning new skills and remain cognitively engaged.
> I firmly believe that public college should be free, for all, for life
I just don't understand these statements that "this or that should be free". Do you plan to enslave the people who would provide this education? Do you not subscribe to the saying "You get what you pay for?". Public education through High School (in the US) has been free for many generations. Ever wonder what would happen if you make the next 4 years "free"? (Hint, you're not going to pop-out of those 4 years with any skills that are differentiated enough from everyone else who took-up the "free" education and not be right back in the same position you are now.)
If you don't have the motivation to prevent your own cognitive decline by taking advantage of a plethora of already free (high quality) education (e.g. https://ocw.mit.edu), then taxing the rest of us so you can be spoon-fed all the free "formal education" you want for life isn't the answer either.
> Do you plan to enslave the people who would provide this education? Do you not subscribe to the saying "You get what you pay for?". Public education through High School (in the US) has been free for many generations.
Do you believe that the people who provide public education through High School are enslaved? If yes, how? If not, why do you assume providing free public college education requires enslavement?
> Public education through High School (in the US) has been free for many generations. Ever wonder what would happen if you make the next 4 years "free"?
No need to wonder. Tuition for bachelor's degrees is free in multiple countries, for instance Germany, Finland, Sweden, Scotland and Norway. What happened there?
I assume the “enslave” question is meant to allude to the fact that, since nothing is actually free, when demand outstrips supply, at some point “making it free” can only mean forcing people to provide it (financially or personally).
If “x should be free” was a solution to anything, why stop at education? Let’s make everything free!
Are people being forced to provide free public education up to High School in the U.S.? Are they being forced to provide free University education in the countries where it is free?
If something being free implies forcing people to provide it, to the point that "enslaving" them is a reasonable analogy, why have anything free whatsoever? Let's have nothing free!
In Germany it is common to get kicked out of free university on a technicality and you have to sue to get back in. I am told they do this to keep class sizes manageable and to filter out those who are just there because it’s free and they have nothing else to do.
That sounds like a 'Germany' problem, not a 'free university' problem. There are trivial ways to keep class sizes manageable.
One way is for universities to limit the available places in each degree for first year enrolment, and assign these places based on entrance exam results (as in Spain, where tuition isn't free but is very cheap compared to the U.S.). Another way is to have unlimited first year places, but restrict places from second year onwards to a given number n, allowing only the top n students from the first year to progress (as in France, where tuition is not technically free but averages to < 200€ a year).
> Ever wonder what would happen if you make the next 4 years "free"?
A high school diploma used to mean something because it was a filter. Once graduation rate became the goal, standards were lowered, and just showing up became enough to graduate.
Higher education does some filtering. Either they filter aggressively at admissions and graduate everybody (Ivies), filter with weed-out classes and lesser degrees (respected public universities), both (other public universities), or offer a middling education and are ranked accordingly. So the degree means something.
I agree that degrees can be filters, but I question what "filter" they represent in modern contexts. From my experiences, the modern degree is little more than a gatekeeping credential to demonstrate you either took on substantial student debt (and thus likely to take lower pay or more precarious employment) or come from a wealthy background (stronger social networks for other rich folks/Capital types; a "pedigree", in other words, a la a caste system).
You're 100% right that a modern American High School Diploma does not reflect any degree of basic competency, because standards were constantly refined downward to promote graduation at all costs; I argue college degrees (and many technology certifications) are much the same, providing little more than a demonstration of taking on debt and rote memorization capabilities, rather than being a functional worker.
So if that's the case, and they're not of practical value as credentials anymore, it could be argued there's no harm in opening fundamental/foundational courses in skills to the entire populace, paid for through taxpayer money and restricted to State/Public non-profit Institutions. If we're really concerned about costs, we could implement caps on consumption unless part of a degree program to ensure those taking the advanced courses for employment prospects are given priority over those seeking non-professional growth. There's a lot of wiggle room to be had, if we're serious about opening this up.
I don't understand this sentiment. You have no problem spending $800 billion in tax payer money on military in a country that hasn't fought a defensive war in 200 years but as soon as the same concept is applied to education or healthcare it's somehow wrong?
This is false equivalence. Most of us that share his ideology aren’t fine with either.
Why would we be in the “foreign forever wars should be free” camp?
When approaching these sorts of situations, it is best to steelman your discussion partner’s argument. It will help in your understanding. People who disagree with you aren’t all stupid.
> Do you plan to enslave the people who would provide this education?
There's a concept called public money which can build roads, dams and other cute concrete things. Why can't you use that for payroll in higher education? Not everybody can learn the same way, not everybody has a separate and chill space in their homes to study without interruption.
Roads serve the needs of now, knowledge builds roads to the future.
> I just don't understand these statements that "this or that should be free".
Because you're focusing on the accumulation of a finite resource (currency, land, etc) as the sole barometer for success, and then conflating "freedom for use" with "freedom from cost". Obviously salaries have to be paid, buildings maintained, and improvements paid for. Obviously this all costs money, which is a finite resource. Obviously that money has to come from somewhere. Taxation enables everyone to contribute a fraction of the cost regardless of use, and an effective social program (like free education) distributes that cost effectively over time since there's zero chance 100% of the population will consume that resource at the same time, or even in the same year.
It's basic societal maths. If we accept forgoing a profit on the consumption of the resource (healthcare, roads, mail service, education, defense), we can lower the cost substantially and concentrate on its effective utilization. If we do that, we can carve up the cost across the widest possible demographic (taxpayers), and assign a percentage of it as taxation relative to income and wealth. It's how governments work.
> Do you not subscribe to the saying "You get what you pay for?"
Does anyone subscribe to this in the current economy? Everything has record high prices, yet still bombards you with advertisements, sells your data, and requires replacement in a matter of years instead of being repairable indefinitely. University education has boiled down to little more than gargantuan debt loads to acquire a credential for potential employment, a credential that often has no relevancy to the field you actually find work in.
So no, I don't subscribe to that, and I haven't for a decade. My $15,000 used beater car is literally more reliable than a six-figure SUV, and it doesn't keep mugging me for more value to the manufacturer through surveillance technology and forced-advertising.
> Ever wonder what would happen if you make the next 4 years "free"?
Yes. I imagine much of the populace would be better educated and informed about how modern, complex systems work. More people would be fiercely resistant to the low-wage, high-labor jobs that flood the market, forcing a reconciliation of societal priorities. I figure we'd have more engineers, and artists, and accountants, and tradespersons. We'd have more perspectives to existing problems from a broader swath of the economic strata, instead of the same old nepobabies from a lineage of college graduates making the same short-sighted mistakes.
The question is, have you considered what might happen if we made a four-year degree more economically accessible?
> If you don't have the motivation to prevent your own cognitive decline by taking advantage of a plethora of already free (high quality) education (e.g. https://ocw.mit.edu), then taxing the rest of us so you can be spoon-fed all the free "formal education" you want for life isn't the answer either.
Now you're just insulting people because they lack means, and conflating it with lack of motivation. I've lived with people whose sole education was reading books in Public Libraries because they never had public education, with Section 8 housing recipients hammering online learning courses from shared computers to try and find a way upward and out of poverty. None of that gets them a foot in the door, because they don't have the physical piece of paper that says "University Graduate" and the social networks you build from physically attending school - which adults cannot do without money or taking on substantial debt, that in turn jeopardizes their ability to survive.
If you want a society where only those of monied means have the ability to succeed, well present-day America is certainly an excellent demonstration of that. I'd rather build a society where all of us contribute a part of the proceeds of our labor to build a more equitable society for all, so everyone has an opportunity to found that new business, make those social connections, or try new ideas, without worrying about losing their home or paying for healthcare treatments.
> Does anyone subscribe to this in the current economy?
Not anyone whose net worth is under -say- fifty- or a hundred-million dollars and is older than their mid-thirties, that's for sure.
If you're not rich enough to routinely afford very well-made things, and you're old enough to know that very many things legitimately used to be far, far higher quality for not that much more inflation-adjusted money [0], then you sure as shit don't subscribe to that saying anymore.
[0] And sometimes, far less... especially when you factor in the cost of continually replacing the garbage that's all that you can afford.
Chat gpt is already free to a very generous extent, and covers 80% (if not more) of the learning resources you could need for almost any topic, theory-wise. I'd risk saying it can adapt for most people's needs.
For practical knowledge you just need to do it over and over. A good mentor/teacher would help a lot, but the very very basics I'd say are learnable by yourself. It's as simple as doing it over and over and keeping a critical eye on what went good and not.
As a result, I don't think free public colleges would enable more people to -actually- learn compared to what we have today. However, I find it would be a great place to build community and find people with similar interests to you, which is quite rare to do without an app these days.
See, I'm worried about relying on LLMs for learning given their penchant for hallucinations and the early studies showing they're actually bad for learning or cognitive improvement, since they remove the "research" and "critical thinking" phases of problem solving for entry-level stuff - fundamental skills that are necessary to put something into practice independently and learn from mistakes. Sure, teachers/professors can also make stuff up (and often with more damage given their position as a "reliable authority"), but in a classroom setting it feels like that'd be found out faster than using a ChatGPT that's spitting out bad results.
> However, I find it would be a great place to build community and find people with similar interests to you, which is quite rare to do without an app these days.
This is what a lot of detractors seem to miss about the benefits of in-person learning. Team projects force you to interact with strangers and cooperate for the benefit of the whole. Campuses increase the likelihood of chance encounters. They get you out of your home and into the community, which helps you feel connected to your actions and their outcomes.
The knock-on effects are often greater than the immediate benefits.
You're not alone! Nobody knows everything, and what's important or necessary to our thriving changes constantly throughout our lives. Learning to cook wasn't high on that list when tech salaries were great, delivery was cheap, and housing wasn't (completely) unaffordable; now that I'm nearing my 40s and have to stretch even a six-figure salary further than before, suddenly learning to cook is a necessity.
Good people are always changing in some way. Making public education free encourages lifelong learning and builds a more adaptable human for times of crises. It's good survival strategy, that also just happens to create a more fulfilled human being.
I don't think I should be paying for others to study simply because they prefer a different modality of learning, especially when it has been found that learning modality selection has nearly zero impact on actual learning outcomes.
Now, if this was structured as a negative tax system, where eg everyone after graduating high school starts with -$10k in taxable income for a handful of years, perhaps that could avoid punishing those that choose to self-study.
This line of reasoning can be used, unmodified, to argue against essentially all of public education.
An educated populace is an inherent good. There’s nothing magic about the particular choice of K-12, and one could very convincingly argue that with the increasing complexity of modern life and increasing expectations from employers that ongoing adult education is also a net good, even when you’re not the recipient.
Ongoing education can also be vocational for those who aren’t inclined towards typical academia.
Cynically, one can also point to the current political administration of the U.S. (and the comparative education rates for its voters) as a case in point for why education is important.
Completely anecdotal, current student, but some people learn differently from in person class vs online/remote. Also, imo not every degree/course is best done online (e.g., trades, arts, performing arts). Right now I am taking classes which cannot be done online.
I do however understand where you are coming from. MIT courseware is abundant, youtube, library resources, github `awesome` lists...
If there wasn't bureaucracy/capitalism surrounding higher education i wouldn't mind it coming from tax dollars since it would be another log added to the fire per se. Plus it helps to create a stronger workforce (theoretically, assuming graduates). Without the right safeguards, free college edu wouldn't work, would be nice tho
Very well said. Education, at its core, is about adapting the species to better survive the increasingly complex world it creates and inhabits. Failing to educate the whole means exposing it to fracture and exploitation from within.
It’s inoculation against exploitation, a mental vaccine that, when done right, promotes cooperation over self-interest.
Which is exactly why those who are threatened by it, seek to restrict or destroy it.
“i don’t think i should be paying” - if it could benefit your community or the country, then why not? It gives people options.
Not everything is a zero sum game. That’s just a fact of living in a society. Some people pay in to the system much more than you, and you benefit from that. And vise versa, there’s someone paying less and they benefit from your contributions (taxes, etc). That’s what society is about. A system that allows citizens to thrive. It’s not supposed to be about ME ME ME.
Just my 2 cents…as an american that’s tired of this attitude. Capitalism with small guardrails is garbage in my opinion.
On a somewhat related note - many americans think free healthcare is not worthwhile because it’s a net negative for them PERSONALLY. I struggle to understand that as well. Like “oh i don’t want to pay for that”. Meanwhile most of your fellow americans can’t afford basic care.
What’s the end game??? You’re entitled to your opinion of course but i don’t _understand_ it.
> Formal education just works better for some of us than video tutorials or self-paced learning
I don’t agree with this at all. Anecdotally, the autodidacts I’ve met are way more knowledgeable about subjects they’re passionate about compared to those who received a formal education for it. This applies to both computer science, but also psychology majors who I’ve met who can’t even tell me the difference between Freud and Jung.
Are you actually saying that nobody exists who learns better when taught in the best ways we currently know how to teach, and in the way all formal education currently works? That everyone is better off teaching themselves with no help?
You are disagreeing if and only if this is what you are saying.
I mean, you can disagree with it based on your anecdata, but mine backs up my assertion which is why I made (and qualified) it the way I did. I specifically thrive in live sessions with an instructor knowledgeable on the material who can provide direct feedback, and I am not the only one. "Works better" is a qualifier on the effectiveness of the education on an individual, not the effectiveness of it on all individuals.
The key to learning accessibility is flexibility. Some thrive on self-study, some thrive on video tutorials, some thrive on audio lectures and others in live exercises. Heck, I wouldn't be surprised if this also applied to specific topics: fundamentals of cooking might be better via live instruction, while iterating on a recipe is often fine with self-study or video tutorials.
The point is the flexibility, to allow people to learn in a way that's best for them, so they're more likely to continue learning throughout their lives.
Over the past 40 years I've become aware of a LOT of people who had difficulty staying engaged in self-paced learning sessions, especially pre-recorded. Without the dynamics -- questions and interactions -- that other students can pose (or you can pose), it's tough to maintain your attention for a solid 50 or 90 minutes. Not that all courses must be in-person, but I'd there to be a mix, with more in-person opportunities for course material that needs Q&A and interaction and examples, like courses heavy in math or theory, or recitation sections.
She still got Alzheimer's and died a couple of years later.
She had multiple incidents that she hid because she was too scared to find out, and too stubborn to lose her ability to drive. She could have had some treatment if she'd approached a doctor earlier.
Alzheimer's is utterly evil. Robbing people of their unique spark, killing the person before the body dies.
Alzheimer's is a disease, you can get it in your 40s. If somebody recommends exercise to keep your legs healthy, they don't mean that if you have a staph infection in your legs that exercise will make it go away.
My grandfather had vascular dementia, and keeping him thinking and using his brain absolutely helped. Makes sense for a problem of blood flow that thinking new, hard stuff might direct some more blood supply to the brain.
Also, 1) you don't know for sure if you have Alzheimer's until you're gone, and 2) it seems that vascular dementia co-occurs with Alzheimer's a lot. So I can't imagine that it would ever be a good idea to stop using your mind if you felt it slipping.
Yep, first thing I thought, too. I'm terrified of age-related degeneration, so I try to stay active and mentally alert, just like my father did. He got out and played golf every chance he had, did duo-lingo to try to learn to speak Spanish, played bass in his church band, kept working even though he didn't need the money... and still got Alzheimers. Now he can't drive, can't be trusted to go out and take a walk by himself, can't even work the TV, so all he can do is sit and watch DVD's that my mom changes for him - at least while she still can.
I'm still going to try to fight it for myself, though.
I hope we have the compassion as a society to get to the point where I can say, "If I am unable to recognize my children, please kill me." At that point I would have died regardless of the condition of my body.
I don't want to wait that long. If I get diagnosed with Alzheimer's, I am taking a quick farewell tour of family and friends and then I'm done. I don't want to wait so long that I need someone else to off me. I wish that all wasn't necessary but this country (US) isn't going to get smarter anytime soon.
I hope we have the compassion as individuals not to ask others to kill us. That's a heavy weight to put on someone else. It's not abstract "society" conducting the euthanasia: individual healthcare providers would have to decide that you met the criteria and then administer the drugs.
> I hope we have the compassion as individuals not to ask others to kill us.
When I've had to kill my pets, I didn't do it myself. I called in a professional to do it.
Surely you don't believe that OP is asking their friends to knife them in the chest if they're too far gone to ask to be euthanized? Surely you believe that OP is asking their friends to ask a doctor or nurse come in and do it, if OP is no longer capable of asking for it to be done?
Instead of posting snarky, low-effort comments you should spend some time learning the basics of medical ethics. This is not a religious issue. Medically assisted suicide for a terminal patient is one thing, but directly killing a patient with severe dementia who is mentally unable to give informed consent is quite another. This would put healthcare providers in an impossible situation. There are good reasons why no civilized country allows this.
Sure, I'll follow those goalposts as they walk down the field.
At the time it becomes relevant, a person with a DNR is usually (always?) in no state to give informed consent to being killed by their doctor's inaction. Same thing for someone in a irrecoverable coma who's being kept alive by machines when a family member or friend instructs the doctor to pull the plug on them.
Relatedly, angels of mercy have been releasing suffering folks who are at the end of their life from that suffering for ages.
You might find these things unpalatable, but they do happen, will continue to happen, and we're better off because they do happen.
I sincerely hope that through to the end of your life you remain lucid and able to clearly and convincingly express your preferences. I very much hope that you're not locked in a metaphorical hell of suffering, but unable to express to (let alone convince) anyone that you're ready to end it early.
Stop lying, I haven't moved any goalposts. In medical ethics there is a clear line between withholding care versus actively killing someone who is unable to give informed consent to the procedure.
You absolutely have moved them (and also refuse to talk about pre-registered requests to die (of which, the request that kicked off this subthread totally qualifies)). You started by saying
> I hope we have the compassion as individuals not to ask others to kill us. That's a heavy weight to put on someone else. It's not abstract "society" conducting the euthanasia: individual healthcare providers would have to decide that you met the criteria and then administer the drugs.
(while ignoring that asking a doctor or nurse to kill you is also asking another to kill us) and now you've moved to talking about specific situations that can be tricky, depending on the particulars.
Isn't that still less awful than having to administer other kind of drugs again and again for suffering and slowly dying patients that want to die?
The situation is just bad regardless.
So long as people can freely choose whether they will do it or not I don't see a moral problem. There would be a very big problem if healthcare providers were mandated to provide such a service. And note that while the evaluation certainly needs to be by a doctor a nurse is quite capable of doing it. Look at the Canadian method--for the most part it's something that's actually done quite routinely in emergency rooms across the world. Sedation followed by a paralytic. Usually that's a prelude to intubation but if you walk away in the middle it kills. Canada then pushes potassium chloride just in case as the paralytics wear off pretty fast.
And we are better off as individuals if we have the option of having external providers do it as that removes any dependency on actually being able to do things. There also is the benefit that it brings an external evaluation into the system that can recognize that maybe the evaluation was wrong. (I'm thinking of a case I heard about--woman thought she had lung cancer, chose to not treat it, simply work until she dropped. Autopsy said TB, not cancer.)
You completely missed the point. This discussion is about dementia. The assisted suicide laws in Canada and other countries generally require the patient to be of sound mind, as evaluated by a qualified clinician. The laws don't apply to patients with severe dementia.
In the comment above, @thinkingtoilet apparently wants someone to kill them if they ever have severe dementia. Presumably that desire would be expressed in some sort of "living will" type document. If the patient meets the criteria then should a healthcare provider strap them down and kill them, even if in the moment the patient says then don't want to die? That seems ethically dubious. It essentially puts providers in the position of being serial killers.
Canada has also had some serious abuses and ethically questionable situations. They are not necessarily a model to emulate.
Of course there are edge cases. Reality is continuous, not disjoint, and thus any attempt to impose a line will inherently create edge cases. What you are missing is that the case of inaction (not permitting it) also creates bad things. You can't make a situation without bad, all you can do is attempt to minimize the bad. Note that the "ideal" (as in maximum social benefit) amount of bad things happening is not zero. Preventing bad things always comes with a cost, there will always come a point where additional preventing of bad things causes net harm.
Consider, for example, nuclear power. It has basically been regulated out of existence in the US because of the standard that radiation exposure must be as low as reasonably achievable. The problem with this is that it doesn't result in safer nuclear plants, it results in plants that run on different power sources. Natural gas? Approximately 10x the risk (and that's not counting climate effects.) Oil? Approximately 10x the risk of gas, thus 100x the risk of nuclear. Coal? Approximately 10x the risk of oil, thus 1000x the risk of nuclear. The expected (and observed) safety benefit of the regulations is negative.
And to preempt the inevitable "Fukushima!", that was political. The expected death toll of staying put was approximately zero. The city was evacuated, killing hundreds, for no good reason.
I'm not missing anything. Medical ethics as commonly understood in modern western civilization imposes a clear line between withholding care versus actively killing someone who is unable to give informed consent to the procedure. The topic under discussion isn't even close to being an edge case. Minimizing the bad is not the goal. Nuclear power has zero relevance here and bringing it up is just an attempt to confuse the issue.
It used to be that medical "ethics" precluded actively killing someone. That's a religious legacy that more and more countries are coming to recognize is wrong. Fundamentally, this reduces to whether quality of life can be negative.
If quality of life can be negative then there will be cases where the humane act is to provide someone with a comfortable death.
And my point about nuclear power is that excessive regulation actually is counterproductive at maximizing human benefit.
Forcing someone to live through a disease when they have already lived a full life is simply cruel. Why should someone have to suffer on their way out?
Who is doing the forcing here? Are you personally volunteering to kill anyone who decided that they wanted to be killed if diagnosed with severe dementia? What if they change their mind (even if no longer of sound mind) and say they no longer want to die? Would you go ahead and kill them anyway?
> Who is doing the forcing here? Are you personally volunteering to kill anyone who decided that they wanted to be killed if diagnosed with severe dementia?
I cannot do that because I am not a medical professional and even if I was I wouldn't be the only one making that decision. I do have a lot of respect for the people whose job it is to perform euthanasia. It's not an act of cruelty, but of kindness.
> What if they change their mind (even if no longer of sound mind) and say they no longer want to die? Would you go ahead and kill them anyway?
No euthanasia program is going to kill someone who says they do not wish to die. The moral hazard mainly comes from when they are no longer able to express their wish. Then the decision is based on the wish expressed when they were still able and the wish of family members.
This is not all too different from someone who has suffered severe brain damage and is kept alive on life support. Would you keep them alive until they die of old age or would you respect the family's wish to stop treatment? People with severe dementia may not be on a breathing apparatus, but they also cannot survive without the constant support of hospice care.
But cases where the person can no longer express their wish are exceptional. It is often that their wish to end their suffering is so strong that they will stop eating to hasten their demise. What would you do in that situation? Would you forcibly feed them through a tube because you do not believe they are allowed to determine the manner of their death? Or would you simply ignore their suffering as they die a slow and agonizing death from malnutrition? This is what I mean when I say that you would be forcing someone to suffer.
In my opinion, there is a line that needs to be crossed and that line is extremely hard to define. To be safe, you have to go past the line so any blurriness is removed. I would ask the people I love the most to shoulder this burden and I would offer to shoulder the same burden for them. This is how love works.
No, you don't have the option. To have the option you must have the ability. Consider the hypothetical that started this: "if I don't recognize my children". At that point the ability to do it yourself is gone.
Another possibility is that millions of people have said that they don't want to live to grow so old that they don't have their wits. But when the days come, they don't really want to die.
My neighbor passed away from dementia recently. We first moved in maybe a year after his diagnosis and had to watch it progress. Horrible.
Now a friend of mine who is the best programmer I know has an early onset diagnosis. I have noticed him starting to pick fights regularly with people on LinkedIn over programming topics.
It's a really, really hard thing to watch someone go through.
Hopefully a cure comes as a form of vaccine so some folks can be totally against that.
I don't think mental stimulation correlates to the development of alzheimers anyway. The papers I've touched on the subject seem to suggest a mechanical failure in proteins essentially choking off and killing brain structure. Although the lucidity period shortly before death is interesting.
With 25 years of experience in software development, I’ve noticed that long coding sessions leave me feeling more fatigued than they used to. However, I’ve also become significantly more productive, as I spend far less time grappling with problems I’ve already solved. I’ve only just begun to explore AI-assisted coding, so that isn’t what’s driving my efficiency. Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
It depends on the task, but overall, for the work I do as a software developer, yes.
I would say I have less energy, but I need less energy, and I produce better results in the end. I'm better at anticipating where a line of work will go, and I'm quicker and better at adjusting course. There are a lot of multi-hour and multi-day mistakes that I made ten and twenty years ago that I don't make now.
The raw mental energy I had when I was younger allowed me to write things I couldn't write now, but everything I write now is something that other people can read and maintain, unlike twenty years ago. It's very rare that writing a large, clever, intricate mass of code is the right answer to anything. That used to frustrate me, because I was good at it. I used to fantasize about situations where other people would notice and appreciate my ability to do it. Now I'm glad it's not important, because my ability to do it has noticeably declined. In the rare cases where it's needed, there are always people around who can do it.
Another thing that is probably not normal, but not rare either, is that the energy I had when I was young supercharged my anxiety and caused me to avoid a lot of things that would have led to better outcomes, like talking to other people. I'm still not great (as in, not even average for an average human, maybe average for a software developer) but I'm a lot better than I used to be.
What I find most draining is the non-coding work I now do for work. I love the org I work for and it's really fulfilling but I do a lot of senior stuff now and I feel like the years slip away without always getting to build and invent as much stuff as I'd like to. There's so much to do and learn, it's amazing, we live in this difficult world but with amazing opportunities, and I wish I had an extra 12 hours a day (of energy) just to learn and build.
I was young once, 25 years ago I started programming, and I feel as though I have at least another 25 in me, if not more.
I've been coding for over 40 years at this point. I'm definitely a better programmer than I was - not necessarily faster at pumping out lines of code, but I get the right approach first time more often than I used to. Whole classes of bugs are just easy when you've seen them before, but I'm also better at avoiding them in the first place because I know my weaknesses and where to spend time thinking more carefully.
At the same time, I can't context-switch like I used to. Once I get into the zone, no problem, but interruptions affect me much more than when I was 20 (or even 40). I can almost feel the tape changer in the back of my head switching tapes and slowly streaming the new context into RAM (likely because all the staging disks have been full for years).
As for long coding sessions - I relish them when I get the chance, which isn't as often as I'd like. Once the tapes have finished loading and I'm in the zone, I can stay there half the night. So that hasn't changed with age.
It could be something similar that we see happening in seasoned weightlifters/bodybuilders:
As your absolute strength gets stronger, the same exercises and workouts get proportionally more fatiguing.
5 sets of a bench press at an 80% of max load, taken within a rep or two of failure, done by a first-year lifter, is incredibly different from that same scheme being done by somebody who's lifted for 10 years. So more advanced lifters tend to do things like lighten the load and use variations of lifts that have more favorable stimulus-to-fatigue ratios.
Anyways, I thought maybe as an advanced programmer, something here could be analogous. You've already done all the coding and thinking to figure out easier and lower-level problems. So what you're left with are the more cognitively challenging parts of coding, which should be more mentally exhausting per unit time. Whatever is '80% difficulty' for you is probably way more advanced than what you were looking at 10 or 20 years ago.
Magnus Carlsen (the multiple times chess world champion) talked about this in his recent Joe Rogan podcast. He said he passed his chess peak already now at 34. He now knows more, but when he was younger he could win via brute mental power.
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
So according to Carlsen, for chess the answer is no.
I personally also suspect the answer for programming is the same. Most, if not all, of the hotshot programmers we know became famous in their early 20s. Torvalds started writing Linux at 21. Carmack was 22 when Doom was released. Many of the most famous AI researchers were in their early 20s when doing the most groundbreaking work. Einstein's miracle year by the way was also when he was 26.
> He said he passed his chess peak already now at 34. He now knows more, but when he was younger he could win via brute mental power.
The famous anti-case for this is J.R.R Tolkien started writing Lord of the Rings when he was about 45.
Writing is not programming but they are not that dissimilar. Especially in this context.
What I've learned over the years is life is actually not fair and everyone is different. You can be razer sharp and reasonably healthy at 83 or be in great shape and die of a brain aneurism at 12 with no warning.
Basically don't let studies or other people's results persuade you into not starting or giving up.
> Creative writing is tremendously different from coding, imo.
I've had a different experience.
IMO there's a huge overlap in skills when writing, coding, making videos and playing guitar.
They all boil down to the idea of getting something out of your head and then refining it until you know when to stop refining based on whatever criteria you're optimizing for at the time.
This is based on writing over a million words and making hundreds of videos over 10 years on my blog and programming for ~20 years while casually playing the guitar for about as long.
People in their early 20s are also much less likely to have other responsibilities "intruding" into their headspace. It's a lot easier to be monomaniacal when you don't (for example) have kids yet.
I know. That's the common argument, but I don't think that's it. See the argument I made in the previous comment. As I wrote in that comment, Magnus thinks his brain was better when he was younger. It probably doesn't help to have responsibilities like children, but I don't think that explain everything. There are also many people without children for example. And if you don't have children then studying full time should take as much if not more time than a simple job.
Also, Hans Albert Einstein was born during Einstein's miracle year.
> Also, Hans Albert Einstein was born during Einstein's miracle year.
This was in an era when fathers had little to do with childcare. I don’t know about Einstein’s specific situation, but even 40 years ago almost half of fathers had never changed a diaper.
I listened to Magnus and I took that quite differently.
I took that he said that others have caught up and he is just not motivated to do the type of studying to improve even further at this point.
There is a process we don't really have a name for that was best summed up by the boxer Marvin Hagler:
“It's tough to get out of bed to do roadwork at 5am when you've been sleeping in silk pajamas”
The demotivation of success. Of course, that is also going to correlate with age and be very hard to disentangle. At the same time testosterone levels will be past peak, adding another variable in the mix. Plus actual mental acuity past peak.
In other words, as someone pushing 50. Getting old kind of sucks systemically.
Very little of my work needs breakthroughs or inventions. Nothing new under the sun, as the Romans said. So, this mental peak is less important than being focused and efficient for me.
i have 10++ years more, but i don't notice such.. fatigue. 2..5+ hours.. no problems (even with fingers-typing-wrong-keys/order-much-more-often). What i do notice though, and not only in coding, is.. kind-of creeping-boredom. Growing tired of certain things going the way they go, too quickly. You know, the deja-vu feeling when you see something developing certain way, and seeing it go exactly there. Thousand times..
But i haven't stopped learning things, apart of the software-making-related, 2 years ago went into e-foiling, and some half-related more-technical adventures. So maybe that is keeping the dementia at bay..
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
Maybe up to a point. Most of the tools and languages I use daily are fairly recent, or at least new to me. I don't have much of an advantage, if any, compared to my younger colleagues.
There are certainly things I do better now than 10 years ago but I think I'm slowly declining though. Fortunately, there's more than one way to be productive professionally so I hope I can keep up for a few more year.
I noticed that I can still do long sessions if I have to crack open a problem ( I started coding around 35 and now I'm 40+), but the burnout may prevent me from coding for a few days.
I do think it has more to do with daily chores (work, family) than my age. I noticed that, despite being easier to get frustrated nowadays (because I get exposed to more sources of frustration) than I was in my 30s, I'm actually more perseverant than myself 10 years ago. I managed to be very close to wrap up a side project, the first time in my coding life. Of course the scope is smaller than my previous projects but I'm surprised that I didn't back down easily, considering how many times I banged my head during the first few weeks.
I guess being exposed to more frustrations does improve ones resistance to it. To be precise, I get agitated easily, but that agitation doesn't seem to burn me out in the middle term -- while in my 30s I didn't get agitated very often but every time it burns me down to the point I left my side projects.
This is something that can be gamed out mathematically, for example time to goal minus time to refactor.
As someone who has been writing software and/or managing operations for 20 years here is what I have noticed:
* The more experienced people get the more cognizant they become of fatigue in that they know when to take a step back.
* The more experienced people get the faster they get in that they know how to approach repeated problems.
* People do not necessarily get better with experience. Some developers never fully embrace automation, especially if they are reliant on certain tools versus original solution discovery.
Based on that it’s natural that some older developers tend to decline with age while others continue to grow in capability and endurance. The challenge is to identify for that versus those who mask it.
I wouldn't say, "decline," to be charitable. I tend to lean more on mathematics and writing. That often makes up for the lack of stamina.
When I look back on code I wrote 15, 20 years or more ago... it's fine but it lacks the sophistication I have now. I didn't know what I didn't know back then and had to learn. I can see in my code where I encountered a problem and instead of solving it I added more code until it, "worked."
I wasn't university educated so that's explains a bit of it. I didn't start picking up pure functional programming and formal methods until my mid thirties (gosh, has it been a decade already?). I worked through Harvard's Abstract Algebra at 38. I'm leaning more about writing proofs and proof engineering in my spare time while continuing to stream work in Haskell on various libraries and projects. And I'm in my 40s -- I'm doing more programming and mathematics now than ever.
I'm also playing in a band, practice calisthenics and skateboarding, and have been improving my illustration skills with ink.
It seems like the discovery of the article is that if you don't use your skills they start to decline as early as your late 20s. All it takes is practice to maintain and improve them!
I might get a little tired every now and then and can't keep every library I've used in my head all at once. But I tend to rely more on mathematics and specifications and writing. I write less code now. I remove code. And I keep programs and systems fast and correct.
I suspect you are better at architecture now than you were 15-20 years ago, such that you don’t have to struggle over how to solve many complex problems. The solutions and their planning are likely fluid now and quickly envisioned. That is something to comes from years of practice problem solving.
Not everyone has that though, even among people who claim to be well experienced. If those among us are aging and never fully developed the skills to save on manual effort they will likely appear as if in decline. Others that continue to find news ways to deliver higher quality at ever decreasing costs will continue to demonstrate superior value.
> All it takes is practice to maintain and improve them!
That is largely true for anything in that maintenance costs less than recovery and maintenance costs more than original solution delivery for someone well practiced at delivering original solutions. Not everyone invests in the practice to do this though.
Some developers never really learn to actually program. This is largely due to chasing fashions. In the past this has been around things like Java Spring or JavaScript React. Instead of learning to write original software they get really good at using a tool. Now the new fashion is expecting AI to do it for them. When people build their careers around this it’s all they can do and never dig deeper. This works well for seeking employment, but doesn’t allow for practical skills growth.
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
It depends on what you're doing.
The stronger cognitive strength needed, the less it can be replaced with experience.
Some chess grandmasters are teenagers. Maybe maths intensive ML research could be a bit comparable. But that's... Maths. Or distributed software algorithm optimizations?
In the vast majority of software work (as in > 99% ?), experience is more important, though, if you're bright enough when young. Or so I think
(But when closer to 80 or 90 or 100 years, that's different of course.)
Likewise. I can easily work for 10 or 12 hours. It's fitting in things like friends and family, according to their schedules, that makes it difficult. I don't mean to say this in a resentful way at all; it's definitely a me problem, not them.
Productivity doesn't correlate super closely with fatigue in my experience. The worse sessions are when I'm banging my head against something and getting nowhere. When I'm flowing, I can go for hours.
I’m not sure it makes sense to differentiate between energy spend while being “productive”, and energy spent whole trouble shooting and problem solving.
After all, trouble shooting can be viewed as a productive thing.
> skills decline at older ages only for those with below-average skill usage. White-collar and higher-educated workers with above-average usage show increasing skills even beyond their forties.
> Individuals with above-average skill usage at work and home on average never face a skill decline (at least until the limit of our data at age 65).
Literally the two most important things from the article.
Get better at things so you don't have to worry about decline. That simple.
Its like a muscle - develop it early on and then you can easily keep it in shape without much effort until the day you die, without any noticeable decline (at least until like 70).
This is my biggest fear of retiring from programming and doing something else. At 55, I feel like programming keeps my brain elastic. I fear leaving that and going into slow decline.
I'm worried about this with my dad. He's recently retired from a career of hardcore engineering in the optical physics industry. Now in his mid 60s he's inside all the time playing baseball games on his iPad and watching TV shows with my mom. I've been trying to figure out ways to spark his curiosity again. Thought LLMs would blow his mind, and they would have 15 years ago... but it was just passing interest.
Yeah my mom plays solitaire all the time on her iPad and just gets fuzzier and fuzzier. She was doing sudoku and some thing that at least seemed to be a little challenging. But she seems to have stopped. At least she still gets regular exercise with my stepdad.
I wonder how much of the "age-related" decline is due to the brain functioning on autopilot. After over 5 decades, I have experienced most of the issues I'm going to experience in life. More often than not, I'm addressing issues with mental playbooks based on past experience.
As I get older (now in my 50s), I find myself reflecting on how many aspects of my life and decisions are operating on autopilot. I figure it's worse now with social media where people are constantly bombarded with dopamine hits, while boredom and idle thoughts have largely become things of the past.
Perhaps counterintuitively, I am trying to break this pattern and consciously engage with my experiences by asking a few basic questions, such as:
- What am I seeing here?
- What's going on?
- What am I missing?
- How can I approach this differently to achieve the same or better outcomes?
Additionally, I am making a concerted effort to notice more new details during routine tasks, like commuting or shopping. I can't count how many times I've discovered something new and interesting on my work commutes. Actually, I can: it's every time.
Edit: Also spending more time with long-form content over short-form, be it reading or watching videos. It forces me to consider a topic for a much longer period. Short form knowledge is a trap, unless you have some system that hits you with high rates of repetition (eg Anki).
In my humblest of opinions, you are probably spot on about the autopilot vs. actually experiencing things.
As a concrete example, someone in this thread mentioned their older relative spending a lot of time with puzzles daily. I too watched my grandpa doing sudokus and crosswords, but in the end if there’s nothing much else, those too will quickly become uninspiring routine.
I really believe truly experiencing life does require some introspection so that you have agency.
And agreed, at one time I really got into Sudoku and Minesweeper, but my nerd mind quickly turned them into brainless pattern matching routines that required effectively no thinking. Don't get me wrong. I appreciate those abilities, but there's a time and place.
I'm still in my 30s, but I wonder how much mental decline is actually due to physical decline. I notice I feel more sluggish, sleepy, less sharp and motivated during periods when I'm more sedentary. And while exercise is tiring, I feel it gradually improves not only physical stamina, but mental stamina as well. Clearly a large part of our brain power is spent controlling our bodies, for when a stroke happens, becoming paralyzed in an area of your body can result. And clearly our body caters to our brains needs (e.g. nutrition), so if the body declines, then it shouldn't be surprising to see mental capabilities decline as well.
>Additionally, I am making a concerted effort to notice more new details during routine tasks, like commuting or shopping. I can't count how many times I've discovered something new and interesting on my work commutes. Actually, I can: it's every time.
This is one of the underrated pleasures of commuting by bicycle. You aren't abstracted away from the world in a bubble of steel and glass. You see, hear, feel, countless little details, and you can reach out and touch them if you want. Potholes, pedestrians, birds, the wind and rain and sun, smells of food and flowers and weird chemicals, street music and overheard fragments of conversation. Millions of faces.
I bet you can even accomplish some of this retroactively with the right group of friends. The question "What did you do this weekend?" can be answered in so many levels of detail.
For me. I started making enough money that all my old routines stopped being relevant. I started to drift into comforts and lost touch with my surroundings.
Are there any guidelines for what exactly this would entail?
My short term memory is falling off a cliff. What do I need to do to prevent that from getting worse? Are there any other bases I need to cover that I don't know that I'm missing?
Are you sure? I thought this was happening to me too, and then I realized when looking back 10 years ago that I have way more responsibilities now both in and out of work: I am not only getting more done at work, but also for more people. I am now picking and choosing which meetings to even hold, much less attend, because I have a higher throughput. My children's needs are much more complicated now than when they were younger. I have a side business.
I can't fathom how I would have even gotten this all done when I was younger simply due to how much leisure time I spent, much less kept all of this in short term memory back then.
> I thought this was happening to me too, and then I realized when looking back 10 years ago that I have way more responsibilities now both in and out of work
This so much. When I was in my 20s I never forgot things, but I didn't have anything that I really needed to remember lol.
It's easy to forget about how many more responsibilities we take on as we age, simply by nature of how those responsibilities slip into our lives one at a time, bit by bit, gradually shifting our window of normalcy.
My phone is now full of Notes, Alarms, and timers. I can barely leave the house to run an errand without writing down what I need to do.
As far as actually improving memory, I try to expose my mind to as much raw material as I can. The mind is a muscle, it has to be exercised, and as you get old you need to focus on its core strength rather than physique and raw strength.
Rehearsal and repetition. Read constantly, get out in the environment and really try to observe all the things that are going on. Write down all the things you want to do this year, and when you’ve done them, write that down, too. Every so often, review the list. It will prompt your recall to a wonderful degree.
Write down your little milestones - ‘in March we found a clutch of tadpoles in a tire track puddle and we watered and fed them there for six weeks”
Regarding memory, I have made a habit of assuming I have a faulty memory, and trying to write down anything I think I may want to remember in the future using a wiki style tool that supports back linking. The tool I use is Org Roam in Emacs, but there are lots of options. I have found that by doing this, I have offloaded a lot onto my computer, and made space in my mind to remember a lot of new things.
Contrary to the other comments saying to carry a pocket computer: my brain. Hence the improved memory. I offload my thoughts into my notes when I can. If it wasn't important enough to remember until I can find a seat at my desk, it wasn't important enough to write a note on.
You are not supposed to store things in the brain, that only causes stress.
Brain is to do thinking work, you are better off writing and tracking things on paper. Use the brain to think, and paper for planning, scheduling, tracking etc..
I was in the same boat, but I started noticing that if I force myself not to do silly multitasking (like not paying attention to what I am doing because my mind is thinking about irrelevant other things) it gets better. Since I stopped the infinity doom-scrolling it has improved a bit
Stress and lack of sleep also affect me a lot. Both are omnipresent, since I am a parent of young-age special-need kids.
Both of my grandfathers in their 90s have insanely sharp memories. I feel that theirs is a lot better than mine at instantly recalling details. I have noticed this in other older people from that generation too.
The only 'exercise' I've heard of that offers measurable improvement is "N-Back", kind of like the old TV game "Concentration". The app is available on most smartphones.
Emotions can have a large impact on memory, as far as I know. They provide the catalyst, in a way, in the process that forms memories. If you are depressed or otherwise not emotionally engaged, it can become much harder to form memories.
Solve emotional problems and memory may improve. (I have no idea if that applies to you, of course.)
> short term memory
Which sort of memory do you mean? Short term memory is remembering a name while you write it down, not remembering it the next day or week.
Avoid weed if you don't already. Might seem out of left field but a programmer friend of mine is absolutely convinced their memory is shot because of long covid and it's like, well, maybe, and the trauma of the pandemic certainly put a dent in everyone's cognitive ability, but also the dabs can't be helping.
For those who don't feel like taking math courses in a formal setting, making games from scratch is a fun way to learn and apply linear algebra and calculus.
I never really needed determinants in my life until I tried moving a spaceship towards another object. Trying to render realistic computer graphics gets you into some deep topics like FFTs and the physics of light and materials, with some scary-looking math, but I can feel my mind sharpening with each turn of the page in the book.
I have seen it with my own parents and my wife's parents first hand. Frankly, I think the lack of social interaction is a big part of it.
When they're working, they're regularly talking to people outside their comfort zone about potentially challenging questions. That gets largely shutdown once you retire.
Both my parents were in a huge rush to retire early, and now they just sit at home and scroll Facebook. I don't see the appeal.
I didn’t appreciate this until covid and wfh. I’m an introvert and am in my happy place sitting in front of a computer or with a book. But I was losing my mind and had to be actively social for the first time in my life. I can see a decade of living like it’s Covid turning my relatively healthy, relatively young brain into soup.
Leaning into stereotypes, the older women in my family did just fine in retirement because they just started doing social activities full time. If anything they retired and got busier. The older men sometimes did ok but usually did worse.
That is why volunteering when you are in retirement is a win-win. Very few others have the time for what is an absolutely necessary part of society, and it is great to keep your mind and heart active while you recall your life and use its lessons to give back to others. Any sort of volunteering will lend itself to that. For example, Jimmy Carter built houses, and it seems to have done him wonders.
Social interaction must be important, but also the fact that work doesn't ask you how tired you are, you have set of tasks and go. When being master of my own time, I can imagine I would veer towards more fun activities which may not have that forceful aspect and would be done mostly alone.
And super true for those parents, my goal is to travel massively as much as my budget and health will allow it. Backpacking all around south east asia, thats what keeps me pushing to work on earlier retirement. Sitting at home unless forced, no thank you thats a downward spiral
>>Both my parents were in a huge rush to retire early, and now they just sit at home and scroll Facebook. I don't see the appeal.
My retirement plans look somewhat similar to how Knuth spends his time. Long hours of deep intellectually challenging work. Driving long distances and eating tasty food some where far away.
Most of retirement motivation comes from feeling the sun during weekdays. There is little point to be sitting whole day at home.
And from what I've heard on the grapevine, life expectancy drops among those who retire relative to those that don't. This makes sense: many people don't seem to know what to do with themselves if they're not "officially employed", so when they retire, they become aimless, and they sort of decay and disengage from living.
Though is that causation or correlation? I can imagine that people with all kinds of illnesses would also retire sooner than people who are still in peak health.
I write code, pretty much every single day, and also, solve problems, every single day (7 days a week).
I think solving problems is important. Not just rote coding, but being presented with a bug, or a need to achieve an outcome, without knowing the solution, up front, is what I like.
Basically, every single day, I'm presented with a dilemma, which, if not solved, will scrap the entire project that I'm working on.
I solve every one (but sometimes, by realizing it's a red herring, and trying alternate approaches).
> ”Cross-sectional age-skill profiles suggest that cognitive skills start declining by age 30 if not earlier.”
and
> ”Two main results emerge. First, average skills increase strongly into the forties before decreasing slightly in literacy and more strongly in numeracy”
Does this mean that this study contradicts the popular common understanding that cognitive skills decline after 30? Or am I missing something?
For me, personally, if feels a more impactful finding than the “use it or lose it” one
If you are older, I think the trick is to watch (or remember!) what younger people do and follow (or revert to) that behavior, as much as you can.
Comparing cognitive abilities between older and younger people fails to control for the inputs - behavior, experience, etc. Try the same inputs (using some big generalities):
* Exploration: Younger people love to explore, even just for exploration sake, and are also compelled to try things - and they also fail. Exploration is their mode, because so much of the world is new to them, because doing something new and innovative is socially admired, and especially because so many major changes happen - leave home, serious romantic relationships, first job, etc. A lot of that happens, ready or not.
* Learning: Similarly, younger people are compelled to learn lots of very challenging things, whether they want to or not; they are compelled to use cognitive skills that they are uncomfortable with. Their job is to learn, daily, for 12-16+ years. Remember school? Remember your early years at work when had little choice of what you did? Remember struggling with all those things?
* Playing: Young people love to play and are socially admired for playing better and more creatively.
What, you're past all that? Nobody is going to make you study things you're not interested in? Don't want to make any big changes? Dignity too big to play? Ego too big to explore and to fail? When you're older, you can say no and 99.99% (I think that's about accurate) take advantage of that and refuse to do or even talk about things they aren't already comfortable with. Does all this sound too hard? Then don't complain about losing those skills.
I think a big part of the problem is the same that affects CEOs - there is nobody to hold them to account.
This makes so much sense. I've been programming every day since I was in my twenties and there are definitely some concepts that seem much easier for me to get my head around now (I'm in my 50's) than earlier.
Right now I'm reading through a college textbook on the biology of learning and memory with ease and good retention. Never got this deep into any subject in my school years.
> I've been programming every day since I was in my twenties and there are definitely some concepts that seem much easier for me to get my head around now (I'm in my 50's) than earlier.
Same same.
I figured this is because I have less energy, but a little more wisdom. I have much broader understanding of related concepts. So, things click a lot faster.
I’d like to remind everyone that learning to play music, learning dancing or sports requiring complex coordination also count.
Oh, and a question at the back of my mind: wouldn’t using AI to avoid minimize all of us spending time in the struggling-to-figure-something zone lead to earlier decline on a massive scale?
As someone who plays a lot of board games — particularly heavier board games — and hopes to do even more of that in retirement, I’m wondering if/how that is helping/will help fight cognitive decline.
I can imagine at the very least it won’t hurt, and intuitively it makes sense. But I’m not sure studies have been done specifically to understand how board gaming — or the problems being solved with board gaming - helps with cognitive skills.
Curious if others that are closer to this field have thoughts.
I love me some board games, but I prefer depth and decision space to complexity -- and the industry is dominated by stupendously complex beasts full of unnecessary mechanics that slow things down or extend setup without adding too much. A perfect example is TI4's expansion Prophecy of Kings, nearly all of which I despise for bloating a beautiful base game. I'm also always flabbergasted by how starved and railroaded I feel in games like Dune Imperium or Cole Wehlre's collection. Despite a wealth of mechanics, my choices are few and far between.
Complexity has its place, especially for engine builders like Terraforming Mars where complex interactions are the point. Many designers seem to be throwing in the kitchen sink arbitrarily. We're in a "bigger is better" paradigm.
Older coders/technical folks tend to have more wisdom than raw compute (relative to younger coders who may have more raw compute than distilled wisdom.) Wisdom takes a more reliable and more efficient path than raw compute.
Both raw compute and wisdom will be eventually replaced by AI, but "deep wisdom" is largely held in the body, in the way we react viscerally to things, which AI as it is envisioned today does not factor in at all. So we still have a refuge in the wisdom stored in our body memory.
As an older developer who lately has been pairing with early career developers, I've been noticing lately how often wisdom comes into play. It feels like close to a daily occurrence where I suggest something is the cause, then later I'm asked how I knew that was the right thing to investigate, and the only response I have is that it's almost always the culprit.
Recreational travel is the only thing that routinely works for me in terms of slowing down time and fully engaging my brain. It's something I can incorporate into my life multiple times per year and it guarantees a massive amount of new stimulation (assuming travel to new and interesting places). Even the most rudimentary trip to Europe will have you grappling all day long with a different language and culture and environment in ways that are completely taken for granted in our day to day lives.
There's lots of things that can make an even bigger impact, like moving to a new place or starting a new career or school, or a new relationship. But those are things that sometimes only happen a handful of times in our entire lives.
Everything else I find eventually becomes routine, no matter how stimulatingly it can be at the start. Not that we shouldn't try! Some stimulation is a whole lot better than none, and I have a terrible feeling that many people get little-to-no stimulation for weeks and months at a time (beyond a new TV show or podcast or political drama).
I think there's a valid concern about cognitive fatigue. It could be mentally exhausting to constantly "exercise" our brains just to maintain cognitive abilities as we age!
Maybe AI could be our mental gym buddy here - not replacing our thinking but offering just the right level of mental challenge to keep us sharp without burning us out. Picture an AI that knows when to push your intellectual boundaries and when to back off based on your energy levels.
And Neuralink-style brain interfaces? They could be like cognitive training wheels - gently supporting neural pathways while letting us do the actual pedaling. Instead of "downloading knowledge" (which sounds exhausting in its own way), they might subtly enhance natural learning processes or help maintain neural connections that would otherwise weaken with age.
The goal shouldn't be turning our golden years into endless mental marathons, but rather finding that sweet spot where cognitive maintenance feels engaging and enjoyable rather than like another chore on the to-do list!
> Two main results emerge. First, average skills increase strongly into the forties before decreasing slightly in literacy and more strongly in numeracy. Second, skills decline at older ages only for those with below-average skill usage. White-collar and higher-educated workers with above-average usage show increasing skills even beyond their forties. Women have larger skill losses at older age, particularly in numeracy. [emphasis mine]
So, it seems like workers with above-average usage of literacy and numeracy continue to increase their ability, while those in fields that don’t emphasize those would need some kind of mental “exercise”.
(I also note that some commenters here are rushing to add more cognitive work to their daily routine through additional studies, but I wonder if they’d be better off focusing on commonly neglected areas like physical activity.)
I'm skeptical. I was in my 40's before I graduated from a college. Before that, I did some serious electronics with a background from my local community college, have worked production lines, taught myself assembly language when I was engineering my first microprocessor-based design at work, then when I couldn't get re-employed years later, a lot of potential employers simply not believing my resume content because my 'formal education' was lacking, so I went back to school, got my BS in 1999, and my MS in 2006, then continued working on personal projects and learnign new coding languages on my own since now nobody wanted to take a chance on hiring an 'old' man. Their loss.
> I couldn't get re-employed years later, a lot of potential employers simply not believing my resume content because my 'formal education' was lacking..
I am 53 years old. I don't have a college degree. I have never been unemployed and have had good software development jobs all my adult life, including now.
It is possible and likely that your lack of a degree was not the issue.
Idk, at 29, 30 and 31 I became significantly smarter - it just had to do with things I was intensely interested in. Things that could hold my focus just no longer matter. Fortunately I'm interested in engaging things that are hard.
For myself, while the learning curve is "longer" as I've gotten older, it also shoots sharply upwards as the time spent on the skill acquisition increases. Age has a magnification effect at the tail end.
I'm in my late 40s and I do pick up new technical skills a bit slower than younger folks. But because I have a lot of experience, I'm able to more quickly grasp various contextual aspects of those skills: how/why they are useful, how they compare to previous skills that tried to solve the same problem, the hidden costs and implications, etc. These matter a lot in the practical, everyday application of skills.
I find that younger people have a really hard time with those contextual aspects, or they don't think it's that important... until they discover they do.
I'd like to see how much of the decline is correlated not with age but with Parent Brain.
The mental energy occupied by and spent with parenting is palpable, not to mention long-term continued stress, physical, mental and emotional exhaustion. I wouldn't be surprised if having kids (which is of course correlated with age) is much more of a factor than age itself.
Like many I preferred the "internet" before it was this. The War Games-like setups, mystical and empowering, and the wonder of the future, how information would free us, things we could never imagine. All to wind up with people staring at Instagram while driving and running into me and my dog, and the world/companies like Apple and Samsung shoving AI elements down our throats.
So I plan my retirement, and it involves unplugging from all this. Then what? Live in a small cabin in the remote woods? Not sure that would go well.
And yet ranks of successful founders are dominated by people in their mid forties. Perhaps there is something more involved with social function than pure cognitive skill?
Does that imply that it also is part of character traits? As in use empathy, become emphatic, stay in a non-emphatic environment, your brain degrades you to a sociopath?
I hung out with a friend recently I had not seen in close to a decade. He was at one time my closest friend and seeing him was kind of uncomfortable and enlightening. I saw sooo much of how I used to talk and act still in him that it really had me wondering how much of that I'd gotten from him versus the reverse.
I think it has more to do with getting desensitized to things the more you're exposed to them. With age you get more and more exposure to everything emotional and lose the strong reactions.
Add to that some frustration from not being able to keep up with things, health issues, no one "young" having time to hang out and your friends dying all the time and I'd be grumpy too. You were once a stallion taking care of everyone and now you worry about falling in the shower because you occasionally lose balance for whatever reason. And you know it'll hurt like a bitch, you'll break something and it won't heal for a year. It's quite humiliating.
I am definitely grumpy. What makes me grumpy is the fact that society keeps banging its head against a wall for no good reason.
There is everything there for growth, and yet I see none. I get very tired of knowing well what the boring, selfish reaction of the person I encounter is going to be. I am sure I do the same thing - and don't change much compared to what is available to me to make changes. I do not lead by example at all the way I would like.
Nonetheless, what makes me grumpy is lack of change, not the superficial appearance of change with which technology distracts us. Moral growth would be so refreshing to see, but I see none of it - despite virtue signalling as a veneer from all parts of society.
Said more colloquially, a lot of older people just grow tired of all our bulls*t.
As with most research around our scientific understanding of intelligence, I assume this only scratches the surface.
There may be something to your comment.
There are trajectories of personality traits over the life span, I would hesitate to extrapolate them based on the trajectory of cognitive abilities though. One of the known life span emotional/personality trajectory is positivity bias, older people tend to be more positive. It is sometimes framed as negativity avoidance, that is older people tend to ignore negative things more often.
This is sweet news. I'm over 40. I enrolled at my local university in January and I'm studying (literally right now) for my linear algebra midterm [0] which is in 45 minutes! I'm on HN to calm my nerves.
I graduated high school in the early 2000s and graduated college with major in computer science and a minor in math. My goal is 5-8 more classes for a second degree in math (major).
Wish me luck!
[0] Study guide: https://course1.winona.edu/bperatt/M311S25/Tests/Test%202/te... Course: https://course1.winona.edu/bperatt/M311S25/Administrative/M3...
I went to the University of Texas but I took summer courses in Houston Community College (calculus II, physics II, and more -- those classes were SO bad at UT).
It was insane how much better the courses were in the community college. Tiny class of 15. $300 or something. Amazing professor that you could ask questions to like you could in high school. Normal 20-30 question textbook homework where you just work basic problems and build confidence that you know the material.
Meanwhile UT was the opposite. I think I paid $1400/class/semester (and that's a bargain). Lecture halls where you couldn't possibly ask a question. Weird math/physics homework that was like 3-5 super hard questions that I often couldn't figure out, demoralizing. Often a TA that could barely speak English. It's actually quite insulting.
I sometimes think about enrolling in a local college for fun, the experience was that good.
> Weird math/physics homework that was like 3-5 super hard questions that I often couldn't figure out, demoralizing
Had this experience at an elite uni as well for math courses. At the time I felt like it pushed me to really grow, and it was absolutely necessary to do well in that specific course (tests often had questions that ~required you to know how to do all the uber-hard homework problems), but I wonder what the research actually says about this sort of homework vs your more standard variety.
> I wonder what the research actually says about this sort of homework vs your more standard variety.
I have a vivid memory of one of the question on a final being basically “sketch the outline of this important thing we studied”. I couldn’t do it. I took the class but didn’t see the forest for the trees.
Later I met people who talked about things with each other, including the big picture. That’s the community I was missing when I took the class solo.
In retrospect, I could have gotten something more out of those problems that I thought were so hard.
I think it also depends on what the professor's and the student's goals are; and if they're aligned.
Is the course about learning the material at hand, or laying the foundation for graduate level courses in the same subject? About teaching the most efficient way or getting a student used to deriving equations when there's not a plug and play formula.
I'm sure we can draw similar parallels between csci college courses, big tech interviews, and professional software development. Even though it's all the same pipeline, each stage/stakeholder has different goals, motivations, etc... If you're having a discussion about the pros and cons of an approach, you have to make sure the goals are aligned else you'll just be talking past each other.
I had classes with take-home tests of three impossible questions, and standard tests of disguised regurgitation. The impossible questions are the ones that will really test your understanding of the fundamentals. It's the different between "add two numbers together", and "what does adding mean"?
I found out I can't stretch my brain to truly understand the fundamentals, so I stopped after a bachelors and don't use my degree at all. I don't mind. It takes truly special people to push the limits, and a lot of not so special people to keep the world running for them.
I have wondered this too as a person who has attended a regular (non-honors) Calculus II course at a fairly top-rank private university and then again at a community college.
From what I remember, the university course also had some rote exercises for homework so it isn’t like everyone is only focusing on working the trickier exercises.
This also reminds me of the story Donald Knuth has around working every exercise in the book for a calculus class.
Large universities are focused on research, and they incur a lot of expenses due to administrators' egos (build build build), the number of administrators, and the range of microstate services offered, like their own health care system and mental health counseling (a major thing in universities now). Community colleges are focused on teaching.
> their own health care system and mental health counseling (a major thing in universities now)
Which would be a non-issue if the US simply had single-payer or universal healthcare.
the US loves externalities, or not understanding them
[dead]
I had a similar experience -- took physics at a community college when I was in high school. The 'up-side' of the overproduction of PhDs is that many people from elite backgrounds end up teaching at community colleges.
The only negative for me was that the students were pretty checked out.
> The only negative for me was that the students were pretty checked out.
I didn't put a lot of thought into where I went to school but if I could do it over again this is something I would have considered when I applied. The school I ended up at did not have many serious students. It was a night and day difference taking courses with even one or two students who were similarly engaged with the material, but most of those students ended up transferring to better schools after a year or two.
You also run into the issue later on that the people you went to school with wash out of industry (or never work in it to begin with) at much higher rates in comparison to those who went to more serious schools.
Exactly the same experience. My AP Physics teacher in high school was incredibly better than university.
UT is research focused. Depending on the department, they make the professors teach classes, which is often not aligned with their interests at all. Sometimes I think they are actively trying for bad reviews from students to incentivize the university from making them take on course load.
I had the same experience where community college teachers were vastly superior to my university teachers. Vastly.
It's possible that people who end up as teachers in community colleges actually like teaching and see it as a challenge they are willing to tackle.
Whereas in big universities the professors really don't much want to teach they want to accomplish scientific breakthroughs, themselves.
One possible reason is that the community college teachers -- at least in my state -- are unionized. This makes teaching as a career possible.
community colleges are... like... there for the community. and you feel that community.
a lot of big universities have people there for research. there is money to be made, grants to be given, and degrees to be minted. and you can feel that too.
source: got out of the military and went to one, then the other.
I went to a somewhat highly regarded (not MIT or CalTech tier) tech school, and then to a state university.
The tech school considered it a boast that it had more graduate students than undergrad. It was clear where the professors' emphasis was. I recognize the lecture halls where you couldn't ask questions, and the barely-anglophone instructors. (Everyone in the EE department, in particular, seemed to come "fresh off the boat" from China bringing precious little English knowledge with them. The prof for my introductory EE course mumbled on top of it.)
Then I went to state school. Ho-lee shit. Complete difference. The bad profs were incompetent chucklefucks who couldn't cut it in real academia. The good profs actually cared about teaching undergrads.
I learned a lot about choosing a college -- a few years and a few tens of thousands of dollars too late.
I had this experience too! My math professors in community college were much better than at my significantly more expensive university.
Community Colleges are the gems of US undergraduate education
I hope it went well! I am in my fifties and enrolled in a master degree program for pure mathematics about 2 years ago (I don't need the degree, so I"m just taking all the classes they offer, so not about to graduate). It definitely took some time to get my brain sharper, but I am better each semester.
I hope people don't take away the negative side of the article, brain slows down, but the positive side: brain gets better with usage. Its uncomfortable, I can churn out programs as complex as programs I've already written and go to review meetings and planning meetings without much effort. But being able to solve PDEs reasonably quickly and accurately, I cannot, or have not without a great deal of practise. It's unconfortable in some weird mental but physical sense. But I'm sharper in everything else I do.
One interesting thing about software as career followed by math classes is that there's no compiler - you can type any janky thought into LaTeX and if you don't detect that it's bogus, nothing will, until you show it to a professor.
Also, the information density of maths notation is way higher than (good) code. We want code to be readable by some that doesn't know it; a lot of math seems to be readable when you sort of 80% already are familiar with all the prereqs. So no just skimming and then hitting compile/test/run (whatever validation you do). It's typing letter by letter and taking the mental effort to actually see and decipher the letter (at least, for me in my current stage; I'm trying to do novel research, but my demonstrated understanding of the details of the previous research is embarrassing low).
Also, weirdly, I still have the same fear of professors that I did as a young person. I manage it better with my decades of maturity (really) but it is still a part of my social interactions.
> But being able to solve PDEs reasonably quickly and accurately, I cannot, or have not without a great deal of practise.
No one - young or old - does well in math without a great deal of practice :-)
A related ancient XKCD with a slightly different take: https://xkcd.com/447/
> One interesting thing about software as career followed by math classes is that there's no compiler - you can type any janky thought into LaTeX and if you don't detect that it's bogus, nothing will, until you show it to a professor.
The formal proof community is very interested in exactly this problem! It's not my specialty, but I believe that Lean (https://en.wikipedia.org/wiki/Lean_(proof_assistant)) is one of the very active communities.
I've done some intro to lean things, but no one in the maths program is into lean, so I'm just focusing on the math side. Terry Tao is big into the idea of lean tho, and combining LLMs with Lean.
The information density is incredible. A 2x2 matrix (Jordan constants) containing enough information to produce a slice of a hyperbolic paraboloid. Leaves me mesmerized...
It's funny, at the end of each lecture I just want to yell... "NO! Don't stop! I must see how this ends!"
Very similar to when I stop our children's movie and tell them to go take a bath.
I guess nobody knows all of math and it is constantly a learning process, but things like "Jordan constants" which defines proper nouns to thousands upon thousands of math concepts, symbols, theorems and approaches just makes it a even harder to memorize the whole shebang. Fascinating but sometimes overly complex.
I like your point about feedback. That's how I describe my difficulties with proofs, too. There is no way of knowing a proof is right without knowing it's right. (Or maybe I am just missing the point)
I will look into Lean that is mentioned here.
Good luck! You can do it! I started doing statistics classes three years ago when I was 45, continued doing a MSc degree, which I finished successfully a few months ago. I am now looking into doing a PhD. This is more fun than I ever imagined (fair enough: I was a teenager when imagining it).
Good luck! You should check out Math Academy, it's more effective/efficient/cheaper but also a good supplement since it's accredited.
I recently turned 40 myself and I'm working through their Foundations courses (made to help adults catch up) before tackling the Machine Learning and other uni courses.
Have you found Math Academy better than just prompting ChatGPT/Claude/etc. to be a math tutor?
I'll tell you my experience as someone who's been using Math Academy for past 6 months.
Math Academy does what every good application or service does. Make things convenient. That's it. No juggling heavy books or multiple tabs of PDFs. Each problem comes with detailed solution so getting them wrong doesn't mean looking around on the internet for a hint about your mistake (this is pre ChatGPT era of course, where not getting something correct meant putting down MathJax on stackexchange).
> better than just prompting ChatGPT/Claude/etc
The convenience means you are doing the most important part of learning maths with most ease: problem solving and practice. That is something an LLM will not be able to help you with. For me, solving problems is pretty much the only way to mostly wrap my head around the topic.
I say mostly because LLMs are amazing at complementing Math Academy. Any time I hit a conceptual snag, I run off to ChatGPT to get more clarity. And it works great.
So in my opinion, Math Academy alone is pretty good. Even great for school level maths I'd say. Coupled with ChatGPT the package becomes a pretty solid teaching medium.
Yes, much better. ChatGPT/Claude/etc. are useful the times I want extra explanation to help connect the dots, but Math Academy incorporates spaced repetition, interleaving, etc. the way a dedicated tutor would, but in a better structured environment/UI.
Their marketing website leaves a lot to be desired (a perk since they are all math nerds focused on the product), but here are two references on their site that explain their approach:
- https://mathacademy.com/how-it-works
- https://mathacademy.com/pedagogy
They also did a really good interview last week that goes in depth about their process with Dr. Alex Smith (Director of Curriculum) and Justin Skycak (Director of Analytics) from Math Academy: https://chalkandtalkpodcast.podbean.com/e/math-academy-optim...
The second link really impressed me, I'm tentatively sold on (and excited for) their approach. Does anyone know of any other accredited programs similar to Math Academy, but for other subjects?
Anything in the soft sciences, or biology/organic chemistry, or comp sci. I know there are a lot of courses for the latter especially, but I'm looking for accredited ones.
I used an early e-learning platform not because I wanted to but because I was one of its developers. I didn't develop the course-content just the technical implementation.
What I didn't like about the content is I often had questions about it but there was no-one to ask the questions from. Whoever wrote that material was no longer around. It's a frustrating feeling when you can't really trust what you're studying is factually correct, or is misleading.
I assume AI will have a huge improvement in this respect.
Not OP, but I have found MathAcademy to be infinitely better. I really liked the assessment portion which levels you and gives you an idea of where you are are at the present. As someone who graduated with an engineering degree a while ago, there were things I realized I didn’t know as well as I thought I did and I probably would not have prompted an LLM to review.
Math is something that should be taught in an opinionated way with an eye toward pedagogy. Self study with GPT is an excellent tool in math, but only for those who have enough perspective to know which directions to set out on. I don’t think anybody who doesn’t know linear algebra should be guiding their studies themselves.
Given my ChatGPT and friends experience has been one of overwhelming frustration due to incorrect information, I would say Math Academy is in an entirely different galaxy. ChatGPT is great if you want to learn that pi is equal to 4.
b-b-b-but the next supercalifragilistic ChatGPT version will be able to tell you that pi is between 3.1 and 3.2. that will be a Quantum improvement, asymptotically close to AGI.
at least, i think i heard alt samman say so.
you plebs and proles better shell out the $50 a month, increasing by $10 per day, to keep dis honest billionaires able to keep on buying deir multi-million dollar yachts and personal jets.
be grateful for the valuable crumbs we toss to you, serfs.
I haven't used it, but there was a big thread about it yesterday: https://news.ycombinator.com/item?id=43241499
Keep making those pushes! I was a non-traditional graduate student because around 10 years I got very serious about going for my doctorate. I literally scheduled times with my friends to watch Khan Academy videos on upper level maths and spent time practicing those skills. Then grad school is just one intensive learning session.
Years of martial arts ingrained that sense of being a life-long learner. I was taught the mantra of "Progress comes to those who train" and "Practice makes permanent" and even though those phrases were focused on learning to beat someone up, I've carried them on into other parts of my life.
Good luck! I'm over 40 and just had my midterm for General Linear Models (statistics + linear algebra).
This YouTube playlist was invaluable for me: https://youtube.com/playlist?list=PLmM_3MA2HWpYYo7QExaRvor_u...
Appreciating the link!
Congrats! It is never too late to be doing this type of study and work.
I'm doing something similar: I just turned 50 and have been taking graduate ML classes where I work (at Carnegie Mellon). When I finish the graduate certificate program in generative AI and LLMs that I am enrolled in, I will be only two semesters away from earning a full masters degree.
Good luck! I'm 36 and still hoping to master Digital Signal Processing at some point even though I find the math extremely difficult.
Great for you; that's really fantastic and by posting about it, I hope you make a lot of other middle-aged people comfortable with persuing education.
> My goal is 5-8 more classes for a second degree in math (major).
Why not get a masters degree?
Edit: answered here: https://news.ycombinator.com/item?id=43282629
> Wish me luck!
You don't need it. :)
A masters in CS doesn’t interest me as much as pure math. Maybe when I’m 50…
> graduated college with major in computer science and a minor in math.
Me too. High five!
> My goal is 5-8 more classes for a second degree in math (major).
But why? Wouldn't it make more sense to go for a master in computer science? Are you going to use it for work. Otherwise, aren't you going to "lose it" anyways? Also, is your job paying for the degree or are you paying out of pocket?
I’m self employed and this is “for fun.” My wife is a professor in another department and I’ve got a tuition waiver.
An academic pursuit can be done for sake of knowledge. Forcing your mind to constantly flex is never a bad idea.
It could be both, though.
Have one or more kids. Then you get to keep your edge teaching them linear algebra 20 years later..
Off-topic but this is a pretty interesting study guide format.
Maybe it's standard in lots of places, but I've mostly seen study guides where they just list a ton of topics and that's it.
It’s been twenty years so my opinion is skewed and my memory is quite faded, however, I’ve got opinions on the guide and class in general.
The main thing is there are no surprises or tricks. The exams are straightforward and EXHAUSTIVE. I do all the assigned homework twice. Once when we cover the material and again before the exam. Let’s hope that strategy pays off again.
I'm pretty sure it will. Sounds like you are putting some real effort so I don't see why you won't do just fine.
Good luck!
Good on you! Of course even after 40, it's still not the end of the world if you don't get what you're hoping first time, but I hope it goes well.
Good luck! Are you in Winona, too? I live near the campus and have been considering taking some classes there, this was a nice surprise to see. :-)
Yes! I’ll send you an email.
No doubt, a lot of us are greatly relieved to read this.
> My goal is 5-8 more classes for a second degree in math (major)
Do colleges usually let you do this when you're adding to a degree you earned 20 years ago?
This college requires something like taking 30 credits from the institution to award a degree. That's somewhere between 7-10 classes (mix of 3/4 credits each).
I will be 40 in 2 years and I also plan on going back for a masters in something :). Maybe CS, maybe business who knows.
Best of luck on your pursuit.
Kudos! Curious how you got back into classes? If you are getting another degree, sounds like you went back through admissions?
Yes, through admissions. Getting a degree in math, maybe... depends on how much stress this adds to my life. If I were retired I'd just take a full load, but raising a family and running my business I can only take it one class at a time.
Your school allows you to add a degree retroactively like that?
We will see. A degree is just the ends to justify taking math classes. My goal is to learn, if they give me a degree that’ll be a bonus!
How did it go?
I didn't ace it, but knew immediately what I had done wrong as I rode my bicycle home. I kept checking my linear transformation matrix and the Eigen values didn't compute... Looked again at the TI-89 when I got home and realized I swapped the orientation on the Jordan constants. I wrote all the equations out, so maybe my professor will have mercy on me. Oh well, another case of elevator wit - https://en.wikipedia.org/wiki/L%27esprit_de_l%27escalier
Is calculus included in your classes?
This sounds like a line from a Silicon Valley type TV series used to establish "character am smart"... and I love it.
Ha! I was so nervous writing it because I am not qualified to use the smart math words!
Why take classes if you can learn everything with latest llms? Unless you actually need a formal degree in math ?
I've used an LLM for tutoring, but it doesn't replace the experience of biking across campus, ordering a coffee, unpacking my TI-89/iPad, cracking jokes with the professor and other students, paying attention, taking notes, and having to remember the material until exam day. This process is culture, and it's honing my mental blade. Then, as a solo-entrepreneur, I go home and use Cline+Sonnet to hack on a few side projects. These two processes compliment each other, greatly. Like I've mentioned in other replies, this is for "#2 fun" and to see if the "old guy (me) has still got it."
Don’t you feel like you don’t belong there with people half your age?
Can you provide an example where a top llm (sonnet 3.7, grok-3, o1, gpt-4.5) hallucinated a linear algebra answer?
Oh, I did... but it was more of a parsing error.
Prompt:
I have padlocks that I use to lock up my tools, or my bike, etc. The problem is, I often go several months without using some of them and forget the combinations. So, I decided to write down their combinations, but then I always lose the sheet. Being the math geek that I am, I decided on the following solution. I choose a 3 × 3 matrix and multiply this matrix by the combination and write the result on the back of the lock. For example, on the back of one lock is written “2688 − 3055 − 2750 : Birthdays,” indicating that the 3 × 3 matrix that I chose for that particular lock is the matrix whose rows consist of the birthdays of my brothers and me (from youngest to oldest). My brother Rod was born on 7/3/69, I was born on 7/28/66, and my older brother was born on 7/29/57. What is the combination of the lock?
Now, technically the LLM didn't quite know how to parse "2688 − 3055 − 2750" and ran the calculation with "[2688;-3055;2750]" and produced a response of, "These values are clearly not typical lock combinations, which suggests a potential issue with the encoding process."
Smart, kind-of. I reran with a more explicit prompt and it calculated the correct combination.
Overall though, I'm impressed with using ChatGPT as a linear algebra tutor. I wouldn't hesitate to use it in the future.
I just tried your prompt: o1, gpt4.5, gemini 2 pro solved it correctly (21-19-36), sonnet3.7 and grok3 failed because of the parsing error you described.
Break a leg and Good luck!
Cool, good luck!
Good luck!
Nice, good luck
I'd like to see a study on how the acute stress of living in survival mode for a lifetime affects the brain by using it too much for the wrong tasks.
The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel. When I look around at the sheer computing power available to us, I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation. So that we could focus on getting real work done in the sciences for example, instead of just making rent.
I've been living like someone from movies like In Time and The Pursuit of Happyness for so many decades without a win that my subconscious no longer believes that the future will be better. I have to overcome tremendous spidey sense warning signs from my gut in order to begin working each day. The starting friction is intense. To the point where I'm not sure how much longer I can continue doing this to myself, and I'm "only" in my mid-40s. After a lifetime of negative reinforcement, I'm not sure that I can adopt new innovations like AI into my workflows.
It's a hollow feeling to have so much experience in solving any problem, when problem solving itself will soon be solved/marginalized to the point that nobody wants to pay for it because AI can do it. I feel rather strongly that within 3 years, mass-layoffs will start sweeping the world with no help coming from our elected officials or private industry. Nobody will be safe from being rendered obsolete, not even you the reader.
So I have my faculties, I have potential, but I've never felt dumber or more ineffectual than I do right now.
>I'd like to see a study on how the acute stress of living in survival mode for a lifetime affects the brain by using it too much for the wrong tasks. The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel.
I suspected something very different based off the first sentence. Like someone living in a high crime area and trying not to get dragged into it. Or constantly struggling with poverty, food insecurity, etc.
I remember seeing a Ted talk on poverty impacting cognitive abilities.
A quick search returned some articles. Here’s one: https://pmc.ncbi.nlm.nih.gov/articles/PMC7525587/
A Ted talk: https://www.ted.com/talks/rutger_bregman_poverty_isn_t_a_lac...
The other comments are great but I wanted to touch on why I seem to be struggling in a reality that generally provides enough today.
It's because as hard as it is to believe, especially for young people: life these days is decent despite the status quo, not because of it.
In other words, had we continued on the trajectory we were on before loosely 1980 and trickle-down economics, we could have had moonshots to solve each of humanity's problems in the order of need rather than profitability. We could have consulted academics to invent 25% efficient solar panels for under $1 per watt and had them installed on over 50% of homes by 1990. We could have invented lithium iron phosphate batteries at that same time and had $10,000 electric cars, because they simply aren't that complicated. We could have had blue LEDs, and WiFi, and flatscreens, and everything else we enjoy today, decades earlier. Stuff that doesn't even exist right now but should, like affordable public buffets, mass transit in small cities and single-payer/public healthcare. Robotic hydroponic greenhouses. Living closer to work (I know, inconceivable).
Instead, I had to watch everything roll out at a glacial pace under a risk-averse private system that allowed the Dot Bomb to happen around 2000. That defunded nearly all pure research and outsourced the jobs that provided a healthy work/life balance. That marginalized eBay businesses and online advertising and the resale market so that influencers and the ultra-wealthy could capture all of that low-hanging fruit while the rest of us have to work. And boy did I have to work, at jobs that sapped every bit of my passion, motivation and self-determination, leaving me too exhausted to pursue my side hustles fast enough to get to market before someone else beat me to it or a deregulated recession wiped me out again.
When you've watched progress flounder for as long as I have, it becomes obvious that sabotage is where the money's at. The powers that be denied innovation at every turn, in order to prop up aging industries centered around a 20th century fossil fuel economy that still dominates our lives today.
And now suddenly AI falls in our lap because a billionaire finally decided to fund it. Now you see what happens with a moonshot. Things change so rapidly that we're left reeling with their implications. The luddites come out. Politics devolves. Time runs backwards to the 1950s, the 1940s, the conditions that fanned the flames that turned into world wars.
Now they gleefully say "see! we should have kept stifling innovation! ignorance is strength!"
It's.just.so.exhausting.
I find that people fall very strongly into 2 camps, which could be loosely mapped to left and right: those who suffer knowing what could be, and those who defend what is to deny their own suffering.
You're not the only one that has had those kind of feelings, and I really relate to the movies you referenced.
Try to remember, AI is a tool, not a solution, and there will always be new problems. There's a strong case that unlike every other time people said that technology will kill all the jobs, this time it actually will. But a helpful framework comes from Clayton Christensen's Innovator's Solution (not the much more famous Innovator's Dilemma) - whereas a business has well defined needs that can be satisfied by improving products, customers (i.e. people) have ever evolving needs that will never be met. So while specific skills may lose value, there will always be a demand for the ability to recognize and provide value and solutions.
What makes a labor market for agents that recognize problems and provide solutions special or different from markets for other kinds of labor? If AIs get to a point where they dramatically outperform humans in other forms of labor, why not in this one?
I think some humans will be doing it well enough to keep themselves afloat the rest of our lifetime, and some will get fabulously rich building products as a one-man operation leveraging AIs. But there will be far more people failing at it. It will be like Youtube creators or Instagram influencers where there are few winners who take virtually all the rewards.
compared to the broadcast era aren't there way more winners -- with a smaller pieces of the pie -- nowadays?
it's still a Pareto distribution, I'm sure, but mega-stardom kinda died and was replaced by all these mini-stars, as far as I can tell. I'm not sure it supports your hypothesis.
Sure, there is some truth to this.
I'm not really in touch with other genres, but I like to watch chess videos/streams on Youtube and Twitch. The vast, vast majority of views and revenue are captured by about ten people.
I like those people too, but I've also watched a lot of smaller acts, even some amateur players not much stronger than me. So I get those recommendations, and I see their view counts. They aren't making anything at all.
There are other people who have some followers, but even 50,000 followers would be a dream for most people doing it and they will make next to nothing from that. I'd guess there are at least 30x the number of strong, titled players in the 50k group as there are in the 1MM+ group. These are all people who were chess prodigies as kids, won every scholastic tournament in their state, took gap years or went to colleges that let them basically major in chess, travelled the world for tournaments, with awe-inspiring skills, and they are not making anywhere close enough to live on.
And the thing is, I think software might even be tougher in twenty years. Its hard to get people to change from a system they use to another thing, much harder than recommending a new face on Youtbue.
Maybe someday they will. But the current run of LLMs are fantastic at regurgitating and synthesizing existing knowledge, and getting better all the time, but not so good at coming up with new ideas. As long as you keep to the realm of what is known, they can seem incredibly intelligent, but as soon as you cross that boundary there's a clear change - often to just meaningless bullshit. So, I personally don't think we're going to be outsourcing idea generation to LLMs (or AI in general) anytime soon. Though to be fair, I'm only about 75% confident in that, and even so, it doesn't mean they won't be hugely transformative anyway.
Pessimistically but realistically, it doesn't matter if AI will perform better, it just matters if it's cheaper. A historical example is all the offshoring of mission critical code starting in the late 90s, early 00s. The code that came back was sub-par many of the times, particularly for the cheaper shops, but the executives got their bonus for saving money and bailed out. The new executives are now in charge of fixing the disaster of a codebase that they were left with.
I think history will rhyme with the offshoring trend but with AI this time.
> When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation.
I was inspired to get into programming by Star Trek in the early 2000s because I thought I could contribute to automation that would lead towards that kind of society; much like you've stated here. Some will say we're naive and unrealistic, but all the ingredients for having society function in this way are attainable with a bit of a cultural shift. I was fine with the idea that society could take baby steps towards it, but it seems the last 25 years have been a mixture of regressing and small incremental improvements to things that don't contribute towards that goal. Just like you, my expectations have been utterly destroyed and my outlook for the future is grim.
> but all the ingredients for having society function in this way are attainable with a bit of a cultural shift
It's awfully naive to think that you can solve the information problem with a "small cultural shift". Statements like this strike me as deeply ignorant of economics and the history of attempts to plan society. People are messy and their needs are hard to predict in any meaningful and responsive way that respects their preferences.
Imagine answering the question how many washing machines should we make. Assuming you could figure this out, you need to consider the different kinds of washing machines people may want and need. Apartment dwellers need small efficient one, and people with a lot of kids want big ones. This in turn has baring on the number of motors you have to make, feet of copper wire you need to product, plastics, rubber, and on and on. And don't forget that's just washing machines.
Now you need to figure out how to get these washing machines to people.
You just can't plan and automate everything, its far too complicated.
People came up with the information problem at a time when our ability to collect information and our computing power were several orders of magnitude inferior to what we have today. I don't think it's as big a problem as people think. Sure, it was true when every single person didn't have a device capable of instantly sending any kind of information to and from any location on earth, and when we didn't have the computing power to process that amount of information coming in 24/7. Now we do, so I believe a well functioning planned economy would be a possibility today, although it'd be a massive project to build such a thing. Even with limited technology the soviet union functioned for multiple decades and was one of the most developed nations on earth.
He's talking about automating labor, not economic planning.
It’s the same thing. Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest. These are resource allocation issues. You can’t just wave that away.
Hah, I wish that were the case. A whole lot more things would be automated if that were true.
Automation requires resources, but it also requires vision, cooperation among affected parties, a workable regulatory framework, maturity and availability of required solutions, and availability of competent integrators. There are all kinds of reasons something remains manual besides mere resource availability. And all those things change over time.
There's not much you can do about most of those things, but becoming a programmer and working to develop better solutions is one way to make a difference. Even if you don't work directly in automation, your work can trickle down to the people like me who do cencern themselves with automated sewing and strawberry harvesting.
What I mean by resources is the things you mentioned inclusive of vision.
I picked those two examples because you can literally build a robot to do it, but it is either unworkable in the case of the shirt or financially not viable like the strawberry robot.
Using your model, no technological development would ever occur because the fact that something had not happened yet would indicate that it could not possibly happen due to a lack of resources. This is the anecdote about two economists walking down the street and refusing to pick up a $100 because everyone knows that in an efficient market, someone would have already picked it up.
At some point the resources necessary for development are there but the technology itself has not actuated. This invalidates your original claim that: "Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest."
No it isn't. Using a script to automate a process frees me from having to carry out the process manually. It has nothing to do with central economic planning.
Most things in the world require physical processes. Automating those is quite a different task, and requires resources. How would you go about automating sewing a shirt? How about picking strawberries?
Again this is tech hubris and a lack of understanding of economics and history.
> Most things in the world require physical processes.
No one ever disputed that. The principle still holds if we apply this logic to physical processes; by automating or reducing the labor necessary to conduct a physical process, I can enjoy the benefits of the process without having to engage in the labor of the process.
> How would you go about automating sewing a shirt?
https://www.youtube.com/watch?v=oeSu9Vcu0DU
> How about picking strawberries?
https://www.youtube.com/watch?v=H2gL6KC_W44
https://www.youtube.com/watch?v=M3SGScaShhw
https://www.youtube.com/watch?v=OyA9XnW6BV4
To respond to the edit you made to your comment:
> Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest. These are resource allocation issues. You can’t just wave that away.
This is true in the long run and I suppose the argument you are making is that any attempt to interfere with the present system of resource allocation will constitute a centralization that would be less effective than free market capitalism, so the notion that we could redistribute the surpluses generated by labor saving devices to the average person is inherently a call to economic centralization. This might be true, but I would propose an alternative reading:
The surplus of labor-saving devices has primarily accrued to the owners of these devices. You might then claim that these owners are owners because they have found a means of servicing a market demand. Each dollar they possess is a vote from the market that these guys really know what they are doing, and that the world wants more of it. If we were talking about spherical billionaires in a vacuum, I'd agree with you - but this issue is complicated by the compounding impacts of inheritance and its correlation with access to credit, as well as with the existence of competitive moats (e.g. network effects, intellectual property, sunk costs, natural monopolies, etc).
The optimistic read of the technology sector in the 2010s was that businesses would compete with one another to provide services that would ultimately improve people's lives. Instead, we got Windows 11. That wasn't a consequence of users voting with their dollars, it was a consequence of Microsoft entrenching itself into workflows that cannot be economically altered in the immediate future. There are lots of examples of the market not being particularly effective at economic allocation if we step outside of the logic that any purchase is a revealed preference which indicates approval of the good or service being purchased. Apply this logic to the purchases of gamblers, alcoholics, drug addicts, or murder-for-hire plots and the limitations of the logic become obvious.
No my argument is that trying at a broad systemic level to make specific outcomes happen is susceptible to the information problem. Trying like the op suggested to automate away work is utopian and improbable at best. If you squeeze labor out of one kind of drudgery you have no way of predicting the results, and you’re certainly not going to end up with Star Trek.
To my minor aside, look at that shirt. You have to essentially glue the fabric into a board and all the robot can do is a rudimentary set of side seams and sleeves on the tshirt. There’s no finishing work on the collar or hem so it’s useless. That robot exists as a demo and is used precisely nowhere. You could in theory do this, but it makes no sense economically.
And yes I’m aware of Japanese strawberry picking robots. You’ve clearly misunderstood what I’m saying. These thing may be technically possible but they remain infeasible for other reasons.
> No my argument is that trying at a broad systemic level to make specific outcomes happen is susceptible to the information problem.
This is exactly what I said you would say:
> I suppose the argument you are making is that any attempt to interfere with the present system of resource allocation will constitute a centralization that would be less effective than free market capitalism
Further:
> Trying like the op suggested to automate away work is utopian and improbable at best.
We are a long way off from the self-replicating systems that could feasibly make work effectively optional, but you haven't made a convincing argument as to why it is improbable that automation could reach that point.
> And yes I’m aware of Japanese strawberry picking robots.
You clearly were not aware of them or you would have picked better examples. Your original comment consisted solely of the statement: "It's the same thing." and now you're continuing with that flippant attitude by pretending that I'm misunderstanding your argument when I anticipated it in its entirety.
I clearly was aware of them. Do you think I just rattled of the bit about needing special glue to hold the fabric and only certain seam types being possible? There was a whole thing about these in the Economist last year and it was discussed on HN. While it’s technically possible you can’t deploy it. It turns out gluing an then applying solvents to fabrics doesn’t result in a product people want.
This Star Trek stuff is improbable because everything has to be coordinated somehow and waving your hand and saying magical future ai is the only proposal anyone ever has. So yeah, maybe super advanced AGI could do it, but probably not. We don’t even have good models now of how large economies work down to a granular level. People are like I said messy and respond in weird ways to their environments. The best we can do right now is working with prices as signals for the amount of effort other people are willing to put into something. And while that’s imperfect, it’s just improbable that we can do much better. Which is not to say that narrow objectives aren’t possible, only that the bigger and broader you aim the more impossible it becomes.
> I clearly was aware of them.
You cited them as examples of tasks that would be difficult to automate. The pickers have been commercially deployed for the last four years.
> This Star Trek stuff is improbable because everything has to be coordinated somehow and waving your hand and saying magical future ai is the only proposal anyone ever has.
Redistribution already occurs without the use of an AI.
> You cited them as examples of tasks that would be difficult to automate.
Yes because they are. I specifically gave an example where a machine exists but it's impossible to use for the real world, and an example where economics generally prevent adoption. That gets to my whole point.
> The pickers have been commercially deployed for the last four years.
Yes narrowly, and in only a few places where there are extreme labor shortages.
You are clearly misunderstanding me.
> Redistribution already occurs without the use of an AI.
I didn't make the claim that it didn't happen.
I feel like you're willfully ignoring what I'm saying. These things are hard and rolling them out universally often doesn't work because it is either impractical or economically infeasible to automate things or you run up against regulatory/cultural/material issues. The best we can do is piecemeal progress where incentives align.
[flagged]
> We've already established that you were wrong about that as these machines are in commercial deployment.
No there are literally no companies using that sewing robot, you can't buy that shirt.
> No, you're wrong. You clearly know nothing about this issue
You're being very rude, this isn't twitter.
> Anything that could be automated but isn’t automated is because it isn’t cost effective to do so or there is insufficient capital to invest.
Is there any evidence to back up this wildly huge assertion?
It should be obvious. There are plenty of thing we can build robots to do, but we don't because its wildly more expensive than we can sell the resulting product for. We can mostly automate construction but it turns out the land acquisition dominates costs, installation still ends up being sloppy and human, building codes are different everywhere, and people want a different kind of dwelling than what prefab is suited for at the moment.
If it should be obvious then the evidence should be equally obvious.
Or perhaps the world is a bit more nuanced and it may very well be that we're stuck in some local maximums that our current methodologies don't allow us to escape but escaping them is relatively easy if we chose to implement a meagre amount of resources for that purpose which is something we don't do because we're stuck in that local maximum and so on and so forth.
Another way of looking at what you're saying is that we're doing things optimally and that there's no room for improvement when that very obviously is not the case.
There are many gross inefficiencies in our system as it currently is -- look at food production for example. How much of the food produced globally is outright wasted? 30%? 50%?
If we made a conscious effort to tighten that up we could reallocate those resources to solving the problems of automation issues that you're describing.
The true hole in one for automation is a durable machine that can make a copy of itself as well as useful economic goods. Bonus points if this machine can be in a humanoid form to integrate into our existing economic infrastructure.
Once you have a self replicator you can have it make as many copies as needed to solve any problem you need with minimal human effort.
But a self-replicating machine isn't on anyones radar. Have you ever seen a politician or policy person discuss this?
I’m sorry, any talk about self replicating machines is just science fiction at this point. It’s not a serious thing to discuss as a near future possibility.
Milton Friedman's essay on making a pencil:
"Look at this lead pencil. There’s not a single person in the world who could make this pencil. Remarkable statement? Not at all. The wood from which it is made, for all I know, comes from a tree that was cut down in the state of Washington. To cut down that tree, it took a saw. To make the saw, it took steel. To make steel, it took iron ore. This black center—we call it lead but it’s really graphite, compressed graphite—I’m not sure where it comes from, but I think it comes from some mines in South America. This red top up here, this eraser, a bit of rubber, probably comes from Malaya, where the rubber tree isn’t even native! It was imported from South America by some businessmen with the help of the British government. This brass ferrule? [Self-effacing laughter.] I haven’t the slightest idea where it came from. Or the yellow paint! Or the paint that made the black lines. Or the glue that holds it together. Literally thousands of people co-operated to make this pencil. People who don’t speak the same language, who practice different religions, who might hate one another if they ever met! When you go down to the store and buy this pencil, you are in effect trading a few minutes of your time for a few seconds of the time of all those thousands of people. What brought them together and induced them to cooperate to make this pencil? There was no commissar sending … out orders from some central office. It was the magic of the price system: the impersonal operation of prices that brought them together and got them to cooperate, to make this pencil, so you could have it for a trifling sum.
That is why the operation of the free market is so essential. Not only to promote productive efficiency, but even more to foster harmony and peace among the peoples of the world."
https://thenewinquiry.com/milton-friedmans-pencil/
Thank you for that recommendation. Added this 10-hour documentary featuring Milton Friedman to the top of my to-watch list: https://www.freetochoosenetwork.org/programs/free_to_choose/...
Thanks. I'd wanted to search for this, but forgot who'd written it...
The Star Trek future does seem out of reach. On the other hand canonically they only got to fully automated luxury space communism after fighting a global nuclear war against eugenisists.
In Star Trek, people do actually work and have responsibilities with little (or no) leisure time or say over how they spend their days.
In Star Trek money exists but there isn't much use for it because technology has made material abundance cost approximately nothing.
Star Trek doesn't show the 50 billion landwhales watching Netflix all day, because it makes for bad television. It shows the 1% who still work even when they don't have to, who work because they want to.
People very much have to work in Star Trek. If they don't, unsuspecting enemies like Q, the Romulans, or the Borg would eradicate the Federation's existence. They also have their own forms of scarcity. Replicators are somewhat common but Holodecks are not and members of the crew must schedule time to use them. Resource allocation is determined through the decisions of leaders in the beauratic hierarchy and the longer you watch, the more you see the ways in which the Federation falls short.
Most of the characters in Star Trek are in the military. That's pretty much part for the course.
AI didn't really mesh seamlessly with my work until I used Claude, I highly recommend it. If your current workflow involves googling, reading documentation and examples on github until you can put together a solution then AI should slot into your work nicely. It just does all those things but faster and can often surface what I want in 30 seconds instead of 30 minutes of research.
I wouldn't worry though, if the last 4 years are any indicator, we will continue to see LLMs refined as better and better tools at a logarithmic rate, but I don't really see them making the jump to replacing engineers entirely unless some monumental leap happens. If AI ever gets that good it will have replaced vast swathes of white collar workers before us.
I am somewhat optimistic, tech adoption is only going to go up, and the number of students pouring into CS programs is cooling off now that there aren't $100k jobs waiting for anyone who can open up an IDE. My ideal future is people who really love tech are still here in 10 years, and we will have crazy output because the tooling is so good, and all the opportunistic money seekers will have been shaken out.
This is a rather simplistic view of life IMHO. What’s wrong with people working for rent or groceries? What do you expect everyone to work on?
> What’s wrong with people working for rent or groceries?
There's nothing wrong with people who have the ability to work for groceries being compelled to work for groceries. The rent issue is complicated by the fact that land ownership prioritizes those who have already had time to accumulate wealth over those who have not. There are some issues with abandoning prices on land entirely (e.g. if land has no cost, how do we decide who gets to live in the most desirable locations?), but there's a compelling case to be made that the contemporary system of real estate financialization is similar to the enclosure movement both in terms of its structure and impact. It becomes a question of those with good credit (typically the rich and old) being able to (in aggregate) buy up all of the desirable land and thus to set monthly claims on the income of those with bad credit over and above the level of claim that would be possible if the property purchases could not be financed by loans.
There is a legitimate cost to constructing a building and renting it out, but there is no real cost to land except the cost the market assigns to it. This might not be the worst thing (recall our example of allocating land in desirable locations), but when prospective landlords can take out loans against the property, the property's value is driven up beyond what any reasonable person would be willing to pay for the property's use. If you couldn't derive rental income from property, it would not make economical sense to finance these purchases beyond what you needed for your own use. This would (in theory) lead to lower prices.
Henry George is the figure to look at here.
> What’s wrong with people working for rent or groceries?
Many people are compelled to do that, but almost everyone wants more out of life. Strong evidence is that they take more whenever they can get it.
If you could get whatever you want out of life, right now, with no effort, what would you want exactly?
I'd travel the world, taking in diverse centers of culture, history, and nature. I'd try to learn new languages. I'd do more track days, karting, and Ultimate. I'd buy a shell and try to get back into rowing. I'd play more computer games. I'd play ping-pong, foosball, and board games with my kids. I'd coach kids' sports. I'd go to more plays and concerts. Even movies. I'd volunteer.
Of course I wouldn't do ALL of that, since even without work there are only so many hours in the day. But I certainly wouldn't want for things to do!
Some people do all that and still work, you probably just need better time management. You could study a language before work in the morning, and then go row for a bit. Then go to work. Then you could play computer games from 5 to 6, play ping pong with kids from 6 to 6:30, eat a dinner, coach kids soccer from 7 to 8, volunteer open source from 8:30 to 9:30, catch a movie at 10.
So simple! Just as easy to do it as saying it right?
Exactly, saying it’s the easy part.
But even without a job, you still need energy and motivation. The tax of switching between tasks (or hobbies) doesn’t magically disappear. Neither does the time suck of social media.
If you're wealthy and healthy, and even so only some of that timeline _may_ be possible, most just unrealistic.
>You could study a language before work in the morning, and then go row for a bit.
Ok, gotta be in by 9am, 30-60 minutes commute, 30 minutes learning a language, gotta eat, shower, coffee, get my row boat mounted and at the lake 20 minutes away, prep, do a 20 minute row, back again so realistically you'd need to be up at 6am, not unreasonable.
> Then go to work. Then you could play computer games from 5 to 6
Did you end work at 4pm or work from home, either way that is likely a short day but ok. A lot of people are forced to have commutes or work in a job that can't be remote, not to mention work much longer days. Hell isn't "60 hours is the sweet spot" for a work week now? (quoting Google's founder recent comments).
> play ping pong with kids from 6 to 6:30,
Have enough room to have a ping pong table at home, that must be nice, but yeah doable.
> eat a dinner, coach kids soccer from 7 to 8,
Who cooked dinner? Who cleaned up? That shit doesn't just happen by itself. So you prepped, cooked, ate and cleaned up, wrangled kids into car for soccer, and got the game field ready to play all in 30 minutes? Nope.
> volunteer open source from 8:30 to 9:30,
Game ended on time, kids didn't hang around to talk to team mates, straight in the car, no issues, and less than 30 minutes transport. Nope.
> catch a movie at 10.
30 minutes to get kids to bed, baby sitter on time (and you can afford one), doable at some ages sure. Movies are regularly 90-180 minutes so you're in bed at like 1am? For a 6am start? Again transport not taken into account.
The reason people think you can work 60 hours a week, every week, is because they don't do all the everyday things that need to get done, they have other people to do it. Also rarely do they leave enough gaps in their schedule for other peoples priorities.
Assume you WFH, 9 to 5. Commute time is zero. You have a middle class suburban house with a lake in the back. Your partner is a stay at home parent, does not work, just does household tasks and takes care of kids.
You wake up at 7. Quick 15 minute breakfast then push your kayak out to the lake and row 45 minutes on the water.
From 8 to 9, you can study a foreign language (same duration as a university course)
At 5 you can game for an hour and decompress. Then ping pong at 6.
By the time you finish ping pong with kids at 6:30, you’ve spent 90 minutes just playing around. Time for dinner, prepared by your partner. Kids have 25 minutes to get dress for soccer and eat dinner. The soccer field should be no more than 5 minute drive from your home.
After the game ends at 8:30, you could schedule an additional 20 minutes for your children’s frivolity if you like. Once you drive home you can cut down to 30 minutes working on open source stuff. A small sacrifice for their joy.
Send kids to their rooms by 9:30. Let them sleep whenever they feel like as long as they are quiet and in their room. Spend time with your partner and prepare yourselves for the night out.
By 9:45 the baby sitter arrives and you two head out for the movies. A baby sitter can be very cheap if your kids are older, often they are just a high school student doing homework or watching TV while your kids sleep or play. Don’t need a PHD.
You could be home by 1 AM depending on movie length. 6 hours of sleep is good enough, you can do it all again the next day.
It’s very doable, especially if you decide you don’t actually want to follow the same schedule everyday.
This schedule, even as a theory, assumes you work from home and have a partner who does not work and a babysitter? I don't actually know what percent of families that describes, but my guess is it's pretty low
Okay but at some point you have to make choices to work toward the life you want, it’s not just going to happen by accident with you chasing whatever you can, and that’s what people don’t understand.
If you want this schedule, prioritize a WFH career and find a partner who wants to stay home and earn enough money to hire a babysitter. If you don’t then this won’t be available to you and it’s your own fault.
It is really besides the point because of the way the dopaminergic system works.
What was good enough yesterday is expected today and won't be good enough tomorrow.
That is practically what makes us human.
Whatever you get today with no effort won't be enough tomorrow.
The ideal modern life is really one that is challenging enough that you don't get everything at once but not too hard that you can't make progress.
> with no effort
I want effort, lot's of it, but let's not nitpick ...
Off the top of my head: Nobel Prize winning, world-beneficial research; lots of loving, open, deeply connected relationships; grow rapidly; be someone people turn to for support (because I help them), ...
I already do at least one of those things. :)
I think if you let your imagination wander and you end up seeing the scale of potential we have and what we could really achieve, stuff like paying for rent and groceries starts to feel archaic and wasteful, or as some kind of artificial constraint holding us back as a species.
> What’s wrong with people working for rent or groceries?
Why should people have to work to be able to afford rent and groceries?
Poverty is difficult enough to escape--not having to worry about rent and groceries would sure help.
There is a reason why school meal programs are such a success.
Well, even if we agree that's the best we can aim for as a species (how sad), soon we won't even have that luxury.
I think (from personal experience) talking with a good mental health professional would really help with your current state of mind and the pressure you’re feeling.
Really? And how exactly?
"Just man up", maybe?
Sorry for the snark but I don't think they can just magically make you feel better. An example or two could change my mind.
> "Just man up", maybe?
That's the toxic stuff you get from society, which leads to you hiring mental health professionals that can teach you healthy, effective ways of dealing with stress.
This much I know and have heard. Still curious about some examples though.
Responding to your earlier comment, it's not magic and they don't do it, you do it. They help you learn how but it's up to you.
Cognitive Behavior Therapy can help with a wide range of issues. If there are worries that are not productive for you, that you can't get out of your head, a therapist can teach you how to use some basic tools to control that. And you'll probably only need a few visits. You can also read books, but given what you've stated I think you should start with a human.
My son went to a few sessions and completely got his OCD under control. He doesn't have to go anymore. I used similar technique to quit smoking 30 years ago after at least a half-dozen serious tries by other means failed. Still off them. It applies to all kinds of issues though, its also very effective for depression. According to the literature studies I did twenty years ago, it was the only technique that actually showed sustained benefit for depression other than medication.
My depression comes from super severe learned helplessness. I have been extremely stupid with money and career choices and nowadays things got hard, I have several chronic health conditions and the difficulty got up not by 2x, more like 20x. I just can't muster the will to even do one job interview, financial reserves are dwindling fast and, you get the picture.
I have zero faith any therapist can help me. They'll likely start with "but it's for your own good!" and I'll just say "yeah yeah, like 200 other things I have been told and zero of them turned out to be true". That's how I imagine it.
I am not against paying professionals. Obviously. I just don't believe in therapy at all.
What would you do to start with, with a guy like me? (I am aware you are not a therapist yourself.)
I am also not a therapist but I am a former tech founder turned executive coach so I do talk to people who are facing what feels like overwhelming challenges, risk, and uncertainty.
Even in the language you used "severe learned helplessness" and "extremely stupid", you are revealing a state of mind (cynicism, self-flagellation) that is not oriented to improving your condition.
You know you have a strong bias against therapists—given your seeming lack of knowledge about them, where do you think that bias came from? Fundamentally, we are a social species and evolved to live with strong connections to small groups.
Our society is no longer set up like that. So professionals like therapists and coaches provide the essential value of a caring, supportive, and helpful relationship that we lack. Like getting an essential nutrient that your diet lacks.
Do you have health insurance? Many of them cover mental health—the site Headway can help you find one that takes insurance. Try a few and gather some first-party data before writing them off fully. The downside is a few hundred dollars. The upside is a much brighter and materially better future.
To try to complement what other replies already said...
I think an important result of successful intervention is to awaken (or reawaken) the mind to the idea that thoughts and perceptions are internal and not always accurate representation of an objective, external world. Much psychological stress comes from these internal experiences, and subtle shifts in your mental posture can change this environment.
That's not to say that real stressors and stimuli don't exist. It's just that often times a person can spiral in a way that makes their internal reactions counterproductive and harmful to well being.
Another important result is learning better coping and adaptation strategies, so you can start to shift your mental posture or even change lifestyle and environment to reduce chronic stress.
It's not always easy, not magic, and not perfect. But, it can help...
The worst thing here is, from the beginner perspective it seems like simply reframing bad in a positive way, when bad was almost completely in their mind and didn’t exist that much. After the results you can see how twisted you were. I had my moments when I looked at the scheme of my mind on a whiteboard and had to admit how delusional I am, with zero pressure to do so.
That's most likely what's going on with me as well. Working hard to undo it but to say it's hard would be a laughably weak understatement.
I think its important to understand that CBT is a system, a set of tools for managing your thought patterns. Therapists who specialize in it are largely in the business of educating their clients, not having them lie on a couch and talk to the ceiling about their childhood. I'm not saying you won't have generic "talk-therapy" kind of conversations - those are still necessary for them to understand the specific issues you need to work on - but its not just someone helping you find insights that don't change anything.
If you are completely against meeting with a therapist though, you can start with books. I wish I could recommend one that I've used, but this is an example of one that looks really promising to me, with a practical approach: https://www.amazon.com/Retrain-Your-Brain-Behavioral-Depress...
the canonical book recommendation for CBT is Burns:
https://en.m.wikipedia.org/wiki/Feeling_Good:_The_New_Mood_T...
This is not how therapy works. Although, tbf, it’s not hard to find a pseudotherapist who practices stereotypical bs.
What would you do to start with, with a guy like me
IANAT either, but mine would start with asking how I feel and then why. Then we’d talk about my vision of practical ways to stay afloat, the ways I maybe don’t see due to my focus, what exactly makes it hard to push through, in both known and never-tried situations. There would be some belief, avoidance, anxiety, algorithm, or a set of these. In CBT there’s a clear formalized method for each, which you can pick and work with until the next week or two. Examples are: logging your emotional responses, compiling a list of “musts”, start doing un-usual things, asking what exactly is wrong with something that seems bad.
That is, if my depression was on low. If on high, we’d address that first. Last time I pushed through it by following physical regime, a few supplements and lots of anger against it (depression can’t turn off my anger, ymmw as well as methods).
Every case is different. Therapists are brain debuggers. We don’t know what the bug is yet.
Look up "how to reduce salt" on YouTube. And remember, you can lead a horse to water, but you can't make him drink.
How do you know someone is a European? They think therapy and mental health issues are a silly American "trendy" fixation.
Trust me, us Europeans are not exempt from the "everyone should see a psychologist" trope blasting social media the last decade. We are not blind to every Hollywood actor having a personal therapist either.
I think the main difference (speaking as a northern European) is that when you Americans speak of therapy you seem to mean the stereotypical "talk therapy" where as basically every therapy here is cognitive behavioral therapy.
Can cognitive behavioral therapy help someone who has a bit of existential dread about his tech job? Maybe. I don't think it's silly on it's face though to say "really?" if the poster's life is in order otherwise.
Perhaps your life is on the easy setting? Hungry people work really hard. Fearing destroying an entire family by losing my job allows me to find strength and courage.
As a researcher who changed career paths to teaching at a community college, I empathize. Twenty years ago when I graduated from high school, I was inspired by the stories I’ve read about Bell Labs, Xerox PARC, and early Apple and Microsoft. I wanted to be a researcher, and I wanted to do interesting, impactful work.
Over the years I’ve become disappointed and disillusioned. We have nothing like the Bell Labs and Xerox PARC of old, where researchers were given the freedom to pursue their interests without having to worry about short-term results. Industrial research these days is not curiosity-driven, instead driven by finding immediate solutions to business problems. Life at research universities isn’t much better, with the constant “publish-or-perish” and fundraising pressures. Since the latter half of January this year, the funding situation for US scientists has gotten much worse, with disruptions to the NIH and NSF. If these disruptions are permanent, who is going to fund medium- and long-term research that cannot be monetized immediately?
I have resigned myself to the situation, and I now pursue research as a hobby instead of as a paid profession. My role is strictly a teaching one, with no research obligations. I do research during the summer months and whenever else I find spare time.
> I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation
What you stated is true, but my disappointing observation is that the people with wealth/power are only marginally smarter than the rest of us on the topic you mentioned. And then I suspect that even if one had a rich benefactor, pulling that off is not easy. It takes a threshold number people who have a holistic view of things to pull of what you mentions i.e nearly free basics of life. Check my profile etc. - some of what I wrote may strike a chord with you.
Also the proponents on Technocracy (Hubbert etc.) about a 100 years back, essentially touched on the subject you state. Note: The word technocracy today has a different connotation.
I'm very sympathetic to your experience and agree with most of what you say, but as someone who has spend half his life in academia and half outside, "who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel", I must say that 'reinventing the wheel' is at least as prevalent in academia than it is in business.
> acute stress of living in survival mode for a lifetime
For some perspective, bone evidence of pre-Columbian Indians showed that they regularly suffered from famine. There was also the constant threat of warfare from neighboring tribes.
The American colonists didn't fare much better, their bone evidence was one of extreme overwork and malnutrition.
I agree, you said some things I feel for some time too.
Hi zackmorris,
If I may so bold as to refer to you as "my friend" (having never met you)...
My friend, I think I understand what you mean. I am about the same age too.
I would like to propose an idea to you - and it is something I have been exploring very deeply myself lately.. maybe the thing we need to start spending our time on is exactly this meta problem now. The meta problem is something like (not perfectly stated): we as humans have to decide what we value such that we can continue to give our existence purpose in the future.
I don't think AI is going to be the be-all-end-all, but it is clearly a major shift that will keep transforming work and life.
I can't point yet at a specific job, or task - but I am spending real time on this meta problem and starting to come up with some ideas. Maybe we can be part of what gets the world, and humans, ready for the future - applying our problem solving skills to that next problem?
I mean all of the above in 100% seriousness and I am willing to chat sometime if interested to compare notes.
Maybe it's time for me (40+) to go back to college. I want to pick up Mathematics and Physics up to the point of General Relativity. Since it's "use it or lose it", I better start reading now.
But I don't really have any time. There are so many things to do, to learn. Younger people who happen to stumble upon this reply, please please prioritize financial freedom if you don't have a clear objective in mind -- and from my observation many people don't have a clear objective when they are in their 20s! If you can retire around 35-40, you have ample time to pursuit any project you want for the rest of the life.
> up to the point of General Relativity.
Putting in a plug for MIT OCW 8.962 [1]. I also had this itch, and was able to find time during the pandemic to work through the course (at about 1/2 speed). But true to what others are saying, life intruded for the last few lectures, so still have some items on my todo list. I thought Scott Hughes laid out the math with terrific clarity, with just the right amount of joviality. It is not for everyone, but if you have a suitable background it may turn "scratch an itch" into the obsession that it has done to me.
And to make the obligatory on-topic comment: I'm 61yo. Now get off my lawn.
[1] https://ocw.mit.edu/courses/8-962-general-relativity-spring-...
Thanks! Yeah I planned to use MIT OCW for my education, at least the first 3-4 pre-requisite courses, before I even consider registering in an independent program in some University.
BTW I hope you are going to get more free time in a few years so that you can come back and enjoy the education again.
I've always toyed with the idea of studying Computer Science since I taught myself how to code.
Hell of a lot more difficult now when I need to work and don't really have the same amount of time to dedicate to studying. Hell of a lot easier when you're younger, your whole life basically revolves around the education, and any job you have generally fits around your school life rather than the other way round.
Yeah it was really a surprise to me when I realized that my energy declined to the point that I couldn't work on my side projects for the down days. Then I counted how many days I have for the rest of my life (up to 75) and this dreaded me a lot.
And it got worse after my son was born a few years ago. I would count the number of weeks available, not the days, because there has been whole weeks that I couldn't do anything. After all those are two full-time jobs.
As for your CS education, I'd recommend getting into some side projects and explore from there. If you go to a school, it's going to take too many courses.
I'm in my late 40s and I've found that my desire for working on side projects after work is affected by how engaged I am mentally at work. When I'm building new features/products from scratch and I'm having to figure out architecture and learn more about whatever language I'm coding in, I get more amped to do side projects at home. When I'm bored and just bug fixing and dealing with more mundane things, I have no desire to do any more coding after work. Something about being more engaged gets my brain in a state that I can keep going for the rest of the day until I need to pull myself away from the computer because it's 2am and I should have been asleep hours ago. I should note that I don't have children so the only "obligation" I have is to spend time with my partner and eat dinner, which I enjoy doing, of course. She usually starts getting ready for bed around 10pm and that's when I start coding. I do have some bad sleep patterns though, doesn't matter if I'm coding or not, which is probably not healthy. I have that revenge nighttime procrastination thing real bad.
Have you considered doing your side projects before work?
It takes me one call in the morning, of me saying for the hundreth time in the past 8 months that the integration is still missing data, to get me off the rails for the day. I know at 10AM that I won't touch anything else after work.
Been contemplating starting early and dedicating "the best hours" to myself.
That absolutely works for me! I play multiple instruments and have found that the early morning is the best time mentally for me to devote uninterrupted time to practice and playing. I’m also fortunate enough to have a basement with another floor between my cacophony and my sleeping family
(Not original replier)
That was something I have considered for a while, but then figured out it is unrealistic because I have a kid. But original replier probably can do if he/she doesn't have one.
I'm in the same boat.
I realized that frustration from work usually spills over to other parts of my life, not surprising as work is usually the first big thing we do during a day. I'm exactly like you -- when I have a lot of frustration from work, then I wouldn't want to work on side projects. It has nothing to do with how many hours I have.
I also have some bad sleep patterns as I only sleep about 5-6 hours every night most of the time.
I think, it might be useful to learn some mental skills to compartment one's mental state. If I could somehow put that frustration from work into a separate space without it spilling all over the ship, it would definitely help a lot. But so far I don't know how to do it -- plus I have a kid so I can't chill down after work until late night.
Exercise goes a long way to keep up energy levels after work.
Yeah being doing that for a while. Doesn't work as expected as the primary sources of frustration cannot be removed.
Plugging Georgia Tech's online masters program - I did it over the course of 4 years while working - can take 1 class a semester - and it's very cheap for a high quality masters
I'm going to second ativzzz. It's a great program. I did it the same way: 1 class a semester over 4 years.
I've been going through various MIT OCW lecture series as I work on a personal programming project.
Esp. good were:
https://ocw.mit.edu/courses/6-001-structure-and-interpretati...
and
https://ocw.mit.edu/courses/6-042j-mathematics-for-computer-...
The only way I've been able to get things done is to first allocate time for the things I want to study right when I wake up, then do as much as I can to learn them. The only requirement is to try to understand the material. Putting deadlines and/or milestones in the beginning can sometimes discourage people from starting.
> Hell of a lot more difficult now when I need to work and don't really have the same amount of time to dedicate to studying.
Not really. You'll find that as an experienced programmer, you have a massive advantage at times in your classes.
Any time there's a question of "what are the expected value of the max eigenvalue for a random matrix from such and such a distribution" I can answer it in 5 minutes with matlab and ChatGPT (and I know I can answer it). Making a animated GIF with successive approximation of a function by Fourier transform, no problem. Integrating by parts with hyperbolic functions in it? so slow, I'm googling the quadratic formula while the kids are yelling out the integration steps in real time.
I'm over 40 and even though I mostly manage/lead now I have time to do programming and plenty of math. I still see improvement mentally (not so much physically anymore), but also a lot of improvement in skills I neglected when I was younger like interpersonal skills and sales. I'm also learning a new language and read more than ever. Sometimes I feel like I'm less sharp, but I wonder if that's because I'm doing so much more.
My tricks that I don't always follow, is work out every day, get enough sleep, and stay off of most short form social media. I realized when I was on short form social it would zap a lot of time and kill any focus I had.
"If you can retire around 35-40" really? If you can retire that young, you probably don't need any advice.
Achieving financial independence and early retirement does not mean one no longer needs any advice about life. Indeed, because those people have a longer retirement, they might ponder things like the meaning of life much more than someone who's living paycheck to paycheck and has to devote all brain cycles to survival. And there are so many options for those who retire at 40 that they genuinely need advice about what to do, how to find what matters most to them, and how to go about doing the things they had always wanted to do but couldn't.
These people have succeeded in making money and that's all. But life is so much more than just making money.
People's financial goals range wildly, which also impacts on when they can retire. Some people don't care about some combination of having a car, or expensive living quarters, or fancy food, or a family.
You can't blame an old man trying to give advices to people who don't need them...
> please please prioritize financial freedom
This advice could really backfire badly if taken literally by young people.
Optimizing for financial reward early in your career could be the surest way to end up in a dead end from a mission/purpose/domain/skills perspective.
20 years later, you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two.
> 20 years later, you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two.
Ah, but it does. Speaking as someone approaching fifty, you feel every penny. Everything about your financial situation weighs into your decision-making, makes different options possible or impossible. It changes which jobs you can take, and which jobs you can turn down. It affects how much time you can take between jobs. It affects how much energy you pour into keeping your job or chasing a promotion versus investing your energy in education or other things you find satisfying.
People worry that they will accidentally pursue money with such single-minded focus that they turn off every other part of their soul, and miss out on what they "really" want to do. But I don't think that's possible. Replace money with anything else: fame, family, intellectual achievement, hedonism. If you try to dedicate yourself 100% to one thing when something else is important to you, you'll hear the voice in the back of your head. You'll feel what it is, and if you ignore it then, that's on you.
If you don't hear that voice yet, lay down the foundation that will give you the freedom to follow it when you finally do.
Yea, there is a huge distance between "saving enough money to retire comfortably" and "letting wealth accumulation dominate every decision you make." And, honestly, most people don't even get to the first one.
You're absolutely right. I realize my comment could be understood in many various ways.
My point was that, at some point, money has a negative effect on your career. Shooting for the top percentile of revenue can take you off track for life.
But you are saying that having a few hundred thousands bucks when you hit 40-50 is a life-changer and you are absolutely right as well.
Our point of views are not incompatible and were not captured by my first comment.
Its easy to think you have too much money in your twenties. I used to save every other paycheck. But wait until you have a family and a mortgage. The money goes very quickly.
It might backfire for sure, but being financially independent gives you freedom to figure that out for the rest of the life.
IMO it's a lot better than the situation I myself am in right now, when I can clearly see myself working my ass off for the next 20-25 years in domains I totally hate, and then hopefully I can start working on interesting things when I'm ... 65?
I'd further argue that the only downside of my strategy is that you already have a clear non-monetary objective but decided to go with the money for 20 years. That's definitely a bad thing, and that's why in my original reply I rooted this out -- if you already have an objective, go for it.
Only if you get a rest of your life. While most do I've known more that one person who didn't make it to 40. Worse those that do all srart reporting their body is starting to fail. If you have not done some things by 40 it may be too late to ever do them.
Won't expound on my life story, but this is massively overlooked. You can't just prioritize money without taking into account the massive sacrifices it will require in your life. I spent a long, long time becoming successful in careers that I hated, only to burn out and do the career I knew I wanted to do since I was old enough to think and remember. Except now I have wasted decades of my life that I will never get back.
The majority of your life is spent working so you absolutely MUST find it fulfilling or you will burn out (at best) or destroy your body and mind as a sacrifice to the insatiable Mammon.
There is nothing wrong with doing a job you don't like. However you need to ensure that it doesn't burn you out. Work enough to get the money you need to live. Then do things you like. Many people do this: there are a lot of jobs people don't enjoy but they have to be done so someone does them.
Even people who find a job they love often find after 10-15 years they are sick of doing it all the time. This is likely to happen to you unless you are careful not to let your job alone be what defines you. This is normal.
Don't get yourself into a job you hate. (part of this is not being so picky you hate everything!) however liking - much less loving - your job is optional. Then go home and do something else for fun.
You can't forego savings because "I might die at 40". That's really not a sensible plan. It's a balancing act, but I'd rather have saved too much and die a little early, than not save enough and somehow live to 100.
I agree 100%. There is a balance you want enough savings for 'a rainy day' and also enjoy the rest of your life. retirement is only part of enjoping your life.
the important point is don't get so lost in saving money that you don't enjoy now.
But accumulating and saving always comes at a cost. Is it worth to earn more money, but spend less time with your family for example? It's a delicate balancing act and you never know where the right balance is
It depends a lot of where you came from. If you are coming from a poor background, without any perspective of the occasional help from parents or a possible inheritance, I'd say prioritize financial security. Of course, you can accept the occasional lower salary but with better career prospects here and there, but sometimes this is a mirage, and a lot of time, better pay comes with better career prospects.
If you didn't come from a somewhat privileged background chances are you started your career with more professional debt, without a rich contact network, you're probably a bit too humble to negotiate wages and even narratives like "when I started my business I had come from a working class family, and had to scrap by raising 80k from my relatives to start my business" are out of your reality. So, prioritize being financially secure first.
This angst about a sense of purpose is basically a privileged class malady, if you are poor our friend Maslow will ensure you have more pressing issues to care about first.
I mean school debt where I said professional debt.
> you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two
You are describing some extreme case of money chasing and/or complete ignorance to everything else. Having the "luxury" to be covered financially for the rest of your life allows you to pursue whatever goals you have in mind at mid-life. If you are susceptible to not knowing what you want, having less money won't help you find out but having more money might.
Is it any better to know what you want to do for the next 2 decades and not ever be able to afford do it? From a practical perspective you are still missing the opportunities you want or dream of, except you're also doing it with little or no financial buffer for the things you need.
I made a lot of sacrifices and experienced some serious personal pain to achieve modest financial independence by age 35 (not "FAT" by any means, but well beyond the average American), and it was worth it. I'm still working, but only because my former career momentum has carried me into a position where I'm paid a small fortune. I would never do any kind of normal engineering job for a normal income these days.
My attitude and the way my brain processes things is completely different. Getting laid off or fired goes from something you might fear or see as a bad thing to a neutral or even positive event that just encourages you to go spend your time in a different way for as long as you want.
20 years of bad habits facilitated by a given lifestyle can also be very hard to break. Not many can manage duly accumulating the savings while completely isolating themselves from what they work on, who they work with, and how all of that impacts their worldview.
And that's not even considering health. 20 years of being in a bad mental place (stress is bad, but a perceived lack of purpose and agency might well be worse) will leave its marks.
There's also a non-zero chance you'll die before year 20. I agree with the premise that seeking financial independence should be a significant factor in career/life decisions, but if you would be filled with regret by finding out it will be cut short at year 18, you're too singularly focused.
Having the money is far, far better than _not_ having money to help you "make sense or use of the next two decades". If you don't, both the sense and use are narrowed to being chained to a job indefinitely into the future.
I see this argument a lot, i think no one is right. you just need to pick a away to approach life and deal with it.
> honestly, does not help you at all make sense or use of the next two.
Why?
Money can't buy you a sense of purpose, especially if that purpose would largely involve non-monetary aspects.
The thing, being poor doesn't buy one a sense of purpose too. Money for sure doesn't solve the issue, but it gives you all the freedom to solve it for the rest of your life.
Damn I wish I had a million so that I could just drop my job and twitch my coding and gaming streams 12/7. I can't do that.
Absolutely. Money (at least some amount, at least for most people) is necessary, but not sufficient for a fulfilled life.
I believe most people having that million wouldn't spend it to find the fulfillment in life, but would end up increasing their life style by slacking off, drugs, expensive cars and items, etc. And the million would be gone in months and you would be left with just bad habits, dopamine hangover and no idea of the further direction in life.
I found 'So You Want to Learn Physics...' helpful: https://www.susanrigetti.com/physics
Agreed on prioritizing financial freedom.
Thanks. I did read that but found it to be too broad. I set a very narrow target and hopefully everything can be wrapped up in 8-10 Math/Physics courses.
If you’re going to try this route, I’d also recommend prioritizing your family and/or life partner with all your remaining energy.
Grinding is soul-sucking, and having someone at home was the only way I made it through the roughest patches.
I semi-retired in the 35-40 range, but if my choices were being retired and single or working but with my family, I’d 100% take the latter.
I got excited to do this a couple years ago. (early 30s) Time and energy were a real killer.
Physics and Math in a formal setting like school is rigorous, not fun. I found it really hard to stay motivated. I don't know how I would practically use that knowledge, i would never contribute anything scientific. It would take years of grinding through foundational math and physics to get there.
Oftentimes they're not even rigorous. At least in the math setting, many professors excuse their student's lack of comprehension on complex material when it's really bad teaching. Some attempts at explaining proofs are laughable and although their conclusions may be correct, because the covered theorems were proven by those much smarter than them, the steps taken to get there rarely follow the tight logical sequencing needed by those learning the material.
I often ponder if I have the energy to go back to school. I am employed by MIT at one of the labs where I do research for embedded security. As a consequence, they offer free classes you can pick up. I am yet to actually take advantage of that yet but your comment has me thinking the same thing. I turn 36 in a couple days!
Yay, do it! I'm in linear algebra right now (midterm in 40 minutes) and I'm over 40. I went back because I always regretted not taking more higher level math. It's been a lot of work, but very rewarding. My kids (age 7 and 5) think it's pretty cool to see dad working on his TI-89 and Notability on iPad.
I was running into the same issue. I wanted to get into deeeplearning but my math skills had atrophied. go check out mathacademy.com. its no where near the level of time investment that going back to college is and you will learn a lot!
>>please please prioritize financial freedom if you don't have a clear objective in mind
This. To Infinity.
Please prioritise financial freedom. I missed a few steps, but as I get old, I realise this is the biggest blocker to almost anything.
Money == Free time.
If you have the discipline, you can create a lesson plan with an LLM without spending an arm and a leg.
Thanks. I definitely will teach myself some of the pre-requisites before registering in University. I need to prove to myself that I can sit down, take some course, complete the coursework + assignments + exams on MIT courseware, before committing anything that costs $$.
Math Academy is really all you need. 30 minutes per day is enough, though more time will mean faster progress.
You would still very likely need human input and help. LLMs will hallunicate badly on problems just a bit more difficult than the very standard ones (first-hand experience with math).
More proof that old boomers don't get what its like to be a modern, young adult. I was just texting with friends about this at the coffee shop this morning while making plans for this weekend. Boss is interruping by goat-yoga mindfullness session, asking me to come into the office an hour this month. Who has time for this?
You olds have all the money, all the time.
I wish I had all the money and all the time! I don't, alas...
I know it sounds stupid but I started to but lottery tickets, not to win, because statistically it is impossible, but just to give me hope, because lottery is the only thing in the world that can land a mountain of cash in one shot, with a very small investment. Nothing else can do that.
That's why humans purchase lottery tickets all the time throughout history. It's too cheer themselves up.
I found a cheaper way. I walk around the city for a couple hours every weekend with the hopes of randomly finding a winning lottery ticket.
Why not buy out of money options in companies that you know with spare money? Better odds and good payouts if you hit it right.
Needs too many correct bets or too much $$ to get a few million of returns. You can win a lottery of 50 million with just a few dollars! But I do think this is an interesting strategy. I might try it out just for fun.
Anyway I'm half joking. I do buy lottery but it is just to improve the mood of the day. Oh a good mood for a few hours is so important to keep being sane.
It is impossible to truly understand General Relativity after the age of 35.
I strongly doubt this. It’s rare and we have all sorts of credible theories about why it’s rare, but the decline of so-called fluid intelligence is mostly Flynn Effect: people are getting better at taking tests.
Is the joke because Einstein published his findings on General Relativity at 36?
Devil's advocate: how many years did it take him to get to publishing this work?
It is a possibility I actually agree with, because a true understanding probably requires a lot more than taking some classes. It probably needs a PHD on Cosmology or something else.
But let's say a shallow understanding is good enough...even just completing a General Relativity graduate course with good mark is good enough.
You should not have entertained that comment.
There is absolutely zero evidence that 35 is some mystical cut off for "understanding." That poster has NO clue what they are talking about. Seriously, feel free to ignore that comment.
As for practical advice for learning, you should look into learning how to learn and then spend about 1-2 year habituating to the proper way to acquire knowledge. The science says your (not just you, practically everyone) current intuitions and habits are incorrect; as evidenced by almost everyone in this post. Youtuber Justin Sung is pretty much second to none in terms of a practical program for acquiring these skills.
If you want general guidelines to follow to determine who's telling you the truth and who isn't use the following wikipedia article to guide you: https://en.wikipedia.org/wiki/Active_learning#The_principles...
Note: Simply reading that article and "understanding" what it is saying is not equivalent to having a study program that implements these things, and having a program that implements these things is not the same thing as actually executing on and habituating to said program. This process takes many months to years.
Best of luck.
Echoing the sentiments of others here, this is why I firmly believe that public college should be free, for all, for life. Formal education just works better for some of us than video tutorials or self-paced learning, and ensuring everyone is able to learn new things and practice their skills in a consequence-free environment benefits society as a whole.
Think about the tech nerds (me) who never learned how to cook, and are in their thirties. Or lawyers and Doctors who are sick and tired of feeling like they don’t understand how computers work, and want to learn. Or an accountant who loves maths, and wants to get into the scientific side of the field. Or the homemaker who wants to re-enter the workforce now that their kids are grown, and wants to pick up carpentry and welding to become a tradesperson.
If cognitive decline comes from failing to practice it regularly, then the cheapest solution is free education for life to encourage as many people as possible to keep learning new skills and remain cognitively engaged.
> I firmly believe that public college should be free, for all, for life
I just don't understand these statements that "this or that should be free". Do you plan to enslave the people who would provide this education? Do you not subscribe to the saying "You get what you pay for?". Public education through High School (in the US) has been free for many generations. Ever wonder what would happen if you make the next 4 years "free"? (Hint, you're not going to pop-out of those 4 years with any skills that are differentiated enough from everyone else who took-up the "free" education and not be right back in the same position you are now.)
If you don't have the motivation to prevent your own cognitive decline by taking advantage of a plethora of already free (high quality) education (e.g. https://ocw.mit.edu), then taxing the rest of us so you can be spoon-fed all the free "formal education" you want for life isn't the answer either.
> Do you plan to enslave the people who would provide this education? Do you not subscribe to the saying "You get what you pay for?". Public education through High School (in the US) has been free for many generations.
Do you believe that the people who provide public education through High School are enslaved? If yes, how? If not, why do you assume providing free public college education requires enslavement?
> Public education through High School (in the US) has been free for many generations. Ever wonder what would happen if you make the next 4 years "free"?
No need to wonder. Tuition for bachelor's degrees is free in multiple countries, for instance Germany, Finland, Sweden, Scotland and Norway. What happened there?
I assume the “enslave” question is meant to allude to the fact that, since nothing is actually free, when demand outstrips supply, at some point “making it free” can only mean forcing people to provide it (financially or personally).
If “x should be free” was a solution to anything, why stop at education? Let’s make everything free!
Are people being forced to provide free public education up to High School in the U.S.? Are they being forced to provide free University education in the countries where it is free?
If something being free implies forcing people to provide it, to the point that "enslaving" them is a reasonable analogy, why have anything free whatsoever? Let's have nothing free!
> Tuition for bachelor's degrees is free in multiple countries...
For the longest time, it used to be free for state residents attending State colleges in California.
In Germany it is common to get kicked out of free university on a technicality and you have to sue to get back in. I am told they do this to keep class sizes manageable and to filter out those who are just there because it’s free and they have nothing else to do.
That sounds like a 'Germany' problem, not a 'free university' problem. There are trivial ways to keep class sizes manageable.
One way is for universities to limit the available places in each degree for first year enrolment, and assign these places based on entrance exam results (as in Spain, where tuition isn't free but is very cheap compared to the U.S.). Another way is to have unlimited first year places, but restrict places from second year onwards to a given number n, allowing only the top n students from the first year to progress (as in France, where tuition is not technically free but averages to < 200€ a year).
> Ever wonder what would happen if you make the next 4 years "free"?
A high school diploma used to mean something because it was a filter. Once graduation rate became the goal, standards were lowered, and just showing up became enough to graduate.
Higher education does some filtering. Either they filter aggressively at admissions and graduate everybody (Ivies), filter with weed-out classes and lesser degrees (respected public universities), both (other public universities), or offer a middling education and are ranked accordingly. So the degree means something.
I agree that degrees can be filters, but I question what "filter" they represent in modern contexts. From my experiences, the modern degree is little more than a gatekeeping credential to demonstrate you either took on substantial student debt (and thus likely to take lower pay or more precarious employment) or come from a wealthy background (stronger social networks for other rich folks/Capital types; a "pedigree", in other words, a la a caste system).
You're 100% right that a modern American High School Diploma does not reflect any degree of basic competency, because standards were constantly refined downward to promote graduation at all costs; I argue college degrees (and many technology certifications) are much the same, providing little more than a demonstration of taking on debt and rote memorization capabilities, rather than being a functional worker.
So if that's the case, and they're not of practical value as credentials anymore, it could be argued there's no harm in opening fundamental/foundational courses in skills to the entire populace, paid for through taxpayer money and restricted to State/Public non-profit Institutions. If we're really concerned about costs, we could implement caps on consumption unless part of a degree program to ensure those taking the advanced courses for employment prospects are given priority over those seeking non-professional growth. There's a lot of wiggle room to be had, if we're serious about opening this up.
I don't understand this sentiment. You have no problem spending $800 billion in tax payer money on military in a country that hasn't fought a defensive war in 200 years but as soon as the same concept is applied to education or healthcare it's somehow wrong?
This is false equivalence. Most of us that share his ideology aren’t fine with either.
Why would we be in the “foreign forever wars should be free” camp?
When approaching these sorts of situations, it is best to steelman your discussion partner’s argument. It will help in your understanding. People who disagree with you aren’t all stupid.
> Do you plan to enslave the people who would provide this education?
There's a concept called public money which can build roads, dams and other cute concrete things. Why can't you use that for payroll in higher education? Not everybody can learn the same way, not everybody has a separate and chill space in their homes to study without interruption.
Roads serve the needs of now, knowledge builds roads to the future.
> I just don't understand these statements that "this or that should be free".
Because you're focusing on the accumulation of a finite resource (currency, land, etc) as the sole barometer for success, and then conflating "freedom for use" with "freedom from cost". Obviously salaries have to be paid, buildings maintained, and improvements paid for. Obviously this all costs money, which is a finite resource. Obviously that money has to come from somewhere. Taxation enables everyone to contribute a fraction of the cost regardless of use, and an effective social program (like free education) distributes that cost effectively over time since there's zero chance 100% of the population will consume that resource at the same time, or even in the same year.
It's basic societal maths. If we accept forgoing a profit on the consumption of the resource (healthcare, roads, mail service, education, defense), we can lower the cost substantially and concentrate on its effective utilization. If we do that, we can carve up the cost across the widest possible demographic (taxpayers), and assign a percentage of it as taxation relative to income and wealth. It's how governments work.
> Do you not subscribe to the saying "You get what you pay for?"
Does anyone subscribe to this in the current economy? Everything has record high prices, yet still bombards you with advertisements, sells your data, and requires replacement in a matter of years instead of being repairable indefinitely. University education has boiled down to little more than gargantuan debt loads to acquire a credential for potential employment, a credential that often has no relevancy to the field you actually find work in.
So no, I don't subscribe to that, and I haven't for a decade. My $15,000 used beater car is literally more reliable than a six-figure SUV, and it doesn't keep mugging me for more value to the manufacturer through surveillance technology and forced-advertising.
> Ever wonder what would happen if you make the next 4 years "free"?
Yes. I imagine much of the populace would be better educated and informed about how modern, complex systems work. More people would be fiercely resistant to the low-wage, high-labor jobs that flood the market, forcing a reconciliation of societal priorities. I figure we'd have more engineers, and artists, and accountants, and tradespersons. We'd have more perspectives to existing problems from a broader swath of the economic strata, instead of the same old nepobabies from a lineage of college graduates making the same short-sighted mistakes.
The question is, have you considered what might happen if we made a four-year degree more economically accessible?
> If you don't have the motivation to prevent your own cognitive decline by taking advantage of a plethora of already free (high quality) education (e.g. https://ocw.mit.edu), then taxing the rest of us so you can be spoon-fed all the free "formal education" you want for life isn't the answer either.
Now you're just insulting people because they lack means, and conflating it with lack of motivation. I've lived with people whose sole education was reading books in Public Libraries because they never had public education, with Section 8 housing recipients hammering online learning courses from shared computers to try and find a way upward and out of poverty. None of that gets them a foot in the door, because they don't have the physical piece of paper that says "University Graduate" and the social networks you build from physically attending school - which adults cannot do without money or taking on substantial debt, that in turn jeopardizes their ability to survive.
If you want a society where only those of monied means have the ability to succeed, well present-day America is certainly an excellent demonstration of that. I'd rather build a society where all of us contribute a part of the proceeds of our labor to build a more equitable society for all, so everyone has an opportunity to found that new business, make those social connections, or try new ideas, without worrying about losing their home or paying for healthcare treatments.
> Does anyone subscribe to this in the current economy?
Not anyone whose net worth is under -say- fifty- or a hundred-million dollars and is older than their mid-thirties, that's for sure.
If you're not rich enough to routinely afford very well-made things, and you're old enough to know that very many things legitimately used to be far, far higher quality for not that much more inflation-adjusted money [0], then you sure as shit don't subscribe to that saying anymore.
[0] And sometimes, far less... especially when you factor in the cost of continually replacing the garbage that's all that you can afford.
Chat gpt is already free to a very generous extent, and covers 80% (if not more) of the learning resources you could need for almost any topic, theory-wise. I'd risk saying it can adapt for most people's needs.
For practical knowledge you just need to do it over and over. A good mentor/teacher would help a lot, but the very very basics I'd say are learnable by yourself. It's as simple as doing it over and over and keeping a critical eye on what went good and not.
As a result, I don't think free public colleges would enable more people to -actually- learn compared to what we have today. However, I find it would be a great place to build community and find people with similar interests to you, which is quite rare to do without an app these days.
See, I'm worried about relying on LLMs for learning given their penchant for hallucinations and the early studies showing they're actually bad for learning or cognitive improvement, since they remove the "research" and "critical thinking" phases of problem solving for entry-level stuff - fundamental skills that are necessary to put something into practice independently and learn from mistakes. Sure, teachers/professors can also make stuff up (and often with more damage given their position as a "reliable authority"), but in a classroom setting it feels like that'd be found out faster than using a ChatGPT that's spitting out bad results.
> However, I find it would be a great place to build community and find people with similar interests to you, which is quite rare to do without an app these days.
This is what a lot of detractors seem to miss about the benefits of in-person learning. Team projects force you to interact with strangers and cooperate for the benefit of the whole. Campuses increase the likelihood of chance encounters. They get you out of your home and into the community, which helps you feel connected to your actions and their outcomes.
The knock-on effects are often greater than the immediate benefits.
I am 100% with you. I am great engineering wise.. Have no clue how to eat and live healthy!
You're not alone! Nobody knows everything, and what's important or necessary to our thriving changes constantly throughout our lives. Learning to cook wasn't high on that list when tech salaries were great, delivery was cheap, and housing wasn't (completely) unaffordable; now that I'm nearing my 40s and have to stretch even a six-figure salary further than before, suddenly learning to cook is a necessity.
Good people are always changing in some way. Making public education free encourages lifelong learning and builds a more adaptable human for times of crises. It's good survival strategy, that also just happens to create a more fulfilled human being.
I don't think I should be paying for others to study simply because they prefer a different modality of learning, especially when it has been found that learning modality selection has nearly zero impact on actual learning outcomes.
Now, if this was structured as a negative tax system, where eg everyone after graduating high school starts with -$10k in taxable income for a handful of years, perhaps that could avoid punishing those that choose to self-study.
This line of reasoning can be used, unmodified, to argue against essentially all of public education.
An educated populace is an inherent good. There’s nothing magic about the particular choice of K-12, and one could very convincingly argue that with the increasing complexity of modern life and increasing expectations from employers that ongoing adult education is also a net good, even when you’re not the recipient.
Ongoing education can also be vocational for those who aren’t inclined towards typical academia.
Cynically, one can also point to the current political administration of the U.S. (and the comparative education rates for its voters) as a case in point for why education is important.
You can already take classes for free on youtube from the best instructors in the world at the best universities.
What more do you want? People just like to complain about political issues as a type of entertainment.
Completely anecdotal, current student, but some people learn differently from in person class vs online/remote. Also, imo not every degree/course is best done online (e.g., trades, arts, performing arts). Right now I am taking classes which cannot be done online.
I do however understand where you are coming from. MIT courseware is abundant, youtube, library resources, github `awesome` lists...
If there wasn't bureaucracy/capitalism surrounding higher education i wouldn't mind it coming from tax dollars since it would be another log added to the fire per se. Plus it helps to create a stronger workforce (theoretically, assuming graduates). Without the right safeguards, free college edu wouldn't work, would be nice tho
There are hundreds of channels on YouTube providing everything a child needs to learn arithmetic and reading skills. Why should we even provide K-12?
I consider this a matter of extreme importance. Your gutter-tier hot take makes me think you may be the one confusing this with entertainment.
Very well said. Education, at its core, is about adapting the species to better survive the increasingly complex world it creates and inhabits. Failing to educate the whole means exposing it to fracture and exploitation from within.
It’s inoculation against exploitation, a mental vaccine that, when done right, promotes cooperation over self-interest.
Which is exactly why those who are threatened by it, seek to restrict or destroy it.
“i don’t think i should be paying” - if it could benefit your community or the country, then why not? It gives people options.
Not everything is a zero sum game. That’s just a fact of living in a society. Some people pay in to the system much more than you, and you benefit from that. And vise versa, there’s someone paying less and they benefit from your contributions (taxes, etc). That’s what society is about. A system that allows citizens to thrive. It’s not supposed to be about ME ME ME.
Just my 2 cents…as an american that’s tired of this attitude. Capitalism with small guardrails is garbage in my opinion.
On a somewhat related note - many americans think free healthcare is not worthwhile because it’s a net negative for them PERSONALLY. I struggle to understand that as well. Like “oh i don’t want to pay for that”. Meanwhile most of your fellow americans can’t afford basic care.
What’s the end game??? You’re entitled to your opinion of course but i don’t _understand_ it.
> Formal education just works better for some of us than video tutorials or self-paced learning
I don’t agree with this at all. Anecdotally, the autodidacts I’ve met are way more knowledgeable about subjects they’re passionate about compared to those who received a formal education for it. This applies to both computer science, but also psychology majors who I’ve met who can’t even tell me the difference between Freud and Jung.
> I don’t agree with this at all.
Are you actually saying that nobody exists who learns better when taught in the best ways we currently know how to teach, and in the way all formal education currently works? That everyone is better off teaching themselves with no help?
You are disagreeing if and only if this is what you are saying.
I mean, you can disagree with it based on your anecdata, but mine backs up my assertion which is why I made (and qualified) it the way I did. I specifically thrive in live sessions with an instructor knowledgeable on the material who can provide direct feedback, and I am not the only one. "Works better" is a qualifier on the effectiveness of the education on an individual, not the effectiveness of it on all individuals.
The key to learning accessibility is flexibility. Some thrive on self-study, some thrive on video tutorials, some thrive on audio lectures and others in live exercises. Heck, I wouldn't be surprised if this also applied to specific topics: fundamentals of cooking might be better via live instruction, while iterating on a recipe is often fine with self-study or video tutorials.
The point is the flexibility, to allow people to learn in a way that's best for them, so they're more likely to continue learning throughout their lives.
Over the past 40 years I've become aware of a LOT of people who had difficulty staying engaged in self-paced learning sessions, especially pre-recorded. Without the dynamics -- questions and interactions -- that other students can pose (or you can pose), it's tough to maintain your attention for a solid 50 or 90 minutes. Not that all courses must be in-person, but I'd there to be a mix, with more in-person opportunities for course material that needs Q&A and interaction and examples, like courses heavy in math or theory, or recitation sections.
you're saying you don't agree with it, but then go on to talk about something entirely unrelated.
op isn't saying self paced learning doesn't work for anyone, therefore it's irrelevant if you know some whizz autodidacts
My mother in law did many mind puzzles every day.
She still got Alzheimer's and died a couple of years later.
She had multiple incidents that she hid because she was too scared to find out, and too stubborn to lose her ability to drive. She could have had some treatment if she'd approached a doctor earlier.
Alzheimer's is utterly evil. Robbing people of their unique spark, killing the person before the body dies.
Sorry for the rant
Alzheimer's is a disease, you can get it in your 40s. If somebody recommends exercise to keep your legs healthy, they don't mean that if you have a staph infection in your legs that exercise will make it go away.
My grandfather had vascular dementia, and keeping him thinking and using his brain absolutely helped. Makes sense for a problem of blood flow that thinking new, hard stuff might direct some more blood supply to the brain.
Also, 1) you don't know for sure if you have Alzheimer's until you're gone, and 2) it seems that vascular dementia co-occurs with Alzheimer's a lot. So I can't imagine that it would ever be a good idea to stop using your mind if you felt it slipping.
Yep, first thing I thought, too. I'm terrified of age-related degeneration, so I try to stay active and mentally alert, just like my father did. He got out and played golf every chance he had, did duo-lingo to try to learn to speak Spanish, played bass in his church band, kept working even though he didn't need the money... and still got Alzheimers. Now he can't drive, can't be trusted to go out and take a walk by himself, can't even work the TV, so all he can do is sit and watch DVD's that my mom changes for him - at least while she still can.
I'm still going to try to fight it for myself, though.
I hope we have the compassion as a society to get to the point where I can say, "If I am unable to recognize my children, please kill me." At that point I would have died regardless of the condition of my body.
I don't want to wait that long. If I get diagnosed with Alzheimer's, I am taking a quick farewell tour of family and friends and then I'm done. I don't want to wait so long that I need someone else to off me. I wish that all wasn't necessary but this country (US) isn't going to get smarter anytime soon.
There are lots of reasons why that might not play out the way you plan.
I hope we have the compassion as individuals not to ask others to kill us. That's a heavy weight to put on someone else. It's not abstract "society" conducting the euthanasia: individual healthcare providers would have to decide that you met the criteria and then administer the drugs.
> I hope we have the compassion as individuals not to ask others to kill us.
When I've had to kill my pets, I didn't do it myself. I called in a professional to do it.
Surely you don't believe that OP is asking their friends to knife them in the chest if they're too far gone to ask to be euthanized? Surely you believe that OP is asking their friends to ask a doctor or nurse come in and do it, if OP is no longer capable of asking for it to be done?
Did you even read my comment? Healthcare professionals are people, too.
A human is not a pet.
> Healthcare professionals are people, too.
Yep. Strong agreement there.
> A human is not a pet.
Outside of some fairly fringe consensual relationships, I agree that humans are not pets.
> Did you even read my comment?
I did. I was giving you the benefit of the doubt.
Well, the consequences of your stated philosophy are pretty thoroughly explored in this article: <https://theonion.com/no-one-should>.
Instead of posting snarky, low-effort comments you should spend some time learning the basics of medical ethics. This is not a religious issue. Medically assisted suicide for a terminal patient is one thing, but directly killing a patient with severe dementia who is mentally unable to give informed consent is quite another. This would put healthcare providers in an impossible situation. There are good reasons why no civilized country allows this.
Sure, I'll follow those goalposts as they walk down the field.
At the time it becomes relevant, a person with a DNR is usually (always?) in no state to give informed consent to being killed by their doctor's inaction. Same thing for someone in a irrecoverable coma who's being kept alive by machines when a family member or friend instructs the doctor to pull the plug on them.
Relatedly, angels of mercy have been releasing suffering folks who are at the end of their life from that suffering for ages.
You might find these things unpalatable, but they do happen, will continue to happen, and we're better off because they do happen.
I sincerely hope that through to the end of your life you remain lucid and able to clearly and convincingly express your preferences. I very much hope that you're not locked in a metaphorical hell of suffering, but unable to express to (let alone convince) anyone that you're ready to end it early.
Stop lying, I haven't moved any goalposts. In medical ethics there is a clear line between withholding care versus actively killing someone who is unable to give informed consent to the procedure.
You absolutely have moved them (and also refuse to talk about pre-registered requests to die (of which, the request that kicked off this subthread totally qualifies)). You started by saying
> I hope we have the compassion as individuals not to ask others to kill us. That's a heavy weight to put on someone else. It's not abstract "society" conducting the euthanasia: individual healthcare providers would have to decide that you met the criteria and then administer the drugs.
(while ignoring that asking a doctor or nurse to kill you is also asking another to kill us) and now you've moved to talking about specific situations that can be tricky, depending on the particulars.
Isn't that still less awful than having to administer other kind of drugs again and again for suffering and slowly dying patients that want to die? The situation is just bad regardless.
Exactly. The way we treat terminal cancer looks an awful lot like sadism.
So long as people can freely choose whether they will do it or not I don't see a moral problem. There would be a very big problem if healthcare providers were mandated to provide such a service. And note that while the evaluation certainly needs to be by a doctor a nurse is quite capable of doing it. Look at the Canadian method--for the most part it's something that's actually done quite routinely in emergency rooms across the world. Sedation followed by a paralytic. Usually that's a prelude to intubation but if you walk away in the middle it kills. Canada then pushes potassium chloride just in case as the paralytics wear off pretty fast.
And we are better off as individuals if we have the option of having external providers do it as that removes any dependency on actually being able to do things. There also is the benefit that it brings an external evaluation into the system that can recognize that maybe the evaluation was wrong. (I'm thinking of a case I heard about--woman thought she had lung cancer, chose to not treat it, simply work until she dropped. Autopsy said TB, not cancer.)
You completely missed the point. This discussion is about dementia. The assisted suicide laws in Canada and other countries generally require the patient to be of sound mind, as evaluated by a qualified clinician. The laws don't apply to patients with severe dementia.
In the comment above, @thinkingtoilet apparently wants someone to kill them if they ever have severe dementia. Presumably that desire would be expressed in some sort of "living will" type document. If the patient meets the criteria then should a healthcare provider strap them down and kill them, even if in the moment the patient says then don't want to die? That seems ethically dubious. It essentially puts providers in the position of being serial killers.
Canada has also had some serious abuses and ethically questionable situations. They are not necessarily a model to emulate.
https://www.pbs.org/newshour/world/some-health-care-workers-...
Of course there are edge cases. Reality is continuous, not disjoint, and thus any attempt to impose a line will inherently create edge cases. What you are missing is that the case of inaction (not permitting it) also creates bad things. You can't make a situation without bad, all you can do is attempt to minimize the bad. Note that the "ideal" (as in maximum social benefit) amount of bad things happening is not zero. Preventing bad things always comes with a cost, there will always come a point where additional preventing of bad things causes net harm.
Consider, for example, nuclear power. It has basically been regulated out of existence in the US because of the standard that radiation exposure must be as low as reasonably achievable. The problem with this is that it doesn't result in safer nuclear plants, it results in plants that run on different power sources. Natural gas? Approximately 10x the risk (and that's not counting climate effects.) Oil? Approximately 10x the risk of gas, thus 100x the risk of nuclear. Coal? Approximately 10x the risk of oil, thus 1000x the risk of nuclear. The expected (and observed) safety benefit of the regulations is negative.
And to preempt the inevitable "Fukushima!", that was political. The expected death toll of staying put was approximately zero. The city was evacuated, killing hundreds, for no good reason.
I'm not missing anything. Medical ethics as commonly understood in modern western civilization imposes a clear line between withholding care versus actively killing someone who is unable to give informed consent to the procedure. The topic under discussion isn't even close to being an edge case. Minimizing the bad is not the goal. Nuclear power has zero relevance here and bringing it up is just an attempt to confuse the issue.
It used to be that medical "ethics" precluded actively killing someone. That's a religious legacy that more and more countries are coming to recognize is wrong. Fundamentally, this reduces to whether quality of life can be negative.
If quality of life can be negative then there will be cases where the humane act is to provide someone with a comfortable death.
And my point about nuclear power is that excessive regulation actually is counterproductive at maximizing human benefit.
I can only assume the severe dementia clause is an exemption to sound mind, but not willingness to undergo the procedure.
Forcing someone to live through a disease when they have already lived a full life is simply cruel. Why should someone have to suffer on their way out?
Who is doing the forcing here? Are you personally volunteering to kill anyone who decided that they wanted to be killed if diagnosed with severe dementia? What if they change their mind (even if no longer of sound mind) and say they no longer want to die? Would you go ahead and kill them anyway?
> Who is doing the forcing here? Are you personally volunteering to kill anyone who decided that they wanted to be killed if diagnosed with severe dementia?
I cannot do that because I am not a medical professional and even if I was I wouldn't be the only one making that decision. I do have a lot of respect for the people whose job it is to perform euthanasia. It's not an act of cruelty, but of kindness.
> What if they change their mind (even if no longer of sound mind) and say they no longer want to die? Would you go ahead and kill them anyway?
No euthanasia program is going to kill someone who says they do not wish to die. The moral hazard mainly comes from when they are no longer able to express their wish. Then the decision is based on the wish expressed when they were still able and the wish of family members.
This is not all too different from someone who has suffered severe brain damage and is kept alive on life support. Would you keep them alive until they die of old age or would you respect the family's wish to stop treatment? People with severe dementia may not be on a breathing apparatus, but they also cannot survive without the constant support of hospice care.
But cases where the person can no longer express their wish are exceptional. It is often that their wish to end their suffering is so strong that they will stop eating to hasten their demise. What would you do in that situation? Would you forcibly feed them through a tube because you do not believe they are allowed to determine the manner of their death? Or would you simply ignore their suffering as they die a slow and agonizing death from malnutrition? This is what I mean when I say that you would be forcing someone to suffer.
You always have the option to die whenever you want to. Why would you put the burden of ending your life upon other people? That's utter cowardice.
In my opinion, there is a line that needs to be crossed and that line is extremely hard to define. To be safe, you have to go past the line so any blurriness is removed. I would ask the people I love the most to shoulder this burden and I would offer to shoulder the same burden for them. This is how love works.
No, you don't have the option. To have the option you must have the ability. Consider the hypothetical that started this: "if I don't recognize my children". At that point the ability to do it yourself is gone.
Another possibility is that millions of people have said that they don't want to live to grow so old that they don't have their wits. But when the days come, they don't really want to die.
Sorry to hear that. My FIL was just diagnosed with dementia and it’s heartbreaking to watch it progress.
My neighbor passed away from dementia recently. We first moved in maybe a year after his diagnosis and had to watch it progress. Horrible.
Now a friend of mine who is the best programmer I know has an early onset diagnosis. I have noticed him starting to pick fights regularly with people on LinkedIn over programming topics.
It's a really, really hard thing to watch someone go through.
What kind of incidents, if you don't mind?
Hallucinating while driving, seeing people or animals that weren't there. That happened multiple times over several years before the diagnosis.
Unfortunately she didn't share what other incidents she had, I really wish she had.
Hopefully a cure comes as a form of vaccine so some folks can be totally against that.
I don't think mental stimulation correlates to the development of alzheimers anyway. The papers I've touched on the subject seem to suggest a mechanical failure in proteins essentially choking off and killing brain structure. Although the lucidity period shortly before death is interesting.
[dead]
With 25 years of experience in software development, I’ve noticed that long coding sessions leave me feeling more fatigued than they used to. However, I’ve also become significantly more productive, as I spend far less time grappling with problems I’ve already solved. I’ve only just begun to explore AI-assisted coding, so that isn’t what’s driving my efficiency. Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
It depends on the task, but overall, for the work I do as a software developer, yes.
I would say I have less energy, but I need less energy, and I produce better results in the end. I'm better at anticipating where a line of work will go, and I'm quicker and better at adjusting course. There are a lot of multi-hour and multi-day mistakes that I made ten and twenty years ago that I don't make now.
The raw mental energy I had when I was younger allowed me to write things I couldn't write now, but everything I write now is something that other people can read and maintain, unlike twenty years ago. It's very rare that writing a large, clever, intricate mass of code is the right answer to anything. That used to frustrate me, because I was good at it. I used to fantasize about situations where other people would notice and appreciate my ability to do it. Now I'm glad it's not important, because my ability to do it has noticeably declined. In the rare cases where it's needed, there are always people around who can do it.
Another thing that is probably not normal, but not rare either, is that the energy I had when I was young supercharged my anxiety and caused me to avoid a lot of things that would have led to better outcomes, like talking to other people. I'm still not great (as in, not even average for an average human, maybe average for a software developer) but I'm a lot better than I used to be.
What I find most draining is the non-coding work I now do for work. I love the org I work for and it's really fulfilling but I do a lot of senior stuff now and I feel like the years slip away without always getting to build and invent as much stuff as I'd like to. There's so much to do and learn, it's amazing, we live in this difficult world but with amazing opportunities, and I wish I had an extra 12 hours a day (of energy) just to learn and build.
I was young once, 25 years ago I started programming, and I feel as though I have at least another 25 in me, if not more.
I find that I have "less horsepower, but smarter gears", so it kind of evens out.
I'm less likely to code until midnight, but more likely to have the problem solved before clocking out at 6pm ;)
>>I’ve noticed that long coding sessions leave me feeling more fatigued than they used to.
As we age, learning vs getting-paid graph first flattens, then either grows very slowly or not at all.
Im guessing that is where the fatigue part comes. You are not exactly growing too much after working hard after a while.
In fact reducing hours worked might correlate with happiness more as you can allocate free time to other rewarding tasks.
I've been coding for over 40 years at this point. I'm definitely a better programmer than I was - not necessarily faster at pumping out lines of code, but I get the right approach first time more often than I used to. Whole classes of bugs are just easy when you've seen them before, but I'm also better at avoiding them in the first place because I know my weaknesses and where to spend time thinking more carefully.
At the same time, I can't context-switch like I used to. Once I get into the zone, no problem, but interruptions affect me much more than when I was 20 (or even 40). I can almost feel the tape changer in the back of my head switching tapes and slowly streaming the new context into RAM (likely because all the staging disks have been full for years).
As for long coding sessions - I relish them when I get the chance, which isn't as often as I'd like. Once the tapes have finished loading and I'm in the zone, I can stay there half the night. So that hasn't changed with age.
It could be something similar that we see happening in seasoned weightlifters/bodybuilders:
As your absolute strength gets stronger, the same exercises and workouts get proportionally more fatiguing.
5 sets of a bench press at an 80% of max load, taken within a rep or two of failure, done by a first-year lifter, is incredibly different from that same scheme being done by somebody who's lifted for 10 years. So more advanced lifters tend to do things like lighten the load and use variations of lifts that have more favorable stimulus-to-fatigue ratios.
Anyways, I thought maybe as an advanced programmer, something here could be analogous. You've already done all the coding and thinking to figure out easier and lower-level problems. So what you're left with are the more cognitively challenging parts of coding, which should be more mentally exhausting per unit time. Whatever is '80% difficulty' for you is probably way more advanced than what you were looking at 10 or 20 years ago.
That is the conventional wisdom: decreasing stamina/energy is compensated by having more experience/expertise.
Magnus Carlsen (the multiple times chess world champion) talked about this in his recent Joe Rogan podcast. He said he passed his chess peak already now at 34. He now knows more, but when he was younger he could win via brute mental power.
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
So according to Carlsen, for chess the answer is no.
I personally also suspect the answer for programming is the same. Most, if not all, of the hotshot programmers we know became famous in their early 20s. Torvalds started writing Linux at 21. Carmack was 22 when Doom was released. Many of the most famous AI researchers were in their early 20s when doing the most groundbreaking work. Einstein's miracle year by the way was also when he was 26.
> He said he passed his chess peak already now at 34. He now knows more, but when he was younger he could win via brute mental power.
The famous anti-case for this is J.R.R Tolkien started writing Lord of the Rings when he was about 45.
Writing is not programming but they are not that dissimilar. Especially in this context.
What I've learned over the years is life is actually not fair and everyone is different. You can be razer sharp and reasonably healthy at 83 or be in great shape and die of a brain aneurism at 12 with no warning.
Basically don't let studies or other people's results persuade you into not starting or giving up.
Creative writing is tremendously different from coding, imo.
> Creative writing is tremendously different from coding, imo.
I've had a different experience.
IMO there's a huge overlap in skills when writing, coding, making videos and playing guitar.
They all boil down to the idea of getting something out of your head and then refining it until you know when to stop refining based on whatever criteria you're optimizing for at the time.
This is based on writing over a million words and making hundreds of videos over 10 years on my blog and programming for ~20 years while casually playing the guitar for about as long.
What aspects make them feel different for you?
People in their early 20s are also much less likely to have other responsibilities "intruding" into their headspace. It's a lot easier to be monomaniacal when you don't (for example) have kids yet.
I know. That's the common argument, but I don't think that's it. See the argument I made in the previous comment. As I wrote in that comment, Magnus thinks his brain was better when he was younger. It probably doesn't help to have responsibilities like children, but I don't think that explain everything. There are also many people without children for example. And if you don't have children then studying full time should take as much if not more time than a simple job.
Also, Hans Albert Einstein was born during Einstein's miracle year.
> Also, Hans Albert Einstein was born during Einstein's miracle year.
This was in an era when fathers had little to do with childcare. I don’t know about Einstein’s specific situation, but even 40 years ago almost half of fathers had never changed a diaper.
I listened to Magnus and I took that quite differently.
I took that he said that others have caught up and he is just not motivated to do the type of studying to improve even further at this point.
There is a process we don't really have a name for that was best summed up by the boxer Marvin Hagler:
“It's tough to get out of bed to do roadwork at 5am when you've been sleeping in silk pajamas”
The demotivation of success. Of course, that is also going to correlate with age and be very hard to disentangle. At the same time testosterone levels will be past peak, adding another variable in the mix. Plus actual mental acuity past peak.
In other words, as someone pushing 50. Getting old kind of sucks systemically.
Very little of my work needs breakthroughs or inventions. Nothing new under the sun, as the Romans said. So, this mental peak is less important than being focused and efficient for me.
i have 10++ years more, but i don't notice such.. fatigue. 2..5+ hours.. no problems (even with fingers-typing-wrong-keys/order-much-more-often). What i do notice though, and not only in coding, is.. kind-of creeping-boredom. Growing tired of certain things going the way they go, too quickly. You know, the deja-vu feeling when you see something developing certain way, and seeing it go exactly there. Thousand times..
But i haven't stopped learning things, apart of the software-making-related, 2 years ago went into e-foiling, and some half-related more-technical adventures. So maybe that is keeping the dementia at bay..
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
Maybe up to a point. Most of the tools and languages I use daily are fairly recent, or at least new to me. I don't have much of an advantage, if any, compared to my younger colleagues.
There are certainly things I do better now than 10 years ago but I think I'm slowly declining though. Fortunately, there's more than one way to be productive professionally so I hope I can keep up for a few more year.
State of the art tools and technologies today are implementing the features of cutting edge languages and technologies from decades ago.
There are very few capabilities in mainstream languages today, if any, that weren’t available in Common Lisp back in the 1980s or 90s.
I noticed that I can still do long sessions if I have to crack open a problem ( I started coding around 35 and now I'm 40+), but the burnout may prevent me from coding for a few days.
I do think it has more to do with daily chores (work, family) than my age. I noticed that, despite being easier to get frustrated nowadays (because I get exposed to more sources of frustration) than I was in my 30s, I'm actually more perseverant than myself 10 years ago. I managed to be very close to wrap up a side project, the first time in my coding life. Of course the scope is smaller than my previous projects but I'm surprised that I didn't back down easily, considering how many times I banged my head during the first few weeks.
I guess being exposed to more frustrations does improve ones resistance to it. To be precise, I get agitated easily, but that agitation doesn't seem to burn me out in the middle term -- while in my 30s I didn't get agitated very often but every time it burns me down to the point I left my side projects.
This is something that can be gamed out mathematically, for example time to goal minus time to refactor.
As someone who has been writing software and/or managing operations for 20 years here is what I have noticed:
* The more experienced people get the more cognizant they become of fatigue in that they know when to take a step back.
* The more experienced people get the faster they get in that they know how to approach repeated problems.
* People do not necessarily get better with experience. Some developers never fully embrace automation, especially if they are reliant on certain tools versus original solution discovery.
Based on that it’s natural that some older developers tend to decline with age while others continue to grow in capability and endurance. The challenge is to identify for that versus those who mask it.
Software developer of more than 20 years.
I wouldn't say, "decline," to be charitable. I tend to lean more on mathematics and writing. That often makes up for the lack of stamina.
When I look back on code I wrote 15, 20 years or more ago... it's fine but it lacks the sophistication I have now. I didn't know what I didn't know back then and had to learn. I can see in my code where I encountered a problem and instead of solving it I added more code until it, "worked."
I wasn't university educated so that's explains a bit of it. I didn't start picking up pure functional programming and formal methods until my mid thirties (gosh, has it been a decade already?). I worked through Harvard's Abstract Algebra at 38. I'm leaning more about writing proofs and proof engineering in my spare time while continuing to stream work in Haskell on various libraries and projects. And I'm in my 40s -- I'm doing more programming and mathematics now than ever.
I'm also playing in a band, practice calisthenics and skateboarding, and have been improving my illustration skills with ink.
It seems like the discovery of the article is that if you don't use your skills they start to decline as early as your late 20s. All it takes is practice to maintain and improve them!
I might get a little tired every now and then and can't keep every library I've used in my head all at once. But I tend to rely more on mathematics and specifications and writing. I write less code now. I remove code. And I keep programs and systems fast and correct.
Nothing declining here!
I suspect you are better at architecture now than you were 15-20 years ago, such that you don’t have to struggle over how to solve many complex problems. The solutions and their planning are likely fluid now and quickly envisioned. That is something to comes from years of practice problem solving.
Not everyone has that though, even among people who claim to be well experienced. If those among us are aging and never fully developed the skills to save on manual effort they will likely appear as if in decline. Others that continue to find news ways to deliver higher quality at ever decreasing costs will continue to demonstrate superior value.
> All it takes is practice to maintain and improve them!
That is largely true for anything in that maintenance costs less than recovery and maintenance costs more than original solution delivery for someone well practiced at delivering original solutions. Not everyone invests in the practice to do this though.
> . Some developers never fully embrace automation, especially if they are reliant on certain tools versus original solution discovery.
can you expand on that for clarity ?
Some developers never really learn to actually program. This is largely due to chasing fashions. In the past this has been around things like Java Spring or JavaScript React. Instead of learning to write original software they get really good at using a tool. Now the new fashion is expecting AI to do it for them. When people build their careers around this it’s all they can do and never dig deeper. This works well for seeking employment, but doesn’t allow for practical skills growth.
They don't embrace AI is how I read that.
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
It depends on what you're doing.
The stronger cognitive strength needed, the less it can be replaced with experience.
Some chess grandmasters are teenagers. Maybe maths intensive ML research could be a bit comparable. But that's... Maths. Or distributed software algorithm optimizations?
In the vast majority of software work (as in > 99% ?), experience is more important, though, if you're bright enough when young. Or so I think
(But when closer to 80 or 90 or 100 years, that's different of course.)
That's pretty much current state of knowledge.
Terms you want to check for more detailed info are 'liquid intelligence' and 'crystalized intelligence', but you basically nailed it.
I've seen 'fluid' as well as 'liquid' intelligence, but these are the terms the scientific community seems to use.
If your time is spent in higher productivity work, wouldn't that - irrespective of age - leave you feeling more exhausted?
I am old and I can easily code / design all day. 10-16 hours if it’s something I am in to. It’s dealing with people / social issues that exhaust me.
Likewise. I can easily work for 10 or 12 hours. It's fitting in things like friends and family, according to their schedules, that makes it difficult. I don't mean to say this in a resentful way at all; it's definitely a me problem, not them.
Productivity doesn't correlate super closely with fatigue in my experience. The worse sessions are when I'm banging my head against something and getting nowhere. When I'm flowing, I can go for hours.
I’m not sure it makes sense to differentiate between energy spend while being “productive”, and energy spent whole trouble shooting and problem solving.
After all, trouble shooting can be viewed as a productive thing.
Interesting idea though.
Someone who is more efficient expends less energy to accomplish the same thing relative to someone who is less efficient.
A lot of it depends on how good your tools are.
> skills decline at older ages only for those with below-average skill usage. White-collar and higher-educated workers with above-average usage show increasing skills even beyond their forties.
> Individuals with above-average skill usage at work and home on average never face a skill decline (at least until the limit of our data at age 65).
A lot of hiring managers need to read this.
There's another factor here: the older the worker the less you can abuse them.
Right? ageism maybe should work in reverse!
I get that but growing older does mean less energy at the very least.
Does "energy" here mean "willing to work many more hours for no extra pay"?
Citation needed.
Don't say "it's obvious", because people would have said the same (mistaken) thing about cognitive decline before the submitted article.
Literally the two most important things from the article.
Get better at things so you don't have to worry about decline. That simple.
Its like a muscle - develop it early on and then you can easily keep it in shape without much effort until the day you die, without any noticeable decline (at least until like 70).
This is my biggest fear of retiring from programming and doing something else. At 55, I feel like programming keeps my brain elastic. I fear leaving that and going into slow decline.
I'm worried about this with my dad. He's recently retired from a career of hardcore engineering in the optical physics industry. Now in his mid 60s he's inside all the time playing baseball games on his iPad and watching TV shows with my mom. I've been trying to figure out ways to spark his curiosity again. Thought LLMs would blow his mind, and they would have 15 years ago... but it was just passing interest.
Yeah my mom plays solitaire all the time on her iPad and just gets fuzzier and fuzzier. She was doing sudoku and some thing that at least seemed to be a little challenging. But she seems to have stopped. At least she still gets regular exercise with my stepdad.
I wonder how much of the "age-related" decline is due to the brain functioning on autopilot. After over 5 decades, I have experienced most of the issues I'm going to experience in life. More often than not, I'm addressing issues with mental playbooks based on past experience.
As I get older (now in my 50s), I find myself reflecting on how many aspects of my life and decisions are operating on autopilot. I figure it's worse now with social media where people are constantly bombarded with dopamine hits, while boredom and idle thoughts have largely become things of the past.
Perhaps counterintuitively, I am trying to break this pattern and consciously engage with my experiences by asking a few basic questions, such as:
- What am I seeing here?
- What's going on?
- What am I missing?
- How can I approach this differently to achieve the same or better outcomes?
Additionally, I am making a concerted effort to notice more new details during routine tasks, like commuting or shopping. I can't count how many times I've discovered something new and interesting on my work commutes. Actually, I can: it's every time.
Edit: Also spending more time with long-form content over short-form, be it reading or watching videos. It forces me to consider a topic for a much longer period. Short form knowledge is a trap, unless you have some system that hits you with high rates of repetition (eg Anki).
In my humblest of opinions, you are probably spot on about the autopilot vs. actually experiencing things.
As a concrete example, someone in this thread mentioned their older relative spending a lot of time with puzzles daily. I too watched my grandpa doing sudokus and crosswords, but in the end if there’s nothing much else, those too will quickly become uninspiring routine.
I really believe truly experiencing life does require some introspection so that you have agency.
Interesting points.
And agreed, at one time I really got into Sudoku and Minesweeper, but my nerd mind quickly turned them into brainless pattern matching routines that required effectively no thinking. Don't get me wrong. I appreciate those abilities, but there's a time and place.
I'm still in my 30s, but I wonder how much mental decline is actually due to physical decline. I notice I feel more sluggish, sleepy, less sharp and motivated during periods when I'm more sedentary. And while exercise is tiring, I feel it gradually improves not only physical stamina, but mental stamina as well. Clearly a large part of our brain power is spent controlling our bodies, for when a stroke happens, becoming paralyzed in an area of your body can result. And clearly our body caters to our brains needs (e.g. nutrition), so if the body declines, then it shouldn't be surprising to see mental capabilities decline as well.
>Additionally, I am making a concerted effort to notice more new details during routine tasks, like commuting or shopping. I can't count how many times I've discovered something new and interesting on my work commutes. Actually, I can: it's every time.
This is one of the underrated pleasures of commuting by bicycle. You aren't abstracted away from the world in a bubble of steel and glass. You see, hear, feel, countless little details, and you can reach out and touch them if you want. Potholes, pedestrians, birds, the wind and rain and sun, smells of food and flowers and weird chemicals, street music and overheard fragments of conversation. Millions of faces.
You might also be able to avoid the subjective acceleration of time that happens to many of us as we age.
This is another thing I've been exploring, but I haven't had a whole lot of luck in actually slowing down time.
The "fix" seems to be:
- Add more activities to your day, every day.
- Try to break up routines. Eg. you may run every day, but you don't have to take the same route.
- Be actually present during those activities. Engage in conscious thought about those activities.
- Take photos, videos, recordings to recall those activities and jog the brain.
I bet you can even accomplish some of this retroactively with the right group of friends. The question "What did you do this weekend?" can be answered in so many levels of detail.
For me. I started making enough money that all my old routines stopped being relevant. I started to drift into comforts and lost touch with my surroundings.
Are there any guidelines for what exactly this would entail?
My short term memory is falling off a cliff. What do I need to do to prevent that from getting worse? Are there any other bases I need to cover that I don't know that I'm missing?
> My short term memory is falling off a cliff
Are you sure? I thought this was happening to me too, and then I realized when looking back 10 years ago that I have way more responsibilities now both in and out of work: I am not only getting more done at work, but also for more people. I am now picking and choosing which meetings to even hold, much less attend, because I have a higher throughput. My children's needs are much more complicated now than when they were younger. I have a side business.
I can't fathom how I would have even gotten this all done when I was younger simply due to how much leisure time I spent, much less kept all of this in short term memory back then.
> I thought this was happening to me too, and then I realized when looking back 10 years ago that I have way more responsibilities now both in and out of work
This so much. When I was in my 20s I never forgot things, but I didn't have anything that I really needed to remember lol.
It's easy to forget about how many more responsibilities we take on as we age, simply by nature of how those responsibilities slip into our lives one at a time, bit by bit, gradually shifting our window of normalcy.
My phone is now full of Notes, Alarms, and timers. I can barely leave the house to run an errand without writing down what I need to do.
As far as actually improving memory, I try to expose my mind to as much raw material as I can. The mind is a muscle, it has to be exercised, and as you get old you need to focus on its core strength rather than physique and raw strength.
Rehearsal and repetition. Read constantly, get out in the environment and really try to observe all the things that are going on. Write down all the things you want to do this year, and when you’ve done them, write that down, too. Every so often, review the list. It will prompt your recall to a wonderful degree.
Write down your little milestones - ‘in March we found a clutch of tadpoles in a tire track puddle and we watered and fed them there for six weeks”
Regarding memory, I have made a habit of assuming I have a faulty memory, and trying to write down anything I think I may want to remember in the future using a wiki style tool that supports back linking. The tool I use is Org Roam in Emacs, but there are lots of options. I have found that by doing this, I have offloaded a lot onto my computer, and made space in my mind to remember a lot of new things.
And when you’re not in front of a computer?
Contrary to the other comments saying to carry a pocket computer: my brain. Hence the improved memory. I offload my thoughts into my notes when I can. If it wasn't important enough to remember until I can find a seat at my desk, it wasn't important enough to write a note on.
We have computers we can carry around in our pockets now!
Lots of approaches exist, mine is Obsidian + Syncthing and just jotting down notes on my phone that I go flesh out when I'm back at my PC.
I’ve found these work well:
https://www.ataglance.com/p/planners-calendars/journals-diar...
No batteries or Internet required.
One option is using voice assistants to send a message to your todo inbox.
Use your phone
or note book which you later re-write into your knowledge base.
>>My short term memory is falling off a cliff.
Read the book GTD by David Allen.
You are not supposed to store things in the brain, that only causes stress.
Brain is to do thinking work, you are better off writing and tracking things on paper. Use the brain to think, and paper for planning, scheduling, tracking etc..
I wonder how much of that is due to age and how much due to electronic distractions.
I was in the same boat, but I started noticing that if I force myself not to do silly multitasking (like not paying attention to what I am doing because my mind is thinking about irrelevant other things) it gets better. Since I stopped the infinity doom-scrolling it has improved a bit
Stress and lack of sleep also affect me a lot. Both are omnipresent, since I am a parent of young-age special-need kids.
Both of my grandfathers in their 90s have insanely sharp memories. I feel that theirs is a lot better than mine at instantly recalling details. I have noticed this in other older people from that generation too.
The only 'exercise' I've heard of that offers measurable improvement is "N-Back", kind of like the old TV game "Concentration". The app is available on most smartphones.
Emotions can have a large impact on memory, as far as I know. They provide the catalyst, in a way, in the process that forms memories. If you are depressed or otherwise not emotionally engaged, it can become much harder to form memories.
Solve emotional problems and memory may improve. (I have no idea if that applies to you, of course.)
> short term memory
Which sort of memory do you mean? Short term memory is remembering a name while you write it down, not remembering it the next day or week.
Avoid weed if you don't already. Might seem out of left field but a programmer friend of mine is absolutely convinced their memory is shot because of long covid and it's like, well, maybe, and the trauma of the pandemic certainly put a dent in everyone's cognitive ability, but also the dabs can't be helping.
I've found poor sleep really affected my memory. Maybe start tracking your sleep.
I feel so much dumber since having a kid :(
[dead]
For those who don't feel like taking math courses in a formal setting, making games from scratch is a fun way to learn and apply linear algebra and calculus.
I never really needed determinants in my life until I tried moving a spaceship towards another object. Trying to render realistic computer graphics gets you into some deep topics like FFTs and the physics of light and materials, with some scary-looking math, but I can feel my mind sharpening with each turn of the page in the book.
Falling off the cognitive cliff after retirement is something I think a lot of people are familiar with in their own lives.
I have seen it with my own parents and my wife's parents first hand. Frankly, I think the lack of social interaction is a big part of it.
When they're working, they're regularly talking to people outside their comfort zone about potentially challenging questions. That gets largely shutdown once you retire.
Both my parents were in a huge rush to retire early, and now they just sit at home and scroll Facebook. I don't see the appeal.
I didn’t appreciate this until covid and wfh. I’m an introvert and am in my happy place sitting in front of a computer or with a book. But I was losing my mind and had to be actively social for the first time in my life. I can see a decade of living like it’s Covid turning my relatively healthy, relatively young brain into soup.
Leaning into stereotypes, the older women in my family did just fine in retirement because they just started doing social activities full time. If anything they retired and got busier. The older men sometimes did ok but usually did worse.
That is why volunteering when you are in retirement is a win-win. Very few others have the time for what is an absolutely necessary part of society, and it is great to keep your mind and heart active while you recall your life and use its lessons to give back to others. Any sort of volunteering will lend itself to that. For example, Jimmy Carter built houses, and it seems to have done him wonders.
Social interaction must be important, but also the fact that work doesn't ask you how tired you are, you have set of tasks and go. When being master of my own time, I can imagine I would veer towards more fun activities which may not have that forceful aspect and would be done mostly alone.
And super true for those parents, my goal is to travel massively as much as my budget and health will allow it. Backpacking all around south east asia, thats what keeps me pushing to work on earlier retirement. Sitting at home unless forced, no thank you thats a downward spiral
>>Both my parents were in a huge rush to retire early, and now they just sit at home and scroll Facebook. I don't see the appeal.
My retirement plans look somewhat similar to how Knuth spends his time. Long hours of deep intellectually challenging work. Driving long distances and eating tasty food some where far away.
Most of retirement motivation comes from feeling the sun during weekdays. There is little point to be sitting whole day at home.
And from what I've heard on the grapevine, life expectancy drops among those who retire relative to those that don't. This makes sense: many people don't seem to know what to do with themselves if they're not "officially employed", so when they retire, they become aimless, and they sort of decay and disengage from living.
This is characteristic of acedia.
Though is that causation or correlation? I can imagine that people with all kinds of illnesses would also retire sooner than people who are still in peak health.
I (unfortunately) find myself doing this far too much after work, and am worried about what retirement might accidentally look like.
I'm 63, and still learn new stuff, every day.
I write code, pretty much every single day, and also, solve problems, every single day (7 days a week).
I think solving problems is important. Not just rote coding, but being presented with a bug, or a need to achieve an outcome, without knowing the solution, up front, is what I like.
Basically, every single day, I'm presented with a dilemma, which, if not solved, will scrap the entire project that I'm working on.
I solve every one (but sometimes, by realizing it's a red herring, and trying alternate approaches).
[flagged]
The abstract has these two statements:
> ”Cross-sectional age-skill profiles suggest that cognitive skills start declining by age 30 if not earlier.”
and
> ”Two main results emerge. First, average skills increase strongly into the forties before decreasing slightly in literacy and more strongly in numeracy”
Does this mean that this study contradicts the popular common understanding that cognitive skills decline after 30? Or am I missing something?
For me, personally, if feels a more impactful finding than the “use it or lose it” one
Yes, the study contradicts the existing understanding.
If you are older, I think the trick is to watch (or remember!) what younger people do and follow (or revert to) that behavior, as much as you can.
Comparing cognitive abilities between older and younger people fails to control for the inputs - behavior, experience, etc. Try the same inputs (using some big generalities):
* Exploration: Younger people love to explore, even just for exploration sake, and are also compelled to try things - and they also fail. Exploration is their mode, because so much of the world is new to them, because doing something new and innovative is socially admired, and especially because so many major changes happen - leave home, serious romantic relationships, first job, etc. A lot of that happens, ready or not.
* Learning: Similarly, younger people are compelled to learn lots of very challenging things, whether they want to or not; they are compelled to use cognitive skills that they are uncomfortable with. Their job is to learn, daily, for 12-16+ years. Remember school? Remember your early years at work when had little choice of what you did? Remember struggling with all those things?
* Playing: Young people love to play and are socially admired for playing better and more creatively.
What, you're past all that? Nobody is going to make you study things you're not interested in? Don't want to make any big changes? Dignity too big to play? Ego too big to explore and to fail? When you're older, you can say no and 99.99% (I think that's about accurate) take advantage of that and refuse to do or even talk about things they aren't already comfortable with. Does all this sound too hard? Then don't complain about losing those skills.
I think a big part of the problem is the same that affects CEOs - there is nobody to hold them to account.
This makes so much sense. I've been programming every day since I was in my twenties and there are definitely some concepts that seem much easier for me to get my head around now (I'm in my 50's) than earlier.
Right now I'm reading through a college textbook on the biology of learning and memory with ease and good retention. Never got this deep into any subject in my school years.
> I've been programming every day since I was in my twenties and there are definitely some concepts that seem much easier for me to get my head around now (I'm in my 50's) than earlier.
Same same.
I figured this is because I have less energy, but a little more wisdom. I have much broader understanding of related concepts. So, things click a lot faster.
I’d like to remind everyone that learning to play music, learning dancing or sports requiring complex coordination also count.
Oh, and a question at the back of my mind: wouldn’t using AI to avoid minimize all of us spending time in the struggling-to-figure-something zone lead to earlier decline on a massive scale?
This study needs to capture the affects of sleep deficiency. I'm in my mid-forties and don't sleep enough anymore (6-7 hours at best).
What matters is quantity of deep sleep and REM sleep.
REM sleep seems to be related to archiving of events( memory formation ) while lack of deep sleep affects the brain itself.
Pickup a smartwatch and track the sleep stages with Apple watch being the most accurate.
As someone who plays a lot of board games — particularly heavier board games — and hopes to do even more of that in retirement, I’m wondering if/how that is helping/will help fight cognitive decline.
I can imagine at the very least it won’t hurt, and intuitively it makes sense. But I’m not sure studies have been done specifically to understand how board gaming — or the problems being solved with board gaming - helps with cognitive skills.
Curious if others that are closer to this field have thoughts.
I love me some board games, but I prefer depth and decision space to complexity -- and the industry is dominated by stupendously complex beasts full of unnecessary mechanics that slow things down or extend setup without adding too much. A perfect example is TI4's expansion Prophecy of Kings, nearly all of which I despise for bloating a beautiful base game. I'm also always flabbergasted by how starved and railroaded I feel in games like Dune Imperium or Cole Wehlre's collection. Despite a wealth of mechanics, my choices are few and far between.
Complexity has its place, especially for engine builders like Terraforming Mars where complex interactions are the point. Many designers seem to be throwing in the kitchen sink arbitrarily. We're in a "bigger is better" paradigm.
Older coders/technical folks tend to have more wisdom than raw compute (relative to younger coders who may have more raw compute than distilled wisdom.) Wisdom takes a more reliable and more efficient path than raw compute.
Both raw compute and wisdom will be eventually replaced by AI, but "deep wisdom" is largely held in the body, in the way we react viscerally to things, which AI as it is envisioned today does not factor in at all. So we still have a refuge in the wisdom stored in our body memory.
As an older developer who lately has been pairing with early career developers, I've been noticing lately how often wisdom comes into play. It feels like close to a daily occurrence where I suggest something is the cause, then later I'm asked how I knew that was the right thing to investigate, and the only response I have is that it's almost always the culprit.
Thats why I don't do vibe coding and try not to use LLMs to generate code.
Because it literally speeds up your cognitive decline as your brain shuts off and offloads all the heavy lifting.
Recreational travel is the only thing that routinely works for me in terms of slowing down time and fully engaging my brain. It's something I can incorporate into my life multiple times per year and it guarantees a massive amount of new stimulation (assuming travel to new and interesting places). Even the most rudimentary trip to Europe will have you grappling all day long with a different language and culture and environment in ways that are completely taken for granted in our day to day lives.
There's lots of things that can make an even bigger impact, like moving to a new place or starting a new career or school, or a new relationship. But those are things that sometimes only happen a handful of times in our entire lives.
Everything else I find eventually becomes routine, no matter how stimulatingly it can be at the start. Not that we shouldn't try! Some stimulation is a whole lot better than none, and I have a terrible feeling that many people get little-to-no stimulation for weeks and months at a time (beyond a new TV show or podcast or political drama).
I think there's a valid concern about cognitive fatigue. It could be mentally exhausting to constantly "exercise" our brains just to maintain cognitive abilities as we age!
Maybe AI could be our mental gym buddy here - not replacing our thinking but offering just the right level of mental challenge to keep us sharp without burning us out. Picture an AI that knows when to push your intellectual boundaries and when to back off based on your energy levels.
And Neuralink-style brain interfaces? They could be like cognitive training wheels - gently supporting neural pathways while letting us do the actual pedaling. Instead of "downloading knowledge" (which sounds exhausting in its own way), they might subtly enhance natural learning processes or help maintain neural connections that would otherwise weaken with age.
The goal shouldn't be turning our golden years into endless mental marathons, but rather finding that sweet spot where cognitive maintenance feels engaging and enjoyable rather than like another chore on the to-do list!
> Two main results emerge. First, average skills increase strongly into the forties before decreasing slightly in literacy and more strongly in numeracy. Second, skills decline at older ages only for those with below-average skill usage. White-collar and higher-educated workers with above-average usage show increasing skills even beyond their forties. Women have larger skill losses at older age, particularly in numeracy. [emphasis mine]
So, it seems like workers with above-average usage of literacy and numeracy continue to increase their ability, while those in fields that don’t emphasize those would need some kind of mental “exercise”.
(I also note that some commenters here are rushing to add more cognitive work to their daily routine through additional studies, but I wonder if they’d be better off focusing on commonly neglected areas like physical activity.)
I'm skeptical. I was in my 40's before I graduated from a college. Before that, I did some serious electronics with a background from my local community college, have worked production lines, taught myself assembly language when I was engineering my first microprocessor-based design at work, then when I couldn't get re-employed years later, a lot of potential employers simply not believing my resume content because my 'formal education' was lacking, so I went back to school, got my BS in 1999, and my MS in 2006, then continued working on personal projects and learnign new coding languages on my own since now nobody wanted to take a chance on hiring an 'old' man. Their loss.
> I couldn't get re-employed years later, a lot of potential employers simply not believing my resume content because my 'formal education' was lacking..
I am 53 years old. I don't have a college degree. I have never been unemployed and have had good software development jobs all my adult life, including now.
It is possible and likely that your lack of a degree was not the issue.
Is self-employment better able to cope with brain changes?
This is a reason public college should be free.
It has never been easier to pickup a subject and start learning on Youtube or similar streaming platforms. Just Do It, folks! You can do it!
At nights and weekends, I have been learning home improvements, home automations, piano, Korean, and LLM toolings. All from streaming platforms.
Idk, at 29, 30 and 31 I became significantly smarter - it just had to do with things I was intensely interested in. Things that could hold my focus just no longer matter. Fortunately I'm interested in engaging things that are hard.
Love it, maybe it is time to seriously pick up programming!
My uncle went back to university when he was 70 to get a degree in vulcanology.
For myself, while the learning curve is "longer" as I've gotten older, it also shoots sharply upwards as the time spent on the skill acquisition increases. Age has a magnification effect at the tail end.
I'm in my late 40s and I do pick up new technical skills a bit slower than younger folks. But because I have a lot of experience, I'm able to more quickly grasp various contextual aspects of those skills: how/why they are useful, how they compare to previous skills that tried to solve the same problem, the hidden costs and implications, etc. These matter a lot in the practical, everyday application of skills.
I find that younger people have a really hard time with those contextual aspects, or they don't think it's that important... until they discover they do.
It's time for our geriatric political class to be retired.
What about using adderall to get an edge in cognitive skills?
For this type of research the data sample is too small.
I'd like to see how much of the decline is correlated not with age but with Parent Brain.
The mental energy occupied by and spent with parenting is palpable, not to mention long-term continued stress, physical, mental and emotional exhaustion. I wouldn't be surprised if having kids (which is of course correlated with age) is much more of a factor than age itself.
I for one feel dumber than pre-kids.
I'm starting to lose it, I believe, thanks to AI.
Like many I preferred the "internet" before it was this. The War Games-like setups, mystical and empowering, and the wonder of the future, how information would free us, things we could never imagine. All to wind up with people staring at Instagram while driving and running into me and my dog, and the world/companies like Apple and Samsung shoving AI elements down our throats.
So I plan my retirement, and it involves unplugging from all this. Then what? Live in a small cabin in the remote woods? Not sure that would go well.
So, this explains the average aging Trump voter's cognitive impairment? Well, at least it's not leaded gasoline or reality TV.
And yet ranks of successful founders are dominated by people in their mid forties. Perhaps there is something more involved with social function than pure cognitive skill?
Correlation/causation. People whose faculties are still intact are more likely to do and enjoy activities requiring those faculties.
Same as for muscles / physical skills, after all...
Does that imply that it also is part of character traits? As in use empathy, become emphatic, stay in a non-emphatic environment, your brain degrades you to a sociopath?
"You become the average of the 5 people you hang out with most" is a common phrase, there must be at least an ounce of truth to it.
I hung out with a friend recently I had not seen in close to a decade. He was at one time my closest friend and seeing him was kind of uncomfortable and enlightening. I saw sooo much of how I used to talk and act still in him that it really had me wondering how much of that I'd gotten from him versus the reverse.
I've seen people age into the classic "grumpy old man" so there's something to it. But there's probably a lot more to it too I'd think.
I think it has more to do with getting desensitized to things the more you're exposed to them. With age you get more and more exposure to everything emotional and lose the strong reactions.
Add to that some frustration from not being able to keep up with things, health issues, no one "young" having time to hang out and your friends dying all the time and I'd be grumpy too. You were once a stallion taking care of everyone and now you worry about falling in the shower because you occasionally lose balance for whatever reason. And you know it'll hurt like a bitch, you'll break something and it won't heal for a year. It's quite humiliating.
I always assumed that it was something to do with people getting increasingly frustrated with the struggle of keeping up with stuff.
Chronic pain probably plays a part too. I know I get grumpy and miserable when I'm unwell or in pain.
I've attributed that to a decreased ability to deal with novel situations as we age. E.g. the world behaved differently than I was expecting.
One thing it's definitely possible and important to intentionally keep exposing oneself to!
I am definitely grumpy. What makes me grumpy is the fact that society keeps banging its head against a wall for no good reason.
There is everything there for growth, and yet I see none. I get very tired of knowing well what the boring, selfish reaction of the person I encounter is going to be. I am sure I do the same thing - and don't change much compared to what is available to me to make changes. I do not lead by example at all the way I would like.
Nonetheless, what makes me grumpy is lack of change, not the superficial appearance of change with which technology distracts us. Moral growth would be so refreshing to see, but I see none of it - despite virtue signalling as a veneer from all parts of society.
Said more colloquially, a lot of older people just grow tired of all our bulls*t.
But all the objective bullshit still existed when we were younger! And it didn't bother us as much then.
I was much less aware of it when I was younger. Ignorance is bliss.
Spot on. People learn to recognize it from miles ahead and in now what's up.
I think you mean empathetic, rather than emphatic.
As with most research around our scientific understanding of intelligence, I assume this only scratches the surface. There may be something to your comment.
There are trajectories of personality traits over the life span, I would hesitate to extrapolate them based on the trajectory of cognitive abilities though. One of the known life span emotional/personality trajectory is positivity bias, older people tend to be more positive. It is sometimes framed as negativity avoidance, that is older people tend to ignore negative things more often.
personally yes. I absolutely have seen this in myself and moved to rectify it.
Personality disorders like BPD tend to attenuate with age, so you would be more likely to become less sociopathic.
Perhaps correlating with "ingesting more and more valuable training data"? Which is pretty much what CBT/DBT is supplying imho.
[dead]
[dead]