422 points by metadat
a month ago
Previous discussion: https://news.ycombinator.com/item?id=24537865
Academics are not the place to go to make useful novel contributions, there are too many confounding factors. First, the university owns whatever IP you come up with, and will happily exclusively license that IP to entities with no interest in developing it. Monopolistic corporate power centers aren't known for fostering novel competition to their established businesses. This is why it took so long - and serious capital - to get something like Tesla going, as the likes of Toyota, Ford, GM etc. were closely tied to Exxon, Chevron etc. via their investors, and had no interest in developing electric cars.
Any kind of applied research in the US academic sector these days is best understood as corporate R & D for established interests in pharmaceuticals, technology, industrial chemistry, etc. You're certainly not free to do blue-skies renewable energy research (budgets for that are still miniscule at best, and have been since the 1980s), for example. The job description has become indistinguishable from that of in-house corporate researcher - narrowly defined assigned problems are what you get to work on, and 'academic freedom to pursue the research wherever it leads' is a quaint myth.
The obvious fix is to eliminate exclusive licensing of academic IP (i.e. repeal the 1980s Bayh-Dole Act), which would force corporations to spend capital on their own private R & D divisions if the wanted exclusivity, and simply make any university-held patented discovery available to any entrepreneur who wanted to develop it for a small flat fee.
Until then the author is 100% correct about entrepreneurship being the only way out of the trap.
As I often say, open-source is the 7th wonder of the world. In no previous situation have humans worked together on a larger scale to create a software that powers literally everything in the world, and even a helicoper on Mars.
And just to put it at scale, the engineers of Apollo were so early before our times that they had to code the timestamps in negative.
Isn’t it fair to say that most of the biggest open source projects and/or most useful projects are developed by people employed by either universities or companies?
> And just to put it at scale, the engineers of Apollo were so early before our times that they had to code the timestamps in negative.
I don’t understand how this relates to the scale of open source software or Apollo software engineers. I guess I know the Unix epoch is 01/01/1970, but how does that relate to Apollo and negative timestamps? Those timestamp definitions came after the first several Apollo missions, didn’t they?
> Isn’t it fair to say that most of the biggest open source projects and/or most useful projects are developed by people employed by either universities or companies?
Yes, except that largely those projects are tangential to the "real" research. Larry Wall, creator of Perl and other useful things, is a linguist who was "studying linguistics with the intention of finding an unwritten language, perhaps in Africa, and creating a writing system for it.". Perl was just a side project.
> how does that relate to Apollo and negative timestamps?
I'm not certain, but perhaps the parent commenter wasn't being literal. Maybe just a way of saying that the Apollo engineers did their amazing work before what the rest of us consider to be the "beginning of time"?
Yes, GP here, it was a joke, Unix is so young that the moon landing is pre-historic to it, it already belonged to another epoch literally.
I took the joke from this comic: https://www.commitstrip.com/2015/10/06/before-timestamp-0/
I work at University and I only publish Open Source (GPL, MIT, or CCBy). It would be hard for my University to claim IP rights after I published something with these licenses.
At many universities (in the U.S. at least), on paper, it’s the technology commercialization department that makes the call about pursuing patent protection and the burden is on the PI to report all potentially patentable inventions before release so that the university has time to make the determination. In the case you describe, they’d in principle find you in violation of that policy, but in practice can’t because they don’t know about it unless you tell them.
Yes, I think you're right. It _depends_, as always. Mostly on the field, and in my case, on country. I am in Germany and Universitities are mostly State financed. There has been a pretty liberal trend at Universities in Germany the last 10 to 20 years, pushing towards open science for society (any society, not just Germans). We have such a department you describe, too, but no one has ever heard of it and its not really used, or lets say regulations exist but are not enforced. The University itself also pushes for Open Source publication and free sharing of results and technology, except for rare cases.
You do not nerdsnipe us! Not today - sir.
Wait a minute - is that true lol? I vaugely remember epoch time being a 32-bit signed integer, which seemed silly to me, but did they actually use that sign prior to 1970?
Nobody would consider the present as negative. Maybe he meant that previous work had to be coded as negative, after the zero was set.
Or maybe it's a joke. We used to say that some guy has his ID number in roman numerals... or used spears for the matter.
I'm guessing that "Unix timestamps" were not used before Unix itself became widely used, which would be after the Apollo missions
Yeah, I don't know the history here, though in thinking through this, it ocurred to me that of course computers might need to represent time before 1970 for some reason, hence the sign, but of course there is still the limitation of 32 bits. Time is one of those things that seems simple, but always seems to bite you somewhere - things like leap seconds or if Juneteenth counts as a business day always seem to throw a wrench in.
Have you seen the quality of most open source software? It’s a joke. Most of it is about people pushing their own agendas and self promoting
There's so much good open source software that I'm not even annoyed at how wrong your statement really is.
Well they’re right in that all software is a mess, but oss is certainly not worse than anything else.
> Have you seen the quality of most open source software? It’s a joke.
Sadly, yes, you're not wrong. However, have you seen the quality of most proprietary software? No, you haven't, no one has, because it's proprietary.
> Most of it is about people pushing their own agendas and self promoting
Perhaps, but some of it is not, and that is the difference to proprietary software.
> First, the university owns whatever IP you come up with, and will happily exclusively license that IP to entities with no interest in developing it
That's simply not true.
Academia has a lot of problems, we don't need to start making up new ones.
Universities want money. The #1 concern, by far, of any university technology licensing office is to get those patents working so that they can reap the benefits in terms of huge licensing fees. Literally, no one wants to license anything to anyone who won't develop it.
Also, university licensing offices always have deals where the researchers get the right of first refusal with their own IP.
> Any kind of applied research in the US academic sector these days is best understood as corporate ...
Academics do very little applied research. Overwhelmingly, we do what is called basic research. Finding cool new ideas. Then, we give them to those corporations that do the overwhelming amount of applied research. This is how the system is supposed to work!
> The job description has become indistinguishable from that of in-house corporate researcher
No idea what university you at. But this is totally false for any university I've seen.
> The obvious fix is to eliminate exclusive licensing of academic IP (i.e. repeal the 1980s Bayh-Dole Act)
So.. you want academics to do less impactful research that has fewer applications? Because that's the outcome of denying us the ability to patent our own work.
> Literally, no one wants to license anything to anyone who won't develop it.
Yes, but that doesn't prevent them from licensing things to organizations with a lot of money regardless of intent. If there's enough money involved, say a multi-million dollar endowment, the academic administration wouldn't give two shits about whether or not the licensee develops it.
> Yes, but that doesn't prevent them from licensing things to organizations with a lot of money regardless of intent
Why in God's name would a corporation with a lot of money pay for a patent they don't want to develop? Wallpaper is a lot cheaper at Home Depot.
Seriously. This doesn't pass the most basic "is this remotely plausible" test.
Anyway, universities tend to give licenses that make them money as the patent is used. There is no world under which they want them to be wasted.
If you don't believe that patents and corporate secrets are bought and kept under warps to protect profits for existing business lines, I can't disabuse you of your naivete.
If you sell widgets for big bucks and someone invents a nearly free widget replacement that could interfere with your primary income.
Point to one example of this happening. Since patent ownership is by definition public information, it should be easy to find an example. I'll wait.
The forerunners of ExxonMobil patented technologies for electric cars and low emissions vehicles as early as 1963 
And as the post points out... the people reviewing your work may not even know what you're talking about. To quote:
> Reviewers at the Journal of Cryptology didn't understand why they were being asked to read a paper about CPU design, while reviewers at a computer hardware journal didn't understand why they were being asked to read about cryptography.
Why would someone be beholden to these reviewers if their expertise is not adequate or too constrained to review the submission? The author seems to be describing Spectre as well.
He was specifically going to be an academic mathematician, which would probably have allowed him to publish discoveries without patenting or licensing them.
MIT is an R&D corporation which has a college attached. MIT filed 358 US patents and had 435 issued in the year ending June 30, 2021.
The correct fix is to make the costs of developing such innovations within the personal reach of those who can imagine it.
So many of our scientific advancements, especially before 1800, came from the idle rich - those free to tinker and experiment.
Imagine what could be done if we lifted the majority of the population out of wage slavery with a UBI or similar - how many Newtons, Laplaces, or Fermats are currently toiling in obscurity?
> how many Newtons, Laplaces, or Fermats
. . . are currently playing the equivalent of Cookie Clicker?
Probably some of them! I would be willing to bet that a lot of the "number go up" reward in our brains is especially rewarding when we feel trapped in a job/life without notable accomplishments.
That is, if you don't have the time or energy to take on fulfilling creative projects, you can at least fake the sense of accomplishment with video games.
In Sweden academics own the IP they create. Makes it easier to be both an academic and an entrepreneur. But other than that it doesn’t help much. Academia still seems like a trap to me.
There's a big truth here that screams out to me. If you want to do great work, you need freedom first.
If you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible result being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want one kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it.
-- from the biography of J. J. Thompson
Amen. It's really a simple thing to understand. But most people involved in academia don't (want to).
we all know. we just can’t figure out how the fuck to get it funded.
That sort of blue-sky culture needs a whole lot of disinterested money, and that’s hard to come by. In previous generations, we’ve used monopolists, the gentry, and other ways…always a challenge.
Sadly there's plenty of money being spent in the world every second and even 1% of them would be enough to fund most of the science on the planet for years in the future... :(
You wouldn't believe how much money do top Twitch / YouTube streamers get in an hour. I understand that they provide entertainment / influence and many people need that and pay them -- it's fair business and I don't look down on it. On the other hand... aren't there like sooooo many other and much more pressing concerns in the world to fix with surplusmoney? Yes, there are.
So yeah I sympathize with the "we can't figure out how to fund proper free research". It's really a shame on humanity.
Do we really want to speed up the rate of research much more than what it's already at? Society can hardly keep up with the speed of change we have now. At least the twitch streamers are a mostly neutral force on the world. Can't say the same about those researching facial recognition or new missile guidance systems.
Quantity is not the same as quality. I obviously don't have a detailed view on this but judging by HN and a few other aggregators, modern science is much more concerned with keeping the lights on and producing bullshit outcomes only serving to further one's career in the same area. Perverse incentives. Having free research funded will alleviate part of this problem.
We need a lot of good innovation right now and that's still missing -- starting with better (small) batteries.
The world doesn't really need to pay yet another batch of 20_000 students trying to invent better cat photos classification NNs. Yet a lot of money are poured in such endeavors.
Part of the problem is that the way we conceptualize money leads us to treat it as if it's a real thing, as opposed to an accounting system the state creates for various purposes, some of which are desirable and some of which are not.
If it can be done, it can be afforded. Only if it is physically impossible can it not be "afforded". That is where we ought to start as a society. Where will we get the money for a universal basic income? There are an infinitude of options--the state literally made the stuff up. The question isn't why there is so little money for so many important things, and so much money wasted--the question is why, despite being ostensibly democratic, we are so bad at statecraft (and have been since the Boomers took charge).
Problem is, not everything that can be done can be afforded (at least, not all at once).
Or, to put it in Thomas Sowell's terms, economics is about the allocation of limited resources that have alternate uses.
You want the government to print a bunch of money? Great. Now we have a bunch of money. We don't have any more actual stuff to buy with the money, though. Are we any better off? No. Is science any better off? Maybe - depends on what we do with the new money. Are the poor and disadvantaged any better off? Maybe - it depends on how much money is created, and how that money is distributed.
Sounds like what you really want is to smooth out who has access to how much stuff, and also support science better. Printing a bunch of money (and distributing it) might do it, but it depends a great deal on the details.
What you're saying isn't wrong, but there are obvious solutions in front of us. We implement a universal basic income (indexed to inflation, so it absorbs any inflation it causes) to keep the minimal living standard high, we increase basic research funding by an order of magnitude or more, and we tax the hell out of the rich. The only reason this isn't happening is because so much of the putative "leadership", in every organ of society, is corrupted by the hyperconsumptive billionaire vampires that these institutions are supposed to regulate (or, better yet, prevent from existing in the first place).
We face, in 2022, a number of problems that are actually pretty easy to solve, but the people in charge won't let us. Well, history tells us exactly what to do when we face that particular problem...
> and we tax the hell out of the rich
If you are a developer in the US, you are the rich.
Whenever people say “tax the rich” what they invariably mean is tax those that are richer than they are.
If you are arguing from a moral standard, then shouldn’t most Americans be taxed and the actually poor ($2 per day) around the world receive those taxes?
> hyperconsumptive billionaire vampires
Which to the average sample of some outside the USA, the vampire is mostly regarded as the USA, using its economic might to equivalently tax most other countries through many means.
Disclaimer: I’m not against the US, but definitely against some of the effects one of the richest country in the world has on other poorer countries (per capita).
Edit: just noticed this quote: “In 1963, 20 percent of Americans lived in poverty. Today it’s 2.3 percent.” If you are not part of that 2.3%, then you are likely the rich of the world that should be taxed?
> Whenever people say “tax the rich” what they invariably mean is tax those that are richer than they are.
Not at all. It's usually meant as "people who have much more money than they objectively need for survival, owning an estate and having some leisure".
No need to resort to a straw man to win a discussion, please.
> If you are arguing from a moral standard, then shouldn’t most Americans be taxed and the actually poor ($2 per day) around the world receive those taxes?
Good idea, a lot of people would welcome it. Although me personally, I wouldn't want USA to become the hub of the world. History has shown us what extreme centralization leads to.
> Which to the average sample of some outside the USA, the vampire is mostly regarded as the USA, using its economic might to equivalently tax most other countries through many means.
Strange thing to say. Plenty of European billionaires are out there as well. Not to mention Chinese and Russian, and who knows how many from other Asian countries.
> Edit: just noticed this quote: “In 1963, 20 percent of Americans lived in poverty. Today it’s 2.3 percent.” If you are not part of that 2.3%, then you are likely the rich of the world that should be taxed?
The biggest problem with these discussions, every time, is that you don't consider that there are a lot of people out there with huge vested interest to misrepresent these percentages. The naive faith in institutions and democracy at large, while commendable, is a huge detractor from the quality of any politico-economical dialogue.
The observable facts -- if you don't deliberately close your eyes, which many do because it's easier to pretend that when you're well-off then everybody else is -- are that democracy has been co-opted and abused as a term while the world is in fact an oligarchy and a rich people club that invents laws to legitimize their behaviour. And that's been true for centuries. But I know I will not convince you. Your criteria for this is likely "well, I don't see armed men trying to storm my house, hence democracy exists" or something similar and thus our discussion is doomed from the start.
Back to "statistics", if you lived in a former Soviet satellite (like I did) you would see through these published percentages in five seconds. There are PLENTY of techniques to misrepresent information and they don't even have to lie -- that's the beauty of it (example: "What is poverty? Let's lower the minimum resources needed to define it. Boom, we have much less poor people officially now! Lol, good job boys, let's go to Thailand and spend money with funds from the fundraiser we have scheduled there.")
And so it goes to eternity.
Because the Soviet Union collapsed.
Right. We didn't actually "win" the Cold War. As soon as "communism" (though it was never actually achieved) was perceived as defeated, the capitalists no longer had to justify themselves, which gave them license to tear town the regulations that restrained their worst impulses.
This led to the dismantling of the "middle class", which is and always has been a state creation. Once the capitalists saw the end of an existential threat--the idea that socialism might work  due to its ostensible prominence elsewhere in the world--they realized they no longer needed one. So what is now the former middle class was left to twist in the wind.
The US contributed to the destruction of the Soviet Union by forcing it to spend the bulk of its resources on its military. Granted, the Soviet system was suboptimal in a number of ways, but the constant threat of imperialist/capitalist aggression didn't help... war benefits capitalists and arms dealers, but is historically bad for the common people. It didn't help that a number of important figures within the USSR's final years (e.g., Yeltsin) were corrupt (ideologically and personally) and sought to undermine the system from within for their own financial and political gain.
Capitalism's "winning" of the Cold War was one of the biggest humanitarian disasters of our time. It enabled the end of "nice guy" capitalism and the dismantling of the US middle class, but it also created widespread poverty while it stripped hope from the people of the former Soviet Union. (The reason Russia has done so much evil shit over the past 15 years is that hopelessness creates a market for hubristic imperialism of the kind Putin is selling.) It enabled an attempt to reconfigure the global economy into a for-profit police state that has cost millions of lives in the Middle East. The bipolar war-economy world was a pretty awful place, but the unipolar capitalist world is even worse, and the only thing that has kept us from seeing it is inertia--people who still believe in capitalism have already parted ways from the cliff, but haven't looked down yet.
 Although, of course, the failure of the Soviet Union didn't actually prove that socialism can't work. It existed in a state of siege socialism from which it was never able to extricate itself. Given the harsh conditions from which it emerged, as well as the fervency of the far-right (capitalists, fascists) in their will to kill it, the accomplishments of the USSR are considerable. However, it wasn't a great place to live. It remained mostly poor (because of said initial conditions as well as necessary but massive war expenditures) and was authoritarian to a fault; its great sin was that it was a system of economic totalitarianism--the economy literally controlled everything, from where people could live to what they could say--but, then again, so is ours.
> Capitalism's "winning" of the Cold War was one of the biggest humanitarian disasters of our time.
Whoa. You're completely ignoring the fact that USSR's collapse allowed a lot of nations, comprising of easily over a hundred million people, to gain independence. A lot of them, since freed from Moscow's yoke, are thriving in an unprecedented manner. It's basically the opposite of humanitarian disaster for them.
We can't be sure about alternate timelines, but it's a strong possibility that, had the USSR been left in peace, it would have achieved an even greater prosperity by now. The USSR was on a path to moderation: Khrushchev was not Stalin, and Gorbachev was quite moderate (to a fault, some would say, given the results of his tactical mistakes). They might have actually achieved something like communism by now. We'll never know; we didn't let them find out.
I do agree that Russian jingoism was a problem with the Union, especially for "buffer states" (e.g., Poland, Hungary) whose citizens were treated badly in the midcentury. This likely would have moderated over time under socialism. (Under capitalism, for countries still within Russia's reach, it has not.) In fact, even in the SSRs considered "lesser" than Russia, approval for the USSR and the desire to remain were well over 50% even up to the point of dissolution. The people wanted to fix the Soviet Union, not break it.
As for the prosperity in Eastern Europe, that tends to be the case in countries remain mostly socialist, like Czechia. It's true that they have mostly escaped the horrors of the post-Soviet 1990s and 2000s. What is less clear is how stable their well-being is. I hope this forecast is wrong, but I'm afraid that the EU is not that far behind the US on the path to corrosion, corruption, and ruin. What is happening in the US, I'm afraid, is coming for everyone. Where do the worst people in the world--figurative reptilians, literal pedophiles--meet up every winter? Not in the US, but in Davos, Switzerland. This is a global problem.
Are the people of the FSU better off now than they would have been, had the USSR persisted? It's impossible to know for sure, but I think a strong case can be made that it would have developed a limited market economy, one that avoids our issues of extreme inequality but provides the benefits market systems can (e.g., increased consumer choice). Of course, the USSR did have a lot of problems, especially toward its end, though a number of those problems were caused by external forces (capitalist aggression). On that, and on the notion that capitalism's victory might itself be evidence of our economic model's superiority... I have strong doubts. History tells us that geographic and technological forces have more to do with which side wins a war than having the better economic system, and in the case of the Cold War, this was a matchup between a sea empire (capitalism) and a land empire (the USSR). A land empire has to try to assimilate people, which is hard; a sea empire can dominate them from afar (cf. Latin America). The Soviet Union started from behind, both because it had to integrate some very poor geographic regions, and because it bore the brunt of the casualties in World War II. It was never going to catch up, not unless we let it (which we didn't).
Although we can't know for sure what a 2022 Soviet Union would have looked like--Russia itself would probably be far less belligerent--we do have more than 30 years of data on the trajectory of capitalism, once this threat to it was vanquished... and those data are ugly. For at least two decades and arguably three, the reigning economic system has produced nothing but a devolving culture, absurd political polarization, and a collapsing standard of living.
> but I'm afraid that the EU is not that far behind the US on the path to corrosion, corruption, and ruin. What is happening in the US, I'm afraid, is coming for everyone.
If that's how you see the US, then I'm affraid our perceptions of reality just differ too much for us to be able to have a conversation. (My take on the US is that it's one of the strongest and most robust states in the world. Objectively, it's not doing great, but subjectively - the rest of the world is mostly an even greater mess, so the US comes on top).
So, off the top oh my head, the:
- Regular school shootings;
- Teachers living in their cars;
- People being charged $8K if an ambulance picks them up after an accident;
- One of the highest proportions of the populace with cardiovascular problems and diabetic conditions;
- Incoming real estate bubble pop;
...does not give you a pause and make you think that maybe, just maybe, USA isn't as great as you think it is?
Empires don't fall overnight. The Roman Empire's history -- and the works of fiction based on it, one of which is Asimov's "Foundation" -- demonstrate that once certain positive forces vanish then the collapse becomes imminent BUT it does happen quite slowly, like a tumbling giant. This makes it easy for people to keep pretending that things are going great and everything else are just blips on the news.
(Until one day the courier carrying their package gets robbed and the company shrugs it off and says you won't get a refund. Then it will hit close to home and you'll start believing it, I presume.)
Also sticking to the view that the rest of the countries are worse off than USA is severely mistaken -- but as you said, at this point our perceptions of reality just differ too much indeed.
Don't give the USSR a free pass because it faced 'capitalist aggression'. The USSR invaded Eastern Europe and explicitly annexed half of Poland while in a literal pact with the Nazis. It almost nuked China, and starved millions to death in Ukraine all of its own volition. All of that was in living memory at the time of its collapse - it was a monstrous zombie that deserved to have a stake driven through its heart.
I say that as someone who actually has a pretty balanced perspective on 'capitalist' aggression too which has historically been severe and also with many unforgivable atrocities. But the USSR was big and powerful enough that its hand was not forced in the atrocities that it enacted.
it's a strong possibility that, had the USSR been left in peace, it would have achieved an even greater prosperity by now
Late stage communist countries couldn't manage the manufacturing of toilet paper, there were shortages of everything. There is absolutely no reason to believe they could have achieved any prosperity, much less comparable or greater.
USSR didn't fall apart because of external aggression. It was completely inefficient due to its economic system.
Money is not an accounting system created by the state for various purposes. It’s a solution to certain frictions in a barter economy. It is very real.
> If it can be done, it can be afforded. Only if it is physically impossible can it not be "afforded". That is where we ought to start as a society.
Isn’t this just Marxism? Non-commodity money and all that. I’m not really familiar with his work.
> Money is not an accounting system created by the state for various purposes. It’s a solution to certain frictions in a barter economy.
That's a story that's often told, but it doesn't seem to actually be true. An examination of the historical and anthropological record shows that barter systems only exist in societies that formerly had money, but no longer do (for example, because the empire that was backing the currency collapsed). In societies that never had money, you see a variety of things, mostly formal gift economies, but sometimes things like tracking and cancellation of debts not denominated in a single unit.
If you'd like to learn more about this, read "Debt, the first 5,000 years", by the late anthropologist David Graeber.
Can’t we just invent some kind of kickstarter for research? Certified researchers pitch something to millions and people donate small amounts to fund them? Some guy just got $300k from the internet just for working at Burger King.
You could do that on kickstarter. An audience exists there already.
Although it's mostly small investors/ backers there. You have thought more of the bigger investors, haven't you? Research is expensive indeed...
Unfortunately democracy doesn't do great in areas where technocrats are needed.
Um, a lot of research is, in goal, substance, and subject, completely incomprehensible to non-specialists. So in practice I see a lot of obstructions to this method :/
to a first approximation, what you’re proposing is that citizens pay their taxes, and then the state funds research at accredited non-profit institutes of higher learning.
it’s the best we have right now; it is not without drawbacks. You’re not going to escape wanting to reinvent the grant proposal system.
If you know, you'll find a way to do the research you want anyway. So, no problem here. But I guess the problem is finding out which research you really want to do. You might find other aspects of life more rewarding / interesting / important.
Something I've learned is that most people are only really good (that is, good enough to contribute notably and distinguish themselves) at one thing. A few of us are good at two things, but even there, those two things tend to be related more closely than it seems at first. This is part of why our society has such horrible leadership, especially in the private sector. People who are good at climbing bureaucratic hierarchies, most typically, are not good at anything else: not very smart, not very ethical, etc.
Academia was supposed to protect people from having to do the things they're bad at--such as sell their ideas to people who lack the ability to deeply understand them--and, unfortunately, it went derelict in that duty a long time ago. Salesmen, in any field other than sales, are like an invasive species who outcompete the natives with their superior social polish, but who struggle to actually do anything, which leads to systemic underperformance and the erosion of trust, which only makes the competitive necessity of salesmanship even greater, setting in motion a death spiral from which a private entity (or, in the long term, an entire industry) almost never recovers.
> Salesmen, in any field other than sales, are like an invasive species who outcompete the natives with their superior social polish, but who struggle to actually do anything, which leads to systemic underperformance and the erosion of trust, which only makes the competitive necessity of salesmanship even greater
As a graduate student it's hard not to internalize this framework even though it's soul destroying and leads all too predictably to poor science
If you're very lucky your career success might be orthogonal to or even aligned with your sense of curiosity and intellectual honesty, more often they're antipathetic and you will have to sacrifice one for the other
The problem is that a few people would use their leisure to do important basic research, while most would use their leisure to do... leisure.
“A room of one’s own” by Virginia Woolf is a great book about this topic. Financial freedom proceeds creative thinking.
I recently wrote about and shared this: https://news.ycombinator.com/item?id=31956590
"page not found"
not the HN ofc, but the website you linked in the thread
Thats unfortunate. I feel like surge.sh has been having issues as of late.
> Sure, I have customers to assist, servers to manage (not that they need much management), and business accounting to do; but professors equally have classes to teach, students to supervise, and committees to attend. When it comes to research, I can follow my interests without regard to the whims of granting agencies and tenure and promotion committees.
As a fellow developer/owner, this is the mantra I'm always repeating.
Is running a small software biz a great job? Meh. The job has higher highs and lower lows than when I was a dev for other companies.
At the end of the day, there's no better way to make sustained progress on groundbreaking, long-term dev projects than to run a company that dedicated to building such projects. If one is fortunate enough to find a stable cash source (like Tarsnap), it serves as a perpetual license to work on the most interesting dev projects one can fathom succeeding at.
I have a tattoo on my arm of some (ostensibly) last recorded words of Evaristé Galois. A guy who did more for human progress than some whole nation states.
@cperceiva is in the k-equivalence class of people whose ability we lack the granularity to measure. He won the Putnam before making multiple unanticipated contributions to security in general and the security of FreeBSD in particular.
@pg has tagged him in particular as notable.
If he spends his time making information more durable in a probably secure way as opposed to folding proteins, I’m going to leave it to him to have decided between multiple highly compelling outlets for his genius.
In my observation he’s too humble to reply to the ankle-biting haters, but I have no such qualms: this guy is a fucking legend and if you disagree let’s have an argument.
Until a man is twenty-five he still thinks, every so often, that under the right circumstances he could be the baddest motherfucking programmer in the world. If I moved to an APT in North Korea and studied real hard for ten years. If my family was wiped out by Colombian drug dealers and I swore myself to revenge. If I got a fatal disease, had one year to live, devoted it to wiping out side channels. If I just dropped out and devoted my life to wizardry. I used to feel that way, too, but then I ran into cperciva. In a way, this is liberating. I no longer have to worry about being the baddest motherfucking programmer in the world. The position is taken.
For people who don't get the reference, this is an adaptation of a famous line from Snow Crash, by Neal Stephenson.
I liked the style of that book, fun story. Except for the 15 year old getting it on with an (estimated) 30+ year old.
But I can see how and why Mark thinks he should build the Metaverse ... (and why I'll never visit it)
> studied real hard for ten years
Somewhat related thought. I wonder about this with regard to AGI, I have this fantasy of a symbiote/friend thing. But it is not easy to make that. I don't know if I have the real/actual drive to do it. I'm also a mediocre programmer/developer. That kind of passion where you do it every free moment of your time/not monetary based. It is not for me. I tinker on CRUD stuff/low-end robotics on my free time. So yeah not sure of effort vs. talent/genius.
> That kind of passion where you do it every free moment of your time/not monetary based
I used to have that passion. Every waking moment devoted to figuring out cool things. Or at least what seemed cool to me.
Then one day I joined a rocketship company and all my side projects and ideas started to feel small, boring, and insignificant. The grandest idea I could come up with couldn’t hold a light to even the most mundane problems a real business in the hockeystick part of the curve comes up with.
Sometimes I miss the motivation to tinker. Sometimes I cherish my newfound ability to relax. Perhaps one day I will tinker once more.
I had the opposite.
I used to work for one of the most famous imaging corporations in the world. My little team was part of an elite, worldwide, organization that created instruments that sold for tens of thousands of dollars, and were used to capture some of the most iconic images in history, and make some of the most astounding scientific discoveries.
Since leaving, I have been working; mostly alone, or with very little assistance from others, so my scope has been drastically reduced. I write little apps.
I love it. I am constantly working on code. My GH ID is solid green (no exaggeration), and is not "gamed," the way some folks do it.
I have no intentions of ever working for anyone again. I was constantly told that I was wrong, that my work was insignificant, that I should not cast my eyes heavenward, etc.
These days, I set my own agenda. I design my software the way that I always wanted (and was never allowed) to, and I am absolutely thrilled with the results.
Turns out, I was absolutely right, the whole time.
I just recently started on my journey. I'm currently building my own infrastructure stack/platform to suit my needs. https://www.adama-platform.com/
That looks pretty interesting.
I wrote something like that, a few years ago; mainly to practice. I am now using it as the backend of the app I'm writing.
Yeah having a full time job does limit you a lot with what you can do after the day is done.
When I was younger (10+ years ago) I had all kinds of crazy ideas/things to make. High traffic websites, get rich, that kind of thing. But they were all dumb... unvalidated, went nowhere. But now that I know how to build things (web) I don't have any ideas. Funny how that works.
When you look at indie projects seems like there's so many niches/ways to make money out there. But personally I found it hard to do hence 9-5er.
This is why instead of actually executing my ideas.. i at least divinely them meticulously. I have a book filled with (to me) brilliant ideas.
Who knows what will happen with them.
Just don't pay $99 (non-refundable) to sell your idea
You should not compare your own ideas to others. If an idea is extremely stupid but at the same time it gets you going in the morning you should stick to it.
Some of the most useful inventions that we use today every day have been made by very stubborn people who were told thousands of times that their idea was stupid.
I still have the motivation and desire to work on some very interesting and super hard stuff one day -- and I am 42. You should get back to that enthusiasm. Maybe that business' ideas and actual products are much more useful and interesting on a general socially-accepted level. That doesn't mean that your thing isn't the best in the world to work on for you.
So IMO get back to doing your own stuff when you have the time and energy for it.
>Some of the most useful inventions that we use today every day have been made by very stubborn people who were told thousands of times that their idea was stupid.
That's true; Zuckerberg's parents told him that Facebook idea is stupid and that he should get his Harvard degree. Larry and Sergey couldn't sell Google to Excite for $350k because Excite's managers thought Google was unnecessary.
I appreciate the sentiment. Truth is my ideas stopped getting me going in the morning because the dayjob feels like such a bigger better more interesting opportunity.
I’m sure exciting ideas will come again. Until then “Enjoying my dayjob too much” isn’t a bad place to be :)
I agree and I am happy for you that you have that going for you. Personally for 20.5 years of career I have only had 2-3 short-term contracts that truly interested me.
I’ve found myself in a similar position. I’ve often wondered whether the cause is one specific thing (eg my ideas seem small / less impactful than other projects I’ve worked on), or the culmination of several things.
I suspect in my case it’s a combination, but with a big dose of “been there, done that”. There’s only so many times you can get excited about a new tool or optimization.
The answer is, of course, to dive into something that isn’t tech. There’ll be a host of problems that seem/are fresh and interesting.
recent real estate chaos and vast increases in cost of living makes me wonder how many people in the tech industry might right now be literally living in storage units as described by Hiro Protagonist.
In 2007, I lived in my car and just had a storage locker. I mostly slept in the computer science building.
In took me roughly 15 years of dealing with this bullshit world to get back to the calm of that time.
The author writes: "Is there a hypothetical world where I would be an academic working on the Birch and Swinnerton-Dyer conjecture right now? Sure. It's probably a world where high-flying students are given, upon graduation, some sort of "mini-Genius Grant". If I had been awarded a five-year $62,500/year grant with the sole condition of "do research", I would almost certainly have persevered in academia and — despite working on the more interesting but longer-term questions — have had enough publications after those five years to obtain a continuing academic position. But that's not how granting agencies work; they give out one or two year awards, with the understanding that those who are successful will apply for more funding later."
These awards certainly exist; the Miller institute at Berkeley, the Simons fellows in NYC, and the Clay Fellows are a few examples. (Some not quite for 5 years, but more than 2. Also, the pay is more like 90k.) The NSF math postdoc is also close to this (2 years full funding if you elect to not teach those years, plus a third year where you have to teach.) Regular NSF grants and NSF CAREER awards are also longer, IIRC, though perhaps less relevant to fresh PhDs.
So, in fact, these "mini-Genius Grants" do exist.
It's worth emphasizing this part of the quote from the author:
> But that's not how granting agencies work; they give out one or two year awards, with the understanding that those who are successful will apply for more funding later.
I've worked for a funding agency. It's tough to give out five year grants with no deliverables other than "be a genius". When you are funding basic research you have to be prepared for the idea that lots will fail, and many people will decide to leave. How do you know who is a genius when they are finishing their PhD? Everyone finishing a PhD at Oxford/MIT/Stanford/etc has a very impressive resume and the reality is you can't fund them all.
Two year grants seems a reasonable compromise here.
How do you know who is a genius when they are finishing their PhD? Everyone finishing a PhD at Oxford/MIT/Stanford/etc has a very impressive resume
Impressive resumes, sure. But my experience has been that even when students are starting their doctorates there's very clear tiers -- even out of the pool of doctoral students in Oxford.
(In case anyone is thinking this is hubris on my part: I would say I was towards the top but not at the top. The most impressive student I met in Oxford was a mathematician by the name of Lillian Pierce -- and she wasn't even doing her doctorate yet.)
Suppose you wanted to arrange to give the top 10% or so of math Ph.D.s who graduate from Oxford the sort of no-strings-attached five-year grant you were talking about. How would you set up the decision process for deciding which graduates to award the grants to? Would the divisional chair at Oxford assign the grants? Would the graduate students vote on who should get them? Would the Royal Society choose the recipients?
I don't want to claim that nobody was trying to influence your opinion about who the best students were, when you were at Oxford; of course it's common for people to tout their own achievements and those of their friends. But I suspect that if you were in the position of directing millions of dollars a year of no-strings-attached funding, and people knew that, the amount of effort that went into influencing your opinion would be an order of magnitude greater, and might be sufficient to confuse you.
(This pickle is presumably obvious to you but the solution to the pickle isn't obvious to me, so I'm hoping you can outline a solution.)
This kind of decision is already routinely made by committees awarding NSF postdocs, Clay fellowships, etc. What's wrong with the process they use (and independent expert panel)? Do you think they select the wrong mathematicians?
cperciva's criticism of this process is, as I understand it, that the grants they give are too short, and that this, together with the selection criteria, creates a too-strong incentive to pursue short-term, low-risk results.
> But my experience has been that even when students are starting their doctorates there's very clear tiers -- even out of the pool of doctoral students in Oxford.
This might be true, but it's really really hard for someone who doesn't work with them day-to-day to distinguish.
Your article pointed out the same problem in getting novel worked reviewed - it was extremely hard to find people for that. I know in my case it was even worse - funding proposals generally come from research group leaders, and there at least you have a body of work to look at and can judge their likelihood to succeed.
But if you are trying to look at an individual's work how can you judge it when all you have to go on is peer reviews and maybe interviews with supervisors or peers?
Supervisors' motivations are already complicated - most of the time they would want an individual in their group funded because their work is generally aligned with that research group's field.
It's a pretty tricky problem.
> This might be true, but it's really really hard for someone who doesn't work with them day-to-day to distinguish.
It's really not (in mathematics, the field under discussion). It's not hard to look at the CV of a graduating PhD student in math and tell whether they're a semi-reasonable candidate for the Clay Fellowship, or the other fellowships listed above.
Among the small group who makes that cut, of course you need to rely on letters of recommendation, expert assessments, and so on. But the initial cut is fairly straightforward.
It's interesting. We were mostly funding at the PhD level, and in computer science.
By contrast, I read the links for all the current Clay Fellows and noticed that maybe 60-70% of them had one thing in common: They had proved some long standing (30 year+) conjecture.
I agree that's a pretty good sign.
Of course the program they did their PhD through is a pretty good initial screen too:
Imperial College London 1
UT Austin: 1
I've had more of an urge to do some research on CS topics lately (beyond the usual you can find in textbooks), but as someone who was never in academia, and only got an undergraduate degree, I don't really know how to approach it as an outsider, like what sources I should use, how to find something to focus on, etc.
Does anyone else have any experience with this and have any recommendations for how to approach it? I imagine I'll probably have more success in areas that are game or recreational mathematics related (like Martin Gardner), at least to get started.
You could look for academic conferences in your area of interest and see what people are working on . It is not necessary to have particular degree for submitting your work to a conference. Since most review processes are double-blind no one would know at submission time anyways. Though, knowing the process and how a "typical" paper looks like, on which people in academia have an advantage, certainly helps a lot.
 E.g., https://www.computer.org/publications/tech-news/events/2022-....
While I also only completed an undergraduate degree, I did spend a few semesters doing research with a professor and ended up being co-author on a paper with him. Doing a first paper without the guidance of a researcher in the field would be very hard, but doable, and certainly performing research is doable.
I'd start with figuring out roughly what you want to investigate. Try to narrow it down as much as you can, but I understand that it'll start broad and probably eventually narrow down after going down a few rabbit holes. Eventually you have a pretty specific field you know you want to do research in. Find the top conferences in the field, look through some of the papers that seem interesting, and look at what they cite. You should be able to compile a list of textbooks and foundational articles that crop up often - read all of these. Maybe even try to replicate the results of some of the foundational papers - a lot of the time new work is extensions of the foundations, and people writing papers started out replicating those initial works anyway. If you're struggling to narrow things down or figure out what's important, focus on papers from one researcher or research group. Maybe even send them an email if you want their advice on what to read.
Finally, you have a decent grasp of the underpinnings of this narrow field and have seen some examples of how it's been extended in modern times. You might now have some ideas of what to experiment with in your own research, or if not you can try replicating more modern papers and see if that gives you ideas of what more to investigate.
Hard question to answer without specifying topics. The usual way is to find an introductory textbook on the subject, find the papers and authors referenced for results and theorems and work forward through the literature. This will give you both historical perspective and develop your knowledge at a gradual pace.
It seems to be common knowledge that the systems behind publishing papers and getting grants are nearly completely broken. It feels like a weird hazing ritual, where the people who get through it fiercely defend their abuse.
> Is entrepreneurship a trap? No; right now, it's one of the only ways to avoid being trapped.
... Realistically, you need to be rather well off to have even one good shot at starting a company. Get sick at the wrong time and forget about it. Get less than very lucky, and forget about it. Get your ideas stolen by a megacorp; forget about it.
I'm so tired of seeing truly extraordinary people forced to do menial labour to survive, and starting a company in the hopes of getting lucky isn't all that much better.
> I'm so tired of seeing truly extraordinary people forced to do
menial labour to survive
This * 10.
The teacher/mentor part of me relishes seeing success for others.
Meeting many talented, creative and hard working people in my life had
been a privilege. But I do feel a sense of injustice and awful
waste. To see people who could change the world and solve real
problems settle for less, that hurts. When graduation time comes
around I sometimes feel a dark sense of helplessness, and maybe
something of a hypocrite/imposter because the one thing I cannot teach
or offer is opportunity.
> and starting a company in the hopes of getting lucky isn't all that
Maybe not, but I encourage all of them to give it a shot once in life.
At least failure is yours to own and not feeling thwarted by some
manager prick whose decision to obstruct your dream was just a way to
get by for one more day in a firm they couldn't care less for.
> I'm so tired of seeing truly extraordinary people forced to do menial labour to survive
On the bright side, it's probably never been easier for genius type people to become societally useful (and rewarded for it) than now! I was just reading Two Years Before The Mast (1835-ish), where the Harvard educated protagonist met a savant type person working on a ship, but that person was doomed to not rise above his position without a lifetime of struggle. It's not foolproof by any means, but our education system at least found @cperciva and gave him ample status just for his applied intellect to where he could pursue almost anything he wished.
Random question: would you be willing to do an paid audit of new cryptographic code?
Yes, if it's in my (fairly narrow) area of expertise. About 90% of the time when someone approaches me with audit work the answer ends up being "I'm not the right person to hire for this".
What is your narrow area of expertise? Passwords and memory-hard functions?
I figure scrypt is, but I'm not using that. I'm implementing OPAQUE with Argon2.
"I wanted to succeed in academia I would need to churn out incremental research papers every year — at very least until I had tenure"
I sympathize with that track. It's essential the long-term version of "show your work". He would have spent at least 5 years, perhaps many more, writing papers based on small bits of his overall findings on shared caches in multi-threaded CPUs as a cryptographic side channel. Each paper would focus on a tiny bit of the problem, and eventually build up to a full foundation for the real work. All well and good, until some academic with tenure and a raft of papers to his name decides to go ahead and write the actual paper that needs to be written, and takes credit for the important results.
I hadn't heard of spiped, and I was very happy to discover it through this blog post. It's elegant, simple, and solves a problem I was over-engineering my own solution for. So thanks, Colin!
spiped is amazing. I remember looking for something to use Kivaloo for when TFA came out, but I don't have any need for what it offers
I can relate (long story, lots of tears. Get your hanky); except for the "able to buy a house with the proceeds" part.
The stuff that I'm most satisfied with, has never earned me a cent. In fact, it has cost me thousands.
I know that the software that I've written has actually helped save lives, so maybe it's not really a "waste"?
It's always weird how people always think your wasting your life, no matter what you do.
I've been pondering if there's really any "superior" way to spend one's time. If the concept of wasting time even exists.
Especially with hustle porn these days, if you aren't building million dollar project, then you're "wasting time".
In business you follow the market; in academia you follow the grants committee.
But in a start-up, you can follow what you think is important - and see if anyone needs it. Raise the flag, and see if anyone salutes. “If I had asked people what they wanted, they would have said faster horses.” - Ford
Of course, infrastructure and support etc makes it easier. Industry labs like PARC used to be like this. In addition to the genius part, not everyone has the other qualities needed to run the whole show.
Maybe this is the reason people can be so smart and not rich: they trust their own judgement over that of others and fail to make either what NSF want or what VCs think what other people want. At least in academia there is peer review to shorten the feedback loop.
Beyond the well reasoned response, I object to the premise that it's "just backups". Quality backups are a big ug lever for ensuring that others work is able to continue and exist to be appreciated.
Yeah, but generally the reason people's backups don't work aren't that they're using backup software written by programmers who weren't smart enough to win the Putnam. Generally when people's backups don't work it's because they don't do the backups, or because they set up the backups in a way that can't work (for example, forgetting to back up the files they really care about, or backing up a live database tablespace with a filesystem backup tool without snapshots), or because they're in some kind of a dysfunctional relationship with a vendor like Google that doesn't give them programmatic access to their own files so they can't do backups.
So, while I don't think it's bad for cperciva to have spent a lot of time working on Tarsnap, which is clearly a useful piece of software, I also think things like scrypt actually matter more in the end.
However, I also think cperciva is a better judge of what to spend his life on than I am.
In a sense, the difference between entrepreneurs and (idealized) academics is in making what people want right now versus making what they think people (in the future, might) want. So, trusting other people’s judgement over your own, including how to spend your life. Modulo Steve Jobs and lots of unearned capital.
I find myself loving how barebones the site is - there are no distractions, only the text of the article itself.
While capping the maximum width of the text paragraphs might be a good idea for readability (slightly less horizontal eye movement; though some might also disagree), it's as if I've stumbled into some alternate universe where web development is very minimalistic.
Great note on why to follow what you love in life. Things may be different from what you and others thought you would do when you "grow up" and be an academic.
But follow what you love and create something useful, even though some things don't get the attention they deserve, is the best one can do.
We all should do more of this and the world would be a better one!
Thanks, title updated!
cperciva, if you're around, I'd be really curious to hear a bit more about the details behind your attempts to get "Cache Missing for Fun and Profit". I'm really surprised that it was so hard to find a home for it! Maybe a more general security venue like USENIX Sec, ACM CCS, or IEEE S&P would have been a better fit for something that was neither fish nor fowl? Just a couple years later in 2008 USENIX Sec published "Reverse-Engineering a Cryptographic RFID Tag" which is a personal favorite and similarly is a mix of low-level hardware architecture and cryptography.
There were two other shared CPU cache side channel attacks on cryptography in 2005 (DJB's "Cache-timing attacks on AES"  and Osvik/Shamir/Tromer's "Cache Attacks and Countermeasures: the Case of AES" ) so in some ways it feels like the time was right for the community to start finding these attacks -- which makes it all the more baffling to me that your work got rejected.
I'm also a bit dismayed at the advice not to work on anything too novel. I guess it's true in some ways (our Chaff Bugs  paper and an associated grant proposal got rejected for four years straight with comments like "the panel was uniformly skeptical that anyone would ever be willing to use this" ), but it feels from the inside that it just takes more work to convince people to take a chance on new ideas, and that the larger issue is just how much inherent randomness there is in the peer review system  (my advisor used to say it wasn't worth paying attention to the first rejection, and not to worry until it was rejected a second time -- and I think it's only gotten worse since then). Good work gets rejected all the time, and it sucks :(
Anyway, you clearly seem pretty happy with how things ended up, so perhaps I'm just saying that I'm disappointed that we in academic security missed out on the opportunity to have you as a colleague ;)
 Possibly this is true, I admit.
 See e.g. the NeurIPS 2014 and 2021 consistency experiments: https://arxiv.org/abs/2109.09774, https://blog.neurips.cc/2021/12/08/the-neurips-2021-consiste...