54 points by Tomte
5 days ago
I’m very interested in the discipline of respectful enquiry but this passage rang some alarm bells:
> Street Epistemology started life as a method to discuss religious belief in the 2013 book A Manual for Creating Atheists.
Is it just me, or is there (or was there) an agenda here that’s in conflict with its claims about creating understanding? With apologies for the pun, were they “good faith” conversations?
Peter Boghossian was an old instructor of mine at PSU. As a B or C-tier "new atheist" who wasn't quite on tenure track, it was interesting seeing how he tried to get attention. He was pretty solid in his lectures outside of politics and religion -- he was generally well-read and was very entertaining, but seeing him trying to "trigger" people on those topics made a lot of students groan.
He had a small group of students who would follow him around everywhere and he would ask the class and his clique to promote and astroturf him on Reddit, Wikipedia, et al. Which is valid... I guess it's better if his students are doing it than him?
A few times on campus, he was seen coaching students who were later identified by a local antifascist group as being linked to "the Daily Stormer" -- which I have mixed feelings about; I truly believe everyone deserves to have tools for better thinking, but you're right to suspect Boghossian as possibly having agendas.
In a direct response to your last question, he did get in some hot water for bad faith academic publishing that produced questionable benefits:
I'll say that people criticizing RMS doesn't make the libre software movement bad, and it doesn't invalidate his accomplishments.
I had him as well in 2012.
He was fun back then, but I think a little bit of fame has gotten to his head. I think he wants to get people to really think about their beliefs, why they have them, and if it's reasonable to have them, but his methods are indistinguishable from bad faith trolls that are "just asking questions" (aka, JAQing off).
I took his book as disingenuous. "How to have difficult conversations" ... to change the minds of people who are wrong per definition.
Essentially "I am rational therefore I am right".
Another passage that rings a bell:
> Today the method has a life of its own, driven by a community of people interested in discussing difficult topics, seeking truth and reflecting on the methods we use to arrive at our deep convictions.
What is alarming in this passage? Seems innocuous to me.
>Essentially "I am rational therefore I am right".
Sounds like basically the entirety of philosophy
> > Essentially "I am rational therefore I am right".
> Sounds like basically the entirety of philosophy
That's the opposite of the experience I've observed.
The starting point for a lot of philosophy students is learning the teachings of Plato and Socrates, among which the most famous is likely "I know that I know nothing"
The spirit of searching interrogation is there, along with Socrates' daemon who, Socrates said, prevented him from saying anything untrue (I'm hamming it up here).
The philosophical prototype of Socrates also gives license to be a gad-fly, piss people off for a good cause, etc. Instead of offering to pay a modest fine when he was found guilty of corrupting the youth, Socrates was so certain of his rectitude he told the court he should instead be awarded free meals for life like the Olympic champions. Now there's someone with chutzpah.
There are plenty of schools of thought that start with "I am not likely rational" or "I can't know if I'm rational".
not really. It is a specific school of philosophical thought - and the basis of a lot of the critiques of science, positivism.
They've discussed the ethics of this a few times on the live show. Some practitioners think the attitude is wrong, other's don't... but now it's spread so much that background viewpoints are pretty diverse anyway with even Christians doing it, ironically due to their evangelism. On the other hand, I think Anthony Magnabosco doesn't hide it well sometimes in his interviews, which I think some of his interivewees pick up on and it harms the process.
In any case, I think this is the most important kind of activism their is and nobody needs to be part of their original community to do it or do it for the same reasons.
If you see religion as something good, or even neutral, to humanity, then of course there is (I would probably say was) a mismatch between the claims and the objectives.
If you see religion as an overall bad influence - more specifically, as "the opposite of understanding", then both are (were) aligned.
Even if you are biased, striving for truly rigorous arguing can lead to much insight about arguing itself. Dissecting your own philosophy as well as that of your "opponent".
I grow bored with trying to convince other people to agree with me and am much more interested in how I can change my own mind to agree with others. True bravery is saying “I will change my mind about something important today.”
I got pretty good at this - it's not all it's cracked up to be.
If you're really open minded & empirical, every time you talk to a specialist or hobbyist, you'd change your mind - simply because they have more data available. You're not at the mercy of the best idea, just the best argument; you're simply outsourcing your opinions.
It's less "thoughtful" and more "spineless"
Your comment strikes at the core of the matter. Changing your mind is perceived as being spineless, or lacking in some essential moral component. It's unfortunate. Think of the millions who have died needlessly in large and small ways because humans were unable to admit the folly of their ways, even if they could clearly see it.
I stand by my original comment. True bravery is being willing to change your mind about important things.
I think there's a balance. Changing your mind when given sufficient reason to do so is bravery. Changing your mind when given inadequate reason to do so is spineless.
But that begs the question: What is adequate reason? curiousllama is saying that many of us will change for inadequate reasons, because we run into someone who knows more than we do on that particular topic. Not who is right, just who knows more. And if we adopt their position, we learn any more, just adopt their position.
So I think that maybe you have to know why they hold that position, and whether that "why" is valid. But if it is, sure, be brave enough to adopt their position. But if you don't know the why, and whether the why is valid, and you adopt their position anyway... that's kind of spineless.
> I stand by my original comment
Not unexpected. After all, I did try to change your mind by saying "you should not change your mind."
I feel like you’re implying “I am willing to change my mind” is equivalent to “I believe whatever the last thing I heard was”. But surely this is not so.
I feel like this little interaction should be written into a play. It'd fit perfectly in Waiting for Godot.
Is it wrong to follow the opinion of someone with far more data than you?
In the modern world, the vast majority of people (including me) don't have the mental capacity to examine everything. Whether we're talking about politics, science, art, whatever else. Most of us become narrow specialists who are pretty much incapable of forming a good, defensible opinion or concept for a great number of subjects -- both important and trivial ones.
Granted, I don't know how you'd pick which experts to listen to and believe in a given field. I don't, personally, pick a subject matter expert and decide to believe them on everything. But I actually wish I could do that; there's just no social mechanism for picking those people.
data != useful information
information != knowledge
knowledge != wisdom
E.g. there are mountains of astrology "data" daily, but my priors are strong enough to not let them affect my posteriors.
"The problem with having an open mind is that other people will insist on trying to shove things in it". ~Terry Pratchett
It's useful in so far as it makes people question themselves.
It's useless in so far as it doesn't teach any of the nitty grittys of epistemic rigour e.g all the tricks scientists learn to avoit the various traps of their own biases while acquiring new knowledge.
I skimmed. Little meat but lots of references to meat elsewhere.
Can somebody give me a nutshell of their method beyond "practice good epistemological hygiene"?
I heard this conversation which used the method, and was a game-changer for me:
Somebody says they like a particular sports team because the coach is their favourite. They are asked, would you still like the team if it had a different coach? They answer yes, they would. So the reason cannot be the coach. Is there another reason you like the team? Well, my family followed the team since I was a kid.
The technique is NOT to argue the truthfulness of a belief, but to focus on the reason for the belief.
It's only a toy example, and it already has a problem: people aren't very good at explaining their motivation. There doesn't have to be any logic to it, so you basically are in no position to question any preference by appealing to logic. People are also tempted to simply reply when asked, even if they don't know the answer. And your conclusion isn't fully justified either. Suppose the true, underlying reason for liking that team is
coach == "my fav coach" || center_player == "my fav player" || won_last_match
This technique may very well set you on a path to misunderstanding when applied rigourously.
No, you misunderstand the goals and methodology. It might be better if you watch some full examples somewhere.
> people are also tempted to simply reply when asked, even if they don't know the answer
People are intentionally given as much time as possible and not pressured into answering. This is something they focus on consciously. Conversations are even ended with exchange of contact info in case they want to follow up.
The point is not to "win" but to get them to think about why they believe something and actually come to a conclusion (but not necessary right now.) It's acknowledged that this may even strengthen their original position, which is seen as a good thing. The point is not for the questioner to argue anything or come to any conclusion.
I'm not particularly well versed in SE, though I've watched many videos and listened to many podcasts. I feel like the example of a favorite sports team is ineffective because all practical conversations are around a belief that is substantive and meaningful to how one lives their life (with some detours to understand the nature of reality, such as the Tic Tac test).
So instead, maybe somebody says they don't trust vaccines because they know someone whose kid was diagnosed with autism after getting vaccinated. A potential follow-up question might be, "Do you know anyone who was diagnosed but who didn't get vaccinated?" Or a hypothetical, "What if you met someone who was diagnosed but who didn't get vaccinated? Would that change your belief?"
Getting them to work their own way through their beliefs - not necessarily to change said beliefs but to examine them - is the point. And given what I feel can be some REALLY far out beliefs than people have, I think it's normal to see SE practitioners sometimes lose their cool a bit or have their opinions/beliefs show.
>The technique is NOT to argue the truthfulness of a belief, but to focus on the reason for the belief.
That would be where good epistemology comes in. We're concerned with the quality of the knowledge (rather than its truthyness).
They basically just do what's on Anthony's channel, i.e. asking how you know something, what would change your mind, etc.. Very few are actually learning epistemology as far as I can tell from having been in their Discord for a while.
It takes a certain level of rapport or even friendship I think to do this well. To me a difference of opinion is super interesting because I like to see if it reduces to a premise that is a new way of looking at the world for me.
Just last night I had an exchange where I asserted China was going to invade Taiwan near or around the US midterms, and a friend across the table said while it does seem like the best opportunity for them and the incentives are there, US air and space superiority mean it could manage full wars with both China and Russia without the need for a land war in either country. My view was predicated on the view that the US congress was too unstable and divided to muster the political will for a draft, and that a draft would be necessary to get sufficient troops to be a deterrent to a Chinese invasion, and his view was the draft isn't necessary because the real deterrent is US air power and ultimately tactical nuclear weapons. We are politically polar opposites as friends, and both views were indeed proxies for prior opinions of the US administration, but what was interesting to me was the basic difference in opinions was not actually oppositional about the facts, but rather that the thing I thought was important (mustering a draft) was not important in his understanding (air superiority). The debate wasn't over truth or falsehood, or even that partisan, but rather the respective weights we assigned to facts we accepted at face value from one another. I've become very interested in whether his air superiority premise is accurate, as it's critical to the quality of the prediction of this outcome.
Codifying not so much how to argue, but how to learn from people in the context of a disagreement seems like a very useful tool.