I studied politics and history. In my experience, there is also a bias towards writing exceptionally partisan things. I used to write things that were hedging, and which were summaries of the argument. Invariably, these papers scored significantly lower than papers which took a position (whether that was accurate or not).
Of course, university isn't reality. But my point is that people do not find realistic opinions that interesting. Saying "I don't know" or "both sides have merit" is sometimes accurate but never interesting. Society selects for this.
I also worked in finance which exposes you to a reasonably fast feedback loop between opinion and reality. I met people who had multi-decade records of crushing it, and they usually had very anodyne takes. You come in expecting they have access to some higher form of truth...in reality, they are just better at seeing what is already there.
You see this, particularly, when a famed investor says the market is over-valued. Inevitably, they will get lambasted publicly. But you realise that their view is usually one that works long-term, that is quantifiable, that uses reason...everyone else just wants emotion.
The difference isn't knowledge. You can have huge knowledge, you can be rewarded heavily for being right, you can be a media "expert"...but if you aren't interested in reality, it doesn't matter.
I don't think it is about avoiding having strong views either but it has to be strong views, weakly held. With time I think a minority of the population realises that some people will have the same opinion regardless of what facts are presented to them. What is frustrating is that society reproduces so much of this nonsense.
I am a fan of systems/tools which reward those with better models of objective reality and the likely future.
I am intrigued by prediction markets as a way to give feedback to people's models of reality and how accurate they are in extrapolating trends, and to reward those who are better able to predict the future. It's a potential way to introduce 'skin in the game', both from a reputational and financial perspective.
Is that 'media expert' claiming Country X will be a dictatorship in 2 years because their political opposition won an election? Define terms and ask them to make a bet, and let's see how that plays out.
After awhile, you can start to see people's track records, and the skin in the game pressure may incentivize better epistemics before making claims.
I don't think prediction markets make sense. The information about whether a country will become a dictatorship is usually private, and impossible to obtain. It isn't like a betting market where the relevant information is public.
I probably agree with your aims. I think that systems should exist to make better decisions but there is limit because no-one can make perfect decisions (and it is probably more important to consider robustness at that point). I think attempting to reward predictions makes no sense, everyone is prone to these biases. I don't think rewarding a certain group of people makes that any better.
One part of markets that people often get wrong is that sometimes markets are totally wrong. With Enron, the information that the company was fraudulent was publicly available, and relatively easy to find. But the price for the stock was set by people who didn't want to find it. Mayweather vs McGregor...crazy odds. In 2012, the US prediction market Intrade had wildly inaccurate odds for the Presidential election, this was a market that was trading hundreds of thousands per day. Markets work most of the time, they don't work all the time (I would say financial markets are the best evidence of this, you see stuff that makes no sense almost all the time because investors have such different aims...you think no-one is this stupid, then you see it happen in a megacap stock worth hundreds of billions, and your perspective on human reasoning changes significantly).
For the record, I don't think prediction markets are perfect and solve all problems.
I also don't think solutions/tools should be discarded because they are not perfect.
From your examples, it seems like your knock against prediction markets are the fact the aggregated predictions don't end being correct at times? Of course that's going to happen, and I don't necessary see instances of the wrong aggregated predictions of a market as a bug, but a feature. Human populations are wrong about things all the time in the aggregate. What better way to get better at decision-making than getting feedback on how we are wrong?
My argument for prediction markets was not for generating knowledge in the aggregate from all of the individual users' predictions. It was centered on the individual front as a way to reward individuals who are better able to predict the future. The uber context in which I was commenting was around who should be considered an expert. Prediction markets are one potential tool (albeit not perfect) to measure individual's conception of current reality and future reality.
Right, but the point is that prediction markets remove the subjectivity, and I am saying they don't. They are just wrong in different ways.
Rewarding individuals? So we swap one set of "experts" for another? The point of this is that no-one is an expert. The problems are created not because we don't have prediction markets, but because we treat "experts" with reverence.
The comment about randomness you made to another reply...with any event, there is some part of the outcome that is unpredictable. You could have perfect information, and still would be unable to predict it every time. Some events have no information that can be used to predict them, so predicting is just the event's previous frequency or last value.
It would depend on the underlying event. The reason we have financial markets isn't to predict things, it is to fund things that need funding, and the price mechanism guides that for better or worse.
Creating an artificial prediction market is more like gambling where people are transferring risk for its own sake. Nothing wrong with that but I am not sure what the purpose would be. As said, these markets are also unlikely to add any new information either because the kind of events being talked about some largely random.
Would it be fair to say that there is some value, but the quantity is unknown?
> Creating an artificial prediction market is more like gambling where people are transferring risk for its own sake.
I suppose this is part of what is involved, but "transferring risk for its own sake" doesn't seem like a comprehensive description of what they are.
> Nothing wrong with that but I am not sure what the purpose would be.
I think it's useful to always consider that there may be more to see than what catches one's eye.
> As said, these markets are also unlikely to add any new information either because the kind of events being talked about some largely random.
If I rephrase this to:
>> Because [the kind of events being talked about are largely random] it logically follows that [these markets are unlikely to add any new information].
...it seems to me that this is an imperfect way of thinking about it. Perhaps I have mischaracterized your intended meaning?
Yep, I have been there. Example (sorry to use another finance story): the market today is overvalued by any logic. Valuation, 0% returns for next ten years. Equity risk premium, max 2% real. Contributions that are valued most heavily are complex and surprising. If you say the market is overvalued when it has gone up 15%/year for a decade then people are expecting something very big...not some nerd talking about equity risk premia.
I think this is the difference between how humans reason emotionally and how humans reason logically.
Finance is a bad example because most people lose it when money is involved. But you see this in other areas where it is very hard to make a contribution if that contribution isn't surprising, controversial, or shocking in some way. It has to have emotional valence. This isn't universal. Some areas of study don't have that culture, some organizations have clear decision-making processes that try to stop this...but everyone is prone to emotional reasoning.
This seemed so idealistic and a bit silly when I was a kid, but now (especially now) ...
If you can keep your head when all about you
Are losing theirs and blaming it on you;
If you can trust yourself when all men doubt you,
But make allowance for their doubting too;
If you can wait and not be tired by waiting,
Or, being lied about, don’t deal in lies,
Or, being hated, don’t give way to hating,
And yet don’t look too good, nor talk too wise;
If you can dream—and not make dreams your master;
If you can think—and not make thoughts your aim;
If you can meet with triumph and disaster
And treat those two impostors just the same;
If you can bear to hear the truth you’ve spoken
Twisted by knaves to make a trap for fools,
Or watch the things you gave your life to broken,
And stoop and build ’em up with wornout tools;
If you can make one heap of all your winnings
And risk it on one turn of pitch-and-toss,
And lose, and start again at your beginnings
And never breathe a word about your loss;
If you can force your heart and nerve and sinew
To serve your turn long after they are gone,
And so hold on when there is nothing in you
Except the Will which says to them: “Hold on”;
If you can talk with crowds and keep your virtue,
Or walk with kings—nor lose the common touch;
If neither foes nor loving friends can hurt you;
If all men count with you, but none too much;
If you can fill the unforgiving minute
With sixty seconds’ worth of distance run—
Yours is the Earth and everything that’s in it,
And—which is more—you’ll be a Man, my son!
> But my point is that people do not find realistic opinions that interesting. Saying "I don't know" or "both sides have merit" is sometimes accurate but never interesting. Society selects for this.
I completely agree with you, and "Society selects for this" is a decent explanation (I would add in culture) - but isn't it strange that there seem to be so few outliers? And I'm not talking about disinterested centrists, but more like people who are interested in the meta aspect, the fact that approximately no one is interested in what is actually true (that they do not know, that both sides typically do have some merit), especially when compared to how people really are (they do not know that they do not know, they (commonly) cannot realize or even consider the possibility that each side usually has some valid points....rather, a very common phrase I hear nowadays is "I've had enough of this 'both sides!'").
Isn't it strange that this is the nature of the system that we live in, and isn't it even stranger that hardly anyone seems to find it odd?
> What is frustrating is that society reproduces so much of this nonsense.
It's frustrating for sure, but if you think about it: why wouldn't it produce nonsense? Like, we've been having these sorts of issues for decades - we see more of it (and I suspect more exists) because of the internet, but we haven't really changed much about the way we do things (if anything, any new tool we get is used to crank the insanity even higher), so does it make sense to expect it to improve over time?
> And I'm not talking about disinterested centrists, but more like people who are interested in the meta aspect, the fact that approximately no one is interested in what is actually true (that they do not know, that both sides typically do have some merit)
I think you can find plenty of that kind of discussion the academic/professional sphere, though it's small in proportion to the greatly increased radicalized chatter. That's part of the point of radicalization and propaganda - to politicize everything, to the point where there is no truth, just partisan claims and suspicion. They are doing an effective job, but we don't have to go along with it - I simply refuse to read the partisan stuff; there is plenty to read, especially from the entirety of human history before the last several years.
> we've been having these sorts of issues for decades
Polarization and, IIRC, radicalization, have increased dramatically recently.
> I think you can find plenty of that kind of discussion the academic/professional sphere, though it's small in proportion to the greatly increased radicalized chatter.
Agree. You can find "plenty", of a certain quality (but how high?) Compare this to the activity and quality of discussion in the history of philosophy though, an academic undertaking that seems to have taken a seat at the kids table in modern times.
> That's part of the point of radicalization and propaganda - to politicize everything, to the point where there is no truth, just partisan claims and suspicion.
I firmly believe that much of what you are referring to is to a large degree consciously ~engineered. But also at the same time, I think we should be extremely mindful of emergence.
> They are doing an effective job, but we don't have to go along with it - I simply refuse to read the partisan stuff; there is plenty to read, especially from the entirety of human history before the last several years.
I prefer reading some of it on an ongoing basis and studying its nature, the ~techniques, the effects, how it alters the perception of the public, etc. Study it as a system, understand how it works, why things are the way they are, how it can be influenced, etc.
> Polarization and, IIRC, radicalization, have increased dramatically recently.
Very true. The internet and mass media is powerful, much of it is controlled by a relatively small number of people, and is increasingly censored and propagandized. And yet, things are still quite open, many opportunities, channels, and attack vectors are available, for now.
Of course, university isn't reality. But my point is that people do not find realistic opinions that interesting. Saying "I don't know" or "both sides have merit" is sometimes accurate but never interesting. Society selects for this.
I also worked in finance which exposes you to a reasonably fast feedback loop between opinion and reality. I met people who had multi-decade records of crushing it, and they usually had very anodyne takes. You come in expecting they have access to some higher form of truth...in reality, they are just better at seeing what is already there.
You see this, particularly, when a famed investor says the market is over-valued. Inevitably, they will get lambasted publicly. But you realise that their view is usually one that works long-term, that is quantifiable, that uses reason...everyone else just wants emotion.
The difference isn't knowledge. You can have huge knowledge, you can be rewarded heavily for being right, you can be a media "expert"...but if you aren't interested in reality, it doesn't matter.
I don't think it is about avoiding having strong views either but it has to be strong views, weakly held. With time I think a minority of the population realises that some people will have the same opinion regardless of what facts are presented to them. What is frustrating is that society reproduces so much of this nonsense.