I think “near perfect” is vague. If 99.99% is what you personally need to accept “near perfect”, and that’s when you would one box, I would say to you “cool, I would one box for a near perfect predictor too!”
I genuinely don’t care to argue where that number specifically is.
I don’t know why you’re harping on that point anyway. Where’s the number when you would one box? Would you one box even if the prediction rate was 100% over a million test cases?
But the person wouldn’t be cool that you are mistaken and think 99% is enough.
I genuinely don’t care to argue where that number specifically is.
The point is that this might show that you didn’t arrive at your conclusion with good reasoning. If you are sure 51% isn’t good but 99% is good but you can’t tell me why then it might just be all vibes
You would be presupposing certain predispositions about people. I am not sure those are part of the problem. It could be that guessing the same for everyone would lead to a 50% success rate.
It could be but it’s not. There’s an option that leads to a > 51% success rate, and just knowing people, I could guess that even if faced with the problem for the first time. There’s a reason the problem states “near perfect”
Not voting is two boxing behaviour. I don’t vote too. Maybe I’m a two boxer at heart even though I’d rather be the kind of guy that leaves with a million.
Can you guess what two boxing and not voting have in common?
… Kinda, yeah. It’s at least the locally rational thing to do. I mean something by “locally” but I’m not entirely sure what I mean.
Two boxers say I can’t change what’s in the boxes so let’s get both. Makes perfect sense.
Non voters day I can’t change the election results by me voting, and they’re (usually) right.
But they both lose out in a non local way.
See, if you’re a one boxer, as we know you generally leave the building with 1000x more money than a two boxer does, which is… a pretty attractive offer.
The case for voting isn’t quite as clear but in a way it’s similar.
You lose out in two boxing, not just because you’re a two boxer, but because you’re the kind of person who two boxes.
I think you can analyse voting the same way. Yes my vote technically only counts as one vote, but if I reason my way out of voting, that makes me the kind of person who doesn’t vote. And if I’m the kind of person who doesn’t vote, that means people who think like me don’t vote. So in this weird round about way, me not voting isn’t just ME not voting, it’s me and people like me not voting - which can add up to a lot of votes. If me and people like me voted, the candidate I prefer would get a lot more votes than they would if me and people like me don’t vote.
I’m not expecting that to make sense to anyone but me.
At risk of trying to turn the titanic at this stage, isn’t the problem unresolvable?
The entire utility calculation depends on whether our decision is evidence to inform a probability judgement, or an intervention to update a probability judgement.
If it’s evidence, then it’s evidence of what the intelligent being already knows and has already acted on - one box.
If it’s an update, then it’s in response purely to the situation as it’s laid out - two boxes.
But we can’t know which because we’re not told the means by which the intelligent being knows what it knows.
So, our Bayesian prior would be 50% that the intelligent being knows by some irrefutable means (in which case our decision counts as evidence and we one box), and 50% that the intelligent being knows by some form of divination and our decision is an intervention and we two box.
Since the expected utility of two boxing in this case would be higher, we two box.
But this depends on us considering those two options equally likely which ends up undermining the paradox because neither are likely, they’re both fantasy.
We can totally accept that our decision is evidence of the kind of person we are and that the predictor already knows that but it doesn’t matter. Changing our mind (or not) here won’t affect what’s already in the boxes. We simply get $1000 more whatever the predictor did.
But I am not sure what you mean by “intervention to update a probability judgement”.
I can’t recall where I heard this one, so can’t credit it; suffice to say it’s not mine.
A guy pulls out a deck of cards and shuffles them. He asks you to pick a card, look at it and then shuffles it back into the deck. He then cuts the deck. What’s the probability the card on top is yours?
Obviously, it’s not 1:52. That’s the probability given the information you have about the cards, but it’s obviously a trick, which means somehow the probability must be much higher (roughly 1 minus the chances of messing up a trick).
If we’re really trying to decide what’s rational here then the answer has to be based on the likelihood of the intelligent being messing up his trick.
I’d say, given the aforementioned intelligence, that’s pretty low.
Indeed but if you one-box on vibes, you may not realize that.
Did you see my post responding?
To add a bit more, what I mean is whether you accept the following:
The event happens and you don’t know the outcome. But it’s clear that at this point, there is (even if you don’t know or have no way to compute it) a x% chance of outcome 1, y% chance of outcome 2 etc. and those won’t change whatever you do after the event happened.
Well, I guess you could reason your way back into voting if you think voting would change the fact that people like you won’t vote. Otherwise, then it’s still irrational and doesn’t change anything to vote.
Given that the chances of him messing up his trick are very low, and the trick relies on predicting what we’ll decide, the course of action we should take depends entirely on our prediction of the means by which we think he’s doing the trick.
If we think he’s doing the trick by knowing us well then we behave unpredictably. We do the opposite of whatever we think most rational.
If we think he’s doing the trick by means of some crystal ball, then we’re doomed. Nothing we can do will make any difference and we accept the booby prize for being predictable. Just like how you’re not getting your twenty back from the ‘watch the queen’ card hustler.
Edit - I suppose, on reflection, I ought to add the possibility of a benevolent being, rather than a trickster. I’m a natural cynic.