Discussion about this post

User's avatar
Mason Whitehead's avatar

The way you’ve presented this scenario seems odd. Here Omega isn’t predicting if you would take the money (that would be Omega simulating you before a light flashed, and looking at what you do when the light flashes blue). Instead, Omega is looking at what you would do if the light flashed blue, conditional on you knowing the light flashed red.

I don’t know if this ruins your argument. I think it might? (Take the money if blue, commit to not taking the money if blue if red?) But it at very least is a far less simple scenario.

Expand full comment
2 more comments...

No posts