Here's the problem: if you're a psychologist, and you give someone a choice, in an experiment, they know you're willing to lie about anything. People on the internet (wrongly) think it's illegal to shout fire in a crowded theater but hearing someone be wrong about a fire isn't the worst case scenario – psychologists will famously trick you into thinking you may actually imminently get burned alive, or that you are murdering someone else. Sometimes psychologists will just commit crimes at people and pretend they learned something. And if you think those things aren't known to the general public, you should consult The Onion's take.

They want something from you – a choice, some reaction, data. And they are complete psychopaths about how they treat you. Ethics review has not stopped this process – just the more extreme physical or psychological abuse (and, in the meantime, stopped plenty of behavior no one cares about at all – full employment for ethicists!). Lying to people, and then bragging about how much you tricked them, is still today a staple of psychology research – even when you have to tell layers upon layers of lies.

So when I hear results like, "People are hyperbolically discounting, which is a time-inconsistent preference set! They're so irrational!", that gets completely destroyed by the massive pillar of salt I take with, "we know this because they're willing to take a big discount of future payments if we pay them today instead of tomorrow, but don't care about the difference between 30 days from now and 31 days from now" (very slight paraphrase of the results).

That analysis is, frankly, quite stupid. That result post-dates the very public experiments that gave them a reputation for being lying sociopaths! Of course the experimental subjects are correct to want less money today than more money at any other time. They know every additional day they wait, the more likely they are to get stiffed ('oh, sorry, research funding ran out'), and immediate results are the best you can ask for, when it comes to habitual liars. And you could attempt to isolate this by saying their uncertainty in the future grows exponentially and test against that, but if someone on the street said they'd pay me $100 in 10 years or $1,000,000 in 50, I would of course know they were inventing bigger numbers to accomplish some bigger lie (and everyone has a sensitive enough bullshit detector to catch stuff like that, and plausibly not even notice, or want to argue with a nosy 'qualitative researcher' about it). The situation doesn't get resolved by making these scenarios hypothetical either – the "this person is a liar" looks a lot like "this person is a stranger" ('trust is earned' isn't just a platitude), and they live in a world with strangers and liars. Of course, even if they don't suspect anything, or have no awareness of their own suspicion, they'd be conditioned by the real world environment to behave that way! No explicit understanding is needed, no actual promises they would worry about someone else breaking.

This isn't the only psychological study where researchers will ridicule people for being stupid or irrational when the subjects are, in point of fact, acting nearly optimally. For example, psychologists disagree about what precisely procrastination is. But they all believe it's harmful, irrational, or something of the sort. But that is a hard thing to square with finding examples of it in pigeons, literally the model organism for classical conditioning. If there was a better policy, they'd find it when the world classically conditions it into them! So it's better to spend much, much more time thinking of alternative causes for the same effects – for example, we might guess that procrastination is a useful way to both avoid work that it later turns out you don't need to do, as well as making sure the most-competent version of yourself works on the task, while forcing time-boxing, which is a now-well-known productivity technique, and for pigeons it helps early-won rewards not be stolen before they need them etc etc. Occasionally I will see a study try to be very clever and isolate all these side-effects and rationality back-channels, but on a foundation of zero trust, it's quite challenging, and they congratulate themselves too quickly for even mild cleverness.

(For instance, there are clever researchers that think hyperbolic discounting is a component of drug addiction and compulsive gambling. But of course, 'hyperbolic discounting is optimal when dealing with liars' is arguably explaining those correlations just as well, even though I think a more useful analysis is that 'obsessive thought disorders are not meaningfully described as having bad analysis, because there are pre-analysis problems with the thought process, obviously'.)

It seems insane I have to say this, particularly when addressed to psychology researchers, but: humans are complex! Please invest at least a mild amount of curiosity into alternative explanations to your data, before you ask for an office in the science buildings on your university campus.