I think there's an under-theorized element of technological change, having to do with how we might be impacted by automation. The core question is, if you had a cyborg, identical to a human in almost all ways, except they didn't need sleep or food, what would happen to humans?
There's an impulse to say this would be negative, but let's be a bit more careful.
Because as human production has gone up, we haven't found ourselves in a situation where many are in poverty and the only things produced or consumed are those necessary for life. We have other goals. Food as necessity makes up about 2% of the workforce, a rounding error, truly. Almost all shelter is much, much nicer than what is literally required. A king of 500 years ago would find sleeping in a modern car to be a luxury. We can survive on much less than we normally do.
So wouldn't the cyborg, which is, again, an identical copy of us but with no need for food or sleep, spend just as much money, only on slightly different things? They'd still have goals, things worth spending money on, and we seem to have a pretty deep bench of wants after our needs are taken care of.
The real question is, why suppose an arbitrary AI wouldn't also have goals? If anything, we should be concerned about it optimizing too much, caring about a goal so much it risks our greater society, not that automation reduces demand-side economics. Why not assume they'd spend all their resources on something? And isn't our current system set up to not really care what your goals are, so long as it's not harmful and everyone is volunteering for their (sometimes bizarre) work?