Probably not – or at least that's a sane first guess. Few people seem to actual do moral reasoning directly, but you could credibly make the claim that institutions like religion and law enforcement have codified all the important conclusions, so while reasoning isn't very important to do, it's important to have done.

This seems reasonable, and in fact might be true for how most people imagine answers to "how do you be a good person?" – we learn from stories, from people who've not treated us right, from our shame when we've failed. But we all (hopefully) get a solid boost from parents and institutions that simply command us to be a way they're pretty sure is good.

But then I heard an interesting point, that moral reasoning (in the case of artificial intelligence theory) is much less important than having the right terminal goals. In other words, moral reasoning exists only in service of our values, is so its value is secondary to it.