Even in principled egalitarian societies, status hierarchies don't disappear completely. They became harder to see and therefore perhaps harder to satisfy, this may tend to increase rumination, not reduce it.
People perhaps underestimate how much meaning comes from being needed - from not being replaceable . Pre-industrial humans were needed constantly. Modern humans are optional to almost everyone.
Status hierarchies are inevitable in human groups.
The question is not whether they exist, but whether they are legible, stable, and tied to prosocial competence. When hierarchies feel arbitrary, it is likely that dysfunction increases across the entire group.
I think to any observer, it seems fairly clear that we didn't evolve for 9-to-5 desk jobs, hyper-palatable processed food, and 24/7 digital social comparison. A lot of what we currently label as "mental illness" is actually a highly predictable mismatch between our brains and the modern environment.
Consider human emotions as part of a rational, self optimising system. Motivation emerges naturally when effort reliably produces legible, meaningful change in one’s environment. Where this link is broken, motivation collapses. No amount of exhortation can fully substitute.
The modern workplace selects heavily for sustained attention to abstract tasks in static environments. This is evolutionarily unusual. Human cognition evolved for movement, social interaction, novelty, and immediate feedback.
Feeling guilty over doom scrolling? Stress responses are regulatory systems designed to allocate effort in response to perceived demands and uncertainty, chronic activation often reflects structural conditions not necessarily pure personal responsibility.
Neurodiversity becomes more visible in highly standardised systems. The narrower the behavioural expectations, the more natural variation appears pathological. All may benefit if we consider alternatives!
Attention is not a fixed resource that some people “have” and others lack. It is a regulatory system that allocates effort based on expected reward and uncertainty. When outcomes are unclear or delayed, attention naturally destabilises. External structure often works better than pressure.
Many workplaces select for a narrow cognitive profile: sustained sedentary focus on abstract tasks. Human populations have historically evolved with broader specialisation.
Ancestrally, competence was publicly legible. There was strong coupling between environmental proficiency and social reputation - people needed competent hunters to survive! Today, it is often the case that forms of competence are opaque, delayed, or politically mediated.
For most of human history, effort → visible result → calibration.
Now effort → abstraction → delayed, ambiguous feedback.
Many “motivation problems” are likely actually calibration problems within modern environments .
Reframing ADHD. Economists have a concept known as 'Time Preference' - in short, that people prefer immediate rewards to anticipated future rewards. But to some degree all humans have a preference for 'now' over some distant future. Could ADHD be seen as an far range example of a normal human trait?
Preferences are individual - and should be celebrated as such - but not arbitrary. They track cues that historically predicted fertility, investment, protection, and social leverage.
Important: evolutionary mismatch is not a vague just-so story. It generates testable predictions about when traits should become impairing: rapid environmental change, decoupling of cues from outcomes, and loss of developmental calibration.
In hunter-gatherer groups, social interaction and feedback is continuous. Modern settings compress it into high-pressure intervals. The shift changes the dynamics. And allows baseless rumination when back alone!
Long 'to-do' lists could not exist in ancestral environments. The ability for foresight and planning was limited to the possible and fairly immediate.
A growing number of large-scale datasets now track how light, sleep timing and social rhythm influence mood. The strongest effects appear when multiple cues drift at once, supporting ecological rather than single-factor explanations.
Hunter–gatherer learning models suggest that people learn fastest when observing others in real contexts, not in abstract instruction. The apprenticeship model is more natural than rote learning!
A useful lens: anxiety often increases when feedback becomes opaque. Historically, social cues were immediate—facial expression, tone, presence. Today, performance signals are delayed, digital or ambiguous. Uncertainty itself drives stress.
buff.ly/lGfEPOz
"Humans are not well adapted to live in modern cities—and this may be having a big impact on our health and wellbeing.
This is the claim of evolutionary scientists...who say “rapid industrialization” has reshaped human habits so dramatically that our biology may no longer be able to keep up."
"Neurodiversity becomes not just a moral imperative but a competitive advantage. For too long we have approached workplace inclusion as something we should do because it’s
right. But, what our research reveals is that building truly inclusive, human-centred workplaces... it’s about good business."
Our minds evolved in small‑group, high‑mobility, high‑feedback societies. When modern life or work imposes long stretches of isolation or asynchronous tasks, this mismatch in environments is, all else being equal, likely to result in unnecessary suffering.
New empirical work (Nov 11 2025) shows mixed attitudes among autistic, ADHD and dyslexic learners toward the term “neurodiversity."
Link:
link: buff.ly/pRLq0FT
"Although aspects of computer-meditated-communication may indeed represent such evolutionary mismatches...there is also..a failure to contextualize the negative mental health effects of CMC against broader societal factors which are plausible preexisting evolutionary mismatches themselves..."