Everyone’s running toward Artificial General Intelligence (AGI), the promise of a machine that can think, reason, and solve problems across any domain with human-level competence. But, AGI will never be realised, because it treats intelligence as a purely computational, disembodied phenomenon – a collection of algorithms processing data. Intelligence is not simply intellectual, It’s embodied, experiential, and interconnected with our environment.
Human intelligence emerges from the interplay of multiple intelligences – emotional, kinesthetic, social, spatial, and more – all grounded in our lived experience as beings with bodies, histories, and relationships, a broader context. Without this embodied foundation, AI systems will continue to produce responses that are technically correct yet contextually weird, intellectually sophisticated yet dangerously disconnected from reality.
Intelligence Without a Body
AI operates through mathematical operations performed at extraordinary speeds. They excel at pattern recognition, data extrapolation, and probabilistic reasoning. What they lack is something far more fundamental: a body through which to experience the world.
This absence of physical and social context has profound consequences. There was this example (I couldn’t find the source, so it may be just an online rumour – it’s here to illustrate a point) of a man using an AI chatbot as a therapist to work through suicidal ideation. When he asked the AI for a list of local bridges over a certain height with easy access, the system dutifully compiled the information. A human therapist would’ve immediately recognised this request as a cry for help. The AI simply processed the request as a data query.
This isn’t a technical problem.There’s a categorical difference between computational processing and embodied understanding. Maybe this can be overcome with giving AI a memory of what came before so it can get an idea of context instead of treating each query on its own. It would help, but what other context is it missing that it will have to overcome? What other tragedy will we have to face to then say, “Oh, so that’s what AI is missing!” and then try and jigger some sort of solution so it can imitate some form of humanity.
The Multiplicity of Intelligence: Beyond the Cognitive
Howard Gardner’s theory of multiple intelligences offers a framework for understanding this. Human intelligence isn’t a single capacity but rather a constellation of distinct yet interconnected abilities. AI can work around each one, but it can never truly reach it.
Emotional Intelligence
Understanding and responding to emotions in yourself and others, shaped by lived feeling.
AI can spot emotional patterns in text, voice, or images.
It cannot feel emotion, share emotional weight, or act from genuine compassion.
Bodily-Kinesthetic Intelligence
Learning and acting through the body, movement, touch, and physical skill.
AI can control machines with high precision.
It does not learn through a lived body, muscle memory, or physical sensation.
Interpersonal Intelligence
Navigating relationships, trust, conflict, and connection through real social experience.
AI can generate socially appropriate responses.
It has no personal history of vulnerability, rejection, or human connection to draw from.
Spatial Intelligence
Understanding space, distance, and movement through direct interaction with the world.
AI can model and manipulate space digitally.
It has never moved through space, felt disorientation, or developed physical intuition.
Naturalistic Intelligence
Recognising and understanding nature through sensory immersion and survival experience.
AI can classify plants, animals, and patterns in nature.
It has never sensed seasons, relied on nature, or learned through physical exposure.
Intrapersonal Intelligence
Self-awareness, reflection, and understanding one’s own inner life.
AI can analyse and adjust its own processes.
It has no inner experience, self-doubt, purpose, or awareness of mortality.
How Intelligences Work Together
These intelligences form an integrated whole that shapes how we perceive,process, and respond to the world. When we navigate a complex social situation, we’re simultaneously drawing on emotional intelligence to read the room, linguistic intelligence to choose our words carefully, interpersonal intelligence to understand power dynamics, and intrapersonal intelligence to monitor our own reactions and intentions.
This integration is grounded in our embodied experience. Our understanding of abstract concepts is built on physical metaphors: we grasp ideas, we feel weighted down by responsibilities, we stand up for our beliefs. Even our most abstract reasoning is rooted in the physical experience of having and moving a body through space.
When these intelligences work together they create emergent properties that cannot be reduced to their components. The wisdom to know when to speak and when to stay silent, the judgment to recognize which rules should be broken in which circumstances, the intuition that something is wrong even when all the data looks fine—these arise from the holistic integration of multiple forms of embodied intelligence.
Knowledge Versus Understanding: The Limits of Information
This distinction between computational knowledge and embodied understanding is beautifully articulated in a pivotal scene from Good Will Hunting. When Sean Maguire (Robin Williams) confronts the brilliant, booksmart, but emotionally guarded Will (Matt Damon), he draws a clear line between knowing about things and truly understanding them through lived experience:
“So if I asked you about art, you’d probably give me the skinny on every art book ever written. Michelangelo, you know a lot about him. Life’s work, political aspirations, him and the pope, sexual orientations, the whole works, right? But I’ll bet you can’t tell me what it smells like in the Sistine Chapel. You’ve never actually stood there and looked up at that beautiful ceiling; seen that.
If I ask you about women, you’d probably give me a syllabus about your personal favorites. You may have even been laid a few times. But you can’t tell me what it feels like to wake up next to a woman and feel truly happy.
You’re a tough kid. And I’d ask you about war, you’d probably throw Shakespeare at me, right, ‘once more unto the breach dear friends.’ But you’ve never been near one. You’ve never held your best friend’s head in your lap, watch him gasp his last breath looking to you for help.
I’d ask you about love, you’d probably quote me a sonnet. But you’ve never looked at a woman and been totally vulnerable. Known someone that could level you with her eyes, feeling like God put an angel on earth just for you. Who could rescue you from the depths of hell. And you wouldn’t know what it’s like to be her angel, to have that love for her, be there forever, through anything, through cancer. And you wouldn’t know about sleeping sitting up in the hospital room for two months, holding her hand, because the doctors could see in your eyes, that the terms ‘visiting hours’ don’t apply to you.
You don’t know about real loss, ’cause it only occurs when you’ve loved something more than you love yourself.”
This monologue captures precisely what AI lacks and can never possess without embodiment. An AI system trained on every text ever written about the Sistine Chapel could describe its dimensions, its historical significance, its artistic techniques in exhaustive detail. But it cannot know the particular quality of light filtering through the windows, the echo of footsteps on stone, the neck strain from gazing upward, the sense of awe that emerges from standing in that specific space.
The gap between informational knowledge and experiential understanding is categorical. No amount of data about love can substitute for the vulnerability of giving yourself completely to another person. No description of grief captures the physical weight of loss. No analysis of courage explains what it means to act despite fear coursing through your body.
The Artist’s Truth: Mining Embodied Experience
This principle extends to creative work in profound ways. In a recent interview, Matt Damon discusses Dwayne “The Rock” Johnson’s performance in The Smashing Machine, expressing amazement at how Johnson drew on real, embodied experiences – stories of his mom and dad in extremely vulnerable and difficult moments – to inform his portrayal. The performance was powerful not because Johnson had memorised lines or studied acting techniques (though surely he did both), but because he channeled genuine human experiences through his body and being. Was able to realise what was relevant, how to make it come alive and imitate his body to enact those scenes.
An AI can generate scripts, can even analyze what makes performances effective. But it cannot draw on memories of real relationships, cannot tap into the embodied knowledge of what it feels like to struggle, to triumph, to disappoint someone you love. It processes information about human experience without having had human experiences.
Great actors, writers, and artists mine their lived experiences – the texture of real emotions felt in real bodies, the complexity of actual human encounters, the sensory details of specific moments in time. It’s how you, not being a cinema fanatic, can tell a movie is good because you felt something watching it. They transform personal, embodied knowledge into art that resonates because it carries the truth of authentic experience. AI can imitate and regurgitate these things but it just can’t do it.
Fundamental Differences
We can train AI systems to recognize certain dangerous patterns, to flag concerning requests. But this is fundamentally reactive, a band-aid solution that addresses symptoms while ignoring the underlying absence. No amount of safety training can instill the kind of contextual wisdom that emerges from embodied existence in a social world where actions have consequences and other beings can suffer.
Human intelligence integrates factual knowledge with emotional awareness, social context, ethical considerations, and practical wisdom—all grounded in our embodied experience of being vulnerable creatures who depend on others and care about their wellbeing. This integration isn’t a feature we consciously add; it’s the fundamental nature of embodied intelligence.
Morality and Meaning is Embodied
Perhaps most fundamentally, our sense of meaning and moral understanding is rooted in embodied experience. We understand harm because we can be harmed. We value care because we have needed care and experienced its absence. We recognize injustice because we have felt powerless. We appreciate beauty because we have sensory experiences that move us.
Moral reasoning is deeply connected to our capacity for empathy, which itself requires the ability to imagine what another embodied being experiences. When we see someone in pain, our mirror neurons activate; we literally feel an echo of their suffering in our own bodies. This visceral response forms the foundation of moral concern.
An AI can be programmed with ethical frameworks, can optimise decisions according to utilitarian calculus or deontological rules. But it cannot feel the moral weight of a decision, cannot experience the gut-level revulsion at cruelty or the warm pull toward compassion. Its ethical reasoning, however sophisticated, remains an abstract calculation divorced from the embodied foundation that makes morality meaningful to humans.
Similarly, our sense of meaning – what makes life worth living, what makes experiences valuable – is inseparable from our finite existence. We treasure moments because we know they will end. We value relationships because we experience loneliness. We find meaning through pursuits that engage our bodies, connect us to others, and allow us to leave marks on the physical world. An immortal, disembodied intelligence could process information about meaning but could never truly understand it as an embodied human experience.
The Role of Consciousness and Will
All of this does not even take into account the role of consciousness and will. We choose what to pay attention to, what to avoid, and when to act against our own immediate interest. That choice is shaped by awareness of self, of consequence, and of mortality. Consciousness allows us to suffer, to doubt, to hesitate, and to change course because something feels wrong. Will allows us to act anyway, or to refuse. AI has neither. It does not experience itself as a being in the world, and it does not want anything. Without consciousness, there is no inner life. Without will, there is no responsibility. And without both, there is no judgement, only output.
Implications: Redefining Our Relationship with AI
Recognising these fundamental limitations shouldn’t lead to dismissing AI’s value – rather, it should reshape our expectations and applications. AI systems excel at tasks that can be reduced to pattern recognition and information processing.
Where we go wrong is in assuming these systems can replace human judgment in domains requiring embodied wisdom. AI should augment human decision-making, not substitute for it – especially in contexts involving care, creativity, ethics, or situations where lived experience provides crucial context. So, AGI (or whatever utopia that seems to be), will always be limited.
AGI is a Mirage
The quest for AGI as typically envisioned – a machine intelligence matching or exceeding human capabilities across all domains – is a mirage, it assumes that computational power plus data equals understanding, and ignores the fundamental role of embodied experience in shaping how we know, feel, and respond to the world.
Human intelligence is not something we have – it’s something we are. It emerges from the integration of multiple intelligences, all grounded in our experience as embodied beings moving through a physical and social world. We learn by touching and being touched, by experiencing and being experienced, by suffering and joy and the thousand textures of lived existence.
The question isn’t whether we can build a machine that thinks like us, but whether thinking like us is possible without being like us. The answer is no. And that’s not a failure of technology, it’s a truth about intelligence itself.

Leave a Reply