
Frankenstein Rewired: Mary Shelley & Machine Consciousness
- Emma Burbidge
- Apr 19
- 2 min read
In 1818, a young Mary Shelley imagined a future where human ingenuity gave birth to life. In Frankenstein; or, The Modern Prometheus, she created a world where ambition outpaced empathy—a warning that still echoes today as we stand on the cusp of an AI-powered era.
Two centuries later, we’re not stitching together limbs in candlelit laboratories. We’re training neural networks, refining generative models, and building machines capable of imitation, conversation, and creation. But the questions Shelley raised still matter:
What are the consequences of building without accountability?
How do we treat the creations we don’t understand?
What makes something human?
The Creature as an Outsider
Shelley’s creature is not a villain. He begins as a blank slate: curious, observant, and capable of love. But society rejects him—not for what he’s done, but for what he is. That rejection, and the loneliness that follows, transforms him into the “monster” he’s labelled as.
Many neurodivergent people resonate with this narrative. Like the creature, they may feel excluded or misunderstood by systems that weren’t designed with them in mind. And now, as we build AI systems designed to “learn” like humans, we must ask: are we repeating history?
AI, Ethics & Empathy
Generative AI is no longer a distant concept—it’s shaping education, employment, communication, and creativity. But with every advance comes a question: how do we ensure inclusion at every step?
Empathy isn’t easy to code. Neither is accountability. But both are essential.
Mary Shelley’s work reminds us that innovation without empathy leads to destruction. Her story wasn’t about stopping progress—it was about guiding it with care, caution, and conscience.
AI & Neurodiversity
At the intersection of this conversation lies a powerful insight: neurodivergent minds—those who think, feel, and perceive differently—have historically driven technological advancement. From Alan Turing to Temple Grandin, diverse ways of thinking have shaped the world we live in.
AI should not just mirror the status quo—it should be trained to understand difference. And that starts by including neurodivergent voices in the room where AI is designed, trained, and tested.
Because if we are to build machines that “understand” us, we must first acknowledge all the ways there are to be human.
A Call to Action
Mary Shelley’s legacy is not just a gothic warning—it’s a philosophical blueprint. As we design and deploy AI, we need to prioritise:
Inclusive data and development processes
Representation of neurodivergent voices in AI design
Ethical literacy in education and industry
Empathy as a core principle, not an afterthought
Innovation is inevitable. But inclusive, human-centred innovation? That’s a choice.
And as Shelley knew—once something is created, it’s hard to take it back.
Further reading:
Frankenstein by Mary Shelley
AI Ethics (MIT Press Essential Knowledge series)
Neurodiversity in Business (NIB) – www.neurodiversityinbusiness.org
Commentaires