30th of November in 2022 is the day I embraced ChatGPT with no hesitation as my colleague, editor, and assistant. âWhat can I help with?â has become the most relieving sentence thatâs welcomed me to work for the past two years.
Many of my workflows began with prompts like âGive me ideasâ or âShow me the steps on how I can,â and often ended with âFind improvementsâ or âRevise this text for a final draft.â Tasks that once demanded hours of deliberation and decision-making now flowed seamlessly, allowing me to reclaim precious time with more work.
I became the ultimate master of âgetting things doneâ delivering over 5 growth campaigns within the first 9 months of my first full-time job in the blockchain industry, and authoring 7 articles about quantum computing technology when I had no prior knowledge on it.
For two years, not a single task crossed the finish line without an A.I.âs input. Now, as I build my blockchain marketing portfolio, Iâm struck by how effective my AI-assisted work has beenâconsistently achieving results that were âgood enough.â However, I also notice whatâs missing: the bold uniqueness and true innovation that make work unforgettable.
Generative A.I., after all, is trained on existing data and generates responses by predicting and reproducing patterns in the most contextually familiar way.
ChatGPT, Claude and Perplexity are not at fault for the lack of boldness and uniqueness in my portfolio. Their familiarity-preferring algorithms, and predictive methods imitating online text are in fact strikingly similar to how our brains work to reach solutions of creativity.
In fact, Aristotle once said that art and creativity often stem from mimicking nature or other works. Confucian philosophies also valued learning and imitation, but only as a means of mastering knowledge before creating something new.
Observing myself in retrospect, I concluded that the lack of uniqueness in my marketing portfolio was solely my accountability. I had over-relied on AI, hooked on the rush of it giving me fast answers that boosted my productivity, churning out deliverables like clockwork. I was vaping my way through prompts and answers, giving myself artificial injections of dopamine and serotonin that could have otherwise naturally come from thinking deeply about something for a long-time and finally reaching a satisfying conclusion.
In the past two years, ChatGPT has undergone 23 updates, each significantly improving its accuracy, reasoning, and problem-solving capabilities since its public release. As for my own updates? Productivity, definitelyâbut reasoning, problem-solving, and accuracy? Maybe, though none as transformative as ChatGPTâs evolution.

A summary of OpenAIâs ChatGPT updates since Nov, 2022
AI is designed to replicate the solutions we might reach with more time, making it easy to delegate tasks like reasoning, problem-solving, and ensuring accuracy. At first glance, this might not seem harmful. But the truth is, the process matters more than the outcome. Destinations reached through AI shortcuts are inherently different from those achieved without it, even if they seem identical on the surface.
After going through extensive research about mankindâs greatest achievements, Iâve concluded that the grueling thinking processes we delegate to AI in exchange for speed and efficiency is the key to cultivate a trait that AI, no matter how advanced, can never possess.
According to philosophers such as Socrates, and Emmanuel Kant along with the Christian body, The Enlightenment Movement, and the pioneers of the Romanticism â there exists something in humans that transcends intelligence âa quality that designates certain individuals as exceptional and defines our ability to create and innovate. This essence, the very force that enabled us to bring AI into existence, must be nurtured to maintain our distinctiveness as a species.
Socrates was among the first to articulate this concept, referring to it in Latin as âdaimonion.â He described his daimonion as a guiding spirit, particularly in moral decisions. It was not a source of knowledge or intellect, but rather a moral compass that steered him away from improper behavior. Paradoxically, it was Socratesâ humility and acknowledgment of his own ignoranceâhis famous belief that he knew nothingâthat set him apart and elevated his wisdom above that of his peers.
The term âdaimonionâ would later be translated into English as âgenius.â
âGeniusâ is a word that most of us cannot relate to, as it has become a term heavily used by the PR/Marketing specialists of the most successful people of the world. Today, it often serves as a convenient excuse for bad behavior, overshadowing its deeper significance. Modern cognitive science has further diluted the concept by framing human cognition in measurable termsâprocessing speed, memory, and problem-solving abilityâreducing genius to mere variations of quantifiable traits.
However, the concept of genius predates modern intelligence theories, and was initially linked to spiritual or otherworldly guidance rather than intellect. It was never synonymous to intelligence, as Socratesâ acknowledgement of his own ignorance (something A.I. is incapable of by the way) was celebrated as a form of genius â his humility, insight, and the ability to question deeply.
In the Christian era, genius intertwined with spirituality, as mystic-saints sought unity with God. These figures believed in profound truths beyond intellectual comprehension, accessible through divine encounters or moments of ecstasy.
It wasnât until The Enlightenment Movement in the 18th century that the term Genius became closer to the way we understand it today, shifting its focus from divine visitation to individual creativity. Immanuel Kant emphasized that true genius created transformative art by inspiration, not imitation or adherence to rules. For Kant, genius was not about technical skill but about originalityâthe ability to forge new paths and redefine the boundaries of human achievement.
The Romantic movement further elevated the concept of genius, celebrating it as a profound expression of human truths through intuition and inspiration. For them, genius was not just a personal attribute but a force that connected humanity to deeper, universal truths.
Further in the 20th century, figures like Albert Einstein, Kurt GĂśdel, John von Neumann, and J. Robert Oppenheimer were celebrated as scientific geniuses, not solely for their extraordinary intellectual achievements but for their ability to think intuitively and creatively in ways that reshaped our understanding of the world. Their work often seemed to stem from an almost mystical insight, a quality that aligned them with the Romantic notion of genius as an unquantifiable force capable of reaching beyond the constraints of formal logic.
As is typical of modern science, which seeks to classify and quantify everything, most of us were excluded from the classification as a âgeniusâ and were left self-labeled âintelligentâ at the most. And in quantifiable intelligence measures such as speed, and variety of solutions â A.I. has consistently outperformed us.
And over the past 20 years, films like A Space Odyssey, iRobot, and The Terminator have offered captivating predictions of a future where A.I. breaks their programmings and potentially overpowers humanity. This narrative fueled an instinctive yet false impression long before tools like ChatGPT were even introduced to the public. Our cultural prefiguration of A.I. has not only blinded us to the possibility that humans might surpass A.I. in certain ways but has also prematurely defined A.I. as inherently superior to us.
Iâm sure Iâm not the only one who has naturally found themselves taking the passengerâs seat at work while A.I. takes the wheel, thinking, âItâs better this way.â
Weâve rarely considered the genius we might bring to the table because weâve never seen ourselves as geniuses. And maybe, scientifically speaking, we arenât. But if genius were viewed as a spectrum rather than a binary (as many things are increasingly proven to be), humans would likely sit further along that spectrum than A.I.
After all, no matter how advanced algorithms become in reasoning, problem-solving, or accuracy, they remain bound by predefined parameters, and rules while operating only within familiar frameworks incapable of calculating morals, intuition, inspiration and expression.
Moments that make genius seemingly possible from A.I. can be seen from media reports about A.I. hallucinations â when A.I. makes up information that appears factual. For example, this article tells a story on how A.I. hallucinations have been beneficial to scientists for generating highly implausible yet novel (genius) ideas to tackle challenges, like designing entirely new molecules.
However, A.I. hallucinations are nothing more than an error that canât be controlled nor can be intentional. It is a joyous serendipity at the most, and cannot be described as genius.
And shouldnât A.G.I. (the hypothetical intelligence of a machine that possesses the ability to understand or learn any intellectual task that a human being can) be considered genius?
Not really. A.I. was first introduced as an academic discipline in the 1950s at the Dartmouth Summer Research Project on Artificial Intelligence where participants aimed at creating machines capable of replicating human cognitive functions such as reasoning, learning, and problem-solving. A.G.I is just one of the pre-defined fundamentals of A.I. thatâs being further developed.
Maybe humans love A.I. so much because of how highly intelligent it can be without the possibility of it becoming a genius. And itâs perfectly fine to love A.I., as it stands as the genius invention of humanityâan incredible achievement of our kind.
Even after reading all this, you might be reluctant to admit that you are more âgeniusâ than A.I., especially if youâre a strong advocate of science that told you youâre not. Or it may be because the term raises discomfort that comes from the connotation of big ideas challenging societal norms. Figures like medieval saints and Socrates were often seen as dangerous, as they rejected societal conventions and sometimes faced execution for their defiance. This just shows how powerful our genius can be.
But Genius is not a test score, nor a quantifiable trait. Genius is not a cognitive function but rather a unique convergence of creativity, intuition, and insight that transcends raw intellectual abilities. While cognitive functions like memory, reasoning, and problem-solving are essential components of intelligence, genius operates on a higher planeâmarked by the ability to perceive connections, challenge conventions, and create transformative ideas. It is an expression of originality and vision that cannot be reduced to measurable mental processes, as it embodies qualities of inspiration, emotional resonance, and a capacity to innovate in ways that defy purely cognitive explanation.
Genius begins as a brief spark during our problem-solving and reasoning processesâan instinctive moment of questioning that whispers, âbut,â âwhat if,â âwhy not?â or âcould it be?â
Itâs this very geniusâour capacity to explore and question intuitively beyond rationaleâthat weâre giving up in exchange for A.I.-generated shortcuts.
I wrote this story to share how I underestimated myself and overestimated A.I., in the hopes that you wonât make the same mistake. Since there is no manual on how to use A.I. properly, nor does a guide exist on how to balance our intelligence with that of A.I.
As A.I. has become an accessible tool for anyone over the age of 13, challenging ourselves has become a challenge in itself, with so many tempting shortcuts at our fingertips.
But perhaps, as long as we understand what sets us apart from this technology, we can avoid the discouragement and laziness that lead to over-reliance on it.
Now that you know about the human-exclusive genius capabilities within youâcapabilities that are fading in exchange for shortcutsâhow will you adjust them?
This article was originally published by Lisa Kim on HackerNoon.