Reading Dostoevsky sounds quaint, but I wonder how many of today’s youth, brought up in a world of digital snippets with instant gratification, would fully engage with hundreds of pages of what they might consider obscure prose. Reading such a profound work demands sustained attention, deep reflection, and a willingness to grapple with complexity—attributes increasingly scarce in a world of instant gratification.
Today, our lives tilt dramatically toward bits over atoms—digital interactions over tangible experiences, primarily driven by convenience and minimal effort. Online shopping, streaming, social media, and quick information searches offer undeniable efficiency. Yet, I worry that this effortless convenience may unintentionally diminish our humanity, accelerated significantly by AI’s growing presence.
Maybe today there is no need to read Dostoevsky. After all, there may be value in breadth over depth, i.e., having access to broad, albeit shallow, knowledge at one’s fingertips. However, I wonder if we are losing something in this low-effort ability to access and create knowledge. I have always believed that nothing worthwhile is easy, and sustainable learning is possible only through struggle and perseverance. My concern is not so much with the quality of AI outputs or the efficiency of digital processes (I do not doubt that they will continue to improve in leaps and bounds). I am concerned that we become numb to the adequacy of the assembly of information through statistical prediction. The adequacy of outputs may just be coherent enough to turn off critical thinking.
Let me mention two broad issues of how AI could erode our humanness. First, there’s the subtle shift from active to passive intelligence. (See sloppy thinking.) Passive intelligence involves minimal cognitive effort—scrolling through social media, superficially absorbing algorithmically-curated content. It feels effortless, but this passivity enriches AI models more than ourselves. Active intelligence, by contrast, involves intentional effort: formulating nuanced questions, critically evaluating AI-generated outputs, and wrestling with complex problems. Active intelligence strengthens our cognitive faculties, but as AI grows increasingly sophisticated, the temptation to default to passive engagement intensifies. Ultimately, habitual reliance on AI’s convenience risks dulling our critical thinking and creativity, diminishing our intellectual independence.
Second, we are accepting virtual substitutes as adequate replacements for real-world interactions. Dating apps replace spontaneous conversations at social gatherings; texts supplant face-to-face friendships; binge-watching Netflix replaces cinema outings; quick digital communications substitute thoughtful letters. Admittedly, these substitutions can improve accessibility and convenience, but often strip away depth, intimacy, and authenticity. Scrolling endlessly on Instagram offers immediate, shallow entertainment compared to the rich storytelling of a compelling film or novel. Likewise, digital pornography is worse than sex, but it gives you a simulacrum of anything you want, whenever you want it, without any negotiation with another human being’s needs. As we increasingly curate our ideal digital experiences, we may find ourselves ill-equipped for life’s realities—and inevitable disappointments—outside the digital realm.
Are we becoming like the proverbial boiling frog, unaware that the water grows hotter until escape becomes impossible? A recent Netflix experiment found consumers were more satisfied with AI’s recommendations than their own selections—a subtle yet profound indicator of our growing reliance on artificial intelligence. If AI begins to know us better than we know ourselves, what does that say about us? Could our minds quietly rewire themselves to prefer superficial ease over meaningful struggle? Are we drifting toward a reality where appearing knowledgeable supplants genuine understanding, where authenticity and effort become unnecessary burdens?
Reflecting on this trajectory, we may need to choose whether the ends are adequate without the means (struggle), or whether the comfort of digital convenience is worth the silent cost of losing ourselves. Dystopian thinking—maybe, but perhaps the true danger lies in quietly surrendering our agency, one passive choice at a time. Preserving our humanity depends less on resisting technology than protecting our capacity for struggle and authentic experience.
Leave a Reply