



The Last Human Century
The Last Human Century
Willem Van Lancker
—
Between 1920 and 2020, humanity did something unprecedented: we made a world almost entirely for ourselves. Not for gods, not for kings, but for the individual human being.
In these hundred years, we birthed mass culture and it became our shared nervous system. Everything from Elvis’s hips, Sputnik’s launch, and the Friends theme song stitched together a global identity. And for the first time, we archived not just the extraordinary but the mundane: countless laughs, smudges, and offhand remarks. The 20th century turned humanity’s ephemeral noise into a cathedral of data, one that would become the foundation for artificial minds..
It was also a fleeting time, we realize now—a century where we wielded the power of mass creation without the support of artificial minds. Like all great transitions in human history—from oral traditions to written word, from scriptorium to printing press—this shift marks both an ending and a beginning.
Cultural memory has a certain rhythm: every generation romanticizes the era roughly 30 years before its own. Gen Z clings to the pixelated naivete of the 2000s; millennials buff the 1980s and 90s into a yuppie golden age; Gen X yearned for the 1960s idealism. But the 2020s mark a rupture in this continuum. When an AI-generated song hit the Billboard charts in 2023, the prior century collapsed into legend: the last time creativity wasn’t a dialogue with silicon.
Future generations, my own kids included, will be raised in a culture where every song, building, and political system is co-authored by machines. They will look back on this era with wonder and nostalgia: a mythic age when creation was raw, imperfect, fragile, and ours alone.
Consider Apollo 11 moon landing, an apex moment of the 20th century. The Apollo computer ran on 2MHz of processing power (an electric toothbrush has more), schematics drafted in pencil, mission control in a haze of cigarette smoke, astronauts training in swimming pools. Future humans will marvel not at the triumph, but at the audacity: They did this with slide rules and human spirit.
What made it mythic wasn’t the output, but the constraints. We awkwardly and inefficiently built atop the work of other humans, not synthetic minds. The Ford assembly line wasn’t designed by algorithm; Pixar’s Toy Story wasn’t a latent diffusion of a simple prompt. Apple was born in a garage by misfits, not from some rigorous analysis. Even our failures—the Edsel, Google Glass, Fyre Festival, the Juicero—were authentically ours. You can feel the humanity: vision, overreach, ego, lack of precision—the trembling hand of a species not fully optimized.
Then came the pivot point. As pandemic lockdowns accelerated our migration to digital symbiosis, something else awakened alongside us. OpenAI released GPT-3 to little public acclaim, but it marked a threshold we couldn’t uncross. Shortly thereafter came DALL-E 2 and Midjourney. Suddenly, anyone could type “a fox in a spacesuit, oil painting” and conjure it. The shock wasn’t just the output; it was how swiftly it felt normal.
Henceforth, every painting, pickup line, and thought piece (even this one!) carries a silent question: Was this you, or the machine?
The results aren’t perfect yet, but like the first steam engine or the typewriter, they hint at a world where creation requires less work. Less staring at blank pages. Less wandering. Less doubt.
By 2030, the cultural pendulum will have fully swung this way. A child will instantly generate a film that Kubrick might’ve made, or a dozen sequels to Grand Theft Auto. But future critics will likely note a telltale smoothness in post-2020s art—and originality will require new depths. We’ll have to go further to find our rough edges, our in-jokes, our fingerprints. The Warhol of 2070, raised on infinitely customizable and scalable AI, will marvel at how the original Warhol worked: cutting film by hand in a filthy loft, drinking Campbell’s Soup straight from the can, turning literal trash into something sacred.
The longing for this bygone era will grow exponentially. We’ll become their Roman Empire—a civilization whose artifacts speak of both triumph and limitation. Future humans will understand our art through AI-powered analysis, algorithms dissecting every petrified scroll and quantifying why a punk riff recorded in a garage technically feels a certain way. But comprehension isn’t communion. They’ll stare at our relics the way a parent gazes at home videos of a young child’s play—knowing its fleeting nature, mourning its loss.
The very concept of “human-made” will gain new significance, just as “handmade” became meaningful only after industrialization. An ad hoc TikTok dance, a cringe brand tweet, a SoundCloud rapper’s mixtape—these may become folk art, not because they’re profound, but because they’re uncalculated. Future creators, drowning in endless AI-generated permutations, will envy our constraints.
For a century, between the crackle of AM radio and the first whispers of the LLM, we lived in a world sized for human hands. Looking ahead, AI isn’t our end—it’s the culmination of our obsession with making: we finally built something that makes better than we can alone. No one will celebrate a website coded without copilots, just as no one mourns the loss of dial-up internet. There’s a temptation to resurrect that era—to shove this genie back into its bottle. But as the Luddites of the industrial age remind us: progress like this is often inevitable despite any well-meaning hopes to stop it.
Our descendants will split their gaze: some tending to islands of pure human creation, others resultsmaxxing in endless machine-made worlds. But they’ll return, always, to our century’s archive—that last great collection of purely human noise. Not because it was better, but because our very limitations made it meaningful. In a world of endless possibility, our constraints will seem romantic; our imperfections, divine.
—
The Last Human Century
Willem Van Lancker
—
Between 1920 and 2020, humanity did something unprecedented: we made a world almost entirely for ourselves. Not for gods, not for kings, but for the individual human being.
In these hundred years, we birthed mass culture and it became our shared nervous system. Everything from Elvis’s hips, Sputnik’s launch, and the Friends theme song stitched together a global identity. And for the first time, we archived not just the extraordinary but the mundane: countless laughs, smudges, and offhand remarks. The 20th century turned humanity’s ephemeral noise into a cathedral of data, one that would become the foundation for artificial minds..
It was also a fleeting time, we realize now—a century where we wielded the power of mass creation without the support of artificial minds. Like all great transitions in human history—from oral traditions to written word, from scriptorium to printing press—this shift marks both an ending and a beginning.
Cultural memory has a certain rhythm: every generation romanticizes the era roughly 30 years before its own. Gen Z clings to the pixelated naivete of the 2000s; millennials buff the 1980s and 90s into a yuppie golden age; Gen X yearned for the 1960s idealism. But the 2020s mark a rupture in this continuum. When an AI-generated song hit the Billboard charts in 2023, the prior century collapsed into legend: the last time creativity wasn’t a dialogue with silicon.
Future generations, my own kids included, will be raised in a culture where every song, building, and political system is co-authored by machines. They will look back on this era with wonder and nostalgia: a mythic age when creation was raw, imperfect, fragile, and ours alone.
Consider Apollo 11 moon landing, an apex moment of the 20th century. The Apollo computer ran on 2MHz of processing power (an electric toothbrush has more), schematics drafted in pencil, mission control in a haze of cigarette smoke, astronauts training in swimming pools. Future humans will marvel not at the triumph, but at the audacity: They did this with slide rules and human spirit.
What made it mythic wasn’t the output, but the constraints. We awkwardly and inefficiently built atop the work of other humans, not synthetic minds. The Ford assembly line wasn’t designed by algorithm; Pixar’s Toy Story wasn’t a latent diffusion of a simple prompt. Apple was born in a garage by misfits, not from some rigorous analysis. Even our failures—the Edsel, Google Glass, Fyre Festival, the Juicero—were authentically ours. You can feel the humanity: vision, overreach, ego, lack of precision—the trembling hand of a species not fully optimized.
Then came the pivot point. As pandemic lockdowns accelerated our migration to digital symbiosis, something else awakened alongside us. OpenAI released GPT-3 to little public acclaim, but it marked a threshold we couldn’t uncross. Shortly thereafter came DALL-E 2 and Midjourney. Suddenly, anyone could type “a fox in a spacesuit, oil painting” and conjure it. The shock wasn’t just the output; it was how swiftly it felt normal.
Henceforth, every painting, pickup line, and thought piece (even this one!) carries a silent question: Was this you, or the machine?
The results aren’t perfect yet, but like the first steam engine or the typewriter, they hint at a world where creation requires less work. Less staring at blank pages. Less wandering. Less doubt.
By 2030, the cultural pendulum will have fully swung this way. A child will instantly generate a film that Kubrick might’ve made, or a dozen sequels to Grand Theft Auto. But future critics will likely note a telltale smoothness in post-2020s art—and originality will require new depths. We’ll have to go further to find our rough edges, our in-jokes, our fingerprints. The Warhol of 2070, raised on infinitely customizable and scalable AI, will marvel at how the original Warhol worked: cutting film by hand in a filthy loft, drinking Campbell’s Soup straight from the can, turning literal trash into something sacred.
The longing for this bygone era will grow exponentially. We’ll become their Roman Empire—a civilization whose artifacts speak of both triumph and limitation. Future humans will understand our art through AI-powered analysis, algorithms dissecting every petrified scroll and quantifying why a punk riff recorded in a garage technically feels a certain way. But comprehension isn’t communion. They’ll stare at our relics the way a parent gazes at home videos of a young child’s play—knowing its fleeting nature, mourning its loss.
The very concept of “human-made” will gain new significance, just as “handmade” became meaningful only after industrialization. An ad hoc TikTok dance, a cringe brand tweet, a SoundCloud rapper’s mixtape—these may become folk art, not because they’re profound, but because they’re uncalculated. Future creators, drowning in endless AI-generated permutations, will envy our constraints.
For a century, between the crackle of AM radio and the first whispers of the LLM, we lived in a world sized for human hands. Looking ahead, AI isn’t our end—it’s the culmination of our obsession with making: we finally built something that makes better than we can alone. No one will celebrate a website coded without copilots, just as no one mourns the loss of dial-up internet. There’s a temptation to resurrect that era—to shove this genie back into its bottle. But as the Luddites of the industrial age remind us: progress like this is often inevitable despite any well-meaning hopes to stop it.
Our descendants will split their gaze: some tending to islands of pure human creation, others resultsmaxxing in endless machine-made worlds. But they’ll return, always, to our century’s archive—that last great collection of purely human noise. Not because it was better, but because our very limitations made it meaningful. In a world of endless possibility, our constraints will seem romantic; our imperfections, divine.
—
The Last Human Century
Willem Van Lancker
—
Between 1920 and 2020, humanity did something unprecedented: we made a world almost entirely for ourselves. Not for gods, not for kings, but for the individual human being.
In these hundred years, we birthed mass culture and it became our shared nervous system. Everything from Elvis’s hips, Sputnik’s launch, and the Friends theme song stitched together a global identity. And for the first time, we archived not just the extraordinary but the mundane: countless laughs, smudges, and offhand remarks. The 20th century turned humanity’s ephemeral noise into a cathedral of data, one that would become the foundation for artificial minds..
It was also a fleeting time, we realize now—a century where we wielded the power of mass creation without the support of artificial minds. Like all great transitions in human history—from oral traditions to written word, from scriptorium to printing press—this shift marks both an ending and a beginning.
Cultural memory has a certain rhythm: every generation romanticizes the era roughly 30 years before its own. Gen Z clings to the pixelated naivete of the 2000s; millennials buff the 1980s and 90s into a yuppie golden age; Gen X yearned for the 1960s idealism. But the 2020s mark a rupture in this continuum. When an AI-generated song hit the Billboard charts in 2023, the prior century collapsed into legend: the last time creativity wasn’t a dialogue with silicon.
Future generations, my own kids included, will be raised in a culture where every song, building, and political system is co-authored by machines. They will look back on this era with wonder and nostalgia: a mythic age when creation was raw, imperfect, fragile, and ours alone.
Consider Apollo 11 moon landing, an apex moment of the 20th century. The Apollo computer ran on 2MHz of processing power (an electric toothbrush has more), schematics drafted in pencil, mission control in a haze of cigarette smoke, astronauts training in swimming pools. Future humans will marvel not at the triumph, but at the audacity: They did this with slide rules and human spirit.
What made it mythic wasn’t the output, but the constraints. We awkwardly and inefficiently built atop the work of other humans, not synthetic minds. The Ford assembly line wasn’t designed by algorithm; Pixar’s Toy Story wasn’t a latent diffusion of a simple prompt. Apple was born in a garage by misfits, not from some rigorous analysis. Even our failures—the Edsel, Google Glass, Fyre Festival, the Juicero—were authentically ours. You can feel the humanity: vision, overreach, ego, lack of precision—the trembling hand of a species not fully optimized.
Then came the pivot point. As pandemic lockdowns accelerated our migration to digital symbiosis, something else awakened alongside us. OpenAI released GPT-3 to little public acclaim, but it marked a threshold we couldn’t uncross. Shortly thereafter came DALL-E 2 and Midjourney. Suddenly, anyone could type “a fox in a spacesuit, oil painting” and conjure it. The shock wasn’t just the output; it was how swiftly it felt normal.
Henceforth, every painting, pickup line, and thought piece (even this one!) carries a silent question: Was this you, or the machine?
The results aren’t perfect yet, but like the first steam engine or the typewriter, they hint at a world where creation requires less work. Less staring at blank pages. Less wandering. Less doubt.
By 2030, the cultural pendulum will have fully swung this way. A child will instantly generate a film that Kubrick might’ve made, or a dozen sequels to Grand Theft Auto. But future critics will likely note a telltale smoothness in post-2020s art—and originality will require new depths. We’ll have to go further to find our rough edges, our in-jokes, our fingerprints. The Warhol of 2070, raised on infinitely customizable and scalable AI, will marvel at how the original Warhol worked: cutting film by hand in a filthy loft, drinking Campbell’s Soup straight from the can, turning literal trash into something sacred.
The longing for this bygone era will grow exponentially. We’ll become their Roman Empire—a civilization whose artifacts speak of both triumph and limitation. Future humans will understand our art through AI-powered analysis, algorithms dissecting every petrified scroll and quantifying why a punk riff recorded in a garage technically feels a certain way. But comprehension isn’t communion. They’ll stare at our relics the way a parent gazes at home videos of a young child’s play—knowing its fleeting nature, mourning its loss.
The very concept of “human-made” will gain new significance, just as “handmade” became meaningful only after industrialization. An ad hoc TikTok dance, a cringe brand tweet, a SoundCloud rapper’s mixtape—these may become folk art, not because they’re profound, but because they’re uncalculated. Future creators, drowning in endless AI-generated permutations, will envy our constraints.
For a century, between the crackle of AM radio and the first whispers of the LLM, we lived in a world sized for human hands. Looking ahead, AI isn’t our end—it’s the culmination of our obsession with making: we finally built something that makes better than we can alone. No one will celebrate a website coded without copilots, just as no one mourns the loss of dial-up internet. There’s a temptation to resurrect that era—to shove this genie back into its bottle. But as the Luddites of the industrial age remind us: progress like this is often inevitable despite any well-meaning hopes to stop it.
Our descendants will split their gaze: some tending to islands of pure human creation, others resultsmaxxing in endless machine-made worlds. But they’ll return, always, to our century’s archive—that last great collection of purely human noise. Not because it was better, but because our very limitations made it meaningful. In a world of endless possibility, our constraints will seem romantic; our imperfections, divine.
—
Willem Van Lancker
—
Between 1920 and 2020, humanity did something unprecedented: we made a world almost entirely for ourselves. Not for gods, not for kings, but for the individual human being.
In these hundred years, we birthed mass culture and it became our shared nervous system. Everything from Elvis’s hips, Sputnik’s launch, and the Friends theme song stitched together a global identity. And for the first time, we archived not just the extraordinary but the mundane: countless laughs, smudges, and offhand remarks. The 20th century turned humanity’s ephemeral noise into a cathedral of data, one that would become the foundation for artificial minds..
It was also a fleeting time, we realize now—a century where we wielded the power of mass creation without the support of artificial minds. Like all great transitions in human history—from oral traditions to written word, from scriptorium to printing press—this shift marks both an ending and a beginning.
Cultural memory has a certain rhythm: every generation romanticizes the era roughly 30 years before its own. Gen Z clings to the pixelated naivete of the 2000s; millennials buff the 1980s and 90s into a yuppie golden age; Gen X yearned for the 1960s idealism. But the 2020s mark a rupture in this continuum. When an AI-generated song hit the Billboard charts in 2023, the prior century collapsed into legend: the last time creativity wasn’t a dialogue with silicon.
Future generations, my own kids included, will be raised in a culture where every song, building, and political system is co-authored by machines. They will look back on this era with wonder and nostalgia: a mythic age when creation was raw, imperfect, fragile, and ours alone.
Consider Apollo 11 moon landing, an apex moment of the 20th century. The Apollo computer ran on 2MHz of processing power (an electric toothbrush has more), schematics drafted in pencil, mission control in a haze of cigarette smoke, astronauts training in swimming pools. Future humans will marvel not at the triumph, but at the audacity: They did this with slide rules and human spirit.
What made it mythic wasn’t the output, but the constraints. We awkwardly and inefficiently built atop the work of other humans, not synthetic minds. The Ford assembly line wasn’t designed by algorithm; Pixar’s Toy Story wasn’t a latent diffusion of a simple prompt. Apple was born in a garage by misfits, not from some rigorous analysis. Even our failures—the Edsel, Google Glass, Fyre Festival, the Juicero—were authentically ours. You can feel the humanity: vision, overreach, ego, lack of precision—the trembling hand of a species not fully optimized.
Then came the pivot point. As pandemic lockdowns accelerated our migration to digital symbiosis, something else awakened alongside us. OpenAI released GPT-3 to little public acclaim, but it marked a threshold we couldn’t uncross. Shortly thereafter came DALL-E 2 and Midjourney. Suddenly, anyone could type “a fox in a spacesuit, oil painting” and conjure it. The shock wasn’t just the output; it was how swiftly it felt normal.
Henceforth, every painting, pickup line, and thought piece (even this one!) carries a silent question: Was this you, or the machine?
The results aren’t perfect yet, but like the first steam engine or the typewriter, they hint at a world where creation requires less work. Less staring at blank pages. Less wandering. Less doubt.
By 2030, the cultural pendulum will have fully swung this way. A child will instantly generate a film that Kubrick might’ve made, or a dozen sequels to Grand Theft Auto. But future critics will likely note a telltale smoothness in post-2020s art—and originality will require new depths. We’ll have to go further to find our rough edges, our in-jokes, our fingerprints. The Warhol of 2070, raised on infinitely customizable and scalable AI, will marvel at how the original Warhol worked: cutting film by hand in a filthy loft, drinking Campbell’s Soup straight from the can, turning literal trash into something sacred.
The longing for this bygone era will grow exponentially. We’ll become their Roman Empire—a civilization whose artifacts speak of both triumph and limitation. Future humans will understand our art through AI-powered analysis, algorithms dissecting every petrified scroll and quantifying why a punk riff recorded in a garage technically feels a certain way. But comprehension isn’t communion. They’ll stare at our relics the way a parent gazes at home videos of a young child’s play—knowing its fleeting nature, mourning its loss.
The very concept of “human-made” will gain new significance, just as “handmade” became meaningful only after industrialization. An ad hoc TikTok dance, a cringe brand tweet, a SoundCloud rapper’s mixtape—these may become folk art, not because they’re profound, but because they’re uncalculated. Future creators, drowning in endless AI-generated permutations, will envy our constraints.
For a century, between the crackle of AM radio and the first whispers of the LLM, we lived in a world sized for human hands. Looking ahead, AI isn’t our end—it’s the culmination of our obsession with making: we finally built something that makes better than we can alone. No one will celebrate a website coded without copilots, just as no one mourns the loss of dial-up internet. There’s a temptation to resurrect that era—to shove this genie back into its bottle. But as the Luddites of the industrial age remind us: progress like this is often inevitable despite any well-meaning hopes to stop it.
Our descendants will split their gaze: some tending to islands of pure human creation, others resultsmaxxing in endless machine-made worlds. But they’ll return, always, to our century’s archive—that last great collection of purely human noise. Not because it was better, but because our very limitations made it meaning. In a world of endless possibility, our constraints will seem romantic; our imperfections, divine.
—