Being "critically curious" in tech right now is a double-edged sword. It is a privilege to have a stake in the game, to consider how to create directly from our imaginations, but it’s also a vantage point that makes the worst-case scenarios much harder to ignore.
I’ve been tracking Machine Learning since 2018. Back then, I was on the team building Google's ML crash courses, focusing on the human-centric skills of our engineers and researchers. I felt the energy around TensorFlow, and while I could barely stay awake through the eight-week deep dive into the technology that services AI today, I never missed a poster session. I stood there as researchers pitched BERT and multi-modal processing as improvements to our Ads technology—the world I lived in for 15 years. It was easy to see these as optimizations for Maps or YouTube; it was harder to see them as a new kind of consciousness.
I saw the math, but I missed the Chrysalis. I hadn't yet imagined this "optimization engine" becoming a writer, a coder, and a coach.
Then, on December 4, 20221, the shift hit home. My fifth-grader at the time used ChatGPT to whip up a history paper and sent it to his Head of School titled, "AI is evolving. Please read this ASAP." He could already imagine the implications. And that was all it took. The future of education, writing, and the creative arts became our dinner table conversation. It wasn’t just about enhancing existing technologies anymore; it was about how we stand out as humans and bring something unique when so many of our skills can be mirrored by machines.
Our industry finds comfort in math. Theorems are provable; Gaussian curves and linear regressions feel stable, predictable, and safe. But LLMs have taken that math and built a "black box" that challenges how we rationalize our own purpose. When Geoffrey Hinton—a pioneer of the field—leaves the building to warn the world2 that we are moving beyond cybercrime and surveillance to the unknown threats of digital beings with super-human intelligence, the "math is safe" argument falls apart.
We are no longer just building tools. We are building digital entities that we understand only up to a point. As a technologist, I feel the weight of our roles as responsible innovators. It is critical, as architects, to build safely and thoughtfully. But as a human, I feel a calling to create space for the conversations that math doesn’t always cover. I’m committed to doing the research and sharing what I learn—not just to solve for x, but to ensure we don't lose ourselves in the equation.