Doing Things the Hard Way

Date · 2025-10-27

I love using ChatGPT for all sorts of things. When I can explain a code-based task easily and efficiently in English, I often get it to help me write the code: like plotting graphs, processing data, writing API endpoints or much of the frontend of this here webiste. Of course, there's also travel tips, shopping-related research (it helped me massively with the last bicycle I bought) and Github commands I should have learned by now, but still haven't. Of course, there's also more fun things, like questions about what kind of bird would make the best top-tier management consultant - because I'm sure you're dying to know, it's a raven. Sometimes I start to use ChatGPT when I don't have ideas or, when I'm starting to feel like thinking is an effort. And then I stop...

It's been said that social media is eroding our ability to think critically, as we share and consume emotionally charged content presented to us usually with very little context or nuance. Similarly, I think LLMs are eroding our ability to think deeply. Obviously, like almost any tool, ChatGPT/Claude/Gemini etc. are amazing tools if used properly. I have a colleague who uses ChatGPT to help them understand topics in an iterative back and forth manner so they can really grasp the fundamentals of a problem. Another friend uses ChatGPT to help them ask the right kind of questions to take a creative project forward. In both cases they're using ChatGPT as a kind of mentor.

I think, however, if you're asking an LLM to solve something for you as soon as it gets tough, it's the same as opening a math text book, reading the exerises, skipping to the answers in the back and then saying you've grasped the material. Spoiler alert: you haven't.

A truth I think we all know about problem solving - and life in general - yet one we often find hard to accept, is that the good, juicy bits always take some trouble to get to. Not only is the reward better (think about that sweet dopamine hit) when you solve a tough problem, but you learn this way. In a deep way. A few days ago, I wrote a little matrix sorting algorithm that used some method from my optimization class in my masters 5 years ago - and it was so much fun turning off Copilot and figuring this out from scratch! I also recently heard a Simon Sinek interview where he pointed out that habitually leaning on ChatGPT to give us the "right" answers in life robs us of some of the hard earned and valuable lessons that we learn when we make mistakes.

Along with a trend towards developing shallow understandings of topics, I think we're experiencing a flood of AI-generated content, or "AI slop" as its aptly called. There's just so much out there and so much of it feels like empty content. It's essentially information fast food, devoid of anything genuine or authenetic, usually leaving me unsatisfied. When I see one of those posts on LinkedIn with the typical emojis for almost every line (you know the ones, ✅ 🚀) and that snappy short-paragaph style, without even a single mistake, I just scroll past. It's undoubtedly something unoriginal and empty. Not so long ago I used to be such a stickler for perfect grammar, spelling and punctuation, yet nowadays I find the occasional mistake to feel somewhat comforting, as if it signals "This was written by a real human being."

The irony of this flood of content is that after a little dip when everyone started to think that creatives were about to all end up unemployed, I think this slop begins to make people who have unique styles - people who worked and developed those for years - more valuable again. The vast majority of content out there just becomes background noise, especially against the few original ideas and art that stand out.

From my experience as an A&R (talent scout) at Universal Music in the 2010's I believe that there is a place in the world for generic content. Sometimes we just want to dance and we don't care what's playing. But the stuff that really sticks around - and that moves people - is almost always unique and special in some way. This applies to music, to writing, to fine art and to ideas. I don't think this is going to change in the future.

What's the take away here then? In Deep Work, Cal Newport makes an argument (pre-AI content deluge) for why being able to focus in a world constantly driven to distraction by modern technology is a distinguishing trait. I believe that being able to create original content - built upon skills developed through years of deep learning and honing one's craft - will become something that distinguishes creators amongst the rising tide of AI slop. I lean towards being an AI optimist and believe that AI will help to take over a lot of drudgery, like taking over boring processes or writing throw-away copy that has to be done, but that no one really reads. However, I still think in the coming years that the original art and ideas that really touch and move us will be created by humans who went through the hard work to learn and think through the difficult problems. I'm all for using AI to augment what we do, but in a world that values originality and authenticity, I think we're still a long, long way from AI replacing us.