The Looking Glass: What AI Won't Change
Run more experiments, what discipline is, scripts for tough feedback, and making magic
Hello readers!
This week’s tidbits:
What AI won’t change
Run more experiments
What discipline is
From the archives: Diagnose with Data; treat with Design
For paid subscribers:
Subscriber mailbag: How to deliver feedback without reducing trust?
A script for giving tough feedback
50 words for a winning
Making magic
There’s now over 50,000 subscribers! Crazy! What do ya’ll read this for? Inspiration? Tactical advice? Solidarity of suffering? (I hope you tell me it’s the bad poetry :P)
As a reminder, paying for a subscription (about the price of buying me a fancy coffee every month) encourages me to publish more as well as guilts me into not using AI generated responses. (All words still custom crafted! You’ll also get subscriber-only essays, Q&A, and double the content number of tidbits.
If you like The Looking Glass, help me spread the word!
What AI won’t change
What is scarce carries more status.
What is hard delivers more pride.
What goals matter, over what time frame, can only be answered by the heart.
What destroys us will be our mundane hubris and fear.
Run more experiments
If you work in product — particularly if you work in growth — you probably know about the value of experimentation.
The premise goes like this: the more experiments you run, the faster you’ll improve your product. The faster you improve your product, the faster you’ll grow.
I was skeptical at first. What happened to quality over quantity? Surely spending more time on fewer, smarter, ideas will yield more growth than a massive velocity of spray-and-pray.
It turns out, I was wrong.
This is because so much of growing is executing lots of little optimizations that compound over time. It’s rarely just a single improvement that changes the game.
Also: no matter how smart or experienced you are, your hit rate of knowing what will work is low. And because it is low, you need more shots at goal.
So, the combination of: needing to find many small optimizations AND a low hit rate of success = you should launch many experiments.
This is true not just for growing products, but also for growing your career.
You do not improve unless you change what you do in your day-to-day. Status quo = no growth.
Of course there is always a base level of change in one’s job — a new task, a new project, a new collaboration. And the faster-paced your job environment, the faster you will grow. This is why working at a start-up usually leads to more personal growth than working at a big company — the environment forces you to change more frequently.
But you can take your growth even further by asking: what additional experiments can I run every week?
Think of an experiment as an intentional change you’ll try, that you hypothesize might help you make progress toward some larger goal.
Remember: your hypothesis might be right or wrong. But you won’t know unless you do it and see the results.
Some experiments can be big: let me try moving into management, let me try working at a start-up or a big company, let me try to teach a course.
But the vast majority of them can be small.
Can you ask your manager a new question?
Can you make a tweak to how you write status updates?
Can you try a new morning ritual?
The important thing is launching many experiments.
Can you try 5 career experiments next week?
Email me what you tried and learned!
What discipline is
Discipline is pain
Discipline is promise
Discipline is the sharp knife
That slices your skin
Separating who you were
From who you are becoming
Discipline is the life preserver
Tossed as you drown
In a sea of self-pity
Discipline is a spider’s weave
Quiet
Invisible
Delicate
But see how it connects
Branches and doorways
Your mind’s greatest dreams
Discipline is the looking glass
Sometimes sneering at you
Sometimes broken
But there will never be a victory
Where you grin in triumph
Without her grinning back
From the archives: Diagnose with Data; treat with Design
My co-founder Chandra Narayanan's quote has become something of a product-builder's mantra for us: Diagnose with data and treat with design.
First: "diagnose with data."
The job of data is to help you understand the ground truth of what is going on (with your product, user behavior, the market, etc.) Typically, we humans run on intuition, a rudimentary kind of pattern-matching. This is insufficient in many cases.
Intuition works if you've studied something deeply (think Serena playing tennis.) But it does not serve you well in:
Making decisions for contexts you don't understand
Generalizing predictions at huge scale / complexity
Optimizing the impact of many tiny decisions
When I say "data," I mean objective facts that help you understand people's reactions to what you are building. This can be:
Qualitative observations (yes, user research or customer discovery is data!)
Quantitive behavioral data (clicks, views, etc)
Market data (where are your competitors or the worst / average / best in your field at?)
When you are data-informed, it means that you are paying attention to data points beyond your intuition that give you an understanding of what is actually happening, whether problems or opportunities, so that you can make the best decisions.
When people say they want to be "data-driven" with product decision, I get nervous.
There are many shortfalls of data, including:
Data can tell you what is happening, but not what to do about it
Quantitative behavioral data can tell you what someone is doing, not why.
"Treat with design" means that once you understand what is happening in detail—what is the problem? What's possible (from benchmarks)? Where are the opportunities?— you can craft a solution. Design is creative, open-ended problem-solving.
Design is not the way it looks, or beautiful colors and animations. It's not the brand or the logo. It is how the product works. Designing is the the process of exploring and arriving at a solution. I believe all builders are designers.
Design and data are not at odds with one another. One helps you understand phenomena and gives you a foundation on which to build your assumptions. The other is the joyful process of creation to solve problems based on those assumptions.
Of course, data and "design intuition" can point at different conclusions. This is usually because:
The data is being interpreted incorrectly
Design intuition is wrong
How do you know which?
An example of data being interpreted incorrectly: This change results in a higher action rate, so that means it's better.
Are you sure? Did you check how many people undid their action or pressed "back" after the fact? Maybe they did something they didn't mean.
If design intuition tells you that some experience is bad (because it's hard to use, it's confusing, etc.), TRUST the intuition. It probably is bad for some group of people. The job now is to understand: for whom, and how bad?
After you learn that, you might still decide to ship the experience (maybe because those people aren't the target customer.) But you should be informed in making the decision.
If design intuition tells you that A works better than B at a large scale, be wary.
For example, I like myself a clean, modern interface. I like white space. I prefer the elegance of icons only over icons + text. For a product with hundreds of millions of users, this intuition is wrong.
The more your target audience does not look like you, the more you should be skeptical of your design intuition. Despite having seen thousands of a/b test results for minor design details, I am still regularly surprised and humbled by my intuition being wrong.
In general, use quantitative behavioral data (segment analysis, A/B testing, etc.) when you're optimizing and growing something. Do customer discovery / user research when you are in the early 0->1 phase.
In conclusion, data helps you become a better designer. But data by itself does not lead to wonderful things. You still have to design them.
Keep reading with a 7-day free trial
Subscribe to The Looking Glass to keep reading this post and get 7 days of free access to the full post archives.