Don't be like my shampoo!
A shampoo that looked like banana juice made me reflect on how easily users can misinterpret design. In software, we’re lucky, we can test, iterate, and fix. Here's how to catch those 'banana juice' moments before they ship.

A while back I bought a new shampoo labelled "Hair Food Banana", branded Fructis and marketed with 96% natural origins.
I bought it because it was supposed to work well for my type of hair and didn’t think much of it when I found it in the shampoo section. But then two things happened that made me question the design of the packaging.
First, when my mom saw it in the kitchen (I had recently taken all my groceries out of their bags and was about to bring the shampoo to my bathroom), she asked me if she could try that banana juice I’d just bought. I thought it was simply a misunderstanding because she doesn’t speak English.
But then, when I was about to leave it in my shower, I noticed a big round sticker reading "Do not consume" in Spanish and Portuguese. That’s when I thought — it’s not just my mom. It’s the manufacturer’s error for selling shampoo in packaging that looks awfully similar to juice!

I know shampoos (and other cosmetics) usually include warnings against consumption, but they’re typically written in small print.
So why did this shampoo need such a big, obvious warning?
Did they learn, after going to market, that people were actually confusing the shampoo with banana juice?
Did they add the sticker as a quick fix because they’d already invested in the bottle and didn’t want them to go to waste?
The shape of the bottle itself does look a bit odd for a shampoo. What’s the process like for designing physical products that you can get something like this into the market?
All these questions made me reflect on how lucky we are in software: mistakes like this can be reverted. Even better, they can be caught early with the right development processes.
So, if that shampoo were a piece of software, what could they have done to catch that design error early on?
How can we avoid making the same mistake?
How do we, as product engineers building software, make sure we don’t confuse our users?
Here’s a list of things you should be familiar with:
User Testing with Real People, Not Just Your Team
Watching how outsiders use your app can be a real eye-opener. It’s the actual test of your design assumptions.
Doing this while shadowing or using screen recordings can help uncover UX issues you’d never expect.
For example, at Ontruck our internal users had discovered hacks to do things we hadn’t even built features for. It was a real surprise and a clear sign of where the friction was.
A/B tests
These work for all kinds of things, whether it’s feature testing, funnel optimization, or copy testing.
This process of showing different UX/UIs to users in parallel gives you a clear, data-backed view of what drives better results.
The best part? It’s complementary to the other methods in this list.
Continuous Deployment and Progressive Rollouts
As software engineers, this should already be a standard practice, but it’s worth calling out.
This includes using feature flags to toggle functionality with a click. It’s how you test new features without needing to roll out a whole new version labelled “Do not consume!”
Imagine you’re running an A/B test, but midway you discover a critical issue. Instead of rushing a new deployment, you just switch the feature off and go back to normal in seconds.
Fake-door tests
This is a concept I haven’t implemented myself, but I’m seeing it more often, even in mature products.
The idea is simple: build the UI for a feature and track how many users try to use it. It can be as simple as a button that does nothing (or shows a message like “Coming soon”).
I was a little disappointed the first time I stumbled upon one, but I understand the value, and now I’m just waiting for the real feature to land.
NPS + qualitative questions like “What confused you?”
This one works especially well when you’re running A/B tests or doing incremental rollouts (e.g. canary releases).
It’s a solid method to gather early feedback at scale, particularly when user testing or screen recording isn’t feasible.
Mental Model Mapping
This technique leans more towards the design side. It’s about analyzing whether your design unintentionally resembles something else.
This exercise could’ve probably saved the shampoo manufacturer from having to add a sticker.
If something looks like juice and smells like juice, then people will assume it’s juice!
So be mindful of how your product presents itself, and avoid misleading associations.
Dogfooding: Use Your Product in Real-World Contexts
What better way to find out where your product is confusing or broken than using it yourself?
If your product solves a problem you experience, then use it. Do it while developing. It brings the feedback loop closer and can save you from figuring it out later, when it’s more costly.
In the case of my shampoo: if the designers had brought it home and left it on the counter, maybe someone they live with would’ve tried to drink it too!
Instrumentation and Event Tracking
None of the above methods that rely on data would work if your product isn’t instrumented correctly.
Every product engineer should know the basics of event tracking and product metrics. How else are you going to make “data-driven decisions” without actual data?
Hint: get familiar with tools like PostHog, Amplitude, or even Google Analytics. Build funnels on your own site. Track what defines your conversion.
Conclusion
We’re lucky to build digital products. We’re not stuck with bad design decisions because we’ve already manufactured a huge batch of bottles.
If we care about how our product is “packaged” and listen to feedback early, we can avoid costly errors, and stop over-investing in things that confuse or mislead users.
This is how I try to approach the software I build, and it’s a common theme in the indie hacker community.
Are you trying to build without investing too much in bad ideas? I'd love to help. Book a call if you want to talk about it.
Have you experienced something similar? Drop a comment below — I'm curious to hear your take.