IMPULSE #7: Life Story

A few days ago I had a very simple mission. Go to the store, buy a few things, get out. Instead it turned into an unplanned usability study. I needed cornstarch. That is not exactly an exotic product, so I walked to the baking section, then to the sauces, then to the international food aisle, then back to baking. I walked the same path again and again and still could not find it. At some point I just stood there in the middle of the aisle and realised that I was living inside my own thesis problem.

I knew the store had cornstarch, it is a common product and I had bought it there before, but my internal map completely failed. Shelf labels were tiny and placed at odd positions. My working memory was full of other items from my shopping list. After about twenty minutes of wandering, I finally found it at the very middle of a shelf in a place that was too easy to notice but there I am did not see it at all. That moment was the first impulse. If I had my imagined AR glasses, connected to the store’s inventory, this would have been a two second problem.

The story did not end there. When I finally picked up the cornstarch, there were two brands. The packaging looked almost identical. I could not see at a glance what the difference was, apart from a small price variation and some vague marketing text. I stood there comparing ingredients, Googling on my phone, opening product pages and reviews, trying to understand which one to choose. That felt like a second micro usability test. Finding the product is one task, choosing between options is another. Both were slower and more frustrating than they needed to be.

Later I told this story to friends and a few people immediately answered with similar experiences. They knew the store had a product, but could not locate it. Or they found something, then spent ten minutes trying to compare slightly different versions without any help. Some of them are very tech comfortable, so this is not a “user error”. It is a mix of confusing layout, poor signage and the cognitive load of doing small decisions in a crowded noisy environment.

This small field visit also changed how I think about evaluation. It is easy to say “AR will save time in the supermarket”. Now I have a real reference situation where I can ask people how long they typically search for items, how often they feel lost, and how they currently make product choices. I can imagine measuring the difference between the current experience and a guided AR version in a prototype study. The frustration I felt in front of that shelf is exactly the kind of pain point that can justify the complexity of an AR and IoT system.

In the end, this was just a normal shopping trip, but it gave me a very strong validation that my topic is grounded in everyday life. People are already hacking the system with their phones and Google. My research question is how to turn that into a seamless, spatially aware experience that lives in the environment itself instead of on a small screen.

AI Disclaimer
This blog post was polished with the assistance of AI.

Leave a Reply

Your email address will not be published. Required fields are marked *