Testing AR Platforms for Seamless Product Experience

1. AR Code – The Most Seamless Experience, But at a Cost

Website: https://ar-code.com

What I really liked about AR Code is how simple and fast the experience is. You don’t need to download any app the product appears directly in AR in just two taps. This is a huge advantage for customers, especially in retail, where any extra step can lead to drop-off.

However, the downside is the pricing model. It works on a paid membership basis, and the free version is extremely limited. it’s also really expensive with standard membership 79 eur per month and pro membership 890eur per month. I was only able to test it with basic text-based AR, not with the core features that actually make it powerful, but it is only free feature available.

These locked features include:

  • 3D object scanning
  • Face filters
  • Video in AR
  • Advanced interactive elements

Where AR Code Has Limitations / What to Check

Dependence on network/web — Because it’s Web AR, users need decent internet connection. Offline usability is not possible (maybe unlike AR apps).

So while the platform is technically impressive and extremely customer-friendly, it’s not very accessible for students or small creators without a budget. I would say its pretty good for prototype development.

2. Overly – Affordable and Surprisingly Elegant

Website: https://overlyapp.com

Overly feels like a budget-friendly alternative to AR Code. The pricing is very accessible around 19 euros, which makes it much more realistic.

Visually, I actually prefer Overly’s display and viewing experience more than any of others. The way display and animation appear feels clean and intuitive. One very interesting interaction detail is that the animation triggers automatically when the camera sees the image — there is no need to scan QR code which can be really esthetically pleasing for costumers. Also its just needs 1 sec and without taping on the screen animation starts. This is especially effective for customers who might not realize they need to interact with the screen.

However, there is also an important limitation:

  • You cannot create AR content directly inside the mobile app ( which I find good, it will not confuse users)
  • The app works mainly as a viewer / review tool
  • Creation happens externally and then gets imported

Still, from a user experience perspective, Overly feels very polished and customer-friendly. you can design it in the way where it will activate when u will see the poster or painting, which can be used in retails. Of course with some limitations it should be complex enough for the program to recognize it.

3. Adobe Aero – Powerful, But Coming to an End

Support update: https://helpx.adobe.com/aero/aero-end-of-support-faq.html

Unfortunately, Adobe Aero is slowly being phased out. Adobe officially announced the end of support, and this directly affects long-term usability.

  • The app is already facing technical issues and limited stability

This is especially problematic for students and researchers (like me) who invested time into building scenes inside Aero. Even if you already have it installed, it will not be a reliable long-term solution.

4. ARLOOPA – Weak Performance and Limited Results

Website: https://www.arloopa.com

Out of all the platforms I tested, ARLOOPA gave me the weakest results. The AR tracking felt unstable, object placement was inaccurate, and several features simply did not work properly on my phone.

They do offer:

  • web-based preview
  • maybe Pro version might perform better

But based on my current testing experience, the free version felt unreliable and low in quality. It may be usable with a professional subscription, but in its basic form it’s not convincing for serious product visualization.

So shortly:

  • AR Code offers the smoothest experience, but at a high financial entry barrier
  • Overly is the best budget-friendly and customer-oriented solution for viewing.
  • Adobe Aero is no longer future-proof
  • ARLOOPA currently feels too unstable for serious use

How AR Affects Time Perception in Retail Spaces

Why browsing feels faster, easier, and more intentional with augmented reality

Have you ever walked into a store thinking you’d “just stay for five minutes,” only to look at your phone and realize half an hour had passed? Or the opposite—you wanted to browse slowly, but the environment was so overwhelming that everything moved quickly? Time is an interesting thing in retail. It’s not just measured in minutest’s defined by how we feel during the experience. In other words, augmented reality is quietly changing your sense of time.

Why time perception matters in retail

Feeling like time is dragging can make customers bored or overwhelmed. Feeling rushed can push them into poor decisions. Both outcomes hurt the shopping experience.

Studies on customer engagement and AR/VR experiences show that immersive, supportive digital tools can reduce hesitation and improve decision quality. For example:

Time perception is deeply linked to those emotions — confidence, engagement, and clarity.

How AR changes the way we browse

AR affects time perception mainly because it changes how much effort we spend while shopping.
Here’s where the biggest difference happens:

1. AR makes the “first stage” of browsing much faster

AR changes this completely.

With AR, you can scroll through items digitally, preview them instantly, and remove 80% of the “search time.” Instead of checking 30–40 pieces physically, you preview them on a screen and quickly filter out what doesn’t fit your aesthetic.

A systematic review on virtual try-on and visualisation supports this idea: visuals increase decision certainty and reduce browsing time before trying something physically.
https://www.sciencedirect.com/science/article/pii/S2543925123000347

2. AR supports better “flow” in the store

When an experience feels smooth and natural, time feels shorter.
AR can guide shoppers with soft nudges:

  • “Here are similar items”
  • “You might also like…”
  • “This is available in your size”

This creates a sense of direction — you spend less time wandering and more time evaluating things that are actually relevant to you.

3. AR speeds up decisions but still respects the need for real try-ons

We all know virtual try-ons can’t fully replicate how clothing feels, or how it behaves on your body.
But AR can narrow down the options dramatically.

Instead of carrying 12 items to the fitting room, AR helps you identify your favorites digitally.
Then, when you do call a staff member or go to the fitting room, you’re only trying on pieces you’re genuinely interested in.

This is where AR’s biggest time effect happens:

Fast digital preview → precise physical try-on → fewer wasted minutes

In academic tone: AR enhances time efficiency by enabling rapid filtering and previewing. After identifying preferred items digitally, shoppers can request them for live try-on — a crucial step, since virtual try-ons still cannot fully replicate texture, fit, or physical comfort.

Why AR makes shopping feel shorter

When a shopper feels:

  • less overwhelmed
  • more in control
  • guided rather than lost
  • confident in their choices

Fan’s 2025 study confirms that AR reduces cognitive load in the evaluation phase, helping consumers stay focused and calm.
https://www.mdpi.com/2071-1050/17/2/728

Less stress = smoother time = more enjoyable shopping.

What this means for the future of retail

AR won’t replace physical stores — it will reshape how we use them.

Instead of spending time searching, customers will spend time evaluating.
Instead of wandering, they will move with clarity.
Instead of feeling rushed, they will feel supported.

The goal is not to make shopping faster — the goal is to make shopping feel better.
If AR can help shoppers spend their time more intentionally, then the entire retail experience becomes more human-centered.

References

  1. Pandya, H. (2024). Effect of AR and VR Experiences on Customer Engagement in Retail Stores.
    https://www.researchgate.net/publication/382681273_Effect_of_Augmented_Reality_AR_and_Virtual_Reality_VR_experiences_on_customer_engagement_and_purchase_behavior_in_retail_stores
  2. Fan, X. (2025). The Role of AR and VR in Shaping Retail Experiences.
    https://www.mdpi.com/2071-1050/17/2/728
  3. Jeong, H. (2023). AR Virtual Try-On and Its Influence on Return Reduction and Consumer Certainty.
    https://www.sciencedirect.com/science/article/pii/S2543925123000347

ChatGPT was used as a supportive tool during the writing process.

How AR-Based Store Design Can Create Comfort and Reduce Anxiety for Introverts

A people-centered perspective

Shopping should be an enjoyable experience — a space for discovery, experimentation, and inspiration that creates positive emotions.
But for many people, especially shy or introverted customers, it’s not that simple.

During my thesis research, one idea kept coming back to me:
store environments are not emotionally neutral.

Some customers feel overstimulated by noise, crowds, bright lights, or the unexpected approach of an employee. Others feel awkward when browsing slowly, unsure how to find their size, or embarrassed when they feel like they’ve “been there too long.” Some people hesitate to ask too many questions, even though they know it’s the staff’s job to help.

These experiences are not rare — people just don’t talk about them.
Even I experience this personally: I love going to stores, but there have been moments where, after asking many questions, I felt uncomfortable for “wasting their time” when I didn’t buy anything. Because of that, I sometimes avoid asking questions at all just to prevent that situation. Another personal challenge is when clothes are folded — after unfolding and refolding 20 pieces, I get tired and feel uncomfortable messing everything up.

This is exactly why AR has such huge potential.
It’s not just a technological “bonus,” but a way to create emotional comfort, personal space, and autonomy in physical stores.

Why Introverted Consumers Feel Uncomfortable in Stores

For decades, consumer behavior studies have shown that the atmosphere of a store directly affects emotions. Overly intense environments — whether visually, socially, or through noise — tend to create stress rather than pleasure. Introverts, who naturally prefer lower levels of stimulation, are especially sensitive to these factors.

This directly affects shopping behavior: some people simply need more space, more time, and more privacy to feel comfortable.

Before AR, the only alternative was online shopping. While it removes social pressure, it also lacks the tactile, sensory, and spatial benefits of a physical store. No matter how convenient online shopping becomes, many people still need to see and try on items in person.
Even I — someone who orders a lot online — have certain categories of clothing that I cannot buy unless I see how they look on my body.

AR fills this gap in a unique and immediate way.  A relaxed atmosphere, for instance, can evoke positive emotions among consumers, fostering feelings of happiness and sparking the urge to make purchases (Kurniawan, 2013)

An empirical examination of perceived retail crowding, emotions and retail outcomes (Almeida C.P., 2019) — This study identifies how spatial crowding (high density of fixtures/space) in retail settings negatively affects positive emotions and satisfaction.  https://www.researchgate.net/publication/233041648_An_empirical_examination_of_perceived_retail_crowding_emotions_and_retail_outcomes

How AR Promotes Emotional Comfort

Many brands see AR as fun, interactive, or “cool,” but its real power is often emotional — and especially helpful for introverted or shy consumers.

1. AR lets customers browse at their own pace

Approaching a staff member or trying to decode chaotic racks can be stressful.
With AR, users can simply scan a QR code and instantly access:

  • product details
  • 3D or 2D visuals
  • color options
  • style recommendations
  • availability

This removes a major anxiety trigger:
“Are they watching me? Do I look like I don’t know what I’m doing?”

2. AR reduces uncertainty by visualizing the product clearly

Uncertainty is one of the biggest reasons anxious customers hesitate.
AR reduces this by showing the product in its real scale or in the real environment.
It helps people save time and preview items before looking at everything physically.

3. AR provides help without forced social interaction

Introverted customers often want assistance just not the kind they have to initiate.
AR replaces awkward questions with calm, digital guidance:

  • similar items
  • outfit suggestions
  • size and fit information
  • store navigation

This turns assistance into genuine support, not pressure.

4. AR places information inside a calmer environment

AR doesn’t need to be flashy or loud.
In fact, it can reduce sensory overload.

Instead of searching visually through rows of items, the information comes directly to the user. It turns browsing into a quieter, more focused moment.

What I’m Planning for My AR Prototype

For my thesis, I want to develop a simple AR prototype using QR codes, PNG clothing items, and Adobe Aero.
With this, users will be able to:

  • view items in AR
  • zoom in and out
  • compare pieces
  • explore freely at their own pace

Ideally, I want users to walk away thinking:

  • “I didn’t feel rushed.”
  • “I could make decisions on my own.”
  • “The store felt less stressful.”

Because ultimately,
AR is not just a technological addition — it’s an emotional support tool.

AR Design for Introverts — Key Principles

1. Keep the interface simple and predictable

Introverted users value clarity and calm interactions.

2. Reduce stimulation with AR, not increase it

Soft colors, slow animations, and minimalist UI improve comfort.

Why This Direction Matters

Modern retail loves to talk about “experience,” but often forgets that not all consumers enjoy the same kind of experience.

Michael Solomon, in Consumer Behavior, writes that reducing psychological discomfort increases satisfaction and confidence in decision-making:
https://www.pearson.com/en-us/subject-catalog/p/consumer-behavior/P200000003579/9780136747053

In short:
Emotionally comfortable customers stay longer, explore more, and feel better about their choices.

“ChatGPT was used as a support tool for translation and grammatical refinement.”

IMPULSE: Graz Museum – “Demokratie! Heast?!”

Exploring Interactive Media and Communication Design in Public Space

As part of one of our class activities, I visited the “Demokratie! Heast?!” exhibition at the Graz Museum. The exhibition focuses on how people can participate and express their opinions in today’s world. It was an interesting mix of media, sound, and interaction not a typical museum experience, but something that invited visitors to think and take part.

I spent around two hours there with my classmates, and what I really liked was how interactive the whole exhibition was. Instead of just reading information or looking at objects, we were encouraged to touch, listen, and move around. The space felt like a combination of education and play serious topics presented in a way that was easy to connect with.

One installation that impressed me most was the cat in the glass box. At first, I thought it was just a decoration, but then we were asked to stand in a specific spot, and different text, effects and items suddenly came to life through projection. It wasn’t a real cat inside of course. meanwhile items looked really impressive and worked really well in concept with the “fake cat sculpture” because of the way the inside projector worked with the glass and light. The installation explained — the famous theory about whether the cat in a closed box is dead or alive until observed. It was a smart way to explain a complex idea visually, and it made me curious about how the projection was created and synchronized with the model.

From work perspective, this installation gave me a strong impulse for my own research. It showed how interactive project can be designed and how storytelling can transform abstract theories into something understandable and engaging. I also started thinking about how projection mapping and AR could be used in similar ways to make hidden information visible, or to give digital “life” to physical objects.

Another part that I found inspiring/interesting was how the exhibition balanced education and experience. The topic of theories, history and research can feel complicated or even boring if presented traditionally, but the designers of the exhibition managed to make it approachable. The use of visuals, sound, and space created a rhythm that kept people engaged.

I would like to learn more about the technical side of installations like the cat box but I could not find anything similar in YouTube ( explanation videos ) — how the projection system works inside the glassed box , how motion and light are timed, and what software was used. It is really interesting how much work goes into combining artistic ideas with technical precision, something I also want to explore in my AR prototype development.

So overall “Demokratie! Heast?!” didn’t just show information it made people feel involved in it and work for creating something with people and share it maybe even on social media just like I did with the picture I uploaded here.


References:

  1. Graz Museum – “Demokratie! Heast?!” Exhibition
  2. About Schrödinger’s Cat Thought Experiment

IMPULSE: “Oldboy” (2003)

In one of our ChatGPT tips classes, we learned how to use AI to get creative ideas and ask better questions. For fun, I asked ChatGPT to recommend a movie that would fit my mood — something emotional, with a strong story and a big plot twist. It suggested “Oldboy”, a South Korean movie from 2003 directed by Park Chan-wook. I dont usually watch Korean films, more accurate would be to say that its not my style but i decided to watch it, and it honestly surprised me in every possite way.

From the first few minutes, the film caught my attention because of how it’s designed visually. The lighting, colors, camera angles everything feels very alive. It’s not just telling a story; it’s creating an experience. Some scenes are quiet and slow, others feel chaotic and close, and it all builds up this weird mix of curiosity and tension.

Now here comes the big part — the plot twist (Spoiler warning for anyone who hasn’t seen it!)
Everything Dae-su went through — being locked up, being released, meeting a young woman who helps him — was actually part of someone’s long, cruel revenge plan. The shocking reveal is that the woman he falls in love with is his own daughter, and the villain set it all up to make him suffer. It’s a hard twist to process, but it completely changes how you see the story.

I started thinking about it from a media design point of view. The movie doesn’t just surprise the viewer it builds that surprise carefully through design. Also had an elements of hints all around the movie little ones. In “Oldboy,” both the character and the audience are trapped inside a story they don’t fully understand until the end. In a way, that’s similar to how designers shape user experiences in digital spaces — users think they’re exploring freely, but in reality, everything has been planned to create a certain feeling or reaction.

Watching this film after getting it from ChatGPT also showed me how AI can actually help in creative research. I wouldn’t have picked this movie myself, but it matched exactly what I asked for something emotional and complex. so at the end I was really happy with my “film of choice”

“Oldboy” gave me a strong impulse for my Ai generated film. That why I also wanted to do something with plot twists. the same kind of engagement I want to create through AR design in retail.

References:

  1. IMDb – “Oldboy” (2003)
  2. Roger Ebert Review – “Oldboy”
  3. Visual Storytelling in “Oldboy” – FilmSchoolRejects

This text was refined for grammar and flow with the assistance of OpenAI’s ChatGPT. ✅

IMPULSE: Klanglicht Festival Graz

“SPHÄREN”

During the Klanglicht Festival in Graz, I visited the installation “SPHÄREN” at the Graz Museum Schlossberg. It was made up of three big glowing spheres that people could walk into. Each one had floating light texts around it with different mood some were poetic, some philosophical talking about things like space, perception, and so on. The main part of it was AR set up that was making it possible to see the instillation.

At first, I wasn’t really sure what people were doing there or what the idea behind it was. But after watching for a bit seeing people having fun and enjoying their time , it started to make sense. it turned out to be really impressive. The light, the sound, and the way people moved through the spheres created not only fun but also calm atmosphere. I

This experience gave me even more inspiration for my master’s thesis: “Designing Immersive Retail Experiences: How Design in AR/VR Environments Shapes Consumer Engagement” The installation showed me how AR design can change people mood and be fun. I clearly saw that it guided people emotionally and made them explore, even without clear instructions. Also after figuring out app and how it works turn out it was not that complicated after all.

Thinking about it from an AR design point of view, the night setting played a big role in how the installation felt. The darkness made the glowing spheres and text stand out much more, almost like how digital elements appear on a phone screen in low light. so contrast and environment are really important for visibility in AR. The lighting created focus and made every interaction clearer something I definitely want to consider..

Even though Klanglicht is an art event, it reminded me how immersive media design can change how people behave and feel in a space. It’s not just about technology it’s about new way of showing your product and changing people experiences. That’s exactly what I want to explore in my own project.

(Text refined for grammar and flow with the assistance of OpenAI’s ChatGPT.)

Reviewing master’s theses within the field of digital media and augmented reality

Into
For this task, I conducted my research on Google Scholar using targeted academic search
queries to identify relevant master’s theses within the field of digital media and augmented
reality. I used keywords such as “Augmented Reality in Retail Master Thesis”, “AR
marketing tools in customer experience,” and “Gamification in retail environments.”
After reviewing several available works, I selected the following master’s thesis for
systematic evaluation:
Author: Anas Raza
Title: The Investigation of Augmented Reality Marketing Tool Creation and Adaptability in
Retail
Degree: Master of Design (MDes) in Digital Futures
Institution: OCAD University, Toronto, Canada
Year of Publication: 2024
I chose to write about this master’s thesis because the topic strongly relates to my own field
of study and personal research interests. I am particularly interested in exploring how
Augmented Reality (AR) can be integrated into the retail and fashion industries to enhance
customer engagement and digital experiences. This work aligns closely with the direction I
plan to pursue in my own academic research and design projects.
This thesis investigates how Augmented Reality (AR) can function as a marketing and
engagement tool for retail, specifically exploring the impact of AR games on customer
purchase journeys. The work combines literature review, prototype creation, and
evaluation through the Research through Design (RtD) methodology. The practical
artifact—a multiplayer AR scavenger-hunt-style game built in Unity and Vuforia—
represents an ambitious attempt to combine theory and creative practice.
Evaluation According to CMS Criteria
Overall Presentation Quality
The overall presentation is of high academic quality, well-organized, and visually clear.
However, in my opinion, the thesis could benefit from a richer use of illustrations and visual
design elements that show the step-by-step process or visual outcomes of the artifact. Since
this is a design-related topic, the visual presentation felt slightly lacking in comparison to
the conceptual depth of the written sections.
Assessment: Good to very good.
Note: For this written evaluation, I used ChatGPT to enhance the clarity, structure, and
academic flow of my text based on my own notes, analysis, and critical reflections.
Degree of Innovation
The project is innovative in its interdisciplinary nature, combining AR marketing,
gamification, and flow theory. The author’s concept of transforming retail into a playful,
interactive experience is fresh and relevant. However, the prototype’s execution feels
somewhat limited in scope, and the potential for further technical or experiential
exploration is not fully realized.
Assessment: Good.
Independence
The author demonstrates a high level of independence and technical competence. The
creation of the prototype and the integration of multiple technologies show solid autonomy
and problem-solving skills.
Assessment: Very good.
Organization and Structure
The thesis follows a logical and comprehensible structure—moving from literature review
to research design, creation, and discussion. Nonetheless, some sections could be more
concise, and certain arguments would benefit from clearer transitions between theoretical
reflection and practical application.
Assessment: Good.
Communication
The language is clear and professional, and complex ideas are presented effectively.
However, some parts repeat information that could have been summarized more efficiently.
The tone remains consistent and academic, but at times the text lacks critical self-reflection
on the design decisions made.
Assessment: Good.
Scope
The scope of the research is appropriate for a master’s level project. The author successfully
covers both the theoretical context and the creation of a functional prototype. Still, the
project could have gained more depth through a stronger user testing phase or evaluation
with real participants, which would better support the claims about the tool’s effectiveness.
Assessment: Good to very good.
Accuracy and Attention to Detail
The work is detailed in describing the research and technical process, especially regarding
the use of Unity, Vuforia, and Photon networking. However, the visual documentation could
be more comprehensive—for example, showing interface design, user flow, or AR
environment screenshots. This would strengthen the connection between the written
explanation and the actual artifact.
Assessment: Good.

Literature
The literature review is well-structured and covers relevant areas such as AR marketing,
flow theory, and consumer behavior. However, it leans slightly toward summarization
rather than critical analysis. More engagement with recent design research or case studies
would provide a deeper contextual understanding.
Assessment: Good.
Engagement with the Artifact
The artifact is the strongest part of the work conceptually, but it could have been
documented and visualized in greater depth. The prototype’s description is clear, yet the
connection between theory and user interaction remains somewhat abstract. Including
more screenshots, scenario walkthroughs, or visual evidence of testing would make the
documentation more convincing.
Assessment: Good to very good.
Overall Assessment
Overall, Anas Raza’s thesis presents a thoughtful and relevant exploration of augmented
reality marketing tools in retail contexts. It effectively bridges theory and practice and
demonstrates a solid understanding of both design and technological methods. Has many
strengths such as:

  • Innovative concept integrating AR and gamification in retail
  • Clear structure and professional presentation
  • Solid technical foundation and independent execution
    Weaknesses:
  • Lacks sufficient visual and illustrative documentation
  • Limited critical reflection on user feedback and testing results
  • Some sections of text could be more concise and analytical
    Despite these limitations, the thesis represents a valuable and inspiring contribution to the
    field of design and AR marketing. It provides meaningful insights that can influence how
    interactive technologies are used to shape customer experiences.
    Final Evaluation: Good to Very Good— A strong and relevant master’s thesis that combines
    technical skill, design thinking, and conceptual understanding, though it would benefit from
    deeper critical reflection and visual documentation.
    Note: For this written evaluation, I used ChatGPT to enhance the clarity, structure, and
    academic flow of my text based on my own notes, analysis, and critical reflections.

How Fashion Brands Create AR Filters in the Real World

The Software, Workflow & Time Investment

Augmented Reality is becoming the new favourite tech of fashion brands to grab attention of shoppers and optimizing their shopping experience. From virtual try-ons to interactive fashion campaigns, AR filters bring clothes to digital life — and they are a powerful marketing and personalization tool, as well as a vehicle for product discovery.
In this article, I’ll address how fashion AR filters are produced, the necessary tools and platforms, the kind of needed data and the time needed to build a full AR experience from scratch.

 📝 Note: Some of these are also questions I would like to feature for Ines Alpha, the 3D makeup and AR artist that exhibited at OFFF Festival Barcelona on the subject of “3D software an augmented reality to merge makeup with tech.” I’m super excited to have an interview scheduled with her, and can’t wait to get more behind-the-scenes info on what her creative process and use of AR become in the world of digital fashion/beauty.

Why Fashion Brands Use AR Filters

Fashion AR filters are being used on various platforms from Instagram and Snapchat to branded apps to produce:

  • Virtual Try-Ons – Allowing consumers to see how an item will look and move on their body.
  • Digital Dress-Up – Enabling consumers to mix and match articles of clothing in real time to produce curated looks.
    Not only do these experiences personalize shopping, but they increase customer engagement and social shares — and even conversion rates.

How to Make an AR Filter for Fashion

1. Concept and Design

Defining the concept and user experience is the first step. Brands must decide:

  • Are they featuring a product?
  • Want to build your own interactive fashion show?
  • Promoting a new campaign?
    Early on, teams align on:
  • Format of experience: Try-ons, storytelling or brand-specific interactions.
  • User interaction: Are users going to swipe, zoom or rotate to view the garment?
  • Brand identity: Preserving branded colors, logos, textures, and tone of voice.
    Designers often create wireframes or storyboards before developing their ideas.

2. 3D Model Creation

At the core of AR filters, you’ll find high quality 3D models of clothing. These have to look and move real.
Steps include:

  • 3D Scanning: To digitize real pieces (especially more complex garments)
  • 3D Modeling Software: Experience with Blender, Maya or 3DS Max to create highly detailed virtual clothing.

Rigging: Rigging, adding skeletons, so that clothes can move in a natural way with a body — or flight, or other body movement.

3. AR Filter Development

After 3D models are completed, they’re brought into an AR development platform.

  • Spark AR Studio (Instagram/Facebook)
  • Lens Studio (for Snapchat)
    Tasks include:
  • Importing 3D assets
  • Using your own textures or materials
  • Introducing interaction (e.g.Gesture, tap-to change, animation)
  • Supporting multiple devices and platforms (iOS/Android)
    The filters are checked for fit, this is very tight, dust-free, rotation smoothness.

Performance Testing

Smooth performance is key. Filters must:

  • Load quickly
  • Perform well on low-spec phones
  • Not drain battery

Users fall off if filters lag or freeze. Checking that everything runs correctly is essential if users are to have a friction-free experience.

Launch and Monitoring

Once they’re ready, you can then launch your filters through social networks or mobile applications. Brands monitor:

  • Engagement: How frequently users apply and share the filter.
  • Conversion: What is the purchase rate after a user applies the filter?
  • Feedback: What do users love? What do they want more or less of?
    This refining process is useful for future campaigns.


How Accurate Are AR Filters?

Accuracy depends on:

  • 3D model quality Body Tracking(Face/Body Detection)
  • Device functionality (iOS/Android)

Today’s AR tech can be impressively life-like, but there are discrepancies between platforms.

Conclusion

Making AR filters for fashion requires a blend of technical perfection, artistic freedom and user experience design. Every step along the way, from the ground up design process to launch requires a delicate touch to ensure that the experience “feels” intuitive, engaging, and beautiful.

And while building 100 filters might sound like a Herculean effort today, advanced tools and workflows are making the process simpler and faster every day.

interview with Ines Alpha coming soon….


This text was grammar-corrected with the assistance of ChatGPT.

Generative AI in Retail Product Discovery


Abstract

McKinsey & Company estimates that Generative AI could create between $240 billion to $390 billion in economic benefits for retailers, possibly boosting margins by as much as 1.9 percentage points [1]. This paper aims to find a middle ground between all the hype surrounding Generative AI and its actual potential in retail. We’ll look at practical applications offering genuine advantages while addressing overhyped scenarios and providing tips for effective integration of AI tech. Readers will come away with valuable insights on how to use Generative AI wisely without falling into common traps.

CCS Concepts

  • Applied computing → Online shopping
  • Computing methodologies → Natural language processing
  • Information systems → Recommender systems; Search interfaces

Keywords

Generative AI, Retail, Machine Learning, Recommender Systems, Natural Language Processing


Introduction

We kick things off with an overview showing how generatively driven technologies are transforming retail landscapes—pointing out significant chances for enhancing shopper experiences while optimizing processes [6]. Plus we discuss just how much cash retailers are pouring into these new technologies along with their impact on global markets.

Key Applications of Generative Ai in Retail

Next up is our dive into major ways GenAI can be used within retail settings:

  • Smart Assistants: Discover how interactive chatbots powered by GenAI offer customized answers about products [3] while guiding both customers and staff.
  • Curated Shopping Experience: See how retailers can use GenAI to craft personal shopping journeys that feel like having your own virtual assistant—with tailored comparisons based on specific criteria.
  • Enhanced Search Capabilities: Learn about advancements such as guided navigation improvements alongside better understanding user queries which leads to refined search accuracy [5].
  • Recommender Systems: Find out how GenAI fine-tunes suggestions around products creating fresh categories for easier discovery aligned closely with marketing strategies [2].
  • Multimodal Product Content: Explore using GenAI for extracting features efficiently—from generating optimized titles automatically through alt text creation aimed at improving accessibility plus SEO efforts.
  • Marketing Optimization & User Experiences: Uncover ways that data-driven campaigns get enhanced thanks due diligence toward consumer behavior benefiting overall site experience optimization via AIdriven innovations!

The Hype Trap – Overblown Use Cases

After discussing useful applications we’ll dissect some currently hyped-up uses cases where expectations might’ve gotten ahead ourselves including:

  • End-to-End Search Systems: Here’s why thinking you can rely solely upon gen ai technology managing every part independently ignores conventional components necessary—a hybrid approach proves smarter!
  • Comprehensive Recommendation Models: We’re diving deep here too; it turns out leaning completely onto generators alone misses business goals impacting traditional algorithm performance negatively instead pairing them together reaps rewards!
  • Fully Automated Customer Service? Sure thing but let’s not forget humans still play critical roles navigating complex issues requiring empathy far beyond what bots provide alone.. While generative AI automates many tasks, human oversight remains essential to maintain creativity, ethical standards, and quality control. Retailers must strike a balance between automation and human input.
  • Automated Messaging/Writing Needs Creativity Too! Letting machines do this work risks losing coherence across branding voice unless human input stays involved consistently throughout messaging processes[4]!

Challenges and Considerations

1. Data Privacy and Security

The use of generative AI requires access to vast amounts of customer data, raising concerns about data privacy and security. Retailers must ensure compliance with regulations such as GDPR and implement robust data protection measures.

2. Transparency and Trust

AI-generated content, such as product descriptions and images, can sometimes be misleading. Retailers must prioritize transparency and ensure that AI outputs align with brand values and customer expectations {8}


Future Implications

The adoption of generative AI in retail is expected to accelerate in the coming years. By 2025, 50% of fashion executives identify product discovery as the top use case for generative AI [9] . Key trends driving this adoption include:

  1. Multimodal AI: Combining text, image, and video capabilities to create richer shopping experiences [10].
  2. Advanced Personalization: Leveraging AI to create hyper-personalized experiences at scale.

References

  1. McKinsey & Company (2024)
    LLM to ROI: How to Scale Gen AI in Retail
    A comprehensive industry insight exploring the economic impact and integration strategies of Generative AI in retail.
    Read the full article
  2. Yashar Deldjoo et al. (2024)
    A Review of Modern Recommender Systems Using Generative Models (Gen-RecSys)
    Presented at the 30th ACM SIGKDD Conference, this paper reviews how generative models are shaping the future of product recommendation systems.
    DOI: 10.1145/3637528.3671474
  3. Feriel Khennouche et al. (2023)
    Revolutionizing Customer Interactions with Generative Chatbots
    This arXiv paper discusses the challenges and insights behind deploying AI-driven chatbots for FAQ and customer support systems.
    Available at: arXiv:2311.09976
  4. Katherine Lee, A. Feder Cooper, James Grimmelmann (2024)
    Talkin’ ’Bout AI Generation: Copyright and the Generative-AI Supply Chain
    From the Symposium on Computer Science and Law, this study explores copyright and legal implications of generative AI content.
    DOI: 10.1145/3614407.3643696
  5. Zheng Liu et al. (2024)
    Information Retrieval Meets Large Language Models
    This paper, presented at the ACM Web Conference, dives into how language models are transforming traditional search and information retrieval.
    DOI: 10.1145/3589335.3641299
  6. Mari Sako (2024)
    How Generative AI Fits into Knowledge Work
    Published in Communications of the ACM, this article reflects on GenAI’s role in reshaping how professionals manage and apply knowledge.
    DOI: 10.1145/3638567
  7. Macy Takaffoli, Sijia Li, Ville Mäkelä (2024)
    Generative AI in UX Design: Industry Insights
    A study from the ACM Designing Interactive Systems Conference on how UX teams and companies use GenAI in practice.
    DOI: 10.1145/3643834.3660720
  8. (Digixplanet, 2025).
  9. (BoF Insights, 2024)
  10. (Retail TouchPoints, 2025)

Note: This text was grammar-corrected and structured with the assistance of ChatGPT.

Can We Make AR Try-Ons More Personal?

During one of our design research classes, I had the opportunity to interview someone in connection with my research topic digital try-ons and the future of virtual fashion experiences. The conversation turned out to be more insightful than I expected.

One simple, but great question they asked was:
“What if I want the oversized look?”

It made me pause. In the current world of online shopping and AR filters, we often focus on how a garment fits but not how someone wants it to fit. Sometimes the goal isn’t to see if a shirt hugs your waist, but how it hangs off your shoulders. This led to an even deeper point:

“What if I just want to see how it looks on my body—oversized or not?”

That moment made me wonder: are digital fashion tools really reflecting our bodies—or just a generic mannequin?

Later, while browsing an outlet’s website, I came across something interesting: a quiz feature that helps users determine their body type. It made me think—what if this same input could be used to generate a personalized 3D body model? Not just a one-size-fits-all avatar, but a real approximation of how clothes might fall on your unique shape.

Imagine this:

  1. You answer a few questions in a body type quiz just like in OUTNET
  2. The system generates a simplified 3D model based on your input.
  3. You can try on clothes virtually—with options to toggle fits like oversized, slim, or relaxed.

It sounds futuristic—but maybe not that far off. I already found something really really close to the thing I was searching for and it is Zalando’s virtual fitting rooms allow users to adjust the fit of an item on their avatar (tight vs. loose). its pilot version but still coming soon…

From Body Scan to Personalized Avatar

Zalando’s Size & Fit team is leveraging body measurement technology to create tailored 3D avatars—beyond generic mannequins, these avatars reflect real proportions, helping users visualize how clothes will look and feel 

Dynamic Fit Visualization

They’re experimenting with dynamic poses sitting, stretching, walking—to better showcase how an item behaves in real life. To communicate fit, they’ve used color-coded overlays to highlight tight or loose areas, recognizing that fit is style-dependent and context matters

The Path to Trust & Sustainability

Zalando aims to reduce size-related returns, contributing to more sustainable shopping. By prioritizin accuracy, positivity, and inclusivity, and involving real customers in testing, their virtual fitting room has already reached tens of thousands of early users 

I’ll share some visuals from the outlet site below that sparked this idea. Let me know what you think—could this be the next step in personalized AR try-ons?

references

https://ww.fashionnetwork.com/news/Zalando-tests-a-virtual-fitting-room-in-its-25-markets,1509717.html

https://medium.com/zalando-design/bringing-irl-into-digital-with-the-zalando-virtual-fitting-room-4cacc037b943

Note: This text was created and corrected with the assistance of AI to improve clarity and structure.