top of page

Augmented Reality  June 2021  - Dec 2021

Adidas. Adidas Lens

In stores, consumers can touch and try products before buying them. But digital commerce has yet to replicate these critical aspects of the in-store experience. The quality, craftsmanship, dimensions, and fit help people determine if a product meets their expectations and needs. They provide the tangibility that builds trust and increases confidence. After all, experience matters. With thisl project, I wanted to understand if augmented reality can amplify the feeling of the product and better favour the buying decision.

Virtual reality
UX/UI

Product design

NFT

eCommerce

Augmented reality

Task

Determine via an interactive AR prototype whether the experience improves the understanding of the Adidas NMD shoe product, inform about its technical data and facilitate customisation via spatial interactions. 

ADIDAS LENS

My role in the project

My role:

 

Service Design

AR Product Design

3D modelling

3D texturing

3D Prototyping

VF OMNICHANNEL

VF OMNICHANNEL

My objective was to gain the necessary knowledge to accomplish all the stages of a AR prototype development. Following the double diamond methodology, I spent the initial part of the process discovering opportunities and constraints of both the market and technology. I then narrowed down the use cases and determined the critical requisites for the prototype to respond to basic user needs. Once happy with the target for the execution, I started designing a few concepts and user flows to determine the best interaction patterns. During this stage, I faced different challenges in the 3D production pipeline. I designed, textured and model all the UI components that would serve the AR prototype. The USZD file creation and ARKit learning curve were steep and curvy. Still, this exercise ultimately gave me confidence in working on an AR prototype for an entire e2e user journey.

Market Research

As my standard approach, I spent some time researching how a complete AR experience could leverage market opportunities and user behaviours. I did some qualitative analysis by investigating a few market reports to understand if more significant-tech players are currently monitoring such eCommerce use cases and if there are opportunities driven by unmet customer needs. Also, by filtering the hype for emerging technologies, I wanted to understand if users were ready for experiences that would go beyond the traditional mobile and desktop touchpoints to stretch the interaction zone into the physical space. Quite interestingly, I found few data that supported my thesis, such as:

How might we allow consumers to better understand their shoes' performance and easily customize them in order to buy with more confidence for digital and physical consumption?
DESIGN FUNDAMENTAL QUESTION
S.M.A.R.T. Objectives

I set myself objectives following the SMART framework. This objectives framework allowed me to clarify from the beginning how the experience segments were supposed to be accomplished and helped me break larger goals into smaller tasks and progress in the execution. Also, it allowed determining the definition of "done", something not as straightforward to identify when working against your own project ambitions and expectations.

Best practices are the foundations for any software application. However, it is very early days and those are under construction for Augmented Reality, Virtual try-on and fungible tokens (NFT).  The technology is very new, and it is still maturing both in adoption and user experience. With this exercise, I wanted to deep dive into the design disciplines that constitute the solution. The goal is to demonstrate how the AR framework could create more confidence to influence the buying decisions and create further engagement within the Ecommerce space, thanks to an amplified sensorial experience. And eventually, determine new interaction patterns.

ADIDAS LENS

The Challenge
10B

VR units sold globally. Back at Connect 5 in 2018, the Meta's annual XR developer conference, the CEO Mark Zuckerberg explained that he believes that 10 million VR users on a single platform represent an important milestone for the company to make a sustainable ecosystem for VR developers. The bar was reached on March 2021, as declared by Cristiano Amon, Chief Executive of Qualcomm, chipset supplier of the Oculus Quest 2.

10B $

linking with digital assets to real-world activity in the metaverse platforms. Fortnite alone has sold more than $1 billion. The Metaverse is also expected to have a strong connection with the real-world economy – and eventually become an extension. In other words, the Metaverse and NFT give the ability for companies and individuals to participate in economic activities in the same way they do today in the real world. This means building, trading, and investing in products, goods, and services as we usually do in the physical enviremoment.

47%

of end-users are already ready to consume 3D virtual content. Consumers trust immersive experiences to close the sensory gap and provide the purchasing "proof points" they need to evaluate a product digitally. This aspect is so meaningful for them that they associate premium brand value with immersive experiences. Nearly half of consumers would pay extra for a product if they could customize or personalize it using immersive technologies. In addition, three in five consumers expect to buy more from brands that enable them to use these technologies to interact with and evaluate products

 

 

Further consumer motivations for going immersive

52% Viewing products without visiting the store.

42% Assessing product features and capabilities

42% Experiencing products before purchasing.

39% Increasing confidence in purchasing decisions.

29% Changing, customizing or personalizing products.

64%

of leading brands are funding in immersive technologies. As much as consumers love the convenience of digital commerce, there is no purchasing certainty when they shop online for products they haven't experienced in person. Surprisingly, leading consumer brands already understand this. The analysis reveals that a full 64% of those companies are starting to invest in immersive experiences for commerce today. However, many are not investing in scalable or connected ways across the business. They are investing in pockets, doing things like uploading 3D models on product pages, curating personalized make-up palettes and hosting virtual fashion shows to bring people closer to products in the digital world.

Core Behaviours to Understand

How frequently do people buy new sneakers online?
How do people generally get informed about their next shoes? What is the process?
What are the essential factors people seek out when looking for a new pair of sneakers?
What are aspects people do or don't like when choosing a new pair of shoes?
What level of personalization they'd like to see in an ideal scenario and without constraints?
How much the owning a new pair of shoes has the right to be digitalized item to be used in Metaverses contexts?

Accenture Analysis

10B

VR units sold globally. Back at Connect 5 in 2018, the company's annual XR developer conference, Meta CEO Mark Zuckerberg explained that he believed that 10 million VR users on a single platform were an important milestone for the company to make a sustainable ecosystem for VR developers. The bar was reached on March 2021, as declared by Cristiano Amon, Chief Executive of Qualcomm, chipset supplier of the Oculus Quest 2.

Accenture Analysis

10B $

linking with digital assets to real-world activity in the metaverse platforms, and Fortnite alone has sold more than $1 billion. The metaverse is also expected to have a strong connection with the real-world economy – and eventually become an extension. In other words, the metaverse gives the ability for companies and individuals to participate in economic activity in the same way they do today. This means building, trading, and investing in products, goods, and services as we usually do in the physical world.

Accenture Analysis

47%

of end-users are already ready to consume 3D virtual content. Consumers trust immersive experiences to close the sensory gap and provide the purchasing "proof points" they need to evaluate a product digitally. This aspect is so meaningful for them that they associate premium brand value with immersive experiences. Nearly half of consumers would pay extra for a product if they could customize or personalize it using immersive technologies. In addition, three in five consumers expect to buy more from brands that enable them to use these technologies to interact with and evaluate products

 

 

Further consumer motivations for going immersive

52% Viewing products without visiting the store.

42% Assessing product features and capabilities

42% Experiencing products before purchasing.

39% Increasing confidence in purchasing decisions.

29% Changing, customizing or personalizing products.

Accenture Analysis

64%

of leading brands are funding in immersive technologies. As much as consumers love the convenience of digital commerce, there is no purchasing certainty when they shop online for products they haven't experienced in person. Surprisingly, leading consumer brands already understand this. The analysis reveals that a full 64% are starting to invest in immersive experiences for commerce today. However, many are not investing in scalable or connected ways across the business. They are investing in pockets, doing things like uploading 3D models on product pages, curating personalized make-up palettes and hosting virtual fashion shows to bring people closer to products in the digital world.

Virtual Try On competition

The offering for digital augmented reality products to interact and customize virtual items in the physical world is today very limited. I had the opportunity to review many minor apps; however, a few big brands are looking into the AR scenario with a thoughtful approach. The cases that are currently most noteworthy are apps from Ikea, Nike and Porsche. Although quite interactive and visually compelling, the broader experience feels cumbersome and thin for all these brands. I also wanted to understand how these players were moving into the digital NFT eCommerce. To my biggest surprise, none of them even suggested the opportunity to buy items to be exploited into the metaverse, where NFT can create great profits as an alternative way for monetisation. The other important element is that brands that are currently more active in the AR e-commerce space are expanding their AR presence within the social platforms. The lever is to offers to communities augmented lenses and filters that extend the monetization opportunities., by projecting in the physical world virtual goods. It is still early to determine if this approach will become mainstream or it is just led by the specific segment innovation brought forward by tech giants like Meta and Snapchat. The future will tell.

Porche AR Visualiser

Despite the low buzz around the launch of this app, I found this digital product quite interesting. The service offers customers the opportunity to choose their favourite car from an extensive catalogue and visualize the product in the physical space. Once projected, the user can customize parts of the vehicle and virtually drive the car on an infinite floor. The experience is exhaustive and comprehensive, although the wheel interaction pattern doesn't help the usability much. 

Nike Fit

The Nike Fit app was released back in 2019 to solve the common user problem of choosing the right foot dimension. The app would determine the correct size number thanks to AR and AI, enabling the user to select the proper shoe. Nike discontinued the app and fully embedded the AR feature in the flagship digital product. In the Nike app, the customer can try virtually the selected shoes and see real-time how wearing the favourite sneaker might look like in the physical environment. 

Ikea Places

Ikea Places takes advantage of the extensive Ikea catalogue and enables the user to project a 3D representation of the furniture into the physical space. The application allows the user to understand if the selected model fits in the area and with the rest of the furniture, helping the user make a more informed decision. The interaction is straightforward and facilitated by non-diegetic UI. However, it becomes pretty hard to interact with the space when selecting a few objects. The purchase transaction is then achieved via a traditional flat mobile experience.

Farfetch on Snapchat

An interesting case is the extension of branded AR filters over Snapchat. Quite a few brands are exploring this option, and the interaction is pretty intuitive.
The Farfetch experience uses Snapchat current technical capabilities such as body segmentation and face tracking to project digital content. What I found interesting is the level of exposure of these mechanics to younger audiences. However, it is still neither possible to complete a purchase on the Snapchat platform nor direct the user to the owned Farfetch website.

Problem Definition

Is there evidence for a consumer demand/unmet need?

Unlike for projects that solve traditional business problems, this question is not as straightforward to answer in the realm of a disruptive innovation-focused space. Though there are some consumer-evident minor problems with shoe eCommerce and a level of expectation from users regarding AR-based interactions, in their current state, shoe eCommerce platforms essentially work the way they should. However, although this eCommerce industry is quite mature and profitable, my main focus is on things consumers don't know they don't know. Hence, while there is an opportunity to validate innovative ideas with consumers later in the process, there is minor evidence for an unmet need.

To research the problem space, I decided to investigate both the functional features of the shoes and the ability to personalize them. Typically, these needs represent the main driver when buying a new pair of sneakers. I didn't go deep into cultural and demographic factors that usually affect the younger generation purchase decision and therefore considered the cultural influences out of scope for this project. Also, I discovered a website - Solarview - that illustrated the core features of many famous, acclaimed sneakers - including the Adidas NMD -. I thought it was interesting to determine whether an AR visualization of like for like content would improve the consultation experience, leading to amplifying the buying stimulus. 

At this point, in a work environment, I would have run with few qualitative surveys alongside stakeholders interviews to find valuable insights and determine the opportunities and challenging areas. I would run affinity maps to collate the information and create strategic routes to pursue out of this activity. Given this is a personal project of a speculative nature, I continued this exercise with assumptions based on my understanding of the market trends and by emphasizing with the customer to accommodate their most substantial needs.

The Opportunities

As a result of my research and after defining the margins of this project, I outlined some key opportunities. Obviously, as a consequence of Covid, we assisted in an acceleration of digital behaviour that facilitated the usability and understanding of benefits of digital commerce platforms by larger audiences. Also, common retail pain points continue to generate loss and low profitability on higher scale sales. Hence this solution might mitigate those challenges. Last by not least, after the Facebook shift to become a Metaverse company, the VR\AR industry reached its maturity. It will become more urgent for brands to identify tangible use cases to amplify products and services attractivity in the digital space. Alongside the surge of non-fungible tokens as new digital currency, the scenario painted as a scope for this project seems to be very relevant and to be designed further.

1.

People want to get more information about the product they are looking to buy and make a more informed decision.

2.

Brands are looking to decrease the return rate to save costs and increase customer satisfaction.

3.

The VR\AR reached maturity and brands are looking to establish themselves in the immersive ecosystem.

4.

NFTs and cryptocurrencies enable individuals to own digital goods and become legitimate owners of a selected digital item.

The Strategy

To design this solution, I also outlined a barebone strategy that would support the execution of the prototype to sustain my thesis further. The concept is based on a fully augmented reality solution with diagetic controls, enabling the user with a fully 360-degree experience and spatial interactions. The user can access a number of technical information about the product via compelling visualizations and affordances that create a more effective storytelling. Ultimately the customer can customize the product to make it more personal and further decide whether to buy it physically or as a entirely owned digital property.

Given the lack of accessibility of Microsoft Hololens, Snapchat spectacles, and Magic Leap devices, I decided to investigate the prototype as functioning on a portable tablet. While this solution enabled me to understand the constraints and opportunities of creating XR interactions in the physical space, on the other side I was aware it was limiting the opportunity to leverage more advanced interactions like gaze, voice and gesture controls. Also, I acknowledged the area cropping of the visible stage due to the framing of the digital portable device. Nonetheless, this setup exposed me to usability constraints and opportunities in XR, currently not surfacing while designing for traditional flat screens. This last factor was ultimately another of the reasons to initiate such a personal project.

Sitemap and Core Features

Once I determined the business opportunities, the nature of the challenge and the type of users to target for this project, I defined the core functions I wanted to explore in the prototype. Luckily enough, thanks to the SolarView review page mentioned earlier, it was easy for me to deconstruct the main user journeys and structure a simple sitemap to help me orient during the execution and provide me with a backlog of tasks to design.

Persona

As part of my process, I sought to be as close as possible to younger audience generations needs and expectations without being too naive. In an ideal scenario, I would have created the persona as a result of customer research and behavioural analysis. However, given the nature of the exercise as a personal project, I assembled the archetype based both on my understanding of the problem and the business opportunities outlined above. I traced assumptions based on my attempt to empathize with the younger audience target segment as best as possible.

How close should be the experience? How big should be the content in relation to user location and positioning? How clear are the elements that overlay a noisy background? These were some of the questions I had to answer during my design process. After testing and iterating a few different solutions, and within an ideal scenario — clean background, proper lighting, clean surfaces-  it would make sense for the experience to be projected approximately at arms distance, within the “personal” zone. With this placement, I was looking to provide a good appreciation of the sneakers parts, textures and text readability. However, being environments very different case by case, I left open to the user the opportunity to decide the proper size and distance from the eye to accommodate the best experience.

User Flow

I created a chart to determine the steps the users will take to achieve their goals. The term 'flow' depicts movement. My objective was always to allow the user not to feel lost by providing an intuitive interaction model and helping orient through the experience. The experience is divided into steps that the user takes from the entry point through consultation and customisation funnels towards the final conversion action, meaning completing the purchase, either for physical or digital goods.

Interaction Model

AR will use the world around you to represent and accommodate digital information. Therefore I had to consider the boundaries and what happens within each layer. Consequently, I had to determine how each part intertwines with one another and the implications on the users' awareness of those layers. I believe this was the most complex part to resolve throughout the entire project. I planned the interaction affordances to happen in the "controls" area, subdivided into two sections. This section is the closest to the human eye and, consequently, perceived as most relevant. Instead, the "stage" area is where the users can appreciate the result of their interaction. The goal was to achieve a mental model that would not confuse the users by avoiding mixing controls with content and providing a clear and pleasant experience.

Creating a clear backlog with broader epic and specific tickets for features was critical for me to focus on the prototype execution. I also did this exercise to ensure all the necessary elements for a  Minimum Viable Product launch and learn from customers for further iterations. I didn't prioritize and quantify any ticket item for actual design execution and implementation; however, in an ideal scenario, I would have determined which capabilities were essential for users as a result of ad-hoc user research.

Wireframes

At this point, I was ready to describe more granularly each section of the experience. I'm still struggling to decide if it is right to design for AR with a "screen-based" mental model. Or instead, we should script the experience like for a movie, in which you describe the actions and situations as continuous flow instead of subdividing exactly each part. However, being this a learning exercise and to not drag further, I decided to stick to a "scene-based" description for simplicity. The Reality Composer's interaction model probably influenced my choice. The software suggests adopting "slides" to ring-fence similar content/intents and use this framework to easily bridge the different parts of the experience. In any case, this exercise helped me frame the details of the interaction mechanics like it happen when wireframing for traditional flat digital products. It allowed me to see more tangibly the relationships between content and navigational affordances and how the user would understand the augmented space. Ultimately, with this activity, I wanted to favour the discovery of new content from different angles, acknowledging that providing freedom in the movement would give the user an extra layer of engagement.

Visual Design References

I firmly believe that form follows function. However, good visuals ensure that the user is immediately catapulted into the experience leveraging pleasant emotions. As I discovered more about this industry, I spent some time understanding the shoe parts and construction to plan the storytelling and interactions carefully. As the subject can get fairly complex and technical, I also wanted to get some visual inspiration to convey technical data that would feel harmonious with the shoe product and easy to interpret. For the menus, my objective was to provide a clean and sleek look&feel, still making sure the notion of depth wouldn't confuse the user in the browsing. For the icons I wanted to increase affordances taking advantage of further dimensionality Ultimately, these key elements had to reflect the augmented nature of the experience, and I was keen to structure the visual design language taking advantage of the Z coordinate, to bring the experience to the next level.

Adidas Styleguide

I wanted this experience to feel Adidas in all its aspects. Therefore I investigated the look&feel of the brand and followed as best I could its minimalistic and pragmatic approach. A simple colour palette paired with a charming typographic scheme guarantees that Adidas communications are effective and identifiable and that goods and services from the company always stand out. The ITC Avantgarde Gothic Std font is a sublime example of personality and readability that helps the user demonstrate information hierarchies with ease, still being an ambassador of the dynamism that the brand mission embodies. Adidas Iconography is also straightforward and with a pleasant personality and ensure users smooth navigation in a very dense catalogue of items and services.

Previz Concepts

Before jumping into the prototype execution, I investigate the physical elements in the scene, particularly the nature of the materials, the positioning and the illumination of the components. I discovered pretty early in the process that it was easy to control readability in such an artificial context, by choosing bespoke lighting conditions and clean backgrounds. While this setup allowed me to understand weights and space relationships, on the other hand, it exposed me to fundamental questions when designing for AR, such as: what's the right size and elements proportion so to facilitate smooth interactions and clear affordances? Is the brand colour palette the right option to make sure that features in the scene are clear and readable also with noisy backgrounds? How close should be the content to the main navigation to not interfere with interaction flow? 

Process Pipeline 

I had to flex my design muscle to develop a reliable pipeline and be able to create a working prototype. Designing for AR isn't as straightforward as designing for flat screens. You are forced to wear different hats throughout the pipeline. One day you are a product designer, another one you are a 3D modeller, another one a texturing artist, and another one a game artist or even a developer. And it is easy to lose focus and derail. Designing for emerging technologies entails competencies in so many design disciplines that it is very hard not to blur the lines. No matter what, I was laser-focused to achieve my goal, and I had to discover on my skin that the tools out there are NOT developed for designing for AR digital products. All of which made things even more fascinating to explore.

Working in 3D entails that often you don't have (or can't download) all the 3D assets required for the project. Therefore you have to build those yourself. For this particular exercise, I didn't model the shoes which I downlaode from CGTrader, modelled by the talented artist Vissanu. As I wanted to adhere as best as possible to the Adidas visual language, I decided to create, model and texture bespoken 3D iconography to adequately represent the visual style and convey the feeling the brand wholly owns this experience.  

As the first part of the design process, I relied heavily on Sketch to produce mood boards, wireframes and user flows. Sketch works well in this space, primarily to get around the key concepts, blocking out the areas to explore and set the foundation for the mechanics and assets that will constitute the experience.

At this stage, once I had all the areas in focus, it was time to jump on my favourite 3D software. I always used Cinema4D as my preferred tool, though Blender is becoming the major choice nowadays because of its direct compatibility with AR\VR visualizations. Working with a full-fledged 3D software allows you to design and feel in 3D, something that current traditional XD tools can't guarantee. If you want to become a good XR designer is mandatory to master such software as it provides efficient instruments and workflows that make you familiar with the fundamentals of 3D design. Such as selecting an object, positioning, rotating and scaling it, and understanding how texturing and lighting work in a spatial environment. These notions will play a critical role in designing spatial interactions later in the actual digital product development process.

Apple plans for Augmented reality are explicit in terms of file formats. USDZ will guarantee the tech giant compatibility, easy authoring and real-time content packaging. This strategy will allow the Apple company to deliver AR solutions cross-device and platforms. Even if this sounds very promising, the reality is that this file format is still sparingly adopted by other big names like Adobe and Maxon. Therefore, I heavily relied on Vectary - a powerful online 3D authoring tool - to polish my texturing and deliver a final look to the 3D assets. By Importing .gltf from Cinema4D, Vectary ultimately allows the export of 3D assets as USDZ without many conversions headaches. I then imported and used those in Reality Composer immediately. When 3D assets didn't require further enrichment, I also used a primitive tool still from Apple called Reality Converter, which allows you to translate most 3D file formats in the above mentioned USDZ file format and very important, can export rigging/skeleton animations data.

One of the critical features of this experience was the ability for the user to customize the shoes. I spent some time diversifying the original shoe skins. This process, in particular, extended my understanding of UV mapping and texturing, allowing me to discover that PBR texturing has become a standard for real-time rendering. Out of this process, I created various versions of the same shoe model, which allowed lots of variances for each of the shoe parts and full-fledged style presets for lazy users.

Once I had all the 3D assets modelled, textured, and exported, I imported all the items into Reality Composer. With a level of degree, I created my 3D design system. This toolkit allowed me to concentrate on designing the flow rather than jumping in and out of the application in order to build my assets. I soon discovered one crucial notion; to make sure as early as you kick-off your 3D assets production, to keep an eye on the poly count and texture file sizes. Real-time rendering is very demanding and requires lots of processing power to perform with a decent frame rate or not even to crash. I had to learn this on my skin, and something I will keep in mind for my following projects.

I recreated my solution in a completely virtual environment to demonstrate my concept. I planned my Adidas Lens experience to happen at home, from the comfort of your lounge room, and interact with your product without distractions. I recreated all environment and digital products with Shape XR, a VR tool that allows quick prototyping in 360 environments. In this situation, I had a much clearer understanding of the different feelings of diagetic vs non-diagetic controls. The interaction model I created for this solution worked nicely, however allowing the customer to interact with gestures or device pointers gives an additional level of comfort that clearly can't be delivered via mobile portable devices. Also, the immersion provides the user with a sense of presence that can't be achieved with the above devices. In a nutshell, even if these portable solutions allow to see the augmented digital layer, on the other hand, they become a barrier to access the content itself, clearly limiting the interactions and therefore the full range of opportunities that augmented reality solutions can unleash.

I wanted virtual try on to be a key feature. I soon realized that Reality Composer doesn't support body segmentation and external cross-linking. But it made sense anyway to create a fake bridge to Snapchat and take advantage of the Snapchat AR lenses to allow customers to see the result of the customization directly on their feet. For this reason, I utilized Lens Studio to design the experience and output the solution as a Snapchat lens filter. Lens Studio is a potent tool and offers a few templates for you to start from. In particular, I took advantage of a shoe try-on preset for this scope. I imported models that I had previously customized in Cinema4D, adjusted a few parameters in the Lens Studio tool, and published on the Snapchat creators platform.

Solution Analysis in VR

The original concept for this project was meant to target head-mounted displays such as Holelens or Magic Leap. However, since I couldn't access these devices, I wasn't able to test AR usability appropriately. Spatial interaction through a tablet undermines the experience quality because of a lack of gesture or gaze control. I believe these types of interaction can bridge the experience to a whole new level. For this reason, I wanted to test this solution by recreating an HMD experience in VR to understand further how augmented reality solutions could unlock opportunities and mitigate spatial interaction pain points as a learning exercise.

NFT Purchase
Reviews & Suggestions
Style Presets
Customisation
Performance
General Details
Journey Outputs

At this stage, I was able to produce all the specific journeys and create an end to end product fully interactable. Each scene is linked via the main menu and the user can fully browse each section by tapping on each 3D icon on the portable device screen. By enabling the Reality Composer AR feature I can then project the experience in the physical space and interact with the 3D elements as they would live in the real environment. The wow moment is secured! 

I decided to prototype this experience with Reality Composer. The main reason is that despite all the hassle in creating 3D assets, the tool from Apple is probably the most straightforward to prototype AR experiences. On desktop, the software is embedded within Xcode, and you can integrate the immersive experience in your next application with a few lines of code. By leveraging ARKit and Lidar technology, the anchoring system is potent and stable, and with a few simple taps, it enables the projection of the 3D digital content in the physical space. Not only that. I was delighted to discover that it is effortless to design 3D interactions. Thanks to a widget framework and behaviour presets, designing typical interactions is very easy. As an example, in the shoe customization section, I imported all the different parts with different colours. I then assigned to each component an action - show\hide - triggered by tapping each colour on the switcher. The whole process is a bit time consuming, but once the logic is clear, it is easy to create deep functionalities without writing a single line of code—a dream for any XR designer. The software itself is far although to be mature, though. There are still quite a few flaws in the selection and grouping capabilities. However, I believe Apple is doing the proper steps to make product designers become XR experiences creators.

User Testing

In an ideal scenario, the last step would be testing the solution with real users. As an experience designer is critical to detach yourself from your understanding of the product and give people the opportunity to express their judgement, to help evolve the solution in the right direction and make sure it solves real problems. For this reason, I would validate my experience by conducting a qualitative survey with direct users to gain tangible feedback. The outcome of this survey would help me iterate on the product and, in particular, determine which capabilities are essential, which are secondary and which instead are entirely unnecessary for achieving the customer goals. The other step would be getting in touch with business stakeholders to determine implementation issues and, from a commercial perspective, any additional levers that could be introduced to maximize conversions. 

Below are some of the questions I would pose during testing to determine the next steps:

Adidas Lens Questionnaire

1. Was the experience enjoyable?    
2. How was the understanding of the products and their features?    
3. How easy was it to navigate the experience?    
4. Were you able to find the information you were looking for?    
5. Which product features do you consider the most valuable?

6. Are there any missing features in the prototype you would like to suggest?    
6. How would you briefly describe the product?

7. How well the product fits your needs when buying new sneakers?
8. Did the experience generate any side effects such as nausea, vomit etc.    
9. How hard do you have to work, mentally and physically, to digest the information provided?  

10. How much this type of experience would influence your buying decision?    
11. Would you be interested in trying the same experience via an HMD visor?    
12. Would you recommend the experience to friends?    

Final Reflections

It was great to work on this exercise as it was necessary to understand AR design pain points and opportunities and processes pipelines. Working in the XR industry means mastering different disciplines. It isn't anymore just about product design and heuristics. Creating immersive solutions entails learning and mastering 3D tools and, as an individual, being a multidisciplinary jack of all trades, sometimes a writer, a game designer, a visual artist, and eventually a programmer.

What Went Well

  • Design thinking and double diamond methodologies rightly apply to AR products.

  • Working with RC and interaction widgets allows focusing on the design of the experience.

  • Spatial interaction usability patterns still need to be fully established.

  • NFT is going to revolutionize the entire Ecommerce behaviours

  • It is valuable to test the AR solution in a VR mock-up environment.

  • At the end of the exercise, you feel there's much more to explore

  • The whole process was gratifying and fascinating to explore

Even Better If

  • User research with real users could have helped prioritize features.

  • The solution could be tested with HMD devices. 

  • USDZ file format would be more accessible and standardized across 3D tools

  • A more robust and flexible AR prototyping tool would exist.

  • I have had the opportunity to send the prototype as a stand-alone app to gain feedback quickly.

  • I had the opportunity to work with a developer.

It was great to work on this exercise as it was a necessary step for me to understand AR design pain points and opportunities and processes pipelines. Working in the XR industry means wearing different hats and mastering different disciplines. It isn't anymore just about product design and heuristics. Creating immersive solutions entails a deep understanding of spatial relationships and mastering 3D tools. As an individual, it requires you to become a jack of all trades, sometimes a product designer, a writer, a game designer, a visual artist, and eventually a programmer.

Designing an XR experience is just fascinating. There is so much potential and opportunities to uncover. Despite tools, pipelines, and design patterns are far from being fully validated, the potential for this type of experiences, I reckon, is on a different scale than traditional mobile\desktop digital products. And by applying design thinking methodologies, the opportunities become even broader and more granular. My journey surely encompassed moments of frustration and uncertainty, and rest assured, the learning curve was steep, especially when transitioning to high-end real-time software like Unity and Unreal. However, by acknowledging Reality composer limitations, I was surprised to realize that you can achieve exciting results to demonstrate your AR idea thanks to proper planning and a focused approach.

Once the prototype was ready, the idea would be to collaborate with a dedicated team of creative technologists and engineers to iterate on the experience. Below, I broke down all the experience sections to appreciate each section and further demonstrate my XR design journey. These outputs are dry captures out from the Ipad device. No CGI or compositing was applied. The mouse cursor was enabled via Bluetooth to demonstrate the interaction flow, although the experience works simply by tapping on-screen the three-dimensional icons and CTAs immersed in the physical environment.

bottom of page