Key Challenges of Implementing Virtual Try-on Apps

Photo of Wojciech Dudzik

Wojciech Dudzik

Updated Dec 12, 2023 • 15 min read
implementing_virtual_try_on_apps

Virtual try-on apps are evolving fast, with companies like Ikea getting on board, as well as luxury retailers such as Burberry and beauty brands like Sephora.

Powered by augmented reality (AR) and machine learning technology, research shows interacting with virtual try-on AR-based experiences leads to a 94% higher conversion rate (Deloitte).

No longer simply a nice-to-have feature, the pandemic has accelerated pre-existing trends, with virtual try-on applications fast becoming an essential part of business.

Bringing many advantages, from increasing online sales, traffic, and customer satisfaction to reducing returns, virtual try-on software is dubbed "the new frontier of online shopping". However, it’s important to also recognize the challenges implementing virtual try-on apps poses, from technology complexities to balancing the fun factor against producing an authentic experience.

What are virtual try-on apps?

Starting with the basics, a virtual try-on app contains the technology that enables you to try on clothes, accessories, shoes, cosmetics or jewellery, with the help of augmented reality (AR) that uses real-world setting, unlike virtual reality.

This often comes in the form of a mobile app, but there are other types too, including desktop virtual try-on plugins and smart or virtual mirror apps in shops equipped with a camera.

Why are virtual try-on apps the new standard of customer experience?

These applications are a way for consumers to enjoy a real-life shopping experience from the comfort of their homes – more relevant than ever since Covid – brands are increasingly turning to virtual try-on applications to showcase their products and create an engaging experience. But where did it all begin?

Virtual try-on as a golden child of the digital acceleration in retail

The phenomenon of virtual try-on apps powered by augmented reality (AR) and machine learning models has been experimented with for a decade or so. For example, brands like Converse dabbled with immersive virtual try-on technology as far back as 2012.

Then in 2016, AR moved more mainstream, when Niantic launched Pokémon GO, the AR-enhanced game for iOS and Android.

Virtual try-on is just another application available via mobile phone. So, why didn’t we see more of them with the rise of smartphones? For a start, hardware capabilities. Running proper pipelines to estimate position, finding key points, displaying given objects, and calculating lighting are not easy tasks.

Only recently, with the advancement of phone chips and dedicated hardware such as graphics processing units (GPUs) and neural processing units (NPUs) has that become a possibility.

Furthermore, software needed to be more mature, because deep learning (regularly used with virtual try-on apps) relies heavily on proper implementation and support. Solving a problem of key point detection (for example locating a wrist) in an image with powerful GPUs and servers isn’t the same as running a model on a phone. A dedicated version of the library must be written that’s aware and capable of using those GPUs and NPUs

Over time, with both software and hardware increasingly available, virtual try-on apps have become more popular, and the sector is thriving. By 2025, it’s estimated nearly 75% of the global population and almost all people who use social apps will be regular AR users.

Why try-ons pay off?

Virtual try-ons have the same pros as online shopping, plus more. By using virtual try-on apps, customers get a better idea of how they look wearing a given item.

That either drives them to other items or lowers return rates, because customers can immediately decide whether the look is appealing.

As more computational power is available, it’s easier to deliver virtual try-on apps to mobile phones, greatly increasing potential customer range. The privacy of data processed on a local device is another plus point.

The pandemic also plays an important role, with people less likely to visit a shop. That reluctance is coupled with a pre-pandemic trend whereby the online shopping market was already growing.

Technology challenges in building virtual try-on apps

Despite the advantages of virtual try-on and their growing prevalence, there are some challenges. From a technological point of view, virtual try-on apps use augmented reality to display an object like shoes onto an image acquired from a phone camera.

It’s important that the end result is photorealistic. There are additional technical aspects that apply to every type of virtual try-on app, regardless of how photorealistic the final solution is.

Optimized application performance

For example, you must constrain your solution to achieve proper performance. In the case of video, that means a stable 25 frames per second. It’s a bit easier with image processing, because you have more time, but the user shouldn’t wait longer than a second or two.

Regarding performance, it’s hard to estimate how the app will work on different devices. The iOS ecosystem is much easier, but with the variety of Android devices, things get tricky. The best solution is to test the final application on more than just high-end phones.

It’s tempting to use cloud computing, given the pretty much infinite power from the point of view of a virtual try-on app. However, there are several major drawbacks to that approach. First of all, the application is heavily dependent on a fast internet connection. With temporary slowdowns, the application may feel unresponsive, discouraging users from using it.

Data privacy

Another problem is data privacy, an increasingly prominent issue. These days, app users have much more data awareness. On top of that, sharing photos or videos may make people think twice, especially using an example such as tight clothing.

Also, cloud computing provides potential vectors of attack, requiring extra precautions and additional expertise to check security. It’s possible to mitigate these issues by running computations on the device – a simpler solution with less room for error.

3D modeling

The last thing to mention is acquiring the 3D models. You need those models, but the end result depends on their quality. You can approach that challenge in several ways. For example, you may choose to buy a dedicated 3D scanner with proper software that will provide you with a model of the highest possible quality. Another option is to hire a professional 3D artist to create the models.

The most popular option is to use 3D photogrammetry and structure-from-motion. That solution requires taking 10 photos or more (the more the better) using a high-end smartphone and ensuring proper lighting. The hardware cost is smaller compared to a dedicated 3D scanner, while the end model achieves the desired quality.

These 3D models must be prepared ahead of time, because algorithms may require 20-40 minutes to convert photos into 3D models. One more thing to note is the models must be kept in a cloud and downloaded onto the end device when needed, because they can weigh tens of megabytes.

It’s worth considering preparing the models in two qualities – low and high. The low-quality model is much lighter and it’s possible to download it very quickly and present it to the user, while the better quality model is downloaded in the background and swapped when ready.

Off-the-shelf virtual try-on applications

As mentioned above, proper hardware is a must, but there are additional challenges to building a realistic virtual try-ons.

These fine-grained problems are highlighted in the graphic below.

virtual_try_on_applications

Let’s start with off-the-shelf try-ons. These pose a number of issues regarding object placement, including:

  • Adjusting orientation
  • Scaling
  • Stabilizing object position

With out-of-the-box augmented reality software, you can place a virtual object into the scene. However, that’s far from providing a good virtual try-on experience. Most ready-to-go solutions provide some sort of object scaling and adjusting orientation, but simply scaling in, up, or down may not work as expected.

Let’s take shoes as an example. With a single-size 3D model, simple scaling up or down will not work, because shoelace holes shouldn’t change size. If the scan (3D model) presents shoes size 38 (european scale) the size 48 might just look ridiculous, as it will be scaled linearly in all dimensions (imagine a pair of clown shoes). Those shoelaces will be disproportionately long and thick and the shoes might be unnaturally high.

Moreover, good stabilization of the position is needed. Ready-made solutions often flicker, or the model floats around detected key points. That problem originates from the method these key points are found. Most of the time, they come from either computer vision algorithms or a deep learning model.

That means they can move between video frames because of changes with lighting, camera focus, or other environment shifts. So, it’s hard to distinguish between proper movement of a foot for example) and random flickering of the model.

That problem is clearly visible with a lot of ready-made solutions. While it’s solvable to some extent with the help of algorithms from control theory, you still need to balance stabilization quality and reaction to proper movement to prevent lagging behind with your rendered object.

Custom virtual try-on application development

Custom virtual try-ons also come with issues, often more complex in nature as the applications tend to be more precise and deliver more detailed images. That concerns additional technology challenges.

Simulation of natural lighting

One of them is compensating ambient light saturation and object brightness. Without proper lighting, the object looks synthetic. For the customer, that could ruin the immersive experience provided by the virtual try-on app.

To remedy that, gather sensor data about the light conditions. That estimation can be extracted from ARKit (iOS) or ARCore (Android). It gives you basic information about ambient light, the direction, intensity, if it’s moving, etc. You should take that information into account and properly apply it to your object.

That will also allow for proper shadowing and shading, such as which part of the object should be light or dark. Alongside that basic lighting work, it’s also important to handle reflections and specular highlights.

These are more complicated cases where a part of an item is shiny, and the user expects the virtual try-on application to render it properly. So, handling reflection is needed, and also adjustments for the color of the light and objects nearby. These specular highlights should change relative to the position of a viewer in a scene.

Scene understanding and occlusion handling

As well as simulation of natural lighting, there’s also general scene understanding to think about. Connected to lighting, depth of information is crucial here. For example, knowing what’s in the foreground and background is really important, not only for finding a proper location of an object, but also for handling occlusion.

Let’s take an example of trying on a high heel shoe. To look natural, the pant leg should cover the shoe to some extent. The best solution? Obtaining deep information from a dedicated sensor like iPhone LiDAR.

However, while it provides great information it's not available on all devices. So most of the time, you need to estimate this information from a 2D image. That results in inaccuracies and also consumes precious processing power but comes with much lower hardware cost, which means that we can reach more customers.

Business perspective

We’ve covered the technical point of view, but even the greatest technology fails eventually, without a proper business model. Let’s take a look at virtual try-on apps challenges from the business angle.

Fun versus an authentic experience

An important problem to mention is balancing the fun factor with offering a realistic experience. Virtual try-ons are making their way into news headlines and gathering interest, so it’s tempting to jump on the hype train. Releasing fast and early versions is great for marketing, but keeping clients interested and retaining them is a real challenge.

At first glance, virtual try-on applications are pure fun and excitement regarding technology. However, creating a smooth and seamless experience is essential for the application to be a practical sales tool used by the customer. As discussed above, make the app visually pleasing, and give a good estimate and visualization of how an item looks, while streamlining the order process.

The wider ecosystem

A few clicks way of buying a product is definitely the future of accelerating purchasing decisions but to achieve it, the application must be a part of a larger sales ecosystem. If there’s already an existing loyalty app, the virtual try-on should be part of it, and make use of customer data gathered by the loyalty app. Receiving worthwhile recommendations, similar items to try on, or matching accessories may boost sales.

Also consider proposing sizes, based on a customer’s history of purchases, providing measurements, or allowing people to scan themselves based on a single photo. By offering a personalized experience, consumers are motivated to click the buy button.

Building an ecosystem takes time, vision, and resources. It’s possible to put different things in place simultaneously, but generally, the development of virtual try-on applications is a continuous process, with new components added and kept up to date.

Retailers who already invest in virtual try-on apps are a step ahead of their competitors, especially when they combine it with desktop virtual try-ons plugins.

Incorporating new technologies into business and diving into uncharted territory is tricky. A convincing and authentic virtual try-on that integrates into a sales ecosystem requires experts in the field, not just regarding AR and machine learning, but also in terms of UX, mobile app development, and web design.

A poorly orchestrated try-on could actually discourage a consumer instead of helping them make a purchase decision. What I've seen a lot of recently is people trying to utilize these new channels, but not really putting in the know-how and the expertise and the experience design side of things.

Jinder Innovation-1

Jinder Kang

Innovation Consultancy Lead at Netguru

Customers’ trust

The last thing to overcome is the psychological barrier on the side of the customer, and convincing them to use and trust virtual try-on applications. That requires being extra clear about what’s happening with user data such as images and videos, where they’re processed, how they’re used, and so on.

Getting long, ‘never ending scroll’ user terms and conditions agreements is not the way to go. Explain it in a simple way, like a slideshow or 30 seconds video which shows where the data goes and what is happening as you use the application. Even though getting trust is hard, it’s a two-way street, and once you gain it, you may win customers and loyalty.

The future of virtual try-on apps

Ultimately, experts must work together to provide a valuable and authentic virtual try-on app that will always require programming skills. As you’ve seen, the processing pipeline via mobile devices is quite complicated, with the need to solve problems such as lighting, occlusion, object stabilization and scaling, while maintaining performance.

The technical part alone is challenging, before you even think about the business aspect.

Regardless of the challenges, when well-applied, virtual try-on apps can bring exceptional value to retailers, by reducing turnover, increasing sales, and providing additional methods to communicate with customers and personalize their experience. Even better if you can combine a virtual try-on with a pre-existing mobile app as the added value for your users.

Photo of Wojciech Dudzik

More posts by this author

Wojciech Dudzik

Wojciech works as a Machine Learning Researcher at Netguru. He is doing a PhD degree in Applied...

We're Netguru

At Netguru we specialize in designing, building, shipping and scaling beautiful, usable products with blazing-fast efficiency.

Let's talk business