Close your eyes and imagine yourself as a shopkeeper of the future walking through their shop with special glasses. What would you like to see then? What does your dream image look like? And what can be achieved with today’s technology? This is the challenge we tackled during this Broad Horizon Labs project. Within four days, we presented a working prototype to be proud of: The Virtual Shop Manager Assistant.
Mixed reality in retail: now is the time
During this Labs project, we had looked into the world of retail and the role mixed reality will play in the near future. Because although we had to wait years for it after the disappointment resulting from Google Glass, it seems that mixed reality will soon be making its appearance on a large scale. If Facebook and Ray-Ban are to be believed, a fashionable pair of glasses will be on the market within a year and may well be the successor to the smartphone.
However, we are not quite there yet. To the average consumer, purchasing mixed reality glasses is currently too far removed from reality. Glasses like Microsoft’s Hololens-2 are rather large and expensive to buy for the whole family. But is the purchase of glasses different for business purposes, such as for use by the shopkeeper or branch manager? We think so. And this offers opportunities for five years from now as well as for today and tomorrow.
With a multidisciplinary team comprising ten colleagues, we set to work to explore ideas and possibilities. After brief introductory information on the retail market with its issues and the possibilities of mixed reality, the group started brainstorming and exploring pioneering possibilities. This ultimately led to the Virtual Shop Manager Assistant.
Virtual Shop Manager Assistent
This Virtual Shop Manager Assistant aims to provide visual reports, tasks, and insights while you stand in front of the shelf. View the shop through a HoloLens and instantly see what your best-selling item is and how it is selling compared with other items and sales figures of other branches. In the meantime, the combination of a heat map in the shop with the product data shows how often customers approach and look at the shelf in question. And based on that insight, you can immediately set follow-up actions in motion.
To develop a Virtual Shop Manager Assistant, first of all you need a shop. Therefore, a bookcase in the office was emptied and converted into our physical ‘shop shelves’. A colleague offered to lend his collection of whisky bottles for this project, and we transformed the lab into a bona fide liquor store. We used the Microsoft HoloLens-2 as our glasses.
After the idea had been translated into functional sub-areas and corresponding technical components, the tasks were divided and construction could begin. Dynamics 365 was set up to function as the primary data source. The data consultants set to work on entering and coming up with the right data, and the software and Azure developers immediately developed a first version of the app in Unity.
In addition, a link was made with a purpose-made PowerApp – which the shop assistant could directly access the tasks on their terminal or own smartphone. Meanwhile, PowerBI reports were developed that can be read visually when standing in front of the shelf while wearing the HoloLens. In other words, the virtual shop was further optimized while the physical shop was stocked with real products and provided with product and price information. The team was ready to present the final result where the two worlds meet.
The final result
At the end of day four, the result was presented with a significant number of innovative features.
3D data visualization: Real-time insight into figures where it all happens
Viewing results in an Excel sheet or a dashboard while sitting at a desk is now a thing of the past. With Mixed Reality, you can see all these figures as you navigate the shop floor. And because this is not shown in lists and graphs, but visually for the relevant products, interesting insights are immediately provided. For example, you can see the heatmap as you walk through it, which automatically gives you a sense of why certain areas of the shop are visited more or less often than others. You can then take immediate action by, for example, moving markings. You can also, for example, present sales figures directly for the relevant products so that it immediately becomes apparent that some prominent shelf spaces are performing worse or better than expected. By comparing all this with earlier periods or other branches within the same chain, a tour of the shop floor promises to give you more insights than a few hours of being behind a big screen.
AI predictive models in action: The glasses proactively think with you
While the visualization of data in 3D space is quite impressive, how amazing would it be if you could also directly use prediction models in this? That is exactly what we have done. For example, the glasses managed to come up with suggestions to put certain products close together because they score high in a cross-sell algorithm. A different algorithm focusing on upsell opportunities can help to entice customers to choose product D (the more expensive option) instead of product C.
Activating workflows: allocating tasks in practice
The final step we achieved during this Labs project was a process improvement on the shop floor. For example, it is no longer necessary for a branch manager to walk through the shop together with a colleague and point out what work needs to be done. You can walk through the shop with the glasses at any time of the day, even before or after closing time, and create tasks for anything that catches your eye. Is a shelf almost empty? Have you noticed rubbish somewhere? Do you need to move products to positively influence sales figures? With a few simple gestures or voice commands, these commands are entered on the spot and recorded in the back-office application in the background. This means that an employee can call up these tasks in an app at a later date, carry them out, and sign them off.
The next step
We were able to realize the above functionalities within a few days. That is a good result, but at the same time we realize that there are areas where we can improve and develop further. What can this mean, for example, for onboarding new staff and the quick transfer of the right knowledge? And how can you use this to combine information from different ‘data silos’ and make it accessible? How can mixed reality help to verify that information stored in systems matches reality? In short: Mixed Reality can help tackle many challenges in Retail and we are eager to help in this respect.
If you are interested or just want to discuss the possibilities, my colleagues and I would like to hear from you.