Keywords

1 Introduction

Augmented reality fitting rooms facilitate the shopping experience by allowing customers to try-on apparel and/or mix and-match garments without being physically present in the retail shop [10]. They are used to improve the process of trying on clothes and maximizing time efficiency [8]. These platforms not only are powerful decision tools for the on-line shopper [3], but also contribute to the fun factor of in-store shopping. E-commerce is a strong application field for augmented reality applications. Specifically, in the area of retail, the inability of users to foresee how particular clothing items will fit them when shopping online has always been a significant weakness [11]. Fitting rooms are applications targeting at both personal computers and mobile devices [9]. The main interaction technique used by augmented reality fitting rooms is the real time detection of body and hands used to engage gesture based interaction. HD camera or/and Microsoft Kinect sensor inputs are usually used for detection [2, 6, 12]. Another interaction technique -less widespread- is spoken language. It was observed that the combination of spoken language input and virtually-real image output provides natural and robust interaction [7]. This paper represents the development of an augmented reality mirror which uses Microsoft Kinect v.2 for body and hand detection in order to offer a more natural way of interaction to the user, providing new features with regards to related approaches enhancing thus the customer’s shopping experience. The proposed AR mirror operates both on-line and offline and can be installed in a retail or department store as well as store in a house.

2 AR Mirror

2.1 System Description

The AR mirror comprises a large LED display, a Microsoft Kinect v.2 and a PC. The users can stand in front of the system and see themselves on the display, wearing virtual garments and accessories (Fig. 1).

Fig. 1.
figure 1

System setup

The initial “Welcome screen” of the system familiarizes the user - who stands in front of it - with the gestures that the system uses as the main interaction modality. In more details, aiming at the faster introduction of the user to the system and the most intuitive learning of the system’s interaction, a supportive assistance system has been developed, through which step-by-step user training is performed with the aid of animations and confirmatory messages to the user about whether he/she responds correctly regarding system manipulation. This “Help system” acts as an interactive assisting tool [1] and prompts the user to interact with the AR mirror. This way, users are quickly introduced to the system’s environment, which allows them to become accustomed to it and creates a sense of “early achievement” in terms of interaction.

After the user’s quick training, the “Main screen” is displayed. The main screen is divided into three sections (top, middle and bottom) (Fig. 2).

Fig. 2.
figure 2

Main screen of the system

The first part at the top presents the logo of each store as well as menu options, such as language, gender and clothes category (Fig. 3), while the second part in the middle is the actual AR area of the system, displaying a real-time video of the user standing in front of the system wearing virtual garments of the selected category superimposed on his/her body. This way the users perceive how they would look like if they tried them on.

Fig. 3.
figure 3

Top section of the system. (Left) logo of the store, (right) menu options.

In the middle section there are three more options for the user: the “Like” and “Dislike” interactive buttons, as well as an information bubble (Fig. 4). By selecting the Like or Dislike button, the user declares his/her preference about the suggested outfit to the system, the system removes the current outfit, and provides a new one according to the user’s their preferences (which have been defined through the options menu). On the left hand side of the user, an info bubble is presented providing information about the product the user is currently “trying on”. The third section at the bottom, comprises the Buy and View Photos button (Fig. 3). The latter button displays a strip of photos of the user wearing the recommended outfits. Shots of the user’s outline interacting with the system are being taken every time the garment changes and a gallery is created for the user to review what he/she “tried on” previously.

Fig. 4.
figure 4

“Like” and “dislike” buttons and the “info” section.

Fig. 5.
figure 5

(Left) passport screen, (right) buy options

2.2 Novel Features

Beyond the help functionality described above, there are a few more features which introduce novel elements in the user’s purchasing experience. For example, users are able to purchase the offered products either online or offline using a passport code. The AR mirror may be installed on retail stores or be available through on-line stores, to which each user connects through the XBOX console and a TV-mounted Kinect sensor to try on available products and buy them on demand. In the first case, each user can take the passport code written on the screen to the cashier and buy the product connected to this code. In the case of on-line stores, users can buy products via a personal web page that is generated while they try clothes on, and which contains all the outfits they liked accompanied with a link to the online shop service of the store.

Another feature of the AR mirror that enhances the user’s shopping experience is the double projection when he/she tries on items. In more details, the system provides in real-time a split view of the AR area, presenting the user wearing two different outfits, having this way the opportunity to compare (Fig. 6).

Fig. 6.
figure 6

Augmented reality split screen feature

Another interesting feature of the system is the “Photo Gallery”, which shows a strip of photos of the user wearing the recommended outfits, and a “Passport” code for retrieving these photos later on via his/her favourite browser (Fig. 5). The “Passport” is a hash code which is given to a customer, once he stands in front of the system, and is used as a unique identifier that gives access to an online html page containing all the products he liked, as well as the photos of the user taken by the system. A Passport can be also obtained through QR code, so that users can gain immediate access to their personal page through their mobile devices (Fig. 5).

The system also supports both horizontal and vertical display with automatic rearrangement of its UI components in order to fit the aspect ratio of each screen. All the features are provided equally in the two projections (Fig. 7).

Fig. 7.
figure 7

The vertical layout of the system

An assistive administrative environment has also been developed providing functionality such as, entry and management of products displayed by the AR mirror, management of the offers or discounts displayed, statistic insights, etc.

2.3 Interaction Techniques

Interaction with the system is achieved through gestures via hand tracking [4]. Using a Microsoft Kinect version 2 depth sensor [13], the user skeleton is acquired [5], and further recognition of the movement of his/her hand is being recognized, so that the hand movement is mapped to a virtual hand cursor (mouse emulation). Moreover, when the virtual hand cursor lies over an interactive component, a selection can be performed by leaving the cursor for a specific duration on top of the component, which is equivalent to a mouse click. Apart from hand tracking for selection, skeleton tracking is also used to calculate the minimum distance needed in order to start tracking the user and enable the application, or to know when the user has left the mirror. As well as to place the virtual clothes correctly on the user’s body regarding the skeleton’s joints tracked by the sensor.

3 Conclusion

An AR mirror has been presented aiming to provide virtual try on of apparel and accessory products in retail stores. This system initiates an interesting digital interaction experience for all users, through which they can snap and share their virtual fittings online while at the same time connect with the store’s online presence. It can be installed in various commercial locations. Customers are able to try as many products as they wish, without waiting on fitting room queues, and buy them from the comfort of their home or while a store is closed without having to carry them. Furthermore, user’s preferences and user’s experience feedback is collected for evaluation purposes in order to assess the system in terms of its usability and overall user experience. Future work pursues the inclusion of the application in popular on-line stores (e.g., windows store) or their installation on alternative devices, such as XBOX or PS4 consoles, enabling this way on one hand its use by home users and on the other hand brands to merge aspects of the online and in-store shopping into one exciting solution. The product presented in this work is commercialized in collaboration with LiateR.