Our system classifies points by the source and visualizes them as virtual objects. It is different from current point systems which accept a single input (money) to generate a single output (point). Mission is designed as input of our system to remind users of multi-value in consumption instead of only money. We expand the feedback mechanism from single output to two kinds of output – point and virtual pet. Our system presents feedback by spatial visualization instead of the traditional two-dimensional textual information. The way of interaction is expanded from only operation on two-dimensional screen to the spatial gesture interaction.
4.1 Point Classification
Quantity, time limitation and use range of points are taken into consideration in current point system. However, the source of point is an important feature indicating the difference between points. In this paper, we propose the idea of point classification. We classify points based on where they are collected from. Points are from various source such as food, book or flight. In our system, there several types of points indicating the source of points. Points from same source are classified into the same category. By using AR, users are reminded of the source of points and can know the number of each category in a more intuitive way.
4.2 Multi-value Creation
Point system is a system rewarding points to encourage spending. Rewarding points for spending is a way to promote consumption and retain customers. It is economy-oriented. However, such design ignores other important factors in the consumption. Many value factors are critical for making decision, such as health diet and environmental protection. These factors are not taken into consideration in current design. Therefore, we import the mission into our system to remind users of multi-value which cannot be reflected with only points .
In the system, mission will be given during shopping. Users can get point from a certain amount of consumption and get feedback from virtual pet after the completion of mission. It motivates users to take full account of the value of consumption.
4.3 Interactive System with Spatial Visualization
The point system is actually a reward system based on gamification theory. Point is a mechanism for immediate feedback and tracking progress. In the current system, customers are informed of point information in the textual form. The problem is that this immediate feedback cannot be fully expressed by static textual description. It prevents point systems from showing appealing reward in a dynamic way.
Point. Point is visualized in our system. The source of points is reflected by the objects which the points are visualized as. One object represents one category of points. Coin models are placed next to the object that represents the source of the points. Users can get a rough idea of how many points there are by spatial visualization. Users can know the specific number of each category after clicking the coin model. The spatial visualization of points is shown as Fig. 1.
Virtual Pet. The spatial visualization of virtual pet is shown as Fig. 2. In the picture, state of pet is different. In subgraph (a), two indicators are high and the pet is active without illness. In subgraph (b), two indicators are normal and the pet looks calm. In subgraph (c), blue indicator is low and the pet looks unhappy. In subgraph (d), two indicators are low and the pet is sick. The characters are colorful and simplistically designed creatures based on animals and people. These pets look like common animals. At the same time, they can make movements like human beings. Users can decide pet’s name according to their preference. After the pet interface is opened, the pet will appear on the screen. Its name and level is displayed at the bottom-right corner of the pet. Information about the pet’s state is displayed in the top-right conner of the screen. There are two indicators to determine how healthy and happy the pet is. Each indicator is a measure of consumption. If user gets many points, the green indicator will be high. If user gets many kinds of points, the blue indicator will be high. These indicators have a direct relationship with the user’s consumption. The two indicators are combined to evaluate the health of the pet. The expression of pet is influenced by the health of pet. If both indicators are high, it will be active. If two indicators is normal on average, it will look fine. If one indicator is not good, pet’s expression will be unhappy. If both indicators are not good, pet may be ill. The pet goes through several distinct stages of development throughout its life cycle. Each stage lasts a period of time, depending on the level of pet. The level depends on mission completion. If mission is done, pet will get experience value. When the threshold is met, level will be upgraded. After reaching a certain level, the pet reaches a new stage and its appearance changes, which is the evolution of pet. The body shape of the pet varies depending on how many points there are. By introducing the level and two indicators to evaluate the health state of virtual pet, user’s consumption are evaluated and the result is shown to user by spatial visualization.
Mission. Mission is an important method in our system to introduce the multi-value criteria (Fig. 3). In our system, the content of the mission is displayed in a game-like environment. The process of obtaining mission is designed to be a dynamic animation. Users can know the content of mission from reading text on the screen. After completing the mission, the system will play fireworks animation to create joyful environment in which users can feel successful.
Usage. When users go shopping, they can start point system and enter mission interface. There is one box in the center of the screen. The white clouds float in the air. After users click the box, it will emit light and the box will open. After the animation stops, a star will rise into the air. After users click the star, it will blow up and users can see several patterns on the ground. If users click the pattern, they can view the content of mission displayed on the GUI (Fig. 3). If users complete the mission, it will be confirmed by the system after the consumption. If it is confirmed, the pet will receive the experience value and it will be recorded.
The number, variety of points and the completion of mission will affect pet’s health state. On the pet interface, virtual pet will appear after some animations. The pets life cycle stages are baby, child, teen, and adult. There is a state bar at the top right corner of the screen indicating how healthy and happy the pet is. In state bar, there are two different indicators – Hunger and Happy.
The higher each indicator is, the better the pet’s state is. Hunger is related to the number of points earned by the user. After the user gets points, Hunger indicator will rise. Happy is related to the variety of points. If user gets points from various source, the Happy indicator will be high. Filling up the Hunger can be achieved by obtaining points. Filling up the Happy can be achieved by getting points from buying different goods. The two indicators will be considered together to assess the health state of virtual pet. Virtual pet will make different action with different expression according to the health state of virtual pet (Fig. 2) and the expression of pet will change.
Level is used to describe the pet’s current life stage. The pet’s experience value increases after the user completes mission. After the experience value reaches threshold, the pet will level up. When it reaches a certain level, the pet will go to new life stage, which is considered as evolution.
The results, including points, mission and virtual pet, are presented to the user via spatial visualization.
Spatial Gesture. Users can interact with the system based on the GUI provided by the system. When users interact with the system, they can operate on two-dimensional screen. However, this makes the virtual world created by AR cannot be well integrated with the real world. This reduces the users’ interest in the system to some extent. In order to improve the interactive experience of the point system, we consider replacing the traditional two-dimensional interaction with the spatial gesture interaction. We use leap motion to capture the user’s real-world hands movement and map it into the virtual world. Users can interact with system using gesture (Fig. 4).
When the camera scans the surrounding scenes, the features of point card are extracted and compared with the recorded features. After that, the coordinate system of real world and screen coordinate will establish mapping. User’s hand movement is obtained by leap motion and it is shown in the real-time video captured by webcam. Users can see the real-world movement of their hands on the screen. The virtual objects are superimposed to real world in the real-time video. Therefore, users can adjust the position and movement of hand to touch and interact the virtual objects as if they were objects in real world.