Experience summary 📖

This experience is a modified version of the AR Games experience: hunt and destroy all the elements in time limit. The user, once he finds a game point, has a mission: hunt the highest number of elements matched by colors. This type of dynamics, like the logo hunting game, can offer different types of user interactions in environments such as music festivals, marketing campaigns associated with events, loyalty, gamification, activities for cultural and commercial centers, etc. 

These types of experiences use a fairly simple AR mode, which is to place elements around the user, using sensors such as the gyroscope to position the elements in their immediate environment, performing rotation or movement movements, making it difficult to find the content to hunt. 

In this example the user has a configurable time limit to hunt the 5 pairs of logos. If the user makes a mistake in the hunting order of the pairs, the logos will disappear thus penalizing the user’s error.

Visualize this experience

Scan the QR code and hold your device pointing to the marker.

QR CODE
experience QR code
MARKER experience marker

Visualize this experience

Scan the QR code and enjoy the AR experience.

Uses and benefits 🌱

These types of experiences are very useful to add an extra component in different dynamics related to events, or launches of new establishments, anniversaries in shopping malls, and a host of potential examples related to the retail sector, among others.

As already mentioned, these types of experiences can work very well connected with larger systems where there are missions and rewards, such as scavenger hunts or treasure hunts. In these types of projects, more external elements can be added to the AR, which help to create a 360 experience:

  • New user registration system: customer acquisition. 
  • Scoring system: engagement, retention, loyalty.
  • Access to rewards and other types of benefits: brand image, customer relationship, reward systems.

Features and tips💡

In this case a dynamic has been realized in Onirix using the most powerful customization elements of the tool, in addition to the Onirix Studio scene editor: Online code editor and Embed SDK for web AR. With these components you can go to a higher level of complexity, thus accessing the web programming component developed by Onirix with its SDK for JavaScript. We also rely on more common elements of this type of experiences: scene editor, audios, and events and interactions.

Code editor: HTML, CSS and JavaScript

Within these types of experiences we have created a game structure, which the user can modify as he likes. In this structure we can see an OnirixGame class, which has a series of events or callbacks: onStart, onTimeChange, onGameEnd, onScoreChange. 

By copying this experience you will be able to analyze the code in detail, seeing how we shape the whole game, and its changes in the user interface, through the different events that happen in the AR scene.

In addition, thanks to the code editor we can add different elements of the UI of the game relevant to this type of dynamics:

  • Onboarding: initial tips screen to explain the dynamics of the game.
  • Scoring: top menu where we show the status of the game (number of logos that have been hunted so far, for example).
  • Timer: game seconds indicator (the remaining seconds or those that are passing).
  • Final screen: at the end of each game a summary screen is shown with the score obtained.

Access to the online documentation of the code editor.

Embed SDK for webAR in JavaScript

As discussed in the previous section, a game structure has been created within Onirix, which the user can modify to generate their own versions. In addition, access to the Onirix SDK is provided, where it is possible to subscribe to various events and thereby modify different parts of the AR scene.

In the case of the game in question the following events are programmed to manipulate interactions with the scene:

  • EmbedSDK.Events.SCENE_LOAD_END: the scene full load event is used to start the game and thus leave the initial user interface ready with all its components.
  • EmbedSDK.Events.ELEMENT_CLICK: this event allows you to manipulate the click on each of the elements of the scene, and therefore to program the activation of the destruction animation of each element, as well as the disappearance of them. It is also the entry point to the addition of new points in the game and the evolution of the game.

 Access the online documentation of the embed SDK.

Scenes, events and interactions editor

Apart from the programming of the game itself, the scene starts with all the AR content included in it, i.e. all the logos to be hunted are already available in the scene. In this case, the experience starts with a fixed rotating logo, which when clicked “launches” the game dynamics. A total of 10 colored elements appear floating around the user, with different movements. All these contents and their movements are included in the scene editor, with different associated events:

  • 10 3D logos representing the colored logos, and 10 3D models with destruction animations for each of the 3D elements.
  • The logos are grouped in 2 collections with different rotation and orbit properties. They start disabled (hidden) and clicking on the start logo shows them to the user.
  • In addition, rotation events are triggered for the 2 orbitals, with different speeds and spins on various axes, thus achieving the desired effect.

For more information see our documentation on the scene editor and documentation on events and interaction.

Sounds and effects

Also included as part of this experience are sounds that make the dynamics take on a greater level of immersion. 

A video game-like background music is included, which plays throughout the experience. 

In addition, a destruction type sound effect is added, every time an element is clicked on, giving a more realistic feeling to the user.

For more information see our documentation on sounds and effects.

Surface scene: gyroscope tracking and autoload

In this kind of experience where it is not necessary to have any kind of concrete marker (neither an image, nor a specific surface), the use of an AR mode in which the phone’s gyroscope is the most effective way. The scene is loaded using a Surface-like scene feature: autoload. With this the user does not have to make any decision where to place the content, it simply appears in front of them and they can start interacting with the content. The advantage of this type of scenes is that they can be placed anywhere, indoors or outdoors, and do not require any physical reference for its correct consumption.

For more information see our documentation on surface scenes with gyroscope.

Top