Touch Resource (Object Interaction)

For an overview of Resources, click here.

For information on what Object Interactions are, click here.

Deep Learning on Resources is found in Motive Academy. Click here to see the course this lesson is in.

 

Overview

The Touch object interaction emits an Event once the learner makes contact with an object in the Scene with their hand.

An example of how the Touch object interaction could be used is when the learner is required to press a button. Once the button is touched, the Scenario progresses.

Required Fields

 

Field Name

Description

Field Name

Description

World Objects

The object to be touched.

Optional Fields

Field Name

 Description

Field Name

 Description

Require All Targets

If more than one World Object is designated, this would require all.  Otherwise referred to as Target Object.

Prompts

Add an effect or text to draw attention to the item you'd like the learner to.

Prompt Anchor

Object Interaction prompts without an Anchor will end up on the object you are targeting because they’re meant as annotation or extra information about the object in question. If you want to accompany an Object Interaction with additional information for the learner, you could add a Notification or Screen Message to the Frame as well.

Event

Determine an Event or Custom Event to fire along with any other Events that would normally fire.  This field is generally used to distinguish between different options for branching.

Require All Inputs

If more than one Input is designated, this would require all.

Persistent

Will stay open and continue listening for Input and will continue to fire scripted Events.  If this is chosen, the close Event cannot be used as it doesn't close.

Interacted Objects

If multiple World Objects are identified, you can add a Variable here that contains a record of the World Objects the learner has interacted with and the Variable then affects only those in that list.

Events

Activate, Open, Close (if not persistent), Complete, and Custom.

Example in the Headset

As a simple example, in the Motive Lab, I want the learner to locate and touch the Fume Hood.

  1. Add the Fume Hood as an object to a Frame

  2. (Optional) Add instructions for the learner

  3. Add the Touch Resource and fill it out:

  1. (Optional) Add a confirmation message once the task is complete

 

This is what it would look like in the headset:

https://vimeo.com/722956512

View video full screen on Vimeo at this link.

 

Related Articles

Grasp Resource

Gaze Resource

Use Resource