Stream Query Repository: Immersive Environments

Introduction Application Schema Queries


Introduction

This page provides schema and query specifications from an Immersive Environment. These environments allow a user to become immersed within an augmented or virtual reality environment in order to interact with people, objects, places, and databases. The schema and queries were provided by members of the USC-AIMS project.

Immersive Classroom Application

Within Immersive Classroom, a 3-D immersive environment, a group of normal and ADHD-diagnosed (Attention Deficit Hyperactivity Disorder) children are subjects to do particular attention tasks. The objective of the application is to differentiate between these two groups of users by analyzing their interactions with the environment. The virtual environment consists of a typical classroom containing student desks, a teacher's desk, a virtual teacher, a blackboard, a large window looking out onto a playground with buildings, vehicles, and people, and a pair of doorways on each end of the wall opposite the window through which activity occurs.

A typical task consists of alphabetical characters being displayed on the blackboard and having the child press a button when a particular pattern is seen. In one of the designed tasks, AX task, users are instructed to press mouse button as quickly as possible upon detecting an X after an A (a hit) and withhold their responses to any other pattern. At the same time, a series of typical classroom distractions are systematically manipulated within the environment (i.e., ambient classroom noise, paper airplane flying around the room, students walking into the room, activity occurring outside the window). During the test, the trackers placed on the head, hands and legs monitor body movements of the child and stream the data continuously.

Schema

The schema consists of three relations - User, StaticObject, and MovingObject, and four streams - Characters, UserResponse, UserBodyPositions, and MovingObjectPositions shown below.

Queries

Queries in English

  1. Hits Query: For each user, display the number of mouse clicks following each AX pattern in characters. Click should be performed prior to displaying the character that follows X on the blackboard.

  2. Average Response Time Query: For each user, report the average response time of all hits. Response time for each hit is the time between displaying character X and click time.

  3. Body Movement Query: For each male user, report average speed of hand movements during all hits.

  4. User Status Query: Which user is shaking during the test. We can define a common frequent pattern of movement among all body parts as shaking.

  5. Distraction Query: For each pattern that a user missed, show which distractions were around.

  6. Attention Query 1: For each pattern that a user missed, show if she was looking at a distractor instead of the blackboard .

  7. Attention Query 2: Monitor the user attention to each static object in the past 2 minutes. Update the results every minute. Attention can be reported as the times she was looking at the object vs. the times she was looking at the blackboard.

  8. Gesture Recognition Query: When the user raises her hand, pause the task.

  9. Nearest Neighbor Query: Every 2 minutes return the 3 closest static/moving object to the subject.

  10. Touch Query: Every 1 minute return the static object which the user has touched more in the past 3 minutes.

  11. Monitor New Moving Objects Query: Find moving objects that have been around within 2 minutes period prior to the first miss of each user.


Last modified: Feb 6 2003. Please send comments and questions to shivnath@stanford.edu