During this session of the OGC AR Summit, entitled “Context for Discovery: introduction and status update from the AR Community Discovery Task Force,” members of the AR Community Discovery Task Force will introduce motivations for AR Discovery, the vision for a future with AR Discovery, the issues it will raise and technology requirements. An AR developer, Mikel Salazar, will present a novel user interface that can support a wide range of 3D user experiences found through discovery.
Title | Description | Speaker |
Role and scope of user context filtering in discovery of geospatial AR experiences
|
At some point in the evolution of Augmented Reality, there will be Web-scale volumes of content that are or can be represented as AR experiences. Discovery methodologies, infrastructure, and standards will be needed to provide to AR users the content they need when, where, and how they need it.
This presents a challenge of incorporating personal, social, and location-based context into content search on the Web while addressing privacy concerns. Many of the issues it raises and the solutions it requires are also relevant to current more general mobile geospatial (or hybrid geospatial / visual) content discovery scenarios such as first responders operating in indoor environments. |
Josh Lieberman, Tumbling Walls
|
Applying Context for Continuous AR Discovery | This talk summarizes the status of the AR discovery task force and suggests how continuous contextual automatic discovery will address future challenges that will face the publishers and users of Augmented Reality experiences. | Christine Perey, PEREY Research & Consulting |
A comprehensive Interaction Model for AR Systems
|
As reflected in the specification of ARML 2.0, the main focus of current AR browsers is to allow end-users to access media contents (i.e., digital assets such as texts, images, 3d models, etc) by associating them to a geospatial anchor or a fiducial marker. While this solution greatly simplifies the discovery of -and access to- new content relevant to users, the actual interaction mechanisms associated with these media resources is very limited (often restricted to basic web-style data presentation and simple scripts for menu navigation).
Against this background, this talk will present an interaction model that aims to overcome these limitations by providing the necessary mechanisms for user interface designers to create more advanced and interoperable user experiences for AR systems. An open and extensible solution that simplifies the creation of interaction spaces (by facilitating the definition of the physical entities present in the environment) and enables end-users to discover contents relevant to their actual needs and desires (by constantly monitoring their physiological and psychological context). We will introduce the elements of the proposed interaction model (and how they are related to ARML 2.0). Attendees will be able to try several demos of the proposed interaction model and learn about novel use cases where it can be applied. |
Mikel Salazar, DeustoTech
|