AS ISO/IEC 18039:2025

$242.19

Information technology – Computer graphics, image processing and environmental data representation – Mixed and augmented reality (MAR) reference model

AS ISO/IEC 18039:2025 identically adopts ISO/IEC 18039:2019, which defines the scope and key concepts of mixed and augmented reality, the relevant terms and their definitions and a generalized system architecture that together serve as a reference model for mixed and augmented reality (MAR) applications, components, systems, services and specifications

Table of contents
Header
About this publication
Preface
Foreword
Introduction
1 Scope
2 Normative references
3 Terms, definitions and abbreviated terms
3.1 Terms and definitions
3.2 Abbreviated terms
4 Mixed and augmented reality (MAR) domain and concepts
4.1 General
4.2 MAR continuum
5 MAR reference model usage example
5.1 Designing an MAR application or service
5.2 Deriving an MAR business model
5.3 Extending existing or creating new standards for MAR
6 MAR reference system architecture
6.1 Overview
6.2 Viewpoints
6.3 Enterprise viewpoint
6.3.1 General
6.3.2 Classes of actors
6.3.2.1 Class 1: providers of authoring/publishing capabilities
6.3.2.2 Class 2: providers of MAR execution engine components
6.3.2.3 Class 3: service providers
6.3.2.4 Class 4: MAR end user
6.3.3 Business model of MAR systems
6.3.4 Criteria for successful MAR systems
6.4 Computational viewpoint
6.4.1 General
6.4.2 Sensors: pure sensor and real world capturer
6.4.3 Context analyser: recognizer and tracker
6.4.4 Spatial mapper
6.4.5 Event mapper
6.4.6 MAR execution engine
6.4.7 Renderer
6.4.8 Display and user interface
6.4.9 MAR system API
6.5 Information viewpoint
6.5.1 General
6.5.2 Sensors
6.5.3 Recognizer
6.5.4 Tracker
6.5.5 Spatial mapper
6.5.6 Event mapper
6.5.7 Execution engine
6.5.8 Renderer
6.5.9 Display and user interface
7 MAR component classification framework
8 MAR system classes
8.1 General
8.2 MAR Class V — Visual augmentation systems
8.2.1 Local recognition and tracking
8.2.2 Local registration, remote recognition and tracking
8.2.3 Remote recognition, local tracking and registration
8.2.4 Remote recognition, registration and composition
8.2.5 MAR Class V-R: visual augmentation with 3D environment reconstruction
8.3 MAR type 3DV: 3D video systems
8.3.1 Real-time, local-depth estimation, condition-based augmentation
8.3.2 Real-time, local-depth estimation, model-based augmentation
8.3.3 Real-time, remote depth estimation, condition-based augmentation
8.3.4 Real-time, remote-depth estimation, model-based augmentation
8.3.5 Real-time, multiple remote user reconstructions, condition-based augmentation
8.4 MAR Class G: points of interest (POI) — GNSS-based systems
8.4.1 Content-embedded POIs
8.4.2 Server-available POIs
8.5 MAR type A: audio systems
8.5.1 Local audio recognition
8.5.2 Remote audio recognition
8.6 MAR type 3DA: 3D audio systems
8.6.1 Local audio spatialization
9 Conformance
10 Performance
11 Safety
12 Security
13 Privacy
14 Usability and accessibility
Annex A
A.1 Overview
A.2 MPEG ARAF
A.3 KML, ARML, KARML (see Figure A.2)
A.4 X3D
A.5 JPEG AR (see Figure A.3)
A.6 ARToolKit and OSGART
A.7 OpenCV and OpenVX
A.8 QR codes and bar codes
Annex B
B.1 Overview
B.2 Use case categories
B.2.1 Guide use case category
B.2.2 Publish use case category
B.2.3 Collaborate use case category
B.3 MagicBook (Class V, Guide)
B.3.1 What it does
B.3.2 How it works
B.3.3 Mapping to MAR-RM and its various viewpoints
B.4 Human Pac-Man (Type G, Collaborate) and ARQuake (Class V and G, Collaborate)
B.4.1 What it does
B.4.2 How it works
B.4.3 Mapping to MAR-RM and various viewpoints
B.5 Augmented haptics — Stiffness modulation (Class H, Guide)
B.5.1 What it does
B.5.2 How it works
B.5.3 Mapping to MAR-RM and various viewpoints
B.6 Hear-through augmented audio (Class A, Guide)
B.6.1 What it does
B.6.2 How it works
B.6.3 Mapping to MAR-RM and various viewpoints
B.7 CityViewAR on Google Glass (Class G, Guide)
B.7.1 What it does
B.7.2 How it works
B.7.3 Mapping to MAR-RM and various viewpoints
B.8 Diorama-projector-based spatial augmented reality (Class 3DV, Publish)
B.8.1 What it does
B.8.2 How it works
B.8.3 Mapping to MAR-RM and various viewpoints
B.9 Mobile AR with PTAM (Class 3DV, Guide)
B.9.1 What it does
B.9.2 How it works
B.9.3 Mapping to MAR-RM and various viewpoints
B.10 KinectFusion (Class 3DV, Guide)
B.10.1 What it does
B.10.2 How it works
B.10.3 Mapping to MAR-RM and various viewpoints
B.11 ARQuiz (Class X, Guide)
B.11.1 What it does
B.11.2 How it works
B.11.3 Mapping to MAR-RM and various viewpoints
B.12 Augmented Printed Material (Class X, Guide)
B.12.1 What it does
B.12.2 How it works
B.12.3 Mapping to MAR-RM and various viewpoints
B.13 Augmented reality for laparoscopic surgery (Class X, Guide)
B.13.1 What it does
B.13.2 How it works
B.13.3 Mapping to MAR-RM and various viewpoints
B.14 Coexistent reality (Class X, Collaborate)
B.14.1 What it does
B.14.2 How it works
B.14.3 Mapping to MAR-RM and various viewpoints
Bibliography

Cited references in this standard
Content history
DR AS ISO/IEC 18039:2025

Please select a variation to view its description.

Published

06/06/2025

Pages

62

Please select a variation to view its pdf.

AS ISO/IEC 18039:2025
$242.19