Augmented reality (AR) , is a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called computer-mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. Augmented reality enhances one’s current perception of reality, whereas in contrast, virtual reality replaces the real world with a simulated one. Augmentation techniques are typically performed in real time and in semantic context with environmental elements, such as overlaying supplemental information like scores over a live video feed of a sporting event.
With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Information about the environment and its objects is overlaid on the real world. This information can be virtual or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space. Augmented reality brings out the components of the digital world into a person's perceived real world. One example is an AR Helmet for construction workers which display information about the construction sites. The first functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Labs in 1992.
Technology[edit | edit source]
Hardware[edit | edit source]
Hardware components for augmented reality are: processor, display, sensors and input devices. Modern mobile computing devices like smartphones and tablet computers contain these elements which often include a camera and MEMS sensors such as accelerometer, GPS, and solid state compass, making them suitable AR platforms.
Software[edit | edit source]
Location based Augemnted Reality[edit | edit source]
Marker based and location based AR with AR.js[edit | edit source]
Learning Task[edit | edit source]
See AFrame at Work[edit | edit source]
- AFrame Examples from OpenSource Portal  (some examples can be viewed only with Browser Chromium/Chrome - some work with Firefox). With AFrame you create and test the web based Virtual Reality (WebVR) objects that can be used as 3D models in AR.js
See AR.js at Work[edit | edit source]
- Youtube-Video of Basic Features by Jerome Etienne (2017/02/23)
- AR.js with AR Image Tracking, Location based AR
See TrackingJS at Work[edit | edit source]
- Analyse how users can interact with the implemented tracking.js feature and how digital objects can be manipulated with movement of hands.
- Create a software design in which handicapped people that are unable to type can interact with a computer and browser based applications and compare these options with Open Source Simon KDE or PocketSphinxJS OpenSource Library for webbased speech recognition with submission the audio recording to TwoogleBook et. al.
- Handcapped people might not be able to type. Analyse Open Source Speech Recognition for handicapped people in comparision to Human-Computer Interaction with Movement Detection
See Mixare at Work[edit | edit source]
Watch the Mixar Demo video about the application of MixARE in Vienna Austria to understand, how digital information is placed in the real-life camera image of a smartphone by computer-generated information related to geolocation is currently. Compare the similarities and differences to AR.js.
Create your own Mixare Database[edit | edit source]
The following part of the tutotial support you in creating your own Mixare Database.
- Select an area in your hometown e.g. the zoo in your city.
- Attach digital information to real geolocation. To do this you need to search for
- wikipedia information e.g. articles about the animals you can observe in your zoo or
- look for Youtube videos (or existing videos used in Wikiversity) that show the animals in their natural environment. This media or learning resources are attached to a real geolocation.
- select the geolocation for digital learning resources e.g. in your zoo where your can watch the real animals and augment the real experience with your selected digital learning resources. Create Mixare Database with Mixare4JSON editor. This Mixare Database can be used as your own database. Now the created Mixare database can be stored on web server of your choice. This can be done with two ways:
- Storage Webserver: Select a storage location on a webserver for your Mixare JSON database created with Mixare4JSON editor.
- ask the adminitrator of web server of the zoo if you want to share your work on the web page of zoo and place a link to Mixare to the website as well. Other zoo visitors can use your contribution of the Mixare database and maybe they adminitration of zoo supports your work in future if they like your augmented reality approach.
- if you cannot use an existing web server of your Mixare database create with the Mixare4JSON editor. Create a GitHub account and create a GitHub Repository e.g. MixareDB4MyZoo. Create a subdirectory docs/ in your github-Respository with and index.html which a download link for your JSON Mixare database. Learn to populate your GitHub repository.
- Press a menu button on Android phone to select you own source for mixare. Select data source of JSON file.
See also[edit | edit source]
- Open Educational Resouces
- User-driven spatial activities
- Real World Lab
- 3D Modelling with AR.js and Markers
External Links[edit | edit source]
References[edit | edit source]
- ↑ Steuer, Jonathan. Defining Virtual Reality: Dimensions Determining Telepresence https://web.archive.org/web/20160524233446/http://ww.cybertherapy.info/pages/telepresence.pdf 24 May 2016, Department of Communication, Stanford University. 15 October 1993.
- ↑ Introducing Virtual Environments National Center for Supercomputing Applications, University of Illinois.
- ↑ Chen, Brian X. If You’re Not Seeing Data, You’re Not Seeing, Wired, 25 August 2009.
- ↑ Maxwell, Kerry. Augmented Reality, Macmillan Dictionary Buzzword.
- ↑ Augmented reality-Everything about AR, Augmented Reality On.
- ↑ Azuma, Ronald. A Survey of Augmented Reality Presence: Teleoperators and Virtual Environments, pp. 355–385, August 1997.
- ↑ Chatzopoulos D., Bermejo C, Huang Z, and Hui P Mobile Augmented Reality Survey: From Where We Are to Where We Go.
- ↑ Huang Z,P Hui., et al. Mobile augmented reality survey: a bottom-up approach.
- ↑ Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97
- ↑ Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp99-128, 1992.
- ↑ Rosenberg, L.B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments.". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992..
- ↑ Rosenberg, L.B. (1993). "Virtual Fixtures: Perceptual Overlays for Telerobotic Manipulation". In Proc. of the IEEE Annual Int. Symposium on Virtual Reality (1993): pp. 76–82,.
- ↑ Dupzyk, Kevin (2016). "I Saw the Future Through Microsoft's Hololens".
- ↑ Metz, Rachel. Augmented Reality Is Finally Getting Real Technology Review, 2 August 2012.
- ↑ Samani, Nitin. mixare – A New Augmented Reality Engine For Android, Augmented Planet March 19, 2010
- ↑ "mixare – Open Source Augmented Reality Engine". mixare.org
- ↑ TrackingJS - OpenSource library for object detection in images and video streams (2017) - https://trackingjs.com/
- ↑ TwoogleBook - artificial word for commercial data harvesting companies that avoid to use real company names
- ↑ Peter Grasch - KDE Simon Speech - Protoype Simon Dictation (2015) - Youtube: https://www.youtube.com/watch?v=uItCqkpMU_k&t=4s
- ↑ Sinha, G., Shahi, R., & Shankar, M. (2010, November). Human computer interaction. In Emerging Trends in Engineering and Technology (ICETET), 2010 3rd International Conference on (pp. 1-4). IEEE.
- ↑ Jaimes, A., & Sebe, N. (2007). Multimodal human–computer interaction: A survey. Computer vision and image understanding, 108(1), 116-134.