Jump to content

Augmented Reality/Tutorial

From Wikiversity
Virtual Fixtures - first A.R. system 1992, U.S. Air Force, WPAFB
NASA X-38 display showing video map overlays including runways and obstacles during flight test in 2000.

Augmented reality (AR) , is a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called computer-mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. Augmented reality enhances one’s current perception of reality, whereas in contrast, virtual reality replaces the real world with a simulated one.[1][2] Augmentation techniques are typically performed in real time and in semantic context with environmental elements, such as overlaying supplemental information like scores over a live video feed of a sporting event.

With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Information about the environment and its objects is overlaid on the real world. This information can be virtual[3][4][5][6][7][8] or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.[9][10] Augmented reality brings out the components of the digital world into a person's perceived real world. One example is an AR Helmet for construction workers which display information about the construction sites. The first functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Labs in 1992.[11][12][13]


Technology

[edit | edit source]

Hardware

[edit | edit source]

Hardware components for augmented reality are: processor, display, sensors and input devices. Modern mobile computing devices like smartphones and tablet computers contain these elements which often include a camera and MEMS sensors such as accelerometer, GPS, and solid state compass, making them suitable AR platforms.[14]

Software

[edit | edit source]

Location based Augemnted Reality

[edit | edit source]

MixARE (mix Augmented Reality Engine) was one of the first open-source software (GPLv3), that supports augmented reality on Mobile Devices (Android and iPhone). Location based augemnted reality (like MixARE uses the camera image of the mobile device and the sensor of the smartphone to display icons in the camera image. The application can be used for displaying context dependent information, that makes reference to the current geolocation (2010). The developement of webbased technologies and the improved performance of mobile devices allows the implementation location based MixARE like infrastructure[15][16] with HTML5 application based on Javascript. The key element for web-based location based Augemented reality was based GeoAR and AR.js as OpenSource technologies.

Marker based and location based AR with AR.js

[edit | edit source]

Augmented Reality in Javascript with AR.js (see Youtube Example Tutorial) with the Open Source Github Repository of AR.js and learn about creating your augmented reality web application or start with WebAR Playground.

Learning Task

[edit | edit source]

See AFrame at Work

[edit | edit source]
  • AFrame Examples from OpenSource Portal [1] (some examples can be viewed only with Browser Chromium/Chrome - some work with Firefox). With AFrame you create and test the web based Virtual Reality (WebVR) objects that can be used as 3D models in AR.js

See AR.js at Work

[edit | edit source]

See TrackingJS at Work

[edit | edit source]

The OpenSource javascript library tracking.js[17] supports augemented reality without GPS, Accelerometer and Device Orientation. It is mainly based on object detection in the camera image. Recognized objects can be replaced by 3D-objects, videos, images and even specific movements with you hand can trigger augemented audio samples (e.g. augmented bouncing ball, that creates a sound every time the ball touches the real ground in the camera image.

  • Analyse how users can interact with the implemented tracking.js feature and how digital objects can be manipulated with movement of hands.
  • Create a software design in which handicapped people that are unable to type can interact with a computer and browser based applications and compare these options with Open Source Simon KDE or PocketSphinxJS OpenSource Library for webbased speech recognition with submission the audio recording to TwoogleBook et. al.[18]
  • Handcapped people might not be able to type. Analyse Open Source Speech Recognition[19] for handicapped people in comparision to Human-Computer Interaction[20] with Movement Detection[21]

See Mixare at Work

[edit | edit source]

Watch the Mixar Demo video about the application of MixARE in Vienna Austria to understand, how digital information is placed in the real-life camera image of a smartphone by computer-generated information related to geolocation is currently. Compare the similarities and differences to AR.js.

Mixare with a circle in the camera image

Create your own Mixare Database

[edit | edit source]
Mixare use your own URL for a Mixare Database of geolocations
Mixare edit the source of your own URL for the Augmented Reality App

The following part of the tutotial support you in creating your own Mixare Database.

  1. Select an area in your hometown e.g. the zoo in your city.
  2. Attach digital information to real geolocation. To do this you need to search for
    • wikipedia information e.g. articles about the animals you can observe in your zoo or
    • look for Youtube videos (or existing videos used in Wikiversity) that show the animals in their natural environment. This media or learning resources are attached to a real geolocation.
  3. select the geolocation for digital learning resources e.g. in your zoo where your can watch the real animals and augment the real experience with your selected digital learning resources. Create Mixare Database with Mixare4JSON editor. This Mixare Database can be used as your own database. Now the created Mixare database can be stored on web server of your choice. This can be done with two ways:
  4. Storage Webserver: Select a storage location on a webserver for your Mixare JSON database created with Mixare4JSON editor.
    • ask the adminitrator of web server of the zoo if you want to share your work on the web page of zoo and place a link to Mixare to the website as well. Other zoo visitors can use your contribution of the Mixare database and maybe they adminitration of zoo supports your work in future if they like your augmented reality approach.
    • if you cannot use an existing web server of your Mixare database create with the Mixare4JSON editor. Create a GitHub account and create a GitHub Repository e.g. MixareDB4MyZoo. Create a subdirectory docs/ in your github-Respository with and index.html which a download link for your JSON Mixare database. Learn to populate your GitHub repository.
    • Press a menu button on Android phone to select you own source for mixare. Select data source of JSON file.

See also

[edit | edit source]
[edit | edit source]

References

[edit | edit source]
  1. Steuer, Jonathan. Defining Virtual Reality: Dimensions Determining Telepresence https://web.archive.org/web/20160524233446/http://ww.cybertherapy.info/pages/telepresence.pdf 24 May 2016, Department of Communication, Stanford University. 15 October 1993.
  2. Introducing Virtual Environments National Center for Supercomputing Applications, University of Illinois.
  3. Chen, Brian X. If You’re Not Seeing Data, You’re Not Seeing, Wired, 25 August 2009.
  4. Maxwell, Kerry. Augmented Reality, Macmillan Dictionary Buzzword.
  5. Augmented reality-Everything about AR, Augmented Reality On.
  6. Azuma, Ronald. A Survey of Augmented Reality Presence: Teleoperators and Virtual Environments, pp. 355–385, August 1997.
  7. Chatzopoulos D., Bermejo C, Huang Z, and Hui P Mobile Augmented Reality Survey: From Where We Are to Where We Go.
  8. Huang Z,P Hui., et al. Mobile augmented reality survey: a bottom-up approach.
  9. Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97
  10. Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp99-128, 1992.
  11. Rosenberg, L.B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments.". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.. 
  12. Rosenberg, L.B. (1993). "Virtual Fixtures: Perceptual Overlays for Telerobotic Manipulation". In Proc. of the IEEE Annual Int. Symposium on Virtual Reality (1993): pp. 76–82,. 
  13. Dupzyk, Kevin (2016). "I Saw the Future Through Microsoft's Hololens".
  14. Metz, Rachel. Augmented Reality Is Finally Getting Real Technology Review, 2 August 2012.
  15. Samani, Nitin. mixare – A New Augmented Reality Engine For Android, Augmented Planet March 19, 2010
  16. "mixare – Open Source Augmented Reality Engine". mixare.org
  17. TrackingJS - OpenSource library for object detection in images and video streams (2017) - https://trackingjs.com/
  18. TwoogleBook - artificial word for commercial data harvesting companies that avoid to use real company names
  19. Peter Grasch - KDE Simon Speech - Protoype Simon Dictation (2015) - Youtube: https://www.youtube.com/watch?v=uItCqkpMU_k&t=4s
  20. Sinha, G., Shahi, R., & Shankar, M. (2010, November). Human computer interaction. In Emerging Trends in Engineering and Technology (ICETET), 2010 3rd International Conference on (pp. 1-4). IEEE.
  21. Jaimes, A., & Sebe, N. (2007). Multimodal human–computer interaction: A survey. Computer vision and image understanding, 108(1), 116-134.