Learn ‘o’ Sys – VR/AR App for Learning Human Body - For School Children Week - 2 Blog

ID 671619
Updated 1/19/2019
Version Latest
Public

author-image

By

Welcome back, readers. Having understood the significance and impact of problem statement, we are able to make the application better than what we thought of earlier, by enabling Augmented Reality (AR) as well. Yes, the app is going to exploit both AR and VR techniques. Both these approaches are expected to give the users (children) a fantastic learning experience and lively/interactive feel. While AR gives the users a real time and real world connected experience, VR is expected to give the users a completely in-depth experience. 

 

Before diving deeper into the architecture, plan and implementation, we feel, it is necessary to know certain definitions to enable better and meaningful learning. 

 

  •   “Virtual Reality (VR) refers to a high-end user interface that involves real-time simulation and interactions through multiple sensory channels. This is immersive and highly disconnected from the real world” 

 

Whereas, 

 

  •   “Augmented reality (AR) is the most sought after field of research which deals with the combination of real-world and computer-generated data. This enables you to be always connected with what happens in the real world and you are never disconnected”

 

A simple diagrammatic representation is presented below to make sure the understanding is all right. 

 

Fig. 1 The car and the background both are imaginary

 

Fig. 2 The yellow car is digital inclusion; rest are real.

 

One can understand from fig. 1 and 2 that virtual reality produces digitally created illusion. Augmented Reality is inclusion of the illusion (could be interactive) with contents which exist in real world.  In this project, we have decided to present the users with both of these experiences by building both AR and VR applications for the problem statement in hand. 

 

Fine, a quick recap of the problem statement would make you proceed further with ease – we need to build an interactive AR/VR application for understanding the human biology easily. 

 

Coming to the implementation part, first, we shall handle the fundamentals for the AR application development. 

 

The aim is to provide an application for Augmented Reality which has a sensor and smart phone as the primary components. This should suffice for the virtual computing part though the hand gestures and tasks are to be accomplished, in real-time. This setup will be helpful for the users to interact with the real world environment and this is referred as Augmented Reality. The application development has been targeted to be done with Android platform for which AR API should be the get the job done. We may have to use Edge detection algorithms in addition to core AR operations. 

 

But does edge detection add overhead to the process? No, is the answer. Edge detection shall help in identifying the sharp changes in the image brightness and shall help in capturing even the minor changes in the properties of the real world. This smooths the whole process and significantly improves the quality of the whole process.  

 

As one may be aware of, many AR APIs are already available in the market which include Vuforia, Metaio, in2ar etc. We have chosen Vuforia for our process due to simplicity of its usage while not compromising the features.

 

Alright, here we need a sensor to capture the gestures and yes, the obvious option was Leap Motion sensor. It can be replaced by any sensor which can capture gestures. 

 

What the literature says? 

 

It is good to go with a bit of literature study. Isn’t it? We have quickly gone through the literature to understand the state of the art and it is really stunning. Many good articles have been written and few of them are very relevant to the application being developed and are presented below. 

 

[1] In the article authored by Guna, J et al, the performance of the leap motion controller is all discussed and was very helpful to understand its usage for the kind of application that is being built.

 

[2] Article by Hull J.J et al. is very interesting and is appreciable too.  A new approach for augmented reality was touched here. Here, adding the electronic data to the paper is attempted without adding or changing the appearance of the paper. This was really a stunning attempt by then and the results are really encouraging. The algorithm for the text patch recognition was nothing less than a revolution.  

[3] This article is a real fabulous article. In fact, it was too early for the authors to think about the AR and they captured all the aspects of AR in a neat and precise manner. This paper discussed the properties of the Augmented Reality like blending real and virtual in real environment, Real-time interactive and accurate alignment of real objects etc. 

[4] Here, the markerless approach was discussed and in fact, it was a paradigm shift one should say. 

 

So, by now, readers would have clearly understood that, the AR application shall include a Smart phone which has to be interfaced with Leap Motion sensor to get the application up and running. In the next week’s blog, we shall get you some interesting technical inputs about the Leap and Software for Leap towards the application building. Also, details about the Gestures shall be presented for increased understanding. 

 

Detailed descriptions and discussions about VR shall be presented once the AR application development is all done. 

 

Stay tuned folks. 

 

Video link:

 

References: 

 

[1] Guna, J., Jakus, G., Pogačnik, M., Tomažič, S. and Sodnik, J., 2014. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors14(2), pp.3702-3720. 

[2] Hull, J.J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J. and Van Olst, D.G., 2007, November. Based Augmented Reality. In Artificial Reality and Telexistence, 17th International Conference on (pp. 205-209). IEEE.

[3] Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S. and MacIntyre, B., 2001. Recent advances in augmented reality. NAVAL RESEARCH LAB WASHINGTON DC.

[4] Avola, D., Cinque, L., Levialdi, S., Petracca, A., Placidi, G. and Spezialetti, M., 2014. Markerless Hand Gesture Interface Based on LEAP Motion Controller. In DMS (pp. 260-266).