All Case Studies Design Development Interviews Machine Learning Project Management

Augmented Reality on Mobile Devices with an App Tutorial for Android! (UPDATED)

What is augmented reality?

Augmented reality has become quite a popular term in the past few years thanks to Google Glass, but the idea is older than the first Android phone.

Do you remember the Terminator movie? The main hero’s vision mapped the nearby area and displayed additional information about objects or people.

And that, in essence, was AR.


Share your challenge with our team and we'll work with you to deliver an outstanding digital product. Create a product with Netguru.


To show you what AR really is, let me refer to the definition of an augmented reality guru - Ronald Azuma. In late 1997 he established the three main elements that define augmented reality:

  1. It connects real and virtual worlds,
  2. It’s interactive in real time,
  3. It allows movement in 3D.

Basically, augmented reality adds digital components to the real world, often by using the camera on a smartphone to enhance the experience.

Virtual reality vs augmented reality

It is very important that you do not confuse augmented reality with virtual reality because these technologies are just not the same.

The crucial idea of AR is to overlay digitised information on the real (offline) world e.g. by displaying info about the building you are looking at. VR uses computer-generated reality, experienced usually by wearing a headset. Sounds great, doesn’t it?

In the past this seemed like science fiction... And actually it was!

Back then you were only able to check out the possibilities of AR on special devices like the Virtual Fixture. It was heavy, uncomfortable, and had very limited features. Nowadays, AR devices are being pushed out by smartphones, which make the augmented reality experience accessible. Let’s dive deeper and see what are the types of AR with some hot use cases.

How does augmented reality work on mobile?

There are basically 2 types of AR which can be used in mobile apps. Each of them differs in terms of used sensors and technologies, but the basic principle is still the same: they display virtual 3D objects on top of a camera view.

Marker-based Augmented Reality

Marker-based AR is, in my opinion, the most powerful technology. It uses image-recognition algorithms to specify the position and rotation of markers. After that it displays, for example, a 3D object in the specified place.

You may ask: what can be the marker? Well, at the beginning image recognition wasn’t well developed and the marker was just a QR code. But currently there are tools that can recognize almost everything - from text to human face. Let’s see some cool examples of marker-based AR solutions.

Face filters

face filters augmented reality

Source: TNW 

Face filters are one of the most popular cases where AR is used. The success story of face filters began in 2015 when Snapchat introduced them in their app. People have gone crazy about them and started to heavily use them.

But how is it actually possible to show e.g. dog’s ears or tongue on a human head? The answer is face recognition algorithms and some 3D magic. Recognizing human faces is not a trivial feature, but nowadays there are tools that allow developers to create their own face filters, for example the Firebase ML Kit.

With Firebase it is possible to detect the positions of eyes, mouth, nose, and contours. Such data can then be used to place a 3D mesh with proper graphics over the camera image. Wouldn’t it be great to have your own filters in your app?

Inkhunter

tattoo app ar Inkhunter

Source: Bless This Stuff 

Have you ever wanted to have a tattoo, but you weren't sure if this or that one would look good on you?

Augmented Reality can help you with this. Inkhunter is another example of marker-based AR which places a virtual tattoo on your body. The first step is to use Inkhunter to draw a smile on your hand or wherever you want to have a tattoo. The smile is used as a marker in this app. Now it’s time to select a piece of art and point your smartphone’s camera at the smile you just drew. 

Markerless Augmented Reality

The previous type of AR used image-recognition to place 3D objects in the real world. Image from the camera is processed to fetch information about the position and orientation of the marker.

Markerless AR is less complicated in terms of algorithms used, but more complicated when it comes to hardware sensors. It uses  sensors to learn the position and orientation of the device. What sensors, you may ask? There are 3 sensors used in this type of AR:

  • Accelerometer - measures the acceleration applied to the device,
  • Gyroscope - angular speed around all axes in 3D space,
  • Magnetometer - measures the ambient magnetic field in all axes.

Thanks to them it’s possible to calculate the exact rotation of each axis in 3D space. In many cases it’s also necessary to use GPS data to place 3D objects at some distance. Let’s see some examples of such apps.

IKEA Place

IKEA ar app

Source: Engadget

IKEA Place is a very useful app for all the people who want to buy new furniture or to arrange their home from scratch. The main feature of the app is the ability to see if a given product fits your interior. Ikea claims that their products are displayed in the app with 98% size accuracy. Thanks to ARCore and Sceneform the experience is smooth and products are shown with really good details.

Google Maps AR

Google Maps AR

Source: The Verge

We all know Google Maps, they are great, but I can bet that many of you have been in a situation when you started the navigation in the middle of nowhere and it said “Head north”, so you started walking in one direction and, if you weren’t lucky enough, you had to walk back as you didn’t choose wisely.

In 2018, Google presented a new concept of a Google Maps feature. It uses maps and street view along with device sensors to show routes to the user. What is more, it can display additional info about places nearby and guide you to them.

Pokemon GO AR+ technology

At the beginning of July 2016 the Pokemon Go has been released. Fans all over the world started to catch’em all, causing big concerns about the safety of augmented reality users. Pokemon GO is an example of location-based AR app. It means that despite device rotation sensor it uses also GPS data to display Pokemons in proper position. The hype for Pokemon GO is history, but the technology is still hot.

Tips on how to create an app like Pokemon Go

Backend with Database

Each mobile application needs a server that will manage the users as well as the data flow. In our case, we need somewhere to store information about our Pokemon objects.

At the beginning the following should be enough:

  • Object's latitude and longitude can be provided by a pseudorandom generator,
  • Name,
  • Combat Power (CP),
  • Hit Points (HP),
  • 3D model of the animated Pokemon.

When you develop an MVP and want immediate prototypes, it’s great to choose cloud solutions instead of developing them on your own. I would recommend giving Firebase a try. It’s a really powerful NoSQL database that can be easily integrated with the Google Cloud Platform and, for example, push notifications. Moreover, Firebase has its own SDK for Android and it’s really easy to start using it.

To populate your database, just use the Pokemon API database with detailed information that can be used for future improvements such as evolution. Maybe you fancy starting up a Water Pokemon for lakes and rivers?

AR Algorithm

An algorithm for displaying augmented information using the camera view.

You can kick off with the one described in the blog post, but it requires some fine-tuning:

  1. Add the option of working with sets of points instead of a single predefined spot. Probably the easiest approach is to detect the closest Pokemon and bring it up on the display. You need to grab the object that is closest to you from the set, for example by using the distanceTo() method.
  2. It would probably be necessary to implement a simple low-pass filter to stabilise our Pokemon on the screen. A very nice example is provided here. The formula is written in JavaScript, but it’s quite language-agnostic.
  3. Additionally, we don’t want to show the Pokemon in the centre of the screen only. That’s why you would need to map the azimuth angle to the screen pixels to make it look smooth on the display.

Map

One of the key features of Pokemon Go is that you actually need to walk around to catch Pokemon. To pull this off, we can use Google Maps with the first-person view — really simple to set up in your app. Thanks to markers you can place the nearby Pokemons or Pokestops on your map. You can also set a custom location icon with your avatar.

UI

The most interesting part of the UI/UX in Pokemon Go is the Pokeball throwing animation.

As you may well expect, there are as many ways to implement this as there are developers and everything depends on how accurate and sophisticated you want it to end up.

As a starting point, you might want to check out ObjectAnimator from the SDK. This tool allows you to animate a bitmap of a Pokeball. After some user interaction that involves hitting the Pokemon, just play the pre-programmed animation of catching it.

Remember! This Is Just the Beginning!

Challenges with Augmented Reality on Mobile

So, we went through the basics of augmented reality. You know what are the types of augmented reality and learn about some examples and how they work.

But is this technology mature and devoid of flaws? It does have a few, but not many.

  1. Accuracy - mobile device sensors are prone to bias. It means that the accuracy of displayed objects will be lower minute by minute. In newer devices the sensors are much more accurate, but keep in mind that they are not perfect.
  2. Battery - display on, camera on, orientation sensors, image recognition, GPS. These are considered the most energy consuming hardware in every mobile device. This means that playing with augmented reality for even a few minutes can significantly drain your battery.
  3. Accessibility - in 2018 Google released ARCore, the framework for creating AR apps. Although it’s well optimized and really cool, it’s supported on a limited number of devices. The list is growing, but currently it contains about 100 devices. It is currently the easiest way to create augmented reality apps.

App tutorial: the introduction to Augmented Reality theory

Did you ever think about how the implementation of a location based augmented reality app looks in the code? You’ll have a chance to see how to build simple app that displays some objects in the real world. But let’s start with a bit of theory.

*Geodesy theory is based on the book: A. Jagielski, Geodezja I, GEODPIS , 2005.

Wikipedia explains the azimuthal angle as:

“The azimuth is the angle formed between a reference direction (North) and a line from the observer to a point of interest projected on the same plane as the reference direction orthogonal to the zenith”

introduction to Augmented Reality theory

What we will do is attempt to recognize the destination point by comparing the azimuth calculated from the basic properties of a right-angle triangle and the actual azimuth the device is pointing to.

Let’s list what we need to achieve this:

  • Get the GPS location of the device,
  • Get the GPS location of the destination point,
  • Calculate the theoretical azimuth based on GPS data,
  • Get the real azimuth of the device,
  • Compare both azimuths based on accuracy and call an event.

Now the question is how to calculate the azimuth. It turns out to be quite trivial because we will ignore the Earth’s curvature and treat it as a flat surface:

how to calculate the azimuth

As you can see, we have a simple right-angle triangle and we can calculate the angle ᵠ between points A and B from the simple equation:

 calculate the angle ᵠ between points A and B

The table below presents the relationship between the angle in grads and the azimuth A(AB):

image3-15

App tutorial: how to create augmented reality apps on Android?

In this tutorial, I will not describe how to get the location and azimuth orientation of the device because this is very well documented and there are a lot of tutorials online. Mainly for reference, please check Sensors Overview (especially TYPEROTATION_VECTOR) and Location Services.

Once you’ve prepared data from the sensors, it’s time to implement CameraViewActivity. The first and most important thing is to implement SurfaceHolder.Callback to cast an image from the camera to our layout. SurfaceHolder.Callback implements the three methods responsible for this: surfaceChanged() surfaceCreated() surfaceDestroyed().

The next step is to create an XML layout file. This is quite simple:

Once we have the whole presentation, we need to connect it with sensor data providers. In this case, this just means implementing and initialising the proper listeners from OnLocationChangedListener and OnAzimuthChangedListener.

When we’ve gotten the current location, we can calculate the theoretical azimuth based on the theory presented earlier.

When the orientation of the phone changes, we need to calculate accuracy, compare both angles and, if the current angle is within accuracy, we can show a pointer on the screen. Of course this approach is quite trivial but it illustrates the idea.

A method that implements calculation of the azimuth between points might look like the following:

To compare the calculated and theoretical azimuth we will use the isBetween() and calculateAzimuthAccuracy() methods:

Things to consider before implementing AR

Of course, this tutorial is quite basic. My goal was to show you that it doesn’t take much to make a simple AR app. I would like to point out some things that should be considered during further implementation:

  • The accuracy of sensors - unfortunately not perfect, mainly because of the magnetic field emitted by the device itself.
  • It would probably be necessary to implement a simple low-pass filter to stabilise indicators on the screen. A very nice example is provided here. The formula is written in JavaScript, but it’s quite language agnostic.
  • You will probably also need to implement a distance filter to show only the closest points. To achieve this, simply calculate the distance between points:

calculating the distance between points ar

And that’s all you need to know to start your Augmented Reality journey. I really encourage you to use your imagination and creativity to deliver innovative apps.

The potential is huge and it’s our job as developers to make it happen! Good luck!

This article was first published on Nov 4, 2017

augmented reality app
Real Estate app
READ ALSO FROM Mobile
Read also
Need a successful project?
Estimate project or contact us