Some of the material in is restricted to members of the community. By logging in, you may be able to gain additional access to certain collections or items. If you have questions about access or logging in, please use the form on the Contact Page.
For decades, augmented reality has been used to allow a person to visualize an overlay of annotations, videos, and images on physical objects using a camera. Due to the high computational processing cost that is required to match an image from among an enormous number of images, it has been daunting to use the concept of augmented reality on a smartphone without significant processing delays. Although the Global Positioning System (GPS) can be very useful for the outdoor localization of an object, GPS is not suitable for indoor localization. To address the problem of indoor localization, we propose using mobile augmented reality in an indoor environment. Since most smartphones have many useful sensors such as accelerometers, magnetometers and Wi-Fi sensors, we can leverage these sensors to locate the phone’s location, the phone’s field of view, and the phone’s angle of view. Using Mobile Augmented Reality (MAR) based on processing data from several smartphone sensors, we can achieve indoor localization with reduced processing time. We tested MAR in simulated environments, and deployed the system in the Love building (LOV) at Florida State University. We used 200 images in the simulated environment, and compared the matching processing time between multiple object recognition algorithms and reduced the matching time from 2.8 seconds to only 0.17 second using a brisk algorithm.