Skip Navigation
Search
Pat Narendra, PhD
Product & Solution Strategy and Innovation
Zebra Technologies

Pat Narendra is a member of the Zebra Enterprise Mobile Computer Emerging Technology organization, where he passionately explores augmented reality, mobile locationing, deep and machine learning in in the enterprise ecosystem. Pat has a PhD in EE/CS from Purdue University (his thesis on Pattern Recognition and Machine Learning resulted in over a dozen refereed journal articles with over 4000 citations) and spent his early career in computer vision and signal processing research in Honeywell. Mid career, he obtained an MBA in Strategy from University of Minnesota. He launched his product development career in Motorola Mobility, Motorola Solutions, and Zebra where he has created and launched over a dozen products (and 9 granted patents). Pat is a hands on business and technical architect, equally at home in prospecting for customer ROI and coding up prototypes using the latest augmented reality frameworks on Zebra mobile computers and machine learning platforms such as Tensorflow lite.

For more Zebra AR context, please take a look at the recent blogs below:
http://developer.zebra.com/ blog/augmented-reality-extended-developing-enterprise-applications-zebra-mobile-computers
https://www.zebra.com/us/en/ blog/posts/2019/extending-augmented-reality-across-the-enterprise.html

Contact Information: patnarendra@zebra.com
Linkedin:  https://www.linkedin.com/in/pat-narendra/

Abstract

Augmented Reality Extended: Beyond Games and Novelty to Creating Value in the Enterprise

Augmented Reality has hit the mainstage in consumer smartphone applications.  We have all been exposed to AR emoji, trying out furniture in your living room, multiple “measure” apps, and playing interactive AR games.

In this talk, we pull back the curtain on what the game changer is in smartphone AR, the current framework limitations, and how we can extend it to Enterprise applications.  First, the core technology that enables precise motion tracking that underpins AR is briefly explained –They are real-time fusion of the IMU (accelerometer and gyro) and the camera visual tracking to estimate precise position and orientation.  The insertion of virtual objects in the camera scene (ie AR) flows from this.

We describe how the AR framework can be extended to meet the enterprise challenge:

1. The AR scene model needs to persist across: sessions, time, users, devices and even reproduce in a totally different location!   Most of the current applications of the AR frameworks are transient sessions.  Once you put the device in your pocket you cannot recover what you were doing before. 

2. Make the solutions robust within the workflow constraints of the Enterprise user.  The enterprise associate needs to be able seamlessly execute the function on demand, switch contexts and later return to the function without even being aware of the “AR” limitations beneath the app (like getting “lost” when the camera loses track).

3. Developing a family of innovative solution apps – for Planogram   generation   and compliance, AR assisted picking in store, AR assisted stocking, remote dimensioning using additional sensor integration.  A sample of these applications will be demonstrated in this talk.