Will Glass Go Mainstream? OnTheGo Platforms Creates SDK To Help With That

By April 9, 2014

OTG

Four years ago, Ryan Fink, CEO of OnTheGo Platforms, began working on a running application called Ghost Runner.  Of course, back then there weren’t “smart glasses,” really.  But, the idea was to have a heads up display that recorded time, route, and distance; the usual culprits runners look for in an application.  Where this differed, though, was when you would fall behind.  “You’d start to see a 3D avatar start to run exactly where you ran the time before, so you essentially would race your own ghost,” said Fink. (See demo of Ghost Runner here.)

However, after building out a prototype and partnering with another company, Fink realized that the hardware wasn’t quite there and needed a lot of infrastructure to make smart glasses mainstream.  “So, we set out to build a platform that makes it easier for developers to create smart glass applications,” said Fink.

The interface they built is called Ari, which is short for Augmented Reality Interface, and they released their Beta SDK last week on their website for developers to try out for themselves.  Fink explained the interface this way, stating, “It makes a single outward facing camera into a motion tracking system, so you can control the glasses, like the basic functions and applications on the glasses, with gestures and motions.”

One of the interesting things about this is that it’s built on top of the Android stack, which means it can remain device agnostic, and Ari can be distributed across all major smart glass devices for the foreseeable future (i.e. a universal smart glass user experience).

Screen Shot 2014-04-09 at 9.31.38 AMWhile motion and gesture detection has come a long way, I wouldn’t say it’s completely accurate, yet.  If you’ve tried to activate gestures on a Samsung television, you know exactly what I’m talking about.  However, Fink is confident that you won’t have those issues with Ari.  “Ours is extremely accurate.  You don’t have to sync when you put it on or wave at it or act all goofy,” said Fink.  “It works out of the box.  You put it on, turn it on, and when you do a swipe or a gesture, we detect it.”

When you think about it, that’s pretty amazing.  Fink and his team aren’t just creating an interface for detecting gestures from a static camera, as most systems that utilize this type of control do.  They’re having to account for two different motions, you and the camera.

And that’s not the only challenge that they’re facing.  “The biggest one is the hardware constraints,” said Fink.  “The CPU and guts of glass and other smart glasses are essentially the equivalent of a smart phone about 2 years ago.  Your smartphone of today is more powerful, but it’s proven to be a positive.  Us having to develop this technology in a constrained environment, it’s made our software extremely powerful.  When we put ARI on a smart phone of today, it runs extremely fast and accurate.”

Fink is what you might call an early adopter.  He wears his Glass quite often in spite of the awkward looks and glances from people, and he’s hopeful that with Ari the experience will get better.  “I wear it quite a bit, but still, I’ll be honest, it’s kind of like having an iPhone with very few apps and little software,” said Fink.  “You can see the potential, but it’s not quite there.  So I use it for directions and sometimes texting and calling when I’m in motion.  But there are more and more use cases popping up.  As more developers start to adopt the platform, we’ll see more and more, and I’ll use it more.”

While this release will help to push this mainstream, Fink was transparent about people not being quite ready for smart glasses.  “No. People aren’t ready,” said Fink.  “That’s why we’re seeing a lot of adoption in the medical, manufacturing and fitness industries where voice and touchpad on the side of your face just doesn’t work.  That’s where it will start first.”

He also shared that he thinks we’ll begin seeing more adoption this fall, with companies like Google making good strides to make the technology “less geeky.”  But, mainstream adoption he estimates won’t happen for another 12-24 months.

While this is a good start, Fink isn’t finished yet.  They’re working on quite a few projects to help make that timeline a reality.  “We’ve partnered with one of the largest brands in the world, so we’ll be releasing some products with our software running on it in the next 6-8 months,” shared Fink on their future plans.  “We also have other pieces of our platform that we’re going to start releasing as well in the fall.”

You can download a free demo of their Beta SDK from their website here.  “If someone is interested and wants to check it out, come to our website and play with it,” urged Fink.  “I think you’ll be very happy with it.”

So, if you see someone waving their hands around walking down the street, don’t be alarmed.  They may just be using OnTheGo’s interface to control their smart glasses.

What do you think? Will you be the one swiping, opening and closing your hand, or making some other gesture?  We promise not to make fun of you at Techli, but we can’t make that same promise for everyone else quite yet.  See it in action on the video below.

Introducing OnTheGo Platforms from OnTheGo Platforms on Vimeo.