What happens now that phones can see?

  • 25 June 2014
  • 0 replies
  • 101 views

Userlevel 7
By Julian Green/ Posted on June 25, 2014
 
Photos give us a window to the world -- our own experiences can only show us so much. Photos give us a richness of experience and perspective that we would otherwise be devoid of. The sudden existence of billions of smart phones, which take trillions of photos and videos per year, has given us the biggest increase we’ve seen yet in rich user data -- and now we have the technology to start to make sense of that data.
With this new technology, it’s now possible to analyze photos for myriad applications. For example, at Jetpac we use 100s of millions of travel photos shared on Instagram to develop the Jetpac City Guides app, and do object recognition on the photos to be able to recommend places such as restaurants with patios (from the blue skies in the photos), hipster bars (by the unusually high number of mustaches we spot in photos), and the best coffee shops (from the highest proportion of latte art in photos). We've also now developed deep learning AI technology to broaden our ability to be able to recognize all objects in photos. Putting deep learning on the iPhone (Spotter by Jetpac) means we can now recognize many objects from the iPhone's real-time video feed, locally on the phone. Having minimized the technology significantly, we also have room to allow you to train the phone to recognize a specific object (Deep Belief by Jetpac).
 
betanews/full read here/ http://betanews.com/2014/06/25/what-happens-now-that-phones-can-see/


0 replies

Be the first to reply!

Reply