Search

Google I/O 2015: Let’s Imagine “What If”

Jason Atwood

6 min read

Jun 4, 2015

Google I/O 2015: Let’s Imagine “What If”

At I/O 2015, Google’s engineers announced an amazing array of new projects and technologies, in addition to updating us on the progress of existing projects. We Android Nerds were impressed with the new features coming in Android M, and the new developer tools in Android Studio.

And while those things certainly interest us in our day-to-day development, we also really geek out over the big ideas that are way down the road. One of my favorite things to do during I/O is to think about the possibilities these new technologies present, how the technologies might be combined, and what these technologies might ultimately mean for Android.

Project Jacquard

Project Jacquard is an effort to manufacture textile-based touch surfaces, much like our current phones and tablets, but flexible, stretchable and capable of withstanding all of the abuse we expect from modern textiles.

Project Jacquard introduces the possibility of new gesture detectors in Android. All of our UI Events in Android are based on interacting with a rigid surface. Our fingers touch, drag and lift. But with textile-based touch surfaces, we’ll have so many new gestures we can detect. Imagine all the interactions we already have with fabrics. We could see UIs responding to smear, pinch, palm press or crumple gestures. Imagine a drawing app working in conjunction with Project Jacquard. At any time, the user could clear her drawing canvas by crumpling the textile!

And we can think about more than just clothes—we can consider any surface that is textile-based. Imagine a multitouch remote control for your TV, sewn right into the arm of your sofa. What about touch controls sewn into the door panel of your car? Imagine swiping an area of the door panel, indistinguishable from the rest of the panel, which unlocks the doors or rolls down the window!

The real big idea behind Project Jacquard is not the multitouch sensor; it’s conductive yarn in general. Because Google is driving this technology forward, they are solving a lot of general problems associated with mass producing electronics embedded in washable, stretchable, durable textiles.

Google’s goal was, of course, capacitive touch matrix, but the sky really is the limit. Ivan Poupyrev was right when he called Project Jacquard a raw material. Imagine sewing an induction charging loop into the pocket of your jeans, which is wired to an embedded battery. Every time you put your phone back in your pocket, it starts recharging!

Project Soli

Project Soli is an attempt to capture hand gestures independently of a physical device. We already have a large hand motion vocabulary, so why can’t we observe those gestures without a touch device? The goal is to replace all input with gestures of the hand, such as sliding the thumb over the fingers or making a twisting motion of the thumb and forefinger.

I anticipate this will give us additional UI events in Android. I can imagine setting an OnTwistListener on a dial view or SlideListener on a slider view. Imagine: instead of bug reporting with rage shake, we can listen for fist pump gestures!

The number of hand gestures is really endless. It would be amazing if the API allowed developers to implement their own gesture recognizers, perhaps by providing a training data set. Like the example of flicking a virtual soccer ball, your game app might want to implement a two-finger flick. If this gesture wasn’t recognized by the API, you could train the algorithm by providing your own sample data.

Project Tango

Project Tango has integrated 3D mapping and localization hardware and software into a handheld device. The user can build 3D maps of her environment simply by navigating through the environment, capturing video. The user can then “replay” that 3D map later via the built-in positioning sensors.

The possibilities that this technology presents are endless. Google has demoed some of the capabilities of Project Tango before, and expanded on those capabilities this year, as seen in this demo video. What is amazing is that the Project Tango form factor has moved down from tablet size to smart phone size. While this is still very early and only a few development partners will get one of these phones, it’s important to see the direction they are taking. I really expect to see some of these technologies make it into consumer smartphones.

For us as Android developers, we will have to deal with the fact that users will be generating a lot of 3D content. I’m excited to think about the UI patterns that will display and handle all of that content. If you had an app that cataloged people’s 3D room captures, would it make sense to list all of those videos as flat 2D Material cards? Maybe not.

I anticipate more augmented reality interactions with connected devices. Google owns Nest, and Nest works with dozens of connected devices, so I imagine greater cross-pollination between Project Tango and these devices.

Imagine that you’re sitting in the living room and you want to check the status of your Nest thermostat or Whirlpool refrigerator in the kitchen. Instead of pulling up the Nest or Whirlpool apps, you point your phone’s camera into the kitchen and the screen will show an augmented reality window displaying your entire kitchen with status bubbles hovering over each device!

Any company that has built an app to help users interact with the physical world can benefit from this technology. There are probably hundreds of apps by companies that already have brick and mortar stores. These apps are designed to capture more customers who might not come into the physical store. Eventually, this distinction between in-store and online customers will blur, and these apps will service more people. Imagine an app that relies on augmented reality to navigate a customer to the correct aisle and shelf to find the product he is looking for.

Project Abacus

Project Abacus hopes to remove the need for single-mode authentication (e.g., passwords, pattern recognition, facial recognition, fingerprint recognition) and fuse each of these modes into a multi-modal recognition which could provide an “authentication score.”

I can definitely see this being included in the Auth Play Services API. Just like FusedLocationProvider handles all of the sensor data available on a device to locate a user in physical space, we could see a FusedAuthenticationProvider that will handle all sensor data available on a device to help authenticate a user. A single call into Google Play Services could provide real-time authentication of the user of your application.

As developers, we would benefit from the most secure combination of sensor data available to each user. By relying on this Play Services API, our apps would get instant benefit as Google tweaked and improved the authentication algorithm. I could even imagine a parallel to setPriority() where we decide how secure an authentication we need from the service.

I also think it would be crazy to incorporate Project Jacquard into the multi-modal authentication. What if a Project Jacquard jacket had additional sensors around the jacket to interpret the body geometry of the wearer? This body geometry data could be fed into the authentication algorithm and “unlock” the touch surface only once the wearer had been verified.

If you are at all interested in these projects, I encourage you to read up on them. Each project has its own page on ATAP’s site. Most of what I presented here are out-there ideas and I don’t see them coming any time soon, but one thing I love about Google I/O is that it lets me put on my Jules Verne hat and ask “what if?”

Mark Dalrymple

Reviewer Big Nerd Ranch

MarkD is a long-time Unix and Mac developer, having worked at AOL, Google, and several start-ups over the years.  He’s the author of Advanced Mac OS X Programming: The Big Nerd Ranch Guide, over 100 blog posts for Big Nerd Ranch, and an occasional speaker at conferences. Believing in the power of community, he’s a co-founder of CocoaHeads, an international Mac and iPhone meetup, and runs the Pittsburgh PA chapter. In his spare time, he plays orchestral and swing band music.

Speak with a Nerd

Schedule a call today! Our team of Nerds are ready to help

Let's Talk

Related Posts

We are ready to discuss your needs.

Not applicable? Click here to schedule a call.

Stay in Touch WITH Big Nerd Ranch News