How Google's Pixel 4 radar system could be more than a gimmick

Before you write off Google's new hand gesture feature as yet another mobile tech parlor trick, consider some critical points about the technology behind it.

Google Pixel 4 - Project Soli Radar
Travel Radar Media Ltd., modified by IDG Comm

Ladies and gentlemen, the future is upon us.

Well, either that, or we're about to be fed yet another Impressive-Seeming Phone Feature™ that looks incredible in marketing videos but ends up being more limited-use novelty than life-changing tech tool in the real world.

Ever since Google confirmed the presence of a radar-based hand-gesture detection system in its upcoming Pixel 4 phone, ample attention has been heaped onto the question of whether such a system would be incredible or ineffectual — whether it'd represent a new and transformative way of interacting with our phones or merely be a new twist on a tired old gimmick.

To be sure, Android device-makers have worked to get our attention with gesture controls before. Most recently, LG tried its hand at the task by showing off a feature it called Air Motion on its LG G8 ThinQ (gesundheit!) flagship.

Here's how it worked, in LG's implementation: You'd hold your hand up four inches from the phone's front-facing camera for a few seconds until the camera noticed — then, from there, progress into curling your hand into a claw-like shape and waiting another few seconds for the system to recognize that. Then, you'd be ready to roll and could move your hand in one of a few different patterns with the hope that the camera would pick up on what you were trying to do.

If you think that sounds awkward, just wait til you see how it looks:

Yeaaaaaaaaah.

The system was about as effective as you'd expect, and reviewers tore it to shreds accordingly. The website 9to5Google said: "Eight times out of 10, Air Motion doesn’t actually detect your hand, and the couple of times it does, it barely works. You have to perfectly place your hand to get this sort of thing working, and even then, it takes so much longer to get the feature working."

Tom's Guide was even more blunt: "In theory, Hand ID and Air Motion are revelatory. In practice, they made me want to throw the G8 ThinQ directly in the nearest trash can."

And Android Central was equally unmoved: "While it has promise, unfortunately, it's effectively a flop. Despite lots of training and experimentation, I just can't get the G8 to recognize my hand gestures very often or very quickly."

So why should this new Pixel system be any different? Why shouldn't we write it off right away as an equally insignificant gimmick? Why we should pay any attention at all to what Google's doing here?

I certainly can't answer any of those questions definitively at this point. It's entirely possible Google's Pixel 4 hand gesture system will be more sparkle than substance and won't be something we'll want to use in the real world. At the very least, though, I'm optimistic this situation could be different — that there may be more to this story than substance-free, marketing-friendly flash.

And there are three specific reasons why.

1. Accuracy

Much of the failure of LG's hand gesture system, as our friendly neighborhood reviewers noted, revolved around the fact that the thing just wasn't very good at figuring out what you were trying to do. I mean, c'mon: Even just activating the system sounds like an exercise in frustration — and after that, you're relying on what's ultimately a fancy camera sensor to detect your gestures and interpret them accordingly.

Google's system, in contrast, uses a shrunken-down radar chip created by the company's Motorola-born Advanced Technology and Projects (ATAP) group. It's something the group has been working on independent of Android since 2015, and it's something we've seen demoed numerous times along the way.

The whole point of using radar, as I explained in my in-depth Project Soli exploration earlier this summer, is that it's supposedly able to track the tiniest hand movements — "micromotions" or "twitches," as they're lovingly called. The system built around it, according to ATAP's engineers, was designed to "extract specific gesture information" from the radar signal at a "high frame rate."

What that ultimately means is that the chip can, at least in theory, sense precisely and reliably how you're moving your hand — making a twisting-like motion as if you were turning a volume knob up or down, for instance, or tapping your thumb and index finger together as if you were tapping a button — and then perform an action on your device that's mapped to that specific movement. And it doesn't require any complicated, time-requiring sequence of hocus-pocus hand-into-claw manipulation to activate.

See for yourself:

What's more, "even though these controls are virtual," Google's ATAP team has said, the interactions "feel physical and responsive" — with feedback "generated by the haptic sensation of fingers touching each other."

Now, let's be clear: Seeing the setup in a carefully controlled demo isn't the same as actually using it in the real world. But it's pretty apparent that this is a different level of technology than what LG attempted and that it has the potential to open up some interesting new doors.

Even if it works well, though, it has to offer some genuine, practical benefit beyond mere novelty. And that's where our next two reasons for optimism come into play.

2. Distance

LG's gesture detection system, much like Samsung's Air Gesture feature before it, requires you to hold your hand a few short inches from your phone's screen — at which point it's easy to think, "Well, golly jeepers: If my hand is four inches in front of my phone already, why don't I just reach out and touch the dad-gummed thing instead of fussing around with all this hand gesture mumbo-jumbo?"

Use of "dad-gummed" aside, it's a perfectly reasonable question to ponder. And it's another area where Google's Pixel 4 radar system should — or at least could — be different.

According to previous ATAP demos, the Project Soli radar system can sense and detect gestures being performed as far as 15 meters — roughly 49 feet — away. Forty-nine feet! That's almost a third of the width of a U.S. football field. And since it's using radar, not a camera, to "see" and interpret your hand gestures, you shouldn't have to position your hand directly in any line of sight in order for your commands to be detected.

Imagine — provided, of course, that this all works as well as the demos suggest — what sorts of practical possibilities that could create for controlling your phone while you're driving, running, working out, working outside, or doing (ahem) anything else where your hands aren't readily available.

And on a related note...

3. Ability to detect through materials

This last factor is huge: According to Google's ATAP group, the nature of the radar technology being used in the Pixel 4 allows the system to detect hand movements even through fabrics — without any visible path between your hand and the gadget.

Again, we're going off of unproven information here and working without the context of the Pixel 4's specific implementation, but the technology's general capability certainly suggests the Soli-enabled gestures could work even when a phone is tucked away in a pocket, purse, or backpack. Intriguing, no?

Still, even if we go out on a limb and assume that this all works consistently well, even in messy real-world conditions, there's more to consider.

The bigger picture

No matter how much we may know about the technology behind Google's Pixel 4 gesture system, there's one big, prickly unknown — and that's what exactly the phone's gestures will empower us to do. In its initial tease of the feature this week, Google mentioned the system allowing you to skip songs, snooze alarms, and silence phone calls — three tasks that certainly make sense as things you'd want to do when you can't easily swipe around on your phone's screen, for one reason or another, but also a relatively limited set of actions for such a powerful-seeming piece of technology.

Google also, however, said something that seems significant: "These capabilities are just the start, and just as Pixels get better over time, Motion Sense will evolve as well."

So what else could the system eventually accomplish? All it takes is a little creative thinking to imagine the possibilities. It's entirely conceivable to consider this technology letting you swipe your hand left or right from across the room to move through slides or images in a presentation you're casting to a larger screen — or to scroll through a document or web page in a similar manner. Adjusting volume from afar seems like an obvious possibility (and one we've already seen demonstrated with Project Soli, in fact). And it wouldn't be much of a stretch to think of the system integrating with connected smart hardware and allowing you to do things like adjust the level of light in a room by moving your hand up or down in a particular way.

There's also the fact that the Pixel 4 will almost certainly just be the first of many Google-made devices to feature this technology. (Google itself suggested as much in its announcement: "Pixel 4 will be the first device with Soli" — thus implying it won't be the only.) Over the past years, Google's ATAP team has talked about the Soli radar technology working with wearables, speakers, phones, computers, and even vehicles — all areas where Google has a hand in creating products.

So, yeah: It's no huge stretch to say Soli could eventually become the common thread across Google's various device lines and serve as a distinguishing feature no other company is likely to match. It could be the missing piece of the puzzle that really, truly shows off the value of Google's homegrown hardware effort and its end-to-end control of the entire user experience.

For now, there's every reason to remain skeptical about how Soli will fare outside of Google's walls and how valuable it'll be from a tech-using-human perspective. But there are also some pretty compelling reasons to be optimistic — to think that maybe, just maybe, there might be more to this than what we've seen before.

Sign up for my weekly newsletter to get more practical tips, personal recommendations, and plain-English perspective on the news that matters.

AI Newsletter

[Android Intelligence videos at Computerworld]

Copyright © 2019 IDG Communications, Inc.

It’s time to break the ChatGPT habit