You may have seen some on the analysis that is taking place related to Google Soli, which is being viewed with a great deal of excitement (even though it will not be out until next year).
There has been significant work in this space over the years with Leap Motion (focused on hand based gestures) and Microsoft Kinect (addressing whole body or room scale sensing) with numerous examples of special application interfaces.
The first time I recall writing about gesture-based interfaces was back in 2007, although the Wii came out in 2006 (hard to believe that was almost a decade ago). The excitement about Soli did surprise me since the Leap Motion technology is available today (version 2.2.6 was released this week) and can do many of the same levels of gesture sensing (although it doesn’t have the same range as Soli).
In any case, I think we’ll see a whole new level of experimentation in how computers and humans can interface in a more intuitive fashion – and that’s a great thing.