Today, the U.S. Patent and Trademark Office has officially granted Apple a patent covering computer systems that include a display generation component and one or more input devices that provide computer-generated experiences, including but not limited to electronic devices that provide virtual reality and mixed reality experiences through a view. Apple’s invention largely delves into a future mixed reality headset that will work in tandem with microhand gestures and eye tracking that could be helpful when playing video games, navigating menus, and controlling media playback.
While displaying a three-dimensional environment, a computer system detects a hand at a first position that corresponds to a portion of the three-dimensional environment. In response to detecting the hand at the first position: In accordance with a determination that the hand is held in a first predefined configuration, the computer system displays a visual indication of a first operating context for gesture input using hand gestures in the three-dimensional environment; and in accordance with a determination that the hand is not held in the first predefined configuration, the computer system dispenses with displaying the visual indication.
In some versions, a computer system allows a user to use micro gestures performed with small movements of fingers relative to other fingers or parts of the same hand to interact with a three-dimensional environment (for example, a virtual or mixed reality environment).
The micro gestures are detected using cameras (e.g. cameras integrated into a head-mounted device or installed outside the user (e.g. in a CGR room)), e.g. as opposed to touch-sensitive surfaces or other physical controllers.
Different movements and locations of the micro gestures and different movement parameters are used to determine the operations performed in the three-dimensional environment. By using the cameras to capture the micro gestures for interacting with the three-dimensional environment, the user can move freely through the physical environment without being hindered by physical input devices, allowing the user to explore the three-dimensional environment more naturally and efficiently.
In addition, micro gestures are discreet and unobtrusive and suitable for interactions that may take place in public and/or require decorum.
The configuration of the ready state of the hand is used by a computer system as an indication that the user intends to interact with the computer system in a predefined operating context different from the currently displayed operating context. For example, the predefined operating context is one or more interactions with the device that is outside the currently displayed application (eg. game, communication session, media playback session, navigation etc.†
Apple’s patent FIG. 4 below illustrates hand tracking unit #243. In some embodiments, an eye tracker is configured to track the position and movement of the user’s gaze relative to the user’s hand. Patent Figs. 4 further includes a schematic representation of a depth map #410 captured by the image sensors #404.
Patent Figs. 4 also schematically illustrates a hand skeleton #414 that a controller eventually retrieves from the depth map of hand #40. In FIG. 4, the skeleton is superimposed on a hand background #416 that is segmented from the original depth map.
In some embodiments, key features of the hand (e.g., points corresponding to the knuckles, fingertips, the center of the palm, the end of the hand connected to the wrist, etc.) and optionally on the wrist or arm that hand bandaged, identified and located on hand skeleton #414.
In some embodiments, location and movements of these key feature points across multiple image frames are used by the controller 110 to determine the hand gestures performed by the hand or the current state of the hand.
Apple’s patent FIGS. 7A above shows block diagrams illustrating user interactions with a three-dimensional environment.
Apple’s patent FIG. 7B below illustrates an example user interface context with menu #7170 that contains user interface objects #7172-7194. The menu is displayed in a mixed reality environment (e.g. floating in the air or on top of a physical object in a three-dimensional environment, and corresponding to operations associated with the mixed reality environment or operations associated with the physical object) ; Fig. 7C menu 7170 is represented by a view of a view of a device (e.g. device 7100 (FIG. 7C) or an HMD) with (e.g. overlay) at least part of a view of a physical environment captured by one or more – directional cameras from device #7100. In some embodiments, the menu is displayed on a transparent or semi-transparent display of a device (for example, a heads-up display or an HMD) through which the physical environment is visible.
Apple’s patent FIG. 7G above illustrates example gestures performed with a hand in a ready state and example responses of a rendered three-dimensional environment that depend on a user’s gaze.
In some embodiments, the user’s gaze directed at a virtual object in a three-dimensional environment that responds to gesture input causes a visual indication of one or more available interaction options for the virtual object to be displayed only when the hand of the user is held. user also be in a predefined ready state for providing gesture input.
For more details, see Apple’s granted patent 11,340,756.
Other recent reports on this subject from Patently Apple:
May 19: Apple invents new hand gesture tracking system based on camera data from events and more
May 14: Hand-tracking technology used to control future MR headset user interfaces is closer to market than you think
April 21: A new Apple patent describes in-depth the use of mid-air gestures to control a new kind of Mixed Reality headset interface
For more, check out Patently Apple’s archive at: gestures in the air that covers 60 patents.