Recently, foreign technology media PiunikaWeb disclosed that Google is preparing to introduce a new feature called “Look and Sign” on the Pixel Tablet that is being released. This innovative move aims to further optimize and enhance the interactive experience between users and device AI.
"Look and Sign" is the introduction of a feature that will allow users to express commands through simple gestures and will allow for more direct interaction with the Google Assistant . For example, users can communicate their intentions through a thumbs-up or other obvious gesture, greatly simplifying the communication process between the user and the device and reducing reliance on verbal instructions. The launch of this feature will not only provide ordinary users with a more convenient way of interaction, but is also expected to bring good news to user groups who use sign language. Become an important feature with accessibility features.
Google has previously launched a similar “Look and Talk” feature on its Nest Hub Max device. By leveraging the device's camera, users don't need to speak the "OK Google" wake word, they can simply speak into the designated area toward the Nest Hub Max and the device will respond. Today, the emergence of the "Look and Sign" function will undoubtedly further enrich and expand the interaction between users and smart devices, making the application of artificial intelligence technology closer to user needs. Through the "Look and Sign" feature, it's likely to further enrich and extend the modes of interaction between users and smart devices, undoubtedly bringing AI technology applications closer to user needs.
The above is the detailed content of Innovative interactive methods! Google Pixel tablet tests 'Look and Sign' gesture control. For more information, please follow other related articles on the PHP Chinese website!