(Nweon December 14, 2023) In a support document released recently, Apple stated that starting from iOS 17.2, it will begin to collect map-related data to improve the speed and accuracy of AR augmented reality functions.
In a blog post titled "Help improve Augmented Reality Location Accuracy in Maps/Help improve Augmented Reality Location Accuracy in Maps", Apple is calling on users to share data about the surrounding environment with Apple when using the AR function, thereby helping Improve the speed and accuracy of augmented reality features in Apple Maps.
When you use the augmented reality feature in Apple Maps, you can hold up your iPhone to scan your surroundings and detect feature points from nearby buildings and other physical features.
Apple points out that when you use the augmented reality feature, the relevant photos will not be sent to Apple or stored on the device, and the system will only collect relevant data. Relevant feature points are detected in a way that humans cannot read, and the iPhone camera filters out moving objects, such as people and vehicles. Apple Maps only requires feature point data of surrounding stationary objects.
Apple will also take additional measures to protect the privacy of user data. The data they collect is encrypted to ensure it is not associated with any individual user or Apple ID. In addition, Apple will also add "noise" to the data to increase the irregularity of the data, thereby preventing the possibility of restoring the image through the data
You can start or stop sharing your data at any time to help improve the accuracy of augmented reality targeting
The above is the detailed content of Apple announced that it will start collecting map-related data starting from iOS 17.2 to improve the AR function experience. For more information, please follow other related articles on the PHP Chinese website!