There’s no doubt privacy was a major selling point at Apple’s WWDC (Worldwide Developers Conference) this year. From letting developers add a whole new way to sign up and authenticate users with ‘Sign In with Apple’ to add a bunch of small, yet powerful, privacy-focussed features in iOS 13 and macOS, Apple has clearly left no stone unturned in its push towards privacy. Every single day, developers are discovering new iOS 13 features resonating with Apple’s big privacy push.
In iOS 13, Apple is making it easier for users to control their personal information including their location. Apps may need your location details for a variety of reasons, but you don’t want every other app to access these details.
The iOS 13 developer beta is seen to display pop-up notifications while an app is using your location data in the background, reports 9to5Mac. This notification will also include a map of all the location data that an app has been tracking. This will also include the app developer’s reason for capturing the location data in the background.
Users can choose to continue giving the app access to their location information or switch it to ‘only while using’. At the WWDC 2019 keynote, Apple also made it clear that with iOS 13, users can simply allow apps to access location data just once. When the app is opened again, it’ll still ask for permission to ask for your location data.
These notifications won’t appear on Apple’s other platforms since they don’t support always-on location tracking (on tvOS and macOS) and on watchOS it isn’t even needed.
Apart from its privacy-centric features, iOS 13 also features multi-cam support. This means the mobile operating system will allow apps to simultaneously capture photos, video, metadata, video, and depth from multiple microphones and cameras on a single iOS device.
At WWDC, Apple executives demonstrated the new feature during a presentation. The Cupertino-based company demoed a video recording app with a picture-in-picture mode that simultaneously recorded video from the front camera and the primary camera at the back.
The feature will enable developers to access the dual TrueDepth cameras and include a combination of streams. The multi-cam feature in iOS 13 will be limited to iPhone XS, iPhone XS Max, iPhone XR, and the iPad Pro. Apple also made it clear that iPhones and iPads will only be able to carry a single session of multi-cam in one go due to limited resources.
Another camera feature coming with iOS 13 is the Semantic Segmentation Mattes. This feature will help developers use an API to identify items such as skin, hair, and teeth, etc. This could let developers create apps for adding an array of visual effects to users’ photos.