The iOS 16 Photo Feature I Can’t Wait to Try


    
    A new tool in iOS lets you remove the background from a photo just by tapping and holding on a photo.
    Apple
    


    This story is part of WWDC 2022, CNET’s complete coverage from and about Apple’s annual developers conference.
    


    Apple’s WWDC keynote?gave us previews of?MacOS Ventura, iPadOS 16, WatchOS 9 and, of course, iOS 16. The next major version of iPhone software will include?editable Messages?and a customizable lock screen. But there was one feature that truly grabbed my attention Monday. It stood out despite taking up?less than 15 seconds of the nearly 2-hour keynote.
    The feature doesn’t have a name, but here’s how it works: You tap and hold on a photo to remove the background. And if you keep holding your finger against the screen you can then lift the foreground “cutout” into another app to post or share. It seems similar to the way Portrait mode photos separate a person from their backdrop.
    Technically, the tap-and-hold photo feature is part of Visual Lookup, which was first launched with iOS 15 and can recognize objects in your photos such as plants, food, landmarks and even pets. Visual Lookup doesn’t just identify objects; it can also give you information about them. In my Photos app, for example, a picture I took of the Golden Gate Bridge has a link to Siri Knowledge that shows information about the bridge and a link to Maps displaying how to get there.
    


    See also

  • WWDC 2022 Recap: iOS 16 and Everything Else Apple Announced
  • Apple M2 MacBook Air Hands-On: Bigger Screen, Higher Price
  • iOS 16 Features iPhone Users Are Going to Love Most


    Robby Walker, Apple senior director of Siri Language and Technologies, demonstrated the new tap-and-hold tool on a photo of a French bulldog. The dog was “cut out” of the photo and then dragged and dropped into the text field of a message.
    


    “It feels like magic,” Walker said.
    Sometimes Apple overuses the word “magic,” but this tool does seem impressive. Walker was quick to point out that the effect was the result of an advanced machine-learning model, which is accelerated by core machine learning and Apple’s neural engine to perform 40 billion operations in a second.
    Knowing the amount of processing and machine learning required to cut a dog out of a photo thrills me to no end. Many times new phone features need to be revolutionary or solve a serious problem. I guess you could say that the tap-and-hold tool solves the problem of removing the background of a photo, which to at least some could be a serious matter.
    


    
    


    
    I couldn’t help notice the similarity to another photo feature in iOS 16. On the lock screen, the photo editor separates the foreground subject from the background of the photo used for your wallpaper. This makes it so lock screen elements like the time and date can be layered behind the subject of your wallpaper but in front of the photo’s background. It makes it look like the cover of a magazine.
    I haven’t been able to try the new Visual Lookup feature so instead I’ve been watching the part of the WWDC keynote where that French bulldog gets pulled out of its photo over and over. If you have a spare iPhone to try it on, a developer beta for iOS 16 is already available and a public beta version of iOS 16 will be out in July.

For more, check out everything that Apple announced at WWDC 2022, including the new M2 MacBook Air.?