Apple

Apple’s hidden photo feature that will blow your mind

img_1446-1200x675
mm
Written by Kamil Arli

Forbes Columnist Ian Morris wrote an article on Apple’s Hidden Photo Feature

As we leave 2016 behind it’s worth taking a look at one of the features of Apple’s iOS 10 that doesn’t get as much coverage as it should. In this case, it’s the iPhone’s ability to search your photos and tag both people and objects. What’s interesting here though is that Apple does this locally, rather than using the cloud as Google does with its Photos service.

IT IS NOT INSIGNIFICANT THAT THIS HAPPENS LOCALLY EITHER

It’s not insignificant that this happens locally either. For one thing, it’s a lot harder to use a phone than a cluster of supercomputers. But also, it does mean that those who don’t sync images to iCloud can still enjoy the service. I do use cloud sync, and this also means that I can see photos from other uploads I might make – although I tend not to use iCloud for non-iPhone images.

SEE ALSO:   "Apple released its next big thing and nobody noticed" Matt Weinberger says

BOTH GOOGLE AND APPLE’S SERVICE USE COMPUTER VISION

Both Google and Apple’s services use “computer vision” to detect what’s in photos. There’s also hardware that can do this with lookup tables too, Qualcomm has previously demonstrated this to me as being a part of its recent Snapdragon processors. But on the iPhone it’s incredibly easy to pick a subject, let’s say “cats” and search your photo library for images that feature a cat.

To save you power and performance issues, iOS only catalogues when the phone is charging. When you take a new photo, Apple’s Image Signal Processor is able to detect objects and people, and will then look them up against its inbuilt database. At the moment this is no shared between your devices though, so don’t expect to see your “favourite” people turn up on an iPad when you tag them on the iPhone.

SEE ALSO:   Apple's MacBook Pro touch bar Vs. Windows touch screen: and the winner Is

IT IS WORTH TRYING A FEW SEARCHES TO SEE WHAT YOU CAN COME UP WITH

It’s worth trying a few searches to see what you can come up with. To get going, just open the Apple Photos app and press the little search magnifying glass. Amazingly iOS recognises Father Christmas, Cats and the names of people. In the latter example you can tell Apple photos who appears in your images and it should then be able to identify those people again. You can also go through a process where the phone suggests photos of the person you tagged, and asks you to confirm that it’s them – this is quicker than letting the AI do it.

Apple's photo app can detect people and objects - image credit: Ian Morris

Apple’s photo app can detect people and objects – image credit: Ian Morris

Another big advantage of this feature is that the phone or tablet can use what it knows to build you collections of photos. Obviously, like all mobile devices these days it’s entirely possible to place photos based on location. But Apple’s photos features is a bit more clever than that, the idea is that it can build slideshows based on themes, as well as locations. It should be able to make you slideshows that you actually want to watch, rather than just bundling up a bunch of photos taken at one time, in one place.

SEE ALSO:   The iPhone at 10: A decade later, is the smartphone's golden era over?

If anything this feature is more exciting when you consider the future. It’s only going to get better as time goes on, and each update to iOS will more than likely see AI improve and give us new features. It’s certainly a lot easier to search for pictures of your cat by typing “cat” than it is any other keyword. Really, this feature is about bringing the brilliance of geo-tagged photos to further make search useful to humans.

About the author

mm

Kamil Arli

Editor of DigitalReview.co. Digital Media Consultant

Leave a Comment