According to the Mashable Columnist, Facebook’s search tool is about to get way more visual…
Director of Applied Machine Learning Joaquin Candela published a blog post today (accompanying his presentation at the Machine Learning @Scale event in New York City) to share updates about Facebook’s AI-based image-recognition tool. The improvements can hone in on photos to the “pixel level” and will let users search images based on their content — whether or not they’ve been manually tagged.
UNTIL RECENTLY, ONLINE SEARCH HAS ALWAYS BEEN A TEXT DRIVEN TECHNOLOGY
“Until recently, online search has always been a text-driven technology, even when searching through images,” he writes. “Whether an image was discoverable was dependent on whether it was sufficiently tagged or had the right caption — until now.”
Facebook’s computer-vision tools were originally envisioned to help the visually impaired navigate the service, discerning what’s in a photo just by scanning it. But today’s news shows general Facebook users have a lot to benefit from the feature as well.
Crediting “a lot” of teams for the advancements, Candela wrote that Facebook’s general-purpose AI platform, FBLearner Flow, is now running 1.2 million AI experiments a month — six times more than it was just a year ago.
BUILT ON TOP OF THAT IS LUMOS, FACEBOOK’S SPECIALIZED PLATFORM FOR IMAGE
Built on top of that is Lumos, Facebook’s specialized platform for image and video understanding. Using Lumos, the network’ search tool can identify features in images and video automatically. For users, that capability will help pinpoint searches to the exact pic they’re looking for — and for Facebook, the automation will make it easier to identify inappropriate content and spam.
These systems are also being employed to improve the platform’s automatic alt text (AAT) for photos, which makes the visual aspects of the platform more accessible to the visually impaired. With the new tools, a set of 12 new actions, like “people dancing,” have been added to the automatic image description.