Don't Show Again Yes, I would!

Meta is Probably Training AI on Images Taken by Meta Ray-Bans


Facebook parent company Meta last week added new AI features to its camera-equipped Ray-Ban Meta Glasses. You can use the camera feature on the glasses to get information about what’s around you and to remember things like where you parked. There’s also now support for video for AI purposes, for “continuous real-time help.”


With all of these new features that involve the camera continually viewing what’s around the wearer, there are new questions about what Meta is doing with that data. TechCrunch specifically asked Meta if it was using the images collected by the Meta Glasses to train AI models, and Meta declined to say.

“We’re not publicly discussing that,” Anuj Kumar told TechCrunch. Kumar is a senior director that works on AI wearables. “That’s not something we typically share externally,” another spokesperson said. When asked for clarification on whether images are being used to train AI, the spokesperson said “we’re not saying either way.”

TechCrunch doesn’t come out and say it, but if the answer is not a clear and definitive “no,” it’s likely that Meta does indeed plan to use images captured by the Meta Glasses to train Meta AI. If that wasn’t the case, it doesn’t seem like there would be a reason for Meta to be ambiguous about answering, especially with all of the public commentary on the methods and data that companies use for training.

Meta does train its AI on publicly posted Instagram and Facebook images and stories, which it considers publicly available data. But data collected from the Meta Ray-Ban Glasses that’s specifically for interacting with AI in private isn’t the same as a publicly posted Instagram image, and it’s concerning.

See also  OneXPlayer X1 Intel Core Ultra handheld games console

As TechCrunch notes, the new AI features for the Meta Glasses are going to be capturing a lot of passive images to feed to AI to answer questions about the wearer’s surroundings. Asking the Meta Glasses for help picking an outfit, for example, will see dozens of images of the inside of the wearer’s home captured, with those images uploaded to the cloud.

The Meta Glasses have always been used for images and video, but in an active way. You generally know when you’re capturing a photo or video because it’s for the express purpose of uploading to social media or saving a memory, as with any camera. With AI, though, you aren’t keeping those images because they’re being collected for the express purpose of interacting with the AI assistant.

Meta is definitively not confirming what happens to images from the Meta Glasses that are uploaded to its cloud servers for AI use, and that’s something Meta Glasses owners should be aware of. Using these new AI features could result in Meta collecting hundreds of private photos that wearers had no intention or awareness of sharing.

If Meta is in fact not using the Meta Glasses this way, it should explicitly state that so customers can be aware of exactly what’s being shared with Meta and what that is being used for.



Source Link Website

Share:

lisa nichols

My lisa Nichols is an accomplished article writer with a flair for crafting engaging and informative content. With a deep curiosity for various subjects and a dedication to thorough research, lisa Nichols brings a unique blend of creativity and accuracy to every piece

Leave a Reply

Your email address will not be published. Required fields are marked *