Saturday, November 23, 2024

Harvard Students Develop Meta Smart Glasses App That Reveals People’s Sensitive Details

Ray-Ban Meta smart glasses were used by two Harvard engineering students to build an app that can reveal sensitive information about people without them realising it. The students posted a demo of the video on X (formerly known as Twitter) and showcased the app’s capability. Notably, the app is not being made publicly available to users, instead, they made it to highlight the dangers of AI-powered wearable devices that use discreet cameras that can capture photos and images of people.

The app, dubbed I-Xray, uses artificial intelligence (AI) for facial recognition and then uses processed visual data to doxx individuals. Doxxing, a popular Internet slang which is a portmanteau of “dropping dox (informal of docs or documents)”, is the act of revealing personal information about someone without their consent.

It was integrated with the Ray-Ban Meta smart glasses, but the developers said that it would work with any smart glasses with discreet cameras. It uses an AI model similar to PimEyes and FaceCheck for reverse facial recognition. The technology can match the face of an individual to publicly available images of them online and scour the URLs.

Another large language model (LLM) is then fed these URLs and an automatic prompt is generated to find out the person’s name, occupation, address, and other similar data. The AI model also looks through publicly available government data such as voter registration databases. Additionally, an online tool named FastPeopleSearch was also used for this.

In a short video demonstration, Harvard students AnhPhu Nguyen and Caine Ardayfio also showcased the workings of this app. They were able to meet strangers with the camera already turned on, and ask their name, and the AI-powered app could take over from there to find personal data about the individual.

In a Google Docs file, the developers said, “This synergy between LLMs and reverse face search allows for fully automatic and comprehensive data extraction that was previously not possible with traditional methods alone.”

The students have stated that they do not intend to make the app publicly available and only developed it to highlight the risks of an AI-enabled wearable device that can discreetly record people. However, this does not mean that bad actors cannot make a similar app using a similar methodology.

Source link

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest Articles