No more nervous dates or interviews: AI-powered glasses prompt answers to everything
Getting the jitters on a first date? Heart pounding at a nervous rate before a major interview? The students at Stanford University may have a solution to this, so that no one needs to rack their brains for the right words.
They have developed a pair of glasses that “listens to your conversation and tells you exactly what to say next.” In a tweet, Stanford student Bryan Hau-Ping Chiang explains the newly developed prototype.
say goodbye to awkward dates and job interviews ☹️
— Bryan Hau-Ping Chiang (@bryanhpchiang) March 26, 2023
we made rizzGPT -- real-time Charisma as a Service (CaaS)
it listens to your conversation and tells you exactly what to say next 😱
built using GPT-4, Whisper and the Monocle AR glasses
with @C51Alix @varunshenoy_ pic.twitter.com/HycQGGXT6N
In a video demonstration of the capabilities of the AI/AR glasses, the Stanford researchers can be seen as unable to control their delight. Alix poses questions to Varun who’s wearing the AR monocle. The glasses interpret the question and with a bit of delay, transcribe the answer and display it on the glass screen, which is then read out by Varun.

The glasses are based on OpenAI's Whisper, a speech recognition large language model, and Monocle AR glasses provided by Brilliant Labs. The glasses also have a microphone, a high-resolution display, and a camera.
How the glasses work
The AI glasses ‘rizzGPT’ communicate via Bluetooth with a web application on the host device, which could be the user’s phone. And when the user speaks or engages in conversation, the audio is converted to text on a real-time basis. OpenAI’s Whisper then allows the glasses to feed the chat to the chatbot, which then suggests answers back to the user.
All of this happens in a split second unless the user has a slow internet connection. The transcription speed depends on the wifi speed.

Also, the glasses displayed above can’t be worn on an actual date or an interview, as is being advertised by the team. The prototype needs to be developed to have a long battery life, light body weight, less bulky, and most importantly, faster response time. And then maybe we’re looking at a product that can be used in everyday life. As the researchers say in their tweet, “lots more to build here, especially once multimodal GPT4 arrives.”