Rogue AI bot is giving recipes for human flesh and chlorine gas
Developed to churn out recipes for leftover food in people’s homes, a meal bot is now handing out (disaster) recipes to customers. The AI bot is a product of Pak'nSave, a New Zealand-based supermarket chain, and is powered by OpenAI’s GPT 3.5.
People have taken to social media to post the recipes that the Savey meal-bot has come up with. A user on X, formerly Twitter, asked the bot if they could make a dish using only water, bleach, and ammonia. The bot came up with a recipe for what it called the ‘aromatic water mix,’ which, as the user understood was the recipe for the poisonous chlorine gas.
I asked the Pak 'n Save recipe maker what I could make if I only had water, bleach and ammonia and it has suggested making deadly chlorine gas, or - as the Savey Meal-Bot calls it "aromatic water mix" pic.twitter.com/ybuhgPWTAo
— Liam Hehir (@PronouncedHare) August 4, 2023
Interesting Engineering, too, tried putting in the same ingredients as X User Liam Hehir but was met with a message from the bot which read: “Invalid ingredients found, or ingredients too vague. Please try again!”
Not reviewed by humans
As with all large language models, GPT, too, learns and trains on vast amounts of data as it functions simultaneously. Over time, this makes it less prone to falling for mistakes like giving out recipes that may result in someone’s death.
A spokesperson for Pak'nSave spoke to The Guardian and said that the company is disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose.”
The spokesperson added that the company would “keep fine-tuning our controls” of the bot to ensure it was safe and useful.
Before a user can put in their list of ingredients, the website does warn that an AI is responsible for generating these recipes and is not reviewed by a human being.
“To the fullest extent permitted by law, we make no representations as to the accuracy, relevance or reliability of the recipe content that is generated, including that portion sizes will be appropriate for consumption or that any recipe will be a complete or balanced meal, or suitable for consumption,” reads the bot’s terms of use.
“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.”
“A free service aimed to help you reduce food waste”
For another X user, the Savey meal-bot teetered towards cannibalism as it handed out a recipe for a ‘Mysterious Meat Stew’ with 500 grams of chopped human flesh, potatoes, carrots, and onions.
Stick man? More like sick, man. pic.twitter.com/lyVvHHdbeS
— Camryn Brown (@camrynpetebrown) August 4, 2023
Interesting Engineering again tried its hand at putting in a crazy list of ingredients like tar, bread, stones, mayonnaise, lettuce, and petrol, but the task was invalidated by the bot again. Looks to us like they may have fixed the bug.