Recipes suggested by an artificial intelligence bot make those who want to prepare food say “no more”

An artificial intelligence from New Zealand makes those who want to prepare practical foods say "no more" by suggesting literally deadly concoctions.
 Recipes suggested by an artificial intelligence bot make those who want to prepare food say “no more”
READING NOW Recipes suggested by an artificial intelligence bot make those who want to prepare food say “no more”

New Zealand-based supermarket chain Pak’nSave offers shoppers the ability to create recipes from refrigerated ingredients using artificial intelligence. The Savey Meal-Bot was first introduced in June, and Pak’nSave says you need at least three ingredients to create a recipe without any extra grocery trips.

But sometimes things don’t go as planned. New Zealand commentator Liam Hehir wrote on Twitter that he asked the Pak’nSave bot to create a recipe that only included water, ammonia and bleach. Bot complied with this request and offered a recipe for “Aromatic Water Blend”, “the perfect soft drink to quench your thirst and refresh your senses.” As any elementary school chemistry teacher will stress, this mixture will create deadly chlorine gas. Bot also did not hesitate to offer a recipe when presented with water, bread and ingredients including “ant venom flavored jelly”.

The Savey bot has already made a lot of people laugh with the recipes it offers, but more and more people have noticed lately that the bot has no restrictions on what goes into recipes, including cleaning products, glue, and even cat food.

Normal materials don’t always give good results either.

In addition, even when you add the normal ingredients, you may not always encounter a recipe that can be eaten. While the company told The Guardian it was disappointed that some were using the tool “inappropriately and inappropriately”, the bot isn’t shy about using a liquid thyme-flavored milk sauce on a sage-marinated tofu and nori sandwich. Moreover, none of the ingredients you offer to the bot seem to be left out of the recipe. In some cases, it can even add ingredients to a meal, such as bread or milk, which defeats the purpose of saving you from the need to go to the grocery store.

Fortunately, Pak’nSave says it’s working to improve the bot’s settings.

Users who install Savey are informed that the recipes are not “humanly reviewed” and Pak’nSave makes no claims about the “accuracy, suitability, or safe consumption of recipe ingredients” even if the food itself is “appropriate.” With these disclaimers, it’s even debatable why the bot was introduced in the first place.

It’s not known what AI model the company uses for its Savey bot, but ChatGPT and other major chatbots have restrictions on asking them to create poison gas or other chemicals or tools that can cause harm, whether it’s Molotov cocktails or chlorine gas. However, these chatbots are notoriously bad at finding quality or even edible recipes.

This is not a new problem. Neural networks and early chatbots failed to create viable, edible recipes as long as they were online. While modern AI bots are getting better at recognizing certain components, the AI ​​is trained on which components are often used together, not on what kind of result will come from combining them.

Comments
Leave a Comment

Details
115 read
okunma61621