Enhancing Automation with AI Tongue: Empowering Robots to Experience the Pleasures of Food

Robots Can Now “Taste” Food thanks to Artificial Intelligence

This article explores a groundbreaking project by researchers at Pennsylvania State University that has resulted in the development of an electronic tongue, enabling robots to taste food. Unlike a typical attempt at making robots eat food, this project aims to make robots more human-like, allowing them to experience flavors and improve at serving our needs.

The electronic tongue consists of two parts: a tongue and a gustatory cortex. The synthetic tastebuds are equipped with graphene-based electronic sensors called chemitransistors, capable of detecting various flavors like sweet, salty, sour, bitter, and umami. The gustatory cortex uses memtransistors, which are made of molybdenum disulfide and have the ability to remember past signals. This creates an “electronic gustatory cortex” that links different components, such as a “hunger neuron,” an “appetite neuron,” and a “feeding circuit.” For example, when tasting salty food, the AI tongue can detect sodium ions, allowing the device to “taste” salt.

The potential applications of the AI tongue are vast. A robot equipped with this technology could suggest diets that are appealing to humans by “tasting” meals. The future possibility of training an AI system to become a wine taster is also intriguing. Additionally, the device could assist artificial intelligence programs in creating personalized weight loss plans. However, before expanding its capabilities, the research team plans to improve the electronic tongue’s range. They aim to replicate the 10,000 taste receptors found on the human tongue using arrays of graphene devices, each slightly different from the others, allowing for a more nuanced understanding of tastes.

In addition to the AI tongue, another fascinating project discussed is the creation of an AI nose. Joel Mainland and his team developed an AI model that correlates a molecule’s smell with its molecular structure. They trained the model using a dataset containing the molecular makeup and olfactory traits of 5,000 recognized odorants. The AI model outperformed humans in analyzing 53% of the compounds and even made accurate predictions about odor strength, despite not being trained specifically for that purpose.

These advancements in artificial intelligence bring us closer to digitizing odors and recording them for future reproduction. They could also lead to the identification of new fragrances and flavors for various industries, reducing the reliance on endangered plants and opening up possibilities for functional scents like mosquito repellents or malodor masking.

In conclusion, the Pennsylvania State University researchers have revolutionized the way robots perceive food and smells through their AI tongue and AI nose projects. These breakthroughs not only enhance the capabilities of robots but also have implications for various industries and our understanding of sensory experiences.

For more information on this research, you can check out the research paper published in Nature Communications.

Stay updated with more exciting digital trends by visiting Inquirer Tech.

Reference

Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment