- Researchers at Penn State have developed an electronic gustatory system, simulating the human sense of taste to infuse AI with emotional intelligence.
- This breakthrough technology offers AI applications ranging from personalized dietary recommendations based on emotions to refined AI systems capable of distinguishing nuanced tastes, such as favorite food.
Penn State University researchers have made significant strides in the development of artificial intelligence (AI) that can simulate and understand human taste. Their work centers around creating a “gustatory emotional intelligence” in AI systems, aiming to combine the physiological aspects of taste with the psychological aspects of human emotion.
While AI has made remarkable advancements in various fields, it has largely failed to incorporate the complex emotional facets of human behavior.
Bridging the Gap: Emotion in Artificial Intelligence
Dr. Saptarshi Das, an associate professor of engineering science and mechanics at Penn State and a co-author of the study, emphasized the importance of bringing emotional intelligence to AI. This involves understanding the emotional connections we have with our choices, particularly when it comes to favorite food.
Complex Relationship Between Physiology and Psychology
In their study, the researchers observed that the process of gustation (our sense of taste) plays a significant role in determining what we eat. It is not solely about our physiological needs like hunger; our psychological state and emotional preferences play a pivotal role. Dr. Das gave an example of choosing favorite food based on what you like when you have multiple options available, even if you are not hungry.
Mimicking Human Taste with Graphene-Based Sensors
The researchers created a simplified, biomimetic version of the human gustatory system, comprising an electronic “tongue” and “gustatory cortex.”
- This system utilizes 2D materials, which are only one to a few atoms thick, and electronic sensors made of graphene (chemitransistors) to detect gas and chemical molecules.
- They also employed memtransistors made of molybdenum disulfide, which can remember past signals.
- By combining these materials, they developed an “electronic gustatory cortex” that mimics the human physiological state (“hunger neuron”), psychological state (“appetite neuron”), and the decision-making process (“feeding circuit”).
Expanding the Capabilities of Electronic Tongue
The AI-driven tongue can detect different tastes, such as sweet, salty, sour, bitter, and umami. The researchers envision the technology as having various applications, from creating AI-generated diets based on emotional intelligence to personalizing restaurant menus. Moreover, they plan to expand the electronic tongue’s capabilities to mimic the subtleties of human taste by creating arrays of graphene devices that can simulate the thousands of taste receptors on our tongues.
Advancing Emotional and Sensory Intelligence
In the near future, the team plans to integrate the artificial tongue and gustatory circuit into a single chip to further simplify and optimize the technology.
- Additionally, they hope to explore emotional intelligence in AI across different senses, including:
- Olfactory, to advance the capabilities of future AI systems.
- The researchers believe that this technology can potentially be refined to replicate human behavior more closely and significantly contribute to AI’s development.
This groundbreaking work by Penn State researchers has laid the foundation for a new era of AI with emotional and sensory intelligence, transforming the way AI understands and responds to human behavior and preferences.
1. What is the purpose of the electronic gustatory system developed by Penn State researchers?
The purpose of this system is to infuse AI with emotional intelligence by simulating the human sense of taste.
2. How does the electronic gustatory system mimic human taste perception?
It uses artificial tastebuds made of graphene-based chemitransistors to detect different taste profiles like sweet, salty, sour, bitter, and umami.
3. What’s the next step in the research?
The researchers aim to make an integrated gustatory chip and expand the technology to simulate other human senses like sight, sound, touch, and smell in AI.