AI learns to touch đź‘‹ đź§Ł

Exploring Human-AI Perception Alignment in touch Experiences

We introduce perceptual alignment as a critical aspect of human-AI interaction, focusing on bridging the gap in tactile perception alignment between humans and AI.

Our research presents the first exploration of this alignment through the “textile hand” task across two studies, examining how well LLMs align with human touch experiences (Zhong et al., 2024) and (Zhong et al., 2024).

Study 1

We assess how Multimodal LLMs interpret textile tactile qualities compared to human perception—a key challenge for online shopping environments. In person study with 30 participants. We evaluate models from GenAI families:

  • OpenAI GPTs,
  • Google Geminis,
  • Anthropic Claude 3.

Check the paper Feeling Textiles through AI: An exploration into Multimodal Language Models and Human Perception Alignment

Human descriptions (crosses), AI-generated descriptions from MLLM models (dots), and LLM models (triangles) visualised in 2D t-SNE.

Study 2

  • We introduce a novel interactive task probes LLMs’ learned representations for human alignment.
  • First study on alignment between human touch experiences and LLMs in embeddings.
  • LLMs show perceptual biases, aligning better with certain textiles than others.

For details check our video and paper

Overview of the user study setup: A participant handling textiles in a "``"Guess What Textile" task interact with custumised AI system.

References

2024

  1. Feeling Textiles through AI: An exploration into Multimodal Language Models and Human Perception Alignment
    Shu Zhong, Elia Gatti, Youngjun Cho, and 1 more author
    In Proceedings of the 26th International Conference on Multimodal Interaction , 2024
  2. Exploring Human-AI Perception Alignment in Sensory Experiences: Do LLMs Understand Textile Hand?
    Shu Zhong, Elia Gatti, Youngjun Cho, and 1 more author
    arXiv preprint arXiv:2406.06587, 2024