::: 前往中央內容區塊
:::

近期展覽、研討會

AI-Driven Perception and Semantic Understanding for Autonomous and Collaborative Robots 2025/09/19

Overview

Recent advances in deep learning and multimodal sensing are rapidly transforming how robots perceive and interpret the world. Yet, enabling robots to move beyond raw data toward true semantic understanding - recognizing objects, contexts, human intentions, and task constraints - remains a grand challenge. This workshop will bring together international researchers from robotics, computer vision, machine learning, and cognitive science to discuss emerging methods that fuse perception, mapping, and reasoning for autonomous and collaborative robots. Topics include but are not limited to semantic SLAM, vision-language models for embodied agents, task-driven perception for multi-robot cooperation, and real-time adaptation in unstructured environments. By bridging bio-inspired locomotion, advanced actuation, and high-level semantic understanding, we aim to chart the next generation of robots capable of seamless interaction with humans and with each other. Participants will share cutting-edge results, identify open problems, and outline research directions that can accelerate the deployment of autonomous, trustworthy, and context-aware robotic systems across manufacturing, healthcare, transportation, and urban infrastructure.

Event Information

  • Date: Monday, June 1, 2026 (Tentative)
  • Time: 08:30–12:30
  • Venue: VIECON – Vienna Congress & Convention Center, Vienna, Austria

Agenda

Time Agenda Speaker
08:30 - 08:40 Opening Remarks
08:40 – 09:10 Speech 1: Bipedal Walking Pei-Chun Lin
Distinguished Professor and Associate Dean,
Department of Mechanical Engineering,
National Taiwan University
09:10 – 09:40 Speech 2: Autonomous Vehicles Chieh-Chih (Bob) Wang
Professor,
Institute of Electrical and Computer Engineering,
National Yang Ming Chiao Tung University
09:40 – 10:10 Speech 3: Smart Grippers Chao-Chieh Lan
Professor,
Department of Mechanical Engineering,
National Taiwan University
10:10 – 10:40 Coffee break
10:40 – 11:10 Speech 4: Robotics Shu Huang
Principal Researcher,
Mechanical and Mechatronics Systems Research Labs (MMSL),
Industrial Technology Research Institute (ITRI)
11:10 – 12:00 5-min Short Talks of the Paper Presenters (Approx. 10 speakers (subject to submissions) CFP-accepted Speakers
12:00 – 12:30 Panel Discussion and Q&A
12:30 – Lunch / Networking

Call for Papers and Important date

Call for Papers: AI-Driven Perception and Semantic Understanding Enabling Next-Generation Autonomous and Collaborative Robots

This workshop invites original research contributions that advance the state of the art in robot perception and semantic understanding to enable autonomy and collaboration in complex, real-world environments. We particularly welcome works that bridge low-level sensing and high-level reasoning, and that demonstrate scalability, adaptability, and trustworthiness in practice.

Topics of Interest (include but are not limited to):

  • Semantic mapping and lifelong environment understanding
  • Vision–language and multimodal learning for embodied agents
  • Task-driven perception and reasoning for multi-robot coordination
  • Adaptive sensor fusion and real-time semantic inference in unstructured settings
  • Human intention prediction, explainability, and semantic interaction in human–robot teaming
  • Benchmarking, datasets, and evaluation metrics for semantic perception in robotics

Accepted contributions will be presented as oral or poster sessions during the workshop and will stimulate discussion on open challenges and future directions for AI-driven perception in autonomous and collaborative robotics.

Organizer

host is ITRI