This workshop aims to highlight interdisciplinary advances in neuromorphic intelligence, from software algorithms to hardware systems and their synergistic co-design. It is inspired by the brain’s efficient and parallel processing capabilities, which provide a compelling paradigm shift to address limitations of conventional deep learning in energy efficiency, real-time processing, and adaptability within dynamic environments.
Topics
- Neuromorphic Sensing: Neuromorphic sensors (event sensors, spike sensors), data representation, simulation, processing, and applications (in Computer Vision, Audio, etc.).
- Neuromorphic Networks: Spiking Neural Networks (SNNs), Neuromorphic learning algorithms, Neuromorphic temporal processing.
- Algorithm-Hardware Co-design: Interplay between neuromorphic algorithms and their underlying hardware architecture.
- Benchmarking: Establish standardized neuromorphic methodologies and datasets to evaluate the performance, efficiency, and robustness of neuromorphic systems.
Format of Workshop
This is a one-day workshop featuring a mix of invited talks, paper presentations, and a panel session. The workshop is organized around specific topics, fostering a deep dive into the core research challenges and advancements in neuromorphic intelligence.
Submission
We invite submissions that present theoretical advancements, practical applications, and experimental results in the field of brain-inspired, energy-efficient, and real-time AI. Submissions should be formatted using the AAAI template. Long papers are limited to a maximum of 8 pages, while short papers are limited to a maximum of 4 pages, both excluding references.
Please note that this workshop does not have formal proceedings.
Please submit your papers to the following OpenReview website: https://openreview.net/group?id=AAAI.org/2026/Workshop/NI
Contact: neurointelworkshop@googlegroups.com
Important Dates
Note: all deadlines are “anywhere on earth” (UTC-12)
- Paper submission deadline:
October 22, 2025October 29, 2025 - Paper notification:
November 5, 2025November 10, 2025 - Workshop: January 27, 2026
Workshop Committee
- Dr. Yueyi Zhang, Midea, zhyuey@gmail.com
- Dr. Zongwei Wu, University of Würzburg, zongwei.wu@uni-wuerzburg.de
- Prof. Lin Wang, Nanyang Technological University, linwang@ntu.edu.sg
- Prof. Zhiwei Xiong, University of Science and Technology of China, zwxiong@ustc.edu.cn
- Prof. Pascal Vasseur, University of Picardie Jules Verne, pascal.vasseur@u-picardie.fr
Workshop Schedule
Time: Jan. 27, 2026
Location: Conference H, Singapore Expo
Poster Location: Level 2 of the Singapore EXPO (not in the Expo Halls) WS11-WS14
9:00 - 9:10 AM | Opening
- Dr. Yueyi Zhang (Midea, China)
9:10 - 9:45 AM | Invited Talk: Event-Driven Perception for Depth, Motion, and Continuous Scene Understanding
- Prof. Gim Hee Lee (NUS, Singapore)
9:45 - 10:20 AM | Invited Talk: When Bio-inspired Sensing Meet Foundation AI Models: Challenges and Approaches for Embodied AI
- Prof. Lin Wang (NTU, Singapore)
10:20 - 10:55 AM | Invited Talk: High-speed and High Dynamic Range 3D Sensing with Event-based Structured Light
- Prof. Zhiwei Xiong (USTC, China)
10:55 - 11:20 AM | Oral Presentation: E3NeRF: Efficient Event-Enhanced Neural Radiance Fields from Blurry Images
- Yunshan Qi (BUAA, China)
11:20 AM - 1:30 PM | Break —
1:30 - 2:10 PM | Oral Presentations
- Daye Kang (Seoul City University, Korea)
- 1. Adaptive Spiking Transformer for SAR Image Classification
- 2. Reinforcement-Learned Dynamic Execution for Spiking Swin-B
2:10 - 2:45 PM | Invited Talk: Scene understanding with bio-inspired and efficient AI
- Dr. Benoit Cottereau (CNRS, France)
2:45 - 3:20 PM | Invited Talk: TBD
- Prof. Sandamirskaya Yulia (Zurich University of Applied Sciences, Switzerland)
3:20 - 3:55 PM | Invited Talk: Harnessing Graph Neural Networks and Neuromorphic Sensors for Low-latency, Low-power Edge AI
- Dr. Manon Dampfhoffer (CEA, France)
3:55 - 4:30 PM | Invited Talk: TBD
- Dr. Danda Paudel (INSAIT, Sofia University, Bulgaria)
4:30 - 4:35 PM | Closing
- Dr. Yueyi Zhang (Midea, China)
Accepted Papers
- “Adaptive Spiking Transformer for SAR Image Classification,” Donghun Kang, Daye Kang, Hyeongboo Baek
- “Reinforcement-Learned Dynamic Execution for Spiking Swin-B,” Daye Kang, Donghun Kang, Hyeongboo Baek
- “Reinforcing Swarm Macro Behaviors using Spiking Neural Networks,”, Kevin Zhu, Ricardo Vega, Maryam Parsa, Cameron Nowzari
- “E3NeRF: Efficient Event-Enhanced Neural Radiance Fields from Blurry Images”, Yunshan Qi, Lin Zhu, Yifan Zhao, Yu Zhang, Jia Li