Prof. Yang CHAI, the Management Committee member of RI-IWEAR, Associate Dean of Faculty of Science, has been awarded the BOCHK Science and Technology Innovation Prize (STIP) 2024 in the field of Artificial Intelligence and Robotics.
The award aims to recognize Prof. CHAI’s outstanding contributions to AI sensor technology, particularly in the development of disruptive in-sensor computing paradigms for artificial vision sensors, which enable information processing directly within the sensors. Today’s AI heavily relies on digital hardware, which has inherent inefficiency in executing iterative AI algorithms, particularly when processing massive, unstructured sensory data. With the rapid growth of the number of sensor nodes in the Internet of Things, the amount of data generated by sensor terminals has surged, resulting in frequent data transmission between sensors and computing units, which severely limits system performance in terms of energy efficiency, speed, and security.
To address the grand challenge of the computational inefficiency at sensory terminals, Prof. CHAI proposed a bioinspired in-sensor computing paradigm that revolutionizes AI computation at the sensor level. This innovation minimizes the need for frequent data transfers to external computing units, enhances computational efficiency, enriches sensor functionalities for Internet of Things applications, and changes the AI computing approach at the sensor level. At the same time, he concretized this in-sensor computing paradigm through hardware implementation for vision sensors, enabling increased efficiency and enhanced functionalities, including improved image recognition, visual adaptation in extremely dim/bright lighting, and agile perception of dynamic motion, etc. This bioinspired in-sensor computing paradigm is highlighted in the US Semiconductor Research Corporation’s Decadal Plan.
The award aims to recognize Prof. CHAI’s outstanding contributions to AI sensor technology, particularly in the development of disruptive in-sensor computing paradigms for artificial vision sensors, which enable information processing directly within the sensors. Today’s AI heavily relies on digital hardware, which has inherent inefficiency in executing iterative AI algorithms, particularly when processing massive, unstructured sensory data. With the rapid growth of the number of sensor nodes in the Internet of Things, the amount of data generated by sensor terminals has surged, resulting in frequent data transmission between sensors and computing units, which severely limits system performance in terms of energy efficiency, speed, and security.
To address the grand challenge of the computational inefficiency at sensory terminals, Prof. CHAI proposed a bioinspired in-sensor computing paradigm that revolutionizes AI computation at the sensor level. This innovation minimizes the need for frequent data transfers to external computing units, enhances computational efficiency, enriches sensor functionalities for Internet of Things applications, and changes the AI computing approach at the sensor level. At the same time, he concretized this in-sensor computing paradigm through hardware implementation for vision sensors, enabling increased efficiency and enhanced functionalities, including improved image recognition, visual adaptation in extremely dim/bright lighting, and agile perception of dynamic motion, etc. This bioinspired in-sensor computing paradigm is highlighted in the US Semiconductor Research Corporation’s Decadal Plan.