报告题目：智能人机交互展望（Towards Intelligent Human-Machine Interactions）
报告人：林英姿教授, 人机智能交互实验室主任, 美国东北大学
Professor Yingzi Lin, Director of Intelligent Human-Machine Systems (IHMS) Laboratory, Northeastern University, Boston, MA 02115 USA
In this seminar, I will give an overview of my research projects in the emerging research area of intelligent human-machine interactions, in particular nonintrusive biosensing, multimodal fusion, as well as sciences and technologies towards future intelligent human-machine interactions.
Work in my Intelligent Human Machine Systems (IHMS) Laboratory covers a wide range of applications, from driver performance analysis in transportation to patient safety in healthcare. With a focus on interaction between humans and machines, physiological sensors are used both to collect relevant data during testing, as well as an input for machine operation. While the nature of the work can seem broad, it is all working towards the same goal – to improve the level and quality of communication between humans and machines.
Sample projects include： Physiological cues such as heart rate, respiration rate, eye gaze state and electroencephalography can be used to infer a human subject’s cognitive state, for example, inference of levels of fear, frustration, or anger. Collecting this type of data while subjects run through one of the many simulations developed in my lab. Understanding cognitive state during specific, controlled scenarios while the subject is tasked to drive a car, interact with a robot, memorize and recall a list of words, or take a series of geometric based tests, can develop an understanding of how people react to and handle certain situations. Providing a model of these reactions to a machine can allow it to predict and adapt to the state of a human operator, allowing for 2-way awareness and assistance during human-machine operation. Other work uses physiological cues to replace traditional interfaces during human operation. Eye gaze and simple hand gestures have been proven to replace the traditional mouse and keyboard interface for playing virtual games, which can allow greater accessibility to disabled users, as well as serve as a foundation for new at home physical rehabilitation practices. A system of eye gestures has been proven to replace a hand held controller to pilot a quad copter, which increases accessibility of control.
Despite its great technical and social significance, the modeling of human states and behaviors remains one of the greatest challenges in science and technology development. It is known that human states and behaviors are highly nonlinear, uncertain, and random, which challenges many scientific disciplines. Human machine operation has a significant and ever growing presence in our world. My work aims safer, more accessible and productive cooperation between humans and machines for the future.
林英姿博士是美国 Northeastern University的终身教授，现任工学院机械与工业工程系智能人机系统实验室的主任。近年来，她的科研获得美国国家自然科学基金（NSF）、加拿大自然科学及工程基金（NSERC）及如通用GM、Bose公司等企业合作研究的项目资助。林英姿教授获得一系列科研奖项，其中包含美国国家自然科学基金NSF CAREER Award，加拿大自然科学及工程基金NSERC University Faculty Award 。发表100篇以上期刊、书籍和会议论文。研究领域包括：智能人机系统，驾驶员车辆系统，智能结构和系统，传感器及传感系统，多模态信息融合，人机界面设计以及人机友好一体化等。林英姿教授曾是美国人机工程学会（HFES）虚拟环境技术分会主席，美国国家科学委员会交通研究学会TRB委员。她曾任IEEE Trans. on Systems, Man and Cybernetics - Part A: Systems and Humans 的副主编，以及多个著名刊物以及国际会议的特邀审阅专家, 同时一直活跃在先进传感器技术、机电一体化系统、动力与控制系统、高级智能材料及智能结构、以及人机交互领域等领域的国际会议的组织工作中。
Dr. Yingzi LIN is the director of the Intelligent Human-Machine Systems (IHMS) Laboratory, and an Associate Professor (tenured) with the Department of Mechanical and Industrial Engineering, College of Engineering, Northeastern University, Boston, MA, USA. Her research has been funded by the National Science Foundation (NSF), Natural Sciences and Engineering Research Council of Canada (NSERC), and major industries. She is a recipient of a few prestigious research awards, including a NSF CAREER award and a NSERC UFA (University Faculty Award). She has published over 100 technical papers in referred journals and conference proceedings. Her area of expertise includes: intelligent human-machine systems, driver-vehicle systems, smart structures and systems, sensors and sensing systems, multimodality information fusion, human machine interface design, and human friendly mechatronics. Dr. Lin was the Chair of the Virtual Environments Technical Group of the Human Factors and Ergonomics Society (HFES). She was on the committees of the Transportation Research Board (TRB) of the National Academy of Sciences. She served as an Associate Editor of the IEEE Trans. on Systems, Man and Cybernetics - Part A: Systems and Humans. In addition, Professor Lin has been a reviewer for many professional journals and conferences. She has also been on the organizing committee of a number of professional meetings in the areas of Advanced Sensors, Mechatronic Systems, Dynamic Systems and Control, Advanced Smart Materials and Smart Structures, and human-machine interaction.