Zachary (Zach) Tudor
Associate Laboratory Director
National & Homeland Security
Idaho National Laboratory
Tudor is responsible for Idaho National Laboratory’s (INL) National and Homeland Security (N&HS) mission. N&HS is a major center for national security technology development and demonstration, employing 500 scientists and engineers across $300M in programs at the lab. He is responsible for INL’s Nuclear Nonproliferation, Critical Infrastructure Protection and Defense Systems missions. These missions include heavy manufacturing of armor, application of INL’s full-scale and unique infrastructure (grid, wireless testbed, explosives range, and a number of research facilities). In addition to the Department of Energy, these missions also support major programs for the Department of Defense, Department of Homeland Security, and the Intelligence Community.
Title: Preserving the Roles of Humans in an Automated Society
Abstract: Science continues to make great leaps in developing systems and processes that perform tasks in human-like ways. Automated and autonomous systems will replace humans in dangerous and tedious tasks and will augment human capabilities in complex endeavors. Humans will need new skills to understand and use these autonomous tools. This talk will highlight qualities such as responsiveness, reliability, accountability, and trustworthiness needed for continued adoption and deployment of these new capabilities, and suggest ways people must adapt to gain the most benefits of our new ecosystem.
Virginia Commonwealth University
Dr. Ye is a Professor in the Department of Computer Science at Virginia Commonwealth University. Dr. Ye’s group works on creating robots and robotic assistive devices for people with visual/physical disabilities and developing algorithms to allow these robots and devices to interact with their environments, collaborate with the human users in task accomplishments, and learn from the human through interaction with the users. His research aims to improve the quality of life of the disabled and the elderly through the use of co-robots in their daily life. His team is also developing flash LIDAR based autonomous navigation methods for robotic planetary exploration.
Title: Towards Seamless Integration of Human and Robot in Assistive Co-robots
Abstract: Assistive co-robots will play an important role in future healthcare. As these robots are small-sized and must collaborate with the human users in task accomplishment, resource-limited autonomy and effective human-robot interface become the challenges that must be addressed before the robots can be developed and deployed to healthcare applications. In this talk, I will present our recent research in two co-robotic devices⎯the robot-cane and the wearable object manipulation aid that may be used for wayfinding and assisted object detection/grasping, respectively, for the visually impaired. Both devices use a 3D camera for device localization and object detection. In this talk, I will discuss pose estimation, wayfinding, object detection and their real-time implementation, as well as human-centered design of the robots to allow for natural human-robot interaction and collaborative task accomplishment.
Gdańsk University of Technology
Dr. Ruminski is a researcher, engineer, and teacher in the field of informatics and biomedical engineering. Academic background includes MSc in medical devices and Ph.D. in healthcare informatics. Since 1995 professionally related with Gdansk University of Technology. Author of more than 180 scientific book chapters, articles and conference papers. Practical experience in medical imaging, image processing, image analysis, medical imaging, human interaction system, wearable technology, healthcare information systems, data mining.
Title: Augmented perception
Abstract: Human perception is the process in which sensory information is translated into organized experience. The rapid development of sensors enables to reproduce human perception into machine perception. In parallel, new sensors and dedicated algorithms can be applied to augment human perception. For example, the introduction of X-ray imaging highly modified modern medicine. The X-ray imaging enables human observers to see what naturally is not possible to perceive by a human vision system. However, such systems are still not very portable to be implemented as wearable devices. At the same time, there are many other examples of sensors (e.g. thermal cameras, NIR cameras) that have been recently highly miniaturized, so can be embedded in portable devices like smartphones, smart glasses, etc. The portability of such devices offers a new chance to enhance human perception in everyday life.
The lecture will introduce the audience with selected algorithms of visible light video processing and thermal sequences analysis that extract valuable information from image series that is not naturally perceived by an average human observer. Some examples of methods that will be presented include: color magnification, color processing for dichromats, pulse wave and pulse rate extraction, respiratory wave and respiratory rate extraction, etc. A special role of deep learning methods will be underlined in the presentation. Finally, a potential application of such algorithms in smart glasses will be described. Some examples will be discussed including:
– smart glasses as a tool for a Health professional (e.g. as a screening/evaluation tool – pulse rate and respiratory rate evaluation, face skin temperature analysis, emotion analysis, monitoring of face muscles dynamics),
– smart glasses as an assistant tool for disabled users (e.g. navigation, color recognition and identification for color-blinded, etc.),
– smart glasses as a tool for behavioral analysis (e.g. analysis of a person’s activity in reference to his environment, e.g. eye-tracking vs. observations).