Currently, a new industrial revolution, also known as Industry 4.0, is transforming manufacturing industries towards higher efficiency of production, a higher level of automation, more intelligence in the processes, higher complexity of the products, and more flexibility in customization. Collaborative robots (Cobots) equipped with advanced sensors and Artificial Intelligence (AI), are considered the key enablers of the future of manufacturing. Cobots will work alongside human workers to increase the production rate, relieve human workers’ physical and mental loads, and substitute human workers in hazardous or extreme working environments. However, the current level of collaboration between humans and robots is far below what is expected. From humans’ perspective, current methods for instructing robots are unintuitive and time-consuming, causing a steep learning curve. Additionally, human workers have difficulty understanding robots’ motion paths and identifying possible collisions, leading to fear and distrust. On the other hand, it is challenging for robots to understand human workers’ high-level intentions and respond properly. Towards improving the level of human-robot collaboration, Dr. Zhang stepped into a multidisciplinary research area that is at the intersection of multiple domains, including Human-Computer Interaction (HCI), AI, Robotics, Computer Vision, and Computational Geometry. In this talk, Dr. Zhang will present his research progress in three aspects: 1) intuitive and informative extended reality (XR) interfaces for better understanding and communication; 2) robot programming by demonstration for capturing humans’ high-level intentions; and 3) fundamental computational methods for reducing XR visualization errors and enhancing 3D depth sensing. Future research directions will also be discussed.