Kunyu Peng, M.Sc.
- PhD Student
- Group: CV:HCI
- Room:
CS 07.08 - Phone:
+49 721 608-41954 - kunyu peng ∂kit edu
- Vincenz-Priessnitz-Str. 3
76131 Karlsruhe
Introduction
I received my B.Sc. degree in Automation from Beijing Institute of Technology (BIT), and received my M.Sc degree in Electrical Engineering and Information Technology from Karlsruhe Institute of Technology (KIT) in 2021. I am currently a Ph.D. candidate at the Computer Vision for Human-Computer Interaction Lab at KIT. My research fields mostly focus on generalizable human activity recognition and video understanding. Human activity recognition (HAR) plays a crucial role in enhancing human daily life by enabling technologies like healthcare monitoring, smart home automation, and assistive robotics. It helps in tracking and analyzing physical activities, allowing for personalized interventions, improving safety, and optimizing productivity. As the foundation of many advanced AI-driven systems, HAR is vital for creating seamless human-machine interactions that improve overall well-being and quality of life.
Therefore, developing generalized human activity recognition algorithms would greatly contribute to society. Students interested in activity recognition and video understanding are welcome to apply for Master’s and Bachelor’s thesis opportunities. There is high flexibility regarding your thesis topic which is upon discussion and your interests.
Google Scholar: https://scholar.google.com/citations?user=pA9c0YsAAAAJ&hl=zh-CN
Recent publications:
- Peng, K., Wen, D., Yang, K., Luo, A., Chen, Y., Fu, J., Sarfraz, M.S., Roitberg, A., & Stiefelhagen, R. (2024). Advancing Open-Set Domain Generalization Using Evidential Bi-Level Hardest Domain Scheduler. The Thirty-eighth Annual Conference on Neural Information Processing System, 2024. (NeuRIPS2024)
- Peng, K., Schneider, D., Roitberg, A., Yang, K., Zhang, J., Deng, C., Zhang, K., Sarfraz, M.S., & Stiefelhagen, R. (2023). Towards Activated Muscle Group Estimation in the Wild. ACM MULTIMEDIA 2024. (MM 24)
- Peng, K., Fu, J., Yang, K., Wen, D., Chen, Y., Liu, R., Zheng, J., Zhang, J., Sarfraz, M.S., Stiefelhagen, R., & Roitberg, A. (2024). Referring Atomic Video Action Recognition. The 18th European Conference on Computer Vision ECCV 2024. (ECCV2024)
- Xu, Y., Peng, K., Wen, D., Liu, R., Zheng, J., Chen, Y., Zhang, J., Roitberg, A., Yang, K., & Stiefelhagen, R. (2024). Skeleton-Based Human Action Recognition with Noisy Labels. IEEE/RSJ International Conference on Intelligent Robots and Systems. (IROS2024) (Master Student Work)
- Peng, K., Yin, C., Zheng, J., Liu, R., Schneider, D., Zhang, J., Yang, K., Sarfraz, M. S., Stiefelhagen, R., & Roitberg, A. (2024). Navigating Open Set Scenarios for Skeleton-Based Action Recognition. Proceedings ofthe AAAI Conference on ArtificialIntelligence, 38(5), 4487-4496. (AAAI2024)
- Peng, K., Roitberg, A., Yang, K., Zhang, J.,& Stiefelhagen, R., "Delving Deep IntoOne-Shot Skeleton-Based Action Recognition With Diverse Occlusions," in IEEE Transactions on Multimedia, vol. 25, pp. 1489-1504, 2023, doi: 10.1109/TMM.2023.3235300.
- Xiao, H*., Peng, K*., Huang, X., Roitberg, A., Li, H., Wang, Z., & Stiefelhagen, R. (2023). Toward Privacy-Supporting Fall Detection via Deep Unsupervised RGB2Depth Adaptation. IEEE Sensors Journal, 23, 29143-29155.
- Peng, K., Roitberg, A., Schneider, D., Koulakis, M., Yang, K., & Stiefelhagen, R. (2021). Affect-DML: Context-Aware One-Shot Recognition of Human Affectusing Deep Metric Learning. 2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021), 1-8.
- Wei, Y., Peng, K., Roitberg, A., Zhang, J., Zheng, J., Liu, R., Chen, Y., Yang, K., & Stiefelhagen, R. (2024). Elevating Skeleton-Based Action Recognition withEfficient Multi-Modality Self-Supervision. ICASSP 2024. (Master Student Work)
- Peng, K., Roitberg, A., Yang, K., Zhang, J., & Stiefelhagen, R. (2022). TransDARC: Transformer-based Driver Activity Recognition with Latent Space Feature Calibration. 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 278-285.
- Peng, K., Roitberg, A., Yang, K., Zhang, J., & Stiefelhagen, R. (2022). Should I take a walk? Estimating Energy Expenditurefrom Video Data. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2074-2084.
- Wei, P., Peng, K., Roitberg, A., Yang, K., Zhang, J., & Stiefelhagen, R. (2022). Multi-modal Depression Estimationbased on Sub-attentional Fusion. ECCV Workshops. (Bachelor Student Work)
-
Tanama, C., Peng, K., Marinov, Z., Stiefelhagen, R., & Roitberg, A. (2023). Quantized Distillation: Optimizing Driver Activity Recognition Models for Resource-Constrained Environments. 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 5479-5486. (Master Student Work)
-
Roitberg, A., Peng, K., Marinov, Z., Seibold, C., Schneider, D., & Stiefelhagen, R. (2022). A Comparative Analysis of Decision-Level Fusion for Multimodal Driver Behaviour Understanding. 2022 IEEE Intelligent Vehicles Symposium (IV), 1438-1444.
-
Roitberg, A., Peng, K., Schneider, D., Yang, K., Koulakis, M., Martínez, M., & Stiefelhagen, R. (2022). Is My Driver Observation Model Overconfident? Input-Guided Calibration Networks for Reliable and Interpretable Confidence Estimates. IEEE Transactions on Intelligent Transportation Systems, 23, 25271-25286.