Mr. Yashwanth Maddipatla | Human Machine Interaction | Best Researcher Award
XR Engineer at Lowa State University, United States
Yash Maddi is a dynamic Applications Engineer with over six years of experience specializing in Spatial Computing, Mobile Development, and Human-Computer Interaction. With a strong foundation in Cognitive Psychology, he has made significant contributions to cutting-edge technologies through research and development, user-centered design, and interdisciplinary collaboration. Yash has played a pivotal role in developing immersive applications, enterprise solutions, and machine learning-powered systems that enhance user experiences across multiple industries, including retail, healthcare, and industrial training.
Profile
Education
Yash Maddi earned his Master’s degree in Human-Computer Interaction from Iowa State University in 2024, where he focused on XR for Training and Education, Cognitive Psychology, and Human-Robot Interaction. His thesis, titled “VR Co-Lab: A Virtual Reality Platform for Human-Robot Disassembly Training,” demonstrated innovative applications of VR in industrial training. Prior to that, he completed his Bachelor of Engineering in Computer Science from Sathyabama University in 2017, with a specialization in Software Engineering, Mobile Application Development, and Computer Graphics. His academic journey has equipped him with deep expertise in computational perception, cognitive engineering, and qualitative research methods in HCI.
Professional Experience
Yash has held key roles at several prestigious organizations, leading groundbreaking projects in XR and AI-driven technologies. As a Lead XR Engineer at Gambit Labs, he developed an adaptive neurofeedback framework for EEG-integrated VR and mobile applications targeting cognitive training and wellness interventions. His work at Meta Spark Partner Network involved beta-testing AR and MR features, creating experimental AR effects, and collaborating with Meta’s product teams to enhance spatial computing applications. As a Research Assistant at Iowa State University’s Virtual Reality Applications Center, he spearheaded multiple projects, including VR Co-Lab for human-robot training and a CBT-based Audio AR meditation tool for anxiety management. Additionally, his tenure at Cognizant’s Emerging Technologies Lab saw him developing AR/VR retail experiences, predictive analytics systems, and indoor navigation solutions using ARKit, ARCore, and Azure Spatial Anchors.
Research Interests
Yash’s research interests lie at the intersection of XR development, cognitive psychology, and machine learning. His work primarily focuses on enhancing user interactions in virtual environments, adaptive neurofeedback systems, and spatial computing applications for industrial training, healthcare, and retail. His expertise in integrating EEG, LiDAR, and RGBD cameras with machine learning models has led to the development of immersive and context-aware XR experiences. Additionally, he is passionate about designing human-centered AI solutions that leverage cognitive load measurement, biometric analysis, and behavior-driven XR interfaces.
Awards and Recognitions
Yash has received numerous accolades for his innovative contributions to XR and AI technologies. He was the Winner of the XR BrainJam 2022 and a recipient of the Macy’s Tech Innovation Award in 2019. He has also actively participated in leading industry hackathons, including the Meta Presence Hack 2023 and 2024 and the MIT Reality Hack 2023 and 2024. His projects have been recognized for their impact in spatial computing and immersive technology, demonstrating his ability to push the boundaries of human-computer interaction.
Selected Publications
“VR Co-Lab: A Virtual Reality Platform for Human-Robot Disassembly Training” – MDPI, March 2025
“Tracking and Visualization of Benchtop Components” – ASME-MSEC, June 2023
“Low-Cost Automated Facial Recognition and Alert System” – IEEE, June 2017 These publications have been cited in multiple research articles and conferences, contributing to advancements in XR for industrial training, object tracking, and facial recognition technologies.
Conclusion
Yash Maddi’s extensive experience in spatial computing, mobile development, and human-computer interaction has positioned him as a leading innovator in the field of immersive technologies. His contributions to VR training, adaptive neurofeedback systems, and AI-driven spatial applications continue to push the boundaries of technological advancements. With a strong research background, industry experience, and a passion for cognitive psychology-driven XR solutions, Yash remains committed to pioneering new frontiers in human-computer interaction and emerging technologies.