A new software program developed by a team of computer scientists watches a user's movements and gives spoken feedback on what to change to accurately complete a yoga pose.The program, called Eyes-Free Yoga, uses Microsoft Kinect software to track body movements and offer auditory feedback in real time for six yoga poses, including Warrior I and II, Tree and Chair poses.
Project lead Kyle Rector, a UW doctoral student in computer science and engineering wrote programming code that instructs the Kinect to read a user's body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose.
The result is an accessible yoga "exergame" - a video game used for exercise - that allows people without sight to interact verbally with a simulated yoga instructor.
Rector and collaborators Julie Kientz, a University of Washington assistant professor in Human Centered Design and Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.
Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga instructors to put together the criteria for reaching the correct alignment in each pose.
The Kinect first checks a person's core and suggests alignment changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.
The technology uses simple geometry and the law of cosines to calculate angles created during yoga.
Source-ANI
Project lead Kyle Rector, a UW doctoral student in computer science and engineering wrote programming code that instructs the Kinect to read a user's body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose.
The result is an accessible yoga "exergame" - a video game used for exercise - that allows people without sight to interact verbally with a simulated yoga instructor.
Rector and collaborators Julie Kientz, a University of Washington assistant professor in Human Centered Design and Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.
Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga instructors to put together the criteria for reaching the correct alignment in each pose.
The Kinect first checks a person's core and suggests alignment changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.
The technology uses simple geometry and the law of cosines to calculate angles created during yoga.
Source-ANI