A musical robot that can play the piano alongside a human, creating a harmonic accompaniment in real time, has won an award at the Center for Human-Inspired Artificial Intelligence (CHIA) Conference 2024.
Cambridge Engineering Ph.D. student Huijiang Wang from the Bio-Inspired Robotics Laboratory (BIRL), Department of Engineering, won the Best Demo Award for his “Harmony Robot’—a custom anthropomorphic hand attached to a robot arm, with its ‘fingers’ preset into a chord-hitting pose.
A separate research paper, published in the journal IEEE Transactions on Robotics, was also accepted as part of the award win.
Titled “Human-robot cooperative piano playing with learning-based real-time music accompaniment,” Huijiang and his co-authors Dr. Xiaoping Zhang and Professor Fumiya Iida, present a collaborative robot that uses machine learning to predict the appropriate piano chord progressions based on a human’s piano playing.
Experiments demonstrated a 93% accuracy rate and involved the robot learning chords inspired by pop music.
Meanwhile, a behavior-adaptive controller enables seamless temporal synchronization to take place, so that the collaborative robot can generate harmonic chord accompaniment for the human-played melody in real time. In effect, both human and robot are playing a duet.
The robotic setup involved the vertical movement of the custom anthropomorphic hand via a robot arm, enabling the piano keys to be struck. While a separate horizontal motion enabled chord switching to take place, thus expanding the expressive capabilities of the robot in piano-playing tasks.
“We have developed a musical robot that can play the piano alongside a human, creating good duets in real time,” said Huijiang Wang. “Our robot ‘listens’ to the melody the human plays and uses machine learning techniques to predict the next chords, ensuring the music stays harmonious.”
He added, “By using a special control system, the robot matches its timing with the human player, making the performance synchronized. We also have a system that checks how well the robot and human are playing together, ensuring high-quality collaboration.”
Future work is set to include enhancing the diversity of the robot-based music accompaniment by incorporating more chords and enhancing the dexterity of the anthropomorphic hand to enable its position and orientation to be adapted further—ideal for pressing a broader range of chords.
Professor Iida said, “With advancements in AI, we are now aiming to push the boundaries of robotics into emotionally aware scenarios, such as artistic and entertainment performances, where robots must be capable of capturing, interpreting and responding to the emotional cues of their human counterparts.
“Developing an improvisation algorithm capable of managing intensive computational resources—while preserving chord prediction accuracy—is also essential for future endeavors in this field.”
“Our research on the Harmony Robot advances the integration of soft robotics in creative and collaborative roles,” said Huijiang. “By leveraging cutting-edge machine learning and soft robotics, we aim to achieve seamless human-robot cooperation in the future, enabling robots to respond adaptively in dynamic, real-world scenarios.”
More information:
Huijiang Wang et al, Human–Robot Cooperative Piano Playing With Learning-Based Real-Time Music Accompaniment, IEEE Transactions on Robotics (2024). DOI: 10.1109/TRO.2024.3484633
Provided by
University of Cambridge
Citation:
Human-robot cooperative piano playing hits the high note at AI conference (2024, November 21)
retrieved 21 November 2024
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Leave a comment