: Gil Weinberg, Mason Bretan, Guy Hoffman, Scott Driscoll
: Robotic Musicianship Embodied Artificial Creativity and Mechatronic Musical Expression
: Springer-Verlag
: 9783030389307
: 1
: CHF 142.50
:
: Maschinenbau, Fertigungstechnik
: English
: 270
: Wasserzeichen/DRM
: PC/MAC/eReader/Tablet
: PDF

This book discusses the principles, methodologies, and challenges of robotic musicianship through an in-depth review of the work conducted at the Georgia Tech Center for Music Technology (GTCMT), where the concept was first developed. Robotic musicianship is a relatively new research field that focuses on the design and development of intelligent music-making machines. The motivation behind the field is to develop robots that not only generate music, but also collaborate with humans by listening and responding in an expressive and creative manner. This combination of human and machine creativity has the potential to surprise and inspire us to play, listen, compose, and think about music in new ways.

The book provides an in-depth view of the robotic platforms designed at the GTCMT Robotic Musicianship Group, including the improvisational robotic percussionists Haile and Shimon, the personal robotic companion Shimi, and a number of wearable robots, such as the Robotic Drumming Prosthesis, The Third Drumming Arm, and the Skywalker Piano Hand. The book discusses numerous research studies based on these platforms in the context of five main principles: Listen like a Human, Play Like a Machine, Be Social, Watch and Learn, and Wear It.

 

Foreword7
Preface9
Contents12
1 Introduction17
1.1 Abstract17
1.2 Why Robotic Musicianship17
1.3 Sound Production and Design—Survey19
1.3.1 Traditional Instruments20
1.3.2 Augmented and Novel Instruments24
1.4 Musical Intelligence25
1.4.1 Sensing and Perception26
1.4.2 Music Generation30
1.5 Embodiment32
1.6 Integrating Robotic Musicianship into New Interfaces34
1.6.1 Musical Companion Robots34
1.6.2 Wearable Robotic Musicians35
1.7 Discussion36
References37
2 Platforms—Georgia Tech's Robotic Musicians41
2.1 Abstract41
2.2 Haile—A Robotic Percussionist42
2.2.1 Motivation42
2.2.2 Design42
2.3 Shimon—A Robotic Marimba Player47
2.3.1 Striker Design47
2.3.2 Mallet Motor Control50
2.3.3 Slider Motor Control53
2.3.4 Shimon's Socially Expressive Head56
2.4 Shimi—A Music Driven Robotic Dancing Companion59
2.4.1 Robotic Musical Companionship60
2.4.2 Design61
2.4.3 Software Architecture62
2.4.4 Core Capabilities63
2.5 The Robotic Drumming Prosthetic66
2.5.1 Motivation67
2.5.2 Related Work68
2.5.3 Platform69
2.5.4 Generative Physical Model for Stroke Generation70
2.5.5 Conclusions75
References75
3 ``Listen Like A Human''—Human-Informed Music Perception Models78
3.1 Abstract78
3.2 Rhythmic Analysis of Live Drumming79
3.2.1 Onset Detection79
3.2.2 Beat Detection79
3.2.3 Rhythmic Stability and Similarity80
3.2.4 User Study83
3.3 Tonal Music Analysis Using Symbolic Rules84
3.3.1 Implementation85
3.3.2 Evaluation88
3.4 Music Analysis Using Deep Neural Networks91
3.4.1 Deep Musical Autoencoder91
3.4.2 Music Reconstruction Through Selection94
3.5 Real-Time Audio Analysis of Prerecorded Music95
3.5.1 Introduction95
3.5.2 Previous Work97
3.5.3 System Design97
3.5.4 Live Audio Analysis98
3.5.5 Gesture Design101
3.5.6 Network Design105
3.5.7 User Study107
3.5.8 Summary108
References109
4 ``Play Like A Machine''—Generative Musical Models for Robots110
4.1 Abstract110
4.2 Genetic Algorithms111
4.2.1 Related Work111
4.2.2 Method111
4.3 Markov Processes (``Playing with the Masters'')115
4.3.1 Related Work115
4.3.2 Implementation115
4.3.3 Summary119
4.4 Path Planning Driven Music Generation120
4.4.1 Search and Path Planning120
4.4.2 Musical Path Planning121
4.4.3 Planning123
4.4.4 Evaluation128
4.4.5 Discussion130
4.5 Rule Based Jazz Improvisation130
4.5.1 Parametrized Representations of Higher-Level Musical Semantics131
4.5.2 Joint Optimization138
4.5.3 Musical Results140
4.5.4 Discussion142
4.6 Neural Network Based Improvisation143
4.6.1 Introduction144
4.6.2 Semantic Relevance146
4.6.3 Concatenation Cost147
4.6.4 Ranking Units148
4.6.5 Evaluating the Model149
4.6.6 Discussion149
4.6.7 Subjective Evaluation150
4.6.8 Results150
4.6.9 An Embodied Unit Selection Process152
4.7 Conclusion154
References155
5 ``Be Social''—Embodied Human-Robot Musical Interactions158
5.1 Abstract158
5.2 Embodied Interaction with Haile158
5.2.1 Interaction Modes159
5.2.2 Leader-Follower Interaction161
5.2.3 Evaluation162
5.2.4 Data Analysis165
5.2.5 Results168