DOI: https://doi.org/10.15368/theses.2010.175
Available at: https://digitalcommons.calpoly.edu/theses/410
Date of Award
12-2010
Degree Name
MS in Computer Science
Department/Program
Computer Science
Advisor
John Clements
Abstract
Computers have been assisting in recording, sound synthesis and other fields of music production for quite some time. The actual performance of music continues to be an area in which human players are chosen over computer performers. Musical performance is an area in which personalization is more important than consistency. Human players play with each other, reacting to phrases and ideas created by the players that they are playing with. Computer performers lack the ability to react to the changes in the performance that humans perceive naturally, giving the human players an advantage over the computer performers.
This thesis creates a framework for describing unique musical performers that can play along in realtime with human players. FrTime, a reactive programming language, is used to constantly create new musical phrases. Musical phrases are constructed by unique user programmed performers and by chord changes that the framework provides. The reactive language creates multiple musical phrases for each point in time. A simple module which chooses musical phrases to be performed at the time of performance is created.
Included in
Artificial Intelligence and Robotics Commons, Graphics and Human Computer Interfaces Commons, Other Computer Sciences Commons, Programming Languages and Compilers Commons