NoISE (short for: Novel Intelligent Sound Exploration, sometimes also Novel Intelligent Sound Editing or Novel Interfaces for Sound Editing), was a one year interdiscipiplinary, multinational project.
The project was a part of the Digital Media M.Sc. studies at the University of Applied Sciences Bremen, Germany and took place between May 2008 – May 2009.
Within the scope of the project, the group designed and implemented a variety of prototypes, published papers at various international conferences, conducted a series of user studies and gave talks at other universities around the globe. Some of the project results even turned out to be the fundamental basics for the foundation of a successful startup company.
The NoISE project employs a topic from music informatics to equip its participants with well-founded skills and competencies in fields such as computer science and media design in general, the design of standard and non-standard user interfaces, visualization and sonification, development of multimedia software, design and evaluation of scientific experiments, and intercultural relations.
The NoISE project is built on the following observation:
Music synthesizers possess user interfaces that offer up to hundreds of parameters for control; the basic waveforms and samples may stem from libraries that span gigabytes of disk memory. Both hardware and software manufacturers have tried to come up with solutions that make sound design more approachable. But the underlying concepts have not changed much over the last decades.
In this project, we aim at classifying and analyzing existing concepts in terms of human-computer interaction, in terms of acoustics and in terms of signal processing, learning from their strengths and weaknesses, and creating and testing novel approaches. To this end, one can apply methods from diverse fields such as visualization, image processing, pen-based interfaces, gestural interfaces, music information visualization, and data mining. Sound design is not only concerned with creating fixed sounds but also with constructing expressive musical instruments, which means to manipulate a well-chosen set of parameters in an artistic fashion. We want to research into which parameters should be offered for real-time control during play and into which type of control interface to use. The type of control offered may have drastic musical implications in particular in improvised music, in that it determines which melody, which phrasing, and which rhythm is literally “handy.” Given the malleability of software-based instruments, such interrelations can be subjected to scientific study relatively easily.
We hope that the variety of cultural backgrounds of the students provides a large range of music styles and musical instruments to learn from and to apply the research to. Practical, specific results of this interdisciplinary endeavor may in particular comprise new control methods for existing software synthesizers. Employing our existing contacts to manufacturers of music production software, we will try to market the solutions. Scientifically, we aim at results concerning intelligent interfaces for a variety of applications, ranging from sound editing to navigation and data-mining. The project is intended to lead to several contributions to international scientific conferences.
Make some NoISE !