By Carolyn Heneghan
Digital music programming has made leaps and bounds over the past half a century to become the collection of creation modules it is today. One such example is the selection of hands-on music apps and devices on display at Sonar 2013 in Barcelona. But we’ll get to that in Part 2.
To really understand the future of digital & mobile music programming, we take a look at the history of music software and video games, bringing to light where music programming has come from and how it continues to evolve in its current directions.
Early Music Software
In the 1960s and 70s, as computers themselves improved, developers began to implement music software. While primitive at first, the development of this synthetic music-creation programming paved the way for the programs that were to come. One such program was MUSIC, written by Max Mathews in the early 60s, which could play single line tunes, followed by MUSIC II, which could play four-part harmony. Using the most powerful computers at that time, it still took about an hour of programming to make one minute of music. Universities picked up the concepts, and music software began to take shape.
In 1978, MIDI was introduced, and several programs came about soon after. MIDI was the enabler of a user’s tactile control of an instrument wherein he or she could play directly into the software. In the 1990s, these controls only became easier to control.
Sound Tools (1989) introduced a hard disk audio recording system—a two-track recorder and editor used with Q-Sheet software. In 1992, Pro Tools became the next 4-channel development. Multi-track recording soon became the standard, and Pro Tools was at the head of the game for the next decade.
Early Music Video Games
Music-oriented video games also got their start around this time. Gamers deem Simon (1978), though technically not a video game, to be one of the very first music games, wherein the user copies sequences of colored notes introduced by the device. Break Dance (1984) for the Commodore 64 used a similar concept of copying moves determined by the game.
A precursor to Guitar Hero, PaRappa the Rapper (1997) had users not only copy the lyrics of the rap teacher using buttons on the controller but also introduced the rhythm aspect to the gaming experience, with a rating system of Cool, Good, Bad and Awful. Dance Dance Revolution (1999) took rhythm gaming one step further by making the user and his or her live dance moves the controller for the video game.
The Next Generation
Apple’s GarageBand (2004) was a part of the next generation of music software and gave the user complete control over both music created within the software and recorded organic instruments. The program has evolved enormously over the past decade and continues to be a go-to for many musicians looking to record and edit their music.
Then came Guitar Hero (2005) and a generation of similar, more hands-on games: Rock Band (2007), DJ Hero (2009), Rocksmith (2011) and BandFuse: Rock Legends (2013). Guitar Hero, Rock Band and DJ Hero used simulated instruments as controllers and allowed users to play along with familiar songs by playing the instrument at the very moment colored notes passed a sensor onscreen, which were generally on the beat or with the melody.
Rocksmith and Bandfuse have taken this concept one incredible step further by enabling real guitars to plug in for players to use for the same purpose. They also now include instruction and virtual lessons within the app to improve a user’s performance on an actual, organic instrument rather than a synthetic controller. Instead of just shredding on Rock Band, users can learn to master a real guitar using the same type of gameplay.
… Next we’ll get into where music programming is headed in Part 2…
(cover shot: InkScapes, by Adrià Navarro via Sonar)
Comments are closed.