Music educators write a lot about “musical literacy.” To classically-trained musicians, the definition of “musical literacy” is self-evident. It’s the ability to read staff notation. Reading chord charts or guitar tablature with rhythmic notation doesn’t usually count, as far as they’re concerned. Defining “musical literacy” this way is fine in the context of classical and other “traditional” music styles. Blind musicians notwithstanding, you have to read staff notation to be a classical musician.
It’s true that reading music is a step toward greater literacy. Reading opens up a world of music which can’t be accessed any other way. But the narrow definition of “literacy” advanced by the classical school can’t be applied as-is to rock musicians, because rock music is usually conceived through improvisation. Rock recording sessions are usually organized using chord charts, when there’s any written documentation at all.
Rock music isn’t usually written out before it’s recorded, but that isn’t a deficiency. It doesn’t make rock music inferior. Music that’s written out in detail is not necessarily better than music that relies on the interpretation of the musicians. Indian classical music is an example. Music that’s scored by a composer isn’t inherently superior to music that’s improvised or simply memorized.
Most people would not write-off a rock musician with a solid understanding of chord theory, and the ability to dissect complex musical passages quickly and accurately by ear, as “musically illiterate.” Most people wouldn’t dismiss legendary flamenco guitarist Paco de Lucia as “illiterate.” Others disagree, and argue that the “illiterate” label is appropriate. That’s part of the disconnect. But the fact is that many rock and country recording artists can’t read the sheet music to their own songs.
Music educators say we can advance “musical literacy” by developing a deeper understanding of concepts at work in music, going beyond just identifying notes and pressing the right keys. They recommend that students learn to write music in addition to learning to read. They say “literacy” should include performance skills and practical application. They say it’s important to listen to a variety of music, that exposure to music in childhood contributes to musical lilteracy.
These are all great ideas. The problem is that music educators often borrow a little to freely from more rigorous academic disciplines. It seems natural to conflate “musical literacy” with literacy in your native language, but the analogy breaks down quickly. Spoken and written language plays a more fundamental role in our lives than music.
Let’s take a fresh look at “musical literacy.” “Literacy” has two meanings. First, it’s the ability to read and write in your native language. Second, it’s knowledge of a particular subject. But before a person can be literate in either of these two ways, he or she has to first acquire basic language skills.
Learning theorists describe how people acquire primary language skills. (Search “language acquisition model.”) We acquire language as children, or we likely will never acquire it at all. We know this from those rare, unfortunate cases where people reach adulthood without having been exposed to language as children. As babies, we go through a stage where we babble randomly, then we learn to control sounds we make with our voice. After that, we imitate sounds from the speech we hear around us. Then we attach meanings to words. Then we learn syntax, grammar and the other mechanics of spoken language. This whole time, we’re also taking in our culture and learning about the wider world.
This occurs from infancy to about the age of five, before we’re old enough to read or write. All learning in early childhood is direct and intuitive. There’s no program or system. There is only cumulative learning from direct experience.
Music has to be acquired similar to the way we acquire primary language. One difference is that people can acquire music at any age. Exposure to music early in life helps, but you’re not limited to a narrow time-window in early childhood. The “language acquisition” analogy breaks down as it applies to music, because we have spoken language to help us navigate the process of acquiring a fundamental sense of music, and learning to play. However, in some ways, the “acquisition” analogy is valid. Acquiring music is an internal process involving listening, memorizing and imitating. Words won’t take you very far in acquiring a fundamental sense of music.
Acquiring primary language requires that you hear and imitate language, before you understand what any of it means. We learn to comprehend music intuitively, without the help of any theoretical model. To “comprehend” music means that you can follow basic melodies and rhythms. It means you can reconcile melodies to a twelve-note scale, that you can follow a melody when it modulates, and that you can follow rhythmic divisions by 2, 3 or 4. There’s no analogy for this in spoken or written language. It’s usually easy to assess whether someone comprehends the fundamentals of melody and rhythm.
Teachers of students with reading challenges aren’t supposed to allow those challenges to get in the way of teaching subjects and concepts suited to the student’s age and intelligence. Reading difficulties become truly debilitating when teachers limit students to subjects and concepts based solely on their reading level, regardless of their age, intelligence or ability. Unfortunately, most music teachers do just that. It’s typical for music teachers to limit what their students can play to what they can read in standard notation. They won’t allow you to play beyond your reading ability, because in their worldview, your reading ability is the most important measure of your musical ability overall.
For modern musicians, “musical literacy” has to include interactive skills. Educators tell us that learning by children depends on social interaction. Music is social interaction. Every time you hear a musical instrument, whether it’s live or recorded, there’s a human being playing it. Even if it’s computer-generated music, then someone had to program it. But it isn’t taught that way. Music should first be taught as a conversation. The hieroglyphics can wait until after the student is at least minimally conversant. That’s how we learn spoken language. That’s why playing with live sound in real time is a crucial part of learning music.
Glen Campbell was one of the top LA studio guitarists, and played iconic parts on dozens of hit records. In the documentary film Session Men, Campbell said, “A musician that’s a player, it doesn’t matter what you stick in front of them, they can play it. It’s like a horse eating hay. It doesn’t matter what kind of hay it is. Just throw it out there, we’ll chew it up.” Campbell read only chord charts. His biographies say he never learned to read staff notation. But you can’t convince me that Glen Campbell was musically illiterate.
© 2020 Greg Varhaug