One of the eureka moments I had studying music was when I began to understand the harmonic series and how in turn this relates to pitch, intervals, harmony, tuning, waveforms, frequency and latterly timbre.
The topic was broached by my piano teacher when I was trying to transcribe some Charles Mingus, and struggling with it. To get around the problem he introduced me to a program to slow the audio down. At this juncture it’s worth noting that we didn’t use Ableton Live or other DAWs that would have preserved the pitch information.
The program was called the Amazing Slow Downer and it worked much like a vinyl or tape would; if you slow something down the pitch drops, if you speed it up the pitch would increase.
We found (or rather I found, because Pete already knew) that reducing the speed by half caused the pitch to drop by one octave, and doubling the speed caused the pitch to increase by one octave; this was the first example I’d seen of pitch relating to speed.
In the article I want to discuss a few different topics that are all interconnected in one way or another. First frequency and its relation to pitch, our tuning system(s) and how we came by them, what overtones are and how they differ from harmonics, what is timbre and (hopefully) what the point of knowing all this stuff.
I’m going to leave that story there and cover our next steps after covering a few basics. Depending on how familiar you are with looking at waveform displays or using such things as LFOs or filters, you may already have come across the term frequency in music technology. We think of this word as the occurrence or the amount of times something happens, and in short they mean the same thing.
Sound is vibrations or fluctuations in the air pressure that are perceived by a listener (so who knows about the tree that falls in an empty forest?).
Any sort of acoustic phenomenon causes air molecules to be pushed up and down. This creates a (sometimes) audible event which our ears can detect and infer certain information from, like distance, pitch and relative location.
The number of times these air molecules complete a cycle determines its frequency. This means that if something is vibrating faster, the pitch is higher, and if it’s slower then the pitch is lower, you get the idea. Imagine a motorbike or car driving along – as it accelerates the sound the engine makes increases in pitch.
We measure frequency in Hertz (named after Heinrich Rudolf Hertz), which is abbreviated to Hz. 1 Hz is one cycle per second. You might see graphs like the one below, where the x-axis represents time, and the y-axis measures amplitude (which you can understand as volume for now, but there is a difference which we won’t go into yet).
Please excuse my poor Photoshop and Illustrator skills, there probably is better software that can be more accurate!
If in the above example our time is 1 second, then this is a 2 Hz wave, i.e it has completed 2 cycles in one second.
You may also notice that amplitude is measured in both positive and negative, a bipolar measurement. We probably think of volume as unipolar or unidirectional, i.e goes from quiet to loud – I don’t have enough time to fully go into why this is, but if you’re interested, you can read more about amplitude on the wiki.
I understand amplitude as describing the strength of a wave. We can think of non audio waves such as gravitational, electromagnetic, seismic etc. Amplitude is the peak to peak value of these waves.
Someone on Yahoo Answers can explain it better than me:
Volume is used to express (audible) sound and it is usually based on a logarithmic scale, decibels. Amplitude could be based on any scale with a magnitude such as displacement or power/voltage/current, et al. Therefore, volume is an amplitude but not vice versa.
Moving on, in the below we will double the frequency:
In the same amount of time as the above image, the wave completes 4 cycles, making this a frequency of 4 Hz.
We should note that all pitches are frequencies, but not all frequencies are pitches. Older readers may remember AM and FM radio, being described in kHz (kiloHertz, 1000 Hertz), or those of you familiar with LFOs may know their range to be (typically) 0.01 Hz to 20 Hz.
Our hearing ranges from 20 Hz to 20 kHz, so anything outside of this is inaudible. In the case of LFOs, they tend to be subsonic (lower than human hearing) and radio frequencies supersonic (higher than human hearing). That’s why spectral analysers display ranges between 20 Hz and 20 kHz.
Let’s move away from abstract physics and think about an acoustic instrument, as this is perhaps easier to understand. A nylon string guitar is a good place to start.
You may know that the distance between the nut and the bridge on a guitar is a carefully measured length, and 12th fret is exactly halfway between these points.
Plucking the open A string will give you the lowest A note possible in standard tuning. Playing that same string with 12th fret depressed will give you a note twice the frequency; one octave up = double frequency.
The open A string has a frequency of 110 Hz, this means that in one second, that string will have oscillated 110 times. We can infer that the A at 12th fret on the same string has a frequency of 220 Hz, twice that of our open string.
Here’s someone playing Eric Clapton’s Tears in Heaven (amongst other things), filming from underneath the strings with an iPhone 4. You can quite clearly see how different strings oscillate at different frequencies depending on their pitches.
Sadly the above video may not be as clear cut as we first thought, with various debates occurring on Reddit about the camera’s frame rate, aliasing and all sorts of other reasons for the above phenomena. You can read it here if you are so inclined.
It was in-fact pointed out to me on a Facebook music theory group by Jim Harrison that the video is not doing what it claims:
It’s not possible for a human’s (or phone’s, for that matter) visual system to actually “see” the vibrations on a guitar string (or even a double-bass, for that matter) as clearly as that video claims to demonstrate. At best, a person can see that the string is moving rapidly, but cannot make out the “waves” the string describes in space. What is being shown is what’s known as beat frequencies – the perceived difference between (or multiple of) two frequencies occurring together. Most consumer digital video capture systems capture at 24 frames per second (fps). What the video displays is the string’s vibrating frequency divided by the camera’s capture rate. If you’ve seen the video of water flow being altered by sound, then you’ve seen the same effect, just using a different medium.