If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

Sound Properties: Amplitude, period, frequency, wavelength

How to find the amplitude, period, frequency, and wavelength for a sound wave. Created by David SantoPietro.

Want to join the conversation?

  • old spice man blue style avatar for user Galba
    I understand that amplitude determines volume and frequency determines pitch. What changes the quality of a sound? For example, an oboe, a violin, and the speaker in this video can all produce a sustained 440 hz A at the same volume. Why do they sound different?
    (39 votes)
    Default Khan Academy avatar avatar for user
    • blobby green style avatar for user Teacher Mackenzie (UK)
      Good question.
      I think firstly it is to do with the shape of the wave. This will be determined by the features of the instrument (eg a violin tends to be triangluar shape I believe,)
      As well as the physics, I expect there will also be stuff going on inside your brain that 'interprets' or evens adds to the sound depending on what other senses pick up....for example if you see an oboe, it can effect the quality of the sound experienced. Obviously things like echo or resonance will also have an impact on quality.

      MMm sory its a bit vague but hope it helps ...
      (11 votes)
  • blobby green style avatar for user jamie_chu78
    In the displacement vs position graph, what do we mean by position? Where is time? Is this not just position through time, which is basically displacement, which is the y axis, is time not the x axis?

    Furthermore, it seems that in the displacement vs position graph, it seems that we are tracking the horizontal motion of a particle and graphing it's distance over time in the vertical direction, at , this is the exact same thing, the particle goes right, the line of the graph goes up, it's even rotated and shown horizontally to reflect this, so there seems to be no difference, as the y and x axis seems to represent the same information, and the same type of information seems to be graphed.
    (8 votes)
    Default Khan Academy avatar avatar for user
    • blobby green style avatar for user Harrison Sona Ndalama
      No, That graph is like a picture showing where MANY particles were at ONE specific moment in time. the x axis represents the "equilibrium location" of the medium's particles. It may be measured as the distance between the particles and the sound source or some given point. In effect this graph is saying, that at this point in time, the particles whose equilibrium is x meters from the source had moved y meters from their equilibrium.
      (9 votes)
  • leaf green style avatar for user umar
    why cant we hear sound with frequency above 20 000 Hz
    (4 votes)
    Default Khan Academy avatar avatar for user
    • leafers ultimate style avatar for user DAllenSchneider
      We really just aren't made to. In order to hear a sound our eardrum must vibrate, and 20000 times per second is just too darn fast for it - the energy is dissipated through the bones and softer tissues around it which act as a damper. For more insight on this, look up "natural frequency" and see if you can connect the dots.
      (12 votes)
  • blobby green style avatar for user 😊
    if period increases does wavelenght increase with it ? or does it stay the same ?
    (8 votes)
    Default Khan Academy avatar avatar for user
  • duskpin seedling style avatar for user Vijeya Patel
    What was the name of the device that was hooked to the speaker in order to get the visualization of soundwaves?
    (4 votes)
    Default Khan Academy avatar avatar for user
  • piceratops tree style avatar for user Lakshmi.manda07
    How can we get visual representation of sound waves using an oscilloscope ?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user Melanie Kelso
    Is there a difference between amplitude as described in :47 (which is on the displacement vs time graph) and the green arrows at (on the displacement vs 'position x' graph)?
    (5 votes)
    Default Khan Academy avatar avatar for user
  • old spice man blue style avatar for user vasudha
    how does amplitude affect the sound?
    (2 votes)
    Default Khan Academy avatar avatar for user
  • blobby green style avatar for user ritamde101004
    i am still having problem understanding the difference between graphical representations of displacement vs time and displacement vs equilibrium position graphs as in vs time graph also the X axis ultimately is representing equilibrium position of air molecule. anyone to help me out?
    (3 votes)
    Default Khan Academy avatar avatar for user
  • duskpin ultimate style avatar for user Scarecrow786
    Does a higher amplitude in a sound wave mean that the wave carries more energy (since higher amplitude means a louder noise)?
    (2 votes)
    Default Khan Academy avatar avatar for user

Video transcript

- This is what a sound wave sounds like, (speaker hums) but what does a sound wave look like? Well, the air through which the sound wave is traveling looks something like this, but if you want another visual representation of the sound, we can hook this speaker up to an oscilloscope, and it gives us this graph. (speaker hums) We say that this shape represents the sound wave, because if we focus on a single molecule of air, we see that it moves back and forth, just like a sine or cosine graph. The horizontal axis here represents time, and the vertical axis can be thought of as representing the displacement of that air molecule as it oscillates back and forth. The center line here represents the equilibrium position or undisturbed position of that air molecule. It we turn up the volume, we see that the oscillations become larger, and the sound becomes louder. The maximum displacement of the air molecule from its undisturbed position is called the amplitude. Be careful. The amplitude is not the length of the entire displacement. It's only the maximum displacement measured from the equilibrium position. Another key idea is the period of a sound wave. The period is defined to be the time it takes for an air molecule to fully move back and forth one time. We call this back and forth motion a cycle. We measure the period in seconds. So, the period is the number of seconds it takes for one cycle. We use the letter capital T to represent the period. If we decrease the period, the time it takes for the air molecules to oscillate back and forth decreases, and the note or the pitch of the sound changes. The less time it takes the air molecules to oscillate back and forth, the higher note that we perceive. An idea intimately related to the period is called the frequency. Frequency is defined to be one over the period. So, since the period is the number of seconds per oscillation, the frequency is the number of oscillations per second. Frequency has units of one over seconds, and we call one over a second a hertz. Typical sounds have frequencies in the 100s or even 1000s of hertz. For instance, this note, which is an A note, is causing air to oscillate back and forth 440 times per second. So, the frequency of this A note is 440 hertz. Higher notes have higher frequencies, and lower notes have lower frequencies. Humans can hear frequencies as low as about 20 hertz and as high as about 20,000 hertz, but if a speaker were to oscillate air back and forth more than about 20,000 times per second, it would create sound waves, but we wouldn't be able to hear them. (sound starts, then stops) For instance, this speaker is still playing a note, but we can't hear it right now. Dogs could hear this note, though. Dogs can hear frequencies up to at least 40,000 hertz. Another key idea in sound waves is the wavelength of the sound wave. The idea of a wavelength is that when this sound is traveling through a region of air, the air molecules will be compressed close together in some regions and spread far apart from each other in other regions. If you find the distance between two compressed regions, that would be the wavelength of that sound wave. Since the wavelength is a distance, we measure it in meters. Be careful. People get wavelength and period mixed up all the time. The period of a sound wave is the time it takes for an air molecule to oscillate back and forth one time. The wavelength of a sound wave is the distance between two compressed regions of air. People get these mixed up because there's an alternate way to create a graph of this sound wave. Consider this. Before the wave moves through the air, each air molecule has some undisturbed position from the speaker that we can measure in meters. This number represents the equilibrium undisturbed position of that air molecule. Then as the sound wave passes by, the air molecules get displaced slightly from that position. So, an alternate graph that we could make would be the displacement of the air molecule versus the undisturbed position or equilibrium position of that air molecule. This graph would let us know for a particular moment in time how displaced is that air molecule at that particular position in space. This graph shows us that in some regions the air is displaced a lot from its equilibrium position, and in other regions, the air is not displaced much at all from its equilibrium position. For this kind of graph, the distance between peaks represents the wavelength of the sound wave, not the period, because it would be measuring the distance between compressed regions in space. So, be careful. For a sound wave, a displacement versus time graph represents what that particular air molecule is doing as a function of time, and on this type of graph, the interval between peaks represents the period of the wave, but a displacement versus position graph represents a snapshot of the displacement of all the air molecules along that wave at a particular instant of time, and on this type of graph, the interval between peaks represents the wavelength.