for my uni course i have to do a portfolio of different tasks. One is so make a short program that will play a sine wave at a frequency the user has entered then play two others an octave higher and lower. Now the first frequency and time is given to us. They are 1000.0 for frequency and 1000 for time (milliseconds) now my question is how much do i raise and lower the time for to raise and lower the frequency by an octave? hope that makes sense thanks

Isn't it just asking you to playing a frequency for a given time, then play it an octave higher (freq * 2) and then one lower (freq / 2 ), each for the same amount of time?

no, with sine waves (or usualy refered to as sound waves) and strentching or shrinking a wave form to fit the same sine wave in a shorter space of time will either raise its pitch or lower it depending on if you speed up or slow the speed acordingly. I just do not no by how much. edit: this is the exact question.

I believe my way is easier But since they ask you to do it by changing the duration, to raise an octave you'd need to play the same clip in half the time, as you'd have double the cycles in a given amount of time (effectively doubling the frequency). At least that's my thinking on it... has been 10 years since Multimedia Audio 101 and 11 years since highschool band...

aye your way would be easier but hey if i can do it both ways i have a choice i gues i think your right i understand the science so to speak behind why it works, having the same amount of peakes of a wave in a shorter time space moves them closer together raising the frequency, and vise versa to lower it, but i just dont no by how much

shure thing give me a sec i stopped on it there didnt see the point as i dont no by how much the time needs to changed so moved onto the next question

If what you teacher wants is a change in freq. based upon a change in time, then the time (milliseconds) is an indicator of the period of the wave (not the duration). You calculation is ... frequency = 1 / period. Calculating the period (T) by frequencey is T = 1 / freq. So T = 1 / (1000 Hz) = .001s (1 millisecond) at 500HZ T = .002s (2 milliseconds) I'm still have no idea what the original 1000Hz at 1000ms means, unless the time is the duration, not the periodl.

Can't help you with the code, but simply speaking if it goes up and octave, then it doubles in frequency. So 1kHz taken up an octave would be 2kHz. Up an octave, double the frequency. Down an octave, half the frequency. And if this helps, the old way of writing it was cycles per second. So 1kHz is 1000 cylces per second. EDIT: yes, I saying the same thing as Jizzler, but just thought I'd rephrase.