Hey all,
The documentation for AVAudioUnitTimePitch's pitch property describes it as a float, and says:
The pitch is measured in “cents”, a logarithmic value used for measuring musical intervals. One octave is equal to 1200 cents. One musical semitone is equal to 100 cents. The default value is
1.0
. The range of values is -2400
to 2400
.My issue is with seeing the default value listed as 1.0. Shouldn't this really be 0.0, to signify no pitch shifting is taking place? The 1.0 default implies to me that by default, AVAudioUnitTimePitch is shifting the audio data up one cent, so processing is occuring and the original data is getting altered.
Or, if 1.0 is indeed the value for no processing or shifting, then that suggests that perfect semitone shifts should occur in steps of 100n+1.
Upward: 101, 201, 301...
Downward: -99, -199, -299...
Otherwise, the shift will always be one cent off of a true semitone shift.
Please advise. I think the official documentation could also use clarification in this regard.
Thanks,
Geoff