In Core Motion framework, it provides not only the raw values recorded by the hardware (IMU) but also a processed version of those values.
The Apple documentation said that "Processed values do not include forms of bias that might adversely affect how you use that data."
What kinds of sensor fusion algorithms are applied internally to obtain the processed (unbiased) acceleration and gyro values?
Is there any documentation explaining these kinds of sensor fusion algorithms such as EKF, complementary filter?
Did you find any information?