Sensor fusion algorithm in Core Motion?

Hi all,


In Core Motion framework, it provides not only the raw values recorded by the hardware (IMU) but also a processed version of those values.

The Apple documentation said that "Processed values do not include forms of bias that might adversely affect how you use that data."


What kinds of sensor fusion algorithms are applied internally to obtain the processed (unbiased) acceleration and gyro values?

Is there any documentation explaining these kinds of sensor fusion algorithms such as EKF, complementary filter?


Thanks,

Pyojin Kim.

Did you find any information?

You can check this video this is in-depth explanation of core motion and sensor fusion algorithm this might help.[https://developer.apple.com/devcenter/download.action?path=/videos/wwdc_2012__hd/session_524__understanding_core_motion.mov]

Sensor fusion algorithm in Core Motion?
 
 
Q