FSensor – An Open Source Android Sensor Filtering Library

FSensor (FusionSensor) is an Android library that (hopefully) removes some/most of the complexity of using Androids orientation sensors (Acceleration, Magnetic and Gyroscope).

FSensor expands greatly on the “out-of-the-box” sensor implementations provided by Android allowing you to customize sensor filters and fusions for your specific needs, or just add default filters on what Android already provides.

  • Provides device/sensor agnostic averaging filters in the of mean, median and low-pass varieties
  • Provides IMU sensor fusion backed estimations of device orientation in the complimentary and Kalman varieties
  • Provies estimations of linear acceleration (linear acceleration = acceleration – gravity) in the averaging filter and sensor fusion varieties

Get FSensor:

In the project level build.gradle:

allprojects {
    repositories {
        maven { url "https://jitpack.io" }
    }
}

In the module level build.gradle:

dependencies {
    compile 'com.github.KalebKE:FSensor:v1.0'
}

The source of FSensor is available on GitHub.

Averaging Filters:

FSensor implements three of the most common smoothing filters, low-pass, mean and meadian filters. All the filters are user configurable based on the time constant in units of seconds. The larger the time constant, the smoother the signal. However, latency also increases with the time constant. Because the filter coefficient is in the time domain, differences in sensor output frequencies have little effect on the performance of the filter. These filters should perform about the same across all devices regardless of the sensor frequency. FSensor is clever about providing an implementation where the time constant is agnostic to the output frequencies of the devices sensors which vary greatly by model and manufacturer.

Low-Pass Filter:

FSensor implements an IIR single-pole low-pass filter. The coefficient (alpha) can be adjusted based on the sample period of the sensor to produce the desired time constant that the filter will act on. It takes a simple form of output[0] = alpha * output[0] + (1 – alpha) * input[0]. Alpha is defined as alpha = timeConstant / (timeConstant + dt) where the time constant is the length of signals the filter should act on and dt is the sample period (1/frequency) of the sensor. Computationally efficient versus a mean or median filter (constant time vs linear time).

Mean Filter

FSensor implements a mean filter designed to smooth the data points based on a time constant in units of seconds. The mean filter will average the samples that occur over a period defined by the time constant… the number of samples that are averaged is known as the filter window. The approach allows the filter window to be defined over a period of time, instead of a fixed number of samples.

Read more in Mean Filters.

Median Filter

FSensor usees a median filter designed to smooth the data points based on a timeconstant in units of seconds. The median filter will take the median of the samples that occur over a period defined by the time constant… the number of samples that are considered is known as the filter window. The approach allows the filter window to be defined over a period of time, instead of a fixed number of samples.

Median Filter Usage:

private BaseFilter filter; 

private void init() {
  filter = new ... // LowPassFilter(), MeanFilter(), MedianFilter();
  filter.setTimeConstant(0.18);
}

@Override
public void onSensorChanged(SensorEvent event) {
    // Could be any of the Android sensors
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
      // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, acceleration, 0, event.values.length);
      filteredAcceleration = filter.filter(acceleration);
    } 
}

Sensor Fusions

FSensor offers two different estimations of rotation using IMU sensor fusions. One fusion is based on a quaternion backed complimentary filter and the second fusion is based on a quaternion backed Kalman filter. Both fusions use the acceleration sensor, magnetic sensor and gyroscope sensor to provide an estimation the devices orientation relative to world space coordinates.

The gyroscope is used to measure the devices orientation. However, the gyrocope tends to drift due to round off errors and other factors. Most gyroscopes work by measuring very small vibrations in the earth’s rotation, which means they really do not like external vibrations. Because of drift and external vibrations, the gyroscope has to be compensated with a second estimation of the devices orientation, which comes from the acceleration sensor and magnetic sensor. The acceleration sensor provides the pitch and roll estimations while the magnetic sensor provides the azimuth.

Quaternions Complimentary Filter

Quaternions offer an angle-axis solution to rotations which do not suffer from many of the singularies, including gimbal lock, that you will find with rotation matrices. Quaternions can also be scaled and applied to a complimentary filter. The quaternion complimentary filter is probably the most elegant, robust and accurate of the filters, although it can also be the most difficult to implement.

The complementary filter is a frequency domain filter. In its strictest sense, the definition of a complementary filter refers to the use of two or more transfer functions, which are mathematical complements of one another. Thus, if the data from one sensor is operated on by G(s), then the data from the other sensor is operated on by I-G(s), and the sum of the transfer functions is I, the identity matrix. In practice, it looks nearly identical to a low-pass filter, but uses two different sets of sensor measurements to produce what can be thought of as a weighted estimation.

A complimentary filter is used to fuse the two orienation estimations (the gyroscope and acceleration/magentic, respectively) together. It takes the form of gyro[0] = alpha * gyro[0] + (1 – alpha) * accel/magnetic[0]. Alpha is defined as alpha = timeConstant / (timeConstant + dt) where the time constant is the length of signals the filter should act on and dt is the sample period (1/frequency) of the sensor.

Quaternion Kalman Filter

Quaternions offer an angle-axis solution to rotations which do not suffer from many of the singularies, including gimbal lock, that you will find with rotation matrices. Quaternions can also be scaled and applied to a complimentary filter. The quaternion complimentary filter is probably the most elegant, robust and accurate of the filters, although it can also be the most difficult to implement.

Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, containing noise (random variations) and other inaccuracies, and produces estimates of unknown variables that tend to be more precise than those based on a single measurement alone. More formally, the Kalman filter operates recursively on streams of noisy input data to produce a statistically optimal estimate of the underlying system state. Much like complimentary filters, Kalman filters require two sets of estimations, which we have from the gyroscope and acceleration/magnetic sensor.

Sensor Fusion Usage:

private OrientationFusion orientationFusion;

private void init() {
  orientationFusion = new ...  // OrientationComplimentaryFusion(), new OrientationKalmanFusion();
}

@Override
public void onSensorChanged(SensorEvent event) {
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
      // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, acceleration, 0, event.values.length);
      orientationFusion.setAcceleration(acceleration);
    } else  if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
     // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, magnetic, 0, event.values.length);
      orientationFusion.setMagneticField(this.magnetic);
    } else if (event.sensor.getType() == Sensor.TYPE_GYROSCOPE) {
      // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, rotation, 0, event.values.length);
      // Filter the rotation 
      fusedOrientation = orientationFusion.filter(this.rotation);
    }
}

Linear Acceleration

FSensor offers a number of different linear acceleration filters. Linear acceleration is defined as linearAcceleration = (acceleration – gravity). An acceleration sensor is not capable of determining the difference between gravity/tilt and true linear acceleration. There is one standalone approach, a low-pass filter, and many sensor fusion based approaches. Acceleration Explorer offers implementations of all the common linear acceleration filters as well as the Android API implementation.

Linear acceleration can be calculated as linear acceleration = (acceleration – gravity).

Low-Pass Linear Acceleration

The most simple linear acceleration filter is based on a low-pass filter. It has the advantage that no other sensors are requied to estimate linear acceleration and it is computationally efficient. A low-pass filter is implemented in such a way that only very long term (low-frequency) signals (i.e, gravity) are allow to pass through. Anything short term (high-frequency) is filtered out. The gravity estimation is then subtracted from the current acceleration sensor measurement, providing an estimation of linear acceleration. The low-pass filter is an IIR single-pole implementation. The coefficient, a (alpha), can be adjusted based on the sample period of the sensor to produce the desired time constant that the filter will act on. It is essentially the same as the Wikipedia LPF. It takes a simple form of gravity[0] = alpha * gravity[0] + (1 – alpha) * acceleration[0]. Alpha is defined as alpha = timeConstant / (timeConstant + dt) where the time constant is the length of signals the filter should act on and dt is the sample period (1/frequency) of the sensor.  This implementation can work very well assuming the acceleration sensor is mounted in a relatively fixed position and the periods of linear acceleration is relatively short.

Read more in Low-Pass Linear Acceleration.

Low-Pass Linear Acceleration Usage:

// Stil a BaseFilter under the hood
private AveragingFilter averagingFilter;
private LinearAcceleration linearAccelerationFilter;

private void init() {
  averagingFilter = new ... // LowPassFilter(), MeanFilter(), MedianFilter();
  // Make the filter "stiff" with a large time constant since gravity is a constant (mostly)
  averagingFilter.setTimeConstant(5);
  
  // Initialize the linear acceleration filter with the averaging filter
  linearAccelerationFilter = new LinearAccelerationAveraging(averagingFilter);
}

@Override
public void onSensorChanged(SensorEvent event) {
    // Could be any of the Android sensors
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
      // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, acceleration, 0, event.values.length);
      
      // You need to drive *both* filters. The input and the outputs can be driven at different rates...
      // For instance, once your averaging filter has essentially converged on gravity, it doesn't
      // need anymore inputs...
      averagingFilter.filter(acceleration);
      linearAccelerationFilter.filter(acceleration);
    } 
}

IMU Sensor Fusion Linear Acceleration

Calculating the gravity components of a normalized orientation is trivial, so FSensor can easily use the IMU orientation fusions to provide an estimation of linear acceleration that is far more customizable than what Android provides alone.

IMU Sensor Fusion Linear Acceleration Usage:

private LinearAcceleration linearAccelerationFilter;
private OrientationFusion orientationFusion;

private void init() {
  orientationFusion = new ...  // OrientationComplimentaryFusion(), new OrientationKalmanFusion();
  linearAccelerationFilter = new LinearAccelerationFusion(orientationFusion);
}

@Override
public void onSensorChanged(SensorEvent event) {
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
      // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, acceleration, 0, event.values.length);
      orientationFusion.setAcceleration(acceleration);
      
      // Apply the orientation to the raw acceleration to estimate linear acceleration
      linearAcceleration = linearAccelerationFilter.filter(acceleration)
    } else  if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
     // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, magnetic, 0, event.values.length);
      orientationFusion.setMagneticField(this.magnetic);
    } else if (event.sensor.getType() == Sensor.TYPE_GYROSCOPE) {
      // Android reuses events, so you probably want a copy
      System.arraycopy(event.values, 0, rotation, 0, event.values.length);
      // Filter the rotation 
      orientationFusion.filter(this.rotation);
    }
}

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *