I’ve got some real time accelerometer and gyroscope data coming in on my project to recognise hand gestures here. Naturally, I would like to be able to remove jitter and noise from the data as painlessly as possible. So we are into the world of real time digital filtering. Many books have been written on this subject and it is easy to dive ‘down the rabbit hole’ and lose a lot of your life testing filters. I want something that is ‘good enough’ quickly. From this stackoverflow answer, I got the idea for a simple implementation of a moving average filter:
The first term, x’, is the new filtered value, calculated by taking the last filtered value and adding α.(last measured value-last filtered value).
So how to simulate and implement this?
First of all, I will simulate the idea. Filters are implemented by using convolution. The input data is convolved with a filter operator. So I tried out a filter operator:
Here’s a simple script I ran using a Jupyter notebook and Python 3. I lifted the base code from the matplotlib examples page. I generated a sine wave and add some random noise in line 6. The filter function is defined in line 5, I am using alpha = 0.5 for this example. The function np.convolve in line 7 implements the filter. I had to knock off the last element of the filtered and difference data to get them all to plot, as one of the characteristics of a filter is that it will elongate the data set. Really, you need to ‘pad’ a data set at each end before applying convolution to remove ‘edge effects’ of the filter. But we are looking to quickly test and implement a filter here, not get bogged down in the technical minutia of filter design. Rabbit hole. Avoid.
import numpy as np import matplotlib.pyplot as plt ALPHA = 0.5 x = np.linspace(0, 2 * np.pi, 100) filter = (1-ALPHA,ALPHA*1) y = 2 * np.sin(x) + 0.1 * np.random.normal(x) y_filt = np.convolve(y, filter) y_diff = y - y_filt[:-1] print(y) print(y_filt[:-1]) print(y_diff[:-1]) fig, (ax0, ax1, ax2) = plt.subplots(nrows=3) ax0.plot(x, y) ax0.set_title('input') ax1.plot(x, y_filt[:-1]) ax1.set_title('output') ax2.plot(x, y_diff) ax2.set_title('difference') # Hide the right and top spines ax1.spines['right'].set_visible(False) ax1.spines['top'].set_visible(False) # Only show ticks on the left and bottom spines ax1.yaxis.set_ticks_position('left') ax1.xaxis.set_ticks_position('bottom') # Tweak spacing between subplots to prevent labels from overlapping plt.subplots_adjust(hspace=0.5) plt.show()
For an input, filtered output and difference plot, see below. Note that the difference plot is on a different scaling to the input and filtered output. Looks to have the same amplitude and that some of the random noise has been removed. It is not perfect, but it has helped and was fast and easy to implement.
I am using micropython v1.7 on a pyboard v1.0 with an mpu6050 accelerometer/gyroscope for my hardware platform – see the diagram below. So how hard could it be to implement a simple one point filter? Errrr….
The filter code is straightforwards, see the snippet below. The function filter takes the latest sensor value as new_value and uses the last filtered value as old_value, returning the latest filtered value. I am using ALPHA as 0.5 for this test.
def filter(self, old_value, new_value): ''' simple moving average filter ''' return (new_value*ALPHA + (1-ALPHA)*old_value)
This function is called from the main sensor scan and process while loop for each of the x,y and z accelerometers, shown in the snippet below.
while(True): if (self.run_flag): if(self.acc_read_flag): self.counter+= 1 (delta, x, y, z) = self.read_acc() x_acc = x x = self.filter(old_x, x) y = self.filter(old_y, y) z = self.filter(old_z, z) print(START, self.counter, delta, x_acc, x, x-x_acc, END) old_x, old_y, old_z = x, y, z
Have a look at the plot below. This shows the x-axis from an imu6050 module being sampled through a pyboard v1.0. at 100Hz. I wrote the firmware for this board using micropython v1.7 and the display software using python 3.4 with the pyqtgraph library. The x scale shows samples, the y scale shows acceleration in g.
So what can we see? The raw data looks jittery, the filtered data looks smoother and we can see the jitter that has been taken out in the difference plot. To characterise this filter properly I would need to start looking at the frequency spectrum of the raw and filtered data. But this is heading down the rabbit hole again.
I’ve quickly implemented a filter that looks to be doing what I want it to – removing noise from data. I can play with the alpha value to change the amount of smoothing. ‘The proof is in the eating’. If I can get my gesture recognition system to work with this simple filter implemented, then it is good enough.