Wednesday, 1 January 2014

Breathing Detection using Kinect and OpenCV - Part 1 - Image Processing

I have had a go at detecting breathing using an XBox Kinnect depth sensor and the OpenCV image processing library.
I have seen a research paper that did breathing detection, but it relied on fitting the output of the Kinect to a skeleton model to identify the chest area to monitor.  I would like to do it with a less calculation intensive route, so am trying to just use image processing.

To detect the small movements of the chest during breathing, I am doing the following:
Start with a background depth image of empty room.

Grab a depth image from kinect
Subtract Background so we have only the test subject.




Subtract a rolling average background image, and amplify the resulting small differences - makes image very sensitive to small movements.


Resulting video shows image brightness changing due to chest movements from breathing.

We can calculate the average brightness of the test subject image - the value clearly changes due to breathing movements - job for tomorrow night is to do some statistics to work out the breathing rate from this data.

The source code of the python script that does this is the 'benfinder' program in the OpenSeizureDetector archive.

    Tuesday, 31 December 2013

    A Microsoft Kinect Based Seizure Detector?

    Background

    I have been trying to develop an epileptic seizure detector for our son on-and-off for the last year.   The difficulty is that it has to be non-contact as he is autistic and will not tolerate any contact sensors, and would not lie on a sensor mat etc.
    I had a go at a video based version previously, but struggled with a lot of noise, so put it on hold.

    At the weekend I read a book "OpenCV Computer Vision with Python" by Joseph Howse - this was a really good summary of how to combine openCV video processing into an application - dealing with separating user interface from video processing etc.   Most significantly he pointed out that it is now quite easy to use a Microsoft Kinect sensor with openCV (it looked rather complicated earier in the year when I looked), so thought I should give it a go.

    Connecting Kinect

    When I saw a Kinect sensor in a second hand gadgets shop on Sunday, I had to buy it and see what it can do.

    The first pleasant surprise that I got was that it came with a power supply and had a standard USB plug on it (I thought I would have to solder a USB plug onto it) - I plugged it into my laptop (Xubuntu 13.10), and it was immediately detected as a Video4Linux webcam - a very good start.

    System Software

    I installed the libfreenect library and its python bindings (I built it from source, but I don't think I had to - there is an ubuntu package python-freenect which would have done it).

    I deviated from the advice in the book here, because the Author suggested using the OpenNI library, but this didn't seem to work - looks like they no longer support Microsoft Kinect sensors (suspect it is a licensing issue...).   Also the particularly clever software to do skeleton detection (Nite) is not open source so you have to install it as a binary package, which I do not like.   It seems that the way to get OpenNI working with Kinect is to use a wrapper around libfreenect, so I decided to stick with libfreenect.

    The only odd thing is whether you need to be root to use the kinect or not - sometimes it seems I need to access it as root, then after that it works as a normal user - will think about this later - must be something to do with udev rules, so not a big deal at the moment....

    BenFinder Software

    To see whether the Kinect looks promising to use as a seizure detector, wrote a small application based on the framework in Joseph Howse's book.   I had to modify it to work with libfreenect - basically it is a custom frame grabber.
    The code does the following:
    • Display video streams from kinect, from either the video camera or the infrared depth camera on the kinect - works!  (switch between the two with the 'd' key).
    • Save an image to disk ('s' key).
    • Subtract a background image from the current image, and display the resulting image ('b' key).
    • Record a video (tab key).

    The idea is that it should be able to distinguish Benjamin from the background reliably, so we can then start to analyse his image to see if his movements seem odd (those who know Benjamin will know that 'odd' is a bit difficult to define for him!).

    Output

    I am very pleased with the output - it looks like it could work - a few images:

    Output from Kinect Video Camera (note the clutter to make detection difficult!)
    Kinect Depth Camera Output - Note black hole created by open door.



    Depth Camera Output with background image subtracted - note that the subject stands out quite clearly.
    Example of me trying to do Benjamin-like behaviours to see if I can be detected.

    Conclusion & What Next

    Background subtraction from the depth camera makes the test subject stand out nice and clearly - should be quite easy to detect him computationally.
    Next stage is to see if the depth camera is sensitive enough to detect breathing (when lying still) - will try by subtracting an each image from the average of the last 30 or so, and amplifying the differences to see if it can be seen.
    If that fails, I will look at Skeltrack to fit a body model to the images and analyse movement of limbs (but this will be much more computationally costly).
    Then I will have to look at infrastructure to deploy this - I will either need a powerful computer in Benjamin's room to interface with the Kinect and do the analysis, or maybe use a Raspberry Pi to interface with the kinect and serve the depth camera output as a video stream.

    Looking promising - will add another post with the breathing analysis in the new year...

    Thursday, 5 December 2013

    Using a Kobo Ebook Reader as a Gmail Notifier

    A certain person that I know well does not read her emails very often and sees it as a chore to switch on the computer to see if she has any.  And no, I can't interest her in a smartphone that will do email for her....This post is about making a simple device to hang on the wall like a small picture next to the calendar so she can always see if she has emails to know if it is worth putting the computer on.

    I was in WH Smith the other day and realised that they were selling Kobo Mini e-book readers for a very good price (<£30).   When you think about it the reader is a small battery powered computer with wifi interface, a 5" e-ink screen with a touch screen interface.    This sounds like just the thing to hang on the wall and use to display the number of un-read emails.

    Fortunately some clever people have worked out how to modify the software on the device - it runs linux and the manufacturers have published the open source part of the device firmware (https://github.com/kobolabs/Kobo-Reader).   I haven't done it myself, but someone else has compiled python to run on the device and use the pygame library to handle writing to the screen (http://www.mobileread.com/forums/showthread.php?t=219173).  Note that I needed this later build of python to run on my new kobo mini as some of the other builds that are available crashed without any error messages - I think this is to do with the version of some of the c libraries installed on the device.
    Finally someone called Kevin Short wrote a programme to use a kobo as a weather monitor, which is very similar to what I am trying to do and was a very useful template to start from - thank you, Kevin! (http://www.mobileread.com/forums/showthread.php?t=194376).

    The steps I followed to get this working were:

    • Enable telnet and ftp access to the kobo (http://wiki.mobileread.com/wiki/Kobo_Touch_Hacking)
    • Put python on the 'user' folder of the device (/mnt/onboard/.python).
    • Extend the LD_LIBRARY_PATH in /etc/profile to point to the new python/lib and pygame library directories.
    • Add 'source /etc/profile' into /etc/init.d/rcS so that we have access to the python libraries during boot-up.
    • Prevented the normal kobo software from starting by commenting out the lines that start the 'hindenburg' and 'nickel' applications in /etc/init.d/rcS.
    • Killed the boot-up animation screen by adding the following into rcS:
            killall on-animator.sh
            sleep 1
    • Added my own boot-up splash screen by adding the follwing to rcS:
            cat /etc/images/SandieMail.raw | /usr/local/Kobo/pickel showpic 
    • Enabled wifi networking on boot up by referencing a new script /etc/network/wifiup.sh in rcS, which contains:
            insmod /drivers/ntx508/wifi/sdio_wifi_pwr.ko
            insmod /drivers/ntx508/wifi/dhd.ko
            sleep 2
            ifconfig eth0 up
            wlarm_le -i eth0 up
            wpa_supplicant -s -i eth0 -c /etc/wpa_supplicant/wpa_supplicant.conf -C         /var/run/wpa_supplicant -B sleep 2
            udhcpc -S -i eth0 -s /etc/udhcpc.d/default.script -t15 -T10 -A3 -f -q
    • Started my new gmail notifier program using the following in rcS:
            cd /mnt/onboard/.apps/koboGmail
            /usr/bin/python gmail.py > /mnt/onboard/gmail.log 2>&1 &
    The actual python program to do the logging is quite simple - it uses the pygame program to write to a framebuffer screen, but uses a utility called 'full_update' that is part of the kobo weather project to update the screen.   The program does the following:
    • Get the battery status, and create an appropriate icon to show battery state.
    • Get the wifi link status and create an appropriate icon to show the link state.
    • Get the 'atom' feed of the user's gmail account using the url, username and password stored in a configuration file.
    • Draw the screen image showing the number of unread emails, and the sender and subject of the first 10 unread mails, and render the battery and wifi icons onto it.
    • Update the kobo screen with the new image.
    • Wait a while (5 seconds at the moment for testing, but will make it longer in the future - 5 min would probably be plenty).
    • Repeat indefinitely.
    The source code is in my github repository.

    The resulting display is pretty basic, but functional as shown in the picture.

    Things to Do

    There are a few improvements I would like to make to this:
    1. Make it less power intensive by switching off wifi when it is not needed (it can flatten its battery in about 12 hours so will need to be plugged into a mains adapter at the moment).
    2. Make it respond to the power switch - you can switch it off by holding the power switch across for about 15 seconds, but it does not shutdown nicely - no 'bye' display on the screen or anything like that - just freezes.
    3. Get it working as a usb mass storage device again - it does usb networking at the moment instead, so you have to use ftp to update the software or log in and use vi to edit the configuration files - not user friendly.
    4. Make it respond to the touch screen - I will need to interpret the data that appears in /dev/input for this.  The python library evdev should help with interpreting the data, but it uses native c code so I need a cross compiler environment for the kobo to use that, which I have not set up yet.  Might be as easy to code it myself as I will only be doing simple things.
    5. Get it to flash its LED to show that there are unread emails - might have to modify the hardware to add a bigger LED that faces the front rather than top too.
    6. Documentation - if anyone wants to get this working themselves, they will need to put some effort in, because the above is a long way off being a tutorial.   It should be possible to make a kobo firmware update file that would install it if people are interested in trying though.


    Tuesday, 22 October 2013

    Raspberry Pi and Arduino

    I am putting together a data logger for the biogas generator.

    I would like it networked so I don't have to go out in the cold, so will use a raspberry pi.   To make interfacing the sensors easy I will connect the Pi to an Arduino microcontroller.   This is a bit over the top as I should be able to do everything I need using the Pi's GPIO pins, but Arduino has a lot of libraries to save me programming....

    To get it working I installed the following packages using:
    apt-get install gcc-avr avr-libc avrdude arduino-core arduino-mk

    To test it, copy the Blink.ino sketch from /usr/share/arduino/examples/01.Basics/Blink/ to a user directory.
    Then create a Makefile in the same directory that has the following contents:
    ARDUINO_DIR  = /usr/share/arduino
    TARGET       = Blink
    ARDUINO_LIBS =
    BOARD_TAG    = uno
    ARDUINO_PORT = /dev/ttyACM0
    include /usr/share/arduino/Arduino.mk
    Then just do 'make' to compile it, then upload to the arduino (in this case a Uno) using:
    avrdude -F -V -p ATMEGA328P -c arduino -P/dev/ttyACM0  -U build-cli/Blink.hex
    The LED on the Arduino Uno starts to blink - success!

    Saturday, 19 October 2013

    Small Scale Biogas Generator

    I heard on the radio last week that some farmers are using anaerobic digesters to produce methane-rich biogas from vegetable waste.
    This got me wondering if we could use our domestic waste to produce usable fuel gas - maybe to heat the greenhouse or something similar.

    I thought I would make a small scale experimental digester to see if it works, and what amount of gas it makes, to see if it is worth thinking about something bigger.

    My understanding is that the methane producing bacteria work best at over 40 degC, so I will heat the digester.  I will do this electrically for the experimental set up because it is easy, and I can measure the energy consumption easily that way.

    I am using a 25 litre fermentation vessel for the digester - I got one with a screw on cap rather than a bucket so I can run it at slightly elevated pressure if it starts to make gas.
    For simplicity I got a 1 m2 electric underfloor heating blanket to heat the vessel.  I will use an electro-mechanical thermostat as a protection device in case the electronic temperature controller I will produce looses its marbles and tries to melt the vessel.


    To start with I just wrapped the blanket around the vessel.

    But before I tested it I realised that this approach is no good - the vessel will not be full of liquid, so I do not want the heating element all the way up the sides.








    So, I removed the heating element from the underfloor heating mat, and wrapped it around the bottom of the vessel instead.














    To improve heat transfer between the heating element and the vessel, I pushed as much silicone grease as I could get in around the element wires, then wrapped it in gaffer tape to make sure it all held together and I don't get covered in grease:

    It is looking promising now - the element gets warm, and the thermostat trips it out when it starts to get hot.  The dead band on the thermostat is too big to be useful for this application (it is about 10 degC), so I will just use that as an over-heat protection device, and us an Arduino microcontroller to control and log the temperature.

    To get the proof of concept prototype working, I think I need to:
    • Sort out a temperature controller - will use an arduino and a solid state relay to switch the heater elements on and off.
    • Gas Handling - I will need to do something with the gas that is generated, while avoiding blowing up the house or garage - I have seen somewhere where they recommend using an aluminised mylar baloon, which sounds like a good idea if I can find one.
    • Gas Composition Measurement - I will need to find out the proportion of methane to carbon dioxide that I am generating - still not sure how to do that.   It would be possible with a tunable IR laser diode, but not sure if that is feasible without spending real money.  Any suggestions appreciated!
    • Gas volume measurement - the other thing I am interested in is how much gas is generated - not sure how best to measure very low gas flow rates.  I am wondering about modifying a U-bend type airlock to detect how many bubbles pass through - maybe detect the water level changing before the bubble passes through.
    If this looks feasible, the next stages of development would be:
    • Automate gas handling to use the gas generated to heat the digester - success would be making it self sustaining so that it generated enough gas to keep it warm.  That would mean scaling it up would produce excess gas that I could use for something, else.
    • Think about how far I can scale it up - depends on what fuel to use - kitchen and 'soft' garden waste is limited, so might have to look for something else....
    Will post an update when I get it doing something.



    Saturday, 5 October 2013

    Using Raspberry Pi as an IP Camera to Analogue Converter

    I have an old-fashioned analogue TV distribution system in our house.   We use it for a video monitor for our disabled son so we can check he is ok.
    The quality of the analogue camera we use is not good, but rather than getting a new analogue one, I thought I should really get into digital IP cameras.
    I have had quite a nice IP camera with decent infra-red capabilities for a while (a Ycam Knight).   You can view the images and hear the audio on a computer, but it is not as useful as it working on the little portable flat panel TVs we have installed in a few rooms for the old analogue camera.

    I am trying an experiment using a raspberry Pi to take the audio and video from the IP camera, and convert it to analogue signals so my old equipment can be used to view it.

    What we have is:

    • IP Camera connected to home network.
    • Raspberry Pi connected to same network.
    • Analogue video and audio signals from Pi connected to an RF modulator, which is connected to our RF distribution system.
    Using this I can tune the TVs on the RF distribution system to view the Raspberry Pi output.

    I set up the Pi to view the audio and video streams from the IP camera by using the omxplayer video player, which is optimised for the Pi.   I added the following to /etc/rc.local:
    omxplayer rtsp://192.168.1.18/live_mpeg4.sdp &
    Now when the Pi boots, it displays the video from the IP camera on its screen, which is visible to other monitors via the RF modulator.

    My concern is how reliable this will be - I tried earlier in the year and the Pi crashed after a few weeks with a completely mangled root filesystem, which is no good at all.   This time I am using a new Pi and new SD card for the filesystem, so I will see how long it lasts.

    Sunday, 22 September 2013

    Human Power Meters

    I have just done a triathlon with my disabled son, Benjamin (team-bee.blogspot.co.uk)

    While we were training I started to try to calculate the energy requirements for the event, because I was worried about running out of glycogen before the end.   Most of the calculation methods can not take account of weather - especially wind, so I am starting to wonder how to make a power meter for our bike.   I can either go for strain gauges in the cranks, which is likely to be difficult mechanically, or I am wondering if I can just use my heart rate.
    I have just got a Garmin 610 sports watch with heart rate monitor.  It uses a wireless protocol called 'ant'.  I'll have to look at how good heart rate is as a surrogate for power output.
    I may have to go to a gym to calibrate myself against a machine that measures work done...a winter project I think!