Arduino sound to TTL trigger for EEG


A recent query on the Psychtoolbox mailing list about triggers for EEG prompted me to write an explanation of the system we developed at York to do this. I use the method below for steady state EEG, but it should work just as well for ERP designs. The arduino sound trigger was first constructed by my colleague Becky Gilbert, and she has kindly allowed me to share the design and her code.

It uses an Arduino UNO, to which we attached an audio jack socket to the analog input pin A0 (and a ground pin), and a BNC socket to the digital output pin 13. The board was mounted in a standard box. We had our technician drill holes in the box and mount the ports in it for stability. The whole thing is small (11×6.5cm), and powered from USB. Here is a picture with the lid off:


Arduino trigger

We then used the arduino software to upload a script to the board (appended below). It’s a very simple bit of code that just monitors the analog input for changes in voltage above a specific threshold. When a voltage change happens it sends a brief trigger on the digital output, at the appropriate voltage for a TTL pulse. Note that on some systems this pulse is too brief to be detected. If this occurs, inserting a brief pause (e.g. uncomment the //delay(100);) after the digitalWrite(outPin, HIGH) line will extend the pulse.

I connect the jack socket to the headphone socket of the stimulus computer, and the BNC to the trigger socket on the EEG/MEG amplifier system. I use the PsychPortAudio commands in Psychtoolbox to produce a 50ms binary pulse:

PsychPortAudio(‘FillBuffer’, tr, ones(1,220));

which I play at stimulus onset (e.g. directly after a Flip command) using:

PsychPortAudio(‘Start’, tr);

I’ve tested the arduino with an oscilloscope, and it responds within a millisecond of receiving the sound pulse. Of course, the timing accuracy of the sound pulse will depend on your computer hardware – see the PTB documentation and use PsychPortAudioTimingTest to check this. On most systems I’ve tried, the latency is very small indeed, so triggers should be completed well within one screen refresh.

I hope people find this useful, and particular credit to Becky (@BeckyAGilbert) for designing the system. If you have any problems, please leave comments below. Of course producing the box requires a some soldering, and we take no responsibility for any resulting injury, death etc. 😉



#define DEBUG 0 // change to 1 to debug 
 #define THRESHOLD 0.1 // adjust this to control sensitivity in detection of voltage increase
 const int outPin = 13; // change output (D) pin number here
 int prev_voltage = 0; // must be initialized to something
 void setup() {
  if(DEBUG) {
   // initialize serial communication at 9600 bits per second (to write debugging info back to terminal)
  // set output pin
  pinMode(outPin, OUTPUT);
 void loop() {
   // read input on analog pin 0
  int sensorValue = analogRead(A0); // change input (A) pin number here
  // convert analog reading (from 0-1023) to digital
  float voltage = sensorValue * (5.0 / 1023.0);
 if(DEBUG) {
    // print value to serial port (view in Tools > Serial Port Monitor)
    Serial.println(voltage); // This delays the trigger by
                             // about 6ms so remove before using
                             // during experiments
    // Simplest algorithm ever.  
    if (voltage > prev_voltage + THRESHOLD)
      digitalWrite(outPin, HIGH);
     digitalWrite(outPin, LOW); 
   prev_voltage = voltage;

Videos of ECVP noise symposium


A few weeks ago, at ECVP in Bremen, we held a symposium entitled “Visual noise: new insights”. The talks were varied and interesting, and have been recorded for the benefit of those who were unable to attend the symposium.

The videos are available via the following YouTube links:

Remy Allard
Stan Klein [Slides (with additional notes)]
Josh Solomon
Peter Neri
Keith May
Daniel Baker [Slides]

Details of the talks (including abstracts) are available here.

Many thanks to Udo Ernst and his team for hosting and filming the event, and Keith May for editing and uploading the videos.

On learning new motor skills


About four weeks ago, at the age of 31, I learned to ride a bike. This followed some six months of travelling around by tricycle:

Me on my tricycle

Me on my tricycle

Although the trike is great, and really useful for transporting shopping and other things around in its basket, sometimes it just doesn’t go very fast. Also, drunk guys walking home from the pub find it hilarious – as do my friends:

Especially Mladen...

Especially Mladen…

So anyway, last month I finally consented to undergo cycling lessons. Mostly, this involved a couple of hours in a car park after work trying to get both feet on the pedals without falling off. Eventually I managed, and then progressed to some advanced techniques, like using the gears and going round corners.

The interesting bit about learning to ride as an adult was how counterintuitive balancing on a bike turns out to be. If you feel like you’re about to topple off to the left, instinct says you should lean or steer to the right. That’s what you’d do if you were balancing on a beam or something. But on a bike, you do the opposite. Falling to the left is fixed by steering and leaning leftwards.

Apparently this is to do with physics. Imagine going round a corner on a bike. You lean into the curve, which causes the bike to accelerate round the bend (probably this is centrifugal force, but high school physics was a long time ago – it may have changed since then…). If you did the opposite, the whole thing would topple over. I found a nice explanation of some of this here.

Physics aside, the amazing thing is how quickly learning this counterintuitive rule becomes automatic. In fact, now I’ve got the hang of it, I don’t even know I’m doing it. Since most people learn to cycle as children, it must seem so much like second nature that they can’t understand how anyone wouldn’t be able to do it. Also, I definitely felt like I got better in my sleep, which is apparently pretty standard for consolidating motor skills.

I got quite excited about learning to ride, so I went and bought a shiny new bike. Then last week did a 46k ride with some friends from work. I was surprised how much fun that was, and somehow ended up agreeing to do the Action Medical Research York 100k bike ride that took place yesterday. Coincidentally, it left from right outside the psychology department – the very place I’d learned to cycle a few weeks before!

The ride was mostly enjoyable, with a few tough hills, and lots of wind on the last leg. We had good weather though, and managed to get round all 100km (actually more like 106) in 6 hours 45 minutes. I feel much more confident after cycling all that way… Here’s a photo at the finish line:

Tess, Mark and I just after finishing.

Tess, Mark and I just after finishing.

Why can some people’s brains see better than others’?


On Friday I had a new paper published in the open access journal PLoS ONE. It addresses the question of why some people have better sensitivity to contrast (variations in light levels across an image) than others, sometimes by quite substantial amounts. Unlike differences in eyesight (acuity) that can usually be optically corrected, contrast sensitivity differences occur even for large (low frequency) stimuli that aren’t affected much by optical blur. Presumably then, the sensitivity differences are neural in origin. I was surprised that nobody had really tried to answer this question before, so thought I should give it a go.

The paper is divided into two parts. The first section uses an equivalent noise technique to assess whether sensitivity differences are due to different amounts of noise, or a difference in the efficiency with which stimuli are processed. Although I rule out the latter explanation, the noise masking method cannot tease apart a difference in internal noise from a difference in contrast gain. So, the second part of the study looks at a large corpus of contrast discrimination data, collated from 18 studies in the literature. By looking at the between-subject differences in discrimination performance, I conclude that individual differences at threshold are primarily a consequence of differences in contrast gain. Whether this is due to differences in structure, anatomy, neurotransmitter levels or developmental factors is unclear at the moment.

Since I spent quite a long time putting together all of the dipper function data, I thought I should make it available online. Most of the data were extracted from the published figures using the excellent GraphClick program. The data can be downloaded here in Matlab format. They are organised into a cell array, with each of the 22 cells containing data from one experiment. Each cell is further divided into separate cells for each individual observer, with the ‘data’ array containing the x- and y-values used to produce these plots. I hope these data become a useful resource for other researchers interested in basic visual processes.

A first look at the Olimex EEG-SMT


Last week I ordered and received a small EEG device manufactured by a Bulgarian company called Olimex. Called the EEG-SMT, it is part of the OpenEEG project, and is a small USB device that looks like this:

The Olimex EEG device.

The Olimex EEG device.

It has five audio jacks for connecting custom electrodes. The ground electrode is passive, and the other four electrodes are active and comprise two bipolar channels. The system is very basic, and at around €150 (including the electrodes) is obviously not going to compete with high end multi-channel EEG rigs.  But, I’m interested in running some steady state VEP experiments that can be run with a single channel, and in principle are quite robust to lower signal to noise ratios from lower quality equipment. Given the price, I thought it was worth a shot.

Although there are several PC packages capable of reading data from the device, I ideally want to integrate EEG recording into the Matlab code I use for running experiments. So, I decided to try and directly poll the USB interface.

The first stage was to install a driver for the device. I’m using a Mac running OSX 10.8, so I went with the FDTI virtual COM port driver. I also found it useful to check the device was working with this serial port tool. The driver creates a virtual serial port, the location of which can be discovered by opening a Terminal window and entering:

    ls -l /dev/tty.*

On my machine this lists a couple of bluetooth devices, as well as the serial address of the Olimex device:


Matlab has its own tool for polling serial ports (Serial). I was able to read from the device this way, but I found it less flexible than the IOPort function that comes with Psychtoolbox 3. The rest of this post uses that function.

First we open the serial port and give it a handle:

    [h,e] = IOPort(‘OpenSerialPort’,’/dev/tty.usbserial-A9014SQP’);

Then we can set a few parameters, including the baud rate for data transmission, buffer size etc:


To start recording, we purge the buffer and then send this command.


We wait for a while, then we check how much data is waiting for us in the buffer and read it out into a vector:

    bytestoget = IOPort(‘BytesAvailable’,h)
    [longdata,when,e] = IOPort(‘Read’,h,1,bytestoget);

Finally, we stop recording, purge the buffer and close the port:


I had some trouble initially streaming data from the device. If you forget to purge the buffer it can cause your entire system (not just Matlab) to hang and restart. This is very annoying, and slows development progress.

Now that we have some data, we need to process it. The vector is a stream of bytes in packets of 17. We can separate it out like this:

    for n = 1:17
        parseddata(n,:) = longdata(n:17:end);

And plot each signal separately:

Outputs from the Olimex serial interface

Outputs from the Olimex serial interface

According to the device’s firmware, the first two plots are control lines that always output values of 165 and 90. This provides an anchor that lets us know the order of the signals. The next plot tells us the firmware version (version 2), and the fourth plot is a sample counter that increases by 1 each time the device samples the electrodes. The sampling happens at a fixed frequency of 256Hz, so 256 samples represent one second of activity. Plots 5-16 are the outputs of the electrodes (this is what we’re interested in), and I don’t really understand plot 17 yet.

Each channel gets 2 bytes (e.g. 16 bits), but only uses 10 of those bits. This means that to get the actual output, we need to combine the data from two adjacent bytes (paired by colour in the above plots). The data are in big-endian format, which means that the first byte contains the most significant bits, and the second byte the least significant. We can combine them by converting each byte to binary notation, sticking them together, and then converting back:

   for l = 1:6
    for m = 1:length(parseddata)
      trace(l,m)  = bin2dec(strcat(dec2bin(parseddata(lineID(l,1),m)),dec2bin(parseddata(lineID(l,2),m))))./1023;

We now have six ten bit signals, which we can plot as follows:

Channel outputs

Channel outputs

Although the waveforms look exciting, they aren’t very informative because most of what we’re seeing is an artefact from the ‘hum’ of AC mains electricity. We can see this if we examine the Fourier spectrum of one of our waveforms:

Example EEG fourier spectrum

Example EEG fourier spectrum

It is clear that much of the energy is concentrated at 0, and at 50Hz. We can remove these using a bandpass filter, that includes only frequencies between (approximately) 1 and 49Hz. Taking the inverse Fourier transform then gives us a more sensible waveform:

Bandpass filtered waveform

Bandpass filtered waveform

Actually though, I’m more interested in what is happening in the frequency domain. This is because I want to run experiments to measure the response of visual cortex to gratings flickering at a particular frequency. However, there are some problems to overcome first. Critically, I don’t understand how the four active electrodes on the device map onto the six channel outputs that I read over the serial connection. They all seem to produce a signal, and my initial thought was that the first four must be the outputs of individual electrodes, and the final two the differences between positive and negative electrodes for channels 1 & 2. As far as I can tell, that isn’t what’s actually happening though. I have posted on the OpenEEG mailing list, so hopefully someone with experience of using these devices will get back to me.

If anyone is interested, I have put a version of the code outlined above here (with a few extra bells and whistles). Note that it may require some modifications on your system, particularly the serial address of the device. You will also need to have Matlab (or maybe Octave), Psychtoolbox and the driver software installed. Finally, your system may hang if there are problems, and I hereby absolve myself of responsibility for any damage, loss, electrocution etc. that results in you using my code. However, I’d be very interested to hear from anyone else using one of these devices!

New job, first month


So, at the beginning of the month I started working at York. It’s been a busy few weeks, meeting lots of people and finding out how things work. The department is great, people have been very friendly and welcoming. So far I’ve mostly been walking in to work, and the winter mornings have been (occasionally) lovely:

Mist on the Ouse

Mist on the Ouse

My new computers showed up this week and are nearly set up the way I want them. I managed to install Grace under OSX Mountain Lion, which was rather easier than I was expecting. The lab setup is coming together. I’ve now got two very sturdy Headspot chin rests all the way from Houston and a couple of height adjustable tables are on their way.

Also, excitingly, I ordered a small EEG device (EEG-SMT) from a Bulgarian company called Olimex. It’s a very basic open source design, but should hopefully be good enough for measuring VEPs with. It arrived on Friday, and I’ve managed to stream data from it into Matlab using the USB port (and the Psychtoolbox IOPort command). Working out how to interpret that data might take a little work, and I’ll likely produce a blog post once I’ve cracked it. I think the device has lots of potential if it produces clean enough data, and is easily affordable (at well under £200) for anyone interested.

At the start of this month I went to the EPS meeting in London. It was the first one I’ve been to and I really enjoyed it. There was much more vision on the program than I was expecting, and I met some really interesting people. I’ll definitely start going to more of the meetings and maybe also join the society.

Last day at Aston


So, today is my last day on campus at Aston. I’m amazed at how quickly the last three and a half years have passed, it feels like no time at all since I was starting back here after postdoccing in Southampton. Still, it’s been a productive time, and on balance I’m glad I chose to come back here rather than do something else.

This morning I made a Wordle from the text of all the papers I’ve published. I might put it on my new website:


The other day we went for some leaving drinks at the Bull, which went like this:

Lots of people at the Bull to celebrate me leaving.

Lots of people at the Bull to celebrate me leaving.

At the moment our house is full of boxes. Laura is off work today to do some last minute packing (mostly of her craft materials) and hopefully sell her car. Then tomorrow some burly men will arrive and load everything into a lorry and take it to York. I’ll be there already (hopefully) with the cats to tell them where to put everything.

Next week I’m giving a talk at the AVA Christmas meeting in London, and then I’m back in London again just after New Year for the EPS meeting. It’s the first one I’ve been to, and I’m looking forward to going to a more general psychology conference. I’ll need to make my talk less geeky though!

Lastly, John Cass and I submitted our first collaborative paper together yesterday. We sent it to a journal that will probably reject it without even bothering to review, but hey, it doesn’t hurt to aim high!