Welcome to new PhD students

05/10/2014

Our term started last week. Returning to the lab is Greta Vilidaitė. Greta completed a summer project last year, and now returns as a PhD student. Her project will investigate abnormalities of neural noise in autism spectrum disorders. She is pictured here drinking some very strong mead from her native Lithuania.

Greta

Another new addition is Dave Coggan. Dave completed our Cognitive Neuroscience masters course last year, and now returns to start a PhD supervised by Tim Andrews and myself. He will study processes of mid-level vision using fMRI. He is pictured here singing at the annual ECR karaoke night.

DaveCoggan


Conferences and various things

22/03/2014

I’ve just finished a very busy term. It was my first term of proper teaching, which took up a lot more time than I was expecting. It seemed to go well though – I got some very positive feedback from the students, as well as some hilarious comments (“Iron your shirts!!!” being my favourite…).

I’ve also been planning two conferences that we’re holding at York this year. The AVA meeting is in a few weeks time on the 11th April. We’ve got some excellent talks lined up, and have just finalised the program. Also, the BACN meeting is taking place in September, so still plenty of time to submit abstracts for that.

Last week I was up in Glasgow, where I gave a talk in the optometry department at Glasgow Caledonian. I also went to an excellent gig at King Tut’s Wah Wah Hut, where my old band played a few times about 15 years ago. We saw two bands from the 90s: Republica and Space. It looked like this:

Space at King Tut's

Space at King Tut’s

Other than that, I’ve had a couple of papers out this year, and am working on some more. I’m also anxiously awaiting news of a BBSRC grant proposal I put in back in September. I got some very positive reviews in January, and should hear next month whether it’s been funded. Fingers crossed!


Arduino sound to TTL trigger for EEG

22/10/2013

A recent query on the Psychtoolbox mailing list about triggers for EEG prompted me to write an explanation of the system we developed at York to do this. I use the method below for steady state EEG, but it should work just as well for ERP designs. The arduino sound trigger was first constructed by my colleague Becky Gilbert, and she has kindly allowed me to share the design and her code.

It uses an Arduino UNO, to which we attached an audio jack socket to the analog input pin A0 (and a ground pin), and a BNC socket to the digital output pin 13. The board was mounted in a standard box. We had our technician drill holes in the box and mount the ports in it for stability. The whole thing is small (11×6.5cm), and powered from USB. Here is a picture with the lid off:

 

Arduino trigger

We then used the arduino software to upload a script to the board (appended below). It’s a very simple bit of code that just monitors the analog input for changes in voltage above a specific threshold. When a voltage change happens it sends a brief trigger on the digital output, at the appropriate voltage for a TTL pulse. Note that on some systems this pulse is too brief to be detected. If this occurs, inserting a brief pause (e.g. uncomment the //delay(100);) after the digitalWrite(outPin, HIGH) line will extend the pulse.

I connect the jack socket to the headphone socket of the stimulus computer, and the BNC to the trigger socket on the EEG/MEG amplifier system. I use the PsychPortAudio commands in Psychtoolbox to produce a 50ms binary pulse:

PsychPortAudio(‘FillBuffer’, tr, ones(1,220));

which I play at stimulus onset (e.g. directly after a Flip command) using:

PsychPortAudio(‘Start’, tr);

I’ve tested the arduino with an oscilloscope, and it responds within a millisecond of receiving the sound pulse. Of course, the timing accuracy of the sound pulse will depend on your computer hardware – see the PTB documentation and use PsychPortAudioTimingTest to check this. On most systems I’ve tried, the latency is very small indeed, so triggers should be completed well within one screen refresh.

I hope people find this useful, and particular credit to Becky (@BeckyAGilbert) for designing the system. If you have any problems, please leave comments below. Of course producing the box requires a some soldering, and we take no responsibility for any resulting injury, death etc. ;-)

 

 

#define DEBUG 0 // change to 1 to debug 
 
 #define THRESHOLD 0.1 // adjust this to control sensitivity in detection of voltage increase
 
 const int outPin = 13; // change output (D) pin number here
 int prev_voltage = 0; // must be initialized to something
 
 void setup() {
  if(DEBUG) {
   // initialize serial communication at 9600 bits per second (to write debugging info back to terminal)
    Serial.begin(9600); 
  }
  // set output pin
  pinMode(outPin, OUTPUT);
  
 }
 
 void loop() {
   // read input on analog pin 0
  int sensorValue = analogRead(A0); // change input (A) pin number here
  // convert analog reading (from 0-1023) to digital
  float voltage = sensorValue * (5.0 / 1023.0);
 
 if(DEBUG) {
    // print value to serial port (view in Tools > Serial Port Monitor)
    Serial.println(voltage); // This delays the trigger by
                             // about 6ms so remove before using
                             // during experiments
 }
    // Simplest algorithm ever.  
    if (voltage > prev_voltage + THRESHOLD)
    {
      digitalWrite(outPin, HIGH);
//delay(100);
    }
    else
    {
     digitalWrite(outPin, LOW); 
    }
    
   prev_voltage = voltage;
 }

Videos of ECVP noise symposium

24/09/2013

A few weeks ago, at ECVP in Bremen, we held a symposium entitled “Visual noise: new insights”. The talks were varied and interesting, and have been recorded for the benefit of those who were unable to attend the symposium.

The videos are available via the following YouTube links:

Introduction
Remy Allard
Stan Klein [Slides (with additional notes)]
Josh Solomon
Peter Neri
Keith May
Daniel Baker [Slides]
Discussion

Details of the talks (including abstracts) are available here.

Many thanks to Udo Ernst and his team for hosting and filming the event, and Keith May for editing and uploading the videos.


On learning new motor skills

19/08/2013

About four weeks ago, at the age of 31, I learned to ride a bike. This followed some six months of travelling around by tricycle:

Me on my tricycle

Me on my tricycle

Although the trike is great, and really useful for transporting shopping and other things around in its basket, sometimes it just doesn’t go very fast. Also, drunk guys walking home from the pub find it hilarious – as do my friends:

Especially Mladen...

Especially Mladen…

So anyway, last month I finally consented to undergo cycling lessons. Mostly, this involved a couple of hours in a car park after work trying to get both feet on the pedals without falling off. Eventually I managed, and then progressed to some advanced techniques, like using the gears and going round corners.

The interesting bit about learning to ride as an adult was how counterintuitive balancing on a bike turns out to be. If you feel like you’re about to topple off to the left, instinct says you should lean or steer to the right. That’s what you’d do if you were balancing on a beam or something. But on a bike, you do the opposite. Falling to the left is fixed by steering and leaning leftwards.

Apparently this is to do with physics. Imagine going round a corner on a bike. You lean into the curve, which causes the bike to accelerate round the bend (probably this is centrifugal force, but high school physics was a long time ago – it may have changed since then…). If you did the opposite, the whole thing would topple over. I found a nice explanation of some of this here.

Physics aside, the amazing thing is how quickly learning this counterintuitive rule becomes automatic. In fact, now I’ve got the hang of it, I don’t even know I’m doing it. Since most people learn to cycle as children, it must seem so much like second nature that they can’t understand how anyone wouldn’t be able to do it. Also, I definitely felt like I got better in my sleep, which is apparently pretty standard for consolidating motor skills.

I got quite excited about learning to ride, so I went and bought a shiny new bike. Then last week did a 46k ride with some friends from work. I was surprised how much fun that was, and somehow ended up agreeing to do the Action Medical Research York 100k bike ride that took place yesterday. Coincidentally, it left from right outside the psychology department – the very place I’d learned to cycle a few weeks before!

The ride was mostly enjoyable, with a few tough hills, and lots of wind on the last leg. We had good weather though, and managed to get round all 100km (actually more like 106) in 6 hours 45 minutes. I feel much more confident after cycling all that way… Here’s a photo at the finish line:

Tess, Mark and I just after finishing.

Tess, Mark and I just after finishing.


Why can some people’s brains see better than others’?

28/07/2013

On Friday I had a new paper published in the open access journal PLoS ONE. It addresses the question of why some people have better sensitivity to contrast (variations in light levels across an image) than others, sometimes by quite substantial amounts. Unlike differences in eyesight (acuity) that can usually be optically corrected, contrast sensitivity differences occur even for large (low frequency) stimuli that aren’t affected much by optical blur. Presumably then, the sensitivity differences are neural in origin. I was surprised that nobody had really tried to answer this question before, so thought I should give it a go.

The paper is divided into two parts. The first section uses an equivalent noise technique to assess whether sensitivity differences are due to different amounts of noise, or a difference in the efficiency with which stimuli are processed. Although I rule out the latter explanation, the noise masking method cannot tease apart a difference in internal noise from a difference in contrast gain. So, the second part of the study looks at a large corpus of contrast discrimination data, collated from 18 studies in the literature. By looking at the between-subject differences in discrimination performance, I conclude that individual differences at threshold are primarily a consequence of differences in contrast gain. Whether this is due to differences in structure, anatomy, neurotransmitter levels or developmental factors is unclear at the moment.

Since I spent quite a long time putting together all of the dipper function data, I thought I should make it available online. Most of the data were extracted from the published figures using the excellent GraphClick program. The data can be downloaded here in Matlab format. They are organised into a cell array, with each of the 22 cells containing data from one experiment. Each cell is further divided into separate cells for each individual observer, with the ‘data’ array containing the x- and y-values used to produce these plots. I hope these data become a useful resource for other researchers interested in basic visual processes.


A first look at the Olimex EEG-SMT

31/01/2013

Last week I ordered and received a small EEG device manufactured by a Bulgarian company called Olimex. Called the EEG-SMT, it is part of the OpenEEG project, and is a small USB device that looks like this:

The Olimex EEG device.

The Olimex EEG device.

It has five audio jacks for connecting custom electrodes. The ground electrode is passive, and the other four electrodes are active and comprise two bipolar channels. The system is very basic, and at around €150 (including the electrodes) is obviously not going to compete with high end multi-channel EEG rigs.  But, I’m interested in running some steady state VEP experiments that can be run with a single channel, and in principle are quite robust to lower signal to noise ratios from lower quality equipment. Given the price, I thought it was worth a shot.

Although there are several PC packages capable of reading data from the device, I ideally want to integrate EEG recording into the Matlab code I use for running experiments. So, I decided to try and directly poll the USB interface.

The first stage was to install a driver for the device. I’m using a Mac running OSX 10.8, so I went with the FDTI virtual COM port driver. I also found it useful to check the device was working with this serial port tool. The driver creates a virtual serial port, the location of which can be discovered by opening a Terminal window and entering:

    ls -l /dev/tty.*

On my machine this lists a couple of bluetooth devices, as well as the serial address of the Olimex device:

    /dev/tty.usbserial-A9014SQP

Matlab has its own tool for polling serial ports (Serial). I was able to read from the device this way, but I found it less flexible than the IOPort function that comes with Psychtoolbox 3. The rest of this post uses that function.

First we open the serial port and give it a handle:

    [h,e] = IOPort(‘OpenSerialPort’,’/dev/tty.usbserial-A9014SQP’);

Then we can set a few parameters, including the baud rate for data transmission, buffer size etc:

    IOPort(‘ConfigureSerialPort’,h,’BaudRate=57600′);
    IOPort(‘ConfigureSerialPort’,h,’BlockingBackgroundRead=0′);
    IOPort(‘ConfigureSerialPort’,h,’InputBufferSize=65536′);
    IOPort(‘ConfigureSerialPort’,h,’PollLatency=0.0039′);

To start recording, we purge the buffer and then send this command.

    IOPort(‘Purge’,h);
    IOPort(‘ConfigureSerialPort’,h,’StartBackgroundRead=1′);

We wait for a while, then we check how much data is waiting for us in the buffer and read it out into a vector:

    WaitSecs(3);
    bytestoget = IOPort(‘BytesAvailable’,h)
    [longdata,when,e] = IOPort(‘Read’,h,1,bytestoget);

Finally, we stop recording, purge the buffer and close the port:

    IOPort(‘ConfigureSerialPort’,h,’StopBackgroundRead’);
    IOPort(‘Purge’,h);
    IOPort(‘Close’,h);

I had some trouble initially streaming data from the device. If you forget to purge the buffer it can cause your entire system (not just Matlab) to hang and restart. This is very annoying, and slows development progress.

Now that we have some data, we need to process it. The vector is a stream of bytes in packets of 17. We can separate it out like this:

    for n = 1:17
        parseddata(n,:) = longdata(n:17:end);
    end

And plot each signal separately:

Outputs from the Olimex serial interface

Outputs from the Olimex serial interface

According to the device’s firmware, the first two plots are control lines that always output values of 165 and 90. This provides an anchor that lets us know the order of the signals. The next plot tells us the firmware version (version 2), and the fourth plot is a sample counter that increases by 1 each time the device samples the electrodes. The sampling happens at a fixed frequency of 256Hz, so 256 samples represent one second of activity. Plots 5-16 are the outputs of the electrodes (this is what we’re interested in), and I don’t really understand plot 17 yet.

Each channel gets 2 bytes (e.g. 16 bits), but only uses 10 of those bits. This means that to get the actual output, we need to combine the data from two adjacent bytes (paired by colour in the above plots). The data are in big-endian format, which means that the first byte contains the most significant bits, and the second byte the least significant. We can combine them by converting each byte to binary notation, sticking them together, and then converting back:

   for l = 1:6
    for m = 1:length(parseddata)
      trace(l,m)  = bin2dec(strcat(dec2bin(parseddata(lineID(l,1),m)),dec2bin(parseddata(lineID(l,2),m))))./1023;
    end
    end

We now have six ten bit signals, which we can plot as follows:

Channel outputs

Channel outputs

Although the waveforms look exciting, they aren’t very informative because most of what we’re seeing is an artefact from the ‘hum’ of AC mains electricity. We can see this if we examine the Fourier spectrum of one of our waveforms:

Example EEG fourier spectrum

Example EEG fourier spectrum

It is clear that much of the energy is concentrated at 0, and at 50Hz. We can remove these using a bandpass filter, that includes only frequencies between (approximately) 1 and 49Hz. Taking the inverse Fourier transform then gives us a more sensible waveform:

Bandpass filtered waveform

Bandpass filtered waveform

Actually though, I’m more interested in what is happening in the frequency domain. This is because I want to run experiments to measure the response of visual cortex to gratings flickering at a particular frequency. However, there are some problems to overcome first. Critically, I don’t understand how the four active electrodes on the device map onto the six channel outputs that I read over the serial connection. They all seem to produce a signal, and my initial thought was that the first four must be the outputs of individual electrodes, and the final two the differences between positive and negative electrodes for channels 1 & 2. As far as I can tell, that isn’t what’s actually happening though. I have posted on the OpenEEG mailing list, so hopefully someone with experience of using these devices will get back to me.

If anyone is interested, I have put a version of the code outlined above here (with a few extra bells and whistles). Note that it may require some modifications on your system, particularly the serial address of the device. You will also need to have Matlab (or maybe Octave), Psychtoolbox and the driver software installed. Finally, your system may hang if there are problems, and I hereby absolve myself of responsibility for any damage, loss, electrocution etc. that results in you using my code. However, I’d be very interested to hear from anyone else using one of these devices!


Follow

Get every new post delivered to your Inbox.