Alex Irvine, MRC CogÂniÂtion and Brain SciÂences Unit, Cambridge
@alexanderirvine
My talk at BeOnline 2020 will be on the accuÂraÂcy and conÂsisÂtenÂcy of visuÂal preÂsenÂtaÂtion and recordÂing of keyÂboard responsÂes online. So if you are runÂning research that involves preÂsentÂing images for short duraÂtions, or looks at small difÂferÂences (less than a few hunÂdred milÂlisecÂonds) in reacÂtion times, then this talk will be of interÂest to you. The preprint paper I am preÂsentÂing looks at difÂferÂent platÂforms — GorilÂla, jsPsych, PsychoPy/Pavlovia and lab.js on 4 difÂferÂent web browsers and 4 difÂferÂent device catÂeÂgories. It is split up into two parts. In the first, we used a phoÂtoÂdiÂode to detect onset of stimÂuli on the screen, and a robotÂic actuÂaÂtor to press the keyÂboard at speÂcifÂic times. This allowed us to look at the key metÂrics of: accuÂraÂcy (the difÂferÂence between expectÂed and actuÂal timÂing), and preÂciÂsion (the variÂabilÂiÂty of these difÂferÂences). The phiÂlosÂoÂphy behind how we set up these testÂing rigs, was to repÂreÂsent as closeÂly as posÂsiÂble the devices that parÂticÂiÂpants would have in their own homes — so we creÂatÂed the tasks on each platÂform as a norÂmal user would, and used stanÂdard keyÂboards. In the secÂond part of the paper, we look at the demoÂgraphÂics of a large samÂple of GorilÂla parÂticÂiÂpants (200,000 peoÂple), where they are, what platÂforms were used to recruit them, and what their devices looked like. This allows us to put the timÂing results in the first part into conÂtext. The results show that for most platÂforms and devices, timÂing was very good. It also showed that the most comÂmon browsÂer and device comÂbiÂnaÂtion of PCs and Chrome, showed sufÂfiÂcientÂly good timÂing. If you want to see what the results are in full, then you should virÂtuÂalÂly attend my talk, and if you are realÂly keen go and find the preprint on PsyArxÂiv before hand. Thanks, and hope to see you there.

