Yi-Chia Chen

Online Experiments & Personal Website Workshop

Updated: 2021-09-16


[Example Experiment Code]

[Example Experiment]

[Example Personal Website Code]

[Example Personal Website]

Online Experiment Resources

Updated: 2020-10-06

Web Programming Tutorials

Programming & Debugging Tools

Experiment Tools

Subject Recruitment

Other Services

Other Tools

Useful Online Experiment References

Updated: 2021-08-24

Feel free to contact me for suggested papers to include.

Controlling Size

Brascamp, J. W. (2021). Controlling the spatial dimensions of visual stimuli in online experiments. Journal of Vision, 21(8):19.

Li, Q., Joo, S. J., Yeatman, J. D., & Reinecke, K. (2020). Controlling for participants’ viewing distance in large-scale, psychophysical online experiments using a virtual chinrest. Scientific Report,10:904.

Controlling Color and Contrast

To, L., Woods, R. L., Goldstein, R. B., & Peli, E. (2013). Psychophysical contrast calibration. Vision Research, 90, 15-24.


Semmelmann, K., & Weigelt, S. (2018). Online webcam-based eye tracking in cognitive science: A first look. Behavior Research Methods, 50, 451-465.

RT recording & Display timing

Anwyl-Irvine, A., Massonnié, J., Flitton, A., Kirkham, N., & Evershed, J. K. (2020). Gorilla in our midst: An online behavioral experiment builder. Behavior Research Methods, 52, 388-407.

Anwyl-Irvine, A., Dalmaijer, E. S., Hodges, N., & Evershed, J. K. (2020). Online timing accuracy and precision: A comparison of platforms, browsers, and participant‘s devices. PsyArXiv.

Bridges, D., Pitiot, A., MacAskill, M. R., & Peirce, J. W. (2020). The timing mega-study: Comparing a range of experiment generators, both lab-based and online. PsyArXiv.

Data Quality

Armitage, J., & Eerola, T. (2020). Reaction time data in music cognition: Comparison of pilot data from lab, crowdsourced, and convenience web samples. Frontiers in Psychology, 10:2883.

Bartneck, C., Duenser, A., Moltchanova, E., & Zawieska, K. (2015). Comparing the similarity of responses received from studies in Amazon’s Mechanical Turk to studies conducted online and with direct recruitment. PloS one, 10:e0121595

Casler, K., Bickel, L., & Hackett, E. (2013). Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Computers in Human Behavior, 29, 2156-2160.

Clifford, S., & Jerit, J. (2014). Is there a cost to convenience? An experimental comparison of data quality in laboratory and online studies. Journal of Experimental Political Science, 1, 120-131.

Dandurand, F., Shultz, T. R., & Onishi, K. H. (2008). Comparing online and lab methods in a problem-solving experiment. Behavior Research Methods, 40, 428-434.

Gould, S. J. J., Cox, A. L., Brumby, D. P., & Wiseman, S. (2015). Home is where the lab is: A comparison of online and lab data from a time-sensitive study of interruption. Human Computation, 2, 45-67.

Hauser, D. J., & Schwarz, N. (2015). Attentive Turkers: MTurk participants perform better on online attention checks than do subject pool participants. Behavior Research Methods, 48, 400-407.

Hilbig, B. E. (2015). Reaction time effects in lab- versus Web-based research: Experimental evidence. Behavior Research Methods, 48, 1718-1724.

Leeuw, J. R. d., & Motz, B. A. (2015). Psychophysics in a web browser? Comparing response times collected with JavaScript and Psychophysics Toolbox in a visual search task. Behavior Research Methods, 48, 1-12.

Rodd, J. (2019). How to maintain data quality when you can't see your participants. APS Observer, 32(3). https://www.psychologicalscience.org/observer/how-to-maintain-data-quality-when-you-cant-see-your-participants

Zhou, H., & Fishbach, A. (2016). The pitfall of experimenting on the web: How unattended selective attrition leads to surprising (yet false) research conclusions. Journal of Personality and Social Psychology, 111, 493–504.