January 2023: Habit Update

We're still here, and I've been fielding emails and making minor changes to Habit for some time. Take a look here for more!

This version fixes a bug which was present in all versions of Habit since 2.2.1.  The bug only affects experiments which use specific settings, and requires a precisely timed keypress during an experiment, and even then it is only the results output, not the operation of your experiment, which is affected. You can install this version over an existing version of Habit. You can also continue using older versions, but read about the precautions you should take to prevent erroneous data leaking into your results. 

A habit2 experiment in progress

About Habit

Habit (pronounced hə-BIT, as in "habituation") is used in looking-time experiments. Habit can be configured to present visual and auditory stimuli to subjects in habituation experiments. A wide variety of configuration options allow Habit to model many experiences.

Habit was developed at the University of Texas, under the direction of Les Cohen.  The first version ran on Mac OS/9, and was a collaboration between Lisa Oakes and Les Cohen when she was a graduate student in his lab back in the 20th century.  It has been revised and reworked several times since then, and has been used by a number of labs around the world to study infant perceptual and cognitive development.

Habit underwent a major revision beginning in 2013 when it was ported to Mac OS/X, and version 2.1.16 was released in 2015. After version 2.1.26 was released in 2017, another major upgrade took place and when version 2.2.1 was released in 2018, Habit ran on the Mac as well as Windows.

Ongoing development of Habit continues, under the direction of Lisa Oakes or the Infant Cognition Lab at the UC Davis Center for Mind and Brain.

Citing Habit in publications: A methods paper describing Habit 2 in detail is now available. Please use this paper when citing Habit in your work.

Credits: Daniel J. Sperka (developer) was supported by NIH Vision Research Core Grant, P30EY012576. 

Photo credit: Debbie Aldridge