Self review checklist

Here is a checklist to work through as you prepare to submit your study. Lookit staff will look through these categories as well during internal review, so you can avoid unnecessary rounds of review by thinking through each item before submission.

Title

  • Is it short and memorable?

    Note: it’s fine for a title to be cute but not super informative, as long as it has something to do with the content or purpose of the study and could be used to distinguish it from other similar studies. One of its most important functions is to be memorable enough that parents can say “hey I did the XYZ study and [you should too! / I have a question about when my gift card will arrive / etc.]

    Feel free to suggest any clever ideas you have!

  • Would a follow-up study, another study from the same group, or another study using the same method fall naturally under the same title? If so, the name should be more specific.

  • Does it include information that’s covered elsewhere, e.g. eligibility criteria or duration? Don’t duplicate this here. (E.g., “XYZ for 5-month-olds” or “XYZ: a 10-minute study about cats” should just be XYZ.)

Examples

Good examples:

  • “A fish, THE fish, MY fish” (babies’ understanding of articles)
  • “Baby see, baby do” (neonatal imitation)
  • “Cue the music!” (singing in a language cueing attention to that language)
  • “Is two (voices) a crowd?” (kids’ ability to understand speech with one/many background voices)
  • “Little drummers” (toddlers’ production of rhythm)
  • “Does your baby know what you are thinking?” (infant theory of mind)

Examples of titles we asked for more specific versions of:

  • “Thinking about actions”
  • “Look at the pictures”
  • “Emotion learning study”

Thumbnail image

  • If people are pictured, and they’re white - do they need to be? (It’s easy for too many researchers to “default to” pictures of white kids.)
  • Is this image relevant to the study? (e.g., does it have a child pictured that’s in the age range, does it include potential images that the child may see during the study, etc.)
  • Do you have rights to use this image royalty-free, either based on the license or because you created or bought rights to the image?

Short description / “What happens”

  • Does this clearly explain to parents what they/their child will be doing during this experiment in a few sentences?
  • Does it contain unnecessary detail (e.g. exact numbers or timing of trials) or information covered in other fields (e.g. duration)? These should be removed and covered only in the other field to avoid duplication and make it easier to keep information consistent if it changes.
  • Does it adhere to the style guide?

Discoverable

When you start a new study, we recommend making it non-discoverable while you pilot. That way you can take your time fixing any issues that come up during piloting. Once it’s discoverable, it will be listed on the Lookit “studies” page and email announcements will be sent out to all families with a child who’s eligible.

  • If you want your study posted on the Lookit studies page once you start collecting data, is the “discoverable” box checked? If you don’t (because you want to pilot or recruit only from your own database), is it unchecked?

Researcher contact information

Exit URL

Purpose / “What we’re studying”

No seriously, please take a hard look at this section.

This is the single biggest challenge for researchers. Writing a clear and accurate description of the point of your study is HARD but worthwhile, and because this is a critical element of parent communication we’re sticklers for all of the below.

  • Is it truly accurate given the study design?

  • Does it explain what this particular study addresses (not a broader research program)? Example: the purpose of your study is probably not to understand how babies learn new words.

    Note: it’s completely ok if your study is not testing some novel question. You might be developing or testing out a measure to see if it works at all (e.g., “can we get parents to live-code infant looking?”), confirming that you can replicate a lab-based finding, norming stimuli for future work on X, trying to pin down a more precise estimate of an effect size for a model, collecting a dataset that will be used for a variety of exploratory work, etc. In that case, say that, and include a brief explanation as appropriate about the point of the related work (e.g., the work you’re replicating).

  • Does it explain why this matters? (This is not a grant agency - don’t stretch it - but “this hasn’t been studied before” isn’t a reason something matters. Neither is “ability Y is used in skill X which is important” a reason to find out whether infants can do Y, on its own.

  • Is it readable by a bright highschooler without being condescending?

  • Does it adhere to the style guide?

Compensation

  • If providing compensation, have you included any conditions for payment (e.g., child needs to be in age range, child needs to be visible at some point, only one per child)
  • If providing compensation, have you included information about how long it will take to receive? (Make sure this is consistently stated throughout the study!)
  • Is compensation dependent on the child completing the study, or on the child’s behavior in any way? (This is generally not allowed per terms of use - check with us if you have questions.)
  • Be prepared to really compensate people in that timeframe! If you’ve said three days, that means that you have through Monday for participants from Friday. They may be counting on the money.

Eligibility description, min/max ages, eligibility criteria expression (all self-review & Lookit review only)

  • Are any eligibility criteria beyond age either language-based (e.g., speaking English or being bilingual) or rare (e.g., ASD)? We generally ask that other criteria be implemented as part of analysis, rather than preventing families from participating..
  • Don’t specify the age range in the criteria expression in addition to the min/max ages (it just introduces some potential for confusion if you later change one).
  • Is the participant eligibility description easy to understand? (E.g., translate ages into commonly-used terms; don’t say your study is for children between 56 and 70 weeks old.)
  • If participants can do the study more than once is that clearly stated?
  • Sometimes it can be mildly complex to translate between age range and description. Please review guidance on aligning ages to make sure your parent-facing description (e.g., “for 8-month-olds”) lines up with your min/max ages.
  • Are any additional criteria in the eligibility criteria expression noted in the freeform description?

Duration

  • Have you made a realistic estimate of the duration of the study, including setup/consent and children’s responses, and confirmed during peer review?

Protocol configuration

  • Is your study being randomized correctly? (e.g., you have the right audio and videos for the conditions they’re intended to be for) Note: this is NOT something Lookit staff will confirm for you during review; we will generally run through one random condition focusing on communication and any technical issues.
  • Are the audio/videos running the way you want them to? (e.g. video is located in the right place on the screen) Again, this is NOT something Lookit staff will confirm for you as we don’t know how you wanted them to look!
  • Are all stimuli hosted at URLs starting with https://, not http://? (Insecure hosts won’t be allowed for both security and performance reasons.)

Version of experiment runner

  • Are you using a recent version of the experiment runner? (If not why?)

Initial setup

  • Are webcam setup & consent steps included? Does the information in the consent form make sense and avoid repetition?
  • Are these at the start of the study, or if they are later is there a good reason (and are they still before any data collection, including video recording)?

Instructions

  • If children need to be visible or arranged a particular way, do you give the parent a chance to look at the webcam setup right before the study starts?
  • If parents are facing away or have their eyes closed, is it clear when they need to do that and when they can stop? Are there any points where it might seem like there’s a problem with the study if they can’t see what’s going on? Please ACTUALLY TRY your study following the directions given to parents.
  • Is it clear what you as a parent should be doing during the study?
  • Are the directions friendly? (i.e. don’t want to sound demanding/condescending)
  • Do things “flow”? Are there abrupt transitions?
  • Are the instructions clear and straightforward (to the point you could read them while also supervising/holding a few children)? Is there ever an overwhelming amount of info on the screen at once?

Test trials

  • Is there an indication to the parent of progress through the study during test trials if possible, especially if the parent needs to be quiet or keep their eyes closed?
  • Have you run all your stimuli through a simulator like https://www.color-blindness.com/coblis-color-blindness-simulator/ to check whether kids with common forms of colorblindness will be able to see them? (Note that few parents of preschoolers and younger will know yet if their kids are colorblind. Even some adults find out by surprise!)
  • Is audio clear enough to understand & reasonably well-balanced for volume throughout (e.g., not super-loud music with very quiet speech, can use software like Audacity to normalize your audio)

Debriefing (after exit survey)

  • Did you clearly explain the point of the study again (as in the purpose field, this needs to actually get at why the question matters)?
  • Did you concretely walk through the study design and explain HOW the study will answer the question? This is the heart of the debriefing. Generally this will entail briefly explaining what happened during the study, what the dependent measure is and what it indexes if that’s not obvious, and an if-then prediction: e.g., if babies realize that she doesn’t know where the ball is, we expect them to look longer when she finds it right away, because that’s surprising!
  • Did you explain the multiple conditions if there was randomization?
  • Did you head off likely potential parental concerns/objections? e.g.
    • there are many reasons a child might answer a particular way on any given trial (e.g., first/last option, favorite objects), that’s why we average over lots of kids/trial types
    • make sure parents know their child may not have answered a particular way/ looked more or less on a given trial/ or successfully performed some action and that’s OK
  • Did you restate information about compensation and when to expect it (make sure this is the same throughout the study)
  • Did you link to someplace to learn more about this general topic if possible? (e.g. ted talk, popular science article, website with more games, journal paper, other educational video, etc.)
  • You can use /n/n to add line breaks for readability and can insert links as <a href=”https://…” target=”_blank” rel=”noopener”>Cool Website</a>

General things to think about

  • Are any questions/tasks ambiguous or inappropriate for…

    • A single parent (due to choice, breakup/divorce, or death), an unmarried but partnered parent, a parent with a same-sex partner, a divorced parent who shares custody, a parent with more than one partner
    • A family that lost a child in infancy (e.g. “how many siblings” type questions)
    • Multiracial families (e.g. questions about race where it’s ambiguous whether you care about child, parent, or both)
    • Adoptive parents (e.g. questions about prenatal history)
    • A parent under 20 (e.g. educational background qs may be less informative measures)
    • A family of a child born very prematurely and whose adjusted age does not match her chronological age, or who has developmental delays
    • A transgender parent or parent of a gender-nonconforming child
    • You / someone you know! (This is not meant as an exhaustive list, just some examples of places where questions sometimes reveal hidden assumptions.)

    In general think about what information you actually need and ask for that specifically.

  • Are tasks/questions appropriate for the age range?

  • Is the study aesthetically pleasing to look at? (remember parents and children need to be able to stay engaged and we don’t want things to come off too “sterile”)

  • Is all audio clear and easy to understand? Is it as engaging as possible (intonation, pauses, etc.) given the constraints of the study? (Sometimes we default to an unnecessarily flat tone.)