Skip to content

ERP Data Processing: Recommendations

Eric Fields edited this page Jul 17, 2023 · 18 revisions

How you process your data prior to analysis will affect the power of your mass univariate analyses. In the case of approximate permutation tests, it will also effect the Type I error rate (since noisier data will have a higher Type I error rate).

Of course, the decisions you make about how to process your data are important no matter how you analyze your data. Here I mention only two such decisions, because they have particular importance for mass univariate analyses. For more general advice, I recommend Steve Luck's An Introduction to the Event-Related Potential Technique.

Sampling rate

A higher sampling rate means more data points and thus a potentially larger multiple comparisons problem. How this affects mass univariate analysis depends on which correction you use.

The cluster mass approach and FDR approach will probably not be significantly affected by a higher sampling rate. FDR correction depends on the proportion of significant effects (i.e., significant electrodes/time points divided by total electrodes/time points). This isn't likely to change much with sampling rate. The sampling rate probably will also not significantly affect the cluster mass test: if two adjacent time points at a sampling rate of 250 Hz are in a cluster, the time point between them at 500 Hz will probably also be included (assuming there is little high frequency noise—see below). Clusters will be larger, but the rankings probably won't change much. It is important to note, however, that the effects of sampling rate on these techniques has not be systematically tested.

The Fmax approach is affected by sampling rate. The maximum F for each permutation will grow with increasing sampling rate, therefore decreasing power. If you plan to use the Fmax approach, you should downsample your data. Something around 100 Hz will provide enough temporal resolution in most cases.

Note that this is not about the sampling rate during data collection. You can downsample from the original sampling rate either in your data processing software or with the MUT function decimateGND.

Whatever approach you use, a lower sampling rate will mean your analyses will run faster and the output will be easier to view and work with. Given that you don't need more than 100 Hz or so of precision in most cases, downsampling may be worth it no matter which correction you use, especially for complex designs with a large number of permutations.

For further discussion, see the Mass Univariate Toolbox documentation, Groppe et al. (2011a), and Luck (2014).

High frequency noise and filtering

Because individual time points are examined, mass univariate analyses are more affected by high frequency noise than mean amplitude analyses (averaging across time points is itself a kind of low pass filter).

If you plan to use mass univariate analyses, you should attempt to minimize high frequency noise as much as possible (e.g., eliminate sources of electrical noise, make sure participants are still and relaxed).

You should also apply a low pass filter to your data before analysis to remove high frequency noise. The effects of low pass filters on power has not been systematically tested. Lower filter cut-offs are likely to generally increase power, at least for long latency components. On the other hand, lower filter cut-offs reduce temporal precision, increase component overlap, and may reduce effect sizes for short-lived components (e.g., the early sensory response).

The ideal filter will depend on your data and research questions, so it is important to you understand the effects of the filter you are applying. For an excellent introduction to filtering, I recommend Steve Luck's An Introduction to the Event-Related Potential Technique.