Analysis of human ECoG and sEEG recordings


This tutorial provides information on how to analyze human electrocorticography (ECoG, recorded using subdural electrode grids and strips) and stereotactic EEG (sEEG, recorded using electrode shafts inserted into the brain), invasive brain recordings known for their high spatiotemporal precision.

It is important to emphasize that human ECoG and sEEG datasets are solely acquired in medical contexts, for clinical purposes, and come in different shapes and sizes. Some medical institutes use photography or X-ray (e.g., see Analysis of monkey ECoG recordings) for including anatomy in the analysis of the functional recordings, whilst others use CT (3D image from a series of X-rays) and/or MR, or combinations thereof. The datasets used in this tutorial are therefore not necessarily representative for the datasets obtained in the field, but are intended to serve as a platform for thinking and dealing with the challenges of analyzing this type of data.


We will be working on two different datasets, providing example strategies for both the anatomical and for the functional processing of ECoG / sEEG data.

The first dataset used to illustrate the anatomical processing of data has been obtained by members of the Knight Lab (Arjen Stolk & Cal Dewar - Helen Wills Neuroscience Institute, UC Berkeley), and was recorded at the UC Irvine Medical Center. This dataset includes a pre-implantation MR and a post-implantation CT, the latter thus showing the recording electrodes.

The second dataset used to illustrate the functional processing of the data was recorded at the Comprehensive Epilepsy Center of the New York University School of Medicine and processed in collaboration with members of the Multisensory Integration Research Group (Charité - University Medicine Berlin). This dataset includes neural recordings from an electrode grid, with an experimental manipulation that illustrates the spatiotemporal precision of these type of recordings. We will repeat code to select the trials and preprocess the data as described in the time-frequency analysis tutorial. We assume that the reader is already familiar with preprocessing and time-frequency analysis.

Processing of anatomical data


The flowchart below globally captures the steps involved in the anatomical and functional processing of ECoG / sEEG data for a single subject. In the current example, we are going to fuse the subject's CT with the subject's MRI, such that the electrodes localized in the CT will be in the same coordinate space as the MRI. Given that the MRI generally captures the brain's outline and structure in more detail than the CT, this allows us, in the end, to visualize the neural traces on top of or inside the brain.

Setting up

The following will use the anatomical MRI and CT of SubjectUCI29, which can be obtained from The functions described in this part of the tutorial are using toolboxes that are under the fieldtrip/external folder. You do not have to add these toolboxes yourself, but it is important that you set up your MATLAB path properly. You can read about how to set up your MATLAB path here.

addpath <path to fieldtrip directory>

Electrode localization

By fusing the CT with the MRI, we can localize electrodes in the former that will then be in the same coordinate space as the latter. In the tutorial dataset, the MRI and CT have already been preprocessed and co-registered (step 1 and 2).

1. Preprocess the MRI

We will first assign a coordinate system to the MRI, ensuring a coordinate space identical to the CT. For convenience we choose the Talairach coordinate space, a convention FreeSurfer is also familiar with. Realigning to the Talairach coordinate system requires the allocation of landmarks, namely the Anterior and Posterior Commissure (see here for their exact locations), and additionally a location along the positive midline (top center) and a location in the right side of the brain.

% read and align the MRI
mri             = ft_read_mri('xxx.nii'); % or a single file of a DICOM series

cfg             = [];
cfg.method      = 'interactive';
cfg.coordsys    = 'tal'; % anterior & posterior commissure, positive midline, and right side
mri = ft_volumerealign(cfg, mri);

% write to file, for later processing with FreeSurfer
ft_write_mri('SubjectUCI29_MR_preproc.mgz', mri.anatomy, 'transform', mri.transform, 'dataformat', 'mgz');

2. Preprocess the CT and co-register to the MRI

Given that the the Anterior and Posterior Commissure are usually not visible on the CT, we will first align the CT to an arbitrary coordinate system, and then convert it to the Talairach space. The CTF coordinate space is particularly useful as it allows the selection of fiducials recognizable on the CT, such as the left and right ears, and the nasion. We additionally select a location along the positive midline (top center). We then fuse the CT with the MRI, which are now both described in the same coordinate space, using SPM12. Note: currently (February, 2016) this requires downloading and adding SPM12 to the MATLAB path.

The option cfg.viewresult = 'yes' in ft_volumerealign will produce an interactive figure after the co-registration is completed, showing the pre-op MRI and co-registered post-op CT/MRI superimposed. This can be used to judge whether the co-registration was successful. If it was unsuccessful, then either the initial realignment of the CT/MRI to the target coordinate system was not accurate enough, or the post-op CT/MRI contains too many artifacts and will likely need to be cleaned. After these issues are solved, the procedure can be rerun. It is important that this step is completed successfully, as the accuracy of the electrode locations w.r.t. to the subject's anatomy depends strongly on it.

% read and align the CT
ct = ft_read_mri('xxx.nii'); % or a single file of a DICOM series

cfg             = [];
cfg.method      = 'interactive'; % nasion, left & right ear, and positive midline
ct = ft_volumerealign(cfg, ct);
ct = ft_convert_coordsys(ct, 'tal'); % convert ctf to spm/tal coordsys

% co-register the CT to the MRI
cfg             = [];
cfg.method      = 'spm';
cfg.spmversion  = 'spm12';
cfg.viewresult  = 'yes'; % view realignment result
ct = ft_volumerealign(cfg, ct, mri);

% write to file, as a backup
ft_write_mri('SubjectUCI29_CT_coreg.nii', ct.anatomy, 'transform', ct.transform, 'dataformat', 'nifti_spm');

3. Localize electrodes in the CT

Now we can localize electrodes in the CT. The ft_electrodeplacement function allows the a-priori specification of electrode labels from the recording file. Load the pre-computed header structure (SubjectUCI29_hdr.mat) and assign the label vector to Within the ft_electrodeplacement environment, clicking an electrode label assigns the crosshair location to it. Enabling the 'magnet' option facilitates the selection procedure by moving the crosshair to the voxel with the greatest intensity in a certain radius. Double-clicking a previously assigned electrode label removes its marker.

Open SubjectUCI29_grids.png which shows the layout and numbering order of the electrode grids. The convention for sEEG electrodes is to start numbering with the deepest electrodes.

% place and assign the electrodes
hdr             = ft_read_header('xxx.edf'); % recording file

cfg             = [];     = hdr.label; % these will be assigned to a location
elec = ft_electrodeplacement(cfg, ct);

% write to file, for later visualization
save('SubjectUCI29_elec.mat', 'elec');

One can also load in the MRI to perform the electrode localization, or check the anatomical location of previously selected electrode coordinates. Note that cfg.magtype is specified as 'trough' here as electrodes typically turn out to be pitch dark in the (post-implanatation) MRI. This feature complicates the localization of those electrodes in MR, and hence we here used the CT for electrode localization.

% check: plot elec positions on the MRI and possibly adjust
cfg             = [];
cfg.elec        = elec; % plot the previously identified electrodes on the mri
cfg.magtype     = 'trough';
elec = ft_electrodeplacement(cfg, mri);

Surface rendering

We will invoke FreeSurfer to create a brain surface model that is based on a description of the cortical sheet. Essentially, we will construct a triangulated cortical mesh (>100000 vertices per hemisphere), ideally consisting of a number of approximately equally sized triangles that form a topological sphere for each of the cerebral hemispheres. This latter feature is particularly convenient for the creation of an inflated cortex and/or intersubject realignment.

The input of the creation process of the brain surface are mgz files that were created in the MR preprocessing step described above. The output are a collection of files, such as a volumetrically upgraded MRI and, as we will visualize below, a pial surface model for each of the hemispheres. Note that the brain surface rendering takes up to 10 hours and requires a FreeSurfer installation. For the purpose of the tutorial, one may want to skip this step for now and use the pre-computed pial surface stored in the container.

% execute in the unix/osx terminal (substitute < and > by path locations)
export FREESURFER_HOME=<path to FreeSurfer>

mri_convert -c -oc 0 0 0 <SubjectUCI29_MR_preproc.mgz> <temp.mgz>


recon-all -i $DATA_PATH -subject $S_NAME -all

Read in the pial surface mesh generated by FreeSurfer, and plot the electrodes on top of it.

% plot pial surface
mesh = ft_read_headshape('SubjectUCI29_lh.pial');
ft_plot_mesh(mesh, 'facecolor', [0.781 0.762 0.664], 'EdgeColor', 'none')
view([-90 25])
lighting gouraud
material shiny

% plot electrodes
hs = ft_plot_sens(elec, 'style', 'ko');
set(hs, 'MarkerFaceColor', 'k', 'MarkerSize', 6);

Processing of functional data

In this section, we will demonstrate how to analyze functional brain activity in ECoG data. The tutorial contains instructions and code for event-related potentials (ERP), high-gamma power (HGP, 80 - 200 Hz) and time-frequency analyses, including statistical analyses and visualization of the outcome. HGP is a measure of neural activity that is specific for ECoG data analysis. Because high frequency signals are strongly attenuated by tissue and bone that lie between source and sensor, high-gamma activity can hardly be found in scalp EEG data. However, in ECoG data HGP (80 - 200 Hz) is a very prominent neural signature. HGP does not seem to be of oscillatory nature but has rather been associated with population neural spiking rate (Manning et al., 2009; Ray & Maunsell, 2011). It is also highly locally specific, compared with low frequency activity and ERPs.

Data analysis

This dataset was recorded at the Comprehensive Epilepsy Center of the New York University School of Medicine and processed by members of the Clinical Neurophysiology Lab (Thomas Thesen) and the Multisensory Integration Research Group (Martin Krebber, Daniel Senkowski, Charité - University Medicine Berlin). (Support by the CRCNS Data Sharing Grant 01GQ1416 is gratefully acknowledged.)

The patient suffered from intractable epilepsy and underwent invasive monitoring to localize the epileptogenic zone for subsequent surgical removal. Electrode placement was solely guided by clinical considerations. During the recording the patient performed a visual localizer task that was used to identify electrodes responsive to certain types of visual input. Stimuli from the following 7 categories were presented in random order (event codes in parentheses): false font (3), house (4), object (5), texture (6), body (7), text (8), face (9).

The data can be downloaded from

In line with the first tutorial section, you can use the following code for surface rendering and the plotting of electrodes. Note that in this dataset many steps are skipped and we are just plotting the result of the anatomical coregistration and electrode placement.

%% load electrode locations
fid = fopen('NY394_fmri_coor_T1.txt');
elec_info = textscan(fid,'%s %f %f %f %s');

% create FieldTrip electrode structure
% elec       = [];
elec.label   = elec_info{1};
elec.elecpos = [elec_info{2} elec_info{3} elec_info{4}];
elec.unit    = 'mm';

%% load pial surface

% create FieldTrip surface mesh structure
mesh      = [];
mesh.pos  = coords;
mesh.tri  = faces;
mesh.unit = 'mm';

%% plot surface and electrodes
ft_plot_mesh(mesh, 'facecolor', [0.781 0.762 0.664], 'EdgeColor', 'none')
view([90 25])
lighting gouraud
material shiny

% plot electrodes
hs = ft_plot_sens(elec, 'style', 'ko', 'label', 'on');
set(hs, 'MarkerFaceColor', 'k', 'MarkerSize', 6);

1. Data preprocessing

First, we will load the data and segment them into trials using ft_preprocessing. Event information have already been extracted from the trigger channels and stored in in my_trl_r1.mat. The segmentation of continuous data based on triggers is described in detail in one of the preprocessing tutorials.

% load trial info

% load and segment data
cfg            = [];
cfg.dataset    = 'NY394_VisualLoc_R1.edf';
cfg.trl        = my_trl_r1;
cfg.continuous = 'yes';
epoch_data = ft_preprocessing(cfg);

The data still contain some channels that are not required for the further analysis (e.g. 'Pulse' and 'ECG' channels). We will use ft_selectdata to select only channels that have the 'EEG' prefix. Only these data will be submitted to subsequent analysis steps.

cfg         = []; = 'EEG*'; % select 'EEG' channles
epoch_data = ft_selectdata(cfg,epoch_data);

Artifact rejection can be done by visually inspecting individual trials and channels, or by using summary statistics that are calculated across trials and channels (see tutorial here). We will first visually reject bad channels by browsing through the data channel-wise using ft_rejectvisual with the method 'channel'. You will notice that the data from channel 23 appear very noisy after about a quarter of trials. This is probably a technical artifact due to a bad electrode contact. Therefore, the channel should be marked as bad.

cfg         = [];
cfg.method  = 'channel'; % browse through channels = 'all';
epoch_data_clean_chan = ft_rejectvisual(cfg, epoch_data);

For rejecting artifact trials, we will use the 'summary' method in ft_rejectvisual. Identifying artifact trials in ECoG is similar to EEG analysis and can be done according to the tutorial on visual artifact rejection. Note, that ECoG data typically have higher amplitudes and better signal-to-noise ratios compared with data from scalp EEG, because they are recorded directly from the cortex. Still, a number of technical and physiological artifacts can be present in the data. Due to the clinical - and therefore less rigorously controlled - environment during the recording process, technical artifacts are quite common. The present dataset is relatively clean and, hence, does not need much rejection. Some moderate outliers can be found for the metrics: maxabs, zvalue and maxzvalue.

cfg         = [];
cfg.method  = 'summary'; % summary statistics across channels and trials = 'all';
epoch_data_clean = ft_rejectvisual(cfg, epoch_data_clean_chan);

2. Re-reference the data


3. ERP and HGP analysis

3.1 calculate and plot ERPs

The analysis of event-related potentials is done in accordance with the standard ERP tutorial. Here, we will calculate and compare the ERPs of two conditions ('object' and 'face'). In the parameters of ft_timelockanalysis we use the preprocessing options to apply filters to the data (30 Hz low-pass, 1 Hz high-pass). The high-pass filter reduces slow drifts while the low-pass filter eliminates high frequency noise. Note that the exact filter setting depend on the ERP component under investigation. Baseline correction is subsequently done with respect to the time interval of -300 ms to - 50 ms.

% calculate ERPs
cfg                  = [];
cfg.keeptrials       = 'yes'; % keep trials for statistics
cfg.preproc.lpfilter = 'yes';
cfg.preproc.lpfreq   = 30;    % smooth ERP with low-pass filter
cfg.preproc.hpfilter = 'yes';
cfg.preproc.hpfreq   = 1;     % reduce slow drifts
cfg.preproc.detrend  = 'yes';
cfg.trials = find(epoch_data_clean.trialinfo == 3); % select only 'object' trials (event code 3)
ERP_object = ft_timelockanalysis(cfg, epoch_data_clean);
cfg.trials = find(epoch_data_clean.trialinfo == 7); % select only 'face' trials (event code 7)
ERP_face   = ft_timelockanalysis(cfg, epoch_data_clean);
% baseline correction
cfg          = [];
cfg.baseline = [-.3 -.05];
ERP_object_bl = ft_timelockbaseline(cfg,ERP_object);
ERP_face_bl   = ft_timelockbaseline(cfg,ERP_face);

For plotting the data we select channel 'IO_03', located in or in close proximity of the fusiform face area, which is known to strongly respond to facial stimuli. In agreement with the literature, the ERPs appear to be larger for 'face' compared to ‘object’ stimuli.

cfg           = [];
cfg.parameter = 'avg';
cfg.xlim      = [-.3 .6];   = 'EEG IO_03-REF'; % other responsive channels: 'EEG PT_04-REF', 'EEG IO_02-REF', 'EEG IO_04-REF', 'EEG SO_01-REF', 'EEG SO_02-REF''EEG SO_03-REF'
figure, ft_singleplotER(cfg,ERP_object_bl,ERP_face_bl)

3.2 calculate and plot HGPs

To calculate HGP, we will first perform ft_freqanalysis over the high-gamma frequency range (80 - 200 Hz). Then, we will manually correct for the 1/f power dropoff and averaged the data over the frequency dimension to get the HGP time course for each channel. The resulting data structure resembles ERP data and can be further processed using ft_timelockxxx functions. Finally, the HGP data are baseline-corrected using ft_timelockbaseline.

% time-frequency analysis
cfg            = [];
cfg.method     = 'tfr';
cfg.keeptrials = 'yes';
cfg.toi        = epoch_data_clean.time{1}; % keep full temporal resolution
cfg.foi        = [80 85 90 95 100 105 110 115 125 130 135 140 145 150 155 160 165 170 175 185 190]; % 80 - 190 Hz, leave out harmonics of 60 Hz
cfg.width      = 10 * ones(1,length(cfg.foi));
cfg.trials = find(epoch_data_clean.trialinfo == 3); % select 'object' trials
TFR_object = ft_freqanalysis(cfg,epoch_data_clean);
cfg.trials = find(epoch_data_clean.trialinfo == 7); % select 'face' trials
TFR_face   = ft_freqanalysis(cfg,epoch_data_clean);
% create HGP as empty timelock structure with same dimensions as ERP, values will be filled in in the next steps
HGP_object = rmfield(ERP_object,{'trial','avg','var'});
HGP_face   = rmfield(ERP_face,{'trial','avg','var'});
% correct for the 1/f dropoff
freqcorr = reshape(TFR_object.freq.^2,[1 1 length(TFR_object.freq)]); %this vector accounts for the 1/f dropoff
% use repmat to create a matrix the same size as the TFR data
freqcorr_object = repmat(freqcorr,[size(TFR_object.powspctrm,1) size(TFR_object.powspctrm,2) 1 length(TFR_object.time)]);
freqcorr_face   = repmat(freqcorr,[size(TFR_face.powspctrm,1) size(TFR_face.powspctrm,2) 1 length(TFR_face.time)]);
% multiply data with freqcorr matrix and average over frequencies
HGP_object.trial = squeeze(nanmean(TFR_object.powspctrm(:,:,:,:) .* freqcorr_object,3));
HGP_face.trial   = squeeze(nanmean(TFR_face.powspctrm(:,:,:,:) .* freqcorr_face,3));
% calculate mean and variance
HGP_object.avg = squeeze(nanmean(HGP_object.trial,1));
HGP_object.var = squeeze(nanvar(HGP_object.trial,1));
HGP_face.avg   = squeeze(nanmean(HGP_face.trial,1));
HGP_face.var   = squeeze(nanvar(HGP_face.trial,1));
% baseline correction
cfg          = [];
cfg.baseline = [-.3 -.05];
HGP_object_bl = ft_timelockbaseline(cfg,HGP_object);
HGP_face_bl   = ft_timelockbaseline(cfg,HGP_face);
clear TFR*

Again, we decided to plot channel 'IO_03'. As was the case for the ERPs, the HGP response at this channel was larger for 'face' stimuli than 'object' stimuli.

cfg           = [];
cfg.parameter = 'avg';
cfg.xlim      = [-.3 .6];   = 'EEG IO_03-REF'; % other responsive channels: 'EEG PT_03-REF', 'EEG PT_04-REF', 'EEG IO_02-REF', 'EEG IO_04-REF', 'EEG SO_01-REF', 'EEG SO_02-REF''EEG SO_03-REF'
figure, ft_singleplotER(cfg,HGP_object_bl,HGP_face_bl)

3.3 calculate timelock statistics

In the next analysis step, we statistically compare the responses of 'objects' vs. 'faces' for both ERP and HGP data to test which electrodes respond preferentially to one or the other stimulus class. To account for multiple comparisons, we use the cluster-based permutation approach as implemented in ft_timelockstatistics. When it comes to statistics, one distinctive feature of ECoG data is that, due to the high local specificity of the recorded activity, no spatial information is exploited for clustering. Consequently, the cfg.neighbours parameter is specified as an empty matrix.

cfg                  = [];
cfg.latency          = [0 .6];
cfg.parameter        = 'trial';
cfg.method           = 'montecarlo';
cfg.correctm         = 'cluster';
cfg.neighbours       = []; % no spatial information is exploited for statistical clustering
cfg.numrandomization = 500;
cfg.statistic        = 'indepsamplesT'; % idependent samples test for statistical testing on the single-trial level          = 'all';
cfg.alpha            = 0.05;
cfg.correcttail      = 'alpha';           = [ones(1,size(ERP_object_bl.trial,1)), 2*ones(1,size(ERP_face_bl.trial,1))];
stats_ERP = ft_timelockstatistics(cfg,ERP_object_bl,ERP_face_bl);
stats_HGP = ft_timelockstatistics(cfg,HGP_object_bl,HGP_face_bl);

We first check whether the cluster statistic revealed significant channels. If there are significant channels, we will plot them.

% look for significant channels in ERP stats
[chans time] = find(stats_ERP.mask)

% none of the channels contain significant differences between conditions

Although the visual inspection of the data indicated stronger ERP responses for 'face' stimuli in an electrode near the fusiform face area, the statistical analysis of ERPs did not reveal significant effects at any channel. Thus, we move on to the statistics of the HGP data.

% look for significant channels in HGP stats
[chans time] = find(stats_HGP.mask);
chans = unique(chans)

There are six channels with significantly different HGP time courses when comparing the ‘objects’ and ‘faces’ conditions. We will plot each of these channels using ft_singleplotER.

% first, average over trials, otherwise we'll have problems with ft_singleplotER
cfg = [];
HGP_object_bl = ft_timelockanalysis(cfg, HGP_object_bl)
HGP_face_bl   = ft_timelockanalysis(cfg, HGP_face_bl)
% add statistical mask to data
time_idx = find(HGP_object_bl.time == stats_HGP.time(1)) : find(HGP_object_bl.time == stats_HGP.time(end)); % find indices of timepoints in data corresponding to timepoints in stats
HGP_object_bl.mask = false(size(HGP_object_bl.avg));
HGP_object_bl.mask(:,time_idx) = stats_HGP.mask;
% plot the ERP traces
cfg               = [];
cfg.parameter     = 'avg';
cfg.xlim          = [-.3 .6];
cfg.maskparameter = 'mask';
% loop over significant channels and plot
for ichan = 1:length(chans) = chans(ichan);
    figure, ft_singleplotER(cfg,HGP_object_bl,HGP_face_bl),

The analysis revealed stronger HGP in response to 'faces' compared with 'objects' in some channels (IO3, SO2), while other channels show the opposite pattern (PT3, PT6, IO2, SO4). This finding - together with the absence of significant effects in ERPs - suggests that HGP might be more sensitive to the type of visual stimulus presented than ERPs. The results are also in line with the notion that HGP represents highly locally specific activity whereas ERPs and low frequency oscillations are more widespread.

4. Time-frequency analysis

4.1 calculate and plot TFRs

To get an idea about the spectral content in the ECoG dataset, we will calculate time-frequency representations (TFRs) using ft_freqanalysis, and then baseline-correct the data to reflect relative change from baseline using ft_freqbaseline. More information about time-frequency analysis can be found in the time-frequency analysis tutorial). In the current dataset, we decided to analyze the frequency range from 2 to 200 Hz. To save computational resources, we will increase the frequency steps with higher frequencies.

% time-frequency analysis
cfg            = [];
cfg.method     = 'tfr';
cfg.output     = 'pow';
cfg.keeptrials = 'yes'; % keep trials for statistics
cfg.foi        = [4:2:40 44:4:100 108:8:200]; % make the frequency spacing broader with higher frequencies
cfg.toi        = -.3:.02:.6;
cfg.trials = find(epoch_data_clean.trialinfo==3); % select 'object' trials
TFR_object = ft_freqanalysis(cfg, epoch_data_clean);
cfg.trials = find(epoch_data_clean.trialinfo==7); % select 'face' trials
TFR_face   = ft_freqanalysis(cfg, epoch_data_clean);
% baseline correction
cfg              = [];
cfg.baseline     = [-.3 .05];
cfg.baselinetype ='relchange';
TFR_object_bl = ft_freqbaseline(cfg, TFR_object);
TFR_face_bl   = ft_freqbaseline(cfg, TFR_face);

To visualize the time-frequency data from the two conditions, we will plot TFRs for some representative channels using ft_singleplotTFR.

cfg           = [];
cfg.parameter = 'powspctrm';
cfg.xlim      = [-.3 .6];
cfg.zlim      = [-10 10];   = 'EEG IO_03-REF'; % other responsive channels: 'EEG PT_03-REF', 'EEG PT_04-REF', 'EEG IO_02-REF', 'EEG IO_04-REF', 'EEG SO_01-REF', 'EEG SO_02-REF''EEG SO_03-REF'
figure, ft_singleplotTFR(cfg,TFR_object_bl)
figure, ft_singleplotTFR(cfg,TFR_face_bl)

4.2 Time-frequency statistics

In the next step, we statistically compare the TFRs of the ‘house’ and ‘face’ conditions using cluster-based permutation statistics, as implemented in ft_freqstatistics. The statistical approach is presented in more detail in one of the statistics tutorials.

cfg                  = [];
cfg.latency          = [0 .6];
cfg.parameter        = 'powspctrm';
cfg.method           = 'montecarlo';
cfg.correctm         = 'cluster';
cfg.neighbours       = [];
cfg.numrandomization = 500;
cfg.statistic        = 'indepsamplesT';          = 'all';
cfg.alpha            = 0.05;
cfg.correcttail      = 'alpha';
cfg.clusteralpha     = 0.05;           = [ones(1, size(TFR_object_bl.powspctrm,1)), 2*ones(1,size(TFR_face_bl.powspctrm,1))];
cfg.ivar             = 1;
stats_TFR = ft_freqstatistics(cfg,TFR_object_bl,TFR_face_bl);

Finally, we will plot the masked t-values from significant channels of the statistical contrast.

% find significant channels
sig_tiles = find(stats_TFR.mask); % find significant time-frequency tiles
[chan freq time] = ind2sub(size(stats_TFR.mask),sig_tiles); % transform linear indices to subscript to extract significant channels, timepoints and frequencies
chan = unique(chan);
% plot TFRs
cfg               = [];
cfg.parameter     = 'stat';
cfg.maskparameter = 'mask';
cfg.maskalpha     = .4; % opacity value for non-significant parts
cfg.zlim          = [-4 4];
cfg.colorbar      = 'yes';
% loop over channel sets and plot
for ichan = 1:length(chan) = chan(ichan);
    figure, ft_singleplotTFR(cfg,stats_TFR),

Similar to the outcome of the HGP analysis, power differences are found in some occipital channels (IO, SO and PT). For instance, channel IO3 responds more strongly to 'face' Stimuli, whereas channel IO2 responds more strongly to 'object' Stimuli. Notably, the power differences are predominantly found in the high-gamma band range. This suggests that local HGP is highly sensitive for the presented stimulus type.

Data visualization

1. TFR/ERP Multiplot


2. Surface rendering


tutorial/human_ecog.txt · Last modified: 2016/08/23 06:49 by

You are here: starttutorialhuman_ecog
CC Attribution-Share Alike 3.0 Unported Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0
This DokuWiki features an Anymorphic Webdesign theme, customised by Eelke Spaak and Stephen Whitmarsh.
Mobile Analytics Website Security Test