Hi,
I am not 100% this is a bug, or rather a feature request. For now, i could find a work-around, but if it is a bug, it might be relevant to more users in the future.
Describe the bug
it seems that split MEG files (automatically split by the acquisition software) cannot be automatically rejoined when the filenames contains dashes (-). At least, when I tried to read a BIDS-formatted file, the combination of split files failed.
Steps to reproduce
run mne.io.read_raw_fif(infiles,preload=False) on a file with dashes in it, e.g sub-<sub_no>_ses-<ses_no>_task-<task_id>_raw.fif
If necessary, I can upload two files as a MWE, but I suppose it will happen with any files as long as there is a dash in the filename.
Expected results
I expected both file parts, so ...raw.fif and ...raw-1.fif to be automatically detected and read.
Actual results
Only, the first part was read, the second one was ignored.
Additional information
I think I located the problem to _get_next_fname in https://github.com/mne-tools/mne-python/blob/master/mne/io/open.py. In this function, in line 83, MNE tries to check whether this is the first file or one of the subsequent parts, by locating the dashes in the filename. If idx2 = -1, then there are not dashes and the filename is the first part (_raw.fif), if idx2 != -1, but the part of the filename between idx2 and idx is a number, then this is already a subsequent file (...raw-[n].fif), and a potential next filename (..._raw-[n+1].fif`) is generated.
However, when the filename contains dashes any other place (like it is common for BIDS files), this procedure fails, because when the filename of first part is analysed, dashes will be detected, and the next_file_name will be set to None.
As a quick fix, to check whether I understood the problem, I used a different character for split files detection (# instead of -), and then dashes in the filename don't cause problems.
Therefore, the solution would entail making the procedure robust against arbitrary dashes in the filename. I propose a fix as a pull request, not sure whether it is sufficiently robust.
Hi,
I am not 100% this is a bug, or rather a feature request. For now, i could find a work-around, but if it is a bug, it might be relevant to more users in the future.
Describe the bug
it seems that split MEG files (automatically split by the acquisition software) cannot be automatically rejoined when the filenames contains dashes (
-). At least, when I tried to read a BIDS-formatted file, the combination of split files failed.Steps to reproduce
run
mne.io.read_raw_fif(infiles,preload=False)on a file with dashes in it, e.gsub-<sub_no>_ses-<ses_no>_task-<task_id>_raw.fifIf necessary, I can upload two files as a MWE, but I suppose it will happen with any files as long as there is a dash in the filename.
Expected results
I expected both file parts, so
...raw.fifand...raw-1.fifto be automatically detected and read.Actual results
Only, the first part was read, the second one was ignored.
Additional information
I think I located the problem to
_get_next_fnameinhttps://github.com/mne-tools/mne-python/blob/master/mne/io/open.py. In this function, in line 83, MNE tries to check whether this is the first file or one of the subsequent parts, by locating the dashes in the filename. Ifidx2 = -1, then there are not dashes and the filename is the first part (_raw.fif), ifidx2 != -1, but the part of the filename between idx2 and idx is a number, then this is already a subsequent file (...raw-[n].fif), and a potential next filename (..._raw-[n+1].fif`) is generated.However, when the filename contains dashes any other place (like it is common for BIDS files), this procedure fails, because when the filename of first part is analysed, dashes will be detected, and the next_file_name will be set to None.
As a quick fix, to check whether I understood the problem, I used a different character for split files detection (
#instead of-), and then dashes in the filename don't cause problems.Therefore, the solution would entail making the procedure robust against arbitrary dashes in the filename. I propose a fix as a pull request, not sure whether it is sufficiently robust.