Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sync timestamps #1

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open

Conversation

agricolab
Copy link
Member

@agricolab agricolab commented Apr 7, 2019

As requested, i moved the pull request to the new repo structure according to sccn/xdf#28

As mentioned there:

Allows to resample (using interpolation for numeric values and shifting for strings) all streams to have their timestamps sample-wise in sync with the fastest stream.
Example
streams, fileheader = load_xdf(filename, sync_timestamps='linear')
This might help adress #24 and mne-tools/mne-python#5180

I had to manually copy the code, as the history is broken, and hope i did not miss anything. Tests did run fine.
image

@agricolab
Copy link
Member Author

agricolab commented Apr 7, 2019

On a side note, i added the feature before the ordering/recasting currently discussed in sccn/xdf#42
This implementation should therefore survive, if there is a decision made for the breaking change. But i'll have to adapt the tests.

@cboulay
Copy link
Contributor

cboulay commented Apr 26, 2019

Can you make a trivial change and commit it? I'm hoping to trigger a CI build and that it will run the test suite. Maybe add a newline at the end of test_limit_on_synced_stamps.py.

@agricolab
Copy link
Member Author

Seems to work!

@agricolab
Copy link
Member Author

agricolab commented Apr 26, 2019

The test in CI gave a warning: pyxdf.py:230: DeprecationWarning: invalid escape sequence \R. I also get this, when i run pytest the first time after a change. If i run the test a second time, and from then on, this warning is gone and no longer pops up. The line 230 that causes this warning is the final """ from the docstring of load_xdf. Haven't found a way to get rid of this warning. Might be a bug in pytest?

@agricolab
Copy link
Member Author

I guess the next action item is to rebase this commit on the changes in master.

@cbrnr
Copy link
Contributor

cbrnr commented Sep 20, 2019

What's the status of this PR? I'm not even sure what this adds - is this implementing resampling of all streams to a common sampling frequency? @agricolab maybe you could update this PR with a short description in your top comment.

@agricolab
Copy link
Member Author

Updated the top comment with a direct quote from the original pull request.

Regarding your question, yes it syncs streams by interpolation/shifting so that samples match. Regarding the status, this is relatively old now, and does therefore not (yet) account for the recent feature of subselection of streams from an xdf-file.

@cbrnr
Copy link
Contributor

cbrnr commented Sep 23, 2019

Nice, this feature is important. I can rebase if you want.

@agricolab
Copy link
Member Author

That'd be great, i am currently quite busy with work, and would probably find the opportunity to work on this not before january.

@cbrnr
Copy link
Contributor

cbrnr commented Sep 24, 2019

Alright, I rebased and fixed some PEP8-related stuff. Maybe I can come back to this before January.

@cbrnr
Copy link
Contributor

cbrnr commented Oct 11, 2023

@agricolab I assume you don't have time to finalize this PR? In addition to rebasing, I think what's currently missing is support for stream selection (which has been added to main). Also, it would be nice if the frequency that is used for resampling could be defined manually. By default, it could still be the highest sampling frequency among all (selected) streams, but this is not always practical if there's e.g. an audio stream, which could also be downsampled.

@agricolab
Copy link
Member Author

agricolab commented Oct 11, 2023

I can make time in the next few weeks. Considering it's been apparently five years in the making, it really is time to resolve this 😅

@cbrnr
Copy link
Contributor

cbrnr commented Oct 11, 2023

😄 This would be really great! If this is available in PyXDF, downstream projects don't have to roll their own implementations.

Speaking of which, we've been discussing the problem of resampling here: cbrnr/mnelab#385

Specifically, we were thinking about a solution when there are long gaps between otherwise relatively regularly spaced samples. @DominiqueMakowski implemented linear interpolation (using pandas), which is certainly better than assuming that samples are regularly spaced, but I am not sure if interpolating such long periods is the best solution. I think that replacing these segments with NaNs would be a better option.

WDYT? Is this covered by your changes, and if so, how do you deal with large gaps?

@agricolab
Copy link
Member Author

agricolab commented Oct 11, 2023

IIRC there was no consideration for large gaps. I'll make up my mind, but probably it's simplest to let the user decide. Which means even more arguments to the function 🥴

In the end I'd say it depends on the source of the data, or let's say it's complexity / spectrum / non-stationarity and corresponding timescale.

@agricolab
Copy link
Member Author

agricolab commented Nov 3, 2023

I updated everything to integrate with the most recent version. I also updated the tests, and integrated a smoke test with the example-files (i.e. minimal.xdf) Note that i used a pytest fixture for that.
TODOs are

@DominiqueMakowski
Copy link

Small note:

Handle long gaps between otherwise relatively regularly spaced samples

I would not only do a special treatment when long gaps are detected and otherwise assume that is regularly sampled. In fact, I would never assume that it is (from experience, it's often not, and a few milliseconds are vital in some applications). + LSL returns the timestamp of all samples so it seems like a shame not to use it and have a loss of precision

@cbrnr
Copy link
Contributor

cbrnr commented Nov 3, 2023

This PR is about creating regularly sampled time series via resampling. So I think large gaps need to be treated somehow in order to avoid interpolating a large gap and instead insert e.g. NaNs.

@cbrnr
Copy link
Contributor

cbrnr commented Nov 3, 2023

@agricolab I was wondering if this PR can also deal with just a single stream, i.e. resample that stream to a regular sampling frequency (and dealing with larger gaps). Or is this already possible with the current release?

@agricolab
Copy link
Member Author

agricolab commented Nov 3, 2023

With use_samplingrate, now one should be able to resample a single stream (or multiple streams). It is a result as the stream with the highest sampling rate needs to be resampled anyways if a different sampling rate is used, so there's that

@agricolab
Copy link
Member Author

Ok. I implemented shift_alignment as a default, and handle segments. Unit and system tests run fine for segmented and unsegmented streams, and when aligned_timestamps, sampling_rate or interpolation function is supported by the user. See e.g. system tests with minimal.xdf here: https://github.com/agricolab/xdf-Python/blob/sync_timestamps/pyxdf/test/test_data.py#L80-L123)

So i guess it's time to let you fellows play with it. Looking forward to your feedback. Hope i didn't miss a big thing...

@agricolab
Copy link
Member Author

agricolab commented Dec 5, 2023

In regards to

note though you'd have to detect those gaps in there if someone passes use_time_stamps=streams['eeg'].timestamps

I made my life easy. If a user support timestamps that do not cover a span where a stream has samples, shift_alignment will fail with an exception. Therefore, this approach would only work if an interpolation function is supported that can handle this (in fact, supporting an interpolation function for a stream skips any safety checks whether its samples can be assigned to the new timestamps). In that case, it doesn't really matter whether the new_timestamps have gaps or not.

@chkothe
Copy link
Contributor

chkothe commented Dec 6, 2023

Ok great -- thanks for pushing through! Will have a look this weekend at the latest.

@cboulay
Copy link
Contributor

cboulay commented Dec 13, 2023

load_xdf has docstrings for the new arguments but these aren't yet used by load_xdf. Is that intentional?
So for now you're just looking for feedback on align_streams?

(Incidentally, is it necessary that use_samplingrate is int or str? What about float or str?)

@agricolab
Copy link
Member Author

agricolab commented Dec 13, 2023

load_xdf has docstrings for the new arguments but these aren't yet used by load_xdf. Is that intentional?

sorry, that was leftover from an earlier implementation. load_xdf should be almost untouched, but for the passing of the information about segments in the info field of the streams.

So for now you're just looking for feedback on align_streams?

yes, and its helper functions.

Remove docstrings for unused arguments in load_xdf
@agricolab
Copy link
Member Author

agricolab commented Dec 13, 2023

I refactored the main alignment function align_streams and the helper functions into their own file for better maintainability, and added basic docstring. align_streams will be imported to pyxdf.

@cbrnr
Copy link
Contributor

cbrnr commented Sep 24, 2024

@agricolab I tried to test this new functionality, but I'm not sure how exactly I'm supposed to use it.

For example, I have a data set with a large gap somewhere in the middle of the recording (available here). I would like to load this file and then sync and resample all streams to 256Hz. Then I would like to end up with a 2D NumPy array containing all EEG channels. Is this possible? If so, how? The problem here is that with my resampling implementation in MNELAB, I'm completely ignoring any gaps in the data (cbrnr/mnelab#385).

@agricolab
Copy link
Member Author

agricolab commented Sep 26, 2024

Talk about revenants... Oh boy, have to take a look myself after a year or so...

So, we have the free function align_streams see https://github.com/agricolab/xdf-Python/blob/sync_timestamps/pyxdf/align.py#L91C5-L91C18 that takes

  1. streams as returned by load_xdf.
  2. a dictionary that maps streamids to respective interpolation functions (default is _shift_align)
  3. user-defined aligned_timestamps (defaults to earliest of all streams to latest of all streams)
  4. user defined sampling rate (defaults to sampling rate of the fastest stream

All you really need to support is the streams, the rest is using defaults. As discussed, the default is _shift_align, and that leaves the original data untouched. _shift_align simply moves the sample to the closest matching new timestamp. If samples can not be assigned (because of effective downsampling or edge cases), the default approach is to fail with a run time error. In that case, you at least know now that you can only align your stamps if you "distort" your data with resampling. IIRC we discussed this issue extensively, and decided that there should be no default that involves resampling. For me, the best argument against a resampling default was that then the aligned data can vary depending on the timestamps of the other streams. Anyways. Long story short: Actual resampling only works with user-defined interpolation functions.

So, if you need another interpolation function that can handle resampling, you have to specify the second argument, and assign a callable to those streamIds that need resampling. The module supports in addition to the default _shift_align also _interpolate which is a wrapper for scipy.interpolate unless this dependency is not met, than it wraps numpy.interpolate.

That leaves open your question on how are gaps handled.

First, gaps are detected as defined by load_xdf IIRC i had to change load_xdf to export this, so it might not work with recent load_xdfs. Second, each aligned timeseries is initialized with NaN and only segments with actual samples are aligned/interpolated, leaving gaps to stay NaNs. See https://github.com/agricolab/xdf-Python/blob/sync_timestamps/pyxdf/align.py#L162-L175

In conclusion

So at least from reading code and docs and memory, you would need to support:

The streams, the sampling rate (256 in your case), and a dict defining a callable for your streamids that can handle downsampling. Good luck! If you run into any more troubles, feel free to comment.

@cbrnr
Copy link
Contributor

cbrnr commented Sep 27, 2024

Thanks @agricolab, unfortunately it doesn't work for me using this dataset and the following code:

import pyxdf

fname = "sub-13_ses-S001_task-HCT_run-001_eeg.xdf"
streams, header = pyxdf.load_xdf(fname, select_streams=[2])  # EEG stream

pyxdf.align_streams(streams)

This results in the following error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/clemens/Projects/pyxdf/pyxdf/align.py", line 165, in align_streams
    _new_timeseries = align(
                      ^^^^^^
  File "/Users/clemens/Projects/pyxdf/pyxdf/align.py", line 75, in _shift_align
    raise RuntimeError(f"Too few new timestamps. {missed} of {len(old_timestamps)} old samples could not be assigned.")
RuntimeError: Too few new timestamps. 1 of 43848 old samples could not be assigned.

Not sure if this is supposed to work with one single stream in the first place (which would be extremely important for me), but I also tried with two streams which also yield an error:

import pyxdf

fname = "sub-13_ses-S001_task-HCT_run-001_eeg.xdf"
streams, header = pyxdf.load_xdf(fname, select_streams=[2, 5])  # EEG and ACC streams

pyxdf.align_streams(streams)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/clemens/Projects/pyxdf/pyxdf/align.py", line 165, in align_streams
    _new_timeseries = align(
                      ^^^^^^
  File "/Users/clemens/Projects/pyxdf/pyxdf/align.py", line 81, in _shift_align
    toomany[old_timestamps[source[v]]].append(new_timestamps[target[n]])
                                                             ~~~~~~^^^
IndexError: list index out of range

@agricolab
Copy link
Member Author

agricolab commented Oct 3, 2024

Hm. Sorry to hear. I'll take a look asap, probably this weekend.
Whether it's supposed to work with a single stream. Considering the original idea was to align streams, and we didn't want to put a focus on resampling... Feeding it only a single stream didn't cross my mind. But that should be caught regardless.

Can you describe a little bit more about your use case or needs for a single stream? I guess you'd want to align that stream to externally supported timestamps?

@agricolab
Copy link
Member Author

agricolab commented Oct 4, 2024

Okay, no longer Python 3.12 compatible, because pkg_resources is deprecated... Guess that's another issue. Fixed it by wrapping it into a try-catch.

First issue

Your first issue stems from the fact that there is no unique mapping from the old to the new timestamps. Likely due to down-sampling. The bug is in the reporting, where i check how many new timestamps are possible for each old timestamp. This is not really relevant, i fixed it for the time being with a catch IndexError.

Now it complains

The old time_stamp  182290.71920545248 is a closest neighbor of 20 new time_stamps: [np.float64(182427.2883667528), np.float64(182427.29227279904), np.float64(182427.29617884528), np.float64(182427.30008489153), np.float64(182427.30399093777), np.float64(182427.307896984), np.float64(182427.31180303026), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765)]
The old time_stamp  182290.73849332187 is a closest neighbor of 3 new time_stamps: [np.float64(182427.3157090765), np.float64(182427.3157090765), np.float64(182427.3157090765)]
Traceback (most recent call last):
  File "/media/rtgugg/sd/projects/xdf-Python/test.py", line 9, in <module>
    pyxdf.align_streams(streams)
  File "/media/rtgugg/sd/projects/xdf-Python/pyxdf/align.py", line 191, in align_streams
    _new_timeseries = align(
                      ^^^^^^
  File "/media/rtgugg/sd/projects/xdf-Python/pyxdf/align.py", line 102, in _shift_align
    raise RuntimeError("Can not align streams. Could not create an unique mapping")
RuntimeError: Can not align streams. Could not create an unique mapping

The gist is https://github.com/agricolab/xdf-Python/blob/8fa4645f50904dff6f3fe00ada57c06d1c87d6dc/pyxdf/align.py#L76 if len(set(source)) != len(source): # non-unique mapping, will throw

In consequence, because your two streams can not be aligned by shifting (because multiple new timestamps are the closest neighbour of the old timestamp), you would need to resample to align both streams.

@cbrnr
Copy link
Contributor

cbrnr commented Oct 4, 2024

Thanks @agricolab! Regarding my use case, I need to interpolate the original data to a regular grid of time stamps. This is necessary, because I cannot just assume that the recorded samples really align on a grid defined by the nominal sampling frequency. This is even true for the effective sampling frequency, because there could be gaps in the samples (and pretending samples are evenly spaced by their effective sampling frequency would neglect that).

Therefore, even for only a single stream, I think the safest option is to interpolate to the desired sampling frequency. On the other hand, if you are sure there are no gaps, and you either trust the nominal or the effective sampling frequency, I can just take the samples and pretend they are evenly spaced according to either of those two frequencies.

So I think users should be able to make this choice. Do we trust the nominal or effective sampling frequencies? If yes, no interpolation or resampling is required. If no, we have to interpolate (and as a post processing step fill gaps with e.g. NaNs).

@cbrnr
Copy link
Contributor

cbrnr commented Oct 4, 2024

Okay, no longer Python 3.12 compatible, because pkg_resources is deprecated... Guess that's another issue. Fixed it by wrapping it into a try-catch.

If you rebase onto main, that problem has already been fixed.

@agricolab
Copy link
Member Author

Yeah, but if i rebase, i'll have to fix merge conflicts... :D

@agricolab
Copy link
Member Author

agricolab commented Oct 4, 2024

Okay, at first i thought that the second issue stems from some edge case, i..e that new_timestamps older or younger than any old timestamps (i.e. the original timestamps) could not be assigned. That didn't seem to be the problem.

Instead it appears that if you do not support timestamps or sampling rate yourself, the function picks it from the stream and creates a new, aligned_timestamps. Now, we face some floating errors

    if aligned_timestamps is None:
        # we pick the oldest and youngest timestamp of all streams
        stamps = [stream["time_stamps"] for stream in streams]
        ts_first = min((min(s) for s in stamps))
        ts_last = max((max(s) for s in stamps))
        full_dur = ts_last - ts_first
        step = 1 / sampling_rate
        # we create new regularized timestamps
        aligned_timestamps = np.arange(ts_first, ts_last + step / 2, step)
        # using np.linspace only differs in step if n_samples is different (as n_samples must be an integer number (see implementation below).
        # therefore we stick with np.arange (in spite of possible floating point error accumulation, but to make sure that ts_last is included, we add a half-step. This therefore comes at the cost of a overshoot, but i consider this acceptable considering this stamp would only be from one stream, and not part of all other and therefore is kind of arbitray anyways.
        # linspace implementation:
        # n_samples = int(np.round((full_dur * sampling_rate),0))+1
        # aligned_timestamps = np.linspace(ts_first, ts_last, n_samples)

i even had that commented apparently, but alas np.linspace fails as well

@agricolab
Copy link
Member Author

agricolab commented Oct 4, 2024

So, maybe it comes from the regularization, i.e. the original timestamps miss the beat somehow, and have an additional sample at a weird timestamp strewn in; or it comes from an interaction between the new aligned_timestamps and having segments.

    print(np.max(np.diff(new_timestamps)))
    print(np.max(np.diff(old_timestamps)))
    print(np.min(np.diff(new_timestamps)))
    print(np.min(np.diff(old_timestamps)))

suggests the latter....
Maybe i find the time in the weekend...

@agricolab
Copy link
Member Author

agricolab commented Oct 7, 2024

Okay, i inspected the data a little bit more and it seems _shift_align can not assign an old_timestamp from the middle of the segment

RuntimeError: Too few new timestamps. 1 old timestamps ([182337.61104443]) found no corresponding new timestamp because it was already taken by another old timestamp.

shift_align goes through all new_timestamps and finds the closest old_timestamp (unless it is already mapped to another new_timestamp. That appears to be the case here. As the new_timestamps (whether created by linspace or arange) throw in both cases (but can be assumed to be unique and more or less correctly spaced, i would assume the issue stems from the fact that timestamps were created differently.

By default., pyxdf dejitters timestamps, and this happens with these streams too (othwise we wouldn't have segments). But jitter_removal factually uses different sampling rate for different streams, i.e. it linearizes from first to last timestamp (using linalg.lstsq). But the effective_srate is calculated across all segments.

align_streams instead creates new_timestamps evenly spaced across the whole span from the very first to the very last timestamp of the stream.

So, i would assume the issue lies there. A small misalignment makes it impossible to shift_align within a segment, because effective srates do not align sufficiently well.

In general, i should note that shift_align is expected to fail in most cases, because it has very strict requirements...

As a solution, I expanded the exception messaging.

If a new_timestamp lies within a segment, but could still not be assigned, _shift_align will now complain that f"Non-unique mapping. Closest old timestamp for {new_timestamps[nix]} is {old_timestamps[closest]} but that one was already assigned to {new_timestamps[source.index(closest)]}"

If an old timestamp could not be assigned, it will complain differently, and suggests two solutions: raise RuntimeError( f"Too few new timestamps. {missed} old timestamps ({old_timestamps[unassigned_old]}) found no corresponding new timestamp because it was already taken by another old timestamp. If your stream has multiple segments, this might be caused by small differences in effective srate between segments. Try different dejittering thresholds or support your own aligned_timestamps." )

In your case:

RuntimeError: Too few new timestamps. 1 old timestamps ([182337.61104443]) found no corresponding new timestamp because it was already taken by another old timestamp. If your stream has multiple segments, this might be caused by small differences in effective srate between segments. Try different dejittering thresholds or support your own aligned_timestamps.

agricolab added a commit to agricolab/xdf-Python that referenced this pull request Oct 7, 2024
@agricolab
Copy link
Member Author

agricolab commented Oct 7, 2024

When i interpolate instead (i.e.

streams, header = pyxdf.load_xdf(fname, select_streams=[2])  # EEG stream
align = {2: lambda x, y, xh: pyxdf.align._interpolate(x, y, xh, "nearest")}
aligned = pyxdf.align_streams(streams, align_foo=align)

it passes. using "nearest" (or "previous" or "next") is probably the closest match to the pure _shift_align default. These algorithms are more lenient as they allow non-unique mapping and ignore unassigned old_timestamps, but practically there should be minimal difference if old and new timestamps are as close as in this case.

@cbrnr
Copy link
Contributor

cbrnr commented Oct 7, 2024

Thanks @agricolab, now this makes sense! With thorough documentation (including examples), I think this will be a nice addition!

However, I still have questions 😄:

Maybe my understanding of pyxdf is too limited, but what exactly are those segments you mention? Are they available in the output? And if the effective sampling frequency ignores segments, it is not really reliable in this case, right?

I'm viewing this problem from a user perspective. Basically, I have some streams, and pyxdf gives me those streams on a common time basis. However, I also need them on a uniform time grid, which means that these streams need to be resampled, even if it's just a single stream, but most definitely for two or more streams. For a single stream, I would like to be able to take the samples without any further resampling and just assume the nominal (or effective) sampling frequency is accurate. This is now possible with your pyxdf.align_streams() function, right? (The parameter name should probably be improved.)

@agricolab
Copy link
Member Author

agricolab commented Oct 7, 2024

I would like to be able to take the samples without any further resampling and just assume the nominal (or effective) sampling frequency is accurate.

Easiest is to not use any alignment then at all. Simply discard the information contained in the timestamps, and stick with your assumption. Did it all the time with my pure EEG streams, and it works usually sufficiently reliable if the duration of the recording is not too long. Especially if you use medical grade EEG amplifiers, one can usually assume that no samples are dropped, and sampling rate is sufficiently constant, and assume that all jitter stems from TCP/IP or other transporting or protocol translations.

Now, more generally, we face two problems. First, segments. Second, aligning multiple streams with different clocks (typically markers and EEG).

Segment can occur when there is a break due to hardware or network issues, and data is delayed or missing. It then depends on your additional knowledge or assumptions. If samples are missing, we are really talking about a gap between segments. If it is just delayed in transport, you might want to ignore the gap and assume constant sampling rate. If you had to change amplifiers during the recording, you might actually have a slightly different clock now. LSL/XDF couldn't really know. All it offers are methods to dejitter and segment streams if gaps are too large. In that sense, it makes total sense if pyxdf assumes that different segments might have slightly different steps between samples. This information about segments is relayed to align_streams.

Second, because each stream has its own clock and jitter, no two samples from different streams really perfectly align. Markers are typical examples, and we might want to know the closest sample in another stream. Or we might want to align different physiological signals, e.g. EEG and accelerometer data. LSL/XDF couldn't really know. So you can decide whether you only want to shift the data (works usually quite well with a Marker stream) or actually interpolate. The latter means you have to rely very much on the information in the timestamps (or manipulate it before you pass it to align_streams).

Align_streams attempts to map all data to common timestamps. That usually works always if one uses some kind of interpolation. But the default is shift_align which guarantees that the data is not changed. If you feel comfortable with interpolated data, you can do so explicitly. In that sense, the policy of LSL/XDF is that it manipulates and controls timestamps, but passes the actual data (i.e. timeseries) untouched.

Now, back to the start. If you have a single, non-segmented stream, and believe that the sampling rate should be uniform, just discard the timestamps and take the manufacturers sampling rate as gospel.

@cbrnr
Copy link
Contributor

cbrnr commented Oct 7, 2024

Thanks @agricolab, that's very reasonable! Just one more minor question, how do I know that pyxdf detected segments? Is this reflected in the output somehow?

@agricolab
Copy link
Member Author

Jupp, you can control detection with jitter_break_threshold_* (i.e. https://github.com/xdf-modules/pyxdf/blob/main/pyxdf/pyxdf.py#L165C1-L173C74)

IIRC for some reason, pyxdf exposes this not by default. I had to change https://github.com/agricolab/xdf-Python/blob/sync_timestamps/pyxdf/pyxdf.py#L405C9-L405C36 to actually pass this info. IMHO, should be default :D

@agricolab
Copy link
Member Author

agricolab commented Oct 7, 2024

Oh, and obviously, you can turn dejittering and therefore segmenting completely off by setting dejitter_timestamps to False (see https://github.com/xdf-modules/pyxdf/blob/main/pyxdf/pyxdf.py#L122)

@cbrnr
Copy link
Contributor

cbrnr commented Oct 7, 2024

Interesting, I've never paid attention to those arguments. But how does the output structure differ if there are segments? The docs don't mention that...

@agricolab
Copy link
Member Author

That's the point. AFAIK it doesn't. Only the fork from this PR exposes it explicitly in the info field so that align_streams can use that.

@cbrnr
Copy link
Contributor

cbrnr commented Oct 7, 2024

😮

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants