Skip to content

WIP real-time base client#4110

Closed
teonbrooks wants to merge 7 commits intomne-tools:masterfrom
teonbrooks:rtclient
Closed

WIP real-time base client#4110
teonbrooks wants to merge 7 commits intomne-tools:masterfrom
teonbrooks:rtclient

Conversation

@teonbrooks
Copy link
Copy Markdown
Member

@teonbrooks teonbrooks commented Mar 28, 2017

TODOs

  • Add example for LSL client (using simulation)
  • Add test for _BaseClient
  • Add tests for LSL client

@jasmainak
Copy link
Copy Markdown
Member

@teonbrooks have you looked at this by @alexandrebarachant : https://github.com/alexandrebarachant/muse-lsl ? Let's try to finish the refactoring tomorrow morning and then go for the openbci client later in the day ...

# Martin Luessi <mluessi@nmr.mgh.harvard.edu>
# Matti Hamalainen <msh@nmr.mgh.harvard.edu>
# Authors: Teon Brooks <teon.brooks@gmail.com>
# Mainak Jas <mainak@neuro.hut.fi>
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the diff is really unreadable if we rename the files. I'd prefer keeping the filenames as is for now ...

@teonbrooks teonbrooks force-pushed the rtclient branch 2 times, most recently from bb3d809 to cd0e135 Compare March 29, 2017 14:40
@codecov-io
Copy link
Copy Markdown

Codecov Report

Merging #4110 into master will decrease coverage by 0.23%.
The diff coverage is 31.57%.

@@            Coverage Diff             @@
##           master    #4110      +/-   ##
==========================================
- Coverage   86.18%   85.95%   -0.24%     
==========================================
  Files         354      355       +1     
  Lines       63729    63755      +26     
  Branches     9709     9709              
==========================================
- Hits        54927    54801     -126     
- Misses       6127     6272     +145     
- Partials     2675     2682       +7

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a092868...fe46b97. Read the comment docs.

@rodrigohubner
Copy link
Copy Markdown
Contributor

I'll try to participate in this WIP (+1).

@jasmainak
Copy link
Copy Markdown
Member

Great! Feel free to review it, pull the changes and try them on your computer

Copy link
Copy Markdown
Contributor

@rodrigohubner rodrigohubner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two doubts:

  • For me to get the whole project should I pull in Teon's repository?
  • How would I contribute to commit to this same PR?

# License: BSD (3-clause)

import copy
import threading
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is better to work with multiprocessing because of known problems with multithreading in python (GIL for example).

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you have a better solution, i welcome it. here we are just extracting away from our existing code and wanted to have a lightweight base class object that would be work across our existing clients. I wasn't aware of the problems with multithreading

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I used thread for similar function and never had any issues. Yes python has problem with the GIL, but that does not mean thread should never be used. What you want to avoid is having a thread that never release the GIL (i.e. continuously run). For example, when using LSL.pull_sample() or a socket.recv, those function will block until you receive data but this is made with a call to a C extension or to a shared library that will effectively release the GIL during the time it wait.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @alexandrebarachant . I'd say let's stick to threading for now, get the API right, add an example, so people can pull and test. Then we can smooth out other details. Make it work, then make it nice :)

@jaeilepp
Copy link
Copy Markdown
Contributor

jaeilepp commented Apr 6, 2017

One option would be to take over and make another PR with changes on the top of the ones made here (pull this branch, make changes, make another PR).
Another option would be to push to this branch directly, but that would require push rights from @teonbrooks .

@teonbrooks
Copy link
Copy Markdown
Member Author

@rodrigohubner I can give you write privs to my branch if you would like, let me know.

for the LSLClient we are working on, I know that @aj-ptw has been working on making a compatible lsl interface with the npm module he's worked on.
also @alexandrebarachant has made an interface for the MUSE via LSL.
I think this should likely be our course of actions for realtime clients, relying on unified client and have devs work to their hardware compatible with those adopted standards (e.g. LSL, OSC, Fieldtrip client)

@larsoner
Copy link
Copy Markdown
Member

larsoner commented Apr 6, 2017

One option would be to take over and make another PR with changes on the top of the ones made here (pull this branch, make changes, make another PR).
Another option would be to push to this branch directly, but that would require push rights from @teonbrooks .

Keep in mind that PRs can be made from any branch to any other branch -- we juts usually make them from user/branchname into mne-tools/master. But you can just as easily make a PR from rodrigohubner/rtclient-fixes into teonbrooks/rtclient. This doesn't require push access or taking over.

@teonbrooks
Copy link
Copy Markdown
Member Author

Keep in mind that PRs can be made from any branch to any other branch

yeah, I agree, PR to my branch would work, and would be less hassle.

@jasmainak
Copy link
Copy Markdown
Member

@teonbrooks how close are we here?

I am going to add a TODO at the top of the PR so we don't lose focus


from mne.io.meas import create_info

def _buffer_recv_worker(client):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's the same for all clients, no? can we put it in one file and import it?

pass

def _enter_extra():
"""For system-specific loading and initializing during the enter
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

system-specific -> client-specific

"""
inlet = pylsl.StreamInlet(self.client)

## add tmin and tmax to this logic
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

still todo!

## add tmin and tmax to this logic

while True:
samples, timestamps = inlet.pull_chunk(max_samples=self.buffer_size)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

samples, times to conform to MNE nomenclature?


return self

def create_info(self):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing docstring

self.buffer_size = buffer_size
self.verbose = verbose

def connect(self):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

missing docstring

@jasmainak
Copy link
Copy Markdown
Member

@teonbrooks let's try to finish it while the momentum is still strong :)

@andrewjaykeller
Copy link
Copy Markdown

yes @teonbrooks ?

@teonbrooks
Copy link
Copy Markdown
Member Author

hey guys, agreed. I was flying yesterday back to SF so I didn't see any of this. I will address these comments this weekend :) also @aj-ptw, is it straightforward to establish a LSL stream with the npm package? maybe we can chat tomorrow to make discuss what it would take in a simple example

@codyworld
Copy link
Copy Markdown

Hi all, I have questions. I am able to stream EEG data from OpenBCI Cyton V3 hardware into MATLAB environment using BCILAB via LSL and I would like to polish the incoming data in real-time prior component analysis. I know BCILAB has the capability to perform data cleaning using filters like Clean_rawdata, PREP Pipeline, ASR, Infomax etc. Does anyone have any idea, how to perform those process online?

@jasmainak
Copy link
Copy Markdown
Member

I would like to polish the incoming data in real-time prior component analysis. I know BCILAB has the capability to perform data cleaning using filters like Clean_rawdata, PREP Pipeline, ASR, Infomax etc. Does anyone have any idea, how to perform those process online?

You can probably use autoreject with the fit and transform API. The fit would be run offline and the transform would be run online. Depends how bad your data is. But if you're working with Matlab, I'm not sure how you could make it work. Maybe you can try calling Python from Matlab like here

@larsoner
Copy link
Copy Markdown
Member

@teonbrooks @jasmainak what is the plan here? Any time to work on it?

@jasmainak
Copy link
Copy Markdown
Member

I'll be seeing @teonbrooks end of this month for the CRN coding sprint. @teonbrooks do you think we could earmark a couple of days after the sprint to work on this?

@teonbrooks
Copy link
Copy Markdown
Member Author

yeah, I think this can be something we can knock out while @jasmainak is in town :)

@larsoner
Copy link
Copy Markdown
Member

@teonbrooks this is about a year old, still planning to work on this or should we close?

@teonbrooks
Copy link
Copy Markdown
Member Author

let's close this for now. maybe when I get a weekend free...

@teonbrooks teonbrooks closed this May 16, 2018
@teonbrooks teonbrooks mentioned this pull request Apr 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants