[THIS IS CURRENTLY ONLY A FORK OF deep-finder DEDICATED TO napari-deepinder]
The code in this repository is described in this pre-print. This paper has been submitted to Nature Methods and has now been published.
To reviewers: you can follow our tutorial to reproduce segmentations from our paper.
Disclaimer: DeepFinder is still in its early stages, any feedback is welcome for enhancing the user experience.
News: (29/01/20) A first version of the GUI is now available in folder pyqt/. [More information...](###Using the GUI)
News: (01/06/22) A first version of the Napari GUI (Napari plugin) is available here: https://github.com/deep-finder/napari-deepfinder.
- [System requirements](##System requirements)
- [Installation guide](##Installation guide)
- [Instructions for use](##Instructions for use)
- Documentation
- Google group
Deep Finder has been implemented using Python 3 (Python >= 3.8 is required) and is based on the Keras package. It has been tested on Linux (Debian 10), and should also work on Mac OSX as well as Windows.
The algorithm needs an Nvidia GPU and CUDA to run at reasonable speed (in particular for training). The present code has been tested on Tesla K80 and M40 GPUs. For running on other GPUs, some parameter values (e.g. patch and batch sizes) may need to be changed to adapt to available memory.
- If above conditions are not met, we cannot guarantee the functionality of our code at this time.Deep Finder depends on following packages. The package versions for which our software has been tested are displayed in brackets:
tensorflow (2.8.2)
keras (2.8.0)
h5py (3.6.0)
lxml (4.8.0)
mrcfile (1.3.0)
scikit-learn (1.0.2)
scikit-image (0.19.2)
matplotlib (3.5.2)
PyQt5 (5.15.6)
pyqtgraph (0.12.4)
openpyxl (3.0.9)
scipy (1.7.3)
numpy
pycm
Before installation, you need a python environment on your machine. If this is not the case, we advise installing Miniconda.
In your python environment, do:
pip install em-deepfinder
You need to download the present repository. Next, open a terminal, place yourself in your deep-finder folder and run:
cd /path/to/deep-finder/
pip install -r requirements.txt
Also, in order for Keras to work with your Nvidia GPU, you need to install CUDA. For more details about installing Keras and CUDA, please see Keras installation instructions.
Once these steps have been achieved, the user should be able to run Deep Finder.
Instructions for using Deep Finder are contained in folder examples/. The scripts contain comments on how the toolbox should be used. To run a script, first place yourself in its folder. For example, to run the target generation script:
cd examples/training/
python step1_generate_target.py
You can use a new GUI that has been developed for deepfinder as a Napari plugin, you can find more information on https://github.com/deep-finder/napari-deepfinder.
6 different commands (GUIs) are available by directly typing the command in the python environment (display, annotate, generate_target, train, segment, cluster)
In your environment, write the following command for example to run the target generation GUI:
generate_target
The GUI (Graphical User Interface) is launchable from folder bin/, and should be more intuitive for those who are not used to work with script. Currently, 6 GUIs are available (tomogram display, tomogram annotation, target generation, training, segmentation, clustering) and allow the same functionalities as the scripts in example/. To run a GUI, first open a terminal. For example, to run the target generation GUI:
/path/to/deepfinder/bin/generate_target
Notes:
- working examples are contained in examples/analyze/, where Deep Finder processes the test tomogram from the SHREC'19 challenge.
- The script in examples/training/ will fail because the training data is not included in this Gitlab.
- The evaluation script (examples/analyze/step3_launch_evaluation.py) is the one used in SHREC'19, which needs additional packages (pathlib and pycm, can be installed with pip). The performance of Deep Finder has been evaluated by an independent group, and the result of this evaluation has been published in Gubins & al., "SHREC'19 track: Classification in cryo-electron tomograms".
