This is a Python-based analysis framework for Localized Surface Plasmon Resonance Imaging (LSPRi) in sensing applications. It features a PySide-powered Qt GUI that allows users to load folders of image stacks and perform detailed analysis.
Key features include:
-
Calculation of extinction values from image stacks
-
Spot grouping and extinction curve generation
-
Extraction of sensorgrams using multiple metrics
-
Interactive visualization with overlays on representative 2D images
-
Statistical summaries, averaging, functional boxplots, and more
-
Multiple analysis modes and on-demand data extraction
If you use this tool in your work, please cite the associated paper.
This project is described in our paper:
"A visualization framework for localized surface plasmon resonance imaging in sensing applications", Anna Sterzik, Tomáš Lednický, Andrea Csáki, Kai Lawonn, Published in Computers & Graphics, 2025, https://doi.org/10.1016/j.cag.2025.104396.
-
Install conda e.g. using miniforge if you don’t have it already and activate it with
conda activate
-
Clone or download this repository.
-
Create the environment by running:
conda env create -f environment.yml
-
Activate the environment:
conda activate mika
-
Then run the application:
python app.py
The environment provided uses CPU-only OpenCV by default. If you want to enable GPU acceleration with CUDA, you need to install CUDA-enabled OpenCV in the environment manually.
To check if OpenCV is using CUDA, run the following in Python:
import cv2
print(cv2.getBuildInformation())Look for CUDA: YES in the output.
- The wavelengths and frames are extracted from the image names. The pattern matching is done in the function
load_imagesinimage_processing/circle_detection.py. - If you want, you can set a
default_pathto an image folder that gets opened automatically upon startup inapp.py, but this is not necessary. - It's only possible to assign
ngroups withn = number of spots * 2. This is currently hardcoded. If more groups are necessary, this can be changed.
This repository is configured to demonstrate the core functionality presented in the paper. To keep the repository lightweight, we provide a Test Dataset (test_data/) for demonstration purposes.
The provided script launches the application with the demonstration data. By following the interactive steps, you can generate a result that verifies the functionality shown in Figure 4 of the paper.
- Note: As this dataset is used for demonstration, the output will differ to the full-scale result in the paper (less timesteps, less spots), but the algorithmic behavior is identical.
- Reference Output: You can view the expected result for this specific test run here: replicability.png
- We provide a single script that installs all system dependencies, sets up the environment, and automatically launches the application with the demonstration data. It requires no parameters.
bash install_and_run.sh- Since this application is interactive, specific mouse inputs are required to process the data. Please watch the short video guide to see the required steps: