Skip to content

Short Guide to Particle Tracking

rmcgorty edited this page Jan 21, 2019 · 7 revisions

Particle tracking

The code we're using is based of IDL code developed by John Crocker and David Grier. Eric Weeks has developed routines for this code as well and has a nice tutorial on its use. Though the tutorial is for the IDL code, it is still worthwhile to go through.

Loading in the data

We use the tiff_file module to read in images saved as tiff stacks. If your data is not saved as a tif or if you'd like to edit the images (crop, subtract background as described below, etc) you probably want to use Fiji.

Background subtraction

This code will find bright objects against a dark background. If you acquire images of fluorescent beads or molecules, you probably won't have to do any adjustments to the image. However, if you acquire bright-field images of beads, you may see the beads as dark against a light background. If that is the case, you can either invert the images (in Fiji, Edit->Invert) or subtract the images from a background. Subtracting from a background will both invert and removed fixed pattern noise. The easiest way to generate a background image is to take the median of the movie you've acquired (in Fiji, Image->Stacks->Z Project... and select Median as the projection type). Then subtract your images from that background (in Fiji, Process->Image Calculator, select the two images to use and Subtract as the operation, and make sure to check 32-bit result (later save as 16-bit)).

Identifying features

Features within the image (e.g., images of beads) are identified and discarded or not based on a set of attributes. The first attribute to consider is:

  • masscut - first threshold for the integrated intensity of features

If features are found with an intensity greater than this they are further weeded through the parameters:

  • barRg -- the maximum radius of gyration (features larger will be discarded)
  • barCC -- the maximum eccentricity accepted (eccentricity of 0 is perfect circle, greater if elliptical)
  • Imin -- the minimum intensity of the brightest pixel in a feature
  • IdivRg -- the minimum ratio of the integrated intensity to the square of the radius of gyration
  • barI -- the minimum integrated intensity

You should use the function test in the module mpretrack to fine tune these parameters. This function will take a frame of the video and identify all features. It will display that frame with green dots on top of the features that made the cut.

You can then use the function run in mpretrack to identify the features in all frames using the parameters that you found worked well in test.

Linking features

Identifying the locations of all particles in every frame is not the end. We need to then link features from frame-to-frame to build tracks. This will be done with the function trackmem in the trackmem module.

This function to link features across frames requires the following parameters:

  • max_displacement -- what is the maximum displacement a particle can make between frames
  • goodenough -- how long much a trajectory be for you to keep it
  • memory -- how many consecutive frames can a particle be absent in

Once you've set those parameters, you also provide the trackmem function with the array of features found using mpretrack.run.

The data that trackmem returns can now be used to do science (like calculating the mean squared displacement or the probability distribution of displacements over a given time lag). But before we get into that there are a few things that may negatively impact your data that you want to be aware of. We'll consider some below.

Data quality

It is a good idea to understand how this particle tracking algorithm works before you start collecting data. This way, you can optimize things like frame rate, magnification, movie duration, etc.

If the quality of the movie you take is unavoidably bad in some ways, you may be able to adjust the image. For instance if there is a non-uniform intensity or if dirt/smudges are present in the image, you can subtract a background as described above. For instance, in the movie below you see the original data on the left which has some non-uniform background plus some beads which are stuck to the glass and don't move. By generating a background from the median of all images, we can remove the non-uniform background.

Silica beads

Pixel biasing

One thing to be careful of is pixel biasing. Take a look at the two trajectories shown below (one is yellow, one in red) of ~700 nm silica particles.

image of tracks

Those tracks might look fine. But upon closer inspection...

image of tracks, zoomed-in

... you can see that the found position of the bead seems to jump with steps of one pixel. It seems improbable that the bead is actually making steps that are consistently one pixel.

A good way to check for pixel biasing is to create a histogram of the fractional part of the found particle coordinates. Ideally, you should see that the fractional part of the coordinates is random and the histogram will appear flat. However, if the histogram is not flat (like the top plot below) then there is a problem.

histogram indicating pixel biasing

If pixel biasing is occurring, try to adjust the bandpass filtering or increase the feature size parameter.