Difference between revisions of "Team:Grenoble-Alpes/fluorescence module"

Line 159: Line 159:
 
<br>
 
<br>
  
<p>In practice, the prototype looks like in Figure 2. It is just the prototype that was used for the experiments, the parts do not suit the final device. However, this prototype was designed to look alike the final device as much as possible. In particular, a sliding part was printed in order to place the fluorescent sample at the exact same location for each experiment to ensure its repeatability. Holes were drilled in the sample holder to place the LED as close as possible to the sample and orthogonally to the camera to minimize the noise level. </p>
+
<p>In practice, the prototype looks like in Figure 2. It is just the prototype that was used for the experiments, the parts do not suit the final device. However, this prototype was designed to look alike the final device as much as possible, for which dark condition would be ensured thanks to an enclosed box isolating the diagnosis system from exterior light. In particular, a sliding part was printed in order to place the fluorescent sample at the exact same location for each experiment to ensure its repeatability. Holes were drilled in the sample holder to place the LED as close as possible to the sample and orthogonally to the camera to minimize the noise level. </p>
 
</div>
 
</div>
 
<div class="collapse slide">
 
<div class="collapse slide">

Revision as of 08:48, 17 October 2018

Template loop detected: Template:Grenoble-Alpes

FLUORESCENCE MODULE

In our Top10 competent bacteria, we previously inserted, thanks to a bacterial transformation process, the pSB1C3-BBa_J04450 backbone that contains a gene coding for mRFP1. Hence, when the bacteria will express this gene, they will start producing fluorescence, which is related to the presence of the pathogenic bacterium.

The fluorescence expressed by the Top10 comes from the mRFP1 protein, whose spectrum is given in figure 1.


Figure 1: Excitation (left) and emission (right) mRFP1 spectra

This fluorescence is detected thanks to our home-made unit which is shown here:


Figure 2: Exploded view of the fluorescence sensor prototype, life-size


Light pathway through the fluorescence sensor

1. Excitation light

To excite the fluorophore, it is necessary to use a light which emission spectrum is the narrower possible. Therefore, LEDs are much appreciated as their emission spectrum are usually very narrow. Moreover, they are easy to supply (they just need a resistance to protect them and keep the light intensity constant) and very small. We used 565 nm  LNJ309GKGAD LEDs (Panasonic) whose spectrum is given in Figure 3.


Figure 3: LED emission spectrum - peak at 565 nm

It is a small and cheap LED, which light is not that bright, but sufficiently focused to excite the bacteria as it is showcased in Figure 4:


Figure 4: Fluorescence of a sample of TOP10 bacteria expressing the mRFP1 protein

If the red circle is not obvious enough on your screen, you may check the histogram of the picture in the “Learn more” section; each picture is associated to its histogram, which displays the number of pixels having the same intensity given as the grey level. It is a very useful tool in image processing to avoid being abused by our eyes. Concerning the camera, it does not face such problems, it sees the picture as it is.


Filtering

Before reaching the camera, it is mandatory for the incoming light to be filtered, that is to say to get rid of all the background light, mainly coming from the LEDs. A simple way to do it is to use what is called an optical filter. This device can, depending on its characteristics, filter the light to keep some colors and turn others off. This filter is highly recommended to reduce the noise level. For instance, our filter is theoretically able to switch off completely the yellowish-green LEDs (Figure 5):


Figure 6: Picture of the filter’s efficiency. The LED is positioned in place of the sample. Its light is totally filtered.

The picture in Figure 6 is completely dark. It means that the LED light cannot be seen by the camera. It is a primordial condition to confirm to make sure that no light is captured by the camera under control measurement, which is actually the case as Figure 7 shows it:


Figure 7: Non transformed Top10 picture

This proves that the light that is detected by the camera comes from fluorescence only, which makes the processing easier.


2. Detecting fluorescence

All of these pictures were taken with the camera that we used in our module. It is a Pi noIR Camera V2 [4]. This camera is perfectly suited to our application, as it can be easily controlled by a little computer called a Raspberry Pi [5] that also monitors the touch screen. Moreover, this camera has the same performance as a middle-class smartphone, to give an idea of the quality of the pictures.

Eventually, calculating the overall intensity of each picture gives the linear graph Figure 8:

Figure 8: Overall luminance as a function of the fluorescent bacteria optical density at 600 nm

CONCLUSION

As a conclusion, here are some advice you should follow if you need to create your own fluorescence sensor, to avoid painful troubles:

- Unless you have a photomultiplier available with the appropriate supply, make a camera your first choice before a photosensitive device, such as a photoresistance or a photodiode. The fluorescence analysis is highly limited by the fluorescence intensity. It is easier to see fluorescence than detect it. Hence, with a photosensitive device, the minimal sensitivity will not suit your application. Photosensitive devices are helpful when you try to do quantitative measurements, as they directly link the light intensity to their output, but it is meaningless if you cannot detect a low-level fluorescence. If a photosensitive sensor is mandatory for your application, you should use an optic fiber to focus the light to the sensor with no losses.

- Be careful of the material you work with. The plastic you use, for instance, might be fluorescent as well, making your results wrong. The material might also reflect the excitation light to the sensor when you expected it to be confined.

- Look carefully at the characteristics of the excitation light. The brighter the better it is to boost the fluorescence intensity, obviously, but it is not enough. A high angle of vision (more than 100°) can be interesting to illuminated a large surface, but beware not to illuminate your sensor. A small angle of vision would ensure that it does not happen but the resulting light intensity would be usually lower.

- For the sake of reducing the noise level, as usual, you should consider using an excitation light with a lower central wavelength than the optimal one in case the excitation spectrum of the fluorescent protein overflows on the emission spectrum, and hence taking advantage of the excitation spectrum spreading (taking a higher cutoff wavelength longpass filter works as well).


LEARN MORE


About the geometry adopted

One problem in fluorescence analysis, which might seem anecdotal but is actually primordial, is the geometry. Depending on the position of the light sensor, the sample or the light source, the results can vary a lot. The geometry adopted, which is equivalent to the epifluorescence, places the camera on a perpendicular axis to the light source (Figure 9):


Figure 9: Block diagram of the fluorescence sensor

In practice, the prototype looks like in Figure 2. It is just the prototype that was used for the experiments, the parts do not suit the final device. However, this prototype was designed to look alike the final device as much as possible, for which dark condition would be ensured thanks to an enclosed box isolating the diagnosis system from exterior light. In particular, a sliding part was printed in order to place the fluorescent sample at the exact same location for each experiment to ensure its repeatability. Holes were drilled in the sample holder to place the LED as close as possible to the sample and orthogonally to the camera to minimize the noise level.

About the camera performances

The camera is located above the sample. It means that the top surface of the sample is pictured, displaying a circle (Figure 10):


Figure 10: Fluorescence picture

As you can see, the picture is not that clear. Despite its very high resolution (8 MPx) which ranks it among the high-quality smartphones (8 to 12 MPx usually), the Depth Of Field (DOF) is of 1m which limits the quality of the pictures. The DOF is the minimal distance to get unblurred objects on a picture [6]. The nearer an object is to the DOF, the sharper the details are. Yet, in a fluorescence detection application, the light sensor should be as near as possible to the fluorescent sample to capture as much light as possible. Here is why a lens is used: to get the smoothest pictures in spite of this camera’s impairment.

Why using this camera and not another one with a better DOF then? First of all, it is the first camera we had in hands and we could work with it quickly. Moreover, it is a user-friendly camera, adapted to the Raspberry Pi we already planned to use to monitor the touchscreen. The hardware has some characteristics, but they can be easily adjusted through the software, thanks to the Pillow library (Python). The advantage of Pillow over OpenCV, a classic image processing library, on Raspbian is that it is easier to install. Yet, if you feel brave enough to install OpenCV [7], let us advise you to do so, as OpenCV's functions are faster than Pillow’s. We decided not to install OpenCV as we do not have any speed constraint.

Finally, the camera is still able to detect low fluorescent light, even when the human eye struggles with. It can then be useful to plot the histogram of a picture. A histogram classifies pixels having the same value, that is to say, having the same color. For instance, there is no doubt that the sample Figure 3 is fluorescent thanks to its histogram (Figure 11):


Figure 11: Histogram of a fluorescent sample

The greatest bar is at 0, which is normal as the picture is mainly dark, but others pixels have a grey level greater than 0 which means they are colored. And this color can come from fluorescence only as it is explained hereinabove. As a comparison, Figure 12 displays the histogram of Figure 7, a non-fluorescent sample:


Figure 12: Non fluorescent sample histogram. Here, absolutely all the pixels are dark as they all have a grey level of 0.

About the image processing[8]

1. Shortcoming light detection compensation

It was explained earlier that fluorescence light only is detected by the camera, and it is very true. Yet, we are not sheltered to unexpected hardware degradation. For instance, a door that cannot be closed correctly may let ambient light gets to the camera, reducing the fluorescence level on the picture. To prevent that, the subtract() function from the ImageChops module is used with fluorescent and non-fluorescent pictures as arguments. Figure 13 shows an example of what this function does:


Figure 13: Sample picture before correction (left) and after correction (right)

The right picture is a little darker than the left one. Subtracting a picture from another is what this function does. As no fluorescence is expressed with non-fluorescent bacteria, the captured light with such a sample can be noisy only. Hence subtracting a fluorescent picture to a control picture enables to keep the fluorescence contribution only.

2. The fluorescence intensity

This intensity is obtained by summing the grey level of each pixel. The grey level is obtained from the RGB code by a relationship based on the human eye’s sensitivity [9]:

G=0.299R+0.587G+0.114B

By the way, it is to notice that the pictures are beforehand converted into 3D arrays containing the RGB code of each pixel thanks to the getTabPxl() function to enable the analysis.

Finally, plotting the fluorescence intensity as a function of the optical density of the samples gives the graph Figure 8. The curve follows an affine function (good determination coefficient), which is coherent as the sensor should reach a limit of sensitivity.

This graph can be compared to the one obtained with the Tristar [LB 941] microplate reader (Figure 14):


Figure 14: Calibration curve of the Tristar microplate reader

The curve is much more linear. It is not surprising as the sensitivity of a professional microplate reader is expected to be much better than a handmade one. It is however comforting that the shapes look almost the same.

3. The fitted curve

Figure 8 displays 2 curves: the experimental and the fitted ones. An easy way to get a fitted curve with Python is using the linregress() function which returns, among other data, the theoretical coefficients of the curve and the determination coefficient of the fit. It is therefore necessary, as for any fit actually, to know what shape the experimental curve should have. A first feeling would be to think that fluorescence is proportional to the number of fluorescent bacteria; the more present bacteria are, the more fluorescent the sample should be. This hypothesis is confirmed by the Tristar’s calibration curve Figure 14 and retrieved on our fluorescence unit’s calibration curve, although the curves are affine rather than linear. This result is, in fact, consistent as fluorescence detection is limited by the performances of the device. A linear curve would mean that fluorescence from a single bacterium could be detected. It is a performance that any fluorescence sensor tries to get closer but it can harshly be reached, especially with a cheap unit.


REFERENCES

[1] Spectra from the online fluorescent protein database: https://www.fpbase.org/protein/mrfp1/

[2] Spectrum from the LNJ309GKGAD (Panasonic) datasheet, July 2012, p1

[3] Spectrum from the longpass filter, https://www.edmundoptics.fr/p/50mm-diameter-red-dichroic-filter/10607/#downloads

[4] Picamera datasheet: https://www.raspberrypi.org/documentation/hardware/camera/

[5] https://www.raspberrypi.org/

[6] https://digital-photography-school.com/understanding-depth-field-beginners/

[7] https://www.pyimagesearch.com/2017/09/04/raspbian-stretch-install-opencv-3-python-on-your-raspberry-pi/

[8] Click to download the codes (zip file).

[9] https://www.w3.org/TR/AERT/#color-contrast