Difference between revisions of "Team:Grenoble-Alpes/fluorescence module"

Line 107: Line 107:
 
<h2><font color=#ffe100>Light pathway through the fluorescence sensor</font></h2>
 
<h2><font color=#ffe100>Light pathway through the fluorescence sensor</font></h2>
  
<h3>Excitation light</font></h2>
+
<h3>Excitation light</font></h3>
  
 
<p>To excite the fluorophore, it is necessary to use a light which emission spectrum is the narrower possible. Therefore, LEDs are much appreciated as their emission spectrum are usually very narrow. Moreover, they are easy to supply (they just need a resistance to protect them and keep the light intensity constant) and very small. We used 565 nm  LNJ309GKGAD LEDs (Panasonic) whose spectrum is given in Figure 3. </p>
 
<p>To excite the fluorophore, it is necessary to use a light which emission spectrum is the narrower possible. Therefore, LEDs are much appreciated as their emission spectrum are usually very narrow. Moreover, they are easy to supply (they just need a resistance to protect them and keep the light intensity constant) and very small. We used 565 nm  LNJ309GKGAD LEDs (Panasonic) whose spectrum is given in Figure 3. </p>
  
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption>Figure 3: LED emission spectrum - peak at 565 nm  </figcaption></center></figure>
 +
<br>
  
 +
<p>It is a small and cheap LED, which light is not that bright, but sufficiently focused to excite the bacteria as it is showcased in Figure 4:</p>
 +
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption>Figure 4: Fluorescence of a sample of TOP10 bacteria expressing the mRFP1 protein  </figcaption></center></figure>
 +
<br>
 +
 +
<p>If the red circle is not obvious enough on your screen, you may check the histogram of the picture in the “Learn more” section; each picture is associated to its histogram, which displays the number of pixels having the same intensity given as the grey level. It is a very useful tool in image processing to avoid being abused by our eyes. Concerning the camera, it does not face such problems, it sees the picture as it is. 
 +
</p>
 +
<br>
 +
<h3>Filtering</font></h3>
 +
<p>Before reaching the camera, it is mandatory for the incoming light to be filtered, that is to say to get rid of all the background light, mainly coming from the LEDs. A simple way to do it is to use what is called an optical filter. This device can, depending on its characteristics, filter the light to keep some colors and turn others off. This filter is highly recommended to reduce the noise level. For instance, our filter is theoretically able to switch off completely the yellowish-green LEDs (Figure 5):</p>
 +
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption>  Figure 5: Transmission curve of the filter [3]</figcaption></center></figure>
 +
<br>
 +
 +
<p>The cut-off wavelength of the filter is 605 nm. It however transmits light of a little lower wavelength, but the light LED intensity is so low at the right end of the curve that it is totally filtered, as Figure 6 shows it:</p>
 +
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption>  Figure 6: Picture of the filter’s efficiency. The LED is positioned in place of the sample. Its light is totally filtered.</figcaption></center></figure>
 +
<br>
 +
 +
<p>The picture in Figure 6 is completely dark. It means that the LED light cannot be seen by the camera. It is a primordial condition to confirm to make sure that no light is captured by the camera under control measurement, which is actually the case as Figure 7 shows it:</p>
 +
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption> Figure 7: Non transformed Top10 picture </figcaption></center></figure>
 +
<br>
 +
 +
<p>This proves that the light that is detected by the camera comes from fluorescence only, which makes the processing easier. </p>
 +
<br>
 +
<h3>Detecting fluorescence</font></h3>
 +
 +
<p>All of these pictures were taken with the camera that we used in our module. It is a Pi noIR Camera V2 [4]. This camera is perfectly suited to our application, as it can be easily controlled by a little computer called a Raspberry Pi [5] that also monitors the touch screen. Moreover, this camera has the same performance as a middle-class smartphone, to give an idea of the quality of the pictures. </p>
 +
 +
<p>Eventually, calculating the overall intensity on each picture gives the linear graph Figure 8: </p>
 +
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption> Figure 8: Overall luminance as a function of the fluorescent bacteria optical density at 600 nm </figcaption></center></figure>
 +
<br>
  
 
<div style="padding:5px; background-color:#b2b1a6; border:2px solid #b2b1a6; -moz-border-radius:9px; -khtml-border-radius:9px; -webkit-border-radius:9px; border-radius:9px;">
 
<div style="padding:5px; background-color:#b2b1a6; border:2px solid #b2b1a6; -moz-border-radius:9px; -khtml-border-radius:9px; -webkit-border-radius:9px; border-radius:9px;">
Line 123: Line 165:
 
- Be careful of the material you work with. The plastic you use, for instance, might be fluorescent as well, making your results wrong. The material might also reflect the excitation light to the sensor when you expected it to be confined. </p><p>
 
- Be careful of the material you work with. The plastic you use, for instance, might be fluorescent as well, making your results wrong. The material might also reflect the excitation light to the sensor when you expected it to be confined. </p><p>
  
Look carefully at the characteristics of the excitation light. The brighter the better it is to boost the fluorescence intensity, obviously, but it is not enough. A high angle of vision (more than 100°) can be interesting to illuminated a large surface, but beware not to illuminate your sensor. A small angle of vision would ensure that it does not happen but the resulting light intensity would be usually lower. </p><p>
+
- Look carefully at the characteristics of the excitation light. The brighter the better it is to boost the fluorescence intensity, obviously, but it is not enough. A high angle of vision (more than 100°) can be interesting to illuminated a large surface, but beware not to illuminate your sensor. A small angle of vision would ensure that it does not happen but the resulting light intensity would be usually lower. </p><p>
 +
 
 +
- For the sake of reducing the noise level, as usual, you should consider using an excitation light with a lower central wavelength than the optimal one in case the excitation spectrum of the fluorescent protein overflows on the emission spectrum, and hence taking advantage of the excitation spectrum spreading (taking a higher cutoff wavelength longpass filter works as well). </font></p>
 +
 
 +
<div style="padding:3px; padding-left:6px; border-left:4px solid #d0d0d0; background-color:#f1f1f1; margin-left:20px; font-style:italic;">
 +
 
 +
</div>
 +
 
 +
<h2><center><font color=#ffe100>Learn more</font></center></h2>
 +
<br>
 +
 
 +
<h3About the geometry adopted</h3>
 +
 
 +
<p>One problem in fluorescence analysis, which might seem anecdotal but is actually primordial, is the geometry. Depending on the position of the light sensor, the sample or the light source, the results can vary a lot. The geometry adopted, which is equivalent to the epifluorescence, places the camera on a perpendicular axis to the light source (Figure 9): </p>
 +
 
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption>Figure 9: Block diagram of the fluorescence sensor  </figcaption></center></figure>
 +
<br>
 +
 
 +
<p>In practice, the prototype looks like in Figure 2. It is just the prototype that was used for the experiments, the parts do not suit the final device. However, this prototype was designed to look alike the final device as much as possible. In particular, a sliding part was printed in order to place the fluorescent sample at the exact same location for each experiment to ensure its repeatability. Holes were drilled in the sample holder to place the LED as close as possible to the sample and orthogonally to the camera to minimize the noise level. </p>
 +
 
 +
<h3>About the camera performances</h3>
 +
<p>The camera is located above the sample. It means that the top surface of the sample is pictured, displaying a circle (Figure 10): </p>
 +
 
 +
<br>
 +
<figure><center><img src="https://static.igem.org/mediawiki/2018/e/e9/T--Grenoble-Alpes--select_figure4.png"><figcaption>Figure 10: Fluorescence picture  </figcaption></center></figure>
 +
<br>
 +
 +
<p>As you can see, the picture is not that clear. Despite its very high resolution (8 MPx) which ranks it among the high-quality smartphones (8 to 12 MPx usually), the Depth Of Field (DOF) is of 1m which limits the quality of the pictures. The DOF is the minimal distance to get unblurred objects on a picture [6]. The nearer an object is to the DOF, the sharper the details are. Yet, in a fluorescence detection application, the light sensor should be as near as possible to the fluorescent sample to capture as much light as possible. Here is why a lens is used: to get the smoothest pictures in spite of this camera’s impairment. </p><p>
  
For the sake of reducing the noise level, as usual, you should consider using an excitation light with a lower central wavelength than the optimal one in case the excitation spectrum of the fluorescent protein overflows on the emission spectrum, and hence taking advantage of the excitation spectrum spreading (taking a higher cutoff wavelength longpass filter works as well). </font></p>
+
Why using this camera and not another one with a better DOF then? First of all, it is the first camera we had in hands and we could work with it quickly. Moreover, it is a user-friendly camera, adapted to the Raspberry Pi we already planned to use to monitor the touchscreen. The hardware has some characteristics, but they can be easily adjusted through the software, thanks to the Pillow library (Python). The advantage of Pillow over OpenCV, a classic image processing library, on Raspbian is that it is easier to install. Yet, if you feel brave enough to install OpenCV [7], let us advise you to do so, as OpenCV's functions are faster than Pillow’s. We decided not to install OpenCV as we do not have any speed constraint. </p><p>
  
 +
Finally, the camera is still able to detect low fluorescent light, even when the human eye struggles with. It can then be useful to plot the histogram of a picture. A histogram classifies pixels having the same value, that is to say, having the same color. For instance, there is no doubt that the sample Figure 3 is fluorescent thanks to its histogram (Figure 11):</p>
  
 
</div>
 
</div>
Line 132: Line 203:
 
<br>
 
<br>
 
<div style="padding:3px; padding-left:6px; border:1px dotted #d0d0d0; border-left:4px solid #d0d0d0; margin-left:20px;">
 
<div style="padding:3px; padding-left:6px; border:1px dotted #d0d0d0; border-left:4px solid #d0d0d0; margin-left:20px;">
<h3><font size="4">BIBLIOGRAPHY</font></h3>
+
<h3><font size="5">REFERENCES</font></h3>
  
  

Revision as of 09:55, 9 October 2018

Template loop detected: Template:Grenoble-Alpes

FLUORESCENCE MODULE

In our Top10 competent bacteria, we previously inserted, thanks to a bacterial transformation process, the pSB1C3-BBa_J04450 backbone that contains a gene coding for mRFP1. Hence, when the bacteria will express this gene, they will start producing fluorescence, which is related to the presence of the pathogenic bacterium.

The fluorescence expressed by the Top10 comes from the mRFP1 protein, whose spectrum is given in figure 1.


Figure 1: Excitation (left) and emission (right) mRFP1 spectra

This fluorescence is detected thanks to our home-made unit which is shown here:


Figure 2: Exploded view of the fluorescence sensor prototype, life-size

Light pathway through the fluorescence sensor

Excitation light

To excite the fluorophore, it is necessary to use a light which emission spectrum is the narrower possible. Therefore, LEDs are much appreciated as their emission spectrum are usually very narrow. Moreover, they are easy to supply (they just need a resistance to protect them and keep the light intensity constant) and very small. We used 565 nm  LNJ309GKGAD LEDs (Panasonic) whose spectrum is given in Figure 3.


Figure 3: LED emission spectrum - peak at 565 nm

It is a small and cheap LED, which light is not that bright, but sufficiently focused to excite the bacteria as it is showcased in Figure 4:


Figure 4: Fluorescence of a sample of TOP10 bacteria expressing the mRFP1 protein

If the red circle is not obvious enough on your screen, you may check the histogram of the picture in the “Learn more” section; each picture is associated to its histogram, which displays the number of pixels having the same intensity given as the grey level. It is a very useful tool in image processing to avoid being abused by our eyes. Concerning the camera, it does not face such problems, it sees the picture as it is.


Filtering

Before reaching the camera, it is mandatory for the incoming light to be filtered, that is to say to get rid of all the background light, mainly coming from the LEDs. A simple way to do it is to use what is called an optical filter. This device can, depending on its characteristics, filter the light to keep some colors and turn others off. This filter is highly recommended to reduce the noise level. For instance, our filter is theoretically able to switch off completely the yellowish-green LEDs (Figure 5):


Figure 5: Transmission curve of the filter [3]

The cut-off wavelength of the filter is 605 nm. It however transmits light of a little lower wavelength, but the light LED intensity is so low at the right end of the curve that it is totally filtered, as Figure 6 shows it:


Figure 6: Picture of the filter’s efficiency. The LED is positioned in place of the sample. Its light is totally filtered.

The picture in Figure 6 is completely dark. It means that the LED light cannot be seen by the camera. It is a primordial condition to confirm to make sure that no light is captured by the camera under control measurement, which is actually the case as Figure 7 shows it:


Figure 7: Non transformed Top10 picture

This proves that the light that is detected by the camera comes from fluorescence only, which makes the processing easier.


Detecting fluorescence

All of these pictures were taken with the camera that we used in our module. It is a Pi noIR Camera V2 [4]. This camera is perfectly suited to our application, as it can be easily controlled by a little computer called a Raspberry Pi [5] that also monitors the touch screen. Moreover, this camera has the same performance as a middle-class smartphone, to give an idea of the quality of the pictures.

Eventually, calculating the overall intensity on each picture gives the linear graph Figure 8:


Figure 8: Overall luminance as a function of the fluorescent bacteria optical density at 600 nm

CONCLUSION

As a conclusion, here are some advice you should follow if you need to create your own fluorescence sensor, to avoid painful troubles:

- Unless you have a photomultiplier available with the appropriate supply, make a camera your first choice before a photosensitive device, such as a photoresistance or a photodiode. The fluorescence analysis is highly limited by the fluorescence intensity. It is easier to see fluorescence than detect it. Hence, with a photosensitive device, the minimal sensitivity will not suit your application. Photosensitive devices are helpful when you try to do quantitative measurements, as they directly link the light intensity to their output, but it is meaningless if you cannot detect a low-level fluorescence. If a photosensitive sensor is mandatory for your application, you should use an optic fiber to focus the light to the sensor with no losses.

- Be careful of the material you work with. The plastic you use, for instance, might be fluorescent as well, making your results wrong. The material might also reflect the excitation light to the sensor when you expected it to be confined.

- Look carefully at the characteristics of the excitation light. The brighter the better it is to boost the fluorescence intensity, obviously, but it is not enough. A high angle of vision (more than 100°) can be interesting to illuminated a large surface, but beware not to illuminate your sensor. A small angle of vision would ensure that it does not happen but the resulting light intensity would be usually lower.

- For the sake of reducing the noise level, as usual, you should consider using an excitation light with a lower central wavelength than the optimal one in case the excitation spectrum of the fluorescent protein overflows on the emission spectrum, and hence taking advantage of the excitation spectrum spreading (taking a higher cutoff wavelength longpass filter works as well).

Learn more


One problem in fluorescence analysis, which might seem anecdotal but is actually primordial, is the geometry. Depending on the position of the light sensor, the sample or the light source, the results can vary a lot. The geometry adopted, which is equivalent to the epifluorescence, places the camera on a perpendicular axis to the light source (Figure 9):


Figure 9: Block diagram of the fluorescence sensor

In practice, the prototype looks like in Figure 2. It is just the prototype that was used for the experiments, the parts do not suit the final device. However, this prototype was designed to look alike the final device as much as possible. In particular, a sliding part was printed in order to place the fluorescent sample at the exact same location for each experiment to ensure its repeatability. Holes were drilled in the sample holder to place the LED as close as possible to the sample and orthogonally to the camera to minimize the noise level.

About the camera performances

The camera is located above the sample. It means that the top surface of the sample is pictured, displaying a circle (Figure 10):


Figure 10: Fluorescence picture

As you can see, the picture is not that clear. Despite its very high resolution (8 MPx) which ranks it among the high-quality smartphones (8 to 12 MPx usually), the Depth Of Field (DOF) is of 1m which limits the quality of the pictures. The DOF is the minimal distance to get unblurred objects on a picture [6]. The nearer an object is to the DOF, the sharper the details are. Yet, in a fluorescence detection application, the light sensor should be as near as possible to the fluorescent sample to capture as much light as possible. Here is why a lens is used: to get the smoothest pictures in spite of this camera’s impairment.

Why using this camera and not another one with a better DOF then? First of all, it is the first camera we had in hands and we could work with it quickly. Moreover, it is a user-friendly camera, adapted to the Raspberry Pi we already planned to use to monitor the touchscreen. The hardware has some characteristics, but they can be easily adjusted through the software, thanks to the Pillow library (Python). The advantage of Pillow over OpenCV, a classic image processing library, on Raspbian is that it is easier to install. Yet, if you feel brave enough to install OpenCV [7], let us advise you to do so, as OpenCV's functions are faster than Pillow’s. We decided not to install OpenCV as we do not have any speed constraint.

Finally, the camera is still able to detect low fluorescent light, even when the human eye struggles with. It can then be useful to plot the histogram of a picture. A histogram classifies pixels having the same value, that is to say, having the same color. For instance, there is no doubt that the sample Figure 3 is fluorescent thanks to its histogram (Figure 11):


REFERENCES