Difference between revisions of "Team:XMU-China/Hardware"

(Prototype team page)
 
 
(42 intermediate revisions by 7 users not shown)
Line 1: Line 1:
{{XMU-China}}
+
<html lang="en">
<html>
+
  
 +
<head>
 +
    <meta charset="UTF-8" name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0">
 +
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
 +
    <!--ie使用edge渲染模式-->
 +
    <meta content="width=device-width,initial-scale=1.0,maximum-scale=1.0,user-scalable=no" id="viewport" name="viewport">
 +
    <meta name="renderer" content="webkit">
 +
    <!--360渲染模式-->
 +
    <meta name="format-detection" content="telephone=notelphone=no, email=no" />
 +
    <meta name="description" content="" />
 +
    <meta name="keywords" content="" />
 +
    <link rel="shortcut icon" href="favicon.ico" type="image/x-icon" />
 +
    <meta name="apple-touch-fullscreen" content="yes" /><!-- 是否启用 WebApp 全屏模式,删除苹果默认的工具栏和菜单栏 -->
 +
    <meta name="apple-mobile-web-app-status-bar-style" content="black" /><!-- 设置苹果工具栏颜色:默认值为 default,可以定为 black和 black-translucent-->
 +
    <meta http-equiv="Cache-Control" content="no-siteapp" /><!-- 不让百度转码 -->
 +
    <meta name="HandheldFriendly" content="true"><!-- 针对手持设备优化,主要是针对一些老的不识别viewport的浏览器,比如黑莓 -->
 +
    <meta name="MobileOptimized" content="320"><!-- 微软的老式浏览器 -->
 +
    <meta name="screen-orientation" content="portrait"><!-- uc强制竖屏 -->
 +
    <meta name="x5-orientation" content="portrait"><!-- QQ强制竖屏 -->
 +
    <meta name="browsermode" content="application"><!-- UC应用模式 -->
 +
    <meta name="x5-page-mode" content="app"><!-- QQ应用模式 -->
 +
    <meta name="msapplication-tap-highlight" content="no"><!-- windows phone 点击无高光 -->
 +
    <title>Team:XMU-China/Hardware - 2018.igem.org</title>
 +
    <link rel="stylesheet" href="css/interlab.css">
 +
    <link rel="stylesheet" href="css/font.css">
 +
    <link href="http://cdn.bootcss.com/font-awesome/4.7.0/css/font-awesome.min.css" rel="stylesheet">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/cover?action=raw&ctype=text/css">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/footer?action=raw&ctype=text/css">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/nav?action=raw&ctype=text/css">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/interlab?action=raw&ctype=text/css">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/font?action=raw&ctype=text/css">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/nav_mobile?action=raw&ctype=text/css">
 +
    <link rel="stylesheet" href="https://2018.igem.org/Team:XMU-China/css/material-scrolltop?action=raw&ctype=text/css">
 +
</head>
  
 +
<body>
 +
    <header></header>
 +
    <div id="container">
 +
        <header>
 +
            <div class="wrapper cf">
 +
                <nav id="main-nav">
 +
                    <ul class="first-nav">
 +
                        <li class="Project">
 +
                            <a href="#" target="_blank">Project</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Description">Description</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Design">Design</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Results">Results</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Demonstrate">Demonstrate</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Parts">Parts</a></li>
 +
                            </ul>
 +
                        </li>
 +
                    </ul>
 +
                    <ul class="second-nav">
 +
                        <li class="Hardware">
 +
                            <a href="#">Hardware</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Hardware">Overview</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Microfluidic_Chips">Microfluidic Chips</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Fluorescence_Detection">Fluorescence Detection</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Raspberry_Pi">Raspberry Pi</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Application">Application</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Software">Software</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Applied_Design">Product Design</a></li>
 +
                            </ul>
 +
                        </li>
 +
                        <li class="Model">
 +
                            <a href="#">Model</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Model#Summary">Summary</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Model#Thermodynamic_model">Thermodynamic Model</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Model#Fluid_dynamics_model">Fluid dynamics Model</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Model#Molecular_docking_model">Molecular Docking Model</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Model#The_dynamic_model">Derivation of Rate Equation</a></li>
 +
                            </ul>
 +
                        </li>
 +
                        <li class="Human_Practice">
 +
                            <a href="#">Social Works</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Human_Practices">Human Practice</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Public_Engagement">Engagement</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Collaborations">Collaborations</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Entrepreneurship">Entrepreneurship</a></li>
 +
                            </ul>
 +
                        </li>
 +
                        <li class="Other_Works">
 +
                            <a href="#">Other Works</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/InterLab">InterLab</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Improve">Improve</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Safety">Safety</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Measurement">Measurement</a></li>
 +
                            </ul>
 +
                        </li>
 +
                        <li class="Notebook">
 +
                            <a href="#">Notebook</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Notebook">Notebook</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Experiments">Experiments</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Engineering">Engineering</a></li>
 +
                            </ul>
 +
                        </li>
 +
                        <li class="Team">
 +
                            <a href="#">Team</a>
 +
                            <ul>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Team">Members</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Attributions">Attributions</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/Judging">Judging</a></li>
 +
                                <li><a href="https://2018.igem.org/Team:XMU-China/After_iGEM">After iGEM</a></li>
 +
                            </ul>
 +
                        </li>
 +
                    </ul>
 +
                </nav>
 +
                <a class="toggle"><span></span></a>
 +
            </div>
 +
        </header>
 +
    </div>
 +
    <script src="https://2018.igem.org/Team:XMU-China/js/hc-mobile-nav?action=raw&ctype=text/javascript"></script>
 +
    <div class="header">
 +
        <div class="logo">
 +
            <img src="https://static.igem.org/mediawiki/2018/b/b5/T--XMU-China--singlelogo.png">
 +
            <img src="https://static.igem.org/mediawiki/2018/3/35/T--XMU-China--iGEM_logo.png">
 +
        </div>
 +
            <div class="clear"></div>
 +
            <div class="nav">
 +
                <div id="Team">
 +
                    <div class="nav-word">Team</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Team">Members</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Attributions">Attributions</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Judging">Judging</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/After_iGEM">After iGEM</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="Notebook">
 +
                    <div class="nav-word">Notebook</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Notebook">Notebook</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Experiments">Experiments</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Engineering">Engineering</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="Other_Works">
 +
                    <div class="nav-word">Other Works</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/InterLab">InterLab</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Improve">Improve</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Safety">Safety</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Measurement">Measurement</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="Human_Practice">
 +
                    <div class="nav-word">Social Works</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Human_Practices">Human Practice</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Public_Engagement">Engagement</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Collaborations">Collaborations</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Entrepreneurship">Entrepreneurship</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="Model">
 +
                    <div class="nav-word">Model</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Model">Summary</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Model#Thermodynamic_model">Thermodynamic Model</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Model#Fluid_dynamics_model">Fluid dynamics Model</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Model#Molecular_docking_model">Molecular Docking Model</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Model#The_dynamic_model">Derivation of Rate Equation</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="Hardwork">
 +
                    <div class="nav-word">Hardware</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Hardware">Overview</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Microfluidic_Chips">Microfluidic Chips</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Fluorescence_Detection">Fluorescence Detection</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Raspberry_Pi">Raspberry Pi</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Hardware#Application">Application</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Software">Software</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Applied_Design">Product Design</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="Project">
 +
                    <div class="nav-word">Project</div>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Description">Description</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Design">Design</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Results">Results</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Demonstrate">Demonstrate</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Parts">Parts</a></li>
 +
                    </ul>
 +
                </div>
 +
                <div id="jiannan">
 +
                    <a href="https://2018.igem.org/Team:XMU-China">
 +
                        <img src="https://static.igem.org/mediawiki/2018/6/6f/T--XMU-China--jiannanlogo.png">
 +
                    </a>
 +
                </div>
 +
            </div>
 +
      </div>
 +
        <div class="clear"></div>
 +
        <div class="description_banner">
 +
            <div class="word">Hardware</div>
 +
        </div>
 +
        <nav class="Quick-navigation">
 +
            <div class="Quick-navigation_word">
 +
                <img  src="https://static.igem.org/mediawiki/2018/e/e6/T--XMU-China--right50.png">
 +
                <a href="#Overview" class="Quick-navigation-item">
 +
                    <img id="turn_img" src="https://static.igem.org/mediawiki/2018/5/50/T--XMU-China--right51.png">
 +
                    <a href="#Overview"id="Quick_A">Overview</a></a>
 +
                <a href="#Microfluidic_Chips" class="Quick-navigation-item" >
 +
                    <img id="turn_img" src="https://static.igem.org/mediawiki/2018/0/04/T--XMU-China--right52.png">
 +
                    <a href="#Microfluidic_Chips" id="Quick_B">Microfluidic Chips</a></a>
 +
                <a href="#Fluorescence_Detection" class="Quick-navigation-item">
 +
                    <img id="turn_img" src="https://static.igem.org/mediawiki/2018/d/d0/T--XMU-China--right53.png">
 +
                    <a href="#Fluorescence_Detection" id="Quick_C">Fluorescence Detection</a></a>
 +
                <a href="#Raspberry_Pi" class="Quick-navigation-item">
 +
                    <img id="turn_img" src="https://static.igem.org/mediawiki/2018/1/11/T--XMU-China--right54.png">
 +
                    <a href="#Raspberry_Pi"id="Quick_D">Raspberry Pi</a></a>
 +
                <a href="#Application" class="Quick-navigation-item">
 +
                    <img id="turn_img" src="https://static.igem.org/mediawiki/2018/b/ba/T--XMU-China--right55.png">
 +
                    <a href="#Application"id="Quick_F">Application</a></a>
 +
            </div>
 +
        </nav>
 +
        <div class="main Entrepreneurship">
 +
            <section id="Overview" class="js-scroll-step">
 +
                <div class="headline">
 +
                    Overview
 +
                </div>
 +
                <h1>Background</h1>
 +
                <p>Nowadays, most disease-diagnosing methods are confined to specific delicate testing apparatus, which are expensive, time-consuming and low sensitivity. The study of <i>Point-of-care testing</i> (POCT), also called bedside testing (with the definition of medical diagnostic testing at or near the time and place of patient care), has become very heated because of its convenience, simplicity and highly efficiency. <i>Internet of things</i> (IoT) is the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables the connection and exchange of data.</p>
 +
                <p>Here we came up with a design. We combined the idea of our project <i>Aptamer-based Cell-free Detection system</i> (ABCD system), IoT, and the above concept of POCT so as to develop a microfluidic device, which is small while convenient for real-time detection of cancer. </p>
 +
                <p>Given that the biomarkers of different cancers are overlapping and the time was limited, we just took the pancreatic cancer as an example to certify the feasibility of our testing theory as well as our device.</p>
 +
                <p class="video"><video src="https://static.igem.org/mediawiki/2018/9/9c/T--XMU-China--hardware.mp4" controls></video></p>
 +
                <h1>Designs</h1>
 +
                <p>We gave our testing device a name, "<i>Fang</i>". "<i>Fang</i>" in Chinese means "Cubic" which can describe our hardware's outlook vividly.
 +
                <p>In general,<i>Fang</i> consists of four parts, namely the Microfluidic chip, the fluorescence detection apparatus, <i>Raspberry Pi</i> (RPi) and a mobile-phone application(Software). Among these four parts, Raspberry Pi is the main operating system of the entire device, and its functions include chip-driving controlling, image capturing and server-client data transmission.</p>
 +
                <p>At the beginning, the sample, which is added into our designed microfluidic chip based on ABCD system, will react to Cas12a and give out the fluorescence signal. Then RPi controlled camera captures the image that would be transmitted to App through Socket. In the meantime, the App will convert the image results into visual and readable analysis reports based on its internal machine learning sample database. Finally, App would also achieve Information Sharing between User and Doctor.</p>
 +
                <p class="F25"><img src="https://static.igem.org/mediawiki/2018/f/f4/T--XMU-China--hardware-1.png"></p>
 +
                <p class="Figure_word"><strong>Figure 1.</strong> The draft of our hardware –<i>Fang</i>.</p>
 +
                    <p>The overwhelming advantage of <i>Fang</i> is that it gives the best interpretation of POCT. <i>Fang</i> can overcome the drawback that some tumor/cancer detections could only be achieved in the big clinical laboratory by large and expensive equipment. It can be more widely used in the community hospitals, especially in remote areas which are short of necessary medical resources. In addition, <i>Fang</i> can also be used for risk-monitoring by people who have familial-hereditary disease at their home.</p>
 +
                    <p class="F3"><img src="https://static.igem.org/mediawiki/2018/1/16/T--XMU-China--electronic_circui_testing2.png">
 +
                        <p class="Figure_word"><strong>Figure 2.</strong> The physical figure of our hardware –<i>Fang</i>.</p>
 +
                    </p>
  
<div class="column full_size judges-will-not-evaluate">
+
            </section>
<h3>★  ALERT! </h3>
+
            <section id="Microfluidic_Chips" class="js-scroll-step">
<p>This page is used by the judges to evaluate your team for the <a href="https://2018.igem.org/Judging/Medals">medal criterion</a> or <a href="https://2018.igem.org/Judging/Awards"> award listed below</a>. </p>
+
                <div class="headline">
<p> Delete this box in order to be evaluated for this medal criterion and/or award. See more information at <a href="https://2018.igem.org/Judging/Pages_for_Awards"> Instructions for Pages for awards</a>.</p>
+
                    Microfluidic chips
</div>
+
                </div>
 
+
                <p> According to the model of Aptamer Based Cell Free testing system(ABCD System), we put forward the following design using traditional material for hardware—PMMA as the layers.</p>
 
+
                <p> There are two rooms and two pipelines in our microfluidic chips. The first room is Incubation Room, where we has already covered with BAS systems(Biotin-Avidin System). Inspired by the project of <a class="click_here" href="https://2017.igem.org/Team:EPFL/Description/Aptamers">2017 EPFL</a>, we used <strong>ternary affinity coating(TERACOAT) method</strong><sup>[1]</sup> to pre-treat our microfluidic chip, aiming to coat the aptamers which will compete with target protein. The second room is Detection Room, and we put Cas12a (Cfp1) and DNase Alert in advance to achieve it that we convert the protein signal to fluorescence signal. What's more, there is a pipeline we named Pneumatic Valve between Incubation Room and Detection Room. It could control liquid flow through changing the motor speed, which would generate differential air pressure caused by barometric pressure and centrifugal force.</p>
<div class="clear"></div>
+
                <p> When the samples are added into loading slot, they would flow into the first room (Incubation Room).The biomarker would compete the complementary sequences in the aptamer, which would flow to the second room (Detection Room) by increasing rotate speed. In the second room, the complementary sequence would activate the Cas12a (Cfp1) to cut DNase Alert. The short sequence between Quencher and Fluorophore is cut and then the quencher wouldn't restrain the fluorophore anymore. Above all, it would give out the green fluorescence we want in detecting room.</p>
 
+
                <p class="F3"><img src="https://static.igem.org/mediawiki/2018/6/67/T--XMU-China--hardware-2.png"><img src="https://static.igem.org/mediawiki/2018/5/52/T--XMU-China--hardware-3.png">
 
+
                    <p class="Figure_word"><strong>Figure 3.</strong> The Microfluidic Chips Cutaway View(up panel) and the Sandwich Structure of BAS(Biotin-Avidin System)(low panel).</p>
 
+
                </p>
<div class="column full_size">
+
                <h1>References</h1>
 
+
                <p class="reference">[1] Piraino F, Volpetti F, Watson C, et al. A Digital–Analog Microfluidic Platform for Patient-Centric Multiplexed Biomarker Diagnostics of Ultralow Volume Samples. <i>Acs Nano</i>, <strong>2016</strong>, 10(1).</p>
<h1>Hardware</h1>
+
            </section>
 
+
            <section id="Fluorescence_Detection" class="js-scroll-step">
</div>
+
                <div class="headline">
 
+
                    Fluorescence detection
<div class="column two_thirds_size">
+
                </div>
<h3>Best Hardware Special Prize</h3>
+
                <p> Inspired by the principle of the ultraviolet gel tester and <a class="click_here" href="https://2017.igem.org/Team:EPFL/Description/Aptamers">2014 Aachen</a>, we came up with the fluorescence detection design. Given that the fluorophore of the DNaseAlert (<i>IDT</i>) would be excited by green light with 535 nm and give out light of 565 nm, we selected the green LED lamp beads (which give out light around 535 nm) as light source. The camera with specific optical filter is in the same level with the light. It will get the Emission light from sample and then take a photo of the whole microfluidic chip.</p>
<p>iGEM is about making teams of students making synthetic biology projects. We encourage teams to work with parts and build biological devices in the lab. But we are inclusive and want all teams to work on many other types of problems in synbio. Robotic assembly, microfluidics, low cost equipment and measurement hardware are all areas ripe for innovation in synbio. </p>
+
                <p class="F2"><img src="https://static.igem.org/mediawiki/2018/e/e0/T--XMU-China--Fluorescence_detection.png">
 
+
                    <p class="Figure_word"><strong>Figure 4.</strong> The Flow Path of Fluorescence Detection.</p>
<p>
+
                </p>
Teams who are interested in working with hardware as a side project are encouraged to apply for the hardware award.  
+
                <p> As showed in the figure below (Figure 5), after combination of EpCam and Cas12a protein, the detection room gave out significant green fluorescence compared to other wells which didn’t coat corresponding sequence. This proves that our experimental method of converting protein signals into fluorescent signals is feasible.</p>
 
+
                <p class="F25"><img src="https://static.igem.org/mediawiki/2018/6/62/T--XMU-China--hardware-4.png"></p>
<br><br>
+
<p class="Figure_word"><strong>Figure 5.</strong> The group of pictures photographed by our fluorescence detection.</p>
To compete for the <a href="https://2018.igem.org/Judging/Awards">Best Hardware prize</a>, please describe your work on this page and also fill out the description on the <a href="https://2018.igem.org/Judging/Judging_Form">judging form</a>.
+
            </section>
<br><br>
+
            <section id="Raspberry_Pi" class="js-scroll-step">
You must also delete the message box on the top of this page to be eligible for this prize.
+
                <div class="headline">
</p>
+
                    Raspberry Pi
</p>
+
                </div>
</div>
+
                <p> The Raspberry Pi (RPi) plays an important role as server, which can receive the command from the APP and apply it to the peripherals (motor and camera). To achieve it, all you need is just clicking the button on the mobile phone. Besides, it's worth mentioning that it can execute the whole series detection processes automatically. We used SSH to make a connection between RPi and IP address, and programmed in RPi. We called <i>Picamera</i> Function by Python to achieve imaging-capture, and called <i>Wringpi</i> by C-language. We made Speed-controlling come to realize by PID speed mode. </p>
 
+
                <p class="F3"><img src="https://static.igem.org/mediawiki/2018/6/6f/T--XMU-China--Raspberry_Pi.png">
<div class="column third_size">
+
                    <p class="Figure_word"><strong>Figure 6.</strong> Circuit Diagram of Raspberry Pi Operating System.</p>
<div class="highlight decoration_A_full">
+
                </p>
<h3>Inspiration</h3>
+
                <p></p>
<p>You can look at what other teams did to get some inspiration! <br />
+
                <p class="F3"><img src="https://static.igem.org/mediawiki/2018/9/9e/T--XMU-China--circuit2.png"></p>
Here are a few examples:</p>
+
<p class="Figure_word"><strong>Figure 7.</strong> The physical map of the Raspberry Pi Operating System.</p>
<ul>
+
                    <p> The speed acquisition is realized by the encoder and the call interrupt-function. The RPi detects the waveform (square wave) returned by the encoder through the pin. It generates an external trigger when electrical level changes, and then it enters the interrupt function so as to realize counting and come to null. It could convert to the current motor's speed by a pulse count for a fixed time. The maximum output voltage of PWM is only 3.3 V. But the voltage will be applied to the signal input through 12 V battery resistance transformer mode, which can realize 12-0 V voltage regulation, that is, speed regulation in the range of 500-0 rpm. <i>NodeJS</i> calls the corresponding file and executes the corresponding program by receiving signals and parameters, which makes RPi's corresponding to mobile phone's command be realized. More details about the principle of the Raspberry Pi, please switch to our code on <a class="click_here" href="https://github.com/igemsoftware2018/Team_XMU_China">our GitHub page.</a></p>
<li><a href="https://2016.igem.org/Team:Valencia_UPV">2016 Valencia UPV</a></li>
+
<p> From our experimental results, the time required for the sample flowing from the Sample Well to Incubation Room is 15-20 seconds, while the corresponding threshold rotation speed is 280 rpm. After the sample competed for 30 minutes in Incubation Room, we increased the speed to 320 rpm, allowing the sample to flow from Incubation Room to Detection Room, which takes around 30 seconds. (Our experimental speed design is calculated in  our <a class="click_here" href="https://2018.igem.org/Team:XMU-China/Model#Fluid_dynamics_model"> Fluid Dynamics Model page.</a>. Under the guidance of the results, we also consider other practical conditions obtained from experiments)</p>
<li><a href="https://2016.igem.org/Team:Aachen">2016 Aachen </a></li>
+
            </section>
<li><a href="https://2015.igem.org/Team:TU_Delft">2015 TU Delft  </a></li>
+
            <section id="Application" class="js-scroll-step">
<li><a href="https://2015.igem.org/Team:TU_Darmstadt">2015 TU Darmstadt</a></li>
+
                <div class="headline ">
</ul>
+
                    Application
</div>
+
                </div>
 
+
                <p> In order to conveniently control our hardware for users, we have designed our own App which can be used with <i>Fang</i>. It is worth mentioning that our App gathers AIT (Artificial Intelligence Technology) and blocks chain technology, equipped with functions providing excellent user experience. According to the navigation bar, we can easily find that App consists of four function modules: Control, Analysis, Interaction and About.</p>
</div>
+
                <p class="F2"><img src="https://static.igem.org/mediawiki/2018/e/e5/T--XMU-China--hardware-5.png"></p>
 
+
                <p class="Figure_word"><strong>Figure 8.</strong> Interfacial Design of our Application.</p>
 
+
                    <p> The first function module named Control, which is used to control the operation of <i>Fang</i>. There is a switch which can control the power-driven machine to turn on or off. And the PHOTO button can be used to control the camera to capture the fluorescent signal, and display it on the mobile phone. In addition, users are free to control the speed of the revolution (Auto/Seton). What's more, users can press the SET button to change the speed and the REQUIRE button to require the current speed.</p>
 +
                    <p> The second function module named Analysis aims at taking advantage of AIT to analyze the images that <i>Fang</i> has already photographed. The technology we chose is Tensorflow-the second generationartificial intelligent learning system developed by Google based on DistBelief. When an user takes a photo, the photo will be sent to our cloud server. In the cloud server, there are some modules which have been trained by large amounts of samples with CNN (Convolutional Neural Network). Intelligently, the photo uploaded will match the model by machine learning and user can obtain the information containing analysis results accurately and efficiently from App.</p>
 +
                    <p> The third function module named Interaction. It is based on block chain technology. In our cloud server, we have already established a private chain of Ethereum, and we will issue our own digital currency provided for users to deploy smart contract of their own, including all theirdiagnostic messages. There is a One-to-One correspondence between a user and a doctor. When a smart contract is deployed, it allows only one doctor to confirm. Later the doctor can write the therapeutic methods or suggestions into the smart contract to which the patient can refer. On account of its transparency, openness and immutable property, a medical certificate (smart contract) can be corresponded to one specific patient and one specific doctor. Thus, it can effectively avoid medical dispute such as misdiagnose and unscrupulous disavowing.</p>
 +
                    <p> The last function module named About. contains introduction of our teamand some users' information.</p>
 +
                    <p> Following the function recommended above comes the introduction about the communication mechanism of our App. It is based on three-party communication among Raspberry Pi, App, and cloud server. The instruction sent from App will first arrive cloud server, and will be transmit to Raspberry Pi by server. Actually, the instruction sent from Raspberry will be in a similar way. What's more, the machine learning and mining mechanism of block chain will be operating in the cloud server, thus we can enhance operational efficiency and real-time performance.</p>
 +
            </section>
 +
        </div>
 +
        <script src="https://2018.igem.org/Team:XMU-China/js/right?action=raw&ctype=text/javascript"></script>
 +
        <button class="material-scrolltop" type="button"></button>
 +
        <script>
 +
        window.jQuery || document.write('<script src="js/jquery-1.11.0.min.js"><\/script>')
 +
        </script>
 +
        <script src="https://2018.igem.org/Team:XMU-China/js/material-scrolltop?action=raw&ctype=text/javascript"></script>
 +
        <div class="footer">
 +
          <div class="footer_top">
 +
                    <img src="https://static.igem.org/mediawiki/2018/3/3d/T--XMU-China--xmu_is_different.png">
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China">Home</a></a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Model">Model</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Parts">Parts</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Entrepreneurship">Entrepreneurship</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Attributions">Attributions</a></li>
 +
                    </ul>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Design">Design</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Hardware">Hardware</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Human_Practices">Human Practices</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Notebook">Notebook</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Judging">Judging</a></li>
 +
                    </ul>
 +
                    <ul>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Results">Results</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Software">Software</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Collaborations">Collaborations</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/Members">Members</a></li>
 +
                        <li><a href="https://2018.igem.org/Team:XMU-China/After_iGEM">After iGEM</a></li>
 +
                    </ul>
 +
                </div>
 +
        </div>
 +
        <div class="bottom"></div>
 +
</body>
  
 
</html>
 
</html>

Latest revision as of 03:07, 18 October 2018

Team:XMU-China/Hardware - 2018.igem.org

Hardware
Overview

Background

Nowadays, most disease-diagnosing methods are confined to specific delicate testing apparatus, which are expensive, time-consuming and low sensitivity. The study of Point-of-care testing (POCT), also called bedside testing (with the definition of medical diagnostic testing at or near the time and place of patient care), has become very heated because of its convenience, simplicity and highly efficiency. Internet of things (IoT) is the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables the connection and exchange of data.

Here we came up with a design. We combined the idea of our project Aptamer-based Cell-free Detection system (ABCD system), IoT, and the above concept of POCT so as to develop a microfluidic device, which is small while convenient for real-time detection of cancer.

Given that the biomarkers of different cancers are overlapping and the time was limited, we just took the pancreatic cancer as an example to certify the feasibility of our testing theory as well as our device.

Designs

We gave our testing device a name, "Fang". "Fang" in Chinese means "Cubic" which can describe our hardware's outlook vividly.

In general,Fang consists of four parts, namely the Microfluidic chip, the fluorescence detection apparatus, Raspberry Pi (RPi) and a mobile-phone application(Software). Among these four parts, Raspberry Pi is the main operating system of the entire device, and its functions include chip-driving controlling, image capturing and server-client data transmission.

At the beginning, the sample, which is added into our designed microfluidic chip based on ABCD system, will react to Cas12a and give out the fluorescence signal. Then RPi controlled camera captures the image that would be transmitted to App through Socket. In the meantime, the App will convert the image results into visual and readable analysis reports based on its internal machine learning sample database. Finally, App would also achieve Information Sharing between User and Doctor.

Figure 1. The draft of our hardware –Fang.

The overwhelming advantage of Fang is that it gives the best interpretation of POCT. Fang can overcome the drawback that some tumor/cancer detections could only be achieved in the big clinical laboratory by large and expensive equipment. It can be more widely used in the community hospitals, especially in remote areas which are short of necessary medical resources. In addition, Fang can also be used for risk-monitoring by people who have familial-hereditary disease at their home.

Figure 2. The physical figure of our hardware –Fang.

Microfluidic chips

According to the model of Aptamer Based Cell Free testing system(ABCD System), we put forward the following design using traditional material for hardware—PMMA as the layers.

There are two rooms and two pipelines in our microfluidic chips. The first room is Incubation Room, where we has already covered with BAS systems(Biotin-Avidin System). Inspired by the project of 2017 EPFL, we used ternary affinity coating(TERACOAT) method[1] to pre-treat our microfluidic chip, aiming to coat the aptamers which will compete with target protein. The second room is Detection Room, and we put Cas12a (Cfp1) and DNase Alert in advance to achieve it that we convert the protein signal to fluorescence signal. What's more, there is a pipeline we named Pneumatic Valve between Incubation Room and Detection Room. It could control liquid flow through changing the motor speed, which would generate differential air pressure caused by barometric pressure and centrifugal force.

When the samples are added into loading slot, they would flow into the first room (Incubation Room).The biomarker would compete the complementary sequences in the aptamer, which would flow to the second room (Detection Room) by increasing rotate speed. In the second room, the complementary sequence would activate the Cas12a (Cfp1) to cut DNase Alert. The short sequence between Quencher and Fluorophore is cut and then the quencher wouldn't restrain the fluorophore anymore. Above all, it would give out the green fluorescence we want in detecting room.

Figure 3. The Microfluidic Chips Cutaway View(up panel) and the Sandwich Structure of BAS(Biotin-Avidin System)(low panel).

References

[1] Piraino F, Volpetti F, Watson C, et al. A Digital–Analog Microfluidic Platform for Patient-Centric Multiplexed Biomarker Diagnostics of Ultralow Volume Samples. Acs Nano, 2016, 10(1).

Fluorescence detection

Inspired by the principle of the ultraviolet gel tester and 2014 Aachen, we came up with the fluorescence detection design. Given that the fluorophore of the DNaseAlert (IDT) would be excited by green light with 535 nm and give out light of 565 nm, we selected the green LED lamp beads (which give out light around 535 nm) as light source. The camera with specific optical filter is in the same level with the light. It will get the Emission light from sample and then take a photo of the whole microfluidic chip.

Figure 4. The Flow Path of Fluorescence Detection.

As showed in the figure below (Figure 5), after combination of EpCam and Cas12a protein, the detection room gave out significant green fluorescence compared to other wells which didn’t coat corresponding sequence. This proves that our experimental method of converting protein signals into fluorescent signals is feasible.

Figure 5. The group of pictures photographed by our fluorescence detection.

Raspberry Pi

The Raspberry Pi (RPi) plays an important role as server, which can receive the command from the APP and apply it to the peripherals (motor and camera). To achieve it, all you need is just clicking the button on the mobile phone. Besides, it's worth mentioning that it can execute the whole series detection processes automatically. We used SSH to make a connection between RPi and IP address, and programmed in RPi. We called Picamera Function by Python to achieve imaging-capture, and called Wringpi by C-language. We made Speed-controlling come to realize by PID speed mode.

Figure 6. Circuit Diagram of Raspberry Pi Operating System.

Figure 7. The physical map of the Raspberry Pi Operating System.

The speed acquisition is realized by the encoder and the call interrupt-function. The RPi detects the waveform (square wave) returned by the encoder through the pin. It generates an external trigger when electrical level changes, and then it enters the interrupt function so as to realize counting and come to null. It could convert to the current motor's speed by a pulse count for a fixed time. The maximum output voltage of PWM is only 3.3 V. But the voltage will be applied to the signal input through 12 V battery resistance transformer mode, which can realize 12-0 V voltage regulation, that is, speed regulation in the range of 500-0 rpm. NodeJS calls the corresponding file and executes the corresponding program by receiving signals and parameters, which makes RPi's corresponding to mobile phone's command be realized. More details about the principle of the Raspberry Pi, please switch to our code on our GitHub page.

From our experimental results, the time required for the sample flowing from the Sample Well to Incubation Room is 15-20 seconds, while the corresponding threshold rotation speed is 280 rpm. After the sample competed for 30 minutes in Incubation Room, we increased the speed to 320 rpm, allowing the sample to flow from Incubation Room to Detection Room, which takes around 30 seconds. (Our experimental speed design is calculated in our Fluid Dynamics Model page.. Under the guidance of the results, we also consider other practical conditions obtained from experiments)

Application

In order to conveniently control our hardware for users, we have designed our own App which can be used with Fang. It is worth mentioning that our App gathers AIT (Artificial Intelligence Technology) and blocks chain technology, equipped with functions providing excellent user experience. According to the navigation bar, we can easily find that App consists of four function modules: Control, Analysis, Interaction and About.

Figure 8. Interfacial Design of our Application.

The first function module named Control, which is used to control the operation of Fang. There is a switch which can control the power-driven machine to turn on or off. And the PHOTO button can be used to control the camera to capture the fluorescent signal, and display it on the mobile phone. In addition, users are free to control the speed of the revolution (Auto/Seton). What's more, users can press the SET button to change the speed and the REQUIRE button to require the current speed.

The second function module named Analysis aims at taking advantage of AIT to analyze the images that Fang has already photographed. The technology we chose is Tensorflow-the second generationartificial intelligent learning system developed by Google based on DistBelief. When an user takes a photo, the photo will be sent to our cloud server. In the cloud server, there are some modules which have been trained by large amounts of samples with CNN (Convolutional Neural Network). Intelligently, the photo uploaded will match the model by machine learning and user can obtain the information containing analysis results accurately and efficiently from App.

The third function module named Interaction. It is based on block chain technology. In our cloud server, we have already established a private chain of Ethereum, and we will issue our own digital currency provided for users to deploy smart contract of their own, including all theirdiagnostic messages. There is a One-to-One correspondence between a user and a doctor. When a smart contract is deployed, it allows only one doctor to confirm. Later the doctor can write the therapeutic methods or suggestions into the smart contract to which the patient can refer. On account of its transparency, openness and immutable property, a medical certificate (smart contract) can be corresponded to one specific patient and one specific doctor. Thus, it can effectively avoid medical dispute such as misdiagnose and unscrupulous disavowing.

The last function module named About. contains introduction of our teamand some users' information.

Following the function recommended above comes the introduction about the communication mechanism of our App. It is based on three-party communication among Raspberry Pi, App, and cloud server. The instruction sent from App will first arrive cloud server, and will be transmit to Raspberry Pi by server. Actually, the instruction sent from Raspberry will be in a similar way. What's more, the machine learning and mining mechanism of block chain will be operating in the cloud server, thus we can enhance operational efficiency and real-time performance.