Team:UC San Diego/Design

Product Design

Our Thought Process

As our team endeavored to take our idea of the wetlab and tailor it to real world parameters, we realized that it would be important to gain a fundamental understanding of the problem that we were trying to solve, examine some of the clear flaws with the status quo and existing solutions, and then develop a solution that addresses these pain points. Along the way, we have to revise our designs to incorporate our stakeholders’ needs and address them in a comprehensive manner. In addition, considerations of the broader implementation and lifecycle use also helped optimize our design decisions. As a result of our many interactions, we were able to create Epinoma, a modular beginning-to-end workflow for non-invasive cancer detection that uses machine learning to aid biomarker discovery, a functional assay that uses engineered proteins and principles of synthetic biology to detect specific epigenetic determinants, and a digital health platform that helps streamline doctor-patient communication. This journey would have been impossible without the input of many domain experts immersed in the diagnostic pathway, digital health experts, graduate students in the BLUE LINC incubator, industry professionals (researchers and department heads at Genentech and Roche), social entrepreneurs and innovators at the TATA Institute of Genetics in Society.

prodDesFlow

Framing the problem and examining existing solutions

Talking to clinical researchers and diagnostic experts throughout the UC San Diego Health System helped us identify some of the fundamental problems with tissue specimen analysis. One of the primary concerns that healthcare professionals have is that tissue specimen analysis is unable to capture the inherent molecular heterogeneity of tumors and the ability of cancer genomes to evolve. From a diagnostician’s perspective, this decreases the method’s predictive value which makes it suboptimal. In addition, doctors must often make a decision regarding a patient’s biopsy, even though it carries the inherent risk of spreading the tumor and further complicating the issue. From a patient’s perspective, tissue biopsy is often very invasive and painful, and can also pose a serious economic burden in the current healthcare system.

As a result, our team turned to research in the liquid biopsy space, which focuses in non-invasive cancer detection techniques. After examining some of the main players and methods in the liquid biopsy space, we realized that although the liquid biopsy space addressed the pain point of physical pain, it still did not fix the issue of diagnostic accuracy. Our team identified several bottlenecks in the commercialization of liquid biopsy tests including: (1) Analysis of cell-free DNA in urine, blood, and saliva was often inaccurate because the low concentrations of DNA could not be properly analyzed given existing technologies, (2) Analysis of the methylation signal often relied on DNA sample treatment with sodium bisulfite, which can induce random breaks in DNA fragments and lead to incomplete deamination and inaccurate results. Talking to several medical institutions led us to realize that our methodology should eliminate the chemical treatment method while still retaining a quantitative output.

With regards to the disease that we were were working with, hepatocellular carcinoma, we also wanted to quantify the potential impact that our solution would have. With 700,000 existing cases of HCC with an additional 43,000 cases in America predicted in the upcoming year, we felt that it would be important . Current healthcare economics and reimbursement strategies mean that it costs almost almost $500 for a simple needle biopsy and nearly $4000 for surgical biopsies; with our predicted price point of $250, we would be able to prevent a significant economic burden for many individuals as well as prevent a preventative measure on various healthcare systems around the world.

biopsy chemical treatment

Our interactions with hepatocellular carcinoma patients and healthcare professionals demonstrated that traditional tissue biopsies pose a number of problems. They are often invasive and can pose the risk of spreading the cancer in certain scenarios. They are also fairly expensive. Most importantly, the molecular heterogeneity of tumors and the ability of cancer genomes to evolve are not accurately captured by tissue specimens. With regards to methylation analyses, there are a number of challenges regarding accuracy of the reading. Chemical treatment of the DNA can cause random breaks and errors in the DNA and lead to faster rates of degradation. The incomplete conversion of methylated cytosines can also lead to the incorrect calculation for the beta methylation value.

Incorporating synthetic biology into our product design

After reflecting on our initial stakeholder interactions, our team started thinking about how to address the issues at hand. After brainstorming several solutions, we determined that the first crucial pivot that we would have to make is to identify and develop a diagnostic metric that would serve as a more consistent, useful assessment of an individual’s health. Thus, we decided that our solution would need to eliminate the invasiveness of tissue biopsy, be cost-effective, have clinically accurate levels of diagnostic accuracy, provide a quantitative output, and pose no threat in case of environmental release.

As we learned more about hepatocellular carcinoma, we realized that our tool would be extremely applicable in local and international low-resource communities. To understand the specifics of what it would take to develop a clinically-viable test in a variety of environments, we talked to Mr. Manoj Kumar, head at the TATA Institute of Genetics and Society, a NGO and philanthropic social incubator that seeks to bring better healthcare tools to low-resource communities in India. From this, we learned about the ASSURED criteria that are crucial to deploying point-of-care devices in low-resource communities; although we were not necessarily planning on a POC device, we felt that many of these guidelines could guide our design criteria. As such, we realized our device had to be affordable, sensitive (low rates of false positives), specific (low rates of false negatives), user-friendly, robust (rigorously tested in different settings), equipment-free and deliverable. Because a large number of these cases are brought to centralized government hospitals and associated clinical labs, we felt that the equipment-free requirement was not as crucial for the deployment of our project. In fact, given that the desired readout was a quantitative signal, it would be almost impossible to meet that requirement.

Our team decided that incorporating synthetic biology would be essential. Engineering a methyl-binding domain protein with a fluorescent reporter gene would be the core of our biosensor as it allowed for an easy, safe transduction element that did not pose issues that existing alternatives had, including invasiveness and inaccurate readout.

For us, it was also crucial that we come out with a solution that would be easy to acquire and analyze. The most important factors for that depended on turnaround time, ease of materials acquisition, and overall detection accuracy.

smartphone diagnostics hydrogel gold nanoparticle

One idea that our team had was to use smartphone diagnostics and RT-LAMP assays in a separate incubator unit that could quickly detect if the genomic DNA sample was hyper-methylated for the region of interest; however, this failed the ease of materials acquisition as this did not allow for specificity of MBD binding to the promoter region. Our team also struggled to decide between a fluorescent readout and a colorimetric readout; if we had implemented a colorimetric readout, we would have used hydrogels as color barometers; however, this posed several challenges because it would be quite difficult to quantify different shades of the hydrogels and it would be impossible to have a quantitative readout. One method of enhancing the specificity was to use gold nanoparticle to enhance the specificity of our overall mechanism; however, this would be too expensive to implement in both clinical settings and point-of-care situations

How did we use synthetic biology to achieve our target product criteria?

Epinoma design  Epinoma design  Epinoma design  Epinoma design  Epinoma design  Epinoma design  Epinoma design

Affordable: Using synthetic biology and the implementation of a cell-free system requires less instrumentation, thus driving down a significant cost.

Specific: In order to provide MBD protein with the specificity needed to have higher diagnostic accuracy, our team needed to implement a binding mechanism that would allow the MBD to only search for promoter methylation rather than search the entire body of the gene. Combining synthetic biology with materials science allowed us to achieve this goal.

Sensitive: Signal amplification strategies also played an important role in assuring diagnostic accuracy. As such, synthetic biology played a crucial role because the exonuclease-driven amplification strategy was a better option relative to the inaccurate readings that can sometimes be given due to treatment with sodium bisulfite.

User-friendly: Although synthetic biology did not assist in making it “user-friendly”, much of the biosafety of our system can be attributed to the nature of the cell-free implementation. This was also very important to clinical workers who appreciated that synthetic biology helped reduce a traditionally complex workflow.

Robust: The overall design of our workflow means that it can be used anywhere.

Equipment-free: Although our particular use case for a clinical setting relies on some standard laboratory equipment, future iterations of our device will rely on an in-house microfluidic chamber to perform lab on chip reactions. We believe that the intersection between regulatory gene networks and the analysis capability of microfluidics chips could be a potent combination.

To improve the diagnostic accuracy of our assay (the sensitivity and specificity), we also felt that synthetic biology provided some inherent advantages. Discussions with individuals in academia and clinical labs gave us confidence that our signal amplification strategies, including an exonuclease-driven strategy, would also work ; it also allowed us to design several optimized circuits in addition to our baseline MBD-GFP. Foundational literature suggested that multimerization of the protein would enhance binding ability for our MBD protein which would be crucial in detection at the attomolar levels. Discussions with material scientists also pointed us in the direction of implementation for a graphene oxide platform that could help boost our signal-to-noise ratio in the assay. A follow-up conversation with several doctors associated with the TIGS initiative also allowed us to implement a microfluidic system to enable high-throughput analysis and reduce the overall complexity of our workflow.

Expanding our workflow and developing novel use cases

However, our initial product consisted of a single diagnostic assay; although several technological innovations strengthened the predictive power, our team was dedicated to addressing some of the other key lags in the development of clinically available liquid biopsy tests. Our interactions with the following individuals really brought some fresh perspective on broadening our impact.

Dr. Jian Dai, senior data scientist at Genentech

Part of our needs-finding had uncovered a critical lag in the development of liquid biopsy tests. Researchers and clinicians were unsure of which biomarkers to analyze for different diseases. Our team wanted to take the step of addressing this crucial gap in existing liquid biopsy workflows. Talking to Dr. Dai helped give us additional perspective in the drylab, and gave us awareness of the tools that are needed for data analysis. After talking to Dr. Dai, our team was able to come up with a methodology for an unsupervised machine learning framework that would aid in biomarker discovery. By using techniques such as Random Forest and Lasso-Cox, we would be able to discover the optimal gene panel combinations to detect promoter methylation across a subset of patients based on the disease of interest. This aspect of our workflow can be integrate any existing methylome dataset and will provide disease-specific markers.

Dr. Mikael Eliasson, head of Global Product Development & Strategic Innovation at Genentech

Our conversation with Dr. Eliasson also identified another critical lag in the traditional diagnostics journey. In post-treatment therapy, there is a significant decrease in communications between doctors and patients, and this can hinder one’s ability to determine if the initial treatment was successful. In talking with Dr. Eliasson, we realized we could capitalize on the trends of digital health and implement a digital health platform. In addition, we developed a completely novel use case of using hypermethylation as a continuous variable that could be correlated to tumor burden and assess the effectiveness of a patient’s treatment.

Dr. Matthias Essenpreis, CTO at Roche Diagnostics

We also had an opportunity to meet with the CTO of Roche Diagnostics, Dr. Matthias Essenpreis. By talking to him, we started to understand the value of a platform-based business that could facilitate exchange of information and services between two different stakeholder groups. The ability to analyze patient data (both at an early clinical-screening and post-therapy stage) could be given to healthcare professionals to guide their decision-making, and it could also plug into pharmaceuticals’ companies in order to develop more effective treatments going forward. Dr. Essenpreis explained that a multi-sided platform business model would help address stakeholder needs more effectively, and emphasized that the value is in the insight created by the analysis of the data.

From this, we were able to construct two thorough lifecycle cases and the full clinical protocol.

Epinoma design

In addition to the traditional use case for novel diagnostics, our conversations with Dr. Eliasson allowed us to realize that Epinoma could have great value as a post-therapy response tool. This currently does not exist on the market and represents an exciting perspective for our workflow. We subsequently developed a lifecycle centered around post-therapy response.

Epinoma design

These interactions helped expand our workflow from a technologically innovative diagnostic assay into a full-scale diagnostics platform that is able to be implemented at multiple points in the patient care journey, and (1) aids in biomarker discovery and addresses an industry-wide lag, and (2) uses digital health data collection and analysis to generate clinical utility that goes beyond the initial interaction.

Technical Documentation

Because our product would be an in vitro diagnostic device, it must go through the CLIA (see below) regulatory approval cycle. As such, we are not allowed to demonstrate the validity of our solution on humans or human samples. Instead, our team has thoroughly documented the process on the wiki and has provided internal documentation as well. We also worked with clinical researchers in the UCSD Health System to develop a clinical protocol and envision what a kit, if implemented would consist of).

The first part of our workflow is the biomarker discovery tool. Our team analyzed 485,000 unique CpG sites to determine the optimal panel for biomarker detection. The data will be available upon request due to an agreement with UCSD, which hosts one of the United States' largest academic medical centers. After building our model (methodology is given on the Modeling page), we generated a receiver operating curve (ROC) that had an AUC value of 0.99 (highest possible value is 1, and we are the first team to our knowledge in the iGEM competition to have generated such a powerful epigenetics-based model with such high accuracy).

The second part of our workflow is the wetlab assay. Although all experimental results are explained in the Results section of our wiki, we can point to two points of validity: (1) the successful EMSA results of the HRP-MBD hybrid and (2) the statistical verification of the fluorescence quenching in our graphene oxide platform.

The third part of our workflow is the digital health platform to generate additional clinical insight. We have attached screenshots of the logical flow of this platform and the actual functionality that we believe is crucial to streamlining doctor-patient communication. This was developed with the insight of industry professionals such as the CTO of Roche, Matthias Essenpreiss, who emphasized the importance of striking a balance between data privacy and maximum utility.

Feasibility Analysis and Broader Considerations

In addition to having our clinical framework validated by academia and industry professionals, we also were fortunate enough to meet with Dr. Mike Pellini, a venture capitalist who also has a clinical practice. He was able to provide further validation of our entire workflow and found it interesting in its work with post-therapy response via promoter methylation monitoring. The implementation of the cell-free system was an interesting but acceptable way of addressing biosecurity concerns, and he thought that using synthetic biology to advance liquid biopsy and shift cancer diagnosis paradigms was an elegant solution.

Although our project has the potential for broad, positive social impact because of its applications to other researchers and ability for relatively easy implementation in low-resource communities, it was almost important to realize some of the negative consequences of the wrongful use of our product, especially in the last component of our workflow.

Challenges with Microfluidic Integration

One of our project’s main capabilities in future iterations will be its high-throughput capability. In order to achieve this, our team has focused on the potential for harnessing miniaturization of diagnostic devices. One of the challenges that are team still has to consider is the utility of a paper microfluidic sensor vs. a controlled flow microfluidic device; although both have their advantages, it makes more sense for the controlled flow in larger settings such as central laboratories and for paper-based sensors in low-resource communities. This brings additional considerations of cost, overall lifecycle , and environmental impact.

One of the considerations with integration of a microfluidic integration was the ease of compatibility with existing clinical processes. To ensure that our protocol was compatible with existing practices, a key for early adoption in medical centers, we have come up with a streamlined, easy-to-follow protocol for implementation of our liquid biopsy test in a clinical setting. We have attached the protocol below.

Understanding Data Privacy Concerns

Medical data and privacy are growing concerns, and discussions with industry professionals have consistently reiterated that point. Our team’s consultations with Harry Gandhi, founder of Medella Health, and his domain knowledge about medical data privacy helped us choose specific security protocols to ensure that data would be properly handled. However, a large concern still remains as there need to be proper monitoring and handling of patient data when being used for drug and treatment development in the future.

Regulatory Mechanisms for our Device

After thorough literature research and discussion about regulatory obstacles, we realized it would be important to consider how that would impact our overall timeline.

IVDs, like all devices evaluated by the FDA, are further categorized based on each product’s potential risk to patients: Class I (low risk), Class II (moderate risk), and Class III (high risk).

Class III devices are the most regulated and are subject to general controls, special controls, and a premarket approval process. New medical devices with technologies not marketed before 1976 are automatically classified into Class III by federal law. Class III products also include devices intended to directly sustain human life, prevent impairment of human health, and devices with the potential to cause injury. An example of a Class III-IVD would be a product using an analyte-specific reagent, or when a test is intended to diagnose a contagious condition likely to result in a fatal outcome. Epinoma and other projects in the iGEM Diagnostics track would be classified initially as Class III devices by the FDA.

The approval process for an IVD can begin with a Pre-Submission process, which is highly recommended for groups seeking approval for new medical technologies. In this process, a formal request is made to the FDA detailing the product with specific questions regarding its classification and regulation. The FDA then provides initial feedback, defining the best regulatory pathway and data collection for preapproval. Pre-Submission can be done even before a product study has begun. The process can streamline the future approval process as the FDA will already be familiar with the IVD in question, and unnecessary research studies can be avoided.

Official requests for approval fall into several categories. Devices that are considered to be substantially equivalent to approved devices can apply through a 510(k) Premarket Notification, outlined in [21 CFR 807]. The device is then evaluated in comparison to its analogous approved device in regards to its accuracy, precision, and analytical specificity and sensitivity. Evaluations on clinical samples are often sufficient for IVDs that are approved this way. These submissions are approved by the FDA in a 90-day timeline before marketing.

For novel high-risk devices, devices requiring wet lab product evaluation, and devices measuring parameters that are not clearly defined, premarket approval (PMA) is required, as per guidelines. This is the most stringent type of application that the FDA requires. A PMA application includes clinical and non-clinical laboratory studies. The non-clinical laboratory studies section requires data concerning the product’s microbiology, immunology, toxicology, biocompatibility, as well as its shelf life and response to stress and wear.

PMA clinical investigations include review of the product’s protocols, data on the product’s safety and effectiveness, a report of the device’s complications and adverse reactions, and statistical analyses of data collected from patients according to the FDA’s device-specific guidance documents to determine the product’s safety and accuracy in clinical use.

Devices that require a large amount of clinical data for PMA can request an Investigational Device Exemption (IDE), which allows a device to be used in a clinical study. An IDE requires submission of the research plan for the device, which must be approved by an institutional review board, or approval by the FDA. It also requires informed consent from all patients, FDA monitoring of the study, disclaimers that the device is being used for investigational purposes only, and data and reports submission to the FDA.

In addition to these regulations on specific devices, IVD systems that require further analysis at a clinical laboratory must be compliant with the FDA’s Clinical Laboratory Improvement Amendments of 1988 (CLIA ‘88). As part of the premarket process, these types of IVD groups must apply for an FDA evaluation of all laboratories where diagnostic results are to be evaluated. To ensure CLIA ’88 compliances, the FDA evaluates the proposed testing process and either waive further tests or designates the test as moderate- or high- complexity. Based on the lab’s methodology, specific lab standards are applied and evaluated. These include personnel, certification, patient management, quality assurance and control, and inspections.

References

  1. Drain, Paul K et al. “Evaluating Diagnostic Point-of-Care Tests in Resource-Limited Settings.” The Lancet infectious diseases 14.3 (2014): 239–249. PMC. Web. 18 Oct. 2018.
  2. https://www.ncbi.nlm.nih.gov/pubmed/1428896
  3. Yock, Paul G., et al. Biodesign: The Process of Innovating Medical Technologies. 2nd ed., Cambridge University Press, 2015.