Integrated Human Practices
Incorporating a Novel Communication Paradigm for iGEM Teams
As a new team, we looked to past teams who had engaged in stakeholder interactions and Integrated Human Practices to guide our own intuition. Our team quickly realized that often times, other teams simply spoke to other domain experts or end users without having a fully thought-out approach that would allow for integration of advice into project design and deployment. It was also important to realize that a lot of the narratives that teams put forward were extremely linear: the interactions did not invite the team to consider the impact of their decisions or help them optimize their overall design.
In our day-to-day operations, we also experienced the struggles of cross-team discussion. It was quite difficult for people in the wetlab to keep track of what the drylab team was doing, or to keep track of what the entrepreneurship group was trying to deploy. It was also sometimes difficult to ascertain the broad-scale impact of our interactions with certain experts or stakeholders. We were also sure that our team was not the only one to experience such difficulties. To resolve these issues, our team came up with a novel paradigm that we believe will help streamline project issues for future teams.
Similar to the design-build-test cycle that Imperial College introduced in 2006, our team believes that although there is a continuous flow of information and need for integration into the project, we can create three phases for the project with the following methodology.
Recognize
From the scientific method, something that every elementary school student has learned, it is important to use observations (personal and those of others) to identify problems and then seek solutions. Recognize can be split into two cyclical elements
Problem Definition
Identify the particular space that you are interested in/ the area that the problem resides and begin to use foundational literature to understand core elements of the problem: Why does it exist? Does the problem differ across different settings? Why is this?
Constraint determination
Although knowing about the problem you are trying to solve is important, it is important to realize that the problem does not exist in a vacuum; it exists because the real world imposes some sort of constraint (technical, environmental, social, etc.) that prevents the solution from being implemented in an ideal manner. Although there are several different types of constraints, as mentioned above, we encouraged ourselves and future teams to first nail down the specific technical constraints that exist.
There are several different approaches to constraint determination. Focusing on the values that are important to you as an individual and to your entire time can help guide your decision-making process.
Stakeholder identification can also dictate who you should talk and about the topic of discussion; a stakeholder is defined as anyone who may be impacted by your activity or may be impacted by someone who is directly impacted by your activity. As a diagnostics team, our challenge was to bridge the gap in communication between engineers and medical professionals: to address this, we spoke with healthcare professionals, synthetic biologists, academia, industry professionals and executives over the course of our project. Our initial interactions focused on the technical challenges of designing our methodology, especially the need for a non-invasive cancer detection technology that did not rely on chemical treatment.
Develop
After defining the technical parameters of the project, it is important to consider to see how this actually impacts the solution development aspect of one’s project. Often times, teams already have a pre-conceived notion for the problem space they are trying to address. In order to clear the barriers for innovation, it is of utmost importance that the team begin to shape this unrefined idea after each of the initial stakeholder interactions rather than prior.
Development can also be split into two cyclic elements:
Modeling and Visualizations
Modeling is everyone’s best friend, especially an iGEM team’s because it allows for further characterization of a part, device, or a system before actually using resources to build it. For us, it was important to be able to visualize the development of our solution to make sure it fit into the technical parameters that we had determined beforehand. Diagnostics teams can use modeling to check for the validity of their project via a bottom-top approach: protein modeling and an understanding of Michaelis - Menten kinetics can help provide validity of genetic circuit design; further modeling can characterize an entire system (i.e. if a team is developing components for a microfluidic device), and also provide an ecosystem overview that demonstrates the impact of disease monitoring.
Genetic Circuit Design
Although people often think that this step is all about the wetlab scientists and benchwork, it is actually a collaborative endeavor: feedback from earlier modeling and visualization can lead to tweaks or optimizations in the circuit design itself.
Deploy
Strategy Development
After the validation of the genetic circuit, the next step, especially for teams in Diagnostics, Environment, Food and Energy, and Manufacturing tracks is to develop a strategy for implementation in the real-world. Here, teams should consider the other category of constraints (social, resource-based, financial, etc.). We call these dimensional considerations. By thinking about entrepreneurial considerations or what it will take to implement your solution with minimal resources, teams will help guide the evolution of their project in a more pragmatic direction. One thing to keep in mind is that sometimes talking to these stakeholders may uncover information that changes a foundational assumption and sends you back to the drawing board. It can be disheartening at first, but it is important to use this as a learning lesson and as a pivot for the direction of your project.
Communication
Scientists are notoriously ineffective communicators, and this extends to iGEM teams as well. To help resolve this, it is important to identify your audience and determine what their most crucial needs are. Why are you presenting this information to them and what are they hoping to get out of it? The goal of effective communication is to make sure that each party gets something out of it: it is important to make sure that the information given is clearly described without logical missteps and is effectively delivered. This brings us to our second sub-component of communication: communication through design. Especially as iGEM members, our most effective platform for communicating our results is through the wiki, and too often, too much technical depth is provided. After identifying your target audience(s), it can be difficult to tailor the information specifically through them, but that is where information design can be a very powerful tool. Make sure that your graphics are clean and communicate a single concept at a time; much like good writing, it’s important to allow your reader/viewer the opportunity to digest the information at their own pace and to communicate just what is necessary, nothing more, nothing less.
The beauty of our approach is that all parts are ongoing at all times! We encourage teams to use this paradigm and incorporate it throughout their entire project, and we hope that it will be of great benefit to teams going forward.
Reflecting on the Key Outcomes of this Paradigm
Development of a biomarker discovery tool
As a result of this paradigm, we uncovered a critical lag in the development of commercially available liquid biopsy tests. One of the reasons was that scientists and clinicians did not have a centralized methodology for determining biomarkers of interest for specific diseases. Instead, labs would independently identify these markers and then publish papers to communicate their results. Our team believed that because our idea approached cancer diagnostics from a completely new angle and with the development of an uncommon diagnostic metric, it would be important to create a modular biomarker discovery tool that can analyze any existing methylome data and can also integrate existing datasets from The Cancer Genome Atlas.
Expansion of our workflow to integrate a digital health platform
In addition to addressing a core issue in cancer diagnostics, our team’s exploration of the patient care journey led us to identify another significant economic burden on our healthcare system: doctors are unable to ascertain if a treatment has been effective or not long enough because of poor doctor-patient communication. To address this issue, our team developed a functional prototype of a digital health platform.
Development of novel use cases
Although clinicians cautioned us that using promoter methylation as a diagnostic indicator could impact the overall accuracy of our test, implementation of novel signal amplification strategies helped address many of these concerns. In addition to implementing our idea as a early screening tool, our interactions with industry professionals and social innovators led us to the realize the value of hypermethylation’s continuous nature. As such, we were able to develop a novel use case centered around post-therapy response.
IHP Flowchart
Expert Interviews
UCSD Health System
We were fortunate enough to be active in the clinical immersion process. Although the names of the patients are covered for confidentiality purposes, we were able to gain an in-depth perspective of a cancer patient’s journey throughout the diagnostics journey. We understood that patient normally goes through the following stages: prognosis, diagnosis, verification through companion diagnostics, treatment, and post-therapeutic monitoring. The most common method of identifying cancer would be to take a tissue biopsy, and there were a number of issues here for patients: namely the invasiveness and the price point. They also talked about the significant emotional cost that a patient and their family face during this journey. A secondary concern was that a long turnaround time may impede treatment. After talking to several clinicians, they also mentioned that tissue biopsies are not always the best option, as they can spread cancers in certain instances or they might cause further complications. In addition, often times, the first biopsy may result in inconclusive diagnosis, and attempting a second biopsy may present risks.
Takeaway:
At this point, our team was able to see that the gold standard or “status quo” was clearly not enough in the cancer diagnostics space. There was an issue of invasiveness, price point, and overall accuracy.
Poorya Sabounchi, internal affiliate with Illumina Accelerator
Dr. Cashin currently serves as the head of the Illumina Accelerator and Dr. Sabounchi is a startup advocate. Dr. Sabounchi helped answer some of our key questions about next-generation sequencing technology . In addition to describing the mechanisms of NGS, he also talked about some of the key diagnostic metrics that Illumina and GRAIL, a liquid biopsy startup by Illumina, are currently focusing on. He explained that NGS uses deep sequencing runs in order to identify mutations, perform whole-genome sequencing, and believes that they can identify mutations at a better rate than existing practices. He also walked us through some of the companies in the Illumina accelerator and how they were able to harness the power of NGS for improving genomics.
Takeaway:
Our conversation with Dr. Sabounchi helped us understand the promise of next-generation sequencing. Although NGS could resolve the price point, it did not address one of the key patient concerns, namely the invasiveness. We felt that we could somehow engineer the CRISPR protein to detect mutations and then give us a readout that we could quantify.
Pranav Singh, Bioinformatics Team, GRAIL
After speaking with the Illumina Accelerator, we felt that it would be important to understand some of the current alternatives. We reached out to some members of the bioinformatics team at GRAIL to get a better sense of what they do. They first introduced the idea of liquid biopsy, which is given to the collective procedures that amount to non-invasive cancer detection techniques . We learned that the inherent tumor heterogeneity and the ability of cancer genomes to evolve are not properly captured by tissue specimens. GRAIL instead looks at cell-free DNA that is shed by cells and is trying to develop highly accurate. The team also took some time to explain their day to day operations and explained that bioinformatics is becoming exceedingly important due to the massive amounts of data per patient. They use bioinformatics to derive patterns that can generate further clinical insight or help develop more effective treatments in the future.
Takeaway:
After learning from GRAIL, we understood that despite its limitations, liquid biopsy seems to be more advantageous and addresses many of the concerns we had uncovered earlier. As a team, we decided to shift from tissue specimen analysis to liquid biopsy because of the inherent challenges with the molecular heterogeneity.
Dr. Mikael Eliasson, head of Global Product Development and Strategic Innovation
We also got a chance to meet with Dr. Eliasson to discuss some of our questions. In talking to him, we wanted to get a better sense of the liquid biopsy space and the direction that it was heading. He helped identify some of the key bottlenecks; although scientists have been working on liquid biopsy since the early 2000s, there are relatively few commercial tests available despite advances in research. He chalked this up to stringent regulatory oversight as well as some technical limitations. Although tissue biopsy provides a large amount of genomic DNA for sample analysis, liquid biopsy relies on analysis of significantly smaller amounts of DNA and without highly accurate tests, it is very easy to miss the mutations that would indicate cancer. For that reason, he told us that the key to a viable diagnostic test depended on its overall predictive value , or its sensitivity and specificity, which should be around 100% if possible.
Takeaway:
After talking with Dr. Eliasson, we reaffirmed that our project was going to be in the liquid biopsy space. We also realized that one of the most significant challenges would be to eliminate as many false-positives and false-negatives as positive. After performing some literature review, our team also came to the conclusion that NGS’ limitations and tissue specimen analysis’ limitations were because they were looking at point mutations. Thus, our focus shifted to identifying and developing a diagnostic metric that could identify cancer but was not necessarily mutation-based. We wanted to look at epigenetic determinants instead of focusing on somatic alterations.
Amogha Tadimety, pHD student at the Thayer School of Engineering
After performing further literature review, we came across a paper that detailed some of the prevalent methods of analysis for the liquid biopsy space. We contacted the primary author of the paper, Amogha Tadimety, who was a pHD student writing a thesis in a focus area similar to ours. She explained that there has been a recent push to expand liquid biopsy technologies into clinical settings and point-of-care settings. She walked us through the biosensor design, explaining that a biosensor is a device that take a a biological material and puts it in contact with a recognition element to capture an analyte and a transduction element to allow for a measurable signa l. She cautioned us to focus on choosing a suitable medium of analysis , including circulating tumor DNA, circulating tumor cells, and exosomes.
From this, we learned that circulating tumor DNA was probably a good idea for our project. This is partially because circulating tumor DNA provides effective tumor genome profiles and can be sampled non-invasively and subsequently analyzed for specific metrics.
Takeaway:
After our conversation with Amogha, we realized that we were most likely going to work with circulating tumor DNA. Our next step would be to determine a specific epigenetic indicator of interest.
Dr. Kang Zhang, Shiley Eye Institute and UCSD School of Medicine
Dr. Zhang expressed great interest in our project and agreed that looking into epigenetic determinants as an earlier and more consistent indicator of cancer was a good idea. He suggested that the two primary ones that we could look at were promoter methylation and histone modification. By looking at structural modifications, we could use these as potential diagnostic metrics. He also went into depth in his work on epigenetics, and expressed the hope that our team could come up with a methodology that could address the issue of bisulfite treatment . He explained that when he and other clinicians performed methylation analysis, the default method was to treat the input DNA sample with a chemical called sodium bisulfite that converts unmethylated cytosines into a different base that can be sequenced and given a ratio, known as the beta methylation value. He also told us that our methodology should give a quantitative output rather than a qualitative one .
Takeaway:
Our talk with Dr. Zhang allowed us to make some crucial decisions: namely, we decided that our indicator of interest was going to be promoter methylation because it has been associated with events prior to tumorigenesis. In addition, promoter methylation is correlated with tumor burden which can be valuable information in a clinical setting, whereas histone modification cannot be localized or traced to a tangible part of the cancer diagnostics journey. We also decided to focus on developing a methodology that did not require chemical treatment: our team believed that we could take the novel step of implementing synthetic biology in cancer diagnostics to identify a specific epigenetic determinant of interest, and then build a subsequent biosensor that can transduce the signal into a measurable output.
Harry Gandhi, founder at Medella Health
Our team wanted to get more information on the challenges of the diagnostic space. To do this, we got in touch with Harry Gandhi, a founder of Medella Health, a startup that sells glucose sensors and is expanding into other spaces. Because of his background in the diagnostics space, we thought it would be a good opportunity to understand the some of the challenges associated with developing a wetlab technology given real world constraints. Our conversation with Harry primarily centered around the challenges of using hypermethylation as a clinical variable . He was intrigued by our choice and explained that one reason not a lot of people are focusing on it is that unlike most clinical variables which are binary, hypermethylation is on a continuum, meaning that everyone has it . Thus, our methodology would have to also have heightened diagnostic ability, which would be even more of a challenge for us compared to traditional liquid biopsy tools. He also gave us further insight on sensor design and some of the key pivot points to be aware of.
Takeaway:
Our team was determined that using synthetic biology, we would be able to address many of the challenges that Harry had brought up. Our baseline genetic circuit centered around engineering a protein that would be able to detect the methylation somehow and then attach a fluorescent reporter gene that could give a quantitative signal that could be captured and analyzed. After some literature research, we decided to use the methyl-binding-domain protein (MBD) and for that, we decided to play around with different variants until settling on a murine variant for our circuit.
Amogha Tadimety, pHD student at Thayer School of Engineering
After designing our baseline circuit and fleshing out some of our wetlab plans, we had a second discussion with Amogha. She helped validate our planned circuit methodology . Because she has extensive knowledge about liquid biopsy applications in a clinical setting as well in point-of-care devices, we asked her about what to consider for the different use cases. Our initial instinct was that we wanted to implement this as a point-of-care device because we believed that is where it would have its greatest impact, but she explained that it would be better to first focus on a workflow for a clinical methodology that met clinicians’ needs before moving on to the next use case. She re-emphasized that we should focus on the sensitivity and the specificity because administering it in point-of-care would inevitably have lower accuracy to lack of high-quality instrumentation. She suggested that we have try strategies for implementation that would allow for sample DNA amplification or increased sensitivity.
Takeaway:
As a result, we engineered a second genetic circuit that relied on multimerized tandem-MBD proteins, which literature suggested have a higher binding affinity. We also agreed that it would be prudent to develop a MBD-based circuit and methodology for the clinical setting before expanding our use cases.
Dr. Ludmil Alexandrov, Bioengineering at UCSD, Moores Cancer Research Center
Although not directly related to work with epigenetics. Dr. Alexandrov is known for developing Alexandrov signatures, which are mutational signatures that serve as cancer indicators. He is also currently the head of an international project that is developing a comprehensive map of mutational signatures. He expressed great interest in our project and thought that it holds great promise. Although he was a little worried about the rigorous timeline for the project and the iGEM competition, he felt that advancing the use of epigenetic determinants as a cancer indicator was an important step and could have several ramifications down the road. Several factors that he thought we should consider included the capability for high-throughput analysis. Whether it be in a clinical setting or out in the field, it is always better to carry out multiplexed analysis . He also emphasized that for maximum approach, our workflow should be modular and that our circuit should be used for multiple types of cancer.
Takeaway:
Our conversation with Dr. Alexandrov allowed us to begin thinking about some of the bigger questions we were facing in our research, especially in its implementation. His emphasis on modularity really reverberated with us, and we decided that we needed to work on finding a disease that would serve as proof of concept.
Dr. Kang Zhang, Shiley Eye Institute and UCSD School of Medicine
Our second meeting with Dr. Zhang focused on deciding which cancer to focus on. Our team had narrowed down our choices to breast cancer and hepatocellular carcinoma. Dr. Zhang expressed interest in both, but said that identifying methylation markers in breast cancer could be quite difficult; his lab had been examining the problem, and found that the vast majority of putative markers often overlapped with uterine cancer and ovarian cancer, so this would be difficult given the time constraints. When we brought up the idea of identifying pan-cancer markers, he told us that although this was a good idea, it would not be very useful in diagnostics because the amount of clinical insight that could be generated would be very limited and researchers would be unable to synthesize a specific treatment. Instead, he confirmed that we should go forward with working on a single proof-of-concept disease and then broadening our focus towards other diseases as well. While he encouraged us to identify markers specific to hepatocellular carcinoma , he also pointed us to some foundational literature and lab documentation to guide our decision making.
Takeaway:
Talking to Dr. Zhang helped solidify our choice of HCC as our proof of concept. We also began making specific technical design decisions based on the foundational literature he has provided us with.
Nina Vandeventer, Global Head of Personalized Development at Genentech
In talking to Ms. Vandeventer, we learned more about the broader challenges in implementing a diagnostic test in a clinical setting . She re-iterated a crucial point that we had heard many times: the key to getting healthcare professionals and healthcare providers would be to demonstrate high levels of accuracy . She also suggested in order to improve our overall design, we need to discuss our perspectives with as many stakeholders as possible . In her position at Roche/Genentech, interacting with stakeholders is essential to understanding expectations, needs, and concerns. By embedding their feedback int our strategy and our implementation, we would be able to develop a sustainable solution that addresses the needs of more people. Important stakeholder groups included patients, healthcare professionals, providers, and the scientific community.
Takeaway:
After our conversation, our team went back and looked at the people that we had talked to, as well as the ones we were planning to. We decided to group them into buckets and identify their primary pain points. Because our team was interested in long-term sustainable implementation, we wanted to make sure that we took a stakeholder-based approach to this. Talking to patients would help us understand what it is like to live with a specific disease and understand physical and emotional challenges facing patients and their families. Talking to healthcare professionals helped us improve the understanding of how to generate clinical insight and understand existing clinical methodologies and routine practices. Another important piece of a diagnostics product is its price point and healthcare reimbursement strategy: if it is too expensive or has too narrow of a focus, the hospital will most likely not buy into it.
Dr. Michael Pellini, Section 32 and Former CEO of Foundation Medicine
Our team was lucky enough to have a sit-down meeting with the venture capital fund Section 32 in San Diego and meet with Dr. Pellini, who provided some great insight. Dr. Pellini has been in the diagnostics space for over 20 years, and he was able to validate much of our foundational understanding. As with most of our interactions, we were told that the key bottleneck for development and implementation of liquid biopsy tests was the lack of strong predictive value in order to be of use to clinicians. He also thought that at this point, it would be a good idea to give some forethought for our intellectual property strategy . Because we were going to disclose a lot of our research through our wiki and presentation at the Giant Jamboree, he felt that filing a provisional patent would be an important component of defining our differentiating core.
Takeaway:
This conversation primarily helped confirm that we were on the right track with our intuition about how the navigate the space around epigenetic-based liquid biopsy. It helped us understand some of the concerns about intellectual property. We also decided that in the wetlab, following the successful testing of our baseline circuit, we would need to develop some signal amplification strategies. Following this, we met with Parth Majmudar at the UCSD Office of Innovation and Commercialization who helped walk us through the process. Prior to the Jamboree, the team has filed for a provisional patent.
Dr. Tina He, post-doctoral scholar at the Zhang Lab
One of our shifts in the wetlab was to add steps that would enhance the diagnostic power of our assay. In addition, one of the problems that we had run into after over-expressing the MBD protein was that the protein would detect hypermethylation across the entire body of the sample, just not the promoter region as we wanted. Thus, we needed to address the non-specificity of the MBD and derive a quantitative readout. To do this, our team implemented a graphene oxide platform, and used its properties of affinity to single-stranded DNA and fluorescence quenching to give the specificity needed.
Takeaway:
This was one of the most significant modifications to our wetlab design: we added a GO (graphene oxide) platform that would allow for the MBD to detect in the promoter region.
Dr. Jason Kreisberg, Research Scientist at the Ideker Lab
Dr. Kreisberg is part of the Cancer Cell Map Initiative, a joint collaboration between UC San Diego and UC San Francisco. In talking with him and several members of his lab, we were able to validate the framework that we put forward. They gave us a lot of technical considerations: including the ability of our protocols to differentiate between cell-free DNA and circulating tumor DNA. They also wondered how we would amplify the signal to noise ratio in the readout itself. They also emphasized that a diagnostic test alone has limited utility, but we need to be able to generate additional clinical insight from that data . Dr. Kreisberg’s team also emphasized that in order to improve overall accuracy for a test implemented in a clinical setting, we would benefit from a workflow with reduced complexity.
Takeaway:
Talking with CCMI helped us validate the framework and troubleshoot some of the key aspects. By walking through our workflow with others, we were able to anticipate and address some of the problems that we would run into. In addition, talking to Dr. Kreisberg helped us confirm the inclusion of an exonuclease amplification strategy in our workflow. In order to simplify our workflow, our team decided that the eventual implementation of a microfluidic system or microchip platform would be extremely beneficial. This also addressed an earlier point by Dr. Alexandrov, as we were able to provide high-throughput capability.
Dr. Douglas Densmore, Boston University
After these discussions, our team realized that we did not have enough experience in working with microfluidics. Thus, we turned to Boston University for a collaboration because their hardware team was working with microfluidics. In talking to Dr. Densmore, we first explained the general framework of our idea. He then explained that there were several classes of microfluidic devices, including paper, controlled flow, and droplet-based microfluidics. He gave the advantages and disadvantages of each classification , and suggested for our use case, it would make sense to consider controlled flow . He also explained that if we were to go in a point of care direction, it would make more sense for paper microfluidics but those are typically one-step based and not as accurate as controlled flow which is designed with precision through AutoCAD and soft lithography techniques using materials such as PDMS. He also explained some of the work that he does at the CIDAR lab, which develops broad-use tools for microfluidic chip design and fabrication.
Takeaway :
Our team decided that for the most part, we would focus on the implementation and verification of the genetic circuit, and worry about microfluidic integration towards the end. We also decided that a controlled flow microfluidic device would make most sense for our use case. We also launched contact with the Boston University iGEM team and had a video call to discuss our overall idea. They helped us iron out some of the challenges for using microfluidic devices in cancer diagnostics, and we kept this in mind for our design.
UCSD Biodynamics Lab
After selecting controlled flow devices as our primary interest, we met with several undergraduate and graduate researchers at the UCSD Biodynamics Lab, run by Dr. Jeff Hasty. Our conversations helped us realize that their are several key parameters to keep in mind for developing a mathematical representation of a microfluidic system . Because the laws of physics remain the same in microscopic and macroscopic systems but gives rise to different forces, we must consider how reduction of size has an influence of the characteristic times of the system. They also explained that , integrating microfluidic devices with synthetic biology can also have a number of benefits: faster response times, formation of static and dynamic gradients at sub cellular resolution.
Takeaway:
The Hasty Lab pointed us to foundational literature that gave us greater detail about the elements of a microfluidic device, how to design and fabricate each element, and how to use modeling to ensure the correct operation of the system. Consideration of all these parameters was crucial for the design of our prototype microfluidic system. Our conversations with them also led to exploration of biosecurity concerns, and helped us pivot to a cell-free system that can be more accurately modeled and also poses decreased threats to the environment in case of misuse or release.
Naila Chowdhury, Director of Social Innovation, and Dr. Suresh Subramani, TIGS
As part of our exploratory process, we also wanted to consider the larger questions and take into context the broader impact of what we were doing. We reached out to the director of Social Innovation, Ms. Naila Chowdhury, and explained the overall workflow. She expressed great interest and enthusiasm in our project, and helped affirm that through her work, she saw the utility of a cost-effective liquid biopsy test for hepatocellular carcinoma, especially because of its disproportional impact on the Hispanic population in the US with increasing rates of diagnosis, as well as its prevalence in South and Southeast Asia. She also introduced us to Dr. Suresh Subramani, the former vice-chancellor who expressed similar enthusiasm for our project. He helped us consider how we could potentially implement our idea on an international scale. As one of the presiding members of the TATA Institute of Genetics and Society, he was able to connect us to a number of other researchers who helped us understand the implications of applying our solution in low-resource settings.
Takeaway:
We were very excited to see the response to our diagnostic assay, and it gave us another lens to consider our implementation through. Part of the challenge that we faced for working with low-resource settings was that we needed to be able to perform our assay with limited instrumentation and also provide a reasonable turnaround time. It also reaffirmed our decision to implement a cell-free system because GMOs are still not very well understood and have a stigma in South Asia. Since only the protein will be used in the detection, people are more likely to adopt our idea. In addition, Ms. Chowdhury was gracious enough to allow us to present our vision to over 300 people through the Global Empowerment Summit at the beginning of October.
Dr. Jian Dai, Senior Data Scientist for Personalized Health Care at Genentech
Part of our needs-finding had uncovered a critical lag in the development of liquid biopsy tests. Researchers and clinicians were unsure of which biomarkers to analyze for different diseases. Our team wanted to take the step of addressing this crucial gap in existing liquid biopsy workflows. Talking to Dr. Dai helped give us additional perspective in the drylab, and gave us awareness of the tools that are needed for data analysis. After talking to Dr. Dai, our team was able to come up with a methodology for an unsupervised machine learning framework that would aid in biomarker discovery. By using techniques such as Random Forest and Lasso-Cox, we would be able to discover the optimal gene panel combinations to detect promoter methylation across a subset of patients based on the disease of interest.
Takeaway:
We added to the beginning of our workflow to come up with a machine learning-based approach for biomarker discovery. This computational tool uses several techniques to narrow down lists of biomarkers and analyzes the sensitivity and specificity based on a large data set in order to provide optimal results. We have included the biomarker discovery methodology on our wiki for further scrutiny. We also enhanced the overall modularity of our approach because the algorithm could find disease specific markers for any particular cancer given the correct dataset. Subsequent conversations with Dr. Dai and several postdocs in the Zhang Lab gave us confidence in our methodology.
Dr. Mikael Eliasson, head of Global Product Development and Strategic Innovation
A subsequent conversation with Dr. Eliasson focused on more acutely addressing our primary stakeholders’ needs. Based on prior confirmations and confirmation from both patient and healthcare professionals, we had identified another critical lag in the traditional diagnostics journey. In post-therapy monitoring, there is often a significant decrease in communication between the doctors and patients, and this lack of post-treatment symptoms reporting can be crucial in judging if the treatment was successful. We wanted to address this, and based on the trends of digital health, our team had the idea to develop a digital health platform . Talking to Dr. Eliasson helped us understand the key components, especially personalization and the ability to give actionable information for both the healthcare professional and the patient.
Takeaway:
Our conversations with Dr. Eliasson led us to believe that implementing this digital health platform would give us a novel beginning to end workflow for the liquid biopsy space. His advice really helped us make functional prototypes of our platform for the Jamboree and will continue guiding us as we optimize it in our post-iGEM journey. In addition, reviewing our conversations with Dr. Eliasson allowed us to come up with a completely novel use case that leverages hypermethylation as a continuum. To address the emotional toll that family members and patients experience about the anxiety of knowing whether treatments are working or not, we could use hypermethylation and its correlation with tumor burden to see if any individual was responding to treatment, thus decreasing an economic burden as well.
Harry Gandhi, founder at Medella Health
Our conversation with Harry was centered around understanding the intricacies of smartphone diagnostics. One of the key concerns that patients expressed when approached about the idea of using smartphone applications as platforms for personalized medicine and health tracking was the possibility that confidential medical data would be released. In addition, he also wanted us to focus on having a clean design for the actual interface and try to address as many pain points for HCPs and patients as possible. Although Harry is more involved in the technical deployment of the sensors, he was still able to provide guidance about the security protocols . We had a robust discussion about the importance of complying with HIPPAA regulations as well as the GDPR rules recently passed by the EU. He appreciated the novelty of our vision and looked forward to seeing a prototype.
Takeaway:
Talking with Harry gave us a good opportunity to understand the debate of ethics that has engulfed many digital health efforts. Although the issue cannot be resolved with a simple yes or no answer, we felt that it was vital to our best to understand these concerns and plan our strategy accordingly.
BLUE LINC, Biodesign Incubator at UC San Diego
We were also fortunate to meet and discuss some of the technical issues with some graduate students and faculty at the medical school by talking with BLUE LINC, an on-campus resource modeled after the Stanford Biodesign process. Several members of the cohort were able to provide their expertise with the process of medical device design and innovation. They were impressed with the thoroughness of our approach, but encouraged us to think more about the regulatory obstacles that could hamper our ability to bring this product to market. Because laboratory testing of a potential clinical test is of vital importance, we would have to comply with the Clinical Laboratory Improvement Amendment (CLIA). Before we can test human samples for diagnostic testing, we must demonstrate the rigor of our approach and get a waiver from the CDC and FDA. For some members of the cohort, a larger concern that they had picked up on through their clinical immersion was the integration of our clinical test into existing routines . They explained that individuals are less likely to be early adopters, especially for medical devices, if it places too much of a burden.
Takeaway:
BLUE LINC helped really hone in on some of the regulatory work that would be necessary to take this product to market, and it really helped give us a realistic timeline with concrete steps for the actions that we would have to take. In addition, integration of our app and our assay into existing technologies would also be paramount going forward.
Manoj Kumar, TATA Board Trustee and CEO of Social Alpha
Building on our earlier interactions, Dr. Subramani put us in contact with one of his contacts in the TATA Board, Mr. Manoj Kumar, who oversees a $75 million grant to UCSD that is focused on advancing disease detection. He shared his vision for Social Alpha, a social innovation ecosystem centered in India that addresses some of the larger challenges that a developing nation faces. We were given an opportunity to discuss our idea with him, and he affirmed that an HCC liquid biopsy test would be helpful in low-resource communities . Talking with several portfolio managers at Social Alpha helped give insight into the challenges of entering the Indian market, and they also reiterated that it would be important to make sure that this would be available at a reasonable price point. The Indian healthcare system is not as robust as the American one, and there are far fewer localized hospitals; instead, most medical analysis happens at large, centralized government labs. They saw great value in the digital health platform as the Indian government has recently embraced the “digital era” and are pushing forward with a grant of $100,000 to support our research going forward.
Takeaway:
Talking to Mr. Kumar and the individuals who part of the Social Alpha team opened our eyes to some of the challenges of having a social venture. They helped dissect the anatomy of a good business idea and then gave us the chance to explain ; after talking to them, we realized that we would have to tweak our go-to market strategy in India by either partnering with the government or plug directly into Indian pharmaceuticals’ who could serve as channels of distribution. In addition, our team decided to take the step of putting together a policy framework about the steps that policymakers and scientists can make in order to bring better-quality diagnostics to the areas that need them the most.
Dr. Mike Pellini, Section 32 and Former CEO of Foundation Medicine
In order to get a thorough assessment of our clinical and financial validity, we turned to Dr. Mike Pellini, a venture capitalist here in San Diego. We had the chance to sit down with him and give an informal pitch that laid out the existing problems in cancer diagnostics and bottlenecks in liquid biopsy and how our innovation in the lab directly addressed those concerns. He appreciated the modularity of our approach, telling us that as a potential investor, his primary concern would be the ability to enter adjacent markets . Because he is an investor in Freenome, which would be our most significant competition in the space, he was able to answer some of the deeper questions about liquid biopsy and the direction of the industry. He also really appreciated the stakeholder-focused approach that we took, and eventually provided us with his personal testimony, which we have included in the Entrepreneurship section and landing page of the wiki.
Takeaway:,
Although most of the feedback we got from Dr. Pellini was positive, he did mention that to take our idea to market beyond the scope of iGEM, it would be essential to come up with a pricing framework and a successful business model that leveraged our value propositions. He validated our price point of $250 and gave us most of our current information about the healthcare reimbursement strategies, and also encouraged us to focus on performing some TAM, SAM, and TM calculations. All of these calculations are included in our business plan on the wiki, and we have also attached our pitch deck.