Integrated Human Practices
Incorporating a Novel Communication Paradigm for iGEM Teams
As a new team, we looked to past teams who had engaged in stakeholder interactions and Integrated Human Practices to guide our own intuition. Our team quickly realized that often times, other teams simply spoke to other domain experts or end users without having a fully thought-out approach that would allow for integration of advice into project design and deployment. It was also important to realize that a lot of the narratives that teams put forward were extremely linear: the interactions did not invite the team to consider the impact of their decisions or help them optimize their overall design.
In our day-to-day operations, we also experienced the struggles of cross-team discussion. It was quite difficult for people in the wetlab to keep track of what the drylab team was doing, or to keep track of what the entrepreneurship group was trying to deploy. It was also sometimes difficult to ascertain the broad-scale impact of our interactions with certain experts or stakeholders. We were also sure that our team was not the only one to experience such difficulties. To resolve these issues, our team came up with a novel paradigm that we believe will help streamline project issues for future teams.
Similar to the design-build-test cycle that Imperial College introduced in 2006, our team believes that although there is a continuous flow of information and need for integration into the project, we can create three phases for the project with the following methodology.
Recognize
From the scientific method, something that every elementary school student has learned, it is important to use observations (personal and those of others) to identify problems and then seek solutions
Recognize can be split into two cyclical elements:
Problem Definition
Identify the particular space that you are interested in/ the area that the problem resides and begin to use foundational literature to understand core elements of the problem: Why does it exist? Does the problem differ across different settings? Why is this?
Constraint determination
Although knowing about the problem you are trying to solve is important, it is important to realize that the problem does not exist in a vacuum; it exists because the real world imposes some sort of constraint (technical, environmental, social, etc.) that prevents the solution from being implemented in the ideal manner.
Although there are several different types of constraints, as mentioned above, we encouraged ourselves and future teams to first nail down the specific technical constraints that exist. There are several different approaches to constraint determination. Focusing on the values that are important to you as an individual and to your entire time can help guide your decision-making process.
Stakeholder identification can also dictate who you should talk and about the topic of discussion; a stakeholder is defined as anyone who may be impacted by your activity or may be impacted by someone who is directly impacted by your activity. As a diagnostics team, our challenge was to bridge the gap in communication between engineers and medical professionals: to address this, we spoke with healthcare professionals, synthetic biologists, academia, industry professionals and executives over the course of our project. Our initial interactions focused on the technical challenges of designing our methodology, especially the need for a non-invasive cancer detection technology that did not rely on chemical treatment.
Develop
After defining the technical parameters of the project, it is important to consider to see how this actually impacts the solution development aspect of one’s project. Often times, teams already have a pre-conceived notion for the problem space they are trying to address. In order to clear the barriers for innovation, it is of utmost importance that the team begin to shape this unrefined idea after each of the initial stakeholder interactions rather than prior.
Development can also be split into two cyclic elements:
Modeling and Visualizations
Modeling is everyone’s best friend, especially an iGEM team’s because it allows for further characterization of a part, device, or a system before actually using resources to build it. For us, it was important to be able to visualize the development of our solution to make sure it fit into the technical parameters that we had determined beforehand. Diagnostics teams can use modeling to check for the validity of their project via a bottom-top approach: protein modeling and an understanding of Michaelis - Menten kinetics can help provide validity of genetic circuit design; further modeling can characterize an entire system (i.e. if a team is developing components for a microfluidic device), and also provide an ecosystem overview that demonstrates the impact of disease monitoring.
Genetic Circuit Design
Although people often think that this step is all about the wetlab scientists and benchwork, it is actually a collaborative endeavor: feedback from earlier modeling and visualization can lead to tweaks or optimizations in the circuit design itself.
Deploy
Strategy Development
After the validation of the genetic circuit, the next step, especially for teams in Diagnostics, Environment, Food and Energy, and Manufacturing tracks is to develop a strategy for implementation in the real-world. Here, teams should consider the other category of constraints (social, resource-based, financial, etc.). We call these dimensional considerations . By thinking about entrepreneurial considerations or what it will take to implement your solution with minimal resources, teams will help guide the evolution of their project in a more pragmatic direction.
One thing to keep in mind is that sometimes talking to these stakeholders may uncover information that changes a foundational assumption and sends you back to the drawing board. It can be disheartening at first, but it is important to use this as a learning lesson and as a pivot for the direction of your project.
Communication
Scientists are notoriously bad communicators, and this extends to iGEM teams as well. To help resolve this, it is important to identify your audience and determine what their most crucial needs are. Why are you presenting this information to them and what are they hoping to get out of it? The goal of effective communication is to make sure that each party gets something out of it: it is important to make sure that the information given is clearly described without logical missteps and is effectively delivered.
This brings us to our second sub-component of communication: communication through design. Especially as iGEM members, our most effective platform for communicating our results is through the wiki, and too often, too much technical depth and information is provided. After identifying your target audience(s), it can be difficult to tailor the information specifically through them, but that is where information design can be a very powerful tool. Make sure that your graphics are clean and communicate a single concept at a time; much like good writing, it’s important to allow your reader/viewer the opportunity to digest the information at their own pace and to communicate just what is necessary, nothing more, nothing less.
The beauty of our approach is that all parts are ongoing at all times! We encourage teams to use this paradigm and take a non-linear approach to stakeholder interactions, and we hope that it will be of great benefit to teams going forward.
Reflecting on the Key Outcomes of this Paradigm
Development of a biomarker discovery tool
As a result of this paradigm, we uncovered a critical lag in the development of commercially available liquid biopsy tests. One of the reasons was that scientists and clinicians did not have a centralized methodology for determining biomarkers of interest for specific diseases. Instead, labs would independently identify these markers and then publish papers to communicate their results. Our team believed that because our idea approached cancer diagnostics from a completely new angle and with the development of an uncommon diagnostic metric, it would be important to create a modular biomarker discovery tool that can analyze any existing methylome data and can also integrate existing datasets from The Cancer Genome Atlas.
Expansion of our workflow to integrate a digital health platform
In addition to addressing a core issue in cancer diagnostics, our team’s exploration of the patient care journey led us to identify another significant economic burden on our healthcare system: doctors are unable to ascertain if a treatment has been effective or not long enough because of poor doctor-patient communication. To address this issue, our team developed a functional prototype of a digital health platform.
Development of novel use cases
Although clinicians cautioned us that using promoter methylation as a diagnostic indicator could impact the overall accuracy of our test, implementation of novel signal amplification strategies helped address many of these concerns. In addition to implementing our idea as a early screening tool, our interactions with industry professionals and social innovators led us to the realize the value of hypermethylation’s continuous nature. As such, we were able to develop a novel use case centered around post-therapy response.
IHP Flowchart
Expert Interviews
UCSD Health System
We were fortunate enough to be active in the clinical immersion process. Although the names of the patients are covered for confidentiality purposes, we were able to gain an in-depth perspective of a cancer patient’s journey throughout the diagnostics journey. We understood that patient normally goes through the following stages: prognosis, diagnosis, verification through companion diagnostics, and post-therapeutic monitoring. The most common method of identifying cancer would be to take a tissue biopsy, and there were a number of issues here for patients: namely the invasiveness and the price point. They also talked about the significant emotional cost that a patient and their family face during this journey. A secondary concern was that a long turnaround time may impede treatment . After talking to several clinicians, they also mentioned that tissue biopsies are not always the best option, as they can spread cancers in certain instances or they might cause further complications. In addition, often times, the first biopsy may result in inconclusive diagnosis, and attempting a second biopsy may present risks.
Takeaway:
At this point, our team was able to see that the gold standard or “status quo” was clearly not enough in the cancer diagnostics space. There was an issue of invasiveness, price point, and overall accuracy.
Poorya Sabounchi, internal affiliate with Illumina Accelerator
Dr. Cashin currently serves as the head of the Illumina Accelerator and Dr. Sabounchi is a startup advocate. Dr. Sabounchi helped answer some of our key questions about next-generation sequencing technology. In addition to describing the mechanisms of NGS, he also talked about some of the key diagnostic metrics that Illumina and GRAIL, a liquid biopsy startup by Illumina, are currently focusing on. He explained that NGS uses deep sequencing runs in order to identify mutations, perform whole-genome sequencing, and believes that they can identify mutations at a better rate than existing practices. He also walked us through some of the companies in the Illumina accelerator and how they were able to harness the power of NGS for improving genomics.
Takeaway:
Our conversation with Dr. Sabounchi helped us understand the promise of next-generation sequencing. Although NGS could resolve the price point, it did not address one of the key patient concerns, namely the invasiveness. We felt that we could somehow engineer the CRISPR protein to detect mutations and then give us a readout that we could quantify.
Pranav Singh, Bioinformatics Team, GRAIL
After speaking with the Illumina Accelerator, we felt that it would be important to understand some of the current alternatives. We reached out to some members of the bioinformatics team at GRAIL to get a better sense of what they do. They first introduced the idea of liquid biopsy, which is given to the collective procedures that amount to non-invasive cancer detection techniques. We learned that the inherent tumor heterogeneity and ability of cancer genomes to evolve are not properly captured by tissue specimens. GRAIL instead looks at cell-free DNA that is shed by cells and is trying to develop highly accurate. The team also took some time to explain their day to day operations and explained that bioinformatics is becoming exceedingly important due to the massive amounts of data per patient. They use bioinformatics to derive patterns that can generate further clinical insight or help develop more effective treatments in the future.
Takeaway:
After learning from GRAIL, we understood that despite its limitations, liquid biopsy seems to be more advantageous and addresses many of the concerns we had uncovered earlier. As a team, we decided to shift from tissue specimen analysis to liquid biopsy because of the inherent challenges with the molecular heterogeneity.