Gene sequencing is enabling disease tracking in new ways that include retesting laboratory specimens from before the SARS-CoV-2 outbreak to determine when it arrived in the US
On February 26 of this year, nearly 200 executives and employees of neuroscience-biotechnology company Biogen gathered at the Boston Marriott Long Wharf hotel for their annual leadership conference. Unbeknownst to the attendees, by the end of the following day, dozens of them had been exposed to and become infected by SARS-CoV-2, the coronavirus that causes the COVID-19 illness.
Researchers now have hard evidence that attendees at this meeting returned to their communities and spread the infection. The findings of this study will be relevant to pathologists and clinical laboratory managers who are cooperating with health authorities in their communities to identify infected individuals and track the spread of the novel coronavirus.
This “superspreader” event has been closely investigated and has led to intriguing conclusions concerning the use of genetic sequencing to revealed vital information about the COVID-19 pandemic. Recent improvements in gene sequencing technology is giving scientists new ways to trace the spread of COVID-19 and other diseases, as well as a method for monitoring mutations and speeding research into various treatments and vaccines.
Genetic Sequencing Traces an Outbreak
“With genetic data, a record of our poor decisions is being captured in a whole new way,” Bronwyn MacInnis, PhD, Director of Pathogen Genomic Surveillance at the Broad Institute of MIT and Harvard, told The Washington Post (WaPo) during its analysis of the COVID-19 superspreading event. MacInnis is one of many Broad Institute, Harvard, MIT, and state of Massachusetts scientists who co-authored a study that detailed the coronavirus’ spread across Boston, including from the Biogen conference.
What they discovered is both surprising and enlightening. According to WaPo’s report, at least 35 new cases of the virus were linked directly to the Biogen conference, and the same strain was discovered in outbreaks in two homeless shelters in Boston, where 122 people were infected. The variant tracked by the Boston researchers was found in roughly 30% of the cases that have been sequenced in the state, as well as in Alaska, Senegal, and Luxembourg.
“The data reveal over 80 introductions into the Boston area, predominantly from elsewhere in the United States and Europe. We studied two superspreading events covered by the data, events that led to very different outcomes because of the timing and populations involved. One produced rapid spread in a vulnerable population but little onward transmission, while the other was a major contributor to sustained community transmission,” the researchers noted in their study abstract.
“The same two events differed significantly in the number of new mutations seen, raising the possibility that SARS-CoV-2 superspreading might encompass disparate transmission dynamics. Our results highlight the failure of measures to prevent importation into [Massachusetts] early in the outbreak, underscore the role of superspreading in amplifying an outbreak in a major urban area, and lay a foundation for contact tracing informed by genetic data,” they concluded.
Genetic Sequencing and Mutation Tracking
The use of genetic sequencing to trace the virus could inform measures to control the spread in new ways, but currently, only about 0.33% of cases in the United States are being sequenced, MacInnis told WaPo, and that not sequencing samples is “throwing away the crown jewels of what you really want to know.”
Another role that genetic sequencing is playing in this pandemic is in tracking viral mutations. One of the ways that pandemics worsen is when viruses mutate to become deadlier or more easily spread. Scientists are using genetic sequencing to monitor SARS-CoV-2 for such mutations.
A group of scientists at Texas A&M University led by Yue Xing, PhD, published a paper titled, “MicroGMT: A Mutation Tracker for SARS-CoV-2 and Other Microbial Genome Sequences,” which explains that “Although most mutations are expected to be selectively neural, it is important to monitor if SARS-CoV-2 will eventually evolve to be a stronger or weaker infectious agent as time goes on. Therefore, it is vital to track mutations from newly sequenced SARS-CoV-2 genome.”
Korber’s findings are important because the mutation the scientists identified appears to have a fitness advantage. “Our data show that, over the course of one month, the variant carrying the D614G Spike mutation became the globally dominant form of SARS-CoV-2,” they wrote. Additionally, the study noted, people infected with the mutated variant appear to have a higher viral load in their upper respiratory tracts.
Genetic Sequencing, the Race for Treatments, Vaccines, and Managing Future Pandemics
If, as Fauci and Morens predict, future pandemics are likely, improvements in gene sequencing and analysis will become even more important for tracing, monitoring, and suppressing outbreaks. Clinical laboratory managers will want to watch this closely, as medical labs that process genetic sequencing will, no doubt, be part of that operation.
Media reports in the United Kingdom cite bad timing and centralization of public health laboratories as reasons the UK is struggling to meet testing goals
Clinical pathologists and medical laboratories in UK and the US function within radically different healthcare systems. However, both countries faced similar problems deploying widespread diagnostic testing for SARS-CoV-2, the novel coronavirus that causes COVID-19. And the differences between America’s private healthcare system and the UK’s government-run, single-payer system are exacerbating the UK’s difficulties expanding coronavirus testing to its citizens.
The Dark Daily reported in March that a manufacturing snafu had delayed distribution of a CDC-developed diagnostic test to public health laboratories. This meant virtually all testing had to be performed at the CDC, which further slowed testing. Only later that month was the US able to significantly ramp up its testing capacity, according to data from the COVID Tracking Project.
However, the UK has fared even worse, trailing Germany, the US, and other countries, according to reports in Buzzfeed and other media outlets. On March 11, the UK government established a goal of administering 10,000 COVID-19 tests per day by late March, but fell far short of that mark, The Guardian reported. The UK government now aims to increase this to 25,000 tests per day by late April.
This compares with about 70,000 COVID-19 tests per day in
Germany, the Guardian reported, and about 130,000 per day in the US
(between March 26 and April 14), according to the COVID Tracking Project.
What’s Behind the UK’s Lackluster COVID-19 Testing
Response
In January, when the outbreak first hit, Public Health England (PHE) “began a strict program of contact tracing and testing potential cases,” Buzzfeed reported. But due to limited medical laboratory capacity and low supplies of COVID-19 test kits, the government changed course and de-emphasized testing, instead focusing on increased ICU and ventilator capacity. (Scotland, Wales, and Northern Ireland each have separate public health agencies and national health services.)
Later, when the need for more COVID-19 testing became
apparent, UK pathology laboratories had to contend with global shortages of
testing kits and chemicals, The Guardian reported. At present, COVID-19 testing
is limited to healthcare workers and patients displaying symptoms of pneumonia,
acute
respiratory distress syndrome, or influenza-like illness, PHE stated in “COVID-19:
Investigation and Initial Clinical Management of Possible Cases” guidance.
Another factor that has limited widespread COVID-19 testing is the country’s highly-centralized system of public health laboratories, Buzzfeed reported. “This has limited its ability to scale and process results at the same speed as other countries, despite its efforts to ramp up capacity,” Buzzfeed reported. Public Health England, which initially performed COVID-19 testing at one lab, has expanded to 12 labs. NHS laboratories also are testing for the SARS-CoV-2 coronavirus, PHE stated in “COVID-19: How to Arrange Laboratory Testing” guidance.
Sharon Peacock, PhD, PHE’s National Infection Service Interim Director, Professor of Public Health and Microbiology at the University of Cambridge, and honorary consultant microbiologist at the Cambridge clinical and public health laboratory based at Addenbrookes Hospital, defended this approach at a March hearing of the Science and Technology Committee (Commons) in Parliament.
“Laboratories in this country have largely been merged, so we have a smaller number of larger [medical] laboratories,” she said. “The alternative is to have a single large testing site. From my perspective, it is more efficient to have a bigger testing site than dissipating our efforts into a lot of laboratories around the country.”
Writing in The Guardian, Paul Hunter, MB ChB MD, a microbiologist and Professor of Medicine at University of East Anglia, cites historic factors behind the testing issue. The public health labs, he explained, were established in 1946 as part of the National Health Service. At the time, they were part of the country’s defense against bacteriological warfare. They became part of the UK’s Health Protection Agency (now PHE) in 2003. “Many of the laboratories in the old network were shut down, taken over by local hospitals or merged into a smaller number of regional laboratories,” he wrote.
US Facing Different Clinical Laboratory Testing Problems
Meanwhile, a few medical laboratories in the US are now contending with a different problem: Unused testing capacity, Nature reported. For example, the Broad Institute of MIT and Harvard in Cambridge, Mass., can run up to 2,000 tests per day, “but we aren’t doing that many,” Stacey Gabriel, PhD, a human geneticist and Senior Director of the Genomics Platform at the Broad Institute, told Nature. Factors include supply shortages and incompatibility between electronic health record (EHR) systems at hospitals and academic labs, Nature reported.
Politico
cited the CDC’s narrow testing criteria, and a lack of supplies for collecting
and analyzing patient samples—such as swabs and personal protective equipment—as
reasons for the slowdown in testing at some clinical laboratories in the US.
Challenges Deploying Antibody Tests in UK
The UK has also had problems deploying serology tests designed to detect whether people have developed antibodies against the virus. In late March, Peacock told members of Parliament that at-home test kits for COVID-19 would be available to the public through Amazon and retail pharmacy chains, the Independent reported. And, Politico reported that the government had ordered 3.5 million at-home test kits for COVID-19.
However, researchers at the University of Oxford who had been charged with validating the accuracy of the kits, reported on April 5 that the tests had not performed well and did not meet criteria established by the UK Medicines and Healthcare products Regulatory Agency (MHRA). “We see many false negatives (tests where no antibody is detected despite the fact we know it is there), and we also see false positives,” wrote Professor Sir John Bell, GBE, FRS, Professor of Medicine at the university, in a blog post. No test [for COVID-19], he wrote, “has been acclaimed by health authorities as having the necessary characteristics for screening people accurately for protective immunity.”
He added that it would be “at least a month” before suppliers could develop an acceptable COVID-19 test.
In the United States, the Cellex COVID-19 test is intended for use by medical laboratories. As well, many research sites, academic medical centers, clinical laboratories, and in vitro diagnostics (IVD) companies in the US are working to develop and validate serological tests for COVID-19.
Within weeks, it is expected that a growing number of such
tests will qualify for a Food and Drug Administration (FDA) Emergency Use
Authorization (EUA) and become available for use in patient care.
‘Prime editing’ is what researchers are calling the proof-of-concept research that promises improved diagnostics and more effective treatments for patients with genetic defects
Known as Prime Editing, the scientists developed this technique as a more accurate way to edit Deoxyribonucleic acid (DNA). In a paper published in Nature, the authors claim prime editing has the potential to correct up to 89% of disease-causing genetic variations. They also claim prime editing is more powerful, precise, and flexible than CRISPR.
The research paper describes prime editing as a “versatile and precise genome editing method that directly writes new genetic information into a specified DNA site using a catalytically impaired Cas9endonuclease fused to an engineered reverse transcriptase, programmed with a prime editing guide RNA (pegRNA) that both specifies the target site and encodes the desired edit.”
And a Harvard Gazette article states, “Prime editing differs from previous genome-editing systems in that it uses RNA to direct the insertion of new DNA sequences in human cells.”
Assuming further research and clinical studies confirm the
viability of this technology, clinical laboratories would have a new diagnostic
service line that could become a significant proportion of a lab’s specimen
volume and test mix.
In that e-briefing we wrote that Liu “has led a team of scientists in the development of a gene-editing protein delivery system that uses cationic lipids and works on animal and human cells. The new delivery method is as effective as protein delivery via DNA and has significantly higher specificity. If developed, this technology could open the door to routine use of genome analysis, worked up by the clinical laboratory, as one element in therapeutic decision-making.”
Now, Liu has taken that development even further.
Cell Division Not Necessary
CRISPR stands for Clustered Regularly Interspaced Short Palindromic Repeats. It is considered the most advanced gene editing technology available. However, it has one drawback not found in Prime Editing—CRISPR relies on a cell’s ability to divide to generate desired alterations in DNA—prime editing does not.
This means prime editing could be used to repair genetic mutations in cells that do not always divide, such as cells in the human nervous system. Another advantage of prime editing is that it does not cut both strands of the DNA double helix. This lowers the risk of making unintended, potentially dangerous changes to a patient’s DNA.
The researchers claim prime editing can eradicate long lengths of disease-causing DNA and insert curative DNA to repair dangerous mutations. These feats, they say, can be accomplished without triggering genome responses introduced by other forms of CRISPR that may be potentially harmful.
“Prime editors are more like word processors capable of
searching for targeted DNA sequences and precisely replacing them with edited
DNA strands,” Liu told NPR.
The scientists involved in the study have used prime editing to perform over 175 edits in human cells. In the test lab, they have succeeded in repairing genetic mutations that cause both Sickle Cell Anemia (SCA) and Tay-Sachs disease, NPR reported.
“Prime editing is really a step—and potentially a significant step—towards this long-term aspiration of the field in which we are trying to be able to make just about any kind of DNA change that anyone wants at just about any site in the human genome,” Liu told News Medical.
Additional Research Required, but Results are Promising
Prime editing is very new and warrants further
investigation. The researchers plan to continue their work on the technology by
performing additional testing and exploring delivery mechanisms that could lead
to human therapeutic applications.
“Prime editing should be tested and optimized in as many cell types as researchers are interested in editing. Our initial study showed prime editing in four human cancer cell lines, as well as in post-mitotic primary mouse cortical neurons,” Liu told STAT. “The efficiency of prime editing varied quite a bit across these cell types, so illuminating the cell-type and cell-state determinants of prime editing outcomes is one focus of our current efforts.”
Although further research and clinical studies are needed to
confirm the viability of prime editing, clinical laboratories could benefit
from this technology. It’s worth watching.
Genetic data captured by this new technology could lead to a new understanding of how different types of cells exchange information and would be a boon to anatomic pathology research worldwide
What if it were possible to map the interior of cells and view their genetic sequences using chemicals instead of light? Might that spark an entirely new way of studying human physiology? That’s what researchers at the Massachusetts Institute of Technology (MIT) believe. They have developed a new approach to visualizing cells and tissues that could enable the development of entirely new anatomic pathology tests that target a broad range of cancers and diseases.
Scientists at MIT’s Broad Institute and McGovern Institute for Brain Research developed this new technique, which they call DNA Microscopy. They published their findings in Cell, titled, “DNA Microscopy: Optics-free Spatio-genetic Imaging by a Stand-Alone Chemical Reaction.”
Joshua Weinstein, PhD, a postdoctoral associate at the Broad Institute and first author of the study, said in a news release that DNA microscopy “is an entirely new way of visualizing cells that captures both spatial and genetic information simultaneously from a single specimen. It will allow us to see how genetically unique cells—those comprising the immune system, cancer, or the gut for instance—interact with one another and give rise to complex multicellular life.”
The news release goes on to state that the new technology “shows
how biomolecules such as DNA and RNA are organized in cells and tissues,
revealing spatial and molecular information that is not easily accessible
through other microscopy methods. DNA microscopy also does not require
specialized equipment, enabling large numbers of samples to be processed
simultaneously.”
New Way to Visualize Cells
The MIT researchers saw an opportunity for DNA microscopy to
find genomic-level cell information. They claim that DNA microscopy images
cells from the inside and enables the capture of more data than with
traditional light microscopy. Their new technique is a chemical-encoded
approach to mapping cells that derives critical genetic insights from the
organization of the DNA and RNA in cells and tissue.
And that type of genetic information could lead to new precision medicine treatments for chronic disease. New Atlas notes that “ Speeding the development of immunotherapy treatments by identifying the immune cells best suited to target a particular cancer cell is but one of the many potential application for DNA microscopy.”
In their published study, the scientists note that “Despite enormous progress in molecular profiling of cellular constituents, spatially mapping [cells] remains a disjointed and specialized machinery-intensive process, relying on either light microscopy or direct physical registration. Here, we demonstrate DNA microscopy, a distinct imaging modality for scalable, optics-free mapping of relative biomolecule positions.”
How DNA Microscopy Works
The New York Times (NYT) notes that the advantage of DNA microscopy is “that it combines spatial details with scientists’ growing interest in—and ability to measure—precise genomic sequences, much as Google Street View integrates restaurant names and reviews into outlines of city blocks.”
And Singularity Hub notes that “ DNA microscopy, uses only a pipette and some liquid reagents. Rather than monitoring photons, here the team relies on ‘bar codes’ that chemically tag onto biomolecules. Like cell phone towers, the tags amplify, broadcasting their signals outward. An algorithm can then piece together the captured location data and transform those GPS-like digits into rainbow-colored photos. The results are absolutely breathtaking. Cells shine like stars in a nebula, each pseudo-colored according to their genomic profiles.”
“We’ve used DNA in a way that’s mathematically similar to photons in light microscopy,” Weinstein said in the Broad Institute news release. “This allows us to visualize biology as cells see it and not as the human eye does.”
In their study, researchers used DNA microscopy to tag RNA
molecules and map locations of individual human cancer cells. Their method is
“surprisingly simple” New Atlas reported. Here’s how it’s done,
according to the MIT news release:
Small synthetic DNA tags (dubbed “barcodes” by the MIT team) are added to biological samples;
The “tags” latch onto molecules of genetic material in the cells;
The tags are then replicated through a chemical reaction;
The tags combine and create more unique DNA labels;
The scientists use a DNA sequencer to decode and reconstruct the biomolecules;
A computer algorithm decodes the data and converts it to images displaying the biomolecules’ positions within the cells.
“The first time I saw a DNA microscopy image, it blew me away,” said Aviv Regev, PhD, a biologist at the Broad Institute, a Howard Hughes Medical Institute (HHMI) Investigator, and co-author of the MIT study, in an HHMI news release. “It’s an entirely new category of microscopy. It’s not just a technique; it’s a way of doing things that we haven’t ever considered doing before.”
Precision Medicine Potential
“Every cell has a unique make-up of DNA letters or genotype. By capturing information directly from the molecules being studied, DNA microscopy opens up a new way of connecting genotype to phenotype,” said Feng Zhang, PhD, MIT Neuroscience Professor,
Core Institute Member of the Broad Institute, and
Investigator at the McGovern Institute for Brain Research at MIT, in the HHMI
news release.
In other words, DNA microscopy could someday have applications in precision medicine. The MIT researchers, according to Stat, plan to expand the technology further to include immune cells that target cancer.
The Broad Institute has applied for a patent on DNA
microscopy. Clinical laboratory and anatomic pathology group leaders seeking
novel resources for diagnosis and treatment of cancer may want to follow the MIT
scientists’ progress.
Next step is to design Web portal offering low-cost ‘polygenic risk score’ to people willing to upload genetic data received from DNA testing companies such as 23andMe
Their study, published last month in Nature Genetics, found that a genome analysis called polygenic risk scoring can identify individuals with a high risk of developing one of five potentially deadly diseases:
Polygenic Scoring Predicts Risk of Disease Among General Population
To date, most genetic testing has been “single gene,” focusing on rare mutations in specific genes such as those causing sickle cell disease or cystic fibrosis. This latest research indicates that polygenic predictors could be used to discover heightened risk factors in a much larger portion of the general population, enabling early interventions to prevent disease before other warning signs appear. The ultimate goal of precision medicine.
“We’ve known for long time that there are people out there at high risk for disease based just on their overall genetic variation,” senior author Sekar Kathiresan, MD, co-Director of the Medical and Population Genetics Program at the Broad Institute, and Director, Center for Genomic Medicine at Massachusetts General Hospital, said in a Broad Institute news release. “Now, we’re able to measure that risk using genomic data in a meaningful way. From a public health perspective, we need to identify these higher-risk segments of the population, so we can provide appropriate care.”
“What I foresee is in five years, each person will know this risk number—this ‘polygenic risk score’—similar to the way each person knows his or her cholesterol,” Sekar Kathiresan, MD (above), Co-Director of the Medical and Population Genetics Program at the Broad Institute, and Director, Center for Genomic Medicine at Massachusetts General Hospital, told the Associated Press (AP). He went on to say a high-risk score could lead to people taking other steps to lower their overall risk for specific diseases, while a low-risk score “doesn’t give you a free pass” since an unhealthy lifestyle can lead to disease as well. (Photo copyright: Massachusetts General Hospital.)
The researchers conducted the study using data from more than 400,000 individuals in the United Kingdom Biobank. They created a risk score for coronary artery disease by looking for 6.6 million single-letter genetic changes that are more prevalent in people who have had early heart attacks. Of the individuals in the UK Biobank dataset, 8% were more than three times as likely to develop the disease compared to everyone else, based on their genetic variation.
In absolute terms, only 0.8% of individuals with the very lowest polygenic risk scores had coronary artery disease, compared to 11% for people with the highest scores, the Broad Institute news release stated.
“The results should be eye-opening for cardiologists,” Charles C. Hong, MD, PhD, Director of Cardiovascular Research at the University of Maryland School of Medicine, told the AP. “The only disappointment is that this score applies only to those with European ancestry, so I wonder if similar scores are in the works for the large majority of the world population that is not white.”
In its news release, the Broad Institute noted the need for additional studies to “optimize the algorithms for other ethnic groups.”
The Broad Institute’s results suggest, however, that as many as 25 million people in the United States may be at more than triple the normal risk for coronary artery disease. And millions more may be at similar elevated risk for the other conditions, based on genetic variations alone.
Reanalyzing Data from DNA Testing Companies
The researchers are building a website that would enable users to receive a low-cost polygenic risk score—such as calculating inherited risk score for many common diseases—by reanalyzing data users previously receive from DNA testing companies such as 23andMe.
Kathiresan told Forbes his goal is for the 17 million people who have used genotyping services to submit their data to the web portal he is building. He told the magazine he’s hoping “people will be able to get their polygenic scores for about as much as the cost of a cholesterol test.”
Some Experts Not Impressed with Broad Institute Study
But not all experts believe the Broad Institute/MGH/Harvard Medical School study deserves so much attention. Ali Torkamani, PhD, Director of Genomics and Genome Informatics at the Scripps Research Translational Institute, offered a tepid assessment of the Nature Genetics study.
In an article in GEN that noted polygenic risk scores were receiving “the type of attention reserved for groundbreaking science,” Torkamani said the recent news is “not particularly” a big leap forward in the field of polygenic risk prediction. He described the results as “not a methodological advance or even an unexpected result,” noting his own group had generated similar data for type 2 diabetes in their analysis of the UK dataset.
Nevertheless, Kathiresan is hopeful the study will advance disease treatment and prevention. “Ultimately, this is a new type of genetic risk factor,” he said in the news release. “We envision polygenic risk scores as a way to identify people at high or low risk for a disease, perhaps as early as birth, and then use that information to target interventions—either lifestyle modifications or treatments—to prevent disease.”
This latest research indicates healthcare providers could soon be incorporating polygenic risking scoring into routine clinical care. Not only would doing so mean another step forward in the advancement of precision medicine, but clinical laboratories and pathology groups also would have new tools to help diagnose disease and guide treatment decisions.