Archive: Food for Thought

Food for Thought: The Landscape of Selection Biomarkers in Oncology Trials

Hundreds of clinical trials in oncology already use biomarkers to identify patients who have a higher or lower risk of disease progression, as well as help predict how patients will respond to different treatments. However, there has been no systematic overview on the landscape of biomarker use in oncology trials.

In this week’s Science Translational Medicine Robert Sikorski and Bin Yao present the results of their laudable and laborious task to analyse the public database ClinicalTrials.gov for this kind of information.

Their findings can be summarized as follows:

(1) More biomarker work is done in less frequent tumors (such as leukemias) than in more frequent types such as prostate cancer, so it seems that not the big cancer indications will be the first to become segmented into smaller populations with etter treatment options.

(2) There are relatively few selection biomarkers for the major solid tumor indications that will enter clinical practice through current Phase III trials. However, as the database does not include retrospective analyses of completed trials, this point is difficult to assess.

(3) Implementation of biomarkers in clinical trials adds a substantial layer of complexity, increasing costs and making intertrial comparisons more difficult.

(4) Next-generation approaches that apply whole-cancer genome analysis to identify changes associated with therapeutic response will increasingly serve as a major disruptive force reshaping the cancer biomarker landscape.

They forecast that the future of oncology trials will be the study of biomarker-defined patients in smaller, randomized Phase III trials. In addition, they conclude that it should be feasible soon to obtain the complete profile of DNA alterations, DNA copy number changes, and even DNA methylation patterns within a tumor for all subjects in phase I and II studies.

“The resulting ability,” the authors write, “to target new treatments by tumor molecular signatures early in drug development is transformative and offers the promise of demonstrating significantly greater clinical utility with smaller study populations.”

Food for Thought: Why tissue sample quality matters for personalized medicine

“We now have the technical ability to get the wrong answers with unprecedented speed.” Carolyn Compton, Director, Office of Biorepositories and Biospecimen Research

When the U.S. National Cancer Institute recently started its Cancer Genome Atlas initiative and asked biobanks all over the world for cancer biopsy samples, it was puzzled to find that the quality of the donated samples was so poor that the NCI was unable to meet the moderate target of collecting 1,500 biopsy samples per cancer. In a telling article in “Wired magazin”, Steve Silberman gives the example of a university biobank, which claimed to have more than 12,000 samples of glioblastoma in its collection. However, the initiative judged only 18 of those as good enough to use. After contacting biobanks on a global scale, the researchers did not even get to 500 glioblastoma samples of satisfactory quality and barely got to 500 in ovarian cancer, the 5th most common cancer in women. In lung cancer, the initiative was unable to start because it simply could not obtain the minimum number of biopsy samples of adequate quality. „However, all biobanks thought they were doing a superb job,“ resumed Carolyn Compton, director of NCI‘s Office of Biorepositories and Biospecimen Research OBBR and responsible for the biopsy sampling part.
The reason for the poor quality is simple: minutes after cutting tissue off from blood supply, cells start to react with massive changes in gene methylation patterns, gene expression and translation, proteome composition, enzymatic activities, surface protein patterns, etc. The changes affect hundreds of genes, and it is reality in many hospitals that the resected cancer tissue lies around for hours at room temperature in the operation theater before it is put in the freezer to get formalin-fixed a few days later.
Even more, the medication the patient has been given prior to or during operation (sedatives, anesthetics, etc.) has a profound impact on these parameters as well.
Therefore, very often it is impossible to judge whether the changes observed between individual patients is a result of their inherently different metabolisms/genetic makeup or a consequence of different sampling and handling of the biopsies and medications.
“We now have the technical ability to get the wrong answers with unprecedented speed,” Compton says. “If we put the wrong stuff into the front end of our analytical pipeline, we will not only lose the war on cancer, we’ll pollute the scientific literature with incorrect data that will take us a long time to sort out. This is a crisis that requires disruptive innovation.”
OBBR is now systematically looking into the problem and has chosen one company to perform the first systematic studies: Hamburg-based Indivumed GmbH. The company did pioneering research and devised standards for cancer biopsy samplings that are applied in a network of clinics Indivumed is collaborating with in the Hamburg and the Washington DC area. The company runs the only ISO-certified biobank in the world and is offering biospecimens, related patient data, and services including biomarker development for the purpose of developing personalized cancer therapies. By employing specially trained nurses, the company guarantees that each sample is frozen or fixed within 12 minutes, and each sample comes with a data package comprising several hundred data on the patient‘s medical history and life style. Further information about Indivumed, a client of akampion, can be found here.

Food for Thought: Synthetic Biology

A lot has been written since last week’s publication of Craig Venter’s latest coup – the creation of the first cell controlled by a synthetic genome. While the reactions span from the  alarmist (“playing god”) to the dismissive (“nothing new”), most commentaries overlook that Venter has demonstrated that life – for now, bacteria – can be customized to an extend that by far exceeds conventional genetic technologies which merely introduce a few new genes into existing organisms.

For now, it is impossible to forecast the success of building and introducing synthetic genomes to manufacture organisms that spill out biofuel or clean up polluted shores at unprecedented efficiency. Synthetic genomes have only recently become available as the technology to accurately synthesize and assemble large pieces of DNA has made tremendous progress and has led to decreasing prices for synthetic DNA. Venter used 1078 cassettes of 1080 base pairs each which were assembled to a genome of 1.08 million base pairs. In comparison, the E. coli genome consists of about 4.6 million base pairs.

The real challenge now is understanding and commanding the interplay of the multiple genes that make up a functioning genome. Venter’s new bacteria provide an exciting testing ground for this kind of research, and that’s why the first companies to profit from this innovation will be the ones providing the technologies to synthesize and assemble large and complex genes.

In the meantime, the place to look at the field’s progress on a regular basis, is SyntheticBiology and the Biobricks Foundation.

Food for Thought: Open Source Principles – A Concept for the Life Sciences?

In the IT industry, open source is an acknowledged development principle for software that uses peer review and transparency of the development process. The promise of open source is better quality, i.e. higher reliability, more flexibility, and lower cost, among others.
Now, this principle is spreading to the life sciences. For one, there is the Open Source Sensing Initiative which is trying to apply a bottom-up, decentralized approach to the development of sensors for security and environmental purposes. Read more…

1 21 22 23 24