Since each cell in the body contains 23,000 genes, identifying the specific genes involved in cancer growth is an exceedingly complex task. Researchers used a form of artificial intelligence called machine learning to identify three genes that allowed them to determine whether a tumour was fed by estrogen.
"People can’t possibly sort through all this information and find the important patterns," said senior author Russ Greiner, a professor in the Department of Computing Science and investigator with the Alberta Innovates Centre for Machine Learning. "Machines have other limitations, but what they can do is go through high-dimensional data. With our techniques, we can find combinations of biomarkers that can predict important properties of specific breast cancers."
Greiner’s team created an algorithm that proved 93 per cent accurate in predicting the estrogen receptor status of tumours. To do this, they relied on data gathered from 176 frozen tumour samples stored at the Canadian Breast Cancer Foundation Tumor Bank at the Cross Cancer Institute in Edmonton.
The same algorithm was later tested on other data sets available online, with similar success. The results were cross-checked with existing tests done by pathologists using traditional estrogen-receptor testing.
"Essentially, we’ve identified something inexpensive and simple that could replace receptor testing done in a clinical lab," said co-author John Mackey, director of Cross Cancer Institute Clinical Trials Unit, Alberta Health Services. "This is a new way of sifting through thousands of signals and pulling out the wheat from the chaff. In principle, this could be applied to other biomarkers and distil data down into something that a clinician can use."
Mackey, who is also a professor of medical oncology with the Faculty of Medicine & Dentistry, said the technique is poised to take advantage of new gene-sequencing technologies, or genomics, which aims to understand the inner workings of cancer cells with a goal of tailoring treatments for individual patients.
It’s still premature to consider the algorithm as a replacement for traditional lab tests, but that could change as new technologies become more affordable, perhaps in five to eight years.
"We’re not there yet, but at some point it’s going to be cheaper to take a tumour and put it into the machine and get these thousands of signals about its biology than it is to do the increasing number of required tests using traditional techniques in a lab," Mackey said. "When those two lines intersect, we’re going to switch to using the new technologies, and we will need algorithms like this to make sense of the data."
source : http://www.sciencedaily.com/releases/2013/12/131202171924.htm