Artificial Intelligence Development

News

Airport security-style technology could help doctors decide on stroke treatment

A new computer program could help doctors predict which patients might suffer potentially fatal side-effects from a key stroke treatment.

The program, which assesses brain scans using pattern recognition software similar to that used in airport security and passport control, has been developed by researchers at Imperial College London. Results of a pilot study funded by the Wellcome Trust, which used the software are published in the journal Neuroimage Clinical.

Stroke affects over 15 million people each year worldwide. Ischemic strokes are the most common and these occur when small clots interrupt the blood supply to the brain. The most effective treatment is called intravenous thrombolysis, which injects a chemical into the blood vessels to break up or 'bust' the clots, allowing blood to flow again.

However, because intravenous thombolysis effectively thins the blood, it can cause harmful side effects in about six per cent of patients, who suffer bleeding within the skull. This often worsens the disability and can cause death.

Clinicians attempt to identify patients most at risk of bleeding on the basis of several signs assessed from brain scans. However, these signs can often be very subtle and human judgements about their presence and severity tend to lack accuracy and reliability.

In the new study, researchers trained a computer program to recognize patterns in the brain scans that represent signs such as brain-thinning or diffuse small-vessel narrowing, in order to predict the likelihood of bleeding. They then pitted the automated pattern recognition software against radiologists' ratings of the scans. The computer program predicted the occurrence of bleeding with 74 per cent accuracy compared to 63 per cent for the standard prognostic approach.

Dr Paul Bentley from the Department of Medicine, lead author of the study, said: "For each patient that doctors see, they have to weigh up whether the benefits of a treatment will outweigh the risks of side effects. Intravenous thrombolysis carries the risk of very severe side effects for a small proportion of patients, so having the best possible information on which to base our decisions is vital. Our new study is a pilot but it suggests that ultimately doctors might be able to use our pattern recognition software, alongside existing methods, in order to make more accurate assessments about who is most at risk and treat them accordingly. We are now planning to carry out a much larger study to more fully assess its potential."

The research team conducted a retrospective analysis of computerized tomography (CT) scans from 116 patients. These are scans that use x-rays to produce 'virtual slices' of the brain. All the patients had suffered ischemic strokes and undergone intravenous thrombolysis in Charing Cross Hospital. In the sample the researchers included scans from 16 patients who had subsequently developed serious bleeding within the brain.

Without knowing the outcomes of the treatment, three independent experts examined the scans and used standard prognostic tools to predict whether patients would develop bleeding after treatment.

In parallel the computer program directly assessed and classified the patterns of the brain scans to produce its own predictions.

Researchers evaluated the performance of both approaches by comparing their predictions of bleeding with the actual experiences of the patients.

Using a statistical test the research showed the computer program predicted the occurrence of bleeding with 74 per cent accuracy compared to 63 per cent for the standard prognostic approach.

The researchers also gave the computer a series of 'identity parades' by asking the software to choose which patient out of ten scans went on to suffer bleeding. The computer correctly identified the patient 56 per cent of the time while the standard approach was correct 31 per cent of the time.

The researchers are keen to explore whether their software could also be used to identify stroke patients who might be helped by intravenous thrombolysis who are not currently offered this treatment. At present only about 20 per cent of patients with strokes are treated using intravenous thrombolysis, as doctors usually exclude those with particularly severe strokes or patients who have suffered the stroke more than four and half hours before arriving at hospital. The researchers believe that their software has the potential to help doctors to identify which of those patients are at low risk of suffering side effects and hence might benefit from treatment.

Story Source:

The above story is based on materials provided by Imperial College London. Note: Materials may be edited for content and length.

 

New technology for greenhouses developed

Agricultural and fruit producers could acquire high-tech greenhouses at a considerably less cost, thanks to experts from the Autonomous University of Zacatecas (UAZ) in the North of Mexico developing computer systems to control climatic variables within such infrastructures.

According to Luis Octavio Solís Sánchez, researcher in the Department of Electrical Engineering, some factors that may increase the cost of acquiring the import greenhouses are the level of sophistication of its technologies for automation, its size, and the type of materials that are implemented.

For this reason, experts developed technology to automate climatic variables which cost nearly a half million Mexican peso; i.e., only 10 percent of the maximum purchase price of an imported greenhouse.

Solís Sánchez said the technology consists of a motherboard, embedded computer systems (for specific functions), a graphical interface for monitoring variables such as humidity, temperature , wind speed and radiation, as well as elements that enable wireless connectivity between the greenhouse and mobile devices like cell phones.

The development of the UAZ also saves costs in that it can be repaired within the country, and training for operation is included at the time of purchase, with no need of using foreign specialists.

The progressive implementation of this greenhouses for different areas of the country entail multiple benefits for domestic producers. "Automatic control of microclimates has the potential to mitigate the total cost of water for agriculture, which in Mexico amounts to almost 70 percent of the vital liquid. It also allows users to obtain crops equivalent to an area of ??10 hectares in just 500 square meters, "says the researcher.

Solís Sánchez stressed that this project has passed the pilot stage and the technology has been transferred to companies interested in marketing. A second phase for this technologies is the development of neural networks to give some artificial intelligence to the greenhouses. (Agencia ID)

Story Source:

The above story is based on materials provided by Investigación y Desarrollo. Note: Materials may be edited for content and length.

 

Applying math to biology: Software identifies disease-causing mutations in undiagnosed illnesses

A computational tool developed at the University of Utah (U of U) has successfully identified diseases with unknown gene mutations in three separate cases, U of U researchers and their colleagues report in a new study in The American Journal of Human Genetics. The software, Phevor (Phenotype Driven Variant Ontological Re-ranking tool), identifies undiagnosed illnesses and unknown gene mutations by analyzing the exomes, or areas of DNA where proteins that code for genes are made, in individual patients and small families.

Sequencing the genomes of individuals or small families often produces false predictions of mutations that cause diseases. But the study, conducted through the new USTAR Center for Genetic Discovery at the U of U, shows that Phevor's unique approach allows it to identify disease-causing genes more precisely than other computational tools.

Mark Yandell, Ph.D, professor of human genetics, led the research. He was joined by co-authors Martin Reese, Ph.D., of Omicia Inc., an Oakland, Calif., genome interpretation software company, Stephen L. Guthery, M.D., professor of pediatrics who saw two of the cases in clinic, a colleague at the MD Anderson Cancer Center in Houston, and other U of U researchers. Marc V. Singleton, a doctoral student in Yandell's lab, is the first author.

Phevor represents a major advance in personalized health care, according to Lynn B. Jorde, Ph.D., U of U professor and chair of human genetics and also a co-author on the study. As the cost of genome sequencing continues to drop, Jorde expects it to become part of standardized health care within a few years, making diagnostic tools such as Phevor more readily available to clinicians.

"With Phevor, just having the DNA sequence will enable clinicians to identify rare and undiagnosed diseases and disease-causing mutations," Jorde said. "In some cases, they'll be able to make the diagnosis in their own offices."

Phevor works by using algorithms that combine the probabilities of gene mutations being involved in a disease with databases of phenotypes, or the physical manifestation of a disease, and information on gene functions. By combining those factors, Phevor identifies an undiagnosed disease or the most likely candidate gene mutation for causing a disease. It is particularly useful when clinicians want to identify an illness or gene mutation involving a single patient or the patient and two or three other family members, which is the most common clinical situation for undiagnosed diseases.

Yandell, the lead developer of the software, describes Phevor as the application of mathematics to biology. "Phevor is a way to try to get the most out of a child's genome to identify diseases or find disease-causing gene mutations," Yandell said.

The published research cites the case of a 6-month-old infant who was ill with what appeared to be a liver problem, but the child's health care providers couldn't diagnose exactly what was wrong. Phevor solved the mystery by identifying the disease and finding an unknown gene mutation that caused it. In two other cases, Phevor identified unknown gene mutations related to an immunodeficiency disease and autoimmunity disorder in the same way -- by sifting through sequenced parts of the genomes of the two young patients and two or three family members.

In one case, Yandell and colleagues used Phevor and another computational tool, VAAST (Variant Annotation, Analysis, Search Tool), to look for the likely mutation in an immunodeficiency syndrome found in three of four members in a family. Blood was taken from each family member, plus an unrelated person who showed the same symptoms as the mother and two children in the family, for DNA sequencing.

VAAST, also developed in Yandell's laboratory, identified a number of mutations that might have caused the syndrome, but couldn't identify an individual candidate as the causative gene. But using the results from VAAST, in combination with Phevor, the Yandell and colleagues identified the one gene that most likely caused the syndrome. Follow-up studies confirmed Phevor's prediction results.

In a similar case with a 12-year-old whose exome was sequenced without any family data, Phevor built on the analysis of VAAST to identify a gene mutation causing the illness, another autoimmune syndrome. In this case, Phevor needed the exome from only the patient to identify the syndrome.

   

Odds that global warming is due to natural factors: Slim to none

An analysis of temperature data since 1500 all but rules out the possibility that global warming in the industrial era is just a natural fluctuation in the earth's climate, according to a new study by McGill University physics professor Shaun Lovejoy.

The study, published online April 6 in the journal Climate Dynamics, represents a new approach to the question of whether global warming in the industrial era has been caused largely by man-made emissions from the burning of fossil fuels. Rather than using complex computer models to estimate the effects of greenhouse-gas emissions, Lovejoy examines historical data to assess the competing hypothesis: that warming over the past century is due to natural long-term variations in temperature.

"This study will be a blow to any remaining climate-change deniers," Lovejoy says. "Their two most convincing arguments - that the warming is natural in origin, and that the computer models are wrong - are either directly contradicted by this analysis, or simply do not apply to it."

Lovejoy's study applies statistical methodology to determine the probability that global warming since 1880 is due to natural variability. His conclusion: the natural-warming hypothesis may be ruled out "with confidence levels great than 99%, and most likely greater than 99.9%."

To assess the natural variability before much human interference, the new study uses "multi-proxy climate reconstructions" developed by scientists in recent years to estimate historical temperatures, as well as fluctuation-analysis techniques from nonlinear geophysics. The climate reconstructions take into account a variety of gauges found in nature, such as tree rings, ice cores, and lake sediments. And the fluctuation-analysis techniques make it possible to understand the temperature variations over wide ranges of time scales.

For the industrial era, Lovejoy's analysis uses carbon-dioxide from the burning of fossil fuels as a proxy for all man-made climate influences - a simplification justified by the tight relationship between global economic activity and the emission of greenhouse gases and particulate pollution, he says. "This allows the new approach to implicitly include the cooling effects of particulate pollution that are still poorly quantified in computer models," he adds.

While his new study makes no use of the huge computer models commonly used by scientists to estimate the magnitude of future climate change, Lovejoy's findings effectively complement those of the International Panel on Climate Change (IPCC), he says. His study predicts, with 95% confidence, that a doubling of carbon-dioxide levels in the atmosphere would cause the climate to warm by between 2.5 and 4.2 degrees Celsius. That range is more precise than - but in line with -- the IPCC's prediction that temperatures would rise by 1.5 to 4.5 degrees Celsius if CO2 concentrations double.

"We've had a fluctuation in average temperature that's just huge since 1880 - on the order of about 0.9 degrees Celsius," Lovejoy says. "This study shows that the odds of that being caused by natural fluctuations are less than one in a hundred and are likely to be less than one in a thousand.

"While the statistical rejection of a hypothesis can't generally be used to conclude the truth of any specific alternative, in many cases - including this one - the rejection of one greatly enhances the credibility of the other."

Story Source:

The above story is based on materials provided by McGill University. Note: Materials may be edited for content and length.

 

New 'switch' could power quantum computing: Light lattice traps atoms, builds networks of quantum information transmitters

Using a laser to place individual rubidium atoms near the surface of a lattice of light, scientists at MIT and Harvard University have developed a new method for connecting particles -- one that could help in the development of powerful quantum computing systems.

The new technique, described in a paper published today in the journal Nature, allows researchers to couple a lone atom of rubidium, a metal, with a single photon, or light particle. This allows both the atom and photon to switch the quantum state of the other particle, providing a mechanism through which quantum-level computing operations could take place.

Moreover, the scientists believe their technique will allow them to increase the number of useful interactions occurring within a small space, thus scaling up the amount of quantum computing processing available.

"This is a major advance of this system," says Vladan Vuletic, a professor in MIT's Department of Physics and Research Laboratory for Electronics (RLE), and a co-author of the paper. "We have demonstrated basically an atom can switch the phase of a photon. And the photon can switch the phase of an atom."

That is, photons can have two polarization states, and interaction with the atom can change the photon from one state to another; conversely, interaction with the photon can change an atom's energy level from its "ground" state to its "excited" state. In this way the atom-photon coupling can serve as a quantum switch to transmit information -- the equivalent of a transistor in a classical computing system. And by placing many atoms within the same field of light, the researchers may be able to build networks that can process quantum information more effectively.

"You can now imagine having several atoms placed there, to make several of these devices -- which are only a few hundred nanometers thick, 1,000 times thinner than a human hair -- and couple them together to make them exchange information," Vuletic adds.

Using a photonic cavity

Quantum computing could enable the rapid performance of calculations by taking advantage of the distinctive quantum-level properties of particles. Some particles can be in a condition of superposition, appearing to exist in two places at the same time. Particles in superposition, known as qubits, could thus contain more information than particles at classical scales, and allow for faster computing.

However, researchers are in the early stages of determining which materials best allow for quantum-scale computing. The MIT and Harvard researchers have been examining photons as a candidate material, since photons rarely interact with other particles. For this reason, an optical quantum computing system, using photons, could be harder to knock out of its delicate alignment. But since photons rarely interact with other bits of matter, they are difficult to manipulate in the first place.

In this case, the researchers used a laser to place a rubidium atom very close to the surface of a photonic crystal cavity, a structure of light. The atoms were placed no more than 100 or 200 nanometers -- less than a wavelength of light -- from the edge of the cavity. At such small distances, there is a strong attractive force between the atom and the surface of the light field, which the researchers used to trap the atom in place.

Other methods of producing a similar outcome have been considered before -- such as, in effect, dropping atoms into the light and then finding and trapping them. But the researchers found that they had greater control over the particles this way.

"In some sense, it was a big surprise how simple this solution was compared to the different techniques you might envision of getting the atoms there," Vuletic says.

The result is what he calls a "hybrid quantum system," where individual atoms are coupled to microscopic fabricated devices, and in which atoms and photons can be controlled in productive ways. The researchers also found that the new device serves as a kind of router separating photons from each other.

"The idea is to combine different things that have different strengths and weaknesses in such a way to generate something new," Vuletic says, adding: "This is an advance in technology. Of course, whether this will be the technology remains to be seen."

'Still amazing' to hold onto one atom

The paper, "Nanophotonic quantum phase switch with a single atom," is co-authored by Vuletic; Tobias Tiecke, a postdoc affiliated with both RLE and Harvard; Harvard professor of physics Mikhail Lukin; Harvard postdoc Nathalie de Leon; and Harvard graduate students Jeff Thompson and Bo Liu.

The collaboration between the MIT and Harvard researchers is one of two advances in the field described in the current issue of Nature. Researchers at the Max Planck Institute of Quantum Optics in Germany have concurrently developed a new method of producing atom-photon interactions using mirrors, forming quantum gates, which change the direction of motion or polarization of photons.

If the research techniques seem a bit futuristic, Vuletic says that even as an experienced researcher in the field, he remains slightly awed by the tools at his disposal.

"For me what is still amazing, after working in this for 20 years," Vuletic reflects, "is that we can hold onto a single atom, we can see it, we can move it around, we can prepare quantum superpositions of atoms, we can detect them one by one."

   

Page 7 of 28

Our Partners