Tuesday, November 30, 2010

Genomic Fault Zones Come and Go: Fragile Regions in Mammalian Genomes Go Through 'Birth and Death' Process

"The genomic architecture of every species on Earth changes on the evolutionary time scale and humans are not an exception. What will be the next big change in the human genome remains unknown, but our approach could be useful in determining where in the human genome those changes may occur," said Pavel Pevzner, a UC San Diego computer science professor and an author on the new study. Pevzner studies genomes and genome evolution from a computational perspective in the Department of Computer Science and Engineering at the UC San Diego Jacobs School of Engineering.

The fragile regions of genomes are prone to"genomic earthquakes" that can trigger chromosome rearrangements, disrupt genes, alter gene regulation and otherwise play an important role in genome evolution and the emergence of new species. For example, humans have 23 chromosomes while some other apes have 24 chromosomes, a consequence of a genome rearrangement that fused two chromosomes in our ape ancestor into human chromosome 2.

This work was performed by Pevzner and Max Alekseyev -- a computer scientist who recently finished his Ph.D. in the Department of Computer Science and Engineering at the UC San Diego Jacobs School of Engineering. Alekseyev is now a computer science professor at the University of South Carolina.

Turnover Fragile Breakage Model

"The main conclusion of the new paper is that these fragile regions are moving," said Pevzner.

In 2003, Pevzner and UC San Diego mathematics professor Glen Tesler published results claiming that genomes have"fault zones" or genomic regions that are more prone to rearrangements than other regions. Their"Fragile Breakage Model" countered the then largely accepted"Random Breakage Model" -- which implies that there are no rearrangement hotspots in mammalian genomes. While the Fragile Breakage Model has been supported by many studies in the last seven years, the precise locations of fragile regions in the human genome remain elusive.

The new work published inGenome Biologyoffers an update to the Fragile Breakage Model called the"Turnover Fragile Breakage Model." The findings demonstrate that the fragile regions undergo a birth and death process over evolutionary timescales and provide a clue to where the fragile regions in the human genome are located.

Do the Math: Find Fragile Regions

Finding the fragile regions within genomes is akin to looking at a mixed up deck of cards and trying to determine how many times it has been shuffled.

Looking at a genome, you may identify breaks, but to say it is a fragile region, you have to know that breaks occurred more than once at the same genomic position."We are figuring out which regions underwent multiple genome earthquakes by analyzing the present-day genomes that survived these earthquakes that happened millions of years ago. The notion of rearrangements cannot be applied to a single genome at a single point in time. It's relevant when looking at more than one genome," said Pevzner, explaining the comparative genomics approach they took.

"It was noticed that while fragile regions may be shared across different genomes, most often such shared fragile regions are found in evolutionarily close genomes. This observation led us to a conclusion that fragility of any particular genomic position may appear only for a limited amount of time. The newly proposed Turnover Fragile Breakage Model postulates that fragile regions are subject to a 'birth and death' process and thus have limited lifespan," explained Alekseyev.

The Turnover Fragile Breakage Model suggests that genome rearrangements are more likely to occur at the sites where rearrangements have recently occurred -- and that these rearrangement sites change over tens of millions of years. Thus, the best clue to the current locations of fragile regions in the human genome is offered by rearrangements that happened in our closest ancestors -- chimpanzee and other primates.

Pevzner is eagerly awaiting sequenced primate genomes from the Genome 10K Project. Sequencing the genomes of 10,000 vertebrate species -- including 100s of primates -- is bound to provide new insights on human evolutionary history and possibly even the future rearrangements in the human genome.

"The most likely future rearrangements in human genome will happen at the sites that were recently disrupted in primates," said Pevzner.

Work tied to the new Turnover Fragile Breakage Model may also be useful for understanding genome rearrangements at the level of individuals, rather than entire species. In the future, the computer scientists hope to use similar tools to look at the chromosomal rearrangements that occur within the cells of individual cancer patients over and over again in order to develop new cancer diagnostics and drugs.

Pavel Pevzner is the Ronald R. Taylor Professor of Computer Science at UC San Diego; Director of the NIH Center for Computational Mass Spectrometry; and a Howard Hughes Medical Institute (HHMI) Professor.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


Source

Monday, November 29, 2010

'Racetrack' Magnetic Memory Could Make Computer Memory 100,000 Times Faster

Annoyed by how long it took his computer to boot up, Kläui began to think about an alternative. Hard disks are cheap and can store enormous quantities of data, but they are slow; every time a computer boots up, 2-3 minutes are lost while information is transferred from the hard disk into RAM (random access memory). The global cost in terms of lost productivity and energy consumption runs into the hundreds of millions of dollars a day.

Like the tried and true VHS videocassette, the proposed solution involves data recorded on magnetic tape. But the similarity ends there; in this system the tape would be a nickel-iron nanowire, a million times smaller than the classic tape. And unlike a magnetic videotape, in this system nothing moves mechanically. The bits of information stored in the wire are simply pushed around inside the tape using a spin polarized current, attaining the breakneck speed of several hundred meters per second in the process. It's like reading an entire VHS cassette in less than a second.

In order for the idea to be feasible, each bit of information must be clearly separated from the next so that the data can be read reliably. This is achieved by using domain walls with magnetic vortices to delineate two adjacent bits. To estimate the maximum velocity at which the bits can be moved, Kläui and his colleagues* carried out measurements on vortices and found that the physical mechanism could allow for possible higher access speeds than expected.

Their results were published online October 25, 2010, in the journalPhysical Review Letters. Scientists at the Zurich Research Center of IBM (which is developing a racetrack memory) have confirmed the importance of the results in a Viewpoint article. Millions or even billions of nanowires would be embedded in a chip, providing enormous capacity on a shock-proof platform. A market-ready device could be available in as little as 5-7 years.

Racetrack memory promises to be a real breakthrough in data storage and retrieval. Racetrack-equipped computers would boot up instantly, and their information could be accessed 100,000 times more rapidly than with a traditional hard disk. They would also save energy. RAM needs to be powered every millionth of a second, so an idle computer consumes up to 300 mW just maintaining data in RAM. Because Racetrack memory doesn't have this constraint, energy consumption could be slashed by nearly a factor of 300, to a few mW while the memory is idle. It's an important consideration: computing and electronics currently consumes 6% of worldwide electricity, and is forecast to increase to 15% by 2025.


Source

Sunday, November 28, 2010

Supercomputing Center Breaks the Petaflops Barrier

NERSC's newest supercomputer, a 153,408 processor-core Cray XE6 system, posted a performance of 1.05 petaflops (quadrillions of calculations per second) running the Linpack benchmark. In keeping with NERSC's tradition of naming computers for renowned scientists, the system is named Hopper in honor of Admiral Grace Hopper, a pioneer in software development and programming languages.

NERSC serves one of the largest research communities of all supercomputing centers in the United States. The center's supercomputers are used to tackle a wide range of scientific challenges, including global climate change, combustion, clean energy, new materials, astrophysics, genomics, particle physics and chemistry. The more than 400 projects being addressed by NERSC users represent the research mission areas of DOE's Office of Science.

The increasing power of supercomputers helps scientists study problems in greater detail and with greater accuracy, such as increasing the resolution of climate models and creating models of new materials with thousands of atoms. Supercomputers are increasingly used to compliment scientific experimentation by allowing researchers to test theories using computational models and analyzed large scientific data sets. NERSC is also home to Franklin, a 38,128 core Cray XT4 supercomputer with a Linpack performance of 266 teraflops (trillions of calculations per second). Franklin is ranked number 27 on the newest TOP500 list.

The system, installed d in September 2010, is funded by DOE's Office of Advanced Scientific Computing Research.


Source

Saturday, November 27, 2010

'Space-Time Cloak' to Conceal Events

Previously, a team led by Professor Sir John Pendry at Imperial College London showed that metamaterials could be used to make an optical invisibility cloak. Now, a team led by Professor Martin McCall has mathematically extended the idea of a cloak that conceals objects to one that conceals events.

"Light normally slows down as it enters a material, but it is theoretically possible to manipulate the light rays so that some parts speed up and others slow down," says McCall, from the Department of Physics at Imperial College London. When light is 'opened up' in this way, rather than being curved in space, the leading half of the light speeds up and arrives before an event, whilst the trailing half is made to lag behind and arrives too late. The result is that for a brief period the event is not illuminated, and escapes detection. Once the concealed passage has been used, the cloak can then be 'closed' seamlessly.

Such a space-time cloak would open up a temporary corridor through which energy, information and matter could be manipulated or transported undetected."If you had someone moving along the corridor, it would appear to a distant observer as if they had relocated instantaneously, creating the illusion of a Star-Trek transporter," says McCall."So, theoretically, this person might be able to do something and you wouldn't notice!"

While using the spacetime cloak to make people move undetected is still science fiction, there are many serious applications for the new research, which was funded by the Engineering and Physical Sciences Research Council (EPSRC) and the Leverhulme Trust. Co-author Dr Paul Kinsler developed a proof of concept design using customised optical fibres, which would enable researchers to use the event cloak in signal processing and computing. A given data channel could for example be interrupted to perform a priority calculation on a parallel channel during the cloak operation. Afterwards, it would appear to external parts of the circuit as though the original channel had processed information continuously, so as to achieve 'interrupt-without-interrupt'.

Alberto Favaro, who also worked on the project, explains:"Imagine computer data moving down a channel to be like a highway full of cars. You want to have a pedestrian crossing without interrupting the traffic, so you slow down the cars that haven't reached the crossing, while the cars that are at or beyond the crossing get sped up, which creates a gap in the middle for the pedestrian to cross. Meanwhile an observer down the road would only see a steady stream of traffic." One issue that cropped up during their calculations was to speed up the transmitted data without violating the laws of relativity. Favaro solved this by devising a clever material whose properties varied in both space and time, allowing the cloak to be formed.

"We're sure that there are many other possibilities opened up by our introduction of the concept of the spacetime cloak,' says McCall,"but as it's still theoretical at this stage we still need to work out the concrete details for our proposed applications."

Metamaterials is an expanding field of science, with a vast array of potential uses, spanning defence, security, medicine, data transfer and computing. Many ordinary household devices that work using electromagnetic fields could be made more cheaply or to work at higher speeds. Metamaterials could also be used to control other types of waves as well as light, such as sound or water waves, opening up potential applications for protecting coastal or offshore installations, or even engineering buildings to withstand earthquake waves.


Source

Friday, November 26, 2010

Intelligent Detector Provides Real-Time Information on Available Parking Spaces

Testing of the new technology is currently underway at the Universitat Politècnica de Catalunya's North Campus, and a patent is being sought. The system can be used to provide users with information via mobile devices such as phones, laptop computers, and iPads, or using luminous panels in public thoroughfares. In the coming months it will be installed in the 22@Barcelona innovation district and in downtown Figueres.

A team at the Department of Electronic Engineering of the Castelldefels School of Telecommunications and Aerospace Engineering (EETAC), part of the Universitat Politècnica de Catalunya (UPC), has designed a new method for continuously detecting the presence of vehicles using both an optical and a magnetic sensor. The detector incorporates the two sensors in a 4 by 13 cm casing that is set into the pavement of each parking space. Urbiòtica, a company set up by UPC professors and their industrial partners, is testing the system at the UPC's North Campus prior to placing it on the market.

The device works by first detecting the sudden change in the amount of light reaching the pavement that occurs when a vehicle passes over it. The optical sensor then activates the magnetic sensor to verify that the shadow is being produced by a vehicle. This is done by detecting the slight disturbance in Earth's magnetic field that occurs when a car passes over or stops above the device. The two sensors are connected to a microcontroller that executes an algorithm to determine whether or not a vehicle is present. The system's optical sensor is always active but consumes an insignificant amount of power.

When a vehicle is detected, the microcontroller sends a radio-frequency signal, which conveys this information to an antenna connected to a transceiver. This way of transmitting signals is much more economical than using wiring. The transceiver, designed for installation on street lights, receives the information and transmits it to the database or control center within seconds (using technologies such as Wi-Fi or GPRS). Potential clients for the system include municipal services and parking lot operators.

According to Ramon Pallàs, head of the UPC team that developed the technology (for which a patent is being sought), the plan is to make the information available on luminous panels on public thoroughfares. Users will also be able to receive parking information on mobile devices such as phones, laptop computers, and iPads.

The innovative features of the product (which the UPC's AntenaLAB group also worked on) relate to the field of sensors, the circuits connecting the sensors to the microcontroller, the method for supplying power to the sensors, and management of the power supply for the system as a whole.

Continuous operation with low power consumption

The invention overcomes the shortcomings of the best existing systems for detecting stationary vehicles. There currently exist devices that emit a signal when a car passes over a sensor, but they do not detect whether the vehicle stops. In an enclosed facility these systems can be used to count vehicles entering and leaving and thus determine the number of parking spaces available, but they do not indicate where the free spaces are. Also, the magnetic sensors now in use consume too much energy to be kept running all the time.

In contrast, the system developed by the UPC group and marketed by Urbiòtica operates continuously and uses very little power because the optical sensor is the only component that is always active and the magnetic sensor is activated less frequently than in other similar systems. The fact that the sensors are connected directly to the microcontroller, without any intermediate electronic circuit, also reduces power consumption.

Practical applications

The new system can be used to manage and monitor vehicles on public and private thoroughfares, particularly in urban areas. This makes it possible to monitor points of access to centers of population, restricted zones, security zones, and grade crossings, and to manage parking on streets, at airports, and in commercial and underground parking areas. These applications can reduce the time drivers spend looking for a parking spot, resulting in lower fuel consumption and less pollution.

The characteristics of the system also facilitate other applications, such as the reservation of parking spaces for disabled drivers and payment based on the real time that a parking space is used. The system could also be used to detect areas where lighting is absent or insufficient.

Once pilot testing has been successfully completed, the system will be installed in the 22@Barcelona innovation district (from December on) as part of a Barcelona City Council project to deploy sensor systems, and in the town of Figueres (early in 2011), where it will be used to monitor traffic entering and leaving the city center.


Source

Thursday, November 25, 2010

Short, on-Chip Light Pulses Will Enable Ultrafast Data Transfer Within Computers

Details appeared online in the journalNature Communicationson November 16.

This miniaturized short pulse generator eliminates a roadblock on the way to optical interconnects for use in PCs, data centers, imaging applications and beyond. These optical interconnects, which will aggregate slower data channels with pulse compression, will have far higher data rates and generate less heat than the copper wires they will replace. Such aggregation devices will be critical for future optical connections within and between high speed digital electronic processors in future digital information systems.

"Our pulse compressor is implemented on a chip, so we can easily integrate it with computer processors," said Dawn Tan, the Ph.D. candidate in the Department of Electrical and Computer Engineering at UC San Diego Jacobs School of Engineering who led development of the pulse compressor.

"Next generation computer networks and computer architectures will likely replace copper interconnects with their optical counterparts, and these have to be complementary metal oxide semiconductor (CMOS) compatible. This is why we created our pulse compressor on silicon," said Tan, an electrical engineering graduate student researcher at UC San Diego, and part of the National Science Foundation funded Center for Integrated Access Networks.

The pulse compressor will also provide a cost effective method to derive short pulses for a variety of imaging technologies such as time resolved spectroscopy -- which can be used to study lasers and electron behavior, and optical coherence tomography -- which can capture biological tissues in three dimensions.

In addition to increasing data transfer rates, switching from copper wires to optical interconnects will reduce power consumption caused by heat dissipation, switching and transmission of electrical signals.

"At UC San Diego, we recognized the enabling power of nanophotonics for integration of information systems close to 20 years ago when we first started to use nano-scale lithographic tools to create new optical functionalities of materials and devices -- and most importantly, to enable their integration with electronics on a chip. This Nature Communications paper demonstrates such integration of a few optical signal processing device functionalities on a CMOS compatible silicon-on-insulator material platform," said Yeshaiahu Fainman, a professor in the Department of Electrical and Computer Engineering in the UC San Diego Jacobs School of Engineering. Fainman acknowledged DARPA support in developing silicon photonics technologies which helped to enable this work, through programs such as Silicon-based Photonic Analog Signal Processing Engines with Reconfigurability (Si-PhASER) and Ultraperformance Nanophotonic Intrachip Communications (UNIC).

Pulse Compression for On-Chip Optical Interconnects

The compressed pulses are seven times shorter than the original -- the largest compression demonstrated to date on a chip.

Until now, pulse compression featuring such high compression factors was only possible using bulk optics or fiber-based systems, both of which are bulky and not practical for optical interconnects for computers and other electronics.

The combination of high compression and miniaturization are possible due to a nanoscale, light-guiding tool called an"integrated dispersive element" developed and designed primarily by electrical engineering Ph.D. candidate Dawn Tan.

The new dispersive element offers a much needed component to the on-chip nanophotonics tool kit.

The pulse compressor works in two steps. In step one, the spectrum of incoming laser light is broadened. For example, if green laser light were the input, the output would be red, green and blue laser light. In step two, the new integrated dispersive element developed by the electrical engineers manipulates the light so each spectrum in the pulse is travelling at the same speed. This speed synchronization is where pulse compression occurs.

Imagine the laser light as a series of cars. Looking down from above, the cars are initially in a long caravan. This is analogous to a long pulse of laser light. After stage one of pulse compression, the cars are no longer in a single line and they are moving at different speeds. Next, the cars move through the new dispersive grating where some cars are sped up and others are slowed down until each car is moving at the same speed. Viewed from above, the cars are all lined up and pass the finish line at the same moment.

This example illustrates how the on-chip pulse compressor transforms a long pulse of light into a spectrally broader and temporally shorter pulse of light. This temporally compressed pulse will enable multiplexing of data to achieve much higher data speeds.

"In communications, there is this technique called optical time division multiplexing or OTDM, where different signals are interleaved in time to produce a single data stream with higher data rates, on the order of terabytes per second. We've created a compression component that is essential for OTDM," said Tan.

The UC San Diego electrical engineers say they are the first to report a pulse compressor on a CMOS-compatible integrated platform that is strong enough for OTDM.

"In the future, this work will enable integrating multiple 'slow' bandwidth channels with pulse compression into a single ultra-high-bandwidth OTDM channel on a chip. Such aggregation devices will be critical for future inter- and intra-high speed digital electronic processors interconnections for numerous applications such as data centers, field-programmable gate arrays, high performance computing and more," said Fainman, holder of the Cymer Inc. Endowed Chair in Advanced Optical Technologies at the UC San Diego Jacobs School of Engineering and Deputy Director of the NSF-funded Center for Integrated Access Networks.

This work was supported by the Defense Advanced Research Projects Agency, the National Science Foundation (NSF) through Electrical, Communications and Cyber Systems (ECCS) grants, the NSF Center for Integrated Access Networks ERC, the Cymer Corporation and the U.S. Army Research Office.


Source

Wednesday, November 24, 2010

A New Electromagnetism Can Be Simulated Through a Quantum Simulator

There are two fundamental aspects that make these devices attractive for scientists. On the one hand, quantum simulators will play a leading role in clarifying some important, but yet unsolved, puzzles of theoretical physics.. On the other hand, such deeper understanding of a given phenomenon will certainly give rise to useful technological applications.

One of the best quantum simulators consists of a gas of extremely cold atoms loaded in an artificial crystal made of light: an optical lattice. Experimental physicists have developed efficient techniques to control the quantum properties of this system, to such extent, that it serves as an ideal quantum simulator of different phenomena.

So far, efforts have been focused on condensed-matter systems, where many open and interesting problems remain to be solved.

In a recent work published inPhysical Review Lettersby a collaboration of international teams (Universidad Complutense de Madrid: A. Bermudez and M.A. Martin-Delgado; ICFO Barcelona: M. Lewenstein; Max-Planck Institute Garching: L. Mazza, M. Rizzi; Universite de Brussels: N. Goldman), this platform has also been shown to be a potential quantum simulator of high-energy physics.

The authors have proposed a clean and controllable setup where a variety of exotic, but still unobserved, phenomena arise. They describe how to build a quantum simulator of Axion Electrodynamics (high-energy physics), and 3D Topological Insulators (condensed matter). In particular, these results pave the way to the fabrication of an Axion, a long sought-after missing particle in the standard model of elementary particles. They show that their atomic setup constitutes an axion medium, where an underlying topological order gives rise to a non-vanishing axion field.

Besides, they show how the value of the axion can attain arbitrary values, and how its dynamics and space-dependence can be experimentally controlled. Accordingly, their optical-lattice simulator offers a unique possibility to observe diverse effects, such as the Wiiten effect, the Wormhole effect, or a fractionally charged capacitor, in atomic-physics laboratories.

This work has an interdisciplinary character, which brings together physicists specializing in lattice gauge theories, atomic molecular and optical physics, and condensed matter physics.


Source

Tuesday, November 23, 2010

Software Allows Interactive Tabletop Displays on Web

Tabletop touch-operated displays are becoming popular with professionals in various fields, said Niklas Elmqvist, an assistant professor of electrical and computer engineering at Purdue University.

"These displays are like large iPhones, and because they are large they invite collaboration," he said."So we created a software framework that allows more than one display to connect and share the same space over the Internet."

Users are able to pan and zoom using finger-touch commands, said Elmqvist, who named the software Hugin after a raven in Norse mythology that provided the eyes of ears for the god Odin.

"Hugin was designed for touch screens but can be used with any visual display and input device, such as a mouse and keyboard," he said.

Tabletop displays commercially available are the size of a coffee table. The researchers created a unit about twice that size -- 58 inches by 37 inches -- for laboratory studies. They tested the software on 12 users in three groups of four on Purdue's main campus in West Lafayette, Ind., and at the University of Manitoba in Canada. The teams worked together to solve problems on tabletop systems.

Findings were detailed in a research paper presented earlier this month during the ACM International Conference on Interactive Tabletops and Surfaces 2010 in Saarbrücken, Germany.

The collaborative capability would aid professionals such as defense and stock market analysts and authorities managing emergency response to disasters. The program allows users to work together with"time-series charts," like the stock market index or similar graphics that change over time, said Elmqvist, who is working with doctoral student Waqas Javed and graduate student KyungTae Kim.

"This system could be run in a command center where you have people who have access to a tabletop," Elmqvist said."In future iterations it might allow integration of mobile devices connected to the tabletop so emergency responders can see on their small device whatever the people in the command center want them to see."

Participants have their own"territorial workspaces," where they may keep certain items hidden for privacy and practical purposes.

"Everyone only sees the things you send to a public domain on the display," Elmqvist said."This is partly for privacy but also because you don't want to overload everybody with everything you are working on."

The researchers are providing Hugin free to the public and expect to make the software available online in December.

"Other people will be able to use it as a platform to build their own thing on top of," he said."They will be able to download and contribute to it, customize it, add new visualizations."

The research paper was written by Kim, Javed and Elmqvist, all from Purdue's School of Electrical and Computer Engineering, and two researchers from the University of Manitoba: graduate student Cary Williams and Pourang Irani, a professor in the university's Department of Computer Science.

The researchers are working with the Pacific Northwest National Laboratory to develop technologies for command and control in emergency situations, such as first response to disasters.


Source

Physicists Demonstrate a Four-Fold Quantum Memory

Their work, described in the November 18 issue of the journalNature,also demonstrated a quantum interface between the atomic memories -- which represent something akin to a computer"hard drive" for entanglement -- and four beams of light, thereby enabling the four-fold entanglement to be distributed by photons across quantum networks. The research represents an important achievement in quantum information science by extending the coherent control of entanglement from two to multiple (four) spatially separated physical systems of matter and light.

The proof-of-principle experiment, led by William L. Valentine Professor and professor of physics H. Jeff Kimble, helps to pave the way toward quantum networks. Similar to the Internet in our daily life, a quantum network is a quantum"web" composed of many interconnected quantum nodes, each of which is capable of rudimentary quantum logic operations (similar to the"AND" and"OR" gates in computers) utilizing"quantum transistors" and of storing the resulting quantum states in quantum memories. The quantum nodes are"wired" together by quantum channels that carry, for example, beams of photons to deliver quantum information from node to node. Such an interconnected quantum system could function as a quantum computer, or, as proposed by the late Caltech physicist Richard Feynman in the 1980s, as a"quantum simulator" for studying complex problems in physics.

Quantum entanglement is a quintessential feature of the quantum realm and involves correlations among components of the overall physical system that cannot be described by classical physics. Strangely, for an entangled quantum system, there exists no objective physical reality for the system's properties. Instead, an entangled system contains simultaneously multiple possibilities for its properties. Such an entangled system has been created and stored by the Caltech researchers.

Previously, Kimble's group entangled a pair of atomic quantum memories and coherently transferred the entangled photons into and out of the quantum memories. For such two-component -- or bipartite -- entanglement, the subsystems are either entangled or not. But for multi-component entanglement with more than two subsystems -- or multipartite entanglement -- there are many possible ways to entangle the subsystems. For example, with four subsystems, all of the possible pair combinations could be bipartite entangled but not be entangled over all four components; alternatively, they could share a"global" quadripartite (four-part) entanglement.

Hence, multipartite entanglement is accompanied by increased complexity in the system. While this makes the creation and characterization of these quantum states substantially more difficult, it also makes the entangled states more valuable for tasks in quantum information science.

To achieve multipartite entanglement, the Caltech team used lasers to cool four collections (or ensembles) of about one million Cesium atoms, separated by 1 millimeter and trapped in a magnetic field, to within a few hundred millionths of a degree above absolute zero. Each ensemble can have atoms with internal spins that are"up" or"down" (analogous to spinning tops) and that are collectively described by a"spin wave" for the respective ensemble. It is these spin waves that the Caltech researchers succeeded in entangling among the four atomic ensembles.

The technique employed by the Caltech team for creating quadripartite entanglement is an extension of the theoretical work of Luming Duan, Mikhail Lukin, Ignacio Cirac, and Peter Zoller in 2001 for the generation of bipartite entanglement by the act of quantum measurement. This kind of"measurement-induced" entanglement for two atomic ensembles was first achieved by the Caltech group in 2005.

In the current experiment, entanglement was"stored" in the four atomic ensembles for a variable time, and then"read out" -- essentially, transferred -- to four beams of light. To do this, the researchers shot four"read" lasers into the four, now-entangled, ensembles. The coherent arrangement of excitation amplitudes for the atoms in the ensembles, described by spin waves, enhances the matter-light interaction through a phenomenon known as superradiant emission.

"The emitted light from each atom in an ensemble constructively interferes with the light from other atoms in the forward direction, allowing us to transfer the spin wave excitations of the ensembles to single photons," says Akihisa Goban, a Caltech graduate student and coauthor of the paper. The researchers were therefore able to coherently move the quantum information from the individual sets of multipartite entangled atoms to four entangled beams of light, forming the bridge between matter and light that is necessary for quantum networks.

The Caltech team investigated the dynamics by which the multipartite entanglement decayed while stored in the atomic memories."In the zoology of entangled states, our experiment illustrates how multipartite entangled spin waves can evolve into various subsets of the entangled systems over time, and sheds light on the intricacy and fragility of quantum entanglement in open quantum systems," says Caltech graduate student Kyung Soo Choi, the lead author of the Nature paper. The researchers suggest that the theoretical tools developed for their studies of the dynamics of entanglement decay could be applied for studying the entangled spin waves in quantum magnets.

Further possibilities of their experiment include the expansion of multipartite entanglement across quantum networks and quantum metrology."Our work introduces new sets of experimental capabilities to generate, store, and transfer multipartite entanglement from matter to light in quantum networks," Choi explains."It signifies the ever-increasing degree of exquisite quantum control to study and manipulate entangled states of matter and light."

In addition to Kimble, Choi, and Goban, the other authors of the paper are Scott Papp, a former postdoctoral scholar in the Caltech Center for the Physics of Information now at the National Institute of Standards and Technology in Boulder, Colorado, and Steven van Enk, a theoretical collaborator and professor of physics at the University of Oregon, and an associate of the Institute for Quantum Information at Caltech.

This research was funded by the National Science Foundation, the National Security Science and Engineering Faculty Fellowship program at the U.S. Department of Defense (DOD), the Northrop Grumman Corporation, and the Intelligence Advanced Research Projects Activity.


Source