JILA strontium atomic clock sets new records in both precision and stability.

Heralding a new age of terrific timekeeping, a research group led by a National Institute of Standards and Technology (NIST) physicist has unveiled an experimental strontium atomic clock that has set new world records for both precision and stability— key metrics for the performance of a clock.

The is in a laboratory at JILA, a joint institute of NIST and the University of Colorado Boulder.

Described in a new paper in Nature, the JILA lattice clock is about 50 percent more precise than the record holder of the past few years, NIST’s quantum logic clock. Precision refers to how closely the clock approaches the true resonant frequency at which its reference atoms oscillate between two electronic energy levels. The new strontium clock is so precise it would neither gain nor lose one second in about 5 billion years, if it could operate that long. (This time period is longer than the age of the Earth, an estimated 4.5 billion years old.) https://i0.wp.com/cdn.physorg.com/newman/gfx/news/2014/rftyguh.jpg

The strontium clock’s stability—the extent to which each tick matches the duration of every other tick—is about the same as NIST’s ytterbium atomic clock, another world leader in stability unveiled in August, 2013. Stability determines in part how long an must run to achieve its best performance through continual averaging. The strontium and ytterbium lattice clocks are so stable that in just a few seconds of averaging they outperform other types of atomic clocks that have been averaged for hours or days.

“We already have plans to push the performance even more,” NIST/JILA Fellow and group leader Jun Ye says. “So in this sense, even this new Nature paper represents only a ‘mid-term’ report. You can expect more new breakthroughs in our clocks in the next 5 to 10 years.”

The current international definition of units of time requires the use of cesium-based atomic clocks, such as the current U.S. civilian time standard clock, the NIST-F1 cesium fountain clock. Hence only cesium clocks are accurate by definition, even though the strontium clock has better precision. The strontium lattice clock and some other experimental clocks operate at optical frequencies, much higher than the microwave frequencies used in cesium clocks. Thanks to the work at NIST, JILA and other research organizations across the world, the strontium lattice clock and other experimental clocks may someday be chosen as new timekeeping standards by the international community.

The strontium clock is the first to hold world records for both precision and stability since the 1990s, when cesium fountain atomic clocks were introduced. In the past decade, the rapid advances in experimental atomic clocks at NIST and other laboratories around the world have surprised even some of the scientists leading the research. NIST, which operates the NIST-F1 time standard, pursues multiple clock technologies because scientific research can take unpredictable turns, and because different types of atomic clocks are better suited for different practical applications.

JILA’s experimental atomic clock based on strontium atoms held in a lattice of laser light is the world’s most precise and stable atomic clock. The image is a composite of many photos taken with long exposure times and other techniques to make the lasers more visible. Credit: Ye group and Baxley/JILA

In JILA’s world-leading clock, a few thousand atoms of strontium are held in a column of about 100 pancake-shaped traps called an optical lattice formed by intense laser light. JILA scientists detect strontium’s “ticks” (430 trillion per second) by bathing the atoms in very stable red laser light at the exact frequency that prompts the switch between energy levels.

To check the performance, the JILA team compared two versions of the strontium clock, one built in 2005 and the other just last year. Both clocks have set previous records of various types. In the latest work, the two clocks fully agreed with each other within their reported precision—demonstrating the ability to make a duplicate copy and maintain the performance level. This is an advantage for clock comparisons to lay the groundwork for the eventual selection of a next-generation time standard.

Recent technical advances enabling the strontium clocks’ record performance include the development of ultrastable lasers and precise measurements of key effects—atom collisions and environmental heating—that cause tiny changes in the clock’s ticking rate.

Next-generation atomic clocks have already contributed to scientific research and are expected to lead to the development of novel technologies such as super-sensors for quantities such as gravity and temperature.

Fish contain genetic blueprint for limbs

Think of a world populated only by giant insects on land, and fishes in water. According to Joost Woltering of University of Geneva, that is what Earth would look like if the transition from fins to limbs had not happened.  In a study published this week in PLoS Biology, Woltering and colleagues have found some definitive clues about this transition. By studying a group of ‘architect’ genes present in both fish and mammals — the Hox genes — the scientists were able to find out that the DNA structure and regulatory mechanism for limb and digit formation were present in fish even before the transition happened, but the enhancements required to activate digit formation evolved only in tetrapods (ie. four-legged land animals).

The role of Hox genes in limb and fin formation is crucial. Malfunctioning Hox genes result in animals missing large segments of their limbs. Mammalian Hox genes have an interesting feature. “In the forming limbs the HoxA and HoxD genes are switched on in two independent ‘waves’ the first making the proximal limb (arm/leg) and the second making the digits (toes/fingers),” said Woltering in an email to this correspondent.

Limb formation in tetrapods is usually attributed to this ‘bimodal’ behaviour of Hox genes. So the scientists were surprised to observe the same mechanism in Hox genes in zebrafish fin radials (the bony part at the end of fins), too. So are the two structures ancestrally the same or “homologous” structures?

To test this, the team inserted fish Hox genes into mouse embryos and found that in the resulting mice, Hox genes were active only in the proximal part of the limbs, not in the digits. “This showed that the fish counterpart of the mouse ‘digit’ domain cannot yet ‘make’ digits,” said Woltering. Therefore fish fin radials and tetrapod digits are not “homologous” in the classical sense.

However, keeping in mind the shared regulatory mechanism in Hox genes of fish and those of mice, the team propose the re-definition of “homology.” “For instance, there are genes that are expressed in the hand and in hair follicles, this fact doesn’t make them homologous structures,” he said. “Only if the underlying ‘switches’ that determine where a gene is expressed are homologous, the structures are homologous.”

Dr. Arkhat Abzhanov, an evolutionary biologist from Harvard University who was not involved in the study, agrees. “Identical expression patterns of the same gene(s) could in principle be established by non-homologous regulatory mechanisms so it might be very helpful to look at the regulatory details,” he said in an email.

EU pollution target ‘still too high’

car exhaust
Stricter curbs on emissions are needed, say public health experts

A study confirming a link between atmospheric pollution and heart-attack risk strengthens the EU case for tougher clean-air targets, experts say.

Research in the BMJ looking at long-term data for 100,000 people in five European countries found evidence of harm, even at permitted concentrations.

Experts stressed that the risk to an individual was still relatively small.

And some argued the results were not conclusive as they did not take account of previous exposure to higher levels.

“Start Quote

One can only hope that our European politicians will be persuaded of the importance of these findings and reassess their position on air pollution in Europe”

Prof Jon Ayres University of Birmingham

Other factors, such as smoking or having high blood pressure, contribute more to a person’s risk of heart attack than pollution from traffic fumes and industry, they say.

But repeated, long-term exposure to air pollution – living next to a busy road in a city, for example – does take its toll, the research, involving a collaboration of European universities and institutes, reveals.

Tougher limits

The BMJ study found that for each 5 µg/m3 increase in annual exposure to fine-particulate (PM 2.5) air pollution – thought to be the most damaging type, as smaller particles can penetrate deep into the body – there is a 13% relative increase in the incidence of heart attacks, even after taking into account other risk factors such as smoking.

Similarly, rising levels of larger-particulate air pollution (PM 10) were also linked to heart-attack risk.

And these associations remained even when exposure concentrations were below the current European limits.

Authors of the study – the largest ever looking at the impact of pollution exposure in European people – says its results support the case for lowering EU limits for particulate matter air pollution.

Current EU legislation sets the annual mean limit on PM 2.5 at twice that recommended by the World Health Organization.

Case ‘not made’

Jon Ayres, professor of environmental and respiratory medicine at the University of Birmingham, said: “There is no doubt that further reduction of PM levels would result in improvements in cardiac health in Europe.

“One can only hope that our European politicians will be persuaded of the importance of these findings and reassess their position on air pollution in Europe.”

Prof David Coggon, from the Department of Occupational and Environmental Medicine at the University of Southampton, was more cautious.

“This study adds to the evidence that particulate air pollution is a cause of heart disease, but it does not establish that there are important health risks from levels of exposure below current exposure limits,” he said.

“This is because the differences in risk that were observed may have been a long-term effect of exposures in the past when levels of pollution were higher.”

UK estimates suggest nearly 30,000 people die prematurely each year as a direct result of exposure to air pollution, which has been linked to asthma and other lung diseases, including cancer, as well as heart problems.

A recent report by Defra on the issue says evidence suggests that there is no “safe” limit for exposure to PM 2.5, and that this type of man-made pollution cuts the average life expectancy of people living in the UK by seven to eight months.

A Defra spokeswoman said: “Air quality has improved significantly in recent decades and the UK currently meets the EU limits for this type of pollution.

“We want to keep improving air quality and reduce the impact it can have on human health and the environment.”

‘Fastest ever’ broadband passes test

BT Tower
The test was carried out between BT’s central London tower and its site in Ipswich

The “fastest ever” broadband speeds have been achieved in a test in London, raising hopes of more efficient data transfer via existing infrastructure.

Alcatel-Lucent and BT said speeds of 1.4 terabits per second were achieved during their joint test – enough to send 44 uncompressed HD films a second.

The test was conducted on a 410km (255-mile) link between the BT Tower in central London and Ipswich.

However, it may be many years before consumers notice any effect.

“Start Quote

The trade-off is the more you squeeze into a fibre line, the more potential there is for interference and for error”

Oliver Johnson Point Topic

But the breakthrough is being seen as highly important for internet service providers (ISPs), as it means a greater amount of information can be sent through existing broadband infrastructure, reducing the need for costly upgrades.

“BT and Alcatel-Lucent are making more from what they’ve got,” explained Oliver Johnson, chief executive of broadband analyst firm Point Topic.

“It allows them to increase their capacity without having to spend much more money.”

Alcatel-Lucent told the BBC that the demand for higher bandwidth grew by around 35% every year, making the need for more efficient ways to transfer data a massively pressing issue for ISPs, particularly with the growing popularity of data-heavy online services, such as film-streaming website Netflix.

How much?!The speed achieved by the researchers topped out at 1.4 terabits per second. But what does that figure mean?Data transfer is measured in bits, not to be confused with bytes, which are a measure of stored information.

1.4 terabits is a huge amount, enough capacity to transmit 44 HD movies at once.

To give that context, the current fastest package for consumers in the UK (excluding Hull) is Virgin Media’s 120 megabits per second.

There are 1,024 megabits in just one gigabit and 1,024 gigabits in one terabit.

There are faster methods of transmitting data – such as the use of complex laser technology – but this is the first test to achieve such high speeds in “real world” conditions, outside testing labs.

Rush-hour traffic

The high speeds were achieved using existing fibre cable technology that has already been installed in much of the UK and other parts of the world.

Kevin Drury, optical marketing leader at Alcatel-Lucent, likened the development to reducing space between lanes on a busy motorway, enabling more lanes of traffic to flow through the same area.

He said flexibility meant some could be adapted to specific needs – like opening an extra lane during the morning rush hour.

In internet terms, this would mean, for example, streaming video would get a large, wide lane, while accessing standard web pages would need only a small part of the fibre’s capacity.

However, pushing more data through fibre cables presents a challenge.

The test will be welcome news for Reed Hastings, chief executive of streaming service Netflix, interviewed by the BBC earlier this month

“The trade-off is, the more you squeeze into a fibre line, the more potential there is for interference and for error,” explained Mr Johnson.

“What has got better is the fact they are able to pack these channels closer together and into the same space.”

Alcatel-Lucent and BT said their test demonstrated “stable, error-free operation”.

Biggest asteroid vents water vapour

Ceres impression
An artist’s impression of water out-gassing from two sources on Ceres

Observations of the Solar System’s biggest asteroid suggest it is spewing plumes of water vapour into space.

Ceres has long been thought to contain substantial quantities of ice within its body, but this is the first time such releases have been detected.

The discovery was made by Europe’s infrared Herschel space telescope, and is reported in the journal Nature.

Scientists believe the vapour is coming from dark coloured regions on Ceres’ surface, but are not sure of the cause.

One idea is that surface, or near-surface, ice is being warmed by the Sun, turning it directly to a gas that then escapes to space.

“Another possibility,” says the European Space Agency’s Michael Kuppers, “is that there is still some energy in the interior of Ceres, and this energy would make the water vent out in a similar way as for geysers on Earth, only that with the low pressure at the surface of the asteroid, what comes out would be a vapour and not a liquid.”

The quantity being out-gassed is not great – just 6kg per second – but the signature is unmistakable to Herschel, which was perfectly tuned to detect water molecules in space.

The telescope’s observations were made before its decommissioning last year.

Ceres pictured by Hubble
Currently, our best image of Ceres comes from the Hubble Space Telescope

Scientists will get a better idea of what is going on in 2015, when Ceres is visited by the American space agency’s Dawn probe.

The satellite will go into orbit around the 950km-wide body, mapping its surface and determining its composition and structure.

“It will be able to observe those dark regions at high resolution, and will probably solve the question of what process is creating the water vapour,” explained Dr Kuppers.

Ceres is often now referred to as a “dwarf planet” – the same designation used to describe Pluto following its demotion from full planet status in 2006.

The asteroid’s sheer size means gravity has pulled it into a near-spherical form.

It is regarded as quite a primitive body in that it has clearly not undergone the same heating and processing of its materials that the many other objects in the asteroid belt between Mars and Jupiter have experienced.

Scientists suspect water-ice is buried under Ceres’ crust because its density is less than that of the Earth’s. And this reputation as a “wet body” is supported by the presence of a lot of minerals at its surface that have water bound into their structure.

One theory to explain why Ceres has so much more water-ice than other members of the surrounding asteroid population is that it formed further away from the Sun, and only later migrated to its present location.

This could have happened if perturbed by Jupiter, whose gravity plays a key role in corralling the asteroids in the belt they occupy today.

“We now have a more sophisticated model for the evolution of the Solar System called the Nice model, which successfully explains many of the features of the Solar System, with the planets having migrated outwards and then maybe also inwards,” said Dr Kuppers.

Flu shots lies and media propaganda.



Vaccine adjuvant aluminum hydroxide causes neurological disease.


The percentage of veterans of that conflict who ultimately suffered from GWS was astounding: More than one-third, or approximately 250,000 of the 697,000 vets who served in the first Gulf War campaign, are afflicted, according to the National Academy of Sciences. In 2009, a study published in the Journal of Inorganic Biochemistry posited that that the substance aluminum hydroxide could be the primary GWS culprit, as documented at Science.NaturalNews.com:

Gulf War Syndrome is a multi-system disorder afflicting many veterans of Western armies in the 1990-1991 Gulf War. A number of those afflicted may show neurological deficits including various cognitive dysfunctions and motor neuron disease, the latter expression virtually indistinguishable from classical Amyotrophic lateral sclerosis (ALS) except for the age of onset. This ALS “cluster” represents the second such ALS cluster described in the literature to date. Possible causes of GWS include several of the Adjuvants in the Anthrax vaccine and others. The most likely culprit appears to be Aluminum hydroxide.

Digital contact lenses come into focus.

The idea of having images and text streamed live across your contact lenses has been used in many a Hollywood film. Now an international team of researchers has developed the first working prototype for such a device – they have constructed and tested a prototype contact lens capable of streaming real-time information across the field of vision; potentially providing the wearer with information updates. The researchers claim that their device will have a variety of uses, such as a biosensor, augmented reality systems, gaming devices, navigation systems and even as an aid to the hearing impaired.

Photograph of the completed contact lens system

In a study published in the Journal of Micromechanics and Microengineering, the researchers from the University of Washington, Seattle, and Aalto University in Finland describe how they constructed a computerized single-pixel contact lens and then demonstrated its safety by testing it on live eyes that showed no adverse side effects.

“Our group has expertise in miniaturization and integration of devices into unconventional substrates. The contact lens is a perfect platform for this. We also wanted to explore if it is possible to have a single personal display instead of numerous devices with numerous displays per person,” explains lead researcher Babak Parviz from the University of Washington. The current design works as a “proof of concept” for more advanced lenses with multiple pixels, which could be used to display short e-mails and text messages.

Eye of knowledge?

The lens display consists of an antenna to harvest power sent out by an external source, as well as an integrated circuit to store this energy and transfer it to a transparent sapphire chip containing a single blue LED. All these modifications do not affect the function as a normal contact lens in any way, according to Parviz.

The device could overlay computer-generated visual information onto the “real world”, making it easy to access information instantly from platforms such as mobile phones. It could also be linked to a biosensor in the user’s body to generate updates, alerting the wearer to any changes in his or her glucose or lactate levels.

Close-range focus

One of the major challenges faced by the team while making the lens system more biocompatible was overcoming the fact that the human eye cannot resolve objects on a contact lens as its minimum focal distance is several centimetres away at least, making any information on the lens itself look blurry and out of focus. To combat this, the researchers incorporated a set of micro-Fresnel lenses – Fresnel lenses are generally thinner and flatter than an average lens and have short focal lengths – into the lens, allowing it to focus the projected image onto the retina.

After testing the contact lens in free space, it was fitted to the eye of a rabbit, under the strict guidelines for animal use in the laboratory, to see the effect of wearing the contact lens on the cornea and the body in general. In addition to visualizing techniques, a fluorescent dye was added to the eye of the rabbit to test for any abrasion or thermal burning. The team saw absolutely no adverse effects to the cornea after testing.

Hi-res lenses of the future

In the days to come, the researchers will be looking into making improvements that will allow them to produce a fully functional, remotely powered, high-resolution display on the lens. Currently, the lens can be wirelessly powered in free space from approximately one metre away, but this was reduced to about two centimetres when placed on the rabbit’s eye. They are also looking into easily resolving text onto the display. “We still need to perfect the focusing mechanism further before we can do this if the text is arbitrary. Pre-determined text is a lot easier,” explains Parviz.

5 Foods to Avoid if You Have High Blood Pressure

One in three Americans is at risk for kidney disease due to high blood pressure, also known as hypertension. High blood pressure is a leading cause of kidney disease and increases your risk of developing a heart attack or stroke. There is no cure, but treatment and lifestyle changes, including taking high blood pressure medications, following a healthy diet and exercising regularly can lower blood pressure.

Thanksgiving family

Even if you’re taking medications to lower blood pressure, it’s important to reduce your sodium intake by cutting down on high salt foods. The National Kidney Foundation recommends that those with high blood pressure or kidney disease limit daily sodium intake to 2,000 milligrams (mg). To put it in perspective, a teaspoon of salt contains about 2,400 mg of sodium, so 2,000 mg can add up quickly. Here are the National Kidney Foundation’s five foods to avoid if you have high blood pressure:

  1. Processed foods: These foods are usually high in salt and other additives that contain sodium. Processed foods include American cheese, frozen dinners, canned soups and fast foods. For example, two thin slices of regular American cheese contain 456mg sodium. Always read labels and compare various brands of the same food item until you find the one that has the lowest sodium content, since this will vary from brand to brand. Cook rice, pasta and hot cereals without salt. Cut back on instant or flavored rice, pasta, and cereal mixes, which usually have added salt.
  2. Table salt: Avoid table salt whenever possible. Just half a teaspoon of salt contains about 1,200 mg sodium and 60% of your daily allowance. Instead, use herbs, spices, and salt-free seasoning blends in cooking and at the table.
  3. Luncheon meats: Use fresh poultry, fish, and lean meat, rather than canned or processed deli meats. Typically a sodium-phosphate solution is injected into processed deli meats, making them high in sodium. Two slices of regular ham contain 604mg sodium, almost half the daily recommended limit.
  4. Salad dressings: Prepared salad dressings can be very high in sodium, depending on the dressing type and brand. Be sure to look for the “low-sodium” variety. Reduced-fat ranch dressing is a better choice containing approximately 336 mg sodium for two tablespoons.
  5. Soy sauce: One tablespoon of soy sauce has 1,005mg sodium. When shopping at the grocery store or eating at a restaurant, choose low-sodium or light soy sauce.