3D surgery in the brain.

Surgeons carry out operations on the brain using 3D technology

When Avi Yaron was 26 years old he had a motorbike accident – a day he describes as the luckiest of his life.

As doctors scanned his head to check for damage, they found a tumour deep inside his brain which may otherwise have remained undetected.

And in 1993 the electronic engineering student from Israel was told the mass they had just discovered was nestling close to areas of the brain critical for movement and thought.

He now faced a choice – to have complex, invasive surgery that carried a risk of paralysis, or to find another way.

After a year of searching, Mr Yaron came across a surgeon in New York who removed part of the tumour successfully, and samples showed it was benign. The engineer was then advised to wait until technology had improved enough for the next operation to be less risky.

But for Mr Yaron this was not an option. The possibility remained it could put pressure on parts of the brain as it grew.

He said: “I was young and thinking of starting a family. I could not be passive about this sword hanging over my head.”

After five years of seeking out key surgeons and experts in technology, Mr Yaron had the rest of the tumour removed during a conventional operation – with good results.

Picture of operation using 3D glasses
The whole team wears 3D glasses and watches the surgery unfold on screens in the operating theatre

But this epic search for better surgical options continued to play on his mind. He kept thinking of and experimenting with ways to do brain surgery in a less invasive way.

And over the last few years he perfected a way to do surgery on the brain – in 3D.

Surgery through a scope

In the last 25 years minimally invasive surgery has become commonplace for the relatively easy-to-reach areas of the body, such as keyhole surgery on the abdomen and womb. And more recently surgeons have been able to use scopes (tube-like instruments) in brain surgery too.

Case study

Photo of Elizabeth Watson
  • 71-year-old Elizabeth Watson is one of the first patients in the UK to have had an operation using this system (2013)
  • She had a benign tumour growing on her pituitary – a key hormone-producing gland in the brain
  • She says: “The new equipment helped convince me to have the operation. It looks as though it has been really successful”

During these procedures a thin scope is inserted via a surgically-made or naturally-occurring port in the skin. A camera attached to the end of the scope relays images to a screen for the surgeon to see.

And surgical instruments are passed down the scopes to take samples of tissues or remove masses.

Early versions allowed surgeons to look at 2D images in standard definition, evolving over the last decade into more high definition systems.

Surgeons constantly translate these 2D images into 3D as they operate, much as we do when watching 2D television screens.

More recently 3D technology has become available for certain types of operation. But 3D brain surgery has been a much harder feat to achieve.

In neurosurgery the scopes need to be very small in diameter so they can pass through narrow ports such as the nose.

But most 3D scopes rely on two optical channels – each containing a single sensor. Each sensor collects two separate images that are then mixed together to give the appearance of three dimensions as a user looks at the screen – mirroring the way human eyes see.

It has so far proved difficult to make an instrument small enough that is able to produce the high-quality images neurosurgeons need.

‘Insect eye’

But Mr Yaron says his team have cracked this puzzle by thinking laterally. Rather than copying human anatomy, their scope mimics the compound eye of a bee.

The scope contains a miniature sensor with hundreds of thousands of micron-sized elements, each looking in slightly different directions and mapping the surgical field from many different points.

Using software this is translated into images for the left and right eye. Using this single sensor system, Mr Yaron’s company, Visionsense, have produced a scope small enough to operate on the brain.

Shahzada Ahmed from the Queen Elizabeth Hospital in Birmingham who carried out one of the first 3D endoscopic neurosurgical procedures in the UK says: “A bit like going to the movies, Avatar is a great movie in HD but it is an even better one in 3D.

“When I use the scope there is a better appreciation of depth and the pictures feel more real to me.”

3D brain surgery: possible uses

Illustration of endoscopic brain surgery
  • Removal of tumours and masses at the base of the skull and in the nose
  • Removal of pituitary tumours
  • Sinus surgery

It also allows him to see his instruments in 3D, which he feels gives him a better understanding of where they are in relation to key parts of anatomy.

Model brain

A number of studies are now being carried out to see if the 3D approach is better than commonly used 2D high definition systems.

Hani Marcus, a neurosurgeon at the Hamlyn Centre, Imperial College London recently compared the scope to conventional tools, using a model brain and surgeons who are novices to this endoscopic approach.

The study suggests the 3D aspect is beneficial – leading to a faster operation and subjective improvements in depth perception.

But Mr Marcus says it would be a mistake to automatically assume 3D is definitely better than 2D, and thinks further studies are needed.

There are a number of potential problems – surgeons who are already used to seeing 2D may find this approach hard to get used to.

And just as some people don’t enjoy watching 3D films and feel slightly nauseous, the same may hold for some surgeons.

But for Mr Yaron, whose scope is now being used in the US and across Europe, this invention is the bright side of an issue that has been playing on his mind for many years.

He says: “If I hadn’t had this accident I wouldn’t have been able to offer this solution. And I really know how it feels to need options.”

More on This Story

CERN experiment produces first beam of antihydrogen atoms for hyperfine study.

The ASACUSA1 experiment at CERN2 has succeeded for the first time in producing a beam of antihydrogen atoms. In a paper published today in Nature Communications, the ASACUSA collaboration reports the unambiguous detection of 80 antihydrogen atoms 2.7 metres downstream of their production, where the perturbing influence of the magnetic fields used initially to produce the antiatoms is small. This result is a significant step towards precise hyperfine spectroscopy of antihydrogen atoms.

Primordial antimatter has so far never been observed in the Universe, and its absence remains a major scientific enigma. Nevertheless, it is possible to produce significant amounts of antihydrogen in experiments at CERN by mixing antielectrons (positrons) and low energy antiprotons produced by the Antiproton Decelerator.

The spectra of hydrogen and antihydrogen are predicted to be identical, so any tiny difference between them would immediately open a window to new physics, and could help in solving the antimatter mystery. With its single proton accompanied by just one electron, hydrogen is the simplest existing atom, and one of the most precisely investigated and best understood systems in modern physics. Thus comparisons of hydrogen and antihydrogen atoms constitute one of the best ways to perform highly precise tests of matter/antimatter symmetry.

Matter and antimatter annihilate immediately when they meet, so aside from creating antihydrogen, one of the key challenges for physicists is to keep antiatoms away from ordinary matter. To do so, experiments take advantage of antihydrogen’s magnetic properties (which are similar to hydrogen’s) and use very strong non-uniform magnetic fields to trap antiatoms long enough to study them. However, the strong magnetic field gradients degrade the spectroscopic properties of the (anti)atoms. To allow for clean high-resolution spectroscopy, the ASACUSA collaboration developed an innovative set-up to transfer antihydrogen atoms to a region where they can be studied in flight, far from the strong magnetic field.

“Antihydrogen atoms having no charge, it was a big challenge to transport them from their trap. Our results are very promising for high-precision studies of antihydrogen atoms, particularly the hyperfine structure, one of the two best known spectroscopic properties of hydrogen. Its measurement in antihydrogen will allow the most sensitive test of matter/antimatter symmetry.  We are looking forward to restarting this summer with an even more improved set-up,” said Yasunori Yamazaki of RIKEN, Japan, a team leader of the ASACUSA collaboration. The next step for the ASACUSA experiment will be to optimize the intensity and kinetic energy of antihydrogen beams, and to understand better their quantum state.

Progress with antimatter experiments at CERN has been accelerating in recent years. In 2011, the ALPHA experiment announced trapping of antihydrogen atoms for 1000 seconds and reported observation of hyperfine transitions of trapped antiatoms in 2012. In 2013, the ATRAP experiment announced the first direct measurement of the antiproton’s magnetic moment with a fractional precision of 4.4 parts in a million.

A New Drug Could Help You Forget Long-Term Traumatic Memories.

In the 2004 sci-fi film Eternal Sunshine of the Spotless Mind, a pair of characters end their stormy romance with a bizarre solution: They pay a company called Lacuna, Inc. to erase all memory of the relationship from their brains as they sleep.

When the movie came out, the premise was pure fantasy. But a group of neuroscientists from MIT and elsewhere have recently identified a drug that could someday help us dislodge traumatic memories in the real world.

The drug, a histone deacetylase inhibitor (HDACi), interferes with one of the ways that brain cells record memories, by precisely placing proteins called histones on certain segments of DNA, affecting which genes are expressed. The hope is that, using this principle, doctors could someday prescribe drugs that aid in the treatment of post-traumatic stress disorder (PTSD).

Right now, those seeking relief from PTSD typically use exposure therapy, in which a patient mentally revisits a traumatic memory in hopes of overcoming the anxiety associated with it. But “the options for treatment of PTSD are very limited. There’s really no good medication, and exposure-based psychotherapy is often ineffective for older memories,” says Li-Huei Tsai, the lead author of a new study documenting the research, published in the journal Cell. “This study suggests that mainuplating histone-based mechanisms involved in memory deserves a serious investigation, and could someday be applied to patients.”

Exposure therapy usually involves intentionally re-experiencing stimuli associated with a traumatic memory in hopes of replacing the original memory with a new, harmless one. A war veteran experiencing PTSD, for instance, might put on a pair of virtual-reality goggles that portray a traumatic war experience, while consciously aware that he or she is safe in a therapist’s office.

For relatively recent memories, this has been found to be relatively effective, in part because of the brain’s natural neuroplasticity that allows it to replace associations. After a multi-year period of time, however, it seems that old memories harden, and can’t be dislodged by new ones.

Interestingly, the same pattern has been observed in mice—and the use of an HDACi appears to be a way of lengthing the key period of neuroplasticity, which if it’s able to be applied to humans could dramatically lengthen the time period for which exposure therapy is effective.

Researchers demonstrated this effect on neuroplasticity through trials in which mice were exposed to a brief electric shock just after hearing a loud tone, which forces them to associate the sound with a traumatic event. Normally, if the mice hear the same sound a day later without being shocked, they’re able to replace the old memory with the new one, and will stop freezing up in fear when they hear the sound again. However, if a month passes before they listen to the sound again, the association between sound and pain is mentally cemented and permanent.

When the researchers examined activity going on at the molecular level, they noticed that the activity of histone proteins on DNA played a key role in the neuroplasticity that allowed the exposure to sound without a shock to dislodge very recent traumatic memories and replace them with new ones. This gave the researchers an idea: to use a drug like HDACi (currently used in research on cancer treatments) to artificially increase neuroplasticity for older memories by altering how histone proteins attach to DNA.

To do so, they exposed mice to the same tone-and-shock regimen, waited about a month without playing the tone, then injected them with an HDACi and tried to dislodge the memory with the same exposure treatment as before. This time, it worked. The mice didn’t freezing up in terror when they heard the sound. On a cellular level, the researchers observed the same patterns that had normally only occured when day-old memories were replaced.

Obviously, humans are not mice, but previous research has suggested that the same principles related to neuroplasticity seem to apply to exposure therapy in both species. That’s why the researchers suggest that combining an HDACi with conventional rexposure therapy could someday be a way to weaken the hold of older traumatic memories in people suffering from PTSD, replacing them with new memories devoid of anxiety.

“Persistent fearful memories are a problem very relavent to our society. A lot of people suffer from an inability to subside very traumatic events in their lives,” Tsai says. “Combining this sort of treatment with exposure-based psychotherapy might eventually provide an option for them.”

There are still a whole lot of hurdles to be cleared before this is a possibility. The MIT researchers—neuroscientists working in a rapidly emerging field called epigenetics, involving the regulation of gene expression—are attempting to answer basic questions about how the brain encodes memories. They’re not researchers that develop drugs, so it’d likely be another team that carries the research forward, and it would first be necessary to prove that this sort of novel approach is safe for humans.

But it’s worth noting that the researchers extended the mice’s natural forgetting process, allowing the mice to replace a traumatic memory a month—rather than just a day—after it was formed. It’s not as radical as Lacuna, Inc. magically erasing memories à la Eternal Sunshine, but it’s also a lot more similar to the processes that already go on inside the brain, and therefore a much more realistic future treatment.

China’s Pollution Wafting Across Pacific To The U.S., Study Finds.

Pollution from China travels in large quantities across the Pacific Ocean to the United States, a new study has found, making environmental and health problems unexpected side effects of U.S. demand for cheap China-manufactured goods.

On some days, acid rain-inducing sulphate from burning of fossil fuels in China can account for as much as a quarter of sulphate pollution in the western United States, a team of Chinese and American researchers said in the report published by the U.S. National Academy of Sciences, a non-profit society of scholars.

Cities like Los Angeles received at least an extra day of smog a year from nitrogen oxide and carbon monoxide from China’s export-dependent factories, it said.

“We’ve outsourced our manufacturing and much of our pollution, but some of it is blowing back across the Pacific to haunt us,” co-author Steve Davis, a scientist at University of California Irvine, said.

Between 17 and 36 percent of various air pollutants in China in 2006 were related to the production of goods for export, according to the report, and a fifth of that specifically tied to U.S.-China trade.

One third of China’s greenhouse gases is now from export-based industries, according to Worldwatch Institute, a U.S.-based environmental research group.

China’s neighbours, such as Japan and South Korea, have regularly suffered noxious clouds from China in the last couple of decades as environmental regulations have been sacrificed for economic and industrial growth.

However, the new report showed that many pollutants, including black carbon, which contributes to climate change and is linked to cancer, emphysema and heart and lung diseases, travelled huge distances on global winds known as “westerlies”.

Trans-boundary pollution has for several years been an issue in international climate change negotiations, where China has argued that developed nations should take responsibility for a share of China’s greenhouse gas emissions, because they originate from production of goods demanded by the West.

The report said its findings showed that trade issues must play a role in global talks to cut pollution.

“International cooperation to reduce transboundary transport of air pollution must confront the question of who is responsible for emissions in one country during production of goods to support consumption in another,” it said.

Air quality is of increasing concern to China’s stability-obsessed leaders, anxious to douse potential unrest as a more affluent urban population turns against a growth-at-all-costs economic model that has poisoned much of the country’s air, water and soil.

Authorities have invested in various projects to fight pollution, but none so far has worked.

Is the Magic Bullet Becoming Reality?

Researchers at a recent New York conference discuss what the future of nanomedicine may hold.

Nanotechnology: Is the Magic Bullet Becoming Reality?Nanotech seems promising, but regulatory and patent issues remain. [© Anterovium – Fotolia.com]
  • Slightly over a century ago, Paul Ehrlich coined the term “magic bullet” to refer to therapeutic compounds designed to selectively target a pathogen without affecting the host. Subsequently, this idea flourished not only for infectious diseases but also for other fields, such as cancer therapy. At the recent “Nanomedicines: Addressing the Scientific and Regulatory Gap” conference held at the New York Academy of Sciences in late November, investigators discussed key concepts shaping a vibrant field that promises to bring this concept closer to the clinic.

  • Liposomes

    “When we entered this field, from the few systems that existed, we chose to work on liposomes,” said Yechezkel Barenholz, Ph.D., professor of biochemistry at the Hebrew University – Hadassah Medical School in Jerusalem. Liposomes presented the advantage that relatively more knowledge existed about their pharmacokinetic properties. The compound that Dr. Barenholz and colleagues started to work on, doxorubicin, is one of the most effective first-line anticancer therapeutics ever developed, but one of its disadvantages is that adverse effects occur in many organs upon systemic administration. Work by Dr. Barenholz and colleagues on a liposome-based doxorubicin formulation culminated with the development of Doxil (doxorubicin), the first nanodrug approved by the FDA in 1995.

    “In a way, the success of this project started from a failure,” Dr. Barenholz said. Investigators in his lab initially developed a liposome-based doxorubicin formulation to reach the liver and treat hepatocellular carcinoma, and even though this worked well in an animal model, the pharmacokinetics was not favorable in humans. “As a result we became more determined, and in a new approach, we decided to first determine what kind of performance we need from liposomes, and then we used a materials science approach,” he added.

    A key concept in developing the doxorubicin-loaded liposomes was the enhanced permeability and retention effect. This phenomenon, which results from differences in vasculature between normal and inflamed tissue, refers to the ability of the poorly aligned, fenestrated endothelial cells from the malignant tumor neovasculature to allow 10–300 nm diameter nanoparticles to cross and become selectively enriched in the tumor.

    “This is not the case in normal tissues, and it represents the Achilles’ heel of the tumor,” said Dr. Barenholz. The defective lymphatic drainage of malignant tissues further facilitates the accumulation of nanoparticles.

    To load the liposomes with doxorubicin, Dr. Barenholz and colleagues relied on a transmembrane ammonium sulfate gradient that acted as the driving force for the loading process. As a result, doxorubicin reached 100-fold higher concentrations in the intraliposomal aqueous phase as compared to the loading medium. The circulation time of liposomes was extended by stabilizing them with a formulation composed of phospholipids with high melting temperature, cholesterol, and a pegylated lipopolymer. Clinical studies in humans revealed that the nanoparticles accumulated in tumors, and doxorubicin reached higher concentrations in the tumor than what could what could be achieved with systemic administration.

    “This formulation improves patient compliance and the quality of life,” Dr. Barenholz said.

  • TNF

    “We wanted to use nanotechnology in cancer therapy and change the way we treat this disease,” said Lawrence Tamarkin, Ph.D., president and CEO of CytImmune. The approach that Dr. Tamarkin and colleagues developed relies on 27 nm gold nanoparticles that have been used since the 1930s to treat psoriatic arthritis and present an established history of safety.

    The surface of the colloidal gold nanoparticles was simultaneously bound to covalently linked thiolyated PEG, to avoid immune detection, and recombinant human tumor necrosis factor (TNF). Clinically, TNF has been successfully used in Europe to treat tumors of the extremities, in a procedure known as isolated limb perfusion. Infusing high-dose TNF prior to chemotherapy can achieve 15–25-fold higher concentrations as compared to systemic administration, without the same risks of adverse effects. With this strategy, several studies found up to 95% response rates after one treatment in patients with melanoma and sarcoma. “We wanted to mimic this clinical experience systemically,” said Dr. Tamarkin.

    The 27 nm-diameter gold nanoparticles are small enough to travel through the blood vessels, and the 2–4 nm gaps between endothelial cells in healthy blood vessels are too small to allow them to cross into tissues, due to the presence of the tight junctions. “But at the site of the tumor, where the neovasculature fenestrations are 200–400 nm, the blood pressure forces them into the tumor bed, where TNF molecules bind to TNF receptors on the endothelial cells and start causing vascular disruption,” he added.

    In a Phase I clinical trial, CYT-6091, the nanotherapeutic that Dr. Tamarkin and colleagues developed, delivered up to 1.2 mg TNF, as compared to 0.4 mg, which is the maximum tolerated human dose, without any signs of dose-limiting toxicity. “The promise of nanotechnology is that we can reduce or eliminate toxicity and improve the therapeutic index,” he said. Moreover, the drug accumulated at tumor sites, and very few gold nanoparticles were seen in healthy tissues.

    “We intended not simply to reformulate an already approved drug, but to create a safe and effective therapeutic by using nanotechnology,” he said. Two patients, one with inoperable breast cancer and another one with pancreatic cancer, neither of them having previously undergone surgical treatment, showed the most nanoparticles accumulating in the tumor, as compared to the adjacent healthy tissue. “This indicated that perhaps treating patients surgically so quickly might not be a good idea, because it tears up the roadway that nanoparticles use to reach their targets,” he commented.

    Gold nanoparticles accumulated in malignant tissues even at the lowest doses, but accumulation was not increasing in a dose-dependent manner. Reducing the tumor burden in situ offers the possibility to reduce the need for sophisticated surgery and the hospitalization time.

    “We have the promise to dramatically improve healthcare because of decreased treatment costs,” he concluded.

    • Generic Nanotech

      The concept of generic substitution, which guides the replacement of a prescribed brand of a drug with an identical formulation of the same active compound made by a different manufacturer, appears to open uncharted territory when applied to nanomedicine. Generic medicinal products are normally therapeutically equivalent and therefore interchangeable and substitutable to the reference (innovator) product because they are pharmaceutically comparable and bioequivalent, and they do not require additional clinical efficacy or safety studies.

      “But it has to be ensured that the drug can be fully characterized,” said Stefan Mühlebach, Ph.D., professor of pharmacology at the University of Basel and scientific director at Vifor Pharma. The generic paradigm was successful in the past for small molecules such as aspirin, but it is more problematic for the more complex biological drugs, which are much larger and more difficult to characterize. A third category of medicinal products, the nonbiological complex drugs, is distinct from both the small molecules and the biological therapeutics by the presence of multiple different large molecular structures, some of which may be nanoparticulate, and by not being a biological.

      In nonbiological complex drugs, the entire product represents the active pharmaceutical ingredient, all the components contribute to the characteristics of the final product, and the properties cannot be fully characterized by physicochemical means, which is a prerequisite to show pharmaceutical comparability to a reference listed drug (RLD) and requested for generics.

      “The characterization of nonbiological complex drugs is seriously limited by the fact that we do not always know what to look for when characterizing clinical meaningful differences,” said Dr. Mühlebach. Additionally, the characteristics of nonbiological complex drugs are highly dependent on the elaborate, multistep synthetic manufacturing process.

      Examples of nonbiological complex drugs are the iron carbohydrates, such as iron sucrose used for intravenous iron treatment, liposomal drugs, and some polymeric polypeptides like the glatiramoids. Iron sucrose, a colloidal solution, was introduced into therapy almost 50 years ago and used safely without knowing its nanomedicine character. “A challenge for some of these products is that the first generation of compounds started to be even replaced by competitors (similars) in the absence of the awareness on their nano properties and a lack of established or appropriate regulatory evaluation tools,” he added.

      The example of an iron sucrose similar that was used to substitute for the original compound is revealing. A retrospective analysis that investigated adverse effects in 600–700 postpartum gynecology patients from South Korea showed that the original product always caused fewer adverse effects compared to the new formulation. When diluted and administered over a longer time, classically used to improve the tolerance for a parenteral drug, even more adverse effects were reported with the new formulation, contrary to the predictions, but understandable from the complexity and the fragile stability of the products. As these results indicate, the conventional generic paradigm is not reliable any longer in the case of nanosimilars, and concluding that two products could be interchangeable, substitutable, or therapeutically equivalent may be wrong.

      “What we know about nanosimilars is that we need to go into the details of understanding the complexity of the manufacturing process, not only starting with the materials, but also regarding the final product, because these aspects are of highest importance for pharmaceutical equivalence, bioequivalence or the fate of the product in the body and finally efficacy and safety of the therapeutic product for the patient,” Dr. Mühlebach said.

    • Regulatory Issues

      “Nanotechnology, along with the promise and benefits that it brings, may also raise some questions and concerns over safety and effectiveness,” says Ritu Nalubola, Ph.D, senior policy advisor at the Food and Drug Administration. Some of the most significant regulatory considerations revolve around unveiling the properties of nanomaterials and understanding the relevance of those properties to the regulatory status of the specific products.

      To provide a framework for the regulatory oversight of emerging technologies in general and of nanotechnology products in particular, in 2011 the Emerging Technologies Policy Coordination Committee prepared two strategic documents. “Building on these U.S. government policy principles, the FDA developed its own agency-specific regulatory approach, and these emphasize our mission to protect and promote public health, adopt risk-based regulatory approaches based on sound science, and develop transparent and predictable regulatory pathways that are grounded in the best available science,” says Dr. Nalubola.

      The definition of nanoparticles—including their size, which has commonly been placed in the 1-100 nm range—continues to present ample interest for regulatory purposes. “There are challenges in deciding how to address aggregates, agglomerates, and some other complex structures, and whether, in addition, we should also take into account novel engineering properties for regulatory purposes”, Dr. Nalubola commented.

      In 2011, the FDA issued a draft guidance that provides a broad screening tool for regulatory processes. “The FDA draft guidance recognizes our interests in size, but also that our interest extends beyond size, and that other properties are also relevant for safety and efficacy reviews,” she said. The guidance document encourages industry to seek FDA consultation early during product development, to ensure that any questions related to safety, effectiveness, and regulatory status can be adequately and timely identified and addressed.

      “We also articulated our general position, which is that we do not categorically judge all nanotechnology products as being inherently benign or harmful but, rather, that we are looking at products and their characteristics on a case-by-case basis,” she added. This ensures that the current regulatory framework is sufficiently robust and flexible enough to consider a variety of nanomaterials, and to concomitantly identify existing regulatory gaps.

      The FDA is actively engaged in the national nanotechnology initiative and participates in some of its research activities. “On the policy side, and in context of the Emerging Technologies Interagency Policy Coordination Committee, we have ongoing dialogues on developing policy approaches and policy-related coordination, and we also participate in the international arena with our regulatory counterparts,” Dr. Nalubola said.

      From more than 80,000 articles on nanoparticles that were available on PubMed in January 2014, over half were published after 2010, revealing the interest and progress that are marking this area. As a multidisciplinary field, nanotechnology impacted diverse disciplines including agriculture, engineering, energy production, communications, information technology, cosmetics, and biomedicine. Among these, the diverse biomedical applications provide a clear indication that Ehrlich’s era of the “magic bullet” is becoming reality.

    • Fool’s Gold in the Nanotech Patent Rush

      Click Image To Enlarge +
      A lack of a universal nanonomenclature is one factor complicating the patenting of nanotechnologies. [© Raj Bawa]

      “The past few years, especially the past decade, have witnessed a nanotechnology patent boom, a sort of patent ‘land grab’ by ‘patent prospectors’ who have captured upstream, foundational nanotechnology-related technologies,” says Raj Bawa, Ph.D., patent agent at Bawa Biotech LLC in Ashburn, VA, and adjunct professor at Rensselaer Polytechnic Institute, Troy, NY.

      At the recently concluded New York Academy of Sciences meeting, Dr. Bawa discussed some of the main considerations with respect to nanotechnology and nanopharma patents. According to information obtained from the U.S. Patent and Trademark Office (PTO), as of December 2012 over 8,000 U.S. nanopatents have been issued by various PTO technology centers and classified under Class 977. However, Dr. Bawa does not put too much stake into these numbers and even considers the classification strategy inadequate.

      “These data simply reflects an upward trend, a sort of ‘nano-explosion’ and nothing more,” he commented. “It is based on the ill-conceived National Nanotechnology Initiative (NNI) definition of nanotechnology that limits all nanostructures and nanoproducts to a subnanometer range. Obviously, such a narrow (1-100 nm) and arbitrary classification scheme by the PTO misses many, if not most U.S. nanopatents and the actual numbers are meaningless to a researcher, policy-maker, or patent practitioner.”

      Another significant conundrum pertaining to nanotechnology patenting is the lack of a universal nanonomenclature whereby researchers and policy-makers often use distinct terms to refer to the same or similar nanostructure. Furthermore, the late 1980s and early 1990s have witnessed issuance of more than one U.S. nanopatent for the same invention.

      “This is contrary to the foundation of U.S. patent law where only one U.S. patent may be issued per invention,” Dr. Bawa added. “Partly, this situation developed because the search tools and commercial databases that were being used by patent examiners at the PTO, while well-suited to search patents on established technologies, were not well scaled to search most of the early scientific literature residing in scientific publications, as opposed to U.S. patents. Also, the U.S. patent examiners generally lacked expertise and training with respect to the emerging field of nanotechnology.”

      The classic example of these limitations is the issuance of multiple U.S. patents on carbon nanotubes. In a study on carbon nanotubes, Dr. Bawa and colleagues analyzed the claims from approximately 200 existing patents. “We discovered many foundational patents on carbon nanotubes recited overlapping or ‘legally identical’ patent claims,” he said. “A classical patent thicket exists today with respect to carbon nanotube patents.”

      One of the reasons that major conflicts have not emerged in this area is that not too many products have been commercialized yet. “The hope is that such U.S. patents will expire prior to widespread commercialization so that there is little or no need for litigation. However, some have proposed that the government employ provisions under the Bayh-Dole Act of 1980 by imposing compulsory licensing while others have even urged creation of an open-source type process to rectify the erroneous issuance of some of these basic, foundational U.S. nanopatents so that downstream development of nanotechnologies are not stifled,” Dr. Bawa said.

Stick-on screens offer new horizons

The researchers demonstrate the plastic screen they say could turn any window into a movie screen

Scientists have created a transparent screen that can turn any window into a display for moving images.

The screens, which are made by adding tiny nanoparticles that reflect blue light into a liquid polymer, can be stuck on to any window.

The nanoparticles are invisible to humans, creating the transparency, but images projected in blue light show up.

The team asserts in the journal Nature Communications that it is simpler to make than existing similar screens.

Stick-on screens could be used to host advertisements on shop windows or for office presentations.

Pour it on

Transparent screens themselves are not new; from Google glass to Minority Report-style computer screens, there are many now in commercial development.

“Start QuoteSimple diagram of how the transparent display works (c) MIT

You could have your presentation shown on the window of your office”

Chia Wei Hsu MIT

But while these employ complicated, expensive technology – a head-up display projected onto a tiny prism embedded in Google glass, or LEDs actually embedded within transparent computer screens – this new method is very simple.

“We literally just pour the nanoparticles into the polymer before it solidifies,” explained lead researcher Chia Wei Hsu, a graduate student from MIT and Harvard University, US.

Only a few thousandths of a gram of nanoparticles are needed per square centimetre of screen, meaning the technology is also relatively cheap.

‘So simple’

In the case of this latest development, since each type of particle reflects only one colour, the screens display just one colour of image.

The researchers say it would be possible to have multiple colours, but the more nanoparticles that are added to the screen, the more opaque it would become.

“Since it’s so simple to deploy, you could paste this [plastic] sheet onto any surface,” Mr Hsu told BBC News.

“You could put it on store front windows for advertisements or information about what’s inside the store, or we could use it for office windows, so you could then have your presentation [shown] on the window of your office.

How to get through the night shift

Woman at desk asleep at night

We already knew that night shifts were bad for us. Now it turns out that they’re really bad for us. Ben Milne – a veteran of the overnight – is not surprised.

Night shifts hurt us at a molecular level, according to the latest research. But unless someone comes up with a way to make illness, fires, accidents (let alone such inessentials as air flight and road repairs) happen only between 09:00 and 17:00, somebody is always going to have to do them.

So what’s the best way to get through the wee hours? Firstly, switch your brain down to the essential functions. This is not just to preserve energy. It also blinds you from the sheer madness of being awake and sober at 02:00. In the words of REM’s Daysleeper – the best song ever written on the subject – “I am the screen, I work at night.”

Secondly, nutrition: Eat the right food – lots of fresh fruit and plenty of water. Yeah, right.

This is the reality – it’s 03:30 and the only source of sustenance is a well-kicked vending machine, whose contents have mysteriously started to look quite attractive. Chilli-burger flavour corn snacks washed down with a jumbo-sized can of fizzy tropical fruit drink? Don’t mind if I do.

Do you sleep during your breaks or push on through? In the labyrinth of the now-closed Television Centre where I worked the vast majority of my nights, I had a secret list of unoccupied offices complete with comfortable sofas and low light where I could catch half an hour’s zeds during the equivalent of my lunch hour. The downside was waking up after 20 minutes unsatisfactory dozing, dry-mouthed and hair aloft from the static of synthetic sofa, with the realisation that the shift was far from over.

And then you’re home at 10:00 and mysteriously wide awake. Or there are building workers next door very inconsiderately working in daylight hours. Trying to stay asleep in the day is like holding a ball underwater – there’s a part of your consciousness which is always trying to surface. And then you’re back at work again, even more tired than the night before.

So the short answer is, don’t do it. But then you realise that all those extra night payments might cover this year’s holiday. Or they’ve just changed your terms and conditions and you’re stuffed anyway.

Sweet dreams.

Now, send your name to asteroid on Nasas chip

So what if you can’t go to space? You can send your name there!
US space agency Nasa is inviting people around the world to submit their names to be etched on a microchip aboard a spacecraft headed to an asteroid in 2016. The microchip will travel to the asteroid, named Bennu, aboard the agency’s Origins-Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx ) spacecraft.The robotic mission will spend more than two years at the 1,760-foot-wide asteroid. The spacecraft will collect a sample of Bennu’s surface and return it to Earth in a sample return capsule. “We’re thrilled to be able to share the OSIRISREx adventure with people across the Earth, to Bennu and back,” said Dante Lauretta, principal investigator of the OSIRIS-REx mission from the University of Arizona. “It’s a great opportunity for people to get engaged with the mission and join us for launch,” she said. Those wishing to participate in “Messages to Bennu !”should submit their name online by Sept 30.

Chinese scientists decode locust’s genome

 Chinese scientists have decoded the genome sequence of the migratory locust, the largest animal genome sequence so far, providing new ways of combating the destructive pest.The extracted genome sequence is of 6.5 gigabytes, the Chinese Academy of Sciences (CAS) announced. The research was headed by Le Kang of the Institute of Zoology, CAS said.

The scientists assessed changes in gene families related to long-distance migration, feeding and other biological processes unique to the locust and identified genes that might serve as potential pesticide targets.

The findings indicate that the large genome size is likely to be because of transposable element proliferation combined with slow rates of loss for these elements, they reported in the journal Nature Communications.

During their research, scientists found significant expansion of gene families associated with energy consumption and detoxification, consistent with long-distance flight capacity and phytophagy, state-run Xinhua news agency reported.

Now, a headband to control your dreams

A new headband that can measure brain waves and eye movement activity to allow a sleeping person to take control of theirdreams has been developed.

The headband works by determining when a person enters REM sleep — a stage of sleep characterized by rapid eye movement where dreams are more likely to occur — by measuring brain waves and eye movement activity. Then, it emits a series of lights that will not wake up the user, but instead allow them to realize they are dreaming.

Users can then enter into a lucid dream state and control their dreams, ‘Fox News’ reported