Nanogenerator Harvests Swipes To Power LCD Screens


There’s a whole lot of energy out there that’s just kind of hanging around. The brakes on cars and trains turn momentum into heat, for example, which we now have systems for recapturing and recycling. But there are many more examples of wasted “ambient” energy that we don’t recapture. Even regular old walking around as bipedal animals is an inefficient process; the energy we expend in a single stride is greater than it would be given a perfectly efficient process.

Such is life, but nowadays we’re surrounded by devices that don’t require all that much power to operate. A couple of volts goes a long way. A newly developed nanogenerator, described this week in the journal Nano Energy, puts that into perspective, offering a means of converting the energy expended in a standard touchscreen swipe into sufficient power to light up a touchscreen.

The nanogenerator in question is what’s known as a biocompatible ferroelectret nanogenerator, or FENG—a paper-thin sheet of layered materials including silver, polyimide, and a sort of giant charged molecule known as polypropylene ferroelectret. The layers of the FENG are loaded up with charged ions, which results in a construction that, when compressed, produces electrical energy.

The high-level picture is that the FENG winds up with really huge dipoles—magnetic poles of opposite charge—existing on its different layers, which then change in relation to each other as the material is deformed under pressure. This change results in differences in electrical potential, which is what gives us useful electrical energy.

So, we hear about self-powered devices kind of a lot. What makes this one interesting is that it’s a new kind of device. That is, a FENG is not piezoelectric (electricity via squishing crystals) or triboelectric (electricity via certain kinds of friction).

The paper describes some advantages: “their simple fabrication allows for encapsulated low-cost devices. In view of the environment, health, and safety, the fabrication of encapsulated FENG avoids the use of harmful elements (e.g. lead) or toxic materials (e.g. carbon nanotubes), making it more attractive for biocompatible and perhaps even implantable applications.”

The device also has the neat property of becoming more powerful when folded. In a statement, lead investigator Nelson Sepulveda explains: “Each time you fold it you are increasing exponentially the amount of voltage you are creating. You can start with a large device, but when you fold it once, and again, and again, it’s now much smaller and has more energy. Now it may be small enough to put in a specially made heel of your shoe so it creates power each time your heel strikes the ground.”

Sepulveda and co.’s current task is in developing technology that would allow for the transmission of energy generated by said heel strike into devices like headsets.

NASA’s ‘impossible’ EM Drive works: German researcher confirms and it can take us to the moon in just 4 HOURS


Over the past whole year, there’s been a lot of excitement about the electromagnetic propulsion drive, also known as EM Drive – a logically impossible engine that’s challenged almost everyone’s prospects by continuing to stand up to experimental study. The EM drive is so thrilling because it yields enormous amounts of propulsion that could hypothetically blast us to Mars in only 70 days, without the need for dense and costly rocket fuel. Instead, it’s actually propelled forward by microwaves bouncing back and forth inside a sealed off chamber, and this is what makes the EM drive so powerful, and at the same time so debatable.

As effective as this kind of propulsion may sound, it challenges one of the essential concepts of physics – the conservation of momentum, which states that for anything to be propelled forward, some kind of propellant must be pushed out in the opposite direction. For that reason, the drive was generally laughed at and overlooked when it was designed by English scientist Roger Shawyer in the early 2000s. But a few years later, a group of Chinese researchers decided to construct their own version, and to everyone’s amazement, it really worked. Then an American inventor did the something just like that, and convinced NASA’s Eagleworks Laboratories, supervised by Harold ‘Sonny’ White, to give it a try. And they admitted that it actually works. Now Martin Tajmar, a well-known professor and chairman for Space Systems at Dresden University of Technology in Germany, has worked with his own EM Drive, and has once again revealed that it produces thrust – although for reasons he can’t clarify yet.

Tajmar offered his outcomes at the 2015 American Institute for Aeronautics and Astronautics’ Propulsion and Energy Forum and Exposition in Florida on 27th of July, and you can read his entire paper here. He has a long history of experimentally testing (and exposing) revolutionary propulsion systems, so his outcomes are a big deal for those looking for outside confirmation of the EM Drive.

Most importantly, his system produced a parallel amount of thrust as was initially forecast by Shawyer, which is more than a few thousand times greater than a typical photon rocket.

So where does all of this leave us with the EM Drive? While it’s fun to speculate about just how revolutionary it could be for humanity, what we really need now are results published in a peer-reviewed journal – which is something that Shawyer claims he is just a few months away from doing, as David Hambling reports for Wired.

So it might turn out that we need to modify some of our laws of physics in order to clarify how the drive actually works. But if that opens up the opportunity of human travel throughout the entire Solar System – and, more significantly, beyond – then it’s a sacrifice we’re certainly willing to make.

Science Confirms That People Absorb Energy From Others


A biological research team at Bielefeld University has made a groundbreaking discovery showing that plants can draw an alternative source of energy from other plants. This finding could also have a major impact on the future of bioenergy eventually providing the evidence to show that people draw energy from others in much the same way.

Members of Professor Dr. Olaf Kruse’s biological research team have confirmed for the first time that a plant, the green alga Chlamydomonas reinhardtii, not only engages in photosynthesis, but also has an alternative source of energy: it can draw it from other plants. The research findings were released this week in the online journal Nature Communications published by the renowned journal Nature.

Flowers need water and light to grow and people are no different. Our physical bodies are like sponges, soaking up the environment. “This is exactly why there are certain people who feel uncomfortable in specific group settings where there is a mix of energy and emotions,” said psychologist and energy healer Dr. Olivia Bader-Lee.

Plants engage in the photosynthesis of carbon dioxide, water, and light. In a series of experiments, Professor Dr. Olaf Kruse and his team cultivated the microscopically small green alga species Chlamydomonas reinhardtii and observed that when faced with a shortage of energy, these single-cell plants can draw energy from neighbouring vegetable cellulose instead. The alga secretes enzymes (so-called cellulose enzymes) that ‘digest’ the cellulose, breaking it down into smaller sugar components. These are then transported into the cells and transformed into a source of energy: the alga can continue to grow. ‘This is the first time that such a behaviour has been confirmed in a vegetable organism’, says Professor Kruse. ‘That algae can digest cellulose contradicts every previous textbook. To a certain extent, what we are seeing is plants eating plants’. Currently, the scientists are studying whether this mechanism can also be found in other types of alga. Preliminary findings indicate that this is the case.

“When energy studies become more advanced in the coming years, we will eventually see this translated to human beings as well,” stated Bader-Lee. “The human organism is very much like a plant, it draws needed energy to feed emotional states and this can essentially energize cells or cause increases in cortisol and catabolize cells depending on the emotional trigger.”

Bader-Lee suggests that the field of bioenergy is now ever evolving and that studies on the plant and animal world will soon translate and demonstrate what energy metaphysicians have known all along — that humans can heal each other simply through energy transfer just as plants do. “Human can absorb and heal through other humans, animals, and any part of nature. That’s why being around nature is often uplifting and energizing for so many people,” she concluded.

HERE ARE FIVE ENERGY TOOLS TO USE TO CLEAR YOUR SPACE AND PREVENT ENERGY DRAINS WHILE RELEASING PEOPLE’S ENERGY:

Stay centered and grounded. If you are centered within your spiritual self (instead of your analyzer or ego) you will sense right away when something has moved into your space. If you are fully grounded, you can easily release other people’s energy and emotions down your grounding cord with your intention.

Be in a state of non-resistance. What we resists sticks. If you feel uncomfortable around a certain person or in a group, don’t go into resistance as a way to protect yourself as this will only keep foreign energy stuck in your space. Move into a state of non-resistance by imagining that your body is clear and translucent like clear glass or water. This way, if someone throws some invalidation at you, it will pass right through you.

Own your personal aura space. We each have an energetic aura surrounding our body. If we don’t own this personal space we are vulnerable to foreign energy entering it. Become aware of your aura boundaries (about an arms length away from your body all the way around, above and below) as a way to own your personal space.

Give yourself an energy cleanse. The color gold has a high vibration which is useful for clearing away foreign energy. Imagine a gold shower nozzle at the top of your aura (a few feet above your head) and turn it on, allowing clear gold energy to flow through your aura and body space and release down your grounding. You will immediately feel cleansed and refreshed.

Call back your energy. When we have our energy in our own space there is less room for other’s energy to enter. But as we focus on other people and projects we sometimes spread our energy around. Create an image of a clear gold sun several feet above your head and let it be a magnet, attracting all of your energy back into it (and purifying it in the gold energy). Then bring it down through the top of your aura and into your body space, releasing your energy back into your personal space.

Scientists just found the part of our brain that actually gets physics


Not all of your brain is confused by physics.

Many science students might dread the more complex aspects of physics, but far removed from the mathematical equations that define how our physical world behaves, we each have an inner intuitive sense for how things will bounce, wobble, or fall.

Now, researchers say they’ve identified the brain region responsible for making these instinctive, immediate calculations for the movement of physical objects, dubbing it the brain’s “physics engine”.

“We run physics simulations all the time to prepare us for when we need to act in the world,” says cognitive scientist Jason Fischer from Johns Hopkins University. “It is among the most important aspects of cognition for survival. But there has been almost no work done to identify and study the brain regions involved in this capability.”

Interestingly, while we might think of these kinds of physical calculations as largely visual in nature – for example, trying to predict where a basketball might bounce after it hits the rim or backboard – the cerebral region that handles the actual work isn’t in the brain’s vision centre, but in our action planning areas: thepremotor cortex and supplementary motor area.

“Our findings suggest that physical intuition and action planning are intimately linked in the brain,” says Fischer. “We believe this might be because infants learn physics models of the world as they hone their motor skills, handling objects to learn how they behave. Also, to reach out and grab something in the right place with the right amount of force, we need real-time physical understanding.”

To identify the region in our brain that makes physics-based calculations, Fischer and a team of researchers from MIT had 12 participants look at video of Jenga-style blocks assembled in a tower.

While scanning their brain activity via functional magnetic resonance imaging(fMRI), the researchers asked the participants to predict where they thought the blocks might land if the tower were to collapse.

The fMRI results showed that the premotor cortex and the supplementary motor area were the most responsive areas – whereas a simple visual test, in which the participants only had to identify whether the static tower contained more blue or yellow blocks, didn’t stimulate activity in their physics engine.

Regardless of whether you guess the correct responses based on each tower (there are three separate collapses in the video), the parts of the brain you’re using when you try to figure it out are the premotor cortex and the supplementary motor area, according to the researchers.

The team found evidence of the same brain activity in two separate physics-based experiments. In one, the participants watched a video of two dots bouncing, and had to try to predict where the dots would bounce next.

In a final test, the participants simply watched short movie clips featuring lots of physics content while having their brain activity monitored.

Even without being asked to respond to the clips in any way – they only had to watch the video – the results showed that the brain’s physics engine was stimulated, with the more physical content shown, the more the premotor cortex and supplementary motor area were activated.

“The brain activity reflected the amount of physical content in a movie, even if people weren’t consciously paying attention to it,” says Fischer. “This suggests that we are making physical inferences all the time, even when we’re not even thinking about it.”

It’s worth pointing out that this was a very small study with only a small group taking part, so the findings will need to be replicated in larger research involving more people being tested.

But if the findings check out, the researchers say it could help lead to better designs for robots, built with physics engines that resemble our own. The research could also help us understand more about motor disorders like apraxia– where people have difficulty planning and carrying out physical movements.

It’s not the first time researchers have found evidence of physics-based thinking in the brain. Earlier in the year, a team of Japanese scientists discovered even cats have a very basic grasp of physics, based on experiments involving containers that either rattled or were silent when shaken – which gave the animals a clue that something was contained inside them (or not).

We’re finding out more and more out about the human brain all the time – withscientists identifying almost 100 brand-new brain regions just last month.

It’s an exciting time for neuroscience, and we can’t wait to find out just what else our heads have in store for us.

Watch the video. URL:https://youtu.be/1vwa8-wUJI0

‘Optical fibre’ made out of thin air .


Scientists say they have turned thin air into an ‘optical fibre’ that can transmit and amplify light signals without the need for any cables.

In a proof-of-principle experiment they created an “air waveguide” that could one day be used as an instantaneous optical fibre to any point on earth, or even into space.

The findings, reported in the journal Optica, have applications in long range laser communications, high-resolution topographic mapping, air pollution and climate change research, and could also be used by the military to make laser weapons.

“People have been thinking about making air waveguides for a while, but this is the first time it’s been realised,” says Professor Howard Milchberg of the University of Maryland, who led the research, which was funded by the US military and National Science Foundation.

Lasers lose intensity and focus with increasing distance as photons naturally spread apart and interact with atoms and molecules in the air.

Fibre optics solves this problem by beaming the light through glass cores with a high refractive index, which is good for transmitting light.

The core is surrounded by material with a lower refractive index that reflects light back in to the core, preventing the beam from losing focus or intensity.

Fibre optics, however, are limited in the amount of power they can carry and the need for a physical structure to support them.

Light and air

Milchberg and colleagues’ made the equivalent of an optical fibre out of thin air by generating a laser with its light split into a ring of multiple beams forming a pipe.

They used very short and powerful pulses from the laser to heat the air molecules along the beam extremely quickly.

Such rapid heating produced sound waves that took about a microsecond to converge to the centre of the pipe, creating a high-density area surrounded by a low-density area left behind in the wake of the laser beams.

“A microsecond is a long time compared to how far light propagates, so the light is gone and a microsecond later those sound waves collide in the centre, enhancing the air density there,” says Milchberg.

The lower density region of air surrounding the centre of the air waveguide had a lower refractive index, keeping the light focused.

“Any structure [even air] which has a higher density will have a higher index of refraction and thereby act like an optical fibre,” says Milchberg.

Amplified signal

Once Milchberg and colleagues created their air waveguide, they used a second laser to spark the air at one end of the waveguide turning it into plasma.

An optical signal from the spark was transmitted along the air waveguide, over a distance of a metre to a detector at the other end.

The signal collected by the detector was strong enough to allow Milchberg and colleagues to analyse the chemical composition of the air that produced the spark.

The researchers found the signal was 50 per cent stronger than a signal obtained without an air waveguide.

The findings show the air waveguide can be used as a “remote collection optic,” says Milchberg.

“This is an optical fibre cable that you can reel out at the speed of light and place next to [something] that you want to measure remotely, and have the signal come all the way back to where you are.”

Australian expert Professor Ben Eggleton of the University of Sydney says this is potentially an important advance for the field of optics.

“It’s sort of like you have an optical fibre that you can shine into the sky, connecting your laser to the top of the atmosphere,” says Eggleton.

“You don’t need big lenses and optics, it’s already guided along this channel in the atmosphere.”

 

Quantum gas goes below absolute zero .


It may sound less likely than hell freezing over, but physicists have created an atomic gas with a sub-absolute-zero temperature for the first time1. Their technique opens the door to generating negative-Kelvin materials and new quantum devices, and it could even help to solve a cosmological mystery.

Lord Kelvin defined the absolute temperature scale in the mid-1800s in such a way that nothing could be colder than absolute zero. Physicists later realized that the absolute temperature of a gas is related to the average energy of its particles. Absolute zero corresponds to the theoretical state in which particles have no energy at all, and higher temperatures correspond to higher average energies.

However, by the 1950s, physicists working with more exotic systems began to realise that this isn’t always true: Technically, you read off the temperature of a system from a graph that plots the probabilities of its particles being found with certain energies. Normally, most particles have average or near-average energies, with only a few particles zipping around at higher energies. In theory, if the situation is reversed, with more particles having higher, rather than lower, energies, the plot would flip over and the sign of the temperature would change from a positive to a negative absolute temperature, explains Ulrich Schneider, a physicist at the Ludwig Maximilian University in Munich, Germany.

Schneider and his colleagues reached such sub-absolute-zero temperatures with an ultracold quantum gas made up of potassium atoms. Using lasers and magnetic fields, they kept the individual atoms in a lattice arrangement. At positive temperatures, the atoms repel, making the configuration stable. The team then quickly adjusted the magnetic fields, causing the atoms to attract rather than repel each other. “This suddenly shifts the atoms from their most stable, lowest-energy state to the highest possible energy state, before they can react,” says Schneider. “It’s like walking through a valley, then instantly finding yourself on the mountain peak.”

At positive temperatures, such a reversal would be unstable and the atoms would collapse inwards. But the team also adjusted the trapping laser field to make it more energetically favourable for the atoms to stick in their positions. This result, described today in Science1, marks the gas’s transition from just above absolute zero to a few billionths of a Kelvin below absolute zero.

Wolfgang Ketterle, a physicist and Nobel laureate at the Massachusetts Institute of Technology in Cambridge, who has previously demonstrated negative absolute temperatures in a magnetic system2, calls the latest work an “experimental tour de force”. Exotic high-energy states that are hard to generate in the laboratory at positive temperatures become stable at negative absolute temperatures — “as though you can stand a pyramid on its head and not worry about it toppling over,” he notes — and so such techniques can allow these states to be studied in detail. “This may be a way to create new forms of matter in the laboratory,” Ketterle adds.

If built, such systems would behave in strange ways, says Achim Rosch, a theoretical physicist at the University of Cologne in Germany, who proposed the technique used by Schneider and his team3. For instance, Rosch and his colleagues have calculated that whereas clouds of atoms would normally be pulled downwards by gravity, if part of the cloud is at a negative absolute temperature, some atoms will move upwards, apparently defying gravity4.

Another peculiarity of the sub-absolute-zero gas is that it mimics ‘dark energy’, the mysterious force that pushes the Universe to expand at an ever-faster rate against the inward pull of gravity. Schneider notes that the attractive atoms in the gas produced by the team also want to collapse inwards, but do not because the negative absolute temperature stabilises them. “It’s interesting that this weird feature pops up in the Universe and also in the lab,” he says. “This may be something that cosmologists should look at more closely.”

Single photon detected but not destroyed.


First instrument built that can witness the passage of a light particle without absorbing it.

Physicists have seen a single particle of light and then let it go on its way. The feat was possible thanks to a new technique that, for the first time, detects optical photons without destroying them. The technology could eventually offer perfect detection of photons, providing a boost to quantum communication and even biological imaging.

Plenty of commercially available instruments can identify individual light particles, but these instruments absorb the photons and use the energy to produce an audible click or some other signal of detection.

Quantum physicist Stephan Ritter and his colleagues at the Max Planck Institute of Quantum Optics in Garching, Germany, wanted to follow up on a 2004 proposal of a nondestructive method for detecting photons. Instead of capturing photons, this instrument would sense their presence, taking advantage of the eccentric realm of quantum mechanics in which particles can exist in multiple states and roam in multiple places simultaneously.

Ritter and his team started with a pair of highly reflective mirrors separated by a half-millimeter-wide cavity. Then they placed a single atom of rubidium in the cavity to function as a security guard. They chose rubidium because it can take on two distinct identities, which are determined by the arrangement of its electrons. In one state, it’s a 100 percent effective sentry, preventing photons from entering the cavity. In the other, it’s a totally useless lookout, allowing photons to enter the cavity. When photons get in, they bounce back and forth about 20,000 times before exiting.

The trick was manipulating the rubidium so that it was in a so-called quantum superposition of these two states, allowing one atom to be an overachiever and a slacker at the same time. Consequently, each incoming photon took multiple paths simultaneously, both slipping into the cavity undetected and being stopped at the door and reflected away. Each time the attentive state of the rubidium turned away a photon, a measurable property of the atom called its phase changed. If the phases of the two states of the rubidium atom differed, the researchers knew that the atom had encountered a photon.

To confirm their results, the researchers placed a conventional detector outside the apparatus to capture photons after their rubidium rendezvous, the team reports November 14 in Science.

“It’s a very cool experiment,” says Alan Migdall, who leads the quantum optics group at the National Institute of Standards and Technology in Gaithersburg, Md. But he warns that identifying photons without destroying them does not mean that the outgoing photon is the same as it was prior to detection. “You’ve pulled some information out of it, so you do wind up affecting it,” he says. Ritter says he expects the photons’ properties are largely unchanged, but he acknowledges that his team needs to perform more measurements to confirm that hypothesis.

Ritter notes that no photon detector is perfect, and his team’s is no exception: It failed to detect a quarter of incoming photons, and it absorbed a third of them. But he says the power of the technique is that, for many applications of single-photon detectors, each detector wouldn’t have to be perfect. Ritter envisions a nested arrangement of improved detectors that, as long as they did not absorb photons, would almost guarantee that every photon is counted. Ultimately, that could benefit fields such as medicine and molecular biology, in which scientists require precise imaging of objects in low-light environments.

Perfect Imaging, From Theory to Reality via Simulations.


Perfect imaging refers to the idea of producing images with details below the diffraction limit, where even the smallest elements can be resolved to unlimited sharpness regardless of the wavelength of light being used. While just a theory 150 years ago, research has brought us closer to reality over the years. Now, by way of simulation, researchers at Cedint Polytechnic University of Madrid in Spain are taking it one step further.

Maxwell Fish-Eye Lens and the Diffraction Limit

Imaging systems have long been the subject of study for famous physicists like Maxwell, who proposed a fish-eye lens that uses a gradient index lens between a pair of points in space. The pair is defined by two opposite points laying on the spherical surface. Such a lens was supposed to be a “perfect imaging” system or, in other words, a system capable of focusing (imaging) the smallest detail from one point of its surface to another. Maxwell’s proposal was considered impossible to implement with an ordinary material with a positive index of refraction due to the diffraction limit. In practice, this means that in processes like photolitography, the size of features of an electronic device cannot be smaller than the wavelength of the light being used.

Below the Diffraction Limit with Ordinary Materials

In 2004 it was proven that an artificial material with a negative refractive index (also known as a metamaterial) could be used to overcome the diffraction limit. Later, in 2009, a breakthrough theory showed that an ordinary material could in fact be used to manufacture a Maxwell fish-eye lens. The latter approach intrigued professor Juan Carlos Miñano and his research team at Cedint Polytechnic University of Madrid. They decided to use simulation to prove the theory that the diffraction limit could be surpassed by designing a device with the equivalent optical properties of a Maxwell fish-eye lens, but with a different geometry: a spherical geodesic waveguide.

Spherical Geodesic Waveguide for Perfect Imaging

A spherical geodesic waveguide, which was designed by Miñano and his colleagues, is a very thin spherical metallic waveguide filled with a non-magnetic material (see figure below). At the moment, it’s still a proposed device that can be studied, optimized, and fabricated thanks to simulation. Miñano’s team couldn’t resort to geometrical optics, and therefore, to solve Maxwell’s equations with real-world accuracy, they decided to rely on COMSOL Multiphysics and the RF Module. The spherical geodesic waveguide model was designed and simulated using COMSOL by postdoctoral researcher Dejan Grabovickic from Miñano’s group.

Spherical geodesic waveguide for perfect imaging showing a cross section of the coaxial cable and spherical waveguide
The spherical geodesic waveguide with the drain port on top (left), where a cross section of the coaxial cable and its mesh are shown. The cross section (right) of the spherical geodesic waveguide including the drain port.

The spherical geodesic waveguide is designed for short-distance transmission and demonstrated super imaging properties: it can sense changes in the position of its receiver that are much smaller than the wavelength of the light being used. Super imaging could drastically reduce the size of integrated electronics, and as Dejan states in the IEEE Spectrum magazine insert, Multiphysics Simulation, it could allow for the production of integrated electronics that are “much smaller than what is the state of the art — something like 100 times smaller.”

7 Things They Should Teach You in School but Don’t.


“It is hard to convince a high school student that he will encounter a lot of problems more difficult than those of algebra and geometry.” ~ Edgar W. Howe

You know what I realized recently? That I don’t really remember much of the things I have learned in school.I don’t know about you, but when I think about the things that shaped and helped me improve myself and my life, the things that contributed to my growth and evolution, I realize that they didn’t came from school but from what I have learned after I finished my studies… I guess Dr. Seuss was right, “You can get help from teachers, but you are going to have to learn a lot by yourself, sitting alone in a room.”

What I will share with you today is a list of 7 things I think should teach you in school but don’t. Enjoy.

1. The more positive your thoughts are, the happier your life gets

Most of us have no idea how powerful our thoughts are and how, because of our polluted and toxic way of thinking, we make our lives a lot harder and unhappier than it should be. Just look how beautifully Albert Einstein talks about the power of our thoughts: “The world as we have created it is a process of our thinking. It cannot be changed without changing our thinking.”

Our thoughts shape and make us who we are. If our thoughts are negative and destructive, our beliefs will mirror our thoughts and based on these beliefs we will craft our lives. If the mind is made pure, everything else will be made pure.

“If you can change your mind, you can change your life.” ~ William James

I wish they would’ve taught me these things in school. I wish they would’ve told me that once you change your thinking you are in fact changing your whole life and that the more positive your thoughts are, the happier your life will get.

2. People will raise and lower at the level of your expectations

When I was a little kid, I didn’t like my teachers very much simply because I felt like they were treating their students differently based on the way we all looked, the way we were dressed and based on who our parents were.

They were placing labels on little kids, treating all of us differently, condemning those who came from less favorable homes to stay stuck in that place while those who came from more fortunate backgrounds to become better and better at everything they did.

The way I see it, a teachers’s job and responsibility is to shape his or her students in a beautiful and powerful way, to raise them up and not to tear them.

One of my favorite quotes of all times comes from Goethe where he talks about how people will raise and lower at the level of your expectations: “If you treat an individual as he is, he will remain how he is. But if you treat him as if he were what he ought to be and could be, he will become what he ought to be and could be.”

3. The value of Self Love 

If you ask me, self love is the key to a happy and meaningful life. When you honor and love yourself for who you are and for who you are not, you will know how to honor and love the world around you as well.

Because you have so much love for yourself, you will extend that love outwards, on to your family and friends, your work, the environment and every living thing you come in contact with.

We project outwards that which we are inwards and if we learn to love and accept ourselves we will know how to love and accept the whole world.

“The most powerful relationship you will ever have is the relationship with yourself.” ~ Steve Maraboli,

4. A musician must make music, an artist must paint and a poet must write

I studied art for 12 years but because I was lead to believe that you can’t really make a living from pursuing a career in this field, I was “advised” to go study Economics, because you see, that’s where the money are.

As a “well behaved human” that I was at that time and because I had no idea what I wanted to do with my life, I did what the “experts” suggested I would do. I went to University and studied Economics. Biggest mistakes I have ever made.

Being stuck in a place where I felt like I did not belong, studying something that made no sense for me whatsoever led me to believe I was stupid. I remember looking at my books and not understanding anything that was in there. I just couldn’t understand why was it so hard for me to understand accounting, finance, banking and all the other subjects I was studying…

“Everybody is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid.” Einstein

It took me quite so time for me to understand that you you can’t fit a square peg into a round hole and that if you are an artist, you have to create art, and if you are an accountant, you have to do what accountants do :)

I think Abraham Harold Maslow said it best: “A musician must make music, an artist must paint, a poet must write, if he is to be ultimately at peace with himself. What a man can be, he must be.”

5. The importance of being present and engaged in the now

I wish I would’ve learned in school how to be present and engaged in the present moment and how to appreciate life for what it is and also for what it not. To be thankful for whatever the present moment has to offer and if I say I want to be happy, to be happy here and now and not to wait for things to change and time to pass in order to feel those good feelings I say I want to feel.

Little children are very good at being present and engaged in the now but as they grow older and older, things change. Their minds start to wonder around, keeping them from being fully present in the now, keeping them from having as much fun as they used to when they were younger. They start to worry, to fear and to stress about the many things that might go wrong with them and the world around them… they stop enjoying life.

At the core, we all know how to be present, we all know ho to be happy,  how to enjoy life but for different reasons, we tend to forget and that’s why I think school would be a great place for us to start remembering and start living.

“Unease, anxiety, tension, stress, worry — all forms of fear — are cause by too much future, and not enough presence. Guilt, regret, resentment, grievances, sadness, bitterness, and all forms of nonforgiveness are caused by too much past, and not enough presence.” ~ Eckhart Tolle

6. Be a first rate version of yourself and not a second rate version of someone else

Since we are all unique individuals with unique gifts and talents, I think it would be great if they would start encouraging students in schools to embrace their authenticity, to teach them that its okay to be who they really are without the fear of being judged, labeled or criticized. Teachers shouldn’t ask their students to be normal but rather to be themselves.

We are all different and we can’t all play by the same rules. Our differences should be celebrated, appreciated and encouraged in schools, not judged, ridiculed and labeled harshly…

We need more authentic people in this world and schools would be a great place to learn about its value and importance.

“Being yourself is one of the hardest things because it’s scary. You always wonder whether you’ll be accepted for who you really are. I decided to call my record ‘Inside Out’ because that’s my motto about life. I don’t think you ever succeed at trying to be anyone else but who you truly are.” ~ Emmy Rossum

7. Never get your sense of worth from outside yourself.

Why don’t they teach you in school not to get your sense of worth from outside yourself?  Why don’t they have a class called “Who you are is more than enough and you should never look outside yourself for approval and validation”?

When I was in school, comparison and competition were strongly encouraged by my teachers. As a result, we were all trying to get the best grades and we were all trying to better than the other students. We were all perceiving ourselves as being more or less valuable based on our grades.

Because I wanted to be the best in my class and because I wanted my teachers to like me, I was studying for the wrong reason. I wasn’t studying because I enjoyed studying or because I was passionate about the subjects I was studying but because I was after their approval.

It took me a long time until I realized that self worth comes from who you are internally and not from how other people treat you externally.

Never get your sense of worth from outside yourself. Don’t let other people tell you how much you’re worth, decide for yourself. It’s called self worth not others worth.

“Life is too short to waste any amount of time on wondering what other people think about you. In the first place, if they had better things going on in their lives, they wouldn’t have the time to sit around and talk about you. What’s important to me is not others’ opinions of me, but what’s important to me is my opinion of myself.” ~ C. JoyBell C.

Even though nobody taught me these things in school, I am grateful for the many books I have read and the many great teachers that came my way that taught me all of these things and many others.

Materials Prediction Scores a Hit


 Figure 1

 The energetics of predicting materials. A schematic free energy landscape for different crystallographic configurations is given by the blue line. Note the small difference in energy between various structures compared with the total energy of a crystal demanding high computational accuracy. The application of pressure as done by Gou et al. will modify the energy landscape (red curve), potentially stabilizing new structures. The ground state (such as superconductivity, magnetism, or other forms of order) for a given structure is determined at even lower energy scales, as depicted in the inset. The addition of strong electronic correlations in some materials will further modify the landscape over large energy scales up to 10eV, making predictions even more challenging.

Had the great American philosopher Yogi Berra been a condensed matter physicist, he might have said “It’s difficult to make predictions, especially about superconductivity.” Predictions about a material’s structure and even more so its function have been goals of materials research for a long time, but the track record for predicting that a given compound will superconduct is notoriously bad [1]. Fortunately, advances in the fidelity and resolution of electronic structure calculations are beginning to change this trend [2]. In fact, the White House’s Materials Genome Initiative [3] represents a recognition that with recent advances in computational capability and materials models, such breakthroughs are possible and, in fact, likely probable. In a paper in Physical Review Letters, Huiyang Gou at the University of Bayreuth, Germany, and colleagues [4] describe a success story in the search for predictability. They report the observation of superconductivity in iron tetraboride (FeB4) at approximately 3 kelvin (K). Not only did they find superconductivity where electronic structure calculations told them to look, they used high-pressure synthesis techniques to discover a compound that wasn’t readily apparent in nature. Further, the resulting compound, orthorhombic FeB4, turns out to be very mechanically hard as well as superconducting, thus possessing two desirable traits.

Most attempts to predict superconductivity invoke the physicist Bernd Matthias [5]. In the 1950s–1970s Matthias articulated a number of empirical “rules” that anticipated a large number of superconducting materials based on crystal structure and the number of valence electrons per atom. However, these rules were clearly based on intuition and not predictive theory. The experimental discovery that cuprates, magnesium diboride (MgB2), and more recently, iron pnictides all superconduct drove home the reality that serendipity was still the best materials discovery engine. However, that reality is beginning to evolve.

Why is it so difficult to predict new superconducting materials? One issue is the difficulty predicting the structural stability of a compound, that is, whether the binding energy between atoms is large enough to keep them stuck together in a particular configuration. Electronic structure calculations provide the total energy for a crystal, which is on the order of 105 electron volts per atom (eV/atom) (see Fig. 1). However, the stability with respect to competing phases is typically as small as 10-2 eV/atom, thus demanding incredibly high accuracy of the calculations. Furthermore, calculations are typically performed at T=0K in ideal crystals, while the thermal energy at which the crystals are synthesized and the energy scale created by defects can easily shift the relative total energies of competing phases by similar amounts. Another factor is that superconductivity is a very low-energy instability of the electronic structure. For a superconductor with a transition temperature Tc of 3K, as discovered by Gou et al., this amounts to an energy scale of 10-4eV. Few predictive models (yet) have accuracy at the parts per billion level covered by these energy scales.

Advanced electronic structure calculations for predictions have increased effectiveness due to the relative accessibility and availability of high-pressure techniques. Recent discoveries demonstrate that surprises still exist at high pressure [6]. We now know that a dozen or so additional elements superconduct at elevated pressure even though they are normal materials under ambient conditions, including calcium at 220 gigapascals (with Tc=29K, the highest Tc for an elemental superconductor). More broadly, materials science has been transformed by our ability to apply sufficient pressure to tune structural energetics on this scale to make new states of matter available.

In 2010, Kolmogorov, a coauthor of the present study, and colleagues predicted additional phases in the iron–boron (Fe-B) binary phase diagram that had yet to be observed [7]. They used a high-throughput search method coupled to an evolutionary algorithm to identify new structures for which superconductivity was theoretically evaluated. Subsequently, Bialon et al. suggested that the stability of iron tetraboride (FeB4) would be enhanced under pressure, and predicted the material’s hardness [8]. In the present manuscript, Gou et al. confirmed that FeB4 can be synthesized under pressure, and furthermore, that it possess the two novel predicted properties: superconductivity and high incompressibility. In addition, though not definitive, Gou et al. obtained preliminary data that superconductivity is phonon mediated like other conventional superconductors.

While the paper by Gou et al. gives promise that theory may finally be able to guide experimentalists where to look for conventional superconductors, it’s important to remember that the predicted Tc was 5 times too large in a structure that couldn’t be synthesized at ambient pressure. Further, the situation remains much more challenging for unconventional superconductors such as the cuprates, pnictides, heavy fermion materials, and organics. The biggest issue is that strong electronic correlations alter the electronic structure in these materials over an energy scale of order 1–10eV. While modern electronic structure calculations such as dynamical mean-field theory are making progress in understanding these effects, we currently lack the ability to reliably identify an additional superconducting instability on this strongly correlated background. How these electronic correlations modify the ability to compute structural stability of compounds also remains an open question. Given that superconductivity emerges in strongly correlated systems in ways we least expect it [9], future searches would be aided by guidance on where to find such correlations and competing electronic instabilities.

Gou et al. provide an encouraging step in the quest for materials by design, but one can also hope that this is a harbinger of even more and better things to come. Leveraging advanced computational capabilities and associated materials algorithms, together with synthetic techniques that allow broader access to phase space, including metastable materials, holds the exciting potential of delivering on the vision of the Materials Genome Initiative. We look forward to this, bearing in mind the quote attributed to Yogi Berra: “It’s difficult to make predictions, especially about the future.”

Acknowledgment

Our work in this area has been supported by the Department of Energy’s Office of Basic Energy Sciences Division of Materials Science and Engineering.

References

  1. I. I. Mazin, “Superconductivity Gets an Iron Boost,” Nature 464, 183 (2010).
  2. R. Akashi and R. Arita, “Development of Density-Functional Theory for a Plasmon-Assisted Superconducting State: Application to Lithium Under High Pressures,” Phys. Rev. Lett. 111, 057006 (2013).
  3. Materials Genome Initiative for Global Competitiveness, http://www.whitehouse.gov/blog/2011/06/24/materials-genome-initiative-renaissance-american-manufacturing.
  4. H. Gou et al., “Discovery of a Superhard Iron Tetraboride Superconductor,” Phys. Rev. Lett. 111, 157002 (2013).
  5. B. T. Matthias, T. H. Geballe, and V. B. Compton, “Superconductivity,” Rev. Mod. Phys. 35, 1 (1963).
  6. M. Sakata, Y. Nakamoto, K. Shimizu, T. Matsuoka, and Y. Ohishi, “Superconducting state of Ca-VII below a critical temperature of 29 K at a pressure of 216 GPa,” Phys. Rev. B 83, 220512(R) (2011).
  7. A. N. Kolmogorov, S. Shah, E. R. Margine, A. F. Bialon, T. Hammerschmidt, and R. Drautz, “New Superconducting and Semiconducting Fe-B Compounds Predicted with an Ab Initio Evolutionary Search,” Phys. Rev. Lett. 105, 217003 (2010).
  8. A. F. Bialon, T. Hammerschmidt, R. Drautz, S. Shah, E. R. Margine, and A. N. Kolmogorov, “Possible Routes for Synthesis of New Boron-Rich Fe–B and Fe1-xCrxB4 Compounds,” Appl. Phys. Lett. 98, 081901 (2011).
  9. Z. Fisk, H. R. Ott, and J. D. Thompson, “Superconducting materials: What the record tells us,” Philos. Mag. 89, 2111 (2009).