Without science & technology we can all agree that the world would be a dramatically different place than what we know today. We realize that as we move forward in time it will be science & technology that will help supply the resources we will need to make effective, long-term change that will benefit everyone. 

Sustainable Science & Technology explores the trends and the cutting edge discoveries that will help move us forward, from laboratories around the world, to the young minds of our students, teachers, and instructors who are guiding the way.

Here, we'll try to share inspirational stories, ideas, and resources that all revolve around how science and technology are helping our environment, and making life more sustainable around the world.


Wind Energy Testing Site Harnessed

For Wave Power



In the first ever trial of its kind in the U.S., the huge dynamometers housed at the National Wind Technology Centre (NWTC) in Boulder, Colorado, will be used to help bring a breakthrough wave energy system to fruition.


Usually set aside for testing the latest in onshore and offshore wind turbine technology, engineers from the National Renewable Energy Laboratory (NREL), which operates the NWTC facility on behalf of the U.S. government, will work with Columbia Power Technologies to develop its StingRAY system.


“Though designed to benefit the wind industry, the NWTC’s large dynamometer facility is being leveraged to help advance new ocean energy technology,” said NWTC Director Daniel Laird.

“While still in the early stages of development, ocean energy is progressing rapidly. Over the coming decades, ocean energy could become a major source of electricity powering high-population-density areas near the coasts.”


Columbia Power’s StingRAY wave power system harnesses ocean energy by using two forward and aft floats coupled to low-torque, high-speed generators located in the nacelle of the StingRAY. As each float rotates in response to passing waves and ocean swell, electricity is produced and sent via offshore substations to the mainland grid.


The system can be arrayed in series to build offshore wave farms, much like wind turbines. Columbia Power states the StingRAY is designed for deployment in waters at least 3-4km from shore in depth over 60 metres, away from sensitive coastal habitats.


With each generator the height of a two-storey building, the StingRAY is the largest device ever tested at the NWTC’s dynamometer facility, which is capable of mimicking the back-and-forth movement of the sea to ensure the system can handle actual ocean conditions, including rogue wave strikes.


“We have one of the only facilities in the country with a dynamometer that can apply rotational torque at the speeds and forces required while also applying non-torque loads-which are side forces that simulate the action of a rogue wave hitting a wave energy converter in the ocean,” said Mark McDade, NWTC project manager.


“This matters because the structures of these energy conversion devices must be designed to handle the side forces without damage. The work is pioneering in the field of ocean energy conversion.”





2016 Warmest Year on Record Globally



Earth's 2016 surface temperatures were the warmest since modern recordkeeping began in 1880, according to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA).


Globally-averaged temperatures in 2016 were 1.78 degrees Fahrenheit (0.99 degrees Celsius) warmer than the mid-20th century mean. This makes 2016 the third year in a row to set a new record for global average surface temperatures.


The 2016 temperatures continue a long-term warming trend, according to analyses by scientists at NASA's Goddard Institute for Space Studies (GISS) in New York. NOAA scientists concur with the finding that 2016 was the warmest year on record based on separate, independent analyses of the data.


Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. However, even taking this into account, NASA estimates 2016 was the warmest year with greater than 95 percent certainty.


"2016 is remarkably the third record year in a row in this series," said GISS Director Gavin Schmidt. "We don't expect record years every year, but the ongoing long-term warming trend is clear."


The planet's average surface temperature has risen about 2.0 degrees Fahrenheit (1.1 degrees Celsius) since the late 19th century, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere.


Most of the warming occurred in the past 35 years, with 16 of the 17 warmest years on record occurring since 2001. Not only was 2016 the warmest year on record, but eight of the 12 months that make up the year – from January through September, with the exception of June – were the warmest on record for those respective months. October, November, and December of 2016 were the second warmest of those months on record – in all three cases, behind records set in 2015.


Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Researchers estimate the direct impact of the natural El Niño warming in the tropical Pacific increased the annual global temperature anomaly for 2016 by 0.2 degrees Fahrenheit (0.12 degrees Celsius).  


Weather dynamics often affect regional temperatures, so not every region on Earth experienced record average temperatures last year. For example, both NASA and NOAA found the 2016 annual mean temperature for the contiguous 48 United States was the second warmest on record. In contrast, the Arctic experienced its warmest year ever, consistent with record low sea ice found in that region for most of the year.


NASA's analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations. These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. The result of these calculations is an estimate of the global average temperature difference from a baseline period of 1951 to 1980.


NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures.


GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.


NASA monitors Earth's vital signs from land, air and space with a fleet of satellites, as well as airborne and ground-based observation campaigns. The agency develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.


The full 2016 surface temperature data set and the complete methodology used to make the temperature calculation are available at:


http://data.giss.nasa.gov/gistemp




New Delhi chokes on pollution

as scientists urge action


Photo Credit: Wonderlane


Scientists and air pollution experts are calling for Indian government officials to back a ten-point plan to improve air quality following a severe episode at the beginning of November when New Delhi was blanketed under an unprecedented cloud of choking smog.


The Indian government had on Sunday 6 November declared an emergency as toxic pollutants reached severe levels. This week fine particulate levels continued to be well above Indian government standards with air quality defined as "poor" or "very poor".


The new report, 'Breathing Cleaner Air', by a task force of eminent Indian and international experts, outlines solutions that can significantly reduce air pollution, including critical near-term measures such as:


Prevent agricultural burning by turning crop residue into fuel for electricity generation;


Provide cleaner fuels (LPG, Electricity) and biomass stoves with an efficiency of 50% or more and with a forced draft fan to those who cannot afford LPG.

Switch to low sulphur fuel, implement Euro VI equivalent standards for engine emissions; shift freight transport from road to lower-emission modes rail and shipping modes;


Introduce a new street cleaning program to limit emissions from road dust.

The government meanwhile announced that it was already moving to implement another overarching report recommendation for the creation of a new National Clean Air Mission to coordinate across local, state and national jurisdictions to prevent life-threatening pollution in Indian cities.


Dr Ajay Mathur, Director General of The Energy and Resources Institute (TERI), said that Delhi air pollution emergencies were becoming an annual event and that coordinated action needed to be taken to prevent them occurring in the future.


"These solutions [in the report] are both technically feasible and cost effective and the most encouraging part is that the experts and expertise exists in India," he said. "We have seen what is possible in places like California, which once had some of the most polluted cities in the world, and we know that air pollution cannot and should not be part of India's future.


"The time has come to clean the air and achieve standards that can protect people's health, agriculture, and overall quality of life."


In recent winters, Indian cities have experienced increasingly dangerous levels of air pollution. The start to this winter witnessed an exceptional situation with severe levels being recorded on a daily basis. In the first week of November, recorded daily average levels were more than 20 times the equivalent WHO Guideline value for PM2.5, with peak concentrations of over 1000 micrograms per cubic metre in some locations, on occasion. Schools were subsequently closed, construction activities halted and other emergency measures taken by local authorities.


PM2.5, or extremely fine particulate matter (PM), is dangerous to human health because the particles can penetrate deep into people's lungs, bloodstream, and bodies. The report finds eighty per cent of Indian cities don't meet national standards for PM pollution.


Erik Solheim, Executive Director of UN Environment, said air pollution is currently the greatest environmental threat to human health and one of the fastest growing issues on the global health agenda.


"The recent smog crisis in India highlights how important it is to work across borders on an issue like air pollution," he said. "Activities in one place can affect the health and well-being of people hundreds of miles away.


"We need an integrated approach, with solutions that provide real benefits for people. Efforts to improve air quality improve the quality of life, protect our climate and supports sustainable development."


Improving urban air quality is complicated by the fact that local air pollution regulations have little impact on activities, like agricultural burning, that may take place hundreds of miles away in a different jurisdiction.


This is why the task force recommended as its highest priority the launch of a National Clean Air Mission (CAM-INDIA) with the mandate to implement government air pollution reduction policies across several ministries - including transport, power, construction, agriculture, rural development, and environment - and across city and state jurisdictions.


"Technologies, monitoring instruments and governance strategies are all available to solve the air pollution problem and provide cleaner air for citizens of India," report Chair Professor Ramanathan said. "The solutions recommended by our taskforce were based on lessons learned in living laboratories around the world."


Dr Henk Bekedam, WHO Representative to India, said air pollution is a major constraint to public health and sustainable development.


"It is everybody's business and each one is a stakeholder," he said. "Tackling air pollution requires a concerted whole of society approach backed by strong political in order to make a difference to our present and future."


The Climate and Clean Air Coalition, in collaboration with the World Health Organization and UN Environment, recently launched the BreatheLife campaign, an initiative that aims to inform people about the health and climate impacts of air pollution. The campaign provides concrete solutions that can make a difference, including many of those featured in this report. The campaign at http://breathelife2030.org/ is building an alliance of cities working to improve air quality so they can share knowledge and showcase success stories.


The report task force was chaired by Professor Veerabhadran Ramanathan of the University of California, San Diego and Co-Chaired by Mr Sumit Sharma and Dr Ibrahim Rehman of TERI, with support from the United Nations Environment's Climate and Clean Air Coalition (CCAC). The World Health Organization's (WHO) Regional Office for South-East Asia also supported the organization of stakeholder consultations that contributed key elements of the review.




Moving Beyond Batteries to Harvest Energy from Air



A revolutionary and emerging class of energy-harvesting computer systems require neither a battery nor a power outlet to operate, instead operating by harvesting energy from their environment. While radio waves, solar energy, heat, and vibrations have the ability to power devices, harvested energy sources are weak, leading to an "intermittent execution," with periodic power failures and unreliable behavior.


Brandon Lucia, an assistant professor of electrical and computer engineering at Carnegie Mellon University, and his Ph.D. student Alexei Colin created the first programming language designed to build reliable software for intermittent, energy-harvesting computers. Colin will present the work at the 2016 SPLASH conference in Amsterdam, Netherlands, on November 3rd.


"Energy is not always available in the environment for a device to harvest," explains Lucia. "Intermittent operation makes it difficult to build applications because existing software programming languages—and programmers themselves—assume that energy is a continuously available resource."


The innovative new programming language, called Chain, asks an application developer to define a set of computational tasks that compute and exchange data through a novel way of manipulating the computer's memory, called a channel. Chain guarantees that tasks execute correctly despite arbitrary power failures.


"When power is not continuously available, power failures disrupt the software's execution, often leading to unrecoverable errors," says Lucia. "Chain solves this problem by requiring computational tasks in the program to use a novel channel-based memory abstraction that ensures tasks complete without error."


Channel-based memory is the key to Chain's ability to avoid software errors: regardless of when power fails, channels ensure that a computational task always has an intact version of the data it needs when power resumes. Restarting a Chain program after a failure has virtually zero time cost, because Chain does not rely on an expensive, conventional approach, like memory checkpointing. The extreme scarcity of energy makes efficient restarting essential for energy-harvesting applications including IoT devices and implantable or ingestible medical devices.


"Chain provides important reliability guarantees in a familiar and flexible programming interface that is well-positioned to be the foundation for today's and future energy-harvesting applications," says Lucia.


Lucia, Colin, and Dr. Alanson Sample, a collaborator at Disney Research Pittsburgh, worked together to push Chain into real-world deployment; early next year, in cooperation with nano-satellite company KickSat, software written in Chain will run on-board two tiny, postage stamp-sized satellites in a low-earth orbit of Earth. Once in orbit, these satellites will use tiny solar panels to harvest solar energy, powering them to collect and process sensor data and send information back to earth. While satellites are typically powered by solar energy, these satellites will be the first with the strong software correctness guarantees furnished by Chain, ensuring continuous, reliable operation.


"The potential benefit of reliable energy-harvesting computer systems is far-reaching," says Lucia. "Small satellites are proliferating and the space industry itself is expanding.  If we can guarantee that even tiny, energy-harvesting satellites operate without interruption, we can make it easier to conduct other scientific research in space. Further out, we may even see future applications like extraterrestrial natural resource discovery relying on this technology."


The paper, "Chain: Tasks and Channels for Reliable Intermittent Programs," is available here. Learn more about Professor Lucia's research group here.



Expanding Antarctic sea ice linked to natural variability


Image Credit: NASA/Jim Yungel


The recent trend of increasing Antarctic sea ice extent -- seemingly at odds with climate model projections -- can largely be explained by a natural climate fluctuation, according to a new study led by the National Center for Atmospheric Research (NCAR).


The study offers evidence that the negative phase of the Interdecadal Pacific Oscillation (IPO), which is characterized by cooler-than-average sea surface temperatures in the tropical eastern Pacific, has created favorable conditions for additional Antarctic sea ice growth since 2000.


The findings, published in the journal Nature Geoscience, may resolve a longstanding mystery: Why is Antarctic sea ice expanding when climate change is causing the world to warm?


The study's authors also suggest that sea ice may begin to shrink as the IPO switches to a positive phase.


"The climate we experience during any given decade is some combination of naturally occurring variability and the planet's response to increasing greenhouse gases," said NCAR scientist Gerald Meehl, lead author of the study. "It's never all one or the other, but the combination, that is important to understand."


Study co-authors include Julie Arblaster of NCAR and Monash University in Australia, Cecilia Bitz of the University of Washington, Christine Chung of the Australian Bureau of Meteorology, and NCAR scientist Haiyan Teng. The study was funded by the U.S. Department of Energy and by the National Science Foundation, which sponsors NCAR.


Expanding ice


The sea ice surrounding Antarctica has been slowly increasing in area since the satellite record began in 1979. But the rate of increase rose nearly five fold between 2000 and 2014, following the IPO transition to a negative phase in 1999.


The new study finds that when the IPO changes phase, from positive to negative or vice versa, it touches off a chain reaction of climate impacts that may ultimately affect sea ice formation at the bottom of the world.


When the IPO transitions to a negative phase, the sea surface temperatures in the tropical eastern Pacific become somewhat cooler than average when measured over a decade or two. These sea surface temperatures, in turn, change tropical precipitation, which drives large-scale changes to the winds that extend all the way down to Antarctica.


The ultimate impact is a deepening of a low-pressure system off the coast of Antarctica known as the Amundsen Sea Low. Winds generated on the western flank of this system blow sea ice northward, away from Antarctica, helping to enlarge the extent of sea ice coverage.


"Compared to the Arctic, global warming causes only weak Antarctic sea ice loss, which is why the IPO can have such a striking effect in the Antarctic," said Bitz. "There is no comparable natural variability in the Arctic that competes with global warming."


Sifting through simulations


To test if these IPO-related impacts were sufficient to cause the growth in sea ice extent observed between 2000 and 2014, the scientists first examined 262 climate simulations created by different modeling groups from around the world.


When all of those simulations are averaged, the natural variability cancels itself out. For example, simulations with a positive IPO offset those with a negative IPO. What remains is the expected impact of human-caused climate change: a decline in Antarctic sea ice extent.


But for this study, the scientists were not interested in the average. Instead, they wanted to find individual members that correctly characterized the natural variability between 2000-2014, including the negative phase of the IPO. The team discovered 10 simulations that met the criteria, and all of them showed an increase in Antarctic sea ice extent across all seasons.


"When all the models are taken together, the natural variability is averaged out, leaving only the shrinking sea ice caused by global warming," Arblaster said. "But the model simulations that happen to sync up with the observed natural variability capture the expansion of the sea ice area. And we were able to trace these changes to the equatorial eastern Pacific in our model experiments."


Scientists suspect that in 2014, the IPO began to change from negative to positive. That would indicate an upcoming period of warmer eastern Pacific Ocean surface temperatures on average, though year-to-year temperatures may go up or down, depending on El Niño/La Niña conditions. Accordingly, the trend of increasing Antarctic sea ice extent may also change in response.


"As the IPO transitions to positive, the increase of Antarctic sea ice extent should slow and perhaps start to show signs of retreat when averaged over the next 10 years or so," Meehl said.



 Solar panel study reveals impact on Earth



Environmental Scientists at Lancaster University and the Centre for Ecology and Hydrology monitored a large solar park, near Swindon, for a year.


They found that solar parks altered the local climate, measuring cooling of as much as 5 degrees Centigrade under the panels during the summer but the effects varied depending on the time of year and the time of day.


As climate controls biological processes, such as plant growth rates, this is really important information and can help understand how best to manage solar parks so they have environmental benefits in addition to supplying low carbon energy.


Their paper 'Solar park microclimate and vegetation management effects on grassland carbon cycling' is published in the Journal Environmental Research Letters.


Increasing energy demands and the drive towards low carbon energy sources have prompted a rapid increase in ground-mounted solar parks across the world.


This means a significant land use change on a global scale and has prompted urgent calls for a detailed understanding of the impacts of solar parks on the fields beneath them.


Dr Alona Armstrong, of Lancaster University, said the new study raises some key questions for the future.


She said: "Solar parks are appearing in our landscapes but we are uncertain how they will affect the local environment."


"This is particularly important as solar parks take up more space per unit of power generated compared with traditional sources. This has implications for ecosystems and the provision of goods, for example crops, and services, such as soil carbon storage. But until this study we didn't understand how solar parks impacted climate and ecosystems."


"With policies in dominant economies supporting solar energy, it is important that we understand the environmental impacts to ensure we get more than just low carbon energy from the land they occupy."



The authors of the study say understanding the climate effects of solar parks will give farmers and land managers the knowledge they need to choose which crops to grow and how best to manage the land; there is potential to maximise biodiversity and improve yields.


Dr Armstrong added: "This understanding becomes even more compelling when applied to areas that are very sunny that may also suffer water shortages. The shade under the panels may allow crops to be grown that can't survive in full sun. Also, water losses may be reduced and water could be collected from the large surfaces of the solar panels and used for crop irrigation."


The post is provided by Lancaster University



New York High School Student Receives Prestigious Award for Developing Clean Drinking Water Project



A Dix Hills, New York, high school senior has won the U.S. Environmental Protection Agency’s Patrick H. Hurd Sustainability Award for a project currently providing clean drinking water affordably to a community in Kenya.


"On behalf of the agency, I am proud to announce Alexis D’Alessandro as the Patrick H. Hurd Sustainability award winner for her work to protect public health and the environment” said Dr. Thomas A. Burke, EPA science advisor and deputy assistant administrator of EPA's Office of Research and Development. “This award is just one way EPA helps encourage and support the next generation of scientists and engineers to put their passion for innovation into practice."


Alexis D’Alessandro, a senior at Half Hollow Hills High School West in Dix Hills, New York won the award for providing a cost-effective alternative to reverse osmosis water purification for a community in the Turkana Basin of Kenya. A 9th grade classroom lesson on global water scarcity inspired her environmental engineering project, “Design and Implementation of a Sustainable Permeate Gap Membrane Distillation System for Water Purification in the Turkana Basin of Kenya.”


Ms. D’Alessandro took part in the Intel International Science & Engineering Fair in Phoenix, Ariz. this week, along with 1759 other student scientists and engineers from 77 countries, regions, and territories. The EPA Patrick H. Hurd Sustainability award gives the student winner and a chaperone the opportunity to display the student's project at EPA's National Sustainable Design Expo in Washington, DC, featuring the People, Prosperity, and the Planet (P3) Student Design Competition, in the spring of 2017.


The Intel International Science and Engineering Fair, a program of the Society for Science & the Public, is the world's largest pre-college science competition. Students advance to the International Science and Engineering Fair from several levels of local and school-sponsored, regional and state fairs showcasing their independent research. The Society for Science & the Public, a nonprofit organization dedicated to public engagement in scientific research and education, owns and has administered the International Science and Engineering Fair since its inception in 1950.



Canada, a Preview To Climate Change?



It sounds like something out of a Hollywood disaster movie.


More than 88,000 people have left wildfire-ravaged Fort McMurray in western Canada, as authorities warned the blaze could double in size by the end of Saturday.- The Guardian



"Witnesses reported the Flying-J gas station exploded, while the Super 8 Motel and a Denny's restaurant were gutted."

- CBC


"Bernie Schmitte, an official at Alberta's Agriculture and Forestry Ministry, said the "catastrophic fire" had so far "resisted all suppression methods."

- BBC


Since Sunday the blaze had destroyed almost 25,000 acres and an estimated 1,600 buildings and as this article is being written it is still burning. As we watch this terrible disaster play-out there have been a number of very interesting articles that are connecting this to Climate Change. But according to reports, with the fire starting on May 1st,  authorities have said it could have been sparked by a human, such as with a discarded cigarette or an out-of-control campfire, or by nature such as lightning. So after we read what authorities have to say, can we make the connection to climate change?


There are many articles that are making that suggestion. One article that caught our attention was from CNN entitled “The fire in Canada looks a lot like climate change -- and that should scare you.”  The article goes on to say, “scientists and researchers say this fire looks a whole lot like climate change. And that should alarm all of us.” The article does say that it’s impossible to absolutely say that this fire is a result of climate change, but poignantly shows the contributing factors that connect the two.


What was also interesting was that the fire was creating it’s own weather pattern. Brian Proctor, a warning preparedness meteorologist for Environment Canada, says when fires get big enough, they begin to create weather systems of their own.


These firestorms are a powerful force. They can alter weather patterns, funnel smoke and particulates high into the stratosphere, and produce powerful lightning strikes.


"They tend to promote their own kind of conditions," Proctor said.


Observing this from a distance many of us cannot comprehend the magnitude of what is happening and what it might feel like to actually be one of the residents who have lost their homes. But if this is truly  a preview, and is connected to Climate Change, then we all better start getting prepared as this may be the new norm.



A One-of-a-kind Solar Research Facility



FIU and Florida Power & Light Company unveiled a new commercial-scale solar installation at FIU's College of Engineering and Computing – the only solar research facility of its kind that FPL has installed at a Florida university.


The 1.4-megawatt solar array is comprised of more than 4,400 solar panels on canopy-like structures that provide clean electricity to FPL's grid and shade for about 400 parking spaces. The unique solar array incorporates a 24-foot by 12-foot FIU logo that is visible from high above.


Engineering faculty and students from the Energy, Power & Sustainability (EPS) program at FIU will use the installation to conduct important research that will help FPL advance solar energy in the state.


Through a five-year research grant, faculty and students are analyzing data from the on-site solar panels to understand the impacts of intermittent solar power on the electric grid in South Florida's tropical climate. The researchers will also look at historic weather patterns and develop predictive models to forecast the reliability of solar power generation.


"This research project builds on our long-standing relationship with FPL," said FIU President Mark B. Rosenberg. "We're engaging in groundbreaking, problem-solving research to address the challenges of our region and beyond. Our students will get hands-on experience and see how the research they conduct in the lab will have an impact in the real world – gaining skills that will help them compete for high tech 21st century jobs. This solar power facility is a win-win for FIU, FPL, and our community."


FPL is the state's largest generator of solar energy and operates three utility-scale, or universal, solar plants in Florida. The company is constructing three new 74.5-megawatt solar energy centers that will cost-effectively triple its solar capacity by the end of 2016.


"We work hard every day to deliver our customers electricity that is among the cleanest and most reliable in the county for a price well below the national average," said Eric Silagy, president and CEO of FPL. "Through this innovative partnership, we will continue to make our energy infrastructure even smarter. The faculty and students working on this project are contributing to our state's energy future – a future that includes more solar power."


FIU researchers are looking closely at Florida's climate as part of their research.


"Solar power depends on the sun for fuel and with South Florida's tropical weather conditions the amount of sun can vary greatly from one moment to the next," said Arif Sarwat, a professor in the Department of Electrical and Computer Engineering who serves as director of the FIU and FPL Solar Research Facility and EPS. "In Florida, where sunshine can vary moment to moment, our team is researching how intermittent power generation impacts the grid with an eye toward a better understanding of how to best leverage solar power."


For more than three decades, FPL and FIU have partnered on various projects. In addition to hundreds of FPL employees who are FIU alumni, the energy company runs an on-campus customer care training center where students answer calls from customers. FPL also donated an electric vehicle from its clean fleet to FIU's College of Engineering and Computing to further research and test wireless charging technology.


Besides conducting research on wireless charging, EPS students also work at the FPL laboratories every week to conduct high-end experiments and research on batteries and access points.


"This project further demonstrates FIU's commitment to working with FPL to help prepare our students for addressing society's needs for renewable energy," said Ranu Jung, interim dean of the College of Engineering and Computing. "Our faculty and students are engaged in research related to multiple facets of power generation, and this partnership will help strengthen their contributions to making solar energy viable and economical."




NASA Selects ‘Game Changing’ Solar Projects



NASA’s Game Changing Development (GCD) program has selected four proposals to develop solar cell technologies that can operate in the high-radiation and low-temperature conditions of deep space.


Deep space can get cold – very cold; down to -270.45 Celsius. High levels of radiation  from cosmic rays, solar winds and storms can also play havoc with electrical equipment; with repairs being difficult or impossible to carry out.


Solar technology has been proving itself in space applications beyond Earth’s orbit for years. One of the Mars rovers has now been operating for 12 years, 48 times longer than the mission objective. In January, NASA’s Juno probe broke the record for humanity’s most distant solar powered spacecraft. Juno is heading for Jupiter, where sunlight intensity will be 25 times less than on Earth.


Past Jupiter, sunlight intensity continues to drop; meaning even more efficient solar cells will be required.


Thirteen proposals were received in response to the Extreme Environment Solar Power Appendix to the SpaceTech-REDDI-2015 NASA Research Announcement. The Initial contract awards are up to USD $400,000 each, which provides the teams with nine months of funding to work on system design, component testing and analysis.


The four proposals chosen:





    • Concentrator Solar Power Systems for Low-intensity Low Temperature and High Radiation Game Changing Technology Development – ATK Space Systems



NASA anticipates selecting two of the technologies after the nine months, which will receive up to $1.25 million to develop and test hardware. A third phase may see one of the projects selected to advance development and deliver scalable system hardware.


“The ultimate goal of increasing end of life performance and enhanced space power applications will greatly impact how we execute extended missions, especially to the outer planets,”  said Lanetra Tate, the GCD program executive in NASA’s Space Technology Mission Directorate.


The Game Changing Development Program is a part of NASA’s Space Technology Mission Directorate.




Kids Robotics Champions Challenges K-Cup to Go 100% Recyclable



Three inspiring Birmingham, Alabama children are preparing to compete in the 2016 First Lego League (FLL) World Festival in St. Louis in April 2016. This global competition challenges 9- to 14-year-olds to think like engineers and scientists. The robotics competition began with 29,000 teams competing locally and then regionally, and now approximately 100 teams from 82 countries that will compete internationally.


Known as The Variables, the team of three is gearing up for this year's FLL Challenge, Trash Trek. Participants are asked to find solutions to the world's growing trash problems. Each Challenge has three parts: the Robot Game, the Project, and the Core Values.


"As for the project, we identified K-cups as a source of plastic waste. K-cups taste great and are super convenient but are wasteful and not recyclable yet," The Variables explained. "And the amount of K-cups that are thrown out every year would circle the Earth 10.5 times!"


In February the kids made a video that specifically requests Brian Kelley's help (CEO of Keurig). The kids challenge Keurig to be more aggressive with their plan to make K-cup pods 100% recyclable. It seems to be a small request that would only require a short video response.


On March 10, Keurig's CEO and Chief Sustainability Officer responded to the kids in a video. Moreover, Keurig agreed to fly the kids to Boston to meet, tour and better understand their company's sustainability goals.


While Keurig has a plan to make 100% recyclable K-cups by 2020, the kids think that's a long way off and hope to inspire the company to speed up their plan. "If Keurig could accelerate their sustainability plan by just six months, it would make a huge change," The Variables said.


It's great to see that these issues are on the minds of our future generations, and we're sure they'll help to innovate the solutions we need for a healthier, and more sustainable planet.



Europe Hottest in Over 2 Millennia



New research is finding that temperatures over the past 30 years lie outside the range of natural variations, supports the conclusions reached by the International Panel on Climate Change (IPCC) that recent warming is mainly caused by anthropogenic activity.


Most of Europe has experienced strong summer warming over the course of the past several decades, accompanied by severe heat waves in 2003, 2010 and 2015. New research now puts the current warmth in a 2100-year historical context using tree-ring information and historical documentary evidence to derive a new European summer temperature reconstruction.


The work was published today (Friday 29th January) in the journal of Environmental Research Letters by a group of 45 scientists from 13 countries.


Warm summers were experienced during Roman times, up to the 3rd century, followed by generally cooler conditions from the 4th to the 7th centuries. A generally warm medieval period was followed by a mostly cold Little Ice Age from the 14th to the 19th centuries. The pronounced warming early in the 20th century and in recent decades is well captured by the tree-ring data and historical evidence on which the new reconstruction is based.


The evidence suggests that past natural changes in summer temperature are larger than previously thought, suggesting that climate models may underestimate the full range of future extreme events, including heat waves. This past variability has been associated with large volcanic eruptions and changes in the amount of energy received from the sun.


The new research finding that temperatures over the past 30 years lie outside the range of these natural variations supports the conclusions reached by the International Panel on Climate Change (IPCC) that recent warming is mainly caused by anthropogenic activity.


"We now have a detailed picture of how summer temperatures have changed over Europe for more than two thousand years and we can use that to test the climate models that are used to predict the impacts of future global warming," says the coordinator of the study, Professor Jürg Luterbacher from the University of Giessen in Germany.


Story provided by Institute of Physics


Rapid, affordable energy transformation possible


 

Since either the sun is shining or winds are blowing somewhere across the United States all of the time, researchers theorized that the key to resolving the dilemma of intermittent renewable generation might be to scale up the renewable energy generation system to match the scale of weather systems



The United States could slash greenhouse gas emissions from power production by up to 78 percent below 1990 levels within 15 years while meeting increased demand, according to a new study by NOAA and University of Colorado Boulder researchers.


The study used a sophisticated mathematical model to evaluate future cost, demand, generation and transmission scenarios. It found that with improvements in transmission infrastructure, weather-driven renewable resources could supply most of the nation's electricity at costs similar to today's.


"Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology and within 15 years," said Alexander MacDonald, co-lead author and recently retired director of NOAA's Earth System Research Laboratory (ESRL) in Boulder.


The paper is published online today in the journal Nature Climate Change.


Although improvements in wind and solar generation have continued to ratchet down the cost of producing renewable energy, these energy resources are inherently intermittent. As a result, utilities have invested in surplus generation capacity to back up renewable energy generation with natural gas-fired generators and other reserves.


"In the future, they may not need to," said co-lead author Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder.


Since the sun is shining or winds are blowing somewhere across the United States all of the time, MacDonald theorized that the key to resolving the dilemma of intermittent renewable generation might be to scale up the renewable energy generation system to match the scale of weather systems.


So MacDonald, who has studied weather and worked to improve forecasts for more than 40 years, assembled a team of four other NOAA scientists to explore the idea. Using NOAA's high-resolution meteorological data, they built a model to evaluate the cost of integrating different sources of electricity into a national energy system. The model estimates renewable resource potential, energy demand, emissions of carbon dioxide (CO2) and the costs of expanding and operating electricity generation and transmission systems to meet future needs.


The model allowed researchers to evaluate the affordability, reliability, and greenhouse gas emissions of various energy mixes, including coal. It showed that low-cost and low-emissions are not mutually exclusive.


"The model relentlessly seeks the lowest-cost energy, whatever constraints are applied," Clack said. "And it always installs more renewable energy on the grid than exists today."


Even in a scenario where renewable energy costs more than experts predict, the model produced a system that cuts CO2 emissions 33 percent below 1990 levels by 2030, and delivered electricity at about 8.6 cents per kilowatt hour. By comparison, electricity cost 9.4 cents per kWh in 2012.


If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modeled system sliced CO2 emissions by 78 percent from 1990 levels and delivered electricity at 10 cents per kWh. The year 1990 is a standard scientific benchmark for greenhouse gas analysis.


A scenario that included coal yielded lower cost (8.5 cents per kWh), but the highest emissions.


At the recent Paris climate summit, the United States pledged to cut greenhouse emissions from all sectors up to 28 percent below 2005 levels by 2025. The new paper suggests the United States could cut total CO2 emissions 31 percent below 2005 levels by 2030 by making changes only within the electric sector, even though the electrical sector represents just 38 percent of the national CO2 budget. These changes would include rapidly expanding renewable energy generation and improving transmission infrastructure.


In identifying low-cost solutions, researchers enabled the model to build and pay for transmission infrastructure improvements--specifically a new, high-voltage direct-current transmission grid (HVDC) to supplement the current electrical grid. HVDC lines, which are in use around the world, reduce energy losses during long-distance transmission. The model did choose to use those lines extensively, and the study found that investing in efficient, long-distance transmission was key to keeping costs low.


MacDonald compared the idea of a HVDC grid with the interstate highway system which transformed the U.S. economy in the 1950s. "With an 'interstate for electrons', renewable energy could be delivered anywhere in the country while emissions plummet," he said. "An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be."


The new model is drawing interest from other experts in the field.


"This study pushes the envelope," said Stanford University's Mark Jacobson, who commented on the findings in an editorial he wrote for the journal Nature Climate Change. "It shows that intermittent renewables plus transmission can eliminate most fossil-fuel electricity while matching power demand at lower cost than a fossil fuel-based grid -- even before storage is considered."


Nano-reactor created for the production

of hydrogen biofuel


 

Scientists at Indiana University have created a highly efficient biomaterial that catalyzes the formation of hydrogen -- one half of the "holy grail" of splitting H2O to make hydrogen and oxygen for fueling cheap and efficient cars that run on water.

A modified enzyme that gains strength from being protected within the protein shell -- or "capsid" -- of a bacterial virus, this new material is 150 times more efficient than the unaltered form of the enzyme.

The process of creating the material was recently reported in "Self-assembling biomolecular catalysts for hydrogen production" in the journal Nature Chemistry.

"Essentially, we've taken a virus's ability to self-assemble myriad genetic building blocks and incorporated a very fragile and sensitive enzyme with the remarkable property of taking in protons and spitting out hydrogen gas," said Trevor Douglas, the Earl Blough Professor of Chemistry in the IU Bloomington College of Arts and Sciences' Department of Chemistry, who led the study. "The end result is a virus-like particle that behaves the same as a highly sophisticated material that catalyzes the production of hydrogen."

Other IU scientists who contributed to the research were Megan C. Thielges, an assistant professor of chemistry; Ethan J. Edwards, a Ph.D. student; and Paul C. Jordan, a postdoctoral researcher at Alios BioPharma, who was an IU Ph.D. student at the time of the study.

The genetic material used to create the enzyme, hydrogenase, is produced by two genes from the common bacteria Escherichia coli, inserted inside the protective capsid using methods previously developed by these IU scientists. The genes, hyaA and hyaB, are two genes in E. coli that encode key subunits of the hydrogenase enzyme. The capsid comes from the bacterial virus known as bacteriophage P22.

The resulting biomaterial, called "P22-Hyd," is not only more efficient than the unaltered enzyme but also is produced through a simple fermentation process at room temperature.

The material is potentially far less expensive and more environmentally friendly to produce than other materials currently used to create fuel cells. The costly and rare metal platinum, for example, is commonly used to catalyze hydrogen as fuel in products such as high-end concept cars.

"This material is comparable to platinum, except it's truly renewable," Douglas said. "You don't need to mine it; you can create it at room temperature on a massive scale using fermentation technology; it's biodegradable. It's a very green process to make a very high-end sustainable material."

In addition, P22-Hyd both breaks the chemical bonds of water to create hydrogen and also works in reverse to recombine hydrogen and oxygen to generate power. "The reaction runs both ways -- it can be used either as a hydrogen production catalyst or as a fuel cell catalyst," Douglas said.

The form of hydrogenase is one of three occurring in nature: di-iron (FeFe)-, iron-only (Fe-only)- and nitrogen-iron (NiFe)-hydrogenase. The third form was selected for the new material due to its ability to easily integrate into biomaterials and tolerate exposure to oxygen.

NiFe-hydrogenase also gains significantly greater resistance upon encapsulation to breakdown from chemicals in the environment, and it retains the ability to catalyze at room temperature. Unaltered NiFe-hydrogenase, by contrast, is highly susceptible to destruction from chemicals in the environment and breaks down at temperatures above room temperature -- both of which make the unprotected enzyme a poor choice for use in manufacturing and commercial products such as cars.

These sensitivities are "some of the key reasons enzymes haven't previously lived up to their promise in technology," Douglas said. Another is their difficulty to produce.

"No one's ever had a way to create a large enough amount of this hydrogenase despite its incredible potential for biofuel production. But now we've got a method to stabilize and produce high quantities of the material -- and enormous increases in efficiency," he said.

The development is highly significant according to Seung-Wuk Lee, professor of bioengineering at the University of California-Berkeley, who was not a part of the study.

"Douglas' group has been leading protein- or virus-based nanomaterial development for the last two decades. This is a new pioneering work to produce green and clean fuels to tackle the real-world energy problem that we face today and make an immediate impact in our life in the near future," said Lee, whose work has been cited in a U.S. Congressional report on the use of viruses in manufacturing.

Beyond the new study, Douglas and his colleagues continue to craft P22-Hyd into an ideal ingredient for hydrogen power by investigating ways to activate a catalytic reaction with sunlight, as opposed to introducing elections using laboratory methods.

"Incorporating this material into a solar-powered system is the next step," Douglas said.



Story Source: Indiana University


Socks with Urine Generate Power


 

A pair of socks embedded with miniaturised microbial fuel cells (MFCs) and fuelled with urine pumped by the wearer's footsteps has powered a wireless transmitter to send a signal to a PC. This is the first self-sufficient system powered by a wearable energy generator based on microbial fuel cell technology.

The scientific paper, "Self-sufficient Wireless Transmitter Powered by Foot-pumped Urine Operating Wearable MFC," is published in Bioinspiration and Biomimetics.

The paper describes a lab-based experiment led by Professor Ioannis Ieropoulos, of the Bristol BioEnergy Centre at the University of the West of England (UWE Bristol). The Bristol BioEnergy Centre is based in Bristol Robotics Laboratory, a collaborative partnership between the University of the West of England (UWE Bristol) and the University of Bristol.

Soft MFCs embedded within a pair of socks was supplied with fresh urine, circulated by the human operator walking. Normally, continuous-flow MFCs would rely on a mains powered pump to circulate the urine over the microbial fuel cells, but this experiment relied solely on human activity. The manual pump was based on a simple fish circulatory system and the action of walking caused the urine to pass over the MFCs and generate energy. Soft tubes, placed under the heels, ensured frequent fluid push-pull by walking. The wearable MFC system successfully ran a wireless transmission board, which was able to send a message every two minutes to the PC-controlled receiver module.

Professor Ieropoulos says, "Having already powered a mobile phone with MFCs using urine as fuel, we wanted to see if we could replicate this success in wearable technology. We also wanted the system to be entirely self-sufficient, running only on human power -- using urine as fuel and the action of the foot as the pump."

"This work opens up possibilities of using waste for powering portable and wearable electronics. For example, recent research shows it should be possible to develop a system based on wearable MFC technology to transmit a person's coordinates in an emergency situation. At the same time this would indicate proof of life since the device will only work if the operator's urine fuels the MFCs."

Microbial fuel cells (MFCs) use bacteria to generate electricity from waste fluids. They tap into the biochemical energy used for microbial growth and convert it directly into electricity. This technology can use any form of organic waste and turn it into useful energy without relying on fossil fuels, making this a valuable green technology.

The Centre has recently launched a prototype urinal in partnership with Oxfam that uses pee-power technology to light cubicles in refugee camps.



Story Source: Institute of Physics



UCLA Climate Change Researchers Find Way to Improve Accuracy of Global Climate Models


 

Article by Alison Hewitt 

UCLA researchers have discovered a way to reduce uncertainty in global climate models’ projections of how climate change will alter rain and other precipitation.

Global climate models agree that climate change will cause precipitation to increase on average, with wet regions becoming wetter and dry areas getting drier. However, according to the findings published today in Nature, most models overestimate the amount that precipitation will increase due to climate change. The study, part of a joint project by UCLA and the Lawrence Livermore National Laboratory, shows that the global average of precipitation will go up, but by approximately 40 percent less than global models currently predict.

Not only will the research make the climate models more accurate, it will also decrease variation among the different models by an estimated 35 percent, according to the study. That will give greater clarity to the international policy makers who depend upon the models to plan for the effects of climate change, researchers said. Global climate models are analyzed and applied by scientists for reports used by the United Nations, including at the U.N. COP21 Paris climate talks wrapping up this week.

“The projected change in the hydrologic cycle, or water cycle, is one of the most important dimensions of climate change apart from warming because it’s critical for understanding the future of things like water resources, agriculture and ecosystem health,” said principal investigator Alex Hall, a professor in UCLA’s department of atmospheric and oceanic sciences and director of UCLA’s Center for Climate Change Solutions. “Changes in water will be one of the key factors reshaping the planet.”

A goal of the research is to increase the accuracy and decrease the variation among different models, said Anthony DeAngelis, the lead author and a postdoctoral researcher in UCLA’s department of atmospheric and oceanic sciences. While global climate models agree on many details, the broad range in the models’ anticipated global precipitation changes attracted UCLA atmospheric scientists’ attention.

“The greater the agreement in the global climate models, the higher our confidence in the projections,” DeAngelis said. “There are physical laws behind important factors controlling precipitation, and the results flowing from these factors should be predictable.”

In examining a variety of potential causes for the variation, the scientists found that different global climate models disagreed on how much sunlight is absorbed in the atmosphere, which influences the amount of precipitation. Comparing model simulations with satellite observations, they found that many models underestimated sunlight absorption. Sunlight — in the form of shortwave solar radiation — is absorbed by water vapor in the atmosphere. The more the sun’s energy is absorbed, the warmer the atmosphere becomes, and the less precipitation falls.

A stupendous number of calculations — enough to tax even the most powerful supercomputers — are needed to project the complex interaction between the sun’s energy and water vapor in the air, DeAngelis said.

“The theory behind sunlight absorption is sound, but the implementation of this theory in global climate models varies significantly,” DeAngelis said. Some models use scientifically outdated or oversimplified calculations, which contributes to the wide variance in projected precipitation, he said.

The UCLA researchers emphasized their confidence in global climate models’ ability to project other key aspects of the planet’s future climate, adding that although the models are not perfect, they are constantly being improved. This study is part of that process.

The research contributes to the goals of UCLA’s Sustainable L.A. Grand Challenge, a campuswide initiative to transition the Los Angeles region to 100 percent renewable energy, 100 percent local water and enhanced ecosystem health by 2050.

The implications of the findings at a local scale remain to be seen, Hall said. The research projects climate change will cause a less dramatic increase in precipitation from rain, snow, storms and other components of the hydrologic cycle, but that relates only to the global average of precipitation, Hall emphasized.

“It’s necessary to paint this bigger picture before we can zoom in,” Hall said. “We still don’t understand what this change in the hydrologic cycle means. We know it probably means more floods and more droughts, among other impacts. Even though our findings show the increased precipitation will be 40 percent smaller than currently projected, it’s a leap to say that we expect 40 percent fewer droughts and floods.”


Energy Storage Key To Quadrupling

Germany’s Solar Capacity

 

The German electricity system as it stands could cope with four times as much solar power being installed without any major difficulties – if energy storage is properly integrated.


According to a study by Agora Energiewende, a scenario of 150 gigawatts of solar capacity in combination with 40 gigawatts of battery storage presented no issues for the country’s electricity infrastructure.


Scenarios of 150 or 200 gigawatts of photovoltaics in Germany, which until recently were considered by many to be completely unrealistic, are not only technically feasible but also economically viable says Dr. Patrick Graichen, director of Agora Energiewende.


Dr. Graichen recommends German energy policy should be further refined to accommodate a boom in solar + storage. He also questions the need for high-voltage lines on the current network development plan past 2025 if Germany’s energy storage revolution really kicks in as expected.


As it did with solar, Germany has also been an energy storage pioneer. The country initiated support for the purchase of battery storage integrated with home solar power systems in 2013. By December last year, more than 15,000 German households had home battery systems. It’s been estimated 100,000 residential energy storage units will be installed in 2018.


In the first seven months of  2015, German development bank KfW reportedly supported 35% more solar + storage projects than in the same period last year.


The Agora Energiewende study report (in German) can be viewed here.


According to the International Renewable Energy Agency’s (IRENA’s) Renewable Energy Capacity Statistics 2015 report, by the end of last year Germany had 38,236 MW of solar PV capacity installed.



New Global Wind Atlas


 

A publicly-available, highly detailed map of wind energy resources around the world has been launched by the International Renewable Energy Agency (IRENA) and the Technical University of Denmark (DTU).



The Global Wind Atlas provides wind resource data at one-kilometre resolution; offering far more detail than other maps available to the public that are 10-kilometre resolution at best. That low level of resolution has resulted in underestimation of wind resources in the past, increased risk and boosted costs for wind energy planners.


For example, information available until now indicates wind speeds over large areas; missing some of the elevated features that can boost wind resources such as hills and ridges; making the amount of wind energy that could be captured by turbines appear weaker than it actually is. The new maps factor in more of these features; providing wind speed data at three different heights.


Wind energy potential across the globe is vast, but the upfront costs of measuring potential and determining the best locations for projects is an obstacle in many countries,” said IRENA Director-General Adnan Z. Amin. “The new Global Wind Atlas provides this needed data directly and for free, making it a ground-breaking tool to help jumpstart wind energy development worldwide.”


The Wind Atlas, funded by Denmark’s government, adds to the functionality of IRENA’s Global Atlas; which also provides tools mapping bio, hydro, geothermal, marine and solar energy resources.


The Global Wind Atlas can be accessed here.


IRENA’s membership is made up of 142 countries, including Australia plus the European Union. IRENA promotes the widespread uptake and sustainable use of all forms of renewable energy.


By the end of last year, more than 268,000 wind turbines were in operation around the world. In 2012, researchers from the University of Delaware and Stanford University  calculated the saturation wind power potential globally is greater than 250 terawatts; far more than the world’s energy demands.

ORGANIC Solar Cells


 

A new all-polymer organic solar cell developed by a Korean research team is tougher and more flexible than its other organic counterparts.


Organic solar cells have been attracting interest for quite a while given their low-cost, flexible properties and their potential for applications such as wearable devices; but their major drawback has been efficiency and durability.


The somewhat fragile nature of organic solar cells (OSC’s) is due to the presence of fullerenes, molecules of carbon in the form of hollow shapes. The other drawbacks of fullerenes are weak light absorption and un-optimized energy levels.


Researchers at the Korea Advanced Institute of Science and Technology (KAIST) found all-polymer solar cells (PSC’s) based on a BDTTTPD polymer donor and the P(NDI2HD-T) polymer acceptor overcame these issues.


Their efforts have resulted in an organic solar cell that demonstrates a 60-fold increase in flexibility and 470-fold improvements in strength compared to with polymer/PCBM cells.


The superior mechanical properties of all-polymer solar cells afford greater tolerance to severe deformations than conventional polymer-fullerene solar cells, making them much better candidates for applications in flexible and portable devices,” says a paper on the team’s work; recently published in the journal Nature Communications.  


As for efficiency, the team’s development achieves up to 6.64% currently, with ‘great potential’ for further enhancement.


That conversion efficiency is some way off conventional solar cells that offer triple or even more sunlight conversion rates; but it’s a certainly a step in the right direction.


Our results provide guidelines for the design of new material systems for high-performance all-PSCs and demonstrate their potential for future applications in portable and wearable devices that require both high performances and mechanical stability,” says the team.


Meanwhile the hunt for the OSC holy grail goes on; with researchers around the world looking to evolve the technology.


In June, the University of Adelaide’s Chemistry Department announced it would be studying the molecular nature of  conductive plastics to better understand how they absorb and convert light into energy; with view to ‘tuning’ these materials to dramatically improve conversion efficiency.


Invisibility Cloak For Solar Panels


 

Scientists at Karlsruhe Institute of Technology (KIT) have come up with an interesting approach to boosting the conversion efficiency of solar panels.


Even the most efficient solar panels only convert a fraction of the sun’s energy into electricity. A number of issues impact on panel efficiency, include heat, shading and general quality of the solar cells used.


Another factor are the cell contact fingers – these are the small lines on cells that carry the current generated. Up to ten percent of a solar module’s surface may be covered by these contacts, which are “optically dead” areas.


If the contacts could be made “invisible” using a cheap method easily integrated into existing solar cell manufacturing, it could provide a significant boost in efficiency; enabling valuable rooftop real estate to be better utilised and further reducing costs of solar power projects.


The researchers at KIT believe they have found a way to do this – perhaps a couple of ways.


The first involves coating the fingers in a polymer shaped in a way to refract sunlight onto an active part of the cell. The other also uses a polymer coating, but this method involves the surface of the cloak layer being grooved along the contact fingers, refracting light away from them and onto the active surface area of the solar cell.


The second concept is particularly promising, as it can potentially be integrated into mass production of solar cells at low costs,” according to the team.


Computer models indicate both methods would be successful.


When applying such a coating onto a real solar cell, optical losses via the contact fingers are supposed to be reduced and efficiency is assumed to be increased by up to 10%,” said doctoral student Martin Schumann, who carried out the experiments and simulations.


The technology could also be used with other optoelectronic devices where efficient coupling to light with minimal losses is highly desirable, such as in light-emitting diodes or optical detectors


For a more uber-geeky explanation of the concepts, read the team’s paper, “Cloaked contact grids on solar cells by coordinate transformations: designs and prototypes”, which can be accessed here.


The Karlsruhe Institute of Technology was established by the merger of the Forschungszentrum Karlsruhe GmbH and the Universität Karlsruhe (TH) in 2009


Kirigami-Inspired Sun Tracking Solar Cells


 

University of Michigan researchers have successfully combined thin-film solar technology with the Japanese paper-cutting art form Kirigami to produce a flat solar cell capable of splitting into segments and tracking the sun throughout the day.


Solar tracking systems are usually reserved for large-scale, grid-connected solar PV farms covering acres of land. Although solar cells capture up to 40 per cent more energy when they can track the sun, installing the same bulky and expensive technology for rooftop solar power just isn’t feasible, so the U-M team had to build the sun-tracking into the panel.


The beauty of our design is, from the standpoint of the person who’s putting this panel up, nothing would really change,” said Max Shtein, an associate professor of materials science and engineering at U-M. “But inside, it would be doing something remarkable on a tiny scale: the solar cell would split into tiny segments that would follow the position of the sun in unison.”


The work, “Dynamic Kirigami structures for integrated solar tracking”, is published in the journal Nature Coummunications.


To fabricate the cells, researchers used flexible thin-film gallium arsenide strips stuck onto a highly malleable space-grade plastic called Kapton. Shtein then brought in fellow U-M lecturer and Kirigami expert, Matthew Shlian to design and cut patterns into the material with a carbon dioxide laser.


They found a simple pattern of lattice cuts produced the best results. When the solar cell was stretched, the rows first spread, then revolved in a perfectly measurable proportion. The design also provided the most tilt/sun-tracking without losing width as the pattern was stretched.


While the U-M lab does not have the capacity to manufacture a realistic prototype of the Kirigami cell, simulations carried out using data from the Arizona summer solstice showed the optimised design achieved a 36 per cent improvement over a standard flat solar panel.


We think it has significant potential, and we’re actively pursuing realistic applications,” said Shtein. “It could ultimately reduce the cost of solar electricity.” The University has now applied for a patent and is seeking private investment to bring the technology to market.


Making the New Silicon



Gallium nitride electronics could drastically cut energy usage in data centers, consumer devices.


An exotic material called gallium nitride (GaN) is poised to become the next semiconductor for power electronics, enabling much higher efficiency than silicon.


In 2013, the Department of Energy (DOE) dedicated approximately half of a $140 million research institute for power electronics to GaN research, citing its potential to reduce worldwide energy consumption. Now MIT spinout Cambridge Electronics Inc. (CEI) has announced a line of GaN transistors and power electronic circuits that promise to cut energy usage in data centers, electric cars, and consumer devices by 10 to 20 percent worldwide by 2025.


Power electronics is a ubiquitous technology used to convert electricity to higher or lower voltages and different currents — such as in a laptop’s power adapter, or in electric substations that convert voltages and distribute electricity to consumers. Many of these power-electronics systems rely on silicon transistors that switch on and off to regulate voltage but, due to speed and resistance constraints, waste energy as heat.


CEI’s GaN transistors have at least one-tenth the resistance of such silicon-based transistors, according to the company. This allows for much higher energy-efficiency, and orders-of-magnitude faster switching frequency — meaning power-electronics systems with these components can be made much smaller. CEI is using its transistors to enable power electronics that will make data centers less energy-intensive, electric cars cheaper and more powerful, and laptop power adapters one- third the size — or even small enough to fit inside the computer itself.


This is a once-in-a-lifetime opportunity to change electronics and to really make an impact on how energy is used in the world,” says CEI co-founder Tomás Palacios, an MIT associate professor of electrical engineering and computer science who co-invented the technology.


Making GaN feasible


While GaN transistors have several benefits over silicon, safety drawbacks and expensive manufacturing methods have largely kept them off the market. But Palacios, Lu, Saadat, and other MIT researchers managed to overcome these issues through design innovations made in the late 2000s.


Power transistors are designed to flow high currents when on, and to block high voltages when off. Should the circuit break or fail, the transistors must default to the “off” position to cut the current to avoid short circuits and other issues — an important feature of silicon power transistors.


But GaN transistors are typically “normally on” — meaning, by default, they’ll always allow a flow of current, which has historically been difficult to correct. Using resources in MIT’s Microsystems Technology Laboratory, the researchers — supported by Department of Defense and DOE grants — developed GaN transistors that were “normally off” by modifying the structure of the material.


To make traditional GaN transistors, scientists grow a thin layer of GaN on top of a substrate. The MIT researchers layered different materials with disparate compositions in their GaN transistors. Finding the precise mix allowed a new kind of GaN transistors that go to the off position by default.


We always talk about GaN as gallium and nitrogen, but you can modify the basic GaN material, add impurities and other elements, to change its properties,” Palacios says.


But GaN and other nonsilicon semiconductors are also manufactured in special processes, which are expensive. To drop costs, the MIT researchers — at the Institute and, later, with the company — developed new fabrication technologies, or “process recipes,” Lu says. This involved, among other things, switching out gold metals used in manufacturing GaN devices for metals that were compatible with silicon fabrication, and developing ways to deposit GaN on large wafers used by silicon foundries.


Basically, we are fabricating our advanced GaN transistors and circuits in conventional silicon foundries, at the cost of silicon. The cost is the same, but the performance of the new devices is 100 times better,” Lu says.


Major applications


CEI is currently using its advanced transistors to develop laptop power adaptors that are approximately 1.5 cubic inches in volume — the smallest ever made.


Among the other feasible applications for the transistors, Palacios says, is better power electronics for data centers run by Google, Amazon, Facebook, and other companies, to power the cloud.

Currently, these data centers eat up about 2 percent of electricity in the United States. But GaN-based power electronics, Palacios says, could save a very significant fraction of that.


Another major future application, Palacios adds, will be replacing the silicon-based power electronics in electric cars. These are in the chargers that charge the battery, and the inverters that convert the battery power to drive the electric motors. The silicon transistors used today have a constrained power capability that limits how much power the car can handle. This is one of the main reasons why there are few large electric vehicles.


GaN-based power electronics, on the other hand, could boost power output for electric cars, while making them more energy-efficient and lighter — and, therefore, cheaper and capable of driving longer distances. “Electric vehicles are popular, but still a niche product. GaN power electronics will be key to make them mainstream,” Palacios says.


Innovative ideas


In launching CEI, the MIT founders turned to the Institute’s entrepreneurial programs, which contributed to the startup’s progress. “MIT's innovation and entrepreneurial ecosystem has been key to get things moving and to the point where we are now,” Palacios says.


Palacios first earned a grant from the Deshpande Center for Technological Innovation to launch CEI. Afterward, he took his idea for GaN-based power electronics to Innovation Teams (i-Teams), which brings together MIT students from across disciplines to evaluate the commercial feasibility of new technologies. That program, he says, showed him the huge market pull for GaN power electronics, and helped CEI settle on its first products.


Many times, it’s the other way around: You come out with an amazing technology looking for an application. In this case, thanks to i-Teams, we found there were many applications looking for this technology,” Palacios says.


For Lu, a key element for growing CEI was auditing Start6, a workshop hosted by the Department of Electrical Engineering and Computer Science, where entrepreneurial engineering students are guided through the startup process with group discussions and talks from seasoned entrepreneurs. Among other things, Lu gained perspective on dividing equity, funding, building a team, and other early startup challenges.


It’s a great class for a student who has an idea, but doesn’t know exactly what’s going on in business,” Lu says. “It’s kind of an overview of what the process is going to be like, so when you start your own company you are ready.





FACEBOOK's Solar Powered Eagle



Meet Aquila (the Eagle), Facebook’s solar powered answer to providing Internet connectivity to underserved communities and areas where access to the web isn’t available.


Created by the social network’s Connectivity Lab, Aquila has the wingspan of a Boeing 737, but weighs less than a car. The monocoque wing is constructed from a cured carbon fiber that has three times the strength of steel, but is lighter than aluminium.




The craft will cruise at an altitude of around 90,000 feet during the day while the solar cells recharge Aquila’s batteries. At night, Aquila’s altitude will drop to 60,000 feet to take advantage of gravitational potential energy to consume less power. In both instances, the craft will be well above commercial air traffic and adverse weather. 


One of the cutting edge technologies to be used with Aquila is free space laser communications, which will be used for transmitting data between a group of these aircraft. The system can transmit data at tens of billions of bits per second (Gbps). Facebook says the rate is around that of a fiber-optic network, only in this case it’s happening through the air.


A ground station will transmit a radio Internet signal to a “mother” aircraft, and that aircraft will then repeat the signal to others within the constellation using the laser technology. Those aircraft will beam radio Internet signals back to the ground. Receiver towers and dishes will convert the signals into a Wi-Fi or LTE signal; providing Internet access to people in the surrounding area.

The system that guides the lasers has to be incredibly accurate. The project’s Engineering Lead of Aviation Laser Communications, Chien Chen, said it had to be able to “hit a dime” (~18.5 millimeter diameter) from more than 6 kilometres away.


Each aircraft within the constellation can create a 50-km communications radius for up to 90 days; powered purely by the sun.


Facebook says test flights for the full-scale prototype should occur later this year.


We’re proud of the progress we’ve made so far,” says Facebook’s Yael Yael Maguire. “In 14 short months, we’ve designed and built an aircraft from start to finish and made great strides in developing the technology required to distribute high-capacity data streams through the air.”


Aquila is part of the Internet.org project, which has a goal of making affordable access to basic internet services available to every person in the world. Other Internet.org partners include Ericsson, Nokia, Qalcomm and Samsung. Internet.org is already providing free basic Internet services to people in 17 countries.




The Solar Powered Anti-Poaching Drone



A 3D printed solar powered drone may play an important role in the battle against poaching.


Project Icarus is a solar UAV (Unmanned Aerial Vehicle) pushing the limits of endurance for airframes under 5kg. Icarus 3.0 in its full form will be solar powered and capable of incorporating other power systems, including hydrogen fuel.


An entry in the 2015 Hackaday Prize competition, its creator Toby Lankford has a goal of seeing the plane in service in wildlife refuges around the world as a weapon in the anti-poaching arsenal.


“I wanted to focus this year on promoting drones as a tool for social and environmental good,” he says. “Poaching is an issue that presented itself as the most urgent.


While the penalties for poaching in some countries are massive, so are the spoils. A kilogram of rhino horn is worth more than a kilogram of gold or cocaine.


Poaching can be a deadly business for all involved. National Parks and Wildlife rangers in Australia tend to be rather non-threatening folks; but their counterparts in some regions of Africa are more akin to soldiers and armed with powerful rifles. Poachers and rangers are regularly killed in gun battles.


The rangers are fighting a losing battle on some fronts. Fewer than 5000 black rhinos exist in the wild. As their numbers dwindle, this further boosts the value of horns and in turn, the threat of black rhino extinction.




Icarus 3.0 could prove to be very effective in monitoring remaining rhino herds and alerting rangers to any suspicious activity; allowing them to better prepare for engagement and perhaps avoid loss of life – both animals and humans.


Assisted by solar cells covering its wings, Icarus 3.0 will have a range of more than 200 km and flight time exceeding 180 minutes.  The UAV will be able to carry a payload of 6kg. The craft features custom image recognition software, “Fire and Forget” autonomous mission from launch to recovery and TCP/IP Cloud Control for observation or control from anywhere in the world.


"It (poaching) is a problem seeking an immediate solution,” says Mr. Lankford. “We hope that we can be part of that answer with our anti-poaching cloud swarm UAV system.


The 3d-printed Icarus 3.0 will also be released as an open-source design on Thingiverse; available free of charge to hobbyists.


While we produce UAV’s commercially, we wanted to give back to the community as well,” says Mr. Lankford.


The 2015 Hackaday Prize competition encourages entrants to “solve a problem that matters”. The first prize is a choice of a trip into space or nearly USD $200,000 in cash.




$100 A Kilowatt Hour Lithium Battery



Cambridge, Massachusetts company 24M has declared its semi-solid lithium battery technology is the most significant change to Li-ion in two decades.


The company says its battery cell design combined with manufacturing innovations will slash today’s lithium-ion costs by half while providing enhanced battery performance.


The lithium-ion battery is a brilliant, enabling technology, but its economics are flawed. It’s prohibitively expensive; it’s cumbersome and inefficient to make; and today’s version is approaching the limits of its cost reductions,” said Dr. Yet-Ming Chiang, 24M’s Chief Scientist.


“24M has fixed the flaws. We’ve made the world’s favorite battery better, fundamentally changing its cost curve by designing a more elegant and simpler cell and then making the batteries the right way – the way they should have been made from day one.”


So what’s in the secret sauce of this battery also claimed to have unprecedented safety and abuse tolerance?


24M says lithium-ion battery cells in use today have significant inactive, non-charge carrying materials such as metals and plastics. 24M’s semisolid thick electrode eliminates more than 80% of these inactive materials and increases the active layer thickness compared to traditional lithium-ion by up to 5x. As well as reducing costs, the thicker electrodes mean more energy storage within the same amount of space.


Another improvement is the way the batteries are manufactured. Usually the production process for lithium-ion batteries is expensive and takes days. 24M says because its technology doesn’t require binding, drying, solvent recovery or calendaring, cell creation takes just 20% of the time needed to produce a conventional battery.


As no organic solvents are used, 24M is offering a greener battery; one that is 100% recyclable.


The company says a manufacturing plant using the technology costs about 10% of a conventional plant to establish and requires one-fifth the space.


Throop Wilder, 24M’s CEO, says 24M battery costs will be less than USD $100 a kilowatt-hour (kWh) by 2020.


Before solar households get too excited, that’s just the cost of the battery – there’s a lot more that goes into a home battery storage system. Still, given developments such as this and Tesla continuing to push at the cost boundaries with Powerwall in the months ahead, affordable home energy storage isn’t far off.

For those still considering going solar, it’s a very good time to; but perhaps with storage in mind in the future. A battery upgradeable system may be a wise choice – save today and store tomorrow.

24M’s cells are currently undergoing customer trials.




Solar Powered Internet Of Things



A new highly-efficient microchip capable of converting 80 per cent of the energy it receives from a tiny solar cell could have dramatic effects on the future of interconnected wireless technology.


The so-called Internet of Things (IoT) will be powered by tiny embedded sensors streaming information from machinery, automobiles and household appliances to networked servers.


It’s estimated that by 2020, up to 50 billion objects and devices worldwide will be connected to the IoT, but that vision requires ultra-low power sensors that can run for months at a time without changing batteries.


Solar power is a perfect solution to the problem, but current power-conversion chips only convert 40-50 per cent of the energy, and can only charge a battery or power the sensor.


A recently unveiled chip from engineers at the Massachusetts Institute of Technology does both. Measuring just three-by-three millimetres, the chip’s single inductor circuit board determines whether to power the device, charge the battery, or both; as well as regulate an array of switches which control current flow to maintain optimal efficiency.


We still want to have battery-charging capability, and we still want to provide a regulated output voltage,” says Dina Reda El-Damak, MIT graduate student in electrical engineering and computer science.


“And we want to do it without compromising the performance, at very limited input power levels — 10 nanowatts to 1 microwatt — for the Internet of Things.”


The primary challenge facing the researchers was to maintain an even voltage into the chip. Too much or too little flow would degrade the battery, shortening its life. This was particularly vital with a variable solar cell energy source. They used a single wire coil inductor, which creates a magnetic fiend when current passes through it, resisting changes in current.


A switch array in the path of the inductor ensures the current flow drops to zero if it gets too high, and a set of electrical timing components called capacitors open the gate to more current as required, improving the efficiency of the chip. As the current drops, it charges a subset of those capacitors, whose selection is determined by the solar cell’s voltage.


In this technology space, there’s usually a trend to lower efficiency as the power gets lower, because there’s a fixed amount of energy that’s consumed by doing the work,” says Brett Miwa, who leads a power conversion development project as a fellow at the chip manufacturer Maxim Integrated.


If you’re only coming in with a small amount, it’s hard to get most of it out, because you lose more as a percentage. [El-Damak’s] design is unusually efficient for how low a power level she’s at.”



Earth 'entering new extinction phase' - US study



The Earth has entered a new period of extinction, a study by three US universities has concluded, and humans could be among the first casualties.

The report, led by the universities of Stanford, Princeton and Berkeley, said vertebrates were disappearing at a rate 114 times faster than normal.


The findings echo those in a report published by Duke University last year.

One of the new study's authors said: "We are now entering the sixth great mass extinction event."


The last such event was 65 million years ago, when dinosaurs were wiped out, in all likelihood by a large meteor hitting Earth.


"If it is allowed to continue, life would take many millions of years to recover and our species itself would likely disappear early on," said the lead author, Gerardo Ceballos.


The scientists looked at historic rates of extinction for vertebrates - animals with backbones - by assessing fossil records.


They found that the current extinction rate was more than 100 times higher than in periods when Earth was not going through a mass extinction event.

Since 1900, the report says, more than 400 more vertebrates had disappeared.


Such a loss would normally be seen over a period of up to 10,000 years, the scientists say.


The study - published in the Science Advances journal - cites causes such as climate change, pollution and deforestation.


Given the knock-on effect of ecosystems being destroyed, the report says benefits such as pollination by bees could be lost within three human generations.


Stanford University professor Paul Ehrlich said: "There are examples of species all over the world that are essentially the walking dead.

"We are sawing off the limb that we are sitting on."


The International Union for Conservation of Nature (IUCN) says at least 50 animals move closer to extinction every year.


Around 41% of all amphibians and 25% of mammals are threatened with extinction, it says.


According to the IUCN, the lemur faces a real struggle to avoid extinction in the wild in the coming years.


The group says that 94% of all lemurs are under threat, with more than a fifth of all lemur species classed as "critically endangered".


As well as seeing their habitat in Madagascar destroyed by illegal logging, lemurs are also regularly hunted for their meat, the IUCN says.


Last year, a report by Stuart Pimm, a biologist and extinction expert at Duke University in North Carolina, also warned mankind was entering a sixth mass extinction event.


But Mr Pimm's report said the current rate of extinction was more than 1,000 times faster than in the past, not 114, as the new report claims.


The new report's authors said it was still possible to avoid a "dramatic decay of biodiversity" through intensive conservation, but that rapid action was needed.



Energy Storage

Flexible Wood Based Aerogel Batteries



Researchers at KTH Royal Institute of Technology in Sweden and Stanford University have developed a flexible, compressible foam-like battery material made from wood pulp.


The elastic high-capacity batteries are based on nanocellulose broken down from tree fibres, around one million times thinner than the original fibre. The nanocellulose is treated so the material does not collapse and is built layer-by-layer (LbL). This results in a material called aerogel, also sometimes referred to as “frozen smoke” due to its lightweight and ghost-like nature.


Any visual resemblance to frozen smoke is lost when a conductive ink is then used to coat the entire surface – inside and out. The full device is a seven-layer structure, including the aerogel substrate wall.


“The result is a material that is both strong, light and soft,” according to Max Hamedi,  a researcher at KTH and Harvard University. “The material resembles foam in a mattress, though it is a little harder, lighter and more porous.”




The massive surface area is where the energy storage potential lies. While there are limits to how thin a traditional battery can be, this becomes less relevant in 3D structure and Aerogels have the highest specific surface area among any man-made material. Mr. Hamedi says a single cubic decimeter of the aerogel material would cover most of a soccer pitch.


Test batteries have been able to be reversibly bent to 90° or compressed up to 75% without any observable structural damage. The devices showed a stable cycling behaviour at the 60 C rate for 400 cycles and 75% of the initial capacitance was maintained when the charging rate was increased to 160 C (22 s).


With further development, one of the possible applications for such a material is in the bodies and seats of electric cars; or perhaps even insulation in homes – making the house itself a giant battery; storing electricity produced by solar panels on its rooftop.


A paper on the battery technology has been published in the latest issue of Nature Communications.


“We have shown that fully interdigitated 3D supercapacitors and batteries can be self-assembled inside high-surface area aerogels using a rapid and scalable methodology. The obtained devices show stable operation without short-circuiting, are bendable, compressible and can be made with arbitrary form factors,” state the paper’s authors.


” These results are very promising and show that this LbL-based methodology can produce fully interdigitated 3D devices with a complex structure containing a variety of materials.”


 GE Creates Digital ‘Twin’ Wind Farm



GE Renewable Energy has released details of a new digital wind farm concept that it says would make wind power production 20 per cent more efficient and create $100 million in extra value over the lifetime of a 100 megawatt facility.


The concept uses software to pair physical wind turbines located on real-world wind farms with a “digital twin” located on a virtual wind farm inside a computer. All data is stored and processed on GE’s industrial internet cloud platform, Predix, giving engineers the power to customise the performance of individual turbines.


“Every wind farm has a unique profile, like DNA or a fingerprint,” says Keith Longtin, general manager for wind products at GE Renewable Energy. “We thought if we could capture data from the machines about how they interact with the landscape and the wind, we could build a digital twin for each wind farm [and] use it to design the most efficient turbine for each pad on the farm, and then keep optimizing the whole thing.”


The system enables engineers to model about 20 different turbine configurations – depending on required blade diameter, pole height and turbine output and design perfect wind turbine for a given site. According to GE, the software can optimise performance for any type of wind turbine, regardless of size or manufacturer.


Once a wind farm is established, sensors in each turbine begin sending a flood of data back to its twin in the cloud, from wind speed, nacelle yaw and generator torque. The software crunches the information and sends back real-time changes that boost the overall efficiency of the entire wind farm.


“This is a real-time analytical engine using deep data science and machine learning,” says Ganesh Bell, chief digital office at GE. “There is a lot of physics built into it. We get a picture that feels real, just like driving a car in a new video game. We can do things because we understand the physics – we build turbines – but also because we write software.”


The data generated by the digital wind farm can even be used to control noise, by changing the rotor speed depending on the wind direction to ensure it stays below any noise thresholds.




Moth Eyes Inspire More

Efficient Solar Cell



The atomic structure of a moth’s eye and lotus leaves have inspired scientists from the USA’s Oak Ridge National Laboratory (ORNL) to create new water-repelling, anti-reflective glass coating that could increase the efficiency of solar panels by up to six per cent.


“While lotus leaves repel water and self-clean when it rains, a moth’s eyes are antireflective because of naturally covered tapered nanostructures where the refractive index gradually increases as light travels to the moth’s cornea,” said Tolga Aytug, member of ORNL’s Materials Chemistry Group.


“Combined, these features provide truly game-changing ability to design coatings for specific properties and performance.”


The researchers have developed a process for manufacturing a highly robust nanostructured base material that takes advantage of the unique hydrophobic nature of the lotus leaf, whereby water literally bounces off the surface, taking dirt and dust with it.


“We developed a method that starts with depositing a thin layer of glass material on a glass surface followed by thermal processing and selective material removal by etching,” Aytug said. “This produces a surface consisting of a porous three-dimensional network of high-silica content glass that resembles microscopic coral.”


The material is produced using inexpensive industry-standard techniques and can be easily scaled up for a variety of uses, such as algae-resistant marine glass, the group said. It is also super tough, capable of withstanding high temperatures.


“This quality differentiates it from traditional polymeric and powder-based counterparts, which are generally mechanically fragile,” Aytug said. “We have shown that our nanostructure glass coatings exhibit superior mechanical resistance to impact abrasion – like sand storms – and are thermally stable to temperatures approaching 500 degrees Celsius.”


The nanoporous nature of the coating suppresses “Fresnel” light reflection – the amount of light reflected versus the amount of light transmitted – allowing more wavelengths of light at wider angles to flow through glass surfaces. It also effectively blocks ultraviolet light, which can degrade sensitive optical technology.


This enhanced light absorbing characteristic would have the ability to increase the light-to-electricity conversion efficiency of solar cells by three-to-six per cent, according to the ORNL team. And the superhydophobic glass coating would remove the need to clean solar panels, lowering the overall cost of rooftop solar power.


The material has many potential applications beyond solar energy, including goggles, periscopes, optical instruments, photodetectors and sensors.


The ORNL team’s work, titled “Monolithic Graded-Refractive-Index Glass-based Antireflective Coatings: Broadband/Omnidirectional Light Harvesting and Self-Cleaning Characteristics,” is published in the Journal of Materials Chemistry C.



NASA Study Shows Antarctica’s Larsen B Ice Shelf Nearing Its Final Act



A new NASA study finds the last remaining section of Antarctica's Larsen B Ice Shelf, which partially collapsed in 2002, is quickly weakening and likely to disintegrate completely before the end of the decade.


A team led by Ala Khazendar of NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California, found the remnant of the Larsen B Ice Shelf is flowing faster, becoming increasingly fragmented and developing large cracks. Two of its tributary glaciers also are flowing faster and thinning rapidly.


"These are warning signs that the remnant is disintegrating," Khazendar said. "Although it’s fascinating scientifically to have a front-row seat to watch the ice shelf becoming unstable and breaking up, it’s bad news for our planet. This ice shelf has existed for at least 10,000 years, and soon it will be gone."


Ice shelves are the gatekeepers for glaciers flowing from Antarctica toward the ocean. Without them, glacial ice enters the ocean faster and accelerates the pace of global sea level rise. This study, the first to look comprehensively at the health of the Larsen B remnant and the glaciers that flow into it, has been published online in the journal Earth and Planetary Science Letters.


Khazendar's team used data on ice surface elevations and bedrock depths from instrumented aircraft participating in NASA's Operation IceBridge, a multiyear airborne survey campaign that provides unprecedented documentation annually of Antarctica's glaciers, ice shelves and ice sheets. Data on flow speeds came from spaceborne synthetic aperture radars operating since 1997.




Khazendar noted his estimate of the remnant's remaining life span was based on the likely scenario that a huge, widening rift that has formed near the ice shelf's grounding line will eventually crack all the way across. The free-floating remnant will shatter into hundreds of icebergs that will drift away, and the glaciers will rev up for their unhindered move to the sea.


Located on the coast of the Antarctic Peninsula, the Larsen B remnant is about 625 square miles (1,600 square kilometers) in area and about 1,640 feet (500 meters) thick at its thickest point. Its three major tributary glaciers are fed by their own tributaries farther inland.


"What is really surprising about Larsen B is how quickly the changes are taking place," Khazendar said. "Change has been relentless."


The remnant's main tributary glaciers are named Leppard, Flask and Starbuck -- the latter two after characters in the novel Moby Dick. The glaciers' thicknesses and flow speeds changed only slightly in the first couple of years following the 2002 collapse, leading researchers to assume they remained stable. The new study revealed, however, that Leppard and Flask glaciers have thinned by 65-72 feet (20-22 meters) and accelerated considerably in the intervening years. The fastest-moving part of Flask Glacier had accelerated 36 percent by 2012 to a flow speed of 2,300 feet (700 meters) a year -- comparable to a car accelerating from 55 to 75 mph.


Flask's acceleration, while the remnant has been weakening, may be just a preview of what will happen when the remnant breaks up completely. After the 2002 Larsen B collapse, the glaciers behind the collapsed part of the shelf accelerated as much as eightfold – comparable to a car accelerating from 55 to 440 mph.


The third and smallest glacier, Starbuck, has changed little. Starbuck's channel is narrow compared with those of the other glaciers, and strongly anchored to the bedrock, which, according to authors of the study, explains its comparative stability.


"This study of the Antarctic Peninsula glaciers provides insights about how ice shelves farther south, which hold much more land ice, will react to a warming climate," said JPL glaciologist Eric Rignot, a coauthor of the paper.


The research team included scientists from JPL, the University of California, Irvine, and the University Centre in Svalbard, Norway. The paper is online at:


http://go.nasa.gov/1bbpfsC


NASA uses the vantage point of space to increase our understanding of our home planet, improve lives and safeguard our future. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.


For more information about NASA’s Earth science activities, visit:


http://www.nasa.gov/earth



Solar Implant Could Restore Sight To The Blind



Stanford University researchers have developed a retinal implant that harnesses solar power to improve vision for patients suffering from degenerative eye diseases.


In a paper published in Nature Medicine, the scientists behind the work say the breakthrough could restore functional sight to those afflicted with diseases such as retinitis pigmentosa or macular degeneration – ailments that destroy photoreceptors in the eye, blocking light signals to the brain.


The tiny silicon implant, composed of hexagonal photovoltaic pixels, converts light transmitted from special glasses worn by the patient into an electric current, which in turn stimulates retinal neurons known as bipolar cells.




Bipolar cells process light from photoreceptors and send the signals to the brain. By stimulating these cells, the implant bypasses damaged photoreceptors and produces functional sight five times better than existing devices according to the researchers.


Successful testing of the implant has been carried out on rats and a clinical trial with patients blinded by retinal pigmentosa is planned next year in collaboration with French company Pixium Vision.


“The performance we’re observing at the moment is very encouraging,” said Georges Goetz, a lead author of the paper and graduate student in electrical engineering at Stanford. “Based on our current results, we hope that human recipients of this implant will be able to recognize objects and move about.”


According to Daniel Palanker, PhD, professor of ophthalmology and a senior author of the paper, the biggest advantage of the photovoltaic implant is its wireless capability. This eliminates wiring to extraocular devices, a procedure requiring invasive surgery and with generally poor visual acuity results of 20/1,200.


Vision tests of the Stanford device in rats have shown it restores visual acuity to an equivalent of 20/250 and the team plan to further improve these results by developing smaller PV pixels and targeting electrodes to specific cell layers.


“Eventually, we hope this technology will restore vision of 20/120,” Palanker said. “And if it works that well, it will become relevant to patients with age-related macular degeneration.”



Paper Silicon Solar Cells Possible



While paper silicon solar cells mightn’t be such a great thing; new technology developed by researchers at Delft University of Technology in the Netherlands that will make it possible is.


Printable electronics to date have mainly focused on the use of organic and metal-oxide ink materials given their compatibility with the application process. However, these materials don’t offer the same performance of silicon-based electronics.


There have been some inroads in the use of silicon ink, but this has been hampered somewhat by the thermal annealing process which sees temperatures of around 350C reached. This has meant some low cost substrates can’t be used due to their lack of resistance at such high temperatures.


Using excimer laser irradiation, Professor Ryoichi Ishihara and his team have formed poly-Si directly on top of paper with a single laser pulse lasting just a few tens of nanoseconds and at a maximum temperature of only 150 °C.


The method enables silicon device formation on inexpensive, temperature sensitive substrates such as polyethylene terephthalate, polyethylene naphthalate or even paper.


We coated liquid polysilane directly on paper by doctor-blading, or skimming it by a blade directly in an oxygen-free environment,” said  Professor Ryoichi Ishihara. “Then we annealed the layer with an excimer-laser [a conventional tool used for manufacturing smartphone displays]. And it worked.”


Thin-film transistors using the laser-printed layer exhibited mobilities as good as those demonstrated in conventional poly-silicon conductors.


The initial application of this development is in wearable electronics, but by further improving the production process of the thin-film transistors to incorporate additional non-silicon layers, Professor Ishihara says the process can be expanded to biomedical sensor and solar cell areas.


Professor Ishihara received his B.E., M. E., and Ph. D. degrees from Department of Physical Electronics, Tokyo Institute of Technology, Japan in 1991, 1993, and 1996. His main research activities include low-temperature chemical vapor deposition of silicon nitride film, fabrication of amorphous-Si and poly-Si thin-film transistors (TFTs) and more recently, excimer-laser crystallization of Si films.


He has been with Delft Institute of Microsystems and Nanoelectronics, Delft University of Technology, since 1996.

The Delft University team have described their work in Solution-Processed Polycrystalline Silicon On Paper.


Breakthrough Method for Hydrogen Fuel



Scientists at Virginia Tech have come up with a brand new way to create hydrogen fuel: It’s cheap, fast and produces clean results, and involves plain old corn stalks, cobs and husks.


It’s long been known that the use of hydrogen has a tremendous potential both, for increasing energy efficiency and for greatly reducing greenhouse gas emissions. But as the Virginia team write in their study, published on the journal Proceedings of the National Academy of Sciences, “producing it in a distributed, carbon-neutral, low-cost manner requires new technologies.”


Zhang and his team have already received the grant for the next phase of the research, which is to outline methods for mass production.


This is where the world’s simplest sugar comes in: xylose. The resulting amounts of hydrogen in the study were possible previously only in theory.


The method for production proposed by the Virginia Tech scientists releases barely any greenhouse gases and does not involve heavy metals. It also relies exclusively on processed sugars. The dirty biomass of corn stover is also a bonus as the fuel will have become readily available fresh out of the processing plants. This means it would be possible to make the sale of such fuel a local enterprise.


"We have demonstrated the most important step toward a hydrogen economy - producing distributed and affordable green hydrogen from local biomass resources," Zhang says.


Joe Rollin, lead author on the paper, Zhang’s former doctoral student and co-founder with Zhang of the start-up Cell-free Bioinnovations, created a complex multi-step algorithm describing the breakdown of corn stover into hydrogen and carbon dioxide. His model also allows for glucose to be used alongside xylose simultaneously, further increasing the rate of production. This fact was also not possible before.


The above achieves a three-fold increase in the rate of production, as well as greatly reduces the size of the facility required for such a production – to about the size of a gas station – and, most importantly, does not use natural gases, which would entail further distribution costs and most certainly be bad for the environment.


Rollin’s model is also brilliant in that it generates ultra-pure hydrogen, which is the stuff used in fuel cell vehicles. "We believe this exciting technology has the potential to enable the widespread use of hydrogen fuel cell vehicles around the world and displace fossil fuels," Rollin says.


Xylose is key here, as it splits water molecules, in turn producing higher purity stuff.


Though a complex process, the energy efficiency of this approach is 100 percent. By contrast, other sugar-based processes for biofuel generation – ethanol and butanol – can’t touch these figures.


Zhang’s previous research, on which Rollin’s model is built, used starch to produce the resulting hydrogen, but the process was too costly for mass production. The current technique covers all the bases required to start thinking big.


“It really doesn’t make sense to use non-renewable natural resources to produce hydrogen,” Zhang says, “We think this discovery is a game-changer in the world of alternative energy.”


The world has been looking for decades for a way to wean itself off of dirty energy dependency and high fuel costs. The idea to use hydrogen has long been in the making. The US Department of Energy has admitted that a clean hydrogen fuel could dramatically alter our lifestyles.


Are alternatively-powered cars a futuristic, utopian concept for now? Not at all. There were plenty of examples at this year’s North American Auto Show of car makers boldly stepping into new terrain with myriad new technologies – hydrogen-based fuel among them. Check out Honda’s Fuel Cell Vehicle (FCV), slated for production in 2016.


But the scientists need to hurry. The market for electric cars is likewise on the rise. So much so it could cut fuel costs by 13 billion pounds and drive down UK oil imports alone by 40 percent by 2030, according to a new study.


The technology, however, is here to stay. The Chinese, in their desperate bid to curb greenhouse emissions – among the highest in the world – have a couple of weeks ago rolled out their first hydrogen-powered trams. They are also the first to have done so.


Official data currently puts the commercial market for hydrogen gas at $100 billion; it’s both slow, dirty and expensive to produce. Zhang and Rollin’s model is set to change that.

Plastic Threatens Eco-System in

Mediterranean Sea


Large quantities of plastic debris are building up in the Mediterranean Sea, say scientists.


A survey found around one thousand tonnes of plastic floating on the surface, mainly fragments of bottles, bags and wrappings.


The Mediterranean Sea's biological richness and economic importance means plastic pollution is particularly hazardous, say Spanish researchers.


Plastic has been found in the stomachs of fish, birds, turtles and whales.


Very tiny pieces of plastic have also been found in oysters and mussels grown on the coasts of northern Europe.


    Given the biological wealth and concentration of economic activities in the Mediterranean Sea, the effects of plastic pollution on marine and human life could be particularly relevant in this plastic accumulation zone.


   Andres Cozar , University of Cadiz, Spain


"We identify the Mediterranean Sea as a great accumulation zone of plastic debris," said Andres Cozar of the University of Cadiz in Puerto Real, Spain, and colleagues.


"Marine plastic pollution has spread to become a problem of planetary scale after only half a century of widespread use of plastic materials, calling for urgent management strategies to address this problem."


Plastic is accumulating in the Mediterranean Sea at a similar scale to that in oceanic gyres, the rotating ocean currents in the Indian Ocean, North Atlantic, North Pacific, South Atlantic and South Pacific, the study found.


A high abundance of plastic has also been found in other seas, including the Bay of Bengal, South China Sea and Barents Sea in the Arctic Ocean.


Microplastics


Commenting on the study, published in the journal PLOS ONE, Dr David Morritt of Royal Holloway, University of London, said scientists were particularly concerned about very small pieces of plastic (less than 5mm in length), known as microplastics.


The study found more than 80% of plastic items in the Mediterranean Sea fell into this category.


"These very small plastic fragments lend themselves to being swallowed by marine species, potentially releasing chemicals into the gut from the plastics," Dr Morritt, of the School of Biological Sciences.


"Plastic doesn't degrade in the environment - we need to think much more carefully about how we dispose of it, recycle it, and reduce our use of it."


The Mediterranean Sea represents less than 1% of the global ocean area, but is important in economic and ecological terms.


It contains between 4% and 18% of all marine species, and provides tourism and fishing income for Mediterranean countries.


"Given the biological wealth and concentration of economic activities in the Mediterranean Sea, the effects of plastic pollution on marine and human life could be particularly relevant in this plastic accumulation zone," said Dr Cozar.




Millions in Precious Metals in Sewage


Scientists are perusing poop at America’s wastewater treatment facilities for gold, silver, copper and other useful metals. The sewage from one million people could net $13 million in metals each year, all while making fertilizer more efficient.


More than seven million dry tons biosolids are generated in the US annually by more than 16,500 municipal wastewater treatment facilities. And that sewage contains metals that people ingest and otherwise flush down the toilet, or rinse out in the laundry and shower.


"There are metals everywhere," Dr. Kathleen Smith of the US Geological Survey (USGS) said in a statement, noting that they are "in your hair care products, detergents, even nanoparticles that are put in socks to prevent bad odors."


Smith leads a team of researchers looking to get metals out of biosolids because about half of all human waste ‒ about 3.5 million tons in the US ‒ is used as fertilizer on farms and in forests, while the other half is incinerated or sent to landfills.


“If you can get rid of some of the nuisance metals that currently limit how much of these biosolids we can use on fields and forests, and at the same time recover valuable metals and other elements, that's a win-win," Smith said.


It may be odd thinking of precious metals like silver and gold as nuisances, but they impede the usefulness of fertilizers.


"We have a two-pronged approach," Smith said. "In one part of the study, we are looking at removing some regulated metals from the biosolids that limit their use for land application."


"In the other part of the project, we're interested in collecting valuable metals that could be sold, including some of the more technologically important metals, such as vanadium and copper that are in cell phones, computers and alloys," she added.


Smith’s team has collected samples from small towns in the Rocky Mountains, rural communities and big cities, but will also combine their findings with years of existing data collected by the US Environmental Protection Agency (EPA) and the USGS. They will present their research at 249th National Meeting & Exposition of the American Chemical Society on Tuesday.


The EPA analyzed 28 metals for their 2009 Targeted National Sewage Sludge Survey, using samples randomly selected from 3,337 facilities that treat more than one million gallons of sewage per day, Smith wrote in her paper’s abstract. The agency discovered that the samples averaged 30 mg of silver per kilogram, 563 mg of copper per kilogram and 36 mg of vanadium per kilogram of waste.




A similar eight-year study by the USGS involved monthly sampling and analysis of biosolids from a municipal wastewater treatment plant. Those samples contained an average 28 mg of silver, 638 mg of copper, 49 mg of vanadium and less than one milligram of gold per kilogram of waste.


About 80 percent of vanadium is used in producing rust resistant and high-speed tool steels, according to Los Alamos National Laboratory.


"The gold we found was at the level of a minimal mineral deposit," Smith said, meaning that if that amount were in rock, it might be commercially viable to mine it. She added that "the economic and technical feasibility of metal recovery from biosolids needs to be evaluated on a case-by-case basis."


In a January Environmental Science & Technology paper , scientists at Arizona State University (ASU) in Tempe calculated that the waste from one million Americans could contain as much as $13 million worth of metals, including $2.6 million in gold and silver.


The group analyzed sewage sludge for 58 regulated and non-regulated elements, and used electron microscopy to explore opportunities for removal and recovery. Based on their model to capture the relative potential for economic value from biosolids, the 13 most lucrative elements (silver, copper, gold, phosphorus, iron, palladium, manganese, zinc, iridium, aluminum, cadmium, thallium, gallium and chromium) present had a combined value of US $280 per ton of sludge.


The study’s lead author, environmental engineer Paul Westerhoff, says it could prove worthwhile for cities looking for ways to gain value from something that can be a costly disposal problem.


One place has already figured out how to profit from poop: A sewage treatment facility in Japan has been mining sludge for gold since 2009, Reuters reported at the time. The Nagano prefecture, northwest of Tokyo, once recorded finding 1,890 grams of gold per ton of ash from incinerated sludge. The district expected to earn about 15 million yen (currently US$125,000) for that fiscal year, depending on the price of gold.




China Faces 'Huge Impact' From Climate Change



Climate change could have a "huge impact" on China, reducing crop yields and harming the environment, the country's top weather scientist has warned, in a rare official admission.


Zheng Guogang told Xinhua news agency that climate change could be a "serious threat" to big infrastructure projects.

He said temperature rises in China were already higher than global averages.


China, the world's biggest polluter, has said its emissions of gases that cause climate change will peak by 2030.


However, the country has not set a specific target for cutting emissions of the gases, mainly carbon dioxide.


'Emphasise climate security'


Mr Zheng, the head of China's meteorological administration, said warming temperatures exposed his country to a growing "risk of climate change and climate disasters".


He said temperature rises in China had already been higher than the global average for the past century.


These are rare admissions from a Chinese official, RN Asia correspondent Michael Perira says.


China's leaders have acknowledged the damage from global warming but they usually do not lay out the full scale of the problems.


Mr Zheng warned of more droughts, rainstorms, and higher temperatures, which would threaten river flows and harvests, as well as major infrastructure projects such as the Three Gorges Dam. He urged China to pursue a lower-carbon future.


"To face the challenges from past and future climate change, we must respect nature and live in harmony with it," the Xinhua news agency quoted him as saying.


"We must promote the idea of nature and emphasise climate security."


China and the US together produce around 45% of global carbon emissions.


Leaders from the two countries are taking part in a summit in Paris this year that will aim for a global deal to cut carbon emissions by 2020.


China's decades-long pursuit of rapid economic growth has boosted demand for energy, particularly coal.


Scientists fear that pledges made so far to cut emissions will not be enough to avoid the harmful impact of climate change.


comments powered by Disqus

Solar Eclipse In Germany To Test Electricity Grid


Many will be closely monitoring the effects a total solar eclipse  this week has on Germany’s electricity infrastructure.


Germany’s 1.4 million solar power systems generate nearly 7 percent of the nation’s electricity and at their most productive, up to half of the nation’s power demand.


But what happens if all those systems rapidly stop producing power during the day? Germany is about to find out – and it’s unlikely to be an electricity Armageddon.


On the morning of March 20, a total solar eclipse will occur and over the span of 1.25 hours. Sunlight will be reduced by 73 percent, affecting solar electricity production up to 2.7 times faster than it normally would.


According to Opower, the effect will similar to turning off a medium-sized power generation plant every minute for a full hour. As the eclipse ends, the reverse will occur – solar power output could  jump by as much as 18 gigawatts in just over an hour.


Much of what will happen will depend on the weather. Should the morning of March 20 be cloudy, the impacts will likely be negligible. Even if it is a sunny day, Opower believes Germany will survive it quite well as being a nation well known for perfectionism, it of course has strategic flexibility planning in place specifically for a situation like this.


As solar output drops, the nation can turn to releasing more hydro resources, switching on quick-start natural gas power plants, or importing electricity from neighboring countries.


What occurs will be closely watched by the electricity sector all over the world as it’s believed to be the most significant event of its type since the solar revolution began.


The interest isn’t just concerning the impact of solar eclipses. As solar panel capacity increases, so too will the degree of variance in power production; which poses some challenges for grid operators.


“Assuming the German government achieves its target of 66 gigawatts of solar capacity, a clear-sky sunrise in 2030 could drive an increase in solar power supply as steep as the rebound phase of 2015’s eclipse,” says Opower.


It’s situations like this that will also help propel the uptake of home and commercial energy storage – and in the process, drive down storage prices.


Germany led the world in solar uptake through its revolutionary feed in tariff scheme and the nation is also a pioneer in supporting energy storage through a generous subsidy scheme already in place. By the end of last year, more than 15,000 households in Germany had installed energy storage systems.




History being Made, 1st Solar-Powered Flight Around The World!


ABU DHABI: The first attempt to fly around the world in a solar-powered plane will take off from Abu Dhabi on Monday, its pilots said, in a landmark journey aimed at promoting green energy.


The takeoff of Solar Impulse 2, which was delayed on Saturday due to high winds, would cap 13 years of research and testing by Swiss pilots Andre Borschberg and Bertrand Piccard.


“This project is a human project, it is a human challenge,” Borschberg, cofounder and chief executive of Solar Impulse told reporters on Sunday.


The wingspan of the one-seater plane, known as the Si2, is slightly bigger than that of a jumbo jet, but its weight is around that of a family car.


It will take off from Abu Dhabi on Monday at 6:30 a.m. (0230 GMT), landing first in Muscat.


From there, it will make 12 stops on an epic journey spread over five months, with a total flight time of around 25 days.


The longest single leg will see a lone pilot fly non-stop for five days across the Pacific Ocean between Nanjing, China and Hawaii, a distance of 8,500 km.


All this will happen without burning a drop of fuel.


“We want to share our vision of a clean future,” said Piccard, chairman of Solar Impulse.


“Climate change is a fantastic opportunity to bring in the market new green technologies that save energy, save natural resources of our planet, make profit, create jobs, and sustain growth,” Piccard said.


The pilots’ idea was ridiculed by the aviation industry when it was first unveiled.


But Piccard, who hails from a family of scientist-adventurers and was the first person, in 1999, to circumnavigate the globe in a hot air balloon, clung to his belief that clean technology and renewable energy “can achieve the impossible.”


The plane is powered by more than 17,000 solar cells built into wings that, at 72 meters, are longer than a jumbo and approaching that of an Airbus A380 superjumbo.


Thanks to an innovative design, the lightweight carbon fiber aircraft weighs only 2.3 tons, about the same as a family 4X4 and less than one percent of the weight of the A380.


Si2 is the first solar-powered aircraft able to stay aloft for several days and nights.


The propeller craft has four 17.5 horsepower electric motors with rechargeable lithium batteries.


It will travel at 50-100 km per hour, with the slower speeds at night to prevent the batteries from draining too quickly.


It is scheduled to arrive back in Abu Dhabi in July.


Its progress can be monitored via live video streaming at www.solarimpulse.com.




Solar Shrimp, More Than A Tasty Treat


Scientists from Queen Mary University of London (QMUL) have created electricity-generating solar cells using chemicals derived from the shells of shrimp and other crustaceans, a development that could have a major impact on the cost of producing solar power.


The research is focused on nanotechnology, specifically the highly conductive light-absorbing quantum dots used in thin-film and spray-on solar technology.


These tiny crystals can be tuned to specific wavelengths of light, multiplying the energy production of electrons throughout solar devices. They can also trap and convert infrared light to energy, light that would otherwise heat up and degrade photovoltaic processes.


The team discovered that two materials found in the shells of shrimp and crustaceans, chitin and chitosan, could be used to replace expensive metals like ruthenium and platinum, the expensive and rare metals used to make carbon quantum dots (CQDs).


Using a process known as hydrothermal carbonisation, the QMUL scientists incorporated the shrimp-derived chemicals to successfully produce CQDs. They then coated zinc oxide nanorods with the quantum dots to make solar cells.


“This could be a great new way to make these versatile, quick and easy to produce solar cells from readily available, sustainable materials,” said Dr Joe Biscoe, a researcher on the project.


The efficiency of the solar cells is low compared to the silicon-based solar panels used in rooftop PV systems, but the team hopes their discovery of an organic replacement for rare-earth materials in the production of CQDs will result in cheaper solar energy, at least in the field of thin-film technology.


“Once we’ve improved their efficiency they could be used anywhere that solar cells are used now, particularly to charge the kinds of devices people carry with them every day,” Dr Briscoe said.


Professor Magdalena Titirici, Professor of Sustainable Materials Technology at QMUL, added,


“New techniques mean that we can produce exciting new materials from organic by-products that are already easily available. Sustainable materials can be both high-tech and low-cost…We’ve also used biomass, in that case algae, to make the kinds of supercapacitors that can be used to store power in consumer electronics, in defibrillators and for energy recovery in vehicles.”



‘Megadroughts’ They’re Coming


If you're living in the Northeastern United States at this time, where cities such as Boston have reached almost seven feet of snow, the last thing you may be thinking about is a drought. Well, that may be changing soon based on a recent report from NASA scientist, Dr. Benjamin Cook describes.


As bad as recent droughts have been in California and elsewhere in the Southwestern and Midwestern parts of the country, scientists say far worse “megadroughts” are coming — and they’re projected to last for decades.


“Unprecedented drought conditions” — the worst in more than 1,000 years — are likely to come to the Southwest and Central Plains after 2050 and Central Plains after 2050 and stick around because of global warming, according to a new study published in the journal Science Advances.  


During the years 2050 to 2100, the Southwest and Great Plains will face a persistent “megadrought” worse than anything seen in the past 1,000 years, and the dry conditions will be “driven primarily” by human-induced global warming, scientists said.


“The future of drought in western North America is likely to be worse than anybody has experienced in the history of the United States,”   said Dr. Benjamin Cook, the study’s lead author and a climate scientist at NASA’s Goddard Institute for Space Studies in New York City. “These are droughts that are so far beyond our contemporary experience that they are almost impossible to even think about.”


There’s more than an 80 percent chance that much of this part of the country will have a 35-year-or-longer “megadrought” later this century if climate change continues unabated, said Dr. Cook. A megadrought is defined as a drought that lasts for decades or longer, such as those that scorched portions of the West in the 12th and 13th centuries. Dr. Cook said megadroughts should be considered a natural hazard on par with earthquakes and hurricanes.


Megadroughts will be ‘worse than anybody has experienced in the history of the United States


The new research is based on current increasing rate of rising emissions of carbon dioxide and complex simulations run by 17 different computer models, which generally agreed on the outcome, Dr. Cook said.


The regions Dr. Cook’s team looked at include California, Nevada, Utah, Colorado, New Mexico, Arizona, northern Texas, Oklahoma, Kansas, Nebraska, South Dakota, most of Iowa, southern Minnesota, western Missouri, western Arkansas, and northwestern Louisiana.


To identify past droughts, scientists studied tree rings to find out how much — or little — rain fell hundreds or even thousands of years ago. Scientists used that historical data in combination with the computer simulations to predict what changes we may see this century.


The models showed robust and consistent drying in the Southwest and Plains, due to a combination of reduced precipitation and warmer temperatures that dried out the soils, the researchers said.


Historical data showed there were megadroughts in the Southwest and Central Plains in the 1100s and 1200s that lasted for decades, but these will be worse, said Dr. Cook. Those were natural and not caused by climate change, unlike those forecast for the future, he explained.


Scientists predict ‘bleak future’ if climate change is left unchecked


Though previous studies have predicted climate change would increase the odds of worse droughts, this study predicts an even bleaker future, showing “more convincingly than ever before that unchecked climate change will drive unprecedented drying across much of the United States—even eclipsing the huge megadroughts of medieval times,” said Jonathan Overpeck, co-director of the Institute of the Environment at the University of Arizona, who was not involved in the study.


Drought, of course, will have serious consequences, including the potential for water shortages throughout the country. According to the National Resource Defense Council’s “Climate Change, Water, and Risk” report, at least 1,100 U.S. counties – one-third of all counties in the lower 48 states – face higher risks of water shortages by mid-century as a result of climate change. More than 400 of these counties will face extremely high risks of water shortages.


As temperatures rise and precipitation decreases, water quality can also be jeopardized. Shrinking water levels can concentrate contaminants such as heavy metals, industrial chemicals and pesticides, and sediments and salts. Additionally, droughts make drinking water supplies more susceptible to harmful algal blooms and other microorganisms.


Changes in precipitation and water availability will also have serious consequences for commercial agriculture – crops yield less in dry seasons, and food security suffers. In fact, scientists have warned that the effects of climate change will threaten the global food supply unless urgent action is taken. Drought conditions can also help fuel out-of-control wildfires.


But it’s not too late to mitigate some of these effects, scientists say. “The good news is that we have ample warning and know what to do to stop the unprecedented drying from becoming reality—we just need to make serious cuts in greenhouse gas emissions,” said Dr. James Famiglietti, a researcher from the University of California, Irvine.


“Otherwise,” he warned , “the next generations of Americans are going to have a huge problem on their hands.”


Mar's Solar Powered Helicopter



Engineers at NASA’s Jet Propulsion Laboratory in Pasadena, California are developing a small solar powered helicopter to act as an added set of eyes for future Mars rover missions.

While Mars rovers have some rather whizz-bang camera technology, their field vision is limited. Spacecraft orbiting Mars can also help guide the planning of rover routes, but what’s really need are more eyes on the ground – well, closer to it anyway.

The Mars Helicopter could potentially triple the distance rovers currently cover in a Martian day, peering over obstacles and assisting mission control back on Earth plan the best driving route; plus spot potential exploration and sample collection opportunities.

Controlling a drone can be tricky on Earth, but on Mars it will be even more so. Aside from distance between operator and drone, Mars’ atmosphere is a hundred times thinner than that of Earth. Helicopters gain lift by rotor movement pushing against the density of the atmosphere. The heavier the helicopter, the bigger the blades needed – especially on Mars. As space is at a premium on any mission; there was another option – go lighter and smaller.

The box-like craft will likely weigh around 2.2 pounds (1 kilogram) and measure 3.6 feet (1.1 meters) between blade tips.



The blades of the Mars Helicopter will need to spin at about 2400 rpm to provide lift and the vehicle will be designed to fly for 2-3 minutes each day; enough to cover around half a kilometre.

In between flights, a solar panel situated on top of the rotor hub will recharge the batteries; which won’t just be used for flight, but also to help keep the helicopter warm in Mars’ frigid conditions.

Project engineers are working towards making the unit bullet-proof as it will need to endure landings each day on a surface that can be strewn with rocks.

The helicopter may be used in conjunction with NASA’s 2020 Mars rover mission. That rover’s design will be derived from the Curiosity rover. Unlike the Spirit and Opportunity* vehicles before it; which are/were solar powered (communication with Spirit was lost in 2010) , Curiosity is powered by a radioisotope thermoelectric generator (RTG).



NASA See's Tropical Forest as KEY


Tropical forests are critical in absorbing human-made carbon emissions, a NASA study has found. (Photo: Leonard S. Jacobs/flickr/cc)


Tropical forests have emerged as a crucial factor in the fight against climate change, according to a new NASA-led study published Friday which finds that they are absorbing carbon dioxide at a far higher rate than previously thought.


As atmospheric levels of greenhouse gases have continued to rise, tropical forests, like those found in Malaysia, have been absorbing roughly 1.4 billion metric tons of carbon dioxide out of a total global absorption of 2.5 billion, NASA found. Those rates are not only higher than previously estimated, they are also higher than those of the vast boreal forests found in northern regions like Canada and Siberia—which are diminishing.


"This is good news, because uptake in boreal forests is already slowing, while tropical forests may continue to take up carbon for many years," said Dr. David Schimel, NASA Jet Propulsion Laboratory senior research scientist and lead author of a paper on the study.


Forests use human-made emissions to grow faster, which in turn reduces the amount of carbon dioxide in the atmosphere—an effect known as carbon fertilization. They also remove up to 30 percent of airborne human emissions through photosynthesis. If those processes slowed down, the rate of global warming would increase.


Why was it important to determine which kind of forest are more adept at that process?


Because the answer "has big implications for our understanding of whether global terrestrial ecosystems might continue to offset our carbon dioxide emissions or might begin to exacerbate climate change," said Britton Stephens, co-author of the study and a scientist at the National Center for Atmospheric Research.


Schimel added, "All else being equal, the effect is stronger at higher temperatures, meaning it will be higher in the tropics than in the boreal forests."


The problem lies in other harmful impacts of climate change that also affect forests. Warming temperatures decrease water availability and increase larger and more frequent wildfires—which, in turn, release large amounts of carbon into the atmosphere.


Still, NASA's discovery is largely auspicious. "What we've had up till this paper was a theory of carbon dioxide fertilization based on phenomena at the microscopic scale and observations at the global scale that appeared to contradict those phenomena," Schimel said. "Here, at least, is a hypothesis that provides a consistent explanation that includes both how we know photosynthesis works and what's happening at the planetary scale."


The study is groundbreaking in its methodology, as it is the first to use a variety of models, technology, and data to create an "apples-to-apples" comparison carbon dioxide estimates between forests, NASA explained.


By using computer models of ecosystem processes, inverse models of atmospheric concentrations, satellite images, and other data and analysis, the researchers were able to determine the accuracy of their results "based on how well they reproduced independent, ground-based measurements." MORE



Solar Powered Water Desalination


Makers of the Desolenator tout it to be an easy to use, low cost, solar powered and fairly portable water desalination system. It can even make sea water drinkable.

“Water, water, every where, Nor any drop to drink,” laments the Rime Of The Ancient Mariner. Things have certainly changed since the times of those fabled doomed sailors, with simple desalinators now a common feature in survival kits for boats.

96 percent of the Earth’s water can be found in our oceans and increasingly this massive resource is being tapped and turned from undrinkable to potable water.

However, desalination to the point of being truly useful in a day-to-day domestic situation is generally an energy intensive process requiring sophisticated, cumbersome and expensive equipment.

The Desolenator may help change all that. The system reportedly can produce 15 litres of water a day and lasts for up to 20 years. It uses no consumables, no filters and only requires basic maintenance. Its makers claim it can desalinate water at a lower cost per litre than any system at this scale currently available.

The device consists of a 100 watt solar panel, 100 watt immersion heater, storage tank, capillary tubes, glass chamber and battery. The casing incorporates all terrain wheels for ease of movement and an LCD panel .

The Desolenator team recently won 2nd place at the recent Climate KiC Clean Launch Pad competition and has forged academic partnerships with Liverpool University (UK), Imperial College (UK) and College of Engineering Trivandrum (India).

After 18 months of development and 5 prototype iterations, the Desolenator is nearly ready for prime time. Its inventors are now seeking to raise $150,000 through a crowdfunding campaign to accelerate product development process and help evolve the current prototype to a finished product ready for mass production. Funders contributing USD $450 will receive one of the first Desolenators to roll out of production.

With half the world’s population forecast to live in water stressed areas by 2030 according to the UN, devices such as the Desolenator will not just be interesting and useful gadgets, but life-savers.

Shaping the Future of
Energy Storage With Conductive Clay


Researchers at the University of New South Wales have reported converting over 40% of the sunlight hitting a solar panel system into electricity, the highest efficiency ever recorded.


The result has been independently confirmed by the National Renewable Energy Laboratory (NREL) at their outdoor test facility in the United States.


“We used commercial solar cells, but in a new way, so these efficiency improvements are readily accessible to the solar industry,” said Dr Mark Keevers, manager of the project.


The breakthrough results from the use of a custom optical bandpass filter to capture sunlight normally wasted by commercial solar cells on towers and convert it to electricity at a higher efficiency than the solar cells themselves ever could.


“The new results are based on the use of focused sunlight, and are particularly relevant to photovoltaic power towers being developed in Australia,” said UNSW Scientia Professor and Director of the Australian Centre for Advanced Photovoltaics (ACAP) Professor Martin Green.  


The work was funded by the Australian Renewable Energy Agency (ARENA) and supported by the Australia-US Institute for Advanced Photovoltaics (AUSIAPV).


“We hope to see this home grown innovation take the next steps from prototyping to pilot scale demonstrations. Ultimately, more efficient commercial solar plants will make renewable energy cheaper, increasing its competitiveness,” said ARENA CEO Ivor Frischknecht; who also stated the project further demonstrates the value of investing in Australia’s renewable energy ingenuity.


UNSW has been a leader in developing solar technology for decades. Its researchers were responsible for the first photovoltaic system to convert sunlight to electricity with over 20% efficiency in 1989.


Professor Green is referred to by some as the “father of photovoltaics”. It was way back in 1974 Professor Green launched the Solar Photovoltaics Group at the University. In recognition of his pioneering efforts, Professor Green was made a Member of the Order of Australia in 2012.


Earlier this year, the professor was ranked among the top 1% most cited researchers in their subject field and year of publication between 2002 and 2012.


The 40% efficiency achievement will soon be published by the Progress in Photovoltaics journal and will also be presented at the Australian PV Institute’s Asia-Pacific Solar Research Conference.


Shaping the Future of
Energy Storage With Conductive Clay


A special type of clay could play an important role in the energy storage systems of the future.


Invented by Drexel University College of Engineering’s materials scientists, MXene clay exhibits conductivity on par with that of metals and can be formed into a variety of shapes and size with ease.


MXenes are a family of 2D transition metal carbides and/or nitrides discovered and being developed at Drexel.


“Both the physical properties of the clay, consisting of two-dimensional titanium carbide particles, as well as its performance characteristics, seem to make it an exceptionally viable candidate for use in energy storage devices like batteries and supercapacitors,” said Professor Yury Gogotsi, co-author of a paper on MXene clay to be published in Nature on December 4.


“The procedure to make the clay also uses much safer, readily available ingredients than the ones we used to produce MXene electrodes in the past.”




While electrode materials such as graphene are hydrophobic, meaning it repels water, MXene is hydrophilic.


” The fact that we can now roll our electrodes rapidly and efficiently, and not have to use binders and/or conductive additives renders this material quite attractive from a mass production point of view,” said Professor Michel Barsoum; one of the inventors.


“Being able to make a conductive clay, essentially out of titanium carbide with the help of a common fluoride salt and hydrochloric acid is the materials equivalent of making a chocolate chip cookie – everybody has these ingredients in the pantry.”

Solar Cloth Company Scores Industry Award


The Solar Cloth Company, based in the UK, has developed an award-winning solar solution for non-load bearing roofing and car parks.

The firm produces lightweight, flexible thin film photovoltaic (TFPV) solar panels that weigh less than 3.3kg/m2 – ten percent of the weight of traditional glass-face solar panels.  


Technologies currently being used  for the solar cells are aSi (amorphous silicon) and CIGS (copper Indium Gallium diSelenide).  The company’s flagship installation at Cambridge Research Park provides a power output of 15kWp and covers 12 car parking spaces.

Just in the UK, there’s an estimated 834 million square metres of non-load bearing roofing and 353 million square metres of car parking – a huge area that could be used to generate power. If this area was fully utilized, it would represent 120GW of clean electricity generation capacity – more than three-times the energy required to power the UK’s National Grid.

With electricity prices in the UK gradually creeping up, the company says its solution is also cost-competitive.

“By 2020, the wholesale price of electricity is predicted to rise as high as 14p/kWh, whereas our lightweight, flexible solar panels are estimated to have a Levelised Cost of Energy (LCOE) as low as 8p/kWh over the same period.”

The company believes the market for its product will be worth over £1 billion within a decade.

The Solar Cloth Company was a winner in the recent Solar UK Industry Awards 2014 and  has been selected as a finalist in Cleantech Innovate 2015: BIPV Solar Innovation of the Year.

The firm is currently collaborating with the University of Cambridge and other leading European Universities for projects to create ultra-low cost TFPV; and is seeking an investment of £750,000 is to expand it production facilities and sales teams to deliver its 2015 sales pipeline.
 

CSP ‘Black Hole’ Material Unveiled



University of San Diego engineers have developed a nanomaterial that dramatically enhances the collection of solar energy in concentrating solar plants.


The material has been dubbed the “black hole,” due to its enormous ability to absorb solar energy.


The team were challenged as part of the U.S. government’s SunShot program to increase the efficiency of large-scale concentrated solar power (CSP) technology, which involves thousands of reflectors focusing sunlight onto a single collection point, called a solar absorber. The immense heat produced boils steam to drive an electric turbine. Conventional solar absorbers rapidly break down under the strain, requiring lengthy and costly delays while they are replaced.


By contrast, the UCSD team’s nanotech absorber is capable of converting to heat over 90 per cent of sunlight it captures and is designed to withstand extreme temperatures of over 700 degrees Celsius for years, despite exposure to the elements. Their work is published in the journal, Nano Energy.


“We wanted to create a material that absorbs sunlight that doesn’t let any of it escape. We want the black hole of sunlight,” said Sungho Jin, a professor in the department of Mechanical and Aerospace Engineering at UC San Diego Jacobs School of Engineering


To achieve this black hole effect, Jin and his colleagues created a “multiscale” surface using silicon-boride nanoparticles ranging in size from 10 nanometres to 10 micrometres. When sunlight hits the material, it is trapped and absorbed in the multiscale surface more efficiently than the black paint material used in conventional CSP plants.


The team have fabricated a spray-on application method for delivering the material onto a metal substrate in laboratory conditions and say they are close to ensuring the material will last for years in humidity and the open air.


“Current CSP plants are shut down about once a year to chip off the degraded sunlight absorbing material and reapply a new coating, which means no power generation while a replacement coating is applied and cured,” the University stated.


“The UC San Diego research team is aiming for many years of usage life, a feat they believe they are close to achieving.”



100% of World's Power From Renewables

by 2050?



A global low-carbon energy economy is not only feasible - it could actually double electricity supply by 2050, while also reducing air and water pollution, according to new research.


Even though photovoltaic power requires up to 40 times more copper than conventional power plants, and wind power uses up to 14 times more iron, the world wins on a switch to low-carbon energy.


These positive findings are published in the Proceedings of the National Academy of Sciences by Edgar Hertwich and Thomas Gibon, of the Norwegian University of Science and Technology Department of Energy and Process Engineering.


They and international research colleagues report that they have made - as far as they know - the first global life-cycle assessment of the economic and environmental costs of renewable and other clean sources of energy in a world that responds to the threat of climate change.


Other studies have looked at the costs in terms of health, pollutant emissions, land use change or the consumption of metals. The Norwegian team set out to consider the lot.


There were some things they had to leave out. These include bioenergy: the conversion of corn, sugar cane or other crops to ethanol for fuel, because that would also require a comprehensive assessment of the food system; and nuclear energy, because they could not reconcile what they called "conflicting results of competing assessment approaches."


But they tried to consider the whole-life costs of solar power, wind power, hydropower and gas and coal generators that used carbon capture and storage to reduce greenhouse gas emissions.


They took into account the demand for aluminium, copper, nickel and steel, metallurgical grade silicon, flat glass, zinc and clinker. They thought about the comparative costs of 'clean' and 'dirty' power generation.


And they considered the impact of greenhouse gases, particulate matter, toxicity in ecosystems, and the eutrophication- the overwhelming blooms of plankton - of the rivers and lakes.


They also assessed the impact of such future power plants on the use of land, and they made allowances for the economic benefits of increasing amounts of renewable power in the extraction and refinement of minerals needed to make yet more renewable power.


Then they contemplated two scenarios: one in which global electricity production rose by 134% by 2050, with fossil fuels accounting for two-thirds of the total; and one in which electricity demand in 2050 rises by 13% less because energy use becomes more efficient.



Earth's Ozone Layer on Track to Recovery



Earth's protective ozone layer is well on track to recovery in the next few decades thanks to concerted international action against ozone depleting substances, according to a new assessment by 300 scientists.

The Assessment for Decision-Makers, a summary document of the Scientific Assessment of Ozone Depletion 2014, is being published by the United Nations Environment Programme (UNEP) and the World Meteorological Organization (WMO), and is the first comprehensive update in four years.

The stratospheric ozone layer, a fragile shield of gas, protects Earth from harmful ultraviolet rays of the sun. Without the Montreal Protocol and associated agreements, atmospheric levels of ozone depleting substances could have increased tenfold by 2050. According to global models, the Protocol will have prevented 2 million cases of skin cancer annually by 2030, averted damage to human eyes and immune systems, and protected wildlife and agriculture, according to UNEP.

The phase-out of ozone depleting substances has had a positive spin-off for the global climate because many of these substances are also potent greenhouse gases. However, the assessment report cautions that the rapid increase in certain substitutes, which are themselves also potent greenhouse gases, has the potential to undermine these gains. The assessment also notes that there are possible approaches to avoiding the harmful climate effects of these substitutes.




"There are positive indications that the ozone layer is on track to recovery towards the middle of the century. The Montreal Protocol -- one of the world's most successful environmental treaties -- has protected the stratospheric ozone layer and avoided enhanced UV radiation reaching the earth's surface," said UN Under-Secretary-General and UNEP Executive Director Achim Steiner.

"However, the challenges that we face are still huge. The success of the Montreal Protocol should encourage further action not only on the protection and recovery of the ozone layer but also on climate. On September 23, the UN Secretary General will host Heads of State in New York in an effort to catalyse global action on climate. The Montreal Protocol community, with its tangible achievements, is in a position to provide strong evidence that global cooperation and concerted action are the key ingredients to secure the protection of our global commons," he added.

"International action on the ozone layer is a major environmental success story," said WMO Secretary-General Michel Jarraud. "This should encourage us to display the same level of urgency and unity to tackle the even greater challenge of climate change. This latest assessment provides solid science to policy-makers about the intricate relationship between ozone and climate and the need for mutually-supportive measures to protect life on earth for future generations."

"Human activities will continue to change the composition of the atmosphere. WMO's Global Atmosphere Watch programme will therefore continue its crucial monitoring, research and assessment activities to provide scientific data needed to understand and ultimately predict environmental changes, as it has done for the past 25 years" said Mr Jarraud.

Key findings:



Actions taken under the Montreal Protocol on Substances that Deplete the Ozone Layer are enabling the return of the ozone layer to benchmark 1980 levels.

Under full compliance with the Montreal Protocol, the ozone layer is expected to recover to 1980 benchmark levels- the time before significant ozone layer depletion- before the middle of the century in mid-latitudes and the Arctic, and somewhat later in the Antarctic.

The Montreal Protocol and associated agreements have led to decreases in the atmospheric abundance of gases, such as CFCs (chlorofluorocarbons) and halons, once used in products such as refrigerators, spray cans, insulation foam and fire suppression.

Total column ozone declined over most of the globe during the 1980s and early 1990s. It has remained relatively unchanged since 2000, but there are recent indications of its future recovery.

The Antarctic ozone hole continues to occur each spring and it is expected to continue occurring for the better part of this century given that ozone depleting substances persist in the atmosphere, even though their emissions have ceased.

The Arctic stratosphere in winter/spring 2011 was particularly cold, which led to large ozone depletion as expected under these conditions.


The climate benefits of the Montreal Protocol could be significantly offset by projected emissions of HFCs (hydrofluorocarbons) used to replace ozone depleting substances.


The Montreal Protocol has made large contributions toward reducing global greenhouse gas emissions. In 1987, ozone-depleting substances contributed about 10 gigatonnes CO2-equivalent emissions per year. The Montreal Protocol has now reduced these emissions by more than 90 per cent. This decrease is about five times larger than the annual emissions reduction target for the first commitment period (2008-2012) of the Kyoto Protocol on climate change.

Hydrofluorocarbons (HFCs) do not harm the ozone layer but many of them are potent greenhouse gases. They currently contribute about 0.5 gigatonnes of CO2-equivalent emissions per year. These emissions are growing at a rate of about 7 per cent per year. Left unabated, they can be expected to contribute very significantly to climate change in the next decades.

Replacements of the current mix of high-GWP HFCs with alternative compounds with low GWPs or not-in-kind technologies would limit this potential problem.

The annual Antarctic ozone hole has caused significant changes in Southern Hemisphere surface climate in the summer.

Ozone depletion has contributed to cooling of the lower stratosphere and this is very likely the dominant cause of observed changes in Southern Hemisphere summertime circulation over recent decades, with associated impacts on surface temperature, precipitation, and the oceans.

In the Northern Hemisphere, where the ozone depletion is smaller, there is no strong link between stratospheric ozone depletion and tropospheric climate.


CO2, Nitrous Oxide and Methane will have an increasing influence on the ozone layer


What happens to the ozone layer in the second half of the 21st century will largely depend on concentrations of CO2, methane and nitrous oxide -- the three main long-lived greenhouse gases in the atmosphere. Overall, CO2 and methane tend to increase global ozone levels. By contrast, nitrous oxide, a by-product of food production, is both a powerful greenhouse gas and an ozone depleting gas, and is likely to become more important in future ozone depletion.

The Scientific Assessment Panel is expected to present the key findings of the new report at the annual Meeting of the Parties to the Montreal Protocol, to be held in Paris in November 2014. The full body of the report will be issued in early 2015.


MSU Develops Transparent Plastic Solar Cell



Michigan State University scientists have developed a new plastic see-through solar collector that they say can harvest solar energy and be used to power smart phones and tablets without blocking the view.

 

The team created the system using luminescent solar collecting material, or LSC – inexpensive plastic slabs with the ability to absorb light and re-emit it at much higher intensities. By manipulating organic molecules on the surface of the LSC, the researchers were able to tune it collect specific nonvisible wavelengths of sunlight.

 

Although colourful LSCs have previously taken the form of stained glass windows and tiny American flags, the MSU’s ultimate goal was a completely transparent solar collector.

   

"No one wants to sit behind coloured glass," said Richard Lunt, an assistant professor of chemical engineering and materials science at MSU. "It makes for a very colourful environment, like working in a disco. We take an approach where we actually make the luminescent active layer itself transparent."

 

"Because the materials do not absorb or emit light in the visible spectrum, they look exceptionally transparent to the human eye."

 

The system switches near-infrared and ultraviolet wavelengths of sunlight into form of 'glowing' infrared light, which then passes to photovoltaic strips at the edge of the material and is converted into electricity.

    

The MSU team say more work is needed to bring the technology up to scale. Right now the transparent solar collector produces a puny one per cent solar conversion efficiency – compared to seven per cent for most coloured LSCs. The wide range of applications beyond traditional solar sources, they believe, will be the key to its success.

   

"It opens a lot of area to deploy solar energy in a non-intrusive way," Lunt said. "It can be used on tall buildings with lots of windows or any kind of mobile device that demands high aesthetic quality like a phone or e-reader. Ultimately we want to make solar harvesting surfaces that you do not even know are there."



Cigarette Butts A Commodity?



For those of use who have walked along a beautiful beach and enjoyed the warm water upon our feet only to have the moment utterly destroyed when we come upon a slew of cigarette butts in our path there may be some good news in possibly curbing such horrible pollution. News out of Seoul Korea just may make cigarette butts a commodity in helping build, of all things, electric cars!

SEOUL, South Korea, August 6, 2014 (ENS) – How can used cigarette butts contribute to the development of superior electric vehicles? South Korean researchers have found a way.

Five scientists from Seoul National University’s College of Engineering have converted used cigarette filters into a high-performing material for supercapacitors that could be integrated into electric vehicles to store energy.

Unlike batteries that offer limited charging/discharging rates, supercapacitors require only seconds to charge and can feed electricity back into the vehicle’s air-conditioning system, GPS, radio, and other devices as needed.

Publishing their findings August 5 in the Institute of Physics Publishing’s journal “Nanotechnology,” the scientists say they have demonstrated the material’s superior performance compared to commercially available carbon, graphene and carbon nanotubes.
cigarette butts

They hope the material derived from cigarette butts can be used to coat the electrodes of supercapacitors – electrochemical components that can store large amounts of electrical energy – while also offering a solution to the growing environmental problem caused by trillions of used cigarettes filters discarded annually.

It is estimated that as many as 5.6 trillion used-cigarettes, or 766,571 metric tons, are deposited into the environment worldwide every year.

Study co-author Professor Jongheop Yi said, “Our study has shown that used cigarette filters can be transformed into a high-performing carbon-based material using a simple one step process, which simultaneously offers a green solution to meeting the energy demands of society.”

“Numerous countries are developing strict regulations to avoid the trillions of toxic and non-biodegradable used-cigarette filters that are disposed of into the environment each year,” said Yi. “Our method is just one way of achieving this.”

Scientists around the world are currently working towards improving the characteristics of supercapacitors, such as energy density, power density and cycle stability, while reducing production costs.

As compared to the basic electrostatic capacitor used to tune radio frequencies, the supercapacitor is ideal for energy storage that undergoes frequent charge and discharge cycles at high current and short duration.

Carbon is the most popular material that supercapacitors are made of, due to its low cost, high surface area, high electrical conductivity and long-term stability.

In their study, the Seoul researchers demonstrated that the cellulose acetate fibers of which cigarette filters are made could be transformed into a carbon-based material using a simple, one-step burning technique called pyrolysis, conducted in a nitrogen-rich environment.

Used cigarette filters from Marlboro Light Gold, Bohem Cigar Mojito and The One Orange from the Korea Tobacco & Ginseng Corp. were collected.

They were pyrolyzed for two hours in an atmosphere of argon and NH3, a colorless, pungent gas composed of nitrogen and hydrogen.

The carbon-based material resulting from this burning process contained both tiny nano-pores and medium-sized nano-pores, increasing its performance as a supercapacitive material.

“A high-performing supercapacitor material should have a large surface area, which can be achieved by incorporating a large number of small pores into the material,” said Professor Yi.

The porous carbon material developed by Yi and his colleagues spontaneously contains both micropores, with pore diameters of less than two nanometers, and mesopores, with pore diameters of about 25 nanometers.

“The unique self-developed pore structure allowed a favorable pathway for electrolyte permeation and contact probability, resulting in the extended rate capability for the supercapacitor, said Yi.

“A combination of different pore sizes ensures that the material has high power densities, which is an essential property in a supercapacitor for the fast charging and discharging,” he said.

Once fabricated, the carbon-based material was attached to an electrode and tested in a three-electrode system to see how well the material could charge and discharge.

The material stored a higher amount of electrical energy than commercially available carbon, the researchers found. It also had a higher storage capacity compared to the graphene and carbon nanotubes reported in previous studies.

Solar Powered Robot To Hitchhike Across Canada


Drivers on Canada's highways may happen upon a chatty robot attempting to hitchhike across the country next month.
 
Known as hitchBOT, it will commence its coast-to-coast journey at the Institute for Applied Creativity at the Nova Scotia College of Art and Design (NSCAD) in Halifax; with a goal of reaching Victoria, the capital city of British Columbia.
 
Hitchbot will incorporate speech recognition, a social media and Wikipedia API, artificial intelligence technologies and 3G and wifi connectivity to help it on its journey. It will be able to continually call home with its location - so would-be thieves watch out.
 
"We expect hitchBOT to be charming and trustworthy enough in its conversation to secure rides through Canada," says Dr. Frauke Zeller of Ryerson University; one of hitchBOT's creators.
 
While the final design of hitchBOT is yet to be unveiled, Dr. David Harris Smith of McMaster University says it will look "like somebody has cobbled together odds and ends to make the robot, such as pool noodles, bucket, cake saver, garden gloves, Wellies, and so forth." Hitchbot will be unable to move and will be entirely dependent on humans for transport.
 
According to the Toronto Star, hitchBOT's batteries will be powered by solar panels covering the beer cooler bucket that will be its torso, and can also be recharged from car cigarette lighters or a standard power outlet - so hitchBOT might not just bum a ride, but some juice as well.
   
It was unknown at the time of publishing if hitchBOT will incorporate a death-ray, taser or any other weaponry; or if contingency plans are in place should it become self-aware on its journey and start plotting the enslavement of humanity.
 
Rather than being a research initiative of any kind, it's being considered a collaborative art project.


Citizen Scientists Take Control Of
Solar Powered Spacecraf
t



Citizen scientists have, with permission, taken control of a NASA spacecraft  launched in 1978.
   
The International Sun-Earth Explorer-3 (ISEE-3), later renamed to International Cometary Explorer (ICE), was launched August 12, 1978. After successfully completing its original mission; it was then re-tasked to study the interaction between the solar wind and a cometary atmosphere.
 
NASA eventually lost contact with the spacecraft and officially suspended attempts at contact with ISEE-3 in 1998.
 
In 2008, it was discovered that it had not been switched off and most of its equipment was still functioning; powered exclusively by an upper and lower ring of 16 solar panels each; with an original collective capacity of 175 watts. The battery on the craft failed 2 years after launch; as designed.
 
Recently, members of Space College made two-way contact with the decades-old spacecraft as part of its crowdfunded ISEE-3 Reboot Project; which is operating out of an old McDonald's store.
 
In order to interact with the spacecraft, the group needed to locate the original commands and then develop a software version of the original hardware that was used to communicate with ISEE-3.
 
"Over the coming days and weeks our team will make an assessment of the spacecraft's overall health and refine the techniques required to fire its engines and bring it back to an orbit near Earth," states the group.
 
This isn't without substantial risk - there is the prospect of the craft crashing into the moon.
 
Once it is under full control, the group hopes to conduct a privately funded mission to fly-by a comet.
 
The project has certainly stirred up interest from around the world. The crowdfunding campaign had originally sought to raise $125,000; but went on to reach $159,000 by the end of the campaign on May 23.



Solar Panels Draining The Sun


We're hoping the Federal Government and the Renewable Energy Target review panel don't read this article - it seems 'forced photovoltaic drainage' is another evil trait of solar panels.
 
According to National Report, boffins at the Wyoming Institute of Technology have discovered that energy radiated from the sun isn’t just harvested by solar panels, but that energy is directly physically drawn from the sun by those panels in a process they refer to as "forced photovoltaic drainage."
 
"If every home in the world had solar panels on their roofs, global temperatures would drop by as much as thirty degrees over twenty years, and the sun could die out within three hundred to four hundred years."
   
Forced Photovoltaic Drainage was also covered by another fine news source; The Onion - which alerted us to the fact that the Sun doesn't come with "free refills".

The Onion's coverage also warned of the evils of wind turbines. It seems these offensive devices could knock the Earth off its axis and send us hurtling towards the (greatly depleted thanks to solar panels) Sun or away from it.

Add to this the threat of wind-in-brain disease and it becomes clear that renewable energy is truly the work of Satan.
  
Of course, both the National Report and The Onion items are just satire and it's clear the claims are ridiculous. However, so too are some of the 'facts' being bandied about at the moment by certain special interests.



Europe Looking to Convert CO2
Into Methanol For Fuell



Leading research firm Tecnalia has collaborated in a study for the European Parliament’s Science and Technology Options Assessment Panel (STOA) on the future use of methanol, produced from carbon dioxide, in motorised transport.

STOA is the panel that advises the European Parliament in the sphere of Science and Technology.

The study analysed the barriers –technological, environmental and economic– to producing methanol using carbon dioxide as well as the options that would allow possible uses in automobile transport in the medium and long term.

The REPORT


The costs and benefits were evaluated from the life cycle perspective in order to compare various raw materials for producing methanol and in order to reflect the potential benefits of methanol obtained from CO2.

The report concluded that benefits in the medium and long term can be anticipated since the obtaining of an alternative fuel using a residual greenhouse gas would allow European dependence on conventional fossil fuels to be cut, and that way the risks in supply security to be minimized.

The study highlights, however, that a sustained effort will be needed in research and development in order to turn CO2 into a competitive raw material, produce it efficiently using emissions, and ensure that it is an attractive fuel for the transport sector as well as for other industries.

Europe’s growing difficulties in accessing a secure supply of fossil fuels at acceptable prices are forcing it to consider alternative options to enable the transport service to go on being affordable for the production base and citizens during the transition towards an economy that is less dependent on oil.

Antarctica's Melting Speed Doubles

Photo by Christopher.Michel at Flickr

If we needed more disturbing news on a melting Antarctica, scientists are supplying it. Days after we learned that the melting of the West Antarctica ice sheet is apparently unstoppable, researchers find that the continent is disappearing twice as quickly as it was when last measured. Information from Europe's CryoSat spacecraft show that Antarctic ice is now melting at a rate of 160 billion metric tons, or 176 billion short tons, per year, the BBC reports.

That rate will raise sea levels by about .017 inches yearly. Antarctica in its entirety, meanwhile, is dropping by about .79 inches per year. The new study focuses on data from the years 2010 to 2013; the previous data reflected the years 2005 to 2010. "We find that ice losses continue to be most pronounced in West Antarctica, along the fast-flowing ice streams that drain into the Amundsen Sea," says a researcher.



Extreme Drought  Causes Texans  To Turn
 
To Toilet Water


Photo by Earl McGehee, Flickr

If you thought there wasn't a price to pay for climate change, just ask the folks living in Wichita Falls, Texas. After three years of severe drought, the city has imposed harsh restrictions that are forcing them to look for alternatives when it comes to their water needs.

We came across this interesting report on NPR and suggest that you take a look at it, this definitely has us wondering if this will also be our future, the world's future? Clean water is a necessity, it isn't like oil, or natural gas- we can live without those, but water is necessary.

The city of Wichita Falls, Texas, may soon become the first in the country where half of the drinking water comes directly from wastewater.

Yes, that includes water from toilets.

The plan to recycle the water became necessary after three years of extreme drought, which has also imposed some harsh restrictions on Wichita Falls residents, says Mayor Glenn Barham.

"No outside irrigation whatsoever with potable water," he says. "Car washes are closed, for instance, one day a week. If you drain your pool to do maintenance, you're not allowed to fill it."

Barham says residents have cut water use by more than a third, but water supplies are still expected to run out in two years.

So the city has built a 13-mile pipeline that connects its wastewater plant directly to the plant where water is purified for drinking. That means the waste that residents flush down their toilets will be part of what's cleaned up and sent back to them through the tap.

Wichita Falls constructed a 13-mile pipeline to deliver the city's wastewater to a purification plant.


For some citizens, that's a little tough to swallow.

"I think it's gross," says Wichita Falls resident Marissa Oliveras. "I mean, it's recycled wastewater that we could possibly be drinking."

Oliveras isn't the only Wichita Falls resident who says she plans to switch to bottled water. At Gidget's Snack Shack downtown, customer Kira Smith also plans to spend extra money on bottled water when the recycled wastewater begins to flow.

"The thought of it definitely grosses me out," Smith says. "I'm sure that they would clean it and filter it up to standards, but I think just the idea would be — it's sort of a mindset kind of thing, you know what I'm talking about?"

The mayor insists the water will be clean and safe, and the city has undertaken a massive education campaign to explain the science behind the process, known as direct potable reuse. Several other Texas cities are pursuing the process. One small hamlet started recycling wastewater in 2011, but not on the scale that's being done here.

Some people unceremoniously call it "toilet-to-tap," but the city official overseeing this process, Daniel Nix, says that's not really how it works.

"The vast majority of water that enters a wastewater plant did not come from a toilet," he says. "They come from sinks, and bathtubs, and washing machines and dishwashers."

Rhode Island Rally for Climate Resilience 


Climate change is poised to make the Eastern seaboard coastal water levels rise more than 3x the normal global level. That fact was articulated by Brown Professor of Geological Science, Dr. Tim Herbert, on the front steps of the Rhode Island State House for the Climate Change Rally that took place over the weekend.

People of all ages gathered together to support The Resilient Rhode Island Act of 2014, a bill that is now pending in front of state legislatures. This bill targets a number of areas that would help to make Rhode Island one of the states taking aggressive actions when it comes to climate change. 
More about the bill.

During the rally, we heard the concerns about climate change and how it will affect those living in Rhode Island and in the Northeast as they were articulated to us by Mara Freilich, from the Rhode Island Student Climate Coalition, and Abel Collins, Sierra Club of Rhode Island's Program Director. They also spoke about the importance of the bill and what needs to happen next to support it and see it through.

Fuel From the Sea Concept, NAVY uses Scale Model Proof of Concept  


Navy researchers at the U.S. Naval Research Laboratory (NRL), Materials Science and Technology Division, demonstrate proof-of-concept of novel NRL technologies developed for the recovery of carbon dioxide (CO2) and hydrogen (H2) from seawater and conversion to a liquid hydrocarbon fuel.


Fueled by a liquid hydrocarbon—a component of NRL's novel gas-to-liquid (GTL) process that uses CO2 and H2 as feedstock—the research team demonstrated sustained flight of a radio-controlled (RC) P-51 replica of the legendary Red Tail Squadron, powered by an off-the-shelf (OTS) and unmodified two-stroke internal combustion engine.

Using an innovative and proprietary NRL electrolytic cation exchange module (E-CEM), both dissolved and bound CO2 are removed from seawater at 92 percent efficiency by re-equilibrating carbonate and bicarbonate to CO2 and simultaneously producing H2. The gases are then converted to liquid hydrocarbons by a metal catalyst in a reactor system.

"In close collaboration with the Office of Naval Research P38 Naval Reserve program, NRL has developed a game changing technology for extracting, simultaneously, CO2 and H2 from seawater,"
said Dr. Heather Willauer, NRL research chemist. "This is the first time technology of this nature has been demonstrated with the potential for transition, from the laboratory, to full-scale commercial implementation."

CO2 in the air and in seawater is an abundant carbon resource, but the concentration in the ocean (100 milligrams per liter [mg/L]) is about 140 times greater than that in air, and 1/3 the concentration of CO2 from a stack gas (296 mg/L). Two to three percent of the CO2 in seawater is dissolved CO2 gas in the form of carbonic acid, one percent is carbonate, and the remaining 96 to 97 percent is bound in bicarbonate.

NRL has made significant advances in the development of a gas-to-liquids (GTL) synthesis process to convert CO2 and H2 from seawater to a fuel-like fraction of C9-C16 molecules. In the first patented step, an iron-based catalyst has been developed that can achieve CO2 conversion levels up to 60 percent and decrease unwanted methane production in favor of longer-chain unsaturated hydrocarbons (olefins). These value-added hydrocarbons from this process serve as building blocks for the production of industrial chemicals and designer fuels.

In the second step these olefins can be converted to compounds of a higher molecular using controlled polymerization. The resulting liquid contains hydrocarbon molecules in the carbon range, C9-C16, suitable for use a possible renewable replacement for petroleum based jet fuel.

The predicted cost of jet fuel using these technologies is in the range of $3-$6 per gallon, and with sufficient funding and partnerships, this approach could be commercially viable within the next seven to ten years. Pursuing remote land-based options would be the first step towards a future sea-based solution.

The minimum modular carbon capture and fuel synthesis unit is envisioned to be scaled-up by the addition individual E-CEM modules and reactor tubes to meet fuel demands.

NRL operates a lab-scale fixed-bed catalytic reactor system and the outputs of this prototype unit have confirmed the presence of the required C9-C16 molecules in the liquid. This lab-scale system is the first step towards transitioning the NRL technology into commercial modular reactor units that may be scaled-up by increasing the length and number of reactors.



The process efficiencies and the capability to simultaneously produce large quantities of H2, and process the seawater without the need for additional chemicals or pollutants, has made these technologies far superior to previously developed and tested membrane and ion exchange technologies for recovery of CO2 from seawater or air.


Bangladesh, Ground ZERO on Climate Change 


When is comes to Climate Change do you ever wonder where the effects are going to be felt most? With last weeks' UN Intergovernmental Panel on Climate Change that continues to shine light on the dire effects of global warming on the on the earth and the population, we came across an article on NEWSER that identifies as Bangladesh, Ground ZERO when it comes to Climate Change.

Bangladesh only produces 0.3% of the world's greenhouse gases. But few nations are poised to suffer more as sea levels rise due to climate change, the New York   Times points out, in an in-depth piece on the nation's plight. The country is extremely flat, low, prone to cyclones and flooding, and among the world's most densely populated, with 160 million people crammed into a space less than a quarter of the size of France (which has roughly 100 million fewer people). Its sea walls are also poorly built, and because it has long relied heavily on wells-rather than its polluted rivers-for drinking water, its cities are sinking.

"There are a lot of places in the world at risk from rising sea levels, but Bangladesh is at the top of everybody's list," says one environmental policy expert. Indeed, a 2010 study named Bangladesh the country most vulnerable to climate change. And things could be worse than expected. Scientists widely predict that sea levels will rise as much as three feet by 2100, but those increases won't be evenly distributed. One scientist thinks Bangladesh could see waters rise up to 13 feet. "The reaction among Bangladeshi government officials has been to tell me that I must be wrong," he says. "That’s completely understandable, but it also means they have no hope of preparing themselves." For much more, read the full report  .

If this story doesn't give you pose to ask yourself what your own community may be dealing with in years to come, well it better. Along with the identified articles and reports in the story from NEWSER we also suggest that you read the report from the 2014 UN Intergovernmental Panel on Climate Change.



Southeast England most at risk of rising deaths due to climate change


Warmer summers brought on by climate change will cause more deaths in London and southeast England than the rest of the country, scientists predict.

Researchers at Imperial College London looked at temperature records and mortality figures for 2001 to 2010 to find out which districts in England and Wales experience the biggest effects from warm temperatures.

In the most vulnerable districts, London and the southeast, the odds of dying from cardiovascular or respiratory causes increased by over 10 per cent for every 1C rise in temperature. Districts in the far north were much more resilient, seeing no increase in deaths at equivalent temperatures.

Writing in Nature Climate Change, the researchers say local variations in climate change vulnerability should be taken into account when assessing the risks and choosing policy responses.

Dr James Bennett, the lead author of the study from the MRC-PHE Centre for Environment and Health at Imperial College London, said: “It’s well known that warm weather can increase the risk of cardiovascular and respiratory deaths, especially in elderly people. Climate change is expected to raise average temperatures and increase temperature variability, so we can expect it to have effects on mortality even in countries like the UK with a temperate climate.”

Across England and Wales as a whole, a summer that is 2C warmer than average would be expected to cause around 1,550 extra deaths, the study found. Just over half would be in people aged over 85, and 62 per cent would be in women. The extra deaths would be distributed unevenly, with 95 out of 376 districts accounting for half of all deaths.

The effects of warm temperature were similar in urban and rural districts. The most vulnerable districts included deprived districts in London such as Hackney and Tower Hamlets, with the odds of dying more than doubling on very hot days like those of August 2003.

“The reasons for the uneven distribution of deaths in warm weather need to be studied,” said Professor Majid Ezzati, from the School of Public Health at Imperial, who led the research. “It might be due to more vulnerable individuals being concentrated in some areas, or it might be related to differences at the community level, like quality of healthcare, that require government action.

“We might expect that people in areas that tend to be warmer would be more resilient, because they adapt by installing air conditioning for example. These results show that this isn’t the case in England and Wales.

“While climate change is a global phenomenon, resilience and vulnerability to its effects are highly local. Many things can be done at the local level to reduce the impact of warm spells, like alerting the public and planning for emergency services. Detailed information about which communities are most at risk from high temperatures can help to inform these strategies.”


The researchers received funding from the Medical Research Council, Public Health England, and the National Institute for Health Research (NIHR) Imperial Biomedical Research Centre.

Source:
Date:
March 23, 2014
Source:
Imperial College London



Louisiana's Coastline is Disappearing Too Quickly for Mappers to Keep Up


The area south of the town of Buras, Louisiana, in 1990 (left) and today (right). NOAA has retired the names English Bay, Bay Jacquin, and Scofield Bay, acknowledging the vast water that now separates Buras from the barrier along Pelican Island.



Each year, this part of the coastline loses around 16 square miles of land, according to David Muth, the state director of the National Wildlife Federation's Mississippi River Delta Restoration Project. And until quite recently, even the most advanced maps of the area did little to reflect the changing environmental reality.

But in the last few years, renewed mapping efforts from the National Oceanic and Atmospheric Administration have begun to catalog these changes. These new maps show water where there was once marshy land, and bays where there were once small inlets.


And in the last few years, more than 30 of the region’s names — including English Bay, Bay Jacquin, and Scofield Bay — have been officially retired from the map. Meredith Westington, chief geographer at NOAA’s Office of Coast Survey, keeps a running list of these newly extinct places handy on her desk. Though communities have become attached to the names of their nearby landscape, she explains, that it just “doesn’t make sense to leave some island name in the chart where there’s no island there anymore.”


Coastal erosion has dramatically changed the size and shape of Adams Bay and Bastian Bay, once far more distinct waterways .NOAACoastal erosion has dramatically changed the size and shape of Adams Bay and Bastian Bay, once far more distinct waterways. Click to embiggen.

These marshy regions of coastal Louisiana aren’t normally the types of areas that attract close attention from NOAA, which focuses on larger ports with more significant commercial navigation traffic. But following Katrina and Rita, reports of debris prompted an unusual amount of new surveying efforts, according to Mike Espey, who oversees these projects as the chief of the Applications Branch of the Remote Sensing Division of NOAA’s National Geodetic Survey.

In the old days, mapmakers used on-the-water travel or interviews about local terminology with Bayou fishermen to get the lay of the land. Now, Espey’s office uses a combination of aerial photographs and satellite images to catalog the new geography of this rapidly eroding region.

This, Espey explains, makes it possible to track changes to these tiny and navigationally insignificant areas. Since much of the surveyed land is marshy and ill-defined, these maps tend to mark where vegetation has stopped growing, more than anything else. The changes to these most recent charts are still nowhere near the finest scale of detail possible.

Still, the land loss we can see is stark. “You’re kind of seeing a culmination of decades of changes, Westington says. “And it looks very dramatic.”

In many cases, the names lost forever from the maps represent places that have long since ceased to exist. “You’re cruising along and the water will be three, four feet deep, and the GPS will say you’re on land,” Muth says. “The official maps are really trying to catch up, but land loss is so fast in certain parts of the coast that no one can keep up. You can have a piece of land out there that retreats 20, 30, 40 feet a year.”


Bob Taylors Pond, one of the names officially put on the historical list, has become a part of Zinzin Bay. Click to embiggen.NOAABob Taylors Pond, one of the names officially put on the historical list, has become a part of Zinzin Bay. Click to embiggen.

The latest surveys are still being processed, and already Westington’s office has decided to retire 10 additional names that appear on even NOAA’s most recent set of charts. And future changes to the landscape could alter these maps even more. Soon, far larger bodies of water — like the several-miles-across Barataria Bay and Terrebone Bay — may lose their natural barriers and combine. And land rebuilding efforts, outlined in the state’s 2012 Coastal Master Plan, could lead to new deposits of sediment, and eventually new land, in other parts of the river’s vast delta.

No matter what, the map will certainly change again. “Because deltas are so dynamic, they’re either building or they’re eroding,” Muth says. “The idea that you can pick a point in time and say, ‘This is how we want the coast to look,’ is, first of all, the wrong way to think about it. And second of all, it creates an impossible situation.”



Bloomberg's 'Vibrant Oceans' Initiative Invests in the Future of Sustainable Fisheries

A Man of Action,
A Man Who Walks The Talk...

Last month former New York City Mayor Michael Bloomberg’s worked with Henry Paulson under the Risky Business initiative to assess the amount of financial risk that climate change poses to the American economy. But the highly committed former mayor is clearly not waiting around to find out the answer.

He has just set out on a new venture, investing $53 million of his own money to try and do something about the sorry state of the world’s fisheries. The U.N. Food and Agriculture Organization‘s 2006 study published in Science predicted that many ocean fish stocks will be producing less than 10 percent of their peak catch levels by the end of this century.

Already, the U.N. reports that 32 percent of global fish stocks are overexploited or depleted and as much as 90 percent of large species like tuna and marlin have been fished out. The International Programme on the State of the Ocean found that the world’s marine species faced threats “unprecedented in human history”—and overfishing is part of the problem. Another part, of course,  is climate change which has raised ocean temperatures and acidity levels, wreaking havoc with coral reefs and other sensitive breeding areas.

Meanwhile, global demand for seafood and fish products is expected to rise by 20 percent in the next six years.

In many places fish protein is now being obtained through farming rather than fishing. But the aquaculture boom has its own problems. It is currently the fastest growing means of food production, with output jumping 50-fold since the 1950s. Today it provides roughly half of the seafood consumed around the world. But among the problems associated with fish farming are disease, which often spreads to wild fish, the proliferation of antibiotics and anti-fungal agents to combat the disease, a significant amount of nitrogen and phosphorus waste products being released into the ocean, natural habitat destruction, and the voracious appetites of carnivorous fish that often require two pounds of fish meal for every pound of fish produced. With all that, the end results is often less nutritious, and, in the case of farmed salmon, dyed pink to resemble their wild cousins.

Still, given the widespread incidence of cardiovascular disease, diabetes and obesity, many experts, including doctors, are recommending more fish in the American diet.

Despite the dire state of things, there are reasons to believe an investment like Bloomberg’s Vibrant Ocean’s Initiative, can make a difference. Why? Just look at it from a business perspective, says Svati Kirsten Narula in the Atlantic. The difference between what the sea is producing today and what it could be producing if stocks were restored, is somewhere around $50 billion annually. But in order to realize that opportunity, two things have to happen. First, we need to stop subsidizing the global fishing fleet in their quest to get every last fish out of the water. And second, we need to encourage the growth of existing fish stocks by managing them sustainably.

One study, published in Science, found that fishing reform could really make a difference, generating an estimated 56 percent increase in abundance and anywhere between an 8 and 40 percent increase in global catches. A second study, puts the cost of rebuilding at $203 billion, with a profitable outcome in 12 years.

What kind of reforms are we talking about? The current industrial fleet already has the capacity to catch twice as many fish as there are in the ocean today. And some of the methods in use today such as bottom-trawling discards 10 pounds of sea life for every pound caught. Clearly, these types of practices are the very antithesis of sustainable.

A lot of this, says Bloomberg, can be addressed with proper management. Since most of the world’s fisheries lie within national boundaries, individual countries can set policy to manage their resources. Bloomberg will be supporting three initiatives with three different partners utilizing three different leverage points.

Oceana, whose motto is “promote responsible fishing,” is an advocacy and educational group. They work with governments to set and enforce science-based limits on the amount of fish that can be caught, and reduce the amount of sea life that is unintentionally caught and killed.

Rare, which operates in the Philippines, focuses more on the 12 million smaller fishermen that work within 10 miles of shore. They work with educating these fishermen on sustainable practices, and the development and enforcement of protected areas where fish stocks can rebuild. They offer fishers exclusive fishing rights in exchange for the agreement to respect protected zones.

EKO Asset Management Partners, works from the financial end of things, working with investors and agencies to “assess, develop and deploy innovative financing strategies that result in meaningful environmental outcomes and attractive financial returns.” Because both local and industrial fishers tend to over-fish for basic survival, with a short-term focus, EKO works to create financial incentives to reward more sustainable fishing practices. These incentives will, hopefully, reward fishermen for implementing changes and allow them to participate in the returns that these changes are expected to bring about.

Says EKO co-founder Jason Scott, about the J-shaped fishery recovery curve, “If the science behind that is true, then this is a really attractive investment.”



Fish living near the equator will not thrive in the warmer oceans of the future


According to an international team of researchers, the rapid pace of climate change is threatening the future presence of fish near the equator.

"Our studies found that one species of fish could not even survive in water just three degrees Celsius warmer than what it lives in now," says the lead author of the study, Dr Jodie Rummer from the ARC Centre of Excellence for Coral Reef Studies (Coral CoE) at James Cook University.

Dr Rummer and her colleagues studied six common species of fish living on coral reefs near the equator. She says many species in this region only experience a very narrow range of temperatures over their entire lives, and so are likely adapted to perform best at those temperatures.

This means climate change places equatorial marine species most at risk, as oceans are projected to warm by two to three degrees Celsius by the end of this century.

"Such an increase in warming leads to a loss of performance," Dr Rummer explains. "Already, we found four species of fish are living at or above the temperatures at which they function best."

The team measured the rates at which fish use oxygen, the fuel for metabolism, across different temperatures -- at rest and during maximal performance. According to the results, at warmer temperatures fish lose scope for performance. In the wild, this would limit activities crucial to survival, such as evading predators, finding food, and generating sufficient energy to breed.

Because many of Earth's equatorial populations are now living close to their thermal limits, there are dire consequences ahead if these fish cannot adapt to the pace at which oceans are warming.

Dr Rummer suggests there will be declines in fish populations as species may move away from the equator to find refuge in areas with more forgiving temperatures.

"This will have a substantial impact on the human societies that depend on these fish,"
she says.

A concentration of developing countries lies in the equatorial zone, where fish are crucial to the livelihoods and survival of millions of people, including those in Papua New Guinea and Indonesia.

In an era of rapid climate change, understanding the link between an organism and its environment is crucial to developing management strategies for the conservation of marine biodiversity and the sustainable use of marine fisheries.

"This is particularly urgent when considering food security for human communities."



How To Tap The Sun’s Energy
Through Heat as Well As Light



A new approach to harvesting solar energy, developed by MIT researchers, could improve efficiency by using sunlight to heat a high-temperature material whose infrared radiation would then be collected by a conventional photovoltaic cell. This technique could also make it easier to store the energy for later use, the researchers say.

In this case, adding the extra step improves performance, because it makes it possible to take advantage of wavelengths of light that ordinarily go to waste. The process is described in a paper published this week in the journal Nature Nanotechnology, written by graduate student Andrej Lenert, associate professor of mechanical engineering Evelyn Wang, physics professor Marin Soljačić, principal research scientist Ivan Celanović, and three others.

A conventional silicon-based solar cell “doesn’t take advantage of all the photons,” Wang explains. That’s because converting the energy of a photon into electricity requires that the photon’s energy level match that of a characteristic of the photovoltaic (PV) material called a bandgap. Silicon’s bandgap responds to many wavelengths of light, but misses many others.

To address that limitation, the team inserted a two-layer absorber-emitter device — made of novel materials including carbon nanotubes and photonic crystals — between the sunlight and the PV cell. This intermediate material collects energy from a broad spectrum of sunlight, heating up in the process. When it heats up, as with a piece of iron that glows red hot, it emits light of a particular wavelength, which in this case is tuned to match the bandgap of the PV cell mounted nearby.



This basic concept has been explored for several years, since in theory such solar thermophotovoltaic (STPV) systems could provide a way to circumvent a theoretical limit on the energy-conversion efficiency of semiconductor-based photovoltaic devices. That limit, called the Shockley-Queisser limit, imposes a cap of 33.7 percent on such efficiency, but Wang says that with TPV systems, “the efficiency would be significantly higher — it could ideally be over 80 percent.”

There have been many practical obstacles to realizing that potential; previous experiments have been unable to produce a STPV device with efficiency of greater than 1 percent. But Lenert, Wang, and their team have already produced an initial test device with a measured efficiency of 3.2 percent, and they say with further work they expect to be able to reach 20 percent efficiency — enough, they say, for a commercially viable product.

The design of the two-layer absorber-emitter material is key to this improvement. Its outer layer, facing the sunlight, is an array of multiwalled carbon nanotubes, which very efficiently absorbs the light’s energy and turns it to heat. This layer is bonded tightly to a layer of a photonic crystal, which is precisely engineered so that when it is heated by the attached layer of nanotubes, it “glows” with light whose peak intensity is mostly above the bandgap of the adjacent PV, ensuring that most of the energy collected by the absorber is then turned into electricity.

In their experiments, the researchers used simulated sunlight, and found that its peak efficiency came when its intensity was equivalent to a focusing system that concentrates sunlight by a factor of 750. This light heated the absorber-emitter to a temperature of 962 degrees Celsius.

This level of concentration is already much lower than in previous attempts at STPV systems, which concentrated sunlight by a factor of several thousand. But the MIT researchers say that after further optimization, it should be possible to get the same kind of enhancement at even lower sunlight concentrations, making the systems easier to operate.

Such a system, the team says, combines the advantages of solar photovoltaic systems, which turn sunlight directly into electricity, and solar thermal systems, which can have an advantage for delayed use because heat can be more easily stored than electricity. The new solar thermophotovoltaic systems, they say, could provide efficiency because of their broadband absorption of sunlight; scalability and compactness, because they are based on existing chip-manufacturing technology; and ease of energy storage, because of their reliance on heat.

Some of the ways to further improve the system are quite straightforward. Since the intermediate stage of the system, the absorber-emitter, relies on high temperatures, its size is crucial: The larger an object, the less surface area it has in relation to its volume, so heat losses decline rapidly with increasing size. The initial tests were done on a 1-centimeter chip, but follow-up tests will be done with a 10-centimeter chip, they say.

Zhuomin Zhang, a professor of mechanical engineering at the Georgia Institute of Technology who was not involved in this research, says, “This work is a breakthrough in solar thermophotovoltaics, which in principle may achieve higher efficiency than conventional solar cells because STPV can take advantage of the whole solar spectrum. … This achievement paves the way for rapidly boosting the STPV efficiency.”

The research team also included MIT graduate students David Bierman and Walker Chan, former postdoc Youngsuk Nam, and research scientist Ivan Celanović. The work was funded by the U.S. Department of Energy through MIT’s Solid-State Solar Thermal Energy Conversion (S3TEC) Center, as well as the Martin Family Society, the MIT Energy Initiative, and the National Science Foundation.



The Hills Are Alive With Ecosystem Research: The Western Mountain Initiative 


Ten years ago, the United Nations General Assembly signed resolution to designate every December 11 as International Mountain Day (IMD). IMD celebrates both the importance of mountain ecosystems to the planet’s food and water resources, as well as the individuals that live within and support the ecosystems.

The tenth anniversary of IMD provides an excellent opportunity to reflect on the contributions of USGS  science that support research and conservation programs in the rapidly-changing mountain ecosystems of the western United States.

USGS and the Western Mountain Initiative

Mountain ecosystems of the western United States are ideally suited to address ecological questions associated with climate change. They contain


(1) compressed climatic and biogeographic zones containing many ecosystems within relatively small areas,


(2) rich paleoecological resources, which record past environmental changes and consequent ecosystem responses, and


(3) common ecological drivers, such as snowpack, which facilitate comparisons across ecosystems.  Scientists from the USGS and USDA Forest Service have teamed up to better understand and predict the responses of Western mountain ecosystems to climate change. The research emphasizes sensitivities, thresholds, resistance, and resilience to climate change.

Since national parks and wildernesses areas of the western mountains have experienced minimal human disturbance, effects of environmental changes on ecosystems can be inferred with fewer confounding influences than on intensively managed lands.

Western mountain ecosystems are important to society, providing water, wood products, carbon sequestration, biodiversity, and recreational and spiritual opportunities.

More than two decades of USGS research provides the foundation for broad syntheses of existing knowledge.

“Western Mountain Initiative research has provided the scientific foundation for resource management and policy decisions for individual national parks and forests, and has also pulled together a west-wide understanding of how and why the effects of climate change and climate variability differ across mountains and elevations,” says Jill Baron, one of the WMI principal investigators. “The regional context provides an important perspective for local decisions related to natural resource adaptation options.”

The Western Mountain Initiative addresses each of the 5-year goals and objectives of the USGS Climate and Land Use Change research and development.

The Rivers of the Colorado Rockies


One of the areas of focus in the Western Mountain Initiative is the Colorado Rocky Mountains and the various rivers and streams that run through the range.

The Rocky Mountains are one of the major mountain ranges of North America, stretching 3,000 miles from British Columbia in western Canada to New Mexico in the southwestern United States. The continental divide runs along the crest of the Rockies, an important hydrologic feature, separating the flow of water either east to the Atlantic Ocean or Gulf of Mexico to those that flow west to the Pacific Ocean. Within the United States, at least two major rivers, the Rio Grande and the Colorado as well as numerous tributaries to rivers such as the Missouri and Columbia, have headwaters originating in these mountains.

The Colorado, Rio Grande, Arkansas and South Platte have their headwaters within the state of Colorado, and according to 2005 data from the USGS Colorado Water Science Center, water from these river basins serve as the primary freshwater supply for an estimated 3.5 million people in Colorado alone.  The Colorado River basin provides water to some 40 million people in seven states. Water from the Rio Grande basin serves an estimated 2.7 million people in the United States and an additional 6 million in Mexico.


Valley in Rocky Mountain National Park, Colorado.

In the dry arid western states, where there is insufficient rainfall to grow crops, irrigation and agriculture are closely linked.  Water from the Colorado River basin is used to irrigate approximately 3.2 million acres within the basin and is reported to be used to irrigate another 2.5 million acres outside the basin. Irrigating this number of acres consumes an estimated 70 percent of the water in the basin.

The conservation of the Colorado Rockies ecosystem and other mountain ecosystems will help to ensure many of the nation’s water supplies for municipal and agricultural needs.

Water-Shedding Some Light on the Data

In many cases, rivers flow into watersheds, and many of the mountains in the Western United States have natural watersheds within their ecosystems.

In 1991, the USGS initiated the Water, Energy, and Biogeochemical Budgets (WEBB) program to understand the processes controlling water, energy and biogeochemical changes over time. Five research watersheds were selected for the study, including the Loch Vale Watershed in the mountains of Colorado.

The Loch Vale Watershed is exceptionally sensitive to atmospheric man-made contamination and to climate change due to its mountainous and alpine environments, limited forest cover, extensive tundra, slope, and rock and snow glaciers. Research at the site has taken advantage of this sensitivity since 1983 by investigating, the effects of climate on weathering rates and the effects of nitrogen deposition on the algae in the lake. Research indicates that snowmelt is occurring two weeks earlier than in the late 1970s and runoff timing has shifted by a similar amount. These trends are strongly correlated with warming spring-time temperatures. A warming climate and melting permafrost appear to be affecting groundwater flow and solute fluxes at the site.

In addition to watersheds, mountain ecosystems host a multitude of other exciting natural wonders, such as volcanos.

What You Need to Volca-Know

There are many volcanos in the Western United States. Some are still active and some are dormant with no cause for concern. Though there are many negatives associated with volcanic eruptions, volcanos have multiple positive effects on the environment.

Volcanos provide nutrients to surrounding soil, and volcanic ash often contains minerals that are helpful to many plants. In fact, the finer the ash is, then the quicker it breaks down and mixes into the soil.

Many volcanos also have very steep slopes that are inaccessible to humans. This terrain can provide refuges to many different types of plant and animal species that would normally be in danger of human interaction.

The Earth’s water and atmosphere can be attributed to volcanic gases. Though the process of supplying both the entire water supply and the whole atmosphere was a slow process, it is the result of 4.5 billion years of volcanic activity. As volcanos erupt, they are cooling down the planet’s interior as the warm gases are being ejected into the sky.

It All Comes Together


Mountain ecosystems serve as an intersection to many different types of ecological research and environmental studies. The research and data collected through the efforts of the Western Mountain Initiative connects all of the mission areas of the U.S. Geological Survey.



Offshore Wind Farms Could
Protect Us From Hurricanes


Giant offshore wind farms could do more than provide electricity for major cities. They could suck the life and the power out of hurricanes barreling toward those cities, too, according to Stanford University research presented recently at the American Geophysical Union fall meeting.

Stanford civil and environmental engineering professor Mark Z. Jacobson and his research team found that if it was feasible to build tens of thousands of wind power turbines off the shores of some of America’s cities most vulnerable to extreme weather, those cities would see lower wind speeds and less severe storm surges from approaching hurricanes.

The researchers imagined what would have happened if a massive wall of tens of thousands of wind turbines had been built before hurricanes Katrina and Sandy and ran computer simulations of both storms with and without offshore turbines constructed in their paths.

They concluced that the wind turbines could have sapped Katrina of so much energy that wind speeds would have been reduced by up to 50 percent at landfall and the hurricane's storm surge could have been reduced by about 72 percent, Jacobson said. It also would have generated 0.45 terawatts of wind power.

Hurricane Katrina, which devastated New Orleans and the Louisiana Gulf Coast in 2005, was the costliest and and one of the deadliest hurricanes to ever hit the U.S.

Jacobson's Katrina simulations assumed arrays of 70,000 turbines  — 300 gigawatts of installed power  — had been built 100 kilometers offshore southeast of New Orleans and were designed to withstand winds of up to 50 meters per second, just above the strength of a Category 3 hurricane, or roughly 111 mph.

The simulations showed that the turbines would create a net energy reduction in the atmosphere, slashing wind speeds as energy was sapped from the storm and dramatically reducing storm surge, which is caused by high winds pushing water inland as a hurricane barrels toward the coast.

Wind speeds would have been reduced enough to allow the wind turbines to survive the storm themselves, with winds never reaching 50 meters per second, above which the turbines could topple.

A similar array of wind turbines just offshore of New York and New Jersey could have reduced wind speeds of Hurricane Sandy by up to 29 meters per second, or 65 mph, with a storm surge reduction of about 21 percent, he said.

“If we have large arrays of offshore wind turbines — large walls of turbines — we could dissipate winds and storm surge quite a bit,” particularly in the vicinity of the turbines themselves, Jacobson said.

Jacobson did not address the feasibility or the political and environmental challenges of building such massive offshore wind farms along hurricane-prone coastlines. Proposals for smaller offshore wind farms have generated significant opposition and controversy, especially along Cape Cod in Massachusetts.

A study Jacobson co-authored in 2012 showed that offshore wind power can generate enough power to meet a third of U.S. energy needs.

“You’re generating electricity year-round, so (an array) would pay for itself,” he said .

Jacobson said he has also envisioned constructing turbines worldwide to produce green energy that would meet half the world's energy needs. He said it would require 4 million wind turbines globally to do so.

"1.5 billion turbines would reduce wind speeds worldwide by 50 percent," Jacobson said.

Asked by an audience member how wind-farm construction on a such a large scale would affect local wind speeds and global weather patterns during normal conditions in the absence of hurricanes and other extreme weather, Jacobson said the large turbine arrays would likely reduce local shoreline wind speeds at most times, but would not likely affect global weather patterns overall, even if offshore wind farms were constructed on a global scale.




Typhoon Haiyan Influenced by
Climate Change, Scientists Say


Extreme storm events such as super typhoon Haiyan, which wreaked havoc in the Philippines on Friday, are more likely in the future as the build-up of greenhouse gases warms the planet, scientists say.

Winds from typhoon Haiyan were estimated to have been 314km/h or higher when the monster storm made landfall on the Philippine island of Samar. That speed, if confirmed, would make it the strongest storm on record, exceeding hurricane Camille, which hit Mississippi in the US in 1969, according to US meteorologist Jeff Masters' WunderBlog.

Australian scientists say gauging the intensity of the storm – which included a tsunami-like storm surge and heavy rainfall – would be difficult because of limited information emanating from the storm-battered region. The death toll from the city of Tacloban alone may exceed 10,000 people, local authorities say.

Professor Will Steffen, a researcher at the ANU and member of the Climate Council, said scientists understand how a hotter, moister climate is already affecting storms such as Haiyan.
Advertisement

“Once [cyclones] do form, they get most of their energy from the surface waters of the ocean,” Professor Steffen said. “We know sea-surface temperatures are warming pretty much around the planet, so that's a pretty direct influence of climate change on the nature of the storm.”

Data compiled from the US National Oceanic and Atmospheric Administration shows sea temperatures were about 0.5 to 1 degree above normal in the waters to the east of the Philippines as Haiyan began forming. The waters cooled in the storm's wake, an indication of how the storm sucked up energy.

Typhoons – or tropical cyclones as they are known in Australia, and hurricanes in the US – require sea-surface temperatures of at least 26.5 degrees to form, according to the Bureau of Meteorology. The low-pressure systems can persist over lower sea-surface temperatures once they get going.

Temperature gradient

Kevin Walsh, an associate professor at the University of Melbourne and an expert in tropical meteorology, said warmer sea-surface temperatures are only one factor in determining the ferocity of a cyclonic event. The key is the temperature difference between those seas and the tops of the storms, high in the troposphere.

While data on temperatures at sea levels well known, gathering the information at levels up to about 20 kilometres above the surface has been more difficult until recent times, with weather balloons and other devices now more common.

That temperature differential in cyclones, though, is expected to widen as storm heights push higher in the atmosphere, Dr Walsh said.

“In the future, you’re talking about the difference of the sea-surface temperature and the temperature of the troposphere height – that increases even though the upper troposphere warms in a warming world.”



                                                                     Advertising