Wednesday, 24 February 2010

How big is the problem of electronic waste, and can it be tackled?

By Michael McCarthy, Environment Editor
Wednesday, 24 February 2010

Why are we asking this now?
Because yesterday the UN issued a new report on electronic waste, highlighting the danger from "rocketing" sales of mobile phones, PCs and electronic appliances, in the developing countries especially.
What danger is that?
Modern electronic devices might look clean, sleek and spotless on the outside, but inside they contain a lot of materials used in manufacture which are potentially hazardous to human health. Typical ones are PVC (polyvinyl chloride) plastic, used as an insulator with internal cabling, and brominated flame retardants, chemicals used to laminate printed circuit boards to prevent them catching fire.
Most of these substances can be disposed of safely, but considerable investment in waste-handling infrastructure is needed to do so, and in the past, many countries, especially the US, have declined to make the investment and instead taken the "out of sight, out of mind" attitude, and simply shipped their e-waste abroad, usually to developing nations such as China and India. There, instead of being properly processed, appliances are either dumped in unmanaged landfills or broken up for scrap in unofficial recycling facilities – not infrequently by children.
But why break up dangerous waste?
Electronic goods don't just contain hazardous substances – they contain valuable substances as well. A device such as a laptop may contain as many as 60 different elements – many valuable, some dangerous, some both. To poor people in the developing countries, there can be real money in a discarded computer or mobile phone. Copper wire is just the start of it. Mobiles and PCs are now estimated to take up three per cent of the gold and silver mined worldwide each year, 13 per cent of the palladium and 15 per cent of the cobalt, as well as substantial amounts of very rare metals such as hafnium. But trying to recover these can pose real hazards, as substantial plumes of toxic pollution, for example, can be produced by backyard incineration. And the concern is, the stream of e-waste is growing ever larger around the world.
How big is the e-waste stream?
A couple of years ago the United Nations Environment Programme (UNEP) estimated that, worldwide, between 20 and 50 million tonnes of electrical and electronic goods which had come to the end of their lives were being thrown away every year. The latest UNEP report now estimates the annual total at 40 billion tonnes, with America in the lead, producing 3m tonnes domestically every year, followed by China with 2.3m tonnes. (The UK total is thought to be more than 1m tonnes, about 15 per cent of the EU total – it is the fastest-growing waste stream in Britain). But more important, the figure is starting to soar upwards, especially with a gigantic surge of disposable electronics use in the developing countries.
What sort of goods, and in what sort of numbers?
Globally more than a billion mobile phones were sold in 2007, up from 896m in 2006 (In many parts of Africa telephone communications have skipped the landline stage and gone from no phones, to mobile phones, in one step). In the US alone, more than 150m mobiles and pagers were sold in 2008, up from 90m five years earlier. The waste streams are correspondingly burgeoning, and the new UN report focuses on China, India and the other relatively poor but expanding economies.
In China, for example, the report predicts that by 2020, e-waste from old computers will have jumped by 200 to 400 percent from 2007 levels, and the same holds true for South Africa, while the figure for India is a staggering 500 per cent. By that same year in China, e-waste from discarded mobile phones will be about 7 times higher than 2007 levels and, in India, 18 times higher, while e-waste from televisions will be 1.5 to 2 times higher in China and India, and in India e-waste from discarded refrigerators will double or triple. Add to that the vast amounts of e-waste that are still being imported from countries such as the US, and you have a quite colossal e-waste mountain in prospect, with its corresponding dangers for human health and the environment. "The issue is exploding," says Ruediger Kuehr, of the United Nations University in Tokyo.
What can we do about it?
The first thing to do is recognise the problem. The electronics revolution of the past 30 years has seemed different in kind from the original industrial revolution, characterised by smokestacks belching very obvious filth; it has seemed clean, green and lean. But we have gradually come to realise that in two ways in particular, modern hi-tech can be bad for the planet too. The first is its energy use; so enormous is the worldwide scale of IT that electronics now accounts for fully two per cent of global carbon emissions, which about the same as aviation, whose emissions have become highly controversial. The second is the hardware, when it comes to the end of its natural life, which increasingly, is pretty short. We have been largely ignorant of this increasingly important waste stream, so much so that a Greenpeace report on e-waste two years ago referred to it as "the hidden flow". We need to be aware of it.
Once we've recognised the problem, then what?
The European Union has shown the way by adopting a key principle: producer responsibility – that is, make the producers of electronic goods responsible for their disposal at the end of their lives. This is enshrined in the European Union's WEEE (Waste Electrical and Electronic Equipment) directive of 2002 which is now law in Britain and across the EU. In practice, it means that electronics retailers must either take back the equipment they sold you, or help to finance a network or drop-off points, such as council recycling sites. There have been some problems with the directive's initial operation, but its main feature is impressive in its ambition: it aims to deal with "everything with a plug".
Has producer responsibility been adopted elsewhere?
Hardly at all as yet, and the EU is very much in the vanguard. The US did nothing in terms of federal legislation during the George W Bush years, and such rules as exist are implemented by the states, such as California. The new UN report suggests that all countries should start to establish proper e-waste management networks, which could not only cut down on health problems but generate employment, cut greenhouse gas emissions and recover a wide range of valuable substances from gold to copper.
Is there anything else that can be done?
Yes: design the problem out. Groups such as Greenpeace have led the way in putting pressure on companies like Apple to find substitutes for the toxic chemicals inside their products, and have had some success in forcing them to develop non-toxic alternatives. This may be the real way forward.
Is the rising tide of e-waste going to swamp us?
* Once we recognise the problem, it becomes possible to deal with it, and the need is paramount
* The adoption of producer responsibility for disposal, as championed by the EU, is a major step forward
* Some of the hazards can actually be designed out, and that must be a priority for manufacturers
* The growth of the global e-waste stream is becoming simply too large to handle
* In many countries there are no incentives to install official recycling schemes
* Informal recycling is so large in countries such as China that it will hamper official schemes

Academic attempts to take the hot air out of climate science debate

Judith Curry aims to turn inflammatory debate of 'climategate' into reasoned online discussions to rebuild trust with the public

Professor Judith Curry, who currently chairs the Georgia Institute of Technology's School of Earth and Atmospheric Sciences, has embarked on what she's describing as a "blogospheric experiment". Having written a lengthy essay entitled Losing the Public's Trust which will be published later today, she decided to alert many bloggers across the climate change debate in "the hope of demonstrating the collective power of the blogosphere to generate ideas and debate them". She has asked the likes of Anthony Watts, Andrew Revkin, Roger Pielke Jr, among many others, to pitch in with their own thoughts about her essay with the goal of "bringing some sanity to this whole situation surrounding the politicization of climate science and rebuilding trust with the public". I genuinely hope she achieves her aims.
As and when other bloggers publish their own responses I will try and provide links to them below, but here are my own thoughts on Curry's article. First, I agree with her opening premise that "credibility is a combination of expertise and trust" and that the climate research establishment has failed to understand that the "climategate" furore is "primarily a crisis of trust".
In their misguided war against the skeptics, the CRU emails reveal that core research values became compromised. Much has been said about the role of the highly politicized environment in providing an extremely difficult environment in which to conduct science that produces a lot of stress for the scientists. There is no question that this environment is not conducive to science and scientists need more support from their institutions in dealing with it. However, there is nothing in this crazy environment that is worth sacrificing your personal or professional integrity. And when your science receives this kind of attention, it means that the science is really important to the public. Therefore scientists need to do everything possible to make sure that they effectively communicate uncertainty, risk, probability and complexity, and provide a context that includes alternative and competing scientific viewpoints. This is an important responsibility that individual scientists and particularly the institutions need to take very seriously.
If the "climate research establishment" is to take away one lesson from this sorry episode it will surely be the need to "effectively communicate uncertainty, risk, probability and complexity, and provide a context that includes alternative and competing scientific viewpoints".
Up to this point I strongly agree with Curry's sentiments, but I think she is a little complacent in her assessment of the "changing nature of scepticism about global warming". She correctly identifies that climate scepticism is a multi-headed and ever-shifting beast. There are as many flavours to the sceptics as there are to environmentalists. To label them all as flat-earthers and big oil deniers is just as ill-judged and lacking in subtlety as labelling all environmentalists as "eco-Nazis intent on taking us all back to the caves". Genuine climate science sceptics such as Climate Audit's Steven McIntyre are a world apart from the out-and-out denial pumped out by the likes of Prison Planet's Alex Jones. Somewhere in between are the likes of Anthony Watts who risks polluting his legitimate scepticism about the scientific processes and methodologies underpinning climate science with his accompanying politicised commentary. But Curry bags them up together and describes Watts and McIntyre both as "climate auditors":
They are technically educated people, mostly outside of academia. Several individuals have developed substantial expertise in aspects of climate science, although they mainly audit rather than produce original scientific research. They tend to be watchdogs rather than deniers; many of them classify themselves as "lukewarmers". They are independent of oil industry influence. They have found a collective voice in the blogosphere and their posts are often picked up by the mainstream media. They are demanding greater accountability and transparency of climate research and assessment reports… So how did this group of bloggers succeed in bringing the climate establishment to its knees (whether or not the climate establishment realizes yet that this has happened)? Again, trust plays a big role; it was pretty easy to follow the money trail associated with the "denial machine". On the other hand, the climate auditors have no apparent political agenda, are doing this work for free, and have been playing a watchdog role, which has engendered the trust of a large segment of the population.
I think Curry has misjudged this point a tad. If the "climate auditors" were exactly as billed above I would agree they are a most welcome addition to the debate. But to claim these blogs have no political agenda is naïve, I feel. Granted, both McIntyre and Watts do make regular efforts to tone down some of the very worst off-topic comments that follow their posts, but it doesn't take much analysis to know where the political heartbeat of these blogs lies. For right or wrong, they have attracted a particular crowd of followers – predominantly right-wingers in favour of the free-market and libertarianism – and it must be a difficult horse for McIntyre and Watts to ride at times without playing to the crowd.
Curry goes on to say:
There is a large group of educated and evidence driven people (eg, the libertarians, people that read the technical skeptic blogs, not to mention policy makers) who want to understand the risk and uncertainties associated with climate change, without being told what kinds of policies they should be supporting.
I think this is an important point. Some sceptics such as Bjørn Lomborgand Nigel Lawson have made a very conscious shift in their stance in recent years away from one that questioned the science to one that now largely focuses on questioning the policy responses to climate change. If we are to have a fierce, politicised debate let it lie here, surely. But let's keep the politics out of both the climate science and those that choose to try and audit it via their blogs.
And it is on this point that I think Curry makes her most powerful point:
While the blogosphere has a "wild west" aspect to it, I have certainly learned a lot by participating in the blogospheric debate including how to sharpen my thinking and improve the rhetoric of my arguments. Additional scientific voices entering the public debate particularly in the blogosphere would help in the broader communication efforts and in rebuilding trust. And we need to acknowledge the emerging auditing and open source movements in the internet-enabled world, and put them to productive use. The openness and democratization of knowledge enabled by the internet can be a tremendous tool for building public understanding of climate science and also trust in climate research.
I, too, think it would be a grave mistake not to make better use of the obvious open-source and crowd-source advantages enabled by blogs such as Climate Audit. Just as the SETI@Home project has made use of thousands of otherwise idle computers to scan radio telescope data for signs of extraterrestrial life, if people are willing and able to interrogate climate datasets in their spare time it would be strange in my view not to try and make use of this collective resource.
But the key for me is that word "trust" again. I think until those that frequent these sites come out from behind the cloak of anonymity that most of them choose to hide behind very few people, particularly climate scientists, will be willing to trust the motives of this army of DIY auditors. Anonymity allows for some spicy free speech beneath blogs such as this one, but it is not the right tool if we're seeking the "openness and democratization of knowledge". If we are to once again try and drive a wedge between science and politics, then all the participating actors – on both sides of the debate - need to be open about who they are and where their motives and vested interest, if any, lay.

Scotland not doing enough to meet emissions target, ministers told

Emissions cuts from cars, homes and farming are key if Scotland is to meet its target of a 42% reduction, new report says
Severin Carrell, Wednesday 24 February 2010 11.39 GMT
Scottish ministers have been warned they need to aggressively target carbon emissions from car use, home energy and farming if Scotland is to meet its ambitious target of cutting CO2 levels by 42% in the next decade.
The Committee on Climate Change, an influential government advisory body chaired by Lord Adair Turner, has told Alex Salmond's nationalist government it needs to show much greater "political will and leadership" if Scotland is to build a truly low-carbon economy by 2020.
In a report released today, the committee complimented the devolved government for setting "ambitious targets", and confirmed they were tougher and farther-reaching than the UK's government's interim target of a 34% cut by 2020.
The UK government has promised to increase that target to around 42% if a global climate deal is agreed upon, but unlike Scotland, it has refused to include aviation and shipping in its calculations or to set annual targets.
Stop Climate Chaos Scotland, an influential umbrella group of more than 60 environment groups, faith groups, civic organisations and development charities, said the committee's conclusions would increase pressure on UK ministers to set a similar and binding target.
Mike Robinson, the group's chair, said: "This is a great opportunity for the UK government to be ahead of the curve and show some leadership. I do think this shows the UK should up its game. The world needs more ambitious targets."
David Kennedy, the committee's chief executive, stopped short of endorsing that view. But the committee's report confirmed the widely held belief that Scotland's target is heavily dependent on the negotiation of a new global climate treaty. After the failure of the Copenhagen talks last December, no deal is expected before next year.
The Scottish government has direct control over only a minority of Scotland's CO2 emissions, which in 2007 amounted to 56.9m tonnes a year. The committee did not establish the extent of that control, but Scottish officials said it is roughly 30%.
The bulk of Scotland's CO2 emissions are covered by either the EU emissions trading scheme for large energy users, such as power stations, or UK government policies on fuel and car taxes.
The committee warned that a failure to sign a global deal on emissions would make it extremely difficult to hit the 42% target. Even with a deal, though, it said Scotland still needs a "step change" in its policies on transport, housing, waste and agriculture, and to aggressively push renewable power through the planning system.
It specifically recommended greater efforts to promote electric cars: successive Scottish governments and local councils have been far slower than English authorities to invest in low-energy transport, such as hybrid buses or electric vehicle charging points. The SNP has also been strongly criticised for its substantial road-building programme.
The committee said including aviation and shipping, however, meant other parts of the Scottish economy would bear a heavier burden for cutting emissions, increasing the scale of the overall challenge, as these sectors were ignored by EU and UK carbon budgets.
Professor Jim Skea, a member of the committee, said: "These are ambitious targets that go further than those in the rest of the UK. A step change will be needed to unlock potential emissions reductions in Scotland, but we believe this to be achievable with new policies."
Late yesterday, Stewart Stevenson, the Scottish climate change minister, retracted an earlier statement saying the report was a "robust and complex piece of work", and did not directly respond to the committee's challenge on strengthening government policy.
In a revised statement, he said: "The need to take action to reduce our emissions is clear and everyone has a role to play in helping Scotland meet its world leading climate change targets.
"Achieving the necessary reductions in emissions will require hard decisions, not only by governments but also by businesses, the public sector, voluntary and community groups and individuals."

Reject sceptics' attempts to derail global climate deal, UN chief urges

Ban Ki-moon urges environment ministers to reject attempts by sceptics to undermine negotiations by exaggerating shortcomings in Himalayan glaciers report

Associated Press, Wednesday 24 February 2010 10.20 GMT

The UN secretary-general Ban Ki-moon, today urged environment ministers to reject attempts by sceptics to undermine efforts to forge a climate change deal, stressing that global warming poses "a clear and present danger."
In a message read by a UN official, Ban referred to the controversy over mistakes made in a 2007 report issued by the UN-affiliated Intergovernmental Panel on Climate Change (IPCC) which have been criticised by climate sceptics.
Despite the failure to forge a binding deal on curbing heat-trapping greenhouse gas emissions at a UN conference in Copenhagen last December, Ban said the meeting made an important step forward by setting a target to keep global temperature from rising and establishing a program of climate aid to poorer nations.
"To maintain the momentum, I urge you to reject last-ditch attempts by climate sceptics to derail your negotiations by exaggerating shortcomings in the ... report," Ban said in the statement read at the start of an annual UN meeting of environmental officials from 130 countries on the Indonesian resort island of Bali.
"Tell the world that you unanimously agree that climate change is a clear and present danger," Ban said. A British poll yesterday showed public conviction about the threat of climate change has declined sharply in the last year.
The Indonesian president, Susilo Bambang Yudhoyono, said time was running out, but expressed confidence that a binding climate change deal could be forged at the next climate change summit later this year in Cancun, Mexico.
"I'm convinced that we're still not too late," he said at the Bali conference.
Indonesian foreign minister, Marty Natalegawa, said Indonesia will hold an informal meeting of all environmental ministers and officials from 130 countries Friday in Bali to discuss ways of ensuring that a binding treaty on greenhouse gas cutbacks could be forged in Cancun.
"No sealed deal happened in Copenhagen, so it's now more urgent than ever for us to work diligently between now and Mexico," Natalegawa told The Associated Press in an interview.
"It should have been urgent last year, but we didn't live up to that urgency," he said.

GM and farming technology 'key to fighting climate change'

Lord Smith tells National Farmers' Union that climate change 'could provide opportunities for novel crops and systems'

Rebecca Smithers, consumer affairs correspondent, Wednesday 24 February 2010 06.00 GMT

Genetically modified oilseed rape, one of the four main commercial GM crops. The Environment Agency is encouraging GM and other precision farming methods in order to combat the problems agriculture will face from climate change. Photograph: Christopher Furlong/Getty
The government's drive to push controversial genetically modified crops up the national agenda will receive a further boost today, when former cabinet minister Chris Smith will tell farmers that the technology has a key role in helping the UK beating climate change.
Lord Smith, former culture secretary under Tony Blair and now chair of the Environment Agency, will say that both GM crops and new technologies to support "precision farming" - including nanotechnology - could help tackle growing climate pressures such as water shortages.
Addressing delegates at the National Farmers' Union's (NFU) annual conference in Birmingham, Lord Smith will tell farmers that climate change "will create new demands on land and environmental resources" and "could provide opportunities for novel crops and systems".
Intense lobbying by food companies, the growing significance of climate change, recent international food crises and shortages and a major independent Royal Society report have all helped to give the government the authority to put GM back on the national agenda. The controversial technology was the focus of intense campaigns including destruction of GM crop trials by environmentalists in the 1990s, and last month came under renewed attack from academics and organic food campaigners at the Oxford Real Farming Conference.
Lord Smith will say: "We can already see wildlife following climate change – the mayfly is now found some 40 miles further north than before and warmer winters and wetter summers are thought to be a major factor in the rapid decline of pollinating insects with UK bee populations, in particular, falling by 10-15% over the last two years.
"The reliance on seasonal weather patterns means that farming will follow climate change too. My own personal view is that we probably need to be readier to explore GM options, coupled of course with proper environmental safeguards, in adapting to the changes that the climate will bring."
The GM industry now involves 14 million farmers in 25 countries who are growing 134m hectares of GM crops around the world. This is a 7% increase compared with last year.
Lord Smith will recommend more use of new technology: "New tools and technologies are becoming available, nanotechnology for example, as well as the use of satellites, IT and other tools to support precision farming. We need to understand the environmental implications of novel approaches in order to embrace them and be clear how they will help us achieve long-term goals.
"We need to ensure that science is at the forefront of development and innovation and that effective knowledge transfer means farmers can adapt and innovate. Innovation has already seen British agriculture adapt to the economic challenges it has faced over the last 15 years or so and I know it will do so into the future."
Organic farmers have been more resistant to the use of GM than "conventional" farmers represented in the membership of the NFU, although the latter broadly agrees that any such developments must be subject to proper scientific evaluation.
Yesterday Paul Kelly, founder of Kelly's Turkeys, told the conference: "GM has had a terrible press and consumers are very confused. But it is only a matter of time before we are feeding our turkeys GM feed."
As well as exploring the potential of new crops and technologies, Lord Smith will underline the need for agriculture to become more water efficient as climate change ushers in longer, hotter, drier summers.
On the opening day of the conference yesterday, the Conservatives set out plans to prevent development on top quality farmland, reform the body which delivers EU subsidies to farmers and set up a review of red tape as part of efforts to back British farming.
The Liberal Democrats also set out proposals to improve delivery of subsidies by the Rural Payments Agency, which in 2006 left farmers without EU grants after problems with its computer system.

Monday, 22 February 2010

Where Batteries Go to Be Tortured

If something can go wrong with lithium-ion cells, better in the 'abuse lab' than in your car
ALBUQUERQUE, N.M.—It's known as the "abuse lab."
And with good reason.
At Sandia National Laboratories, scientist purposely crush, overheat, and salt batteries to see how much abuse they can take before exploding. The effort is all in the name of reducing the risk to consumers who use laptops and drive electric cars. WSJ's Stephanie Simon reports.

Behind a 2,000-pound blast door, federal researcher Peter Roth spends his days torturing electric-car batteries. He overcharges them, drives nails into them, presses them between scalding brass plates. He dunks them in salt water, sets them on fire, crushes them, drops them, dissects them. Again and again, he watches them explode.
The goal: To make sure all this mayhem happens in his lab—and not in your car. Because, as Mr. Roth says dryly, "One bad incident can spoil the public's opinion."
Electric cars are about to hit the market en masse: General Motors Co. plans to launch its Chevrolet Volt this fall; Nissan Motor Co.'s Leaf is slated to debut by December; and new gas-electric hybrids are in the works at Toyota Motor Corp., Honda Motor Co. and Volkswagen AG's Audi unit. Even BMW AG has an electric compact in the works. Nearly all are powered by lithium-ion batteries, which pack six times the punch of a standard lead acid car battery and more than twice as much as the nickel-metal-hydride batteries used in hybrids such as the Toyota Prius.
Lithium-ion batteries have been around for years in cellphones, laptops and other consumer electronics. Even on this small scale, the batteries have caused sporadic trouble; several computers have been recalled in recent years after their lithium-ion batteries were found to spontaneously catch fire. Scaling up the technology enough to power a car raises fresh safety and reliability questions.
Shoestring Operation
That's where Mr. Roth comes in. In a windowless warren of small test bays—several singed with the soot from past explosions—Mr. Roth seeks to discover what can go wrong with different types of lithium-ion cells, and under what conditions. His guiding principle? "If you build it, it can fail."

The abuse lab is located at Sandia National Laboratories, in a high-security building mostly used for nuclear research; the reception desk displays a small sign instructing couriers: "All explosives go to Room 1107."
Much of Mr. Roth's research is funded by the Department of Energy, which recently awarded the lab $4.2 million in stimulus money to upgrade equipment. Auto companies and battery makers also pay the lab directly for tests on proprietary technology. "It's our key go-to national lab for abuse tolerance testing," says Ted Miller, a senior manager of energy storage strategy and research at Ford Motor Co.
Mr. Roth's lab for the most part studies lithium-ion batteries—from single cells that can be smaller than a tube of lipstick to full-size automobile battery packs weighing several hundred pounds. Then there are the thin silver pouch batteries, which look fit and trim when they are new but "swell up like a Jiffy-Pop bag when they go bad," Mr. Roth says, pulling out badly charred, misshapen pouches.
He and his research partner, Chris Orendorff, emphasize that they are testing these batteries under worst-case scenarios, often after disabling internal controls. "Then we can develop strategies to mitigate those problems," Mr. Orendorff says. "Knowledge is power."
The lab, whose clients also include the National Aeronautics and Space Administration, the U.S. miliary and consumer-electronics manufacturers, relies on an unlikely mix of sophisticated equipment and home-made contraptions for research.
The scientists use a state-of-the-art accelerating rate calorimeter to measure the heat generated by various types of batteries when they begin to overheat and a Fourier transform infrared spectrometer, which can cost about $50,000, to analyze the gases released as a battery breaks down after a catastrophic failure. Thanks to the stimulus funds, the lab will soon get a CT scanner, for peering inside single cells, and a thermal chamber to test how batteries react to extreme temperatures—anything from minus 70 to 200 degrees Celsius.
Yet the researchers rely on an old locomotive relay switch to perform other critical tests, such as the short circuit (a battery is hooked up and the switch is thrown closed, which causes an immediate and intense short circuit). They protect the hydraulic lines on another machine with crumpled tin foil. And their computing center looks like it was built in the 1980s and never updated. "It's a bit of a shoestring operation," Mr. Roth says.

With a shock of white hair, an unruly beard and impish eyes, Mr. Roth, 62 years old, looks every bit the mad scientist as he bounds through the lab reminiscing about disasters he has engineered. In one memorable test a few years back that didn't involve lithium-ion technology, he overcharged a battery made of 48 cells lashed together, then exposed it to sparks. The cells immediately began venting a tremendous cloud of gas and then exploded like fireworks, ricocheting off walls and disintegrating so completely, nothing was left but a thick layer of grit so toxic that cleanup crews had to wear hazmat suits.
His report to the manufacturer was simple, Mr. Roth says: "Back to the drawing board."
As for his test bay? "We put in a steel ceiling after that," he says.
Making an Impact
Overcharging is one of Mr. Roth's standard tests. He has repeatedly found failings in the electronic monitors that are supposed to deflect the current when the battery is full. That can cause overheating—known as "thermal runaway"—and explosions.
Armed with this data, battery manufacturers have developed a backup system of mechanical circuit breakers that interrupt the current flow when the battery's temperature begins to climb to unstable levels.
The Sandia lab also compares the safety of various chemistries used for the positive and negative charge in a lithium-ion battery. Much of this information is confidential, but the researchers say that certain materials are clearly superior in terms of safety and that the industry is shifting in that direction.
"They've made a significant contribution to automotive technology," says Menahem Anderman, president of Advanced Automotive Batteries, a consulting firm in Oregon House, Calif.
Mr. Roth's lab has also dispelled some fears. Manufacturers worried, for instance, that if a car plunged off a bridge, its lithium-ion battery might electrify the water and shock first responders. Mr. Roth tested the scenario and dismissed the concern as unfounded.
Mr. Roth, who plans to retire this spring and turn over the lab to Mr. Orendorff, says that in more than a decade studying battery technology he has seen huge advances in safety and has been impressed by the industry's attentiveness to his research.
So does he plan to buy a car powered by a lithium-ion battery? He hesitates. "I will certainly be inclined to buy one eventually," he says. "But I am disinclined to buy the first of anything." — Ms. Simon is a staff reporter in The Wall Street Journal's Dallas bureau. She can be reached at

What Utilities Have Learned From Smart-Meter Tests...

...And why they aren't putting those lessons to use
Utilities have learned a lot about how smart meters can compel consumers to save electricity. Unfortunately, too often they aren't putting the knowledge to good use.
Smart meters are more precise than traditional meters in that they send readings on electricity usage to utility billing departments throughout the day. Not only do smart meters provide customers with a clearer picture of how they use electricity on a daily basis, they also make it possible for utilities to charge more for power when demand is highest—in the afternoon—and less when usage falls off—at night.
By making variable pricing plans possible, smart meters are expected to play a big role in getting customers to reduce their peak-hour energy consumption, a key goal of utility executives and policy makers. Electricity grids are sized to meet the maximum electricity need, so a drop in peak demand would let utilities operate with fewer expensive power plants, meaning they could provide electricity at a lower cost and with less pollution.
Utilities have run dozens of pilot tests of digital meters and found that people cut power consumption the most when faced with higher peak-hour rates. But utility executives and regulators have been reluctant to implement rate plans that penalize people for too much energy use, fearing that if customers associate smart meters with higher bills, they will stall the technology's advance just as it is gaining traction. Only about 5% of U.S. electric meters are "smart" today, according to the U.S. Department of Energy, but that figure is expected to grow to about one-third in the next five years.
So, many utilities are trying an approach that is less controversial, but also less effective: offering rebates to customers who conserve energy in key periods of the day. By doing things like turning off clothes dryers and adjusting air conditioners on hot summer afternoons, customers earn credits that can reduce their electricity bills.
Preventing Rebellions
"Most CEOs struggle over this issue more than anything else," says Ted Craver, chief executive of Edison International, the Rosemead, Calif., parent of Southern California Edison, which is in the midst of a massive smart-meter rollout. "You could have a real rebellion" if smart meters push up customers' rates, especially if utilities' other capital expenses are increasing, he says.

Pacific Gas & Electric Co., a unit of PG&E Corp., got a taste of the public-relations risk last summer when it installed smart meters in Bakersfield, Calif., as part of a broad upgrade in its Northern California service territory. When customers—who weren't participating in any sort of experimental rate plan—received dramatically higher bills shortly afterward, they blamed the meters for what they assumed was faulty billing. The San Francisco utility investigated and concluded that the meters were functioning properly. It found that the higher bills were simply a case of unfortunate timing: An increase in conventional rates had taken effect just ahead of unseasonably hot temperatures.
"What it told us is that people aren't really knowledgeable about smart meters coming down the pike and they don't pay much attention to rate changes," says Dian Grueneich, a member of the California Public Utilities Commission, which is monitoring the situation. "It told us there needs to be a lot of consumer education" before making big changes.
PG&E now has a voluntary program in which customers agree to pay higher peak rates of 60 cents a kilowatt-hour for no more than 15 days a year, in exchange for a discount of three cents a kilowatt-hour for electricity used at other times. So far some 26,000 customers have signed up.
To date, 16,000 to 18,000 people have participated in more than five dozen pilot tests involving smart meters and experimental rate plans, according to Ahmad Faruqui, a consultant with the Brattle Group who has helped utilities develop some of the programs. He says that while it is sometimes disheartening to see utility executives ignore their own findings, he understands the desire to move slowly until people become comfortable with smart-meter technology.
Pepco Holdings Inc. recently did a pilot test in Washington, D.C., of three rate plans designed to gauge how customers respond to different price signals. One plan pegged the price, which ranged from a penny to 37 cents a kilowatt-hour, to the wholesale cost of electricity. One charged a "critical peak price" of 75 cents a kilowatt-hour during certain hours on a handful of days, and 11 cents per kwh at other times. The final plan gave customers 75 cents for each kilowatt-hour of energy saved and charged 11 cents per kwh for power used.
Results showed that people responded most when threatened with the 75-cent-per-kwh peak pricing. Those customers cut their overall energy consumption between 22% and 34%, depending on whether they also had programmable thermostats that could automatically change temperature settings. Customers offered rebates reduced their usage 9% to 15%—again, with the deeper cuts among those who had smart thermostats.
Despite evidence that sticks are better motivators than carrots, the utility intends to offer rebates in the future in an effort to change behavior. "Our general sense is that consumers would prefer a rate structure with no downside," says Steven Sunderhauf, a program manager for Pepco. "From a purist's standpoint, I may prefer critical peak pricing because it gets the boldest response…but using rebates will help people get comfortable with smart meters."
Offering Protections
In addition to fearing a customer rebellion, utility executives and regulators worry that the introduction of peak pricing for the hottest or coldest days of the year could harm vulnerable members of society. Many experts feel that not enough research has been done to protect those who aren't able to change their electricity usage.
"I'm mindful that an elderly person with medical equipment can't say, 'I'm not going to run the equipment at the 'peak' time,' " says Kevin DelGobbo, a Connecticut utilities commissioner. "We have to be careful with these rate structures."
The same holds true for commercial customers—some may not have the option of cutting usage on weekday afternoons.
Last summer, Connecticut Light & Power Co., a subsidiary of Northeast Utilities Service Co., gave new meters to 3,000 residential and business customers, testing three types of rates. Like other utilities, it found that homes facing the highest peak-hour pricing—$1.60 per kwh at certain times—responded the most, cutting peak use 16% to 23%, depending on whether they had other aids like smart thermostats. Commercial customers, in a similar test, cut their demand far less, only 7%.
That was instructive, says Jessica Brahaney Cain, director of CL&P's smart-grid planning, because it told the utility that many commercial customers don't have the option of cutting usage during times of peak demand. "A restaurant has to use its ovens," she says. "A dentist has to use his drills."
One surprise, says Ms. Cain, was that almost all of the customers who participated in the pilot test reported more satisfaction with the Berlin, Conn., utility than those who didn't. They liked that the meters gave them greater insight into how they use electricity, she says.
CL&P expects to file a plan for mass-meter deployment and new dynamic-pricing plans by the end of March. It plans to offer rebates for conservation, at least in the beginning.
Better Tools
Southern California Edison says it also plans to adopt a rebate strategy by the end of the year, even though it won't have all its meters in place until 2012.
"If customers do nothing, they'll get the same bill they otherwise would get," says Lynda Ziegler, senior vice president for customer service at the Edison International unit. Those that cut peak consumption will get a credit of 75 cents to $1.25 for each kilowatt-hour of reduction. The main concern of regulators, she says, is making sure meter readings are accurate.
The utility chose rebates over penalties partly because a law passed during the California energy crisis a decade ago limits its ability to involuntarily switch people to higher peak-hour pricing plans right now. A new law may allow it after 2013.
But the utility also concluded that it wouldn't be fair to really crank up peak pricing until homeowners have greater access to automation tools such as smart appliances and controllers. In the future, devices will contain computer chips and software so they can go into energy-saving mode in response to a signal sent from the utility or another energy manager that higher prices are kicking in.— Ms. Smith is a Wall Street Journal staff reporter in San Francisco. She can be reached at

Methane levels may see 'runaway' rise, scientists warn

A rapid acceleration may have begun in levels of a gas far more harmful than CO2
By Michael McCarthy, Environment Editor
Monday, 22 February 2010
Atmospheric levels of methane, the greenhouse gas which is much more powerful than carbon dioxide, have risen significantly for the last three years running, scientists will disclose today – leading to fears that a major global-warming "feedback" is beginning to kick in.
For some time there has been concern that the vast amounts of methane, or "natural gas", locked up in the frozen tundra of the Arctic could be released as the permafrost is melted by global warming. This would give a huge further impetus to climate change, an effect sometimes referred to as "the methane time bomb".

This is because methane (CH4) is even more effective at retaining the Sun's heat in the atmosphere than CO2, the main focus of international climate concern for the last two decades. Over a relatively short period, such as 20 years, CH4 has a global warming potential more than 60 times as powerful as CO2, although it decays more quickly.
Now comes the first news that levels of methane in the atmosphere, which began rising in 2007 when an unprecedented heatwave in the Arctic caused a record shrinking of the sea ice, have continued to rise significantly through 2008 and 2009.
Although researchers cannot yet be certain, and there may be non-threatening explanations, there is a fear that rising temperatures may have started to activate the positive feedback mechanism. This would see higher atmospheric levels of the gas producing more warming, which in turn would release more methane, which would produce even further warming, and so on into an uncontrollable "runaway" warming effect. This is believed to have happened at the end of the last Ice Age, causing a very rapid temperature rise in a matter of decades.
The new figures will be revealed this morning at a major two-day conference on greenhouse gases in the atmosphere, taking place at the Royal Society in London. They will be disclosed in a presentation by Professor Euan Nisbet, of Royal Holloway College of the University of London, and Dr Ed Dlugokencky of the Earth System Research Laboratory in Boulder, Colorado, which is run by the US National Oceanic and Atmospheric Administration (NOAA).
Both men are leading experts on CH4 in the atmosphere, and Dr Dlugokencky in particular, who is in charge of NOAA's global network of methane monitoring stations, is sometimes referred to as "the keeper of the world's methane". In a presentation on "Global atmospheric methane in 2010: budget, changes and dangers", the two scientists will reveal that, after a decade of near-zero growth, "globally averaged atmospheric methane increased by [approximately] 7ppb (parts per billion) per year during 2007 and 2008."
They go on: "During the first half of 2009, globally averaged atmospheric CH4 was [approximately] 7ppb greater than it was in 2008, suggesting that the increase will continue in 2009. There is the potential for increased CH4 emissions from strong positive climate feedbacks in the Arctic where there are unstable stores of carbon in permafrost ... so the causes of these recent increases must be understood."
Professor Nisbet said at the weekend that the new figures did not necessarily mark a new excursion from the trend. "It may just be a couple of years of high growth, and it may drop back to what it was," he said. "But there is a concern that things are beginning to change towards renewed growth from feedbacks."
The product of biological activity by microbes, usually in decaying vegetation or other organic matter, "natural gas" is emitted from natural sources and human activities. Wetlands may give off up to a third of the total amount produced. But large amounts are also released from the production of gas for fuel, and also from agriculture, including the production of rice in paddy fields and the belches of cows as they chew the cud (which is known as "bovine eructation"). However, methane breaks down and disappears from the atmosphere quite quickly, and until recently it was thought that the Earth's methane "budget" was more or less in balance.
Global atmospheric levels of the gas now stand at about 1,790 parts per billion. They began to be measured in 1984, when they stood at about 1,630ppb, and were steadily rising. It was thought that this was due to the Russian gas industry, which before the collapse of the Soviet Union was affected by enormous leaks.
After 1991, substantial amounts were invested in stopping the leaks by a privatised Russian gas industry, and the methane rise slowed.
Methane in the atmosphere: The recent rise
Many climate scientists think that frozen Arctic tundra, like this at Sermermiut in Greenland, is a ticking time bomb in terms of global warming, because it holds vast amounts of methane, an immensely potent greenhouse gas. Over thousands of years the methane has accumulated under the ground at northern latitudes all around the world, and has effectively been taken out of circulation by the permafrost acting as an impermeable lid. But as the permafrost begins to melt in rising temperatures, the lid may open – with potentially catastrophic results.

Transport for London unveils UK's largest hydrogen fuel cell

New technology part of wide-ranging green building makeover.
Cath Everett, BusinessGreen, Monday 22 February 2010 12.28 GMT
Transport for London (TfL) hopes to cut its carbon emissions by 40 per cent and save £90,000 per annum on utility bills with a newly unveiled green power plant at its head office that includes the UK's largest hydrogen fuel cell.
TfL and the London Development Agency (LDA), which is housed in the same building, also announced last week that they plan to sign up to the 10:10 energy efficiency campaign from this April.
As a result, they have committed to reduce carbon emissions by a further 10 per cent and cut energy bills by £400,000 over the next financial year.
The £2.4m combined heat and power plant, which was unveiled late last week, is located at TfL's Palestra building in Southwark and was implemented as part of a major green retrofit.
The plant is expected to supply all the facility's power needs at off-peak times and 25 per cent of requirements during peak hours.
Waste heat will also be pumped into a unit on the roof to ensure the building keeps cool and supplement its six existing electric chillers.
The hydrogen fuel cell, which was funded out of TfL's £25m Climate Change Fund, will likewise provide electricity, heat and cooling and provide the office's hot water supply.
Speaking at the opening of the new facilities, Kit Malthouse, deputy mayor of London and chairman of the London Hydrogen Partnership, said: "Zero-polluting hydrogen fuel has the potential to radically transform the way we power our city to create a more pleasant environment. This isn't a fuel of the future but is available right now."
He added that "to catalyse its use more widely", the technology's benefits would be promoted to visitors and passers-by via a permanent multimedia exhibition display fuelled by energy generated on the site.
In a bid to meet its 10:10 commitments, TfL likewise plans to cut general waste and paper consumption at its 32 sites and to retrofit 22 of them in accordance with the Building Energy Efficiency Programme.
Solar panels will be introduced to heat water, while green roofs will be installed to boost insulation, absorb rainwater and improve local ecology.
The deployment of new building management software is also planned to control temperature, heating and cooling systems more effectively, while new energy management and enhanced automated meter reading systems will be similarly installed.
Low-condensing NOx boilers will replace old ones in three buildings and 2,500 lights will be swapped for energy-efficient replacements. About 1,000 halogen lamps will likewise be replaced with low-energy LED lights that should cut energy consumption by 90 per cent and improve lamp life by 25 times.
The company said that alongside the building improvements, a staff awareness programme will be launched from April to encourage personnel to cut their energy consumption.
In broader terms, the organisation plans to spend £23m on green programmes over the next year to help Londoners reduce their carbon emissions. For example, TfL is planning to introduce a public cycle hire scheme in the capital later this year and also aims to add 300 new diesel-electric hybrid buses to its current fleet of 56 by March 2011, after which time all new additions will have to be hybrids.

Sunday, 21 February 2010

Barack Obama's climate change policy in crisis

President Barack Obama's climate change policy is in crisis amid a barrage of US lawsuits challenging goverment directives and the defection of major corporate backers for his ambitious green programmes.

By Philip Sherwell in Washington Published: 5:13PM GMT 20 Feb 2010

The legal challenges and splits in the US climate consensus follow revelations of major flaws in the UN Intergovernmental Panel on Climate Change (IPCC) report, which declared that global warming was no longer scientifically contestable.
Critics of America's Environmental Protection Agency (EPA) are now mounting a series of legal challenges to its so-called "endangerment finding" that greenhouse gases are a threat to human health.

That ruling, based in part on the IPCC's work, gave the agency sweeping powers to force business to curb emissions under the Clean Air Act. An initial showdown is expected over rules on vehicle emissions.
Oil-rich Texas, the Lone Star home state of Mr Obama's predecessor George W Bush, is mounting one of the most prominent challenges to the EPA, claiming new regulations will impose a crippling financial toll on agriculture and energy producers.
"With billions of dollars at stake, EPA outsourced the scientific basis for its greenhouse gas regulation to a scandal-plagued international organization that cannot be considered objective or trustworthy," said Greg Abbott, Texas's attorney general.
"Prominent climate scientists associated with the IPCC were engaged in an ongoing, orchestrated effort to violate freedom of information laws, exclude scientific research, and manipulate temperature data.
"In light of the parade of controversies and improper conduct that has been uncovered, we know that the IPCC cannot be relied upon for objective, unbiased science - so EPA should not rely upon it to reach a decision that will hurt small businesses, farmers, ranchers, and the larger Texas economy."
Mr Abbott’s comments follow the controversy over the work of the University of East Anglia’s Climate Research Unit, whose research was at the heart of IPCC findings. Leaked emails indicated that the freedom of information act was breached and that data was manipulated and suppressed to strengthen the case for man-made climate change.
A series of exaggerated claims, factual mistakes and unscientific sourcing have subsequently been uncovered in the 2007 IPCC report - such as the alarming but unjustified warning that Himalayan glaciers might disappear by 2035. Scientists insistent that humans are causing climate change have said the mistakes do not overturn an overwhelming burden of proof backing their case.
The case brought by Texas is one of 16 challenging the IPA over its data or procedures. They have been lodged variously by states, Republican congressmen, trade associations and advocacy groups before last week's cut-off to file court actions.
The pro-market Competitive Enterprise Institute (CEI) and US Chamber of Commerce are also mounting high-profile battles to overturn the EPA decisions through petitions filed with the US Circuit Court of Appeals in Washington.
"The Clean Air Act is an incredibly flawed way to regulate greenhouse gas emissions and the findings on which it is based are full of very shoddy science," said Myron Ebell, director of energy and global warming policy at the CEI.
"Many policies and proposals that would raise energy prices through the roof for American consumers and destroy millions of jobs in energy-intensive industries still pose a huge threat."
Among those he listed were the EPA's decision to regulate greenhouse gas emissions using the Clean Air Act, efforts to use the Endangered Species Act to stop energy production and new power plants, the higher fuel economy standards for new passenger vehicles enacted in 2007, and bills in Congress that require buildings to use more renewable electricity and introduce higher energy efficiency standards.
The EPA, a federal agency which is increasingly key to Mr Obama's green agenda as his legislative policies become bogged down in Congress, refuted the charges.
"The evidence of and threats posed by a changing climate are right before our eyes," said Catherine Milbourn, EPA spokeswoman. "That science came from an array of highly respected, peer-reviewed sources from both within the United States and across the globe."
The Environmental Defence Fund is leading the defence of the EPA's findings, arguing that critics are deliberately ignoring science to set back efforts to tackle climate change. "The EPA's decision is based on a 200-page synthesis of major scientific assessments," said the Fund, denying the work was simply attributable to the IPCC.
Also last week, the United States Climate Action Partnership, a grouping of businesses backing national legislation on reductions of greenhouse gas emissions, suffered a major blow when oil firms BP America and Conoco Phillips and construction giant Caterpillar left the group.
The two oil firms, the most significant departures, walked out on the industry-green alliance protesting that "cap and trade" legislation would have awarded them far fewer free emission allowances than their rivals in the coal and electricity industries.
Last week also saw the United Nation's top climate official, Yvo de Boer, announce his resignation after the failure of the recent Copenhagen climate conference to agree to more than vague promises to limit carbon dioxide emissions.

Acidified landscape around ocean vents foretells grim future for coral reefs

Underwater vents allow scientists to assess the acidic effect of carbon dioxide on ocean life

Robin McKie, science editor
The Observer, Sunday 21 February 2010
Huge vents covering the sea-floor – among the strangest and most spectacular sights in nature – pour carbon dioxide and other gases into the deep waters of the oceans.
Last week, as researchers reported that they had now discovered more than 50,000 underwater volcanic springs, they also revealed a new use for them – as laboratories for measuring the impact of ocean acidification on marine life.
The seas are slowly being made more acidic by the increasing amounts of carbon dioxide from factories and cars being pumped into the atmosphere and then dissolved in the sea. The likely impact of this acidification worries scientists, because they have found that predicting the exact course of future damage is a tricky process.
That is where the undersea vents come in, says Dr Jason Hall-Spencer of the University of Plymouth. "Seawater around these vents becomes much more acidic than normal sea­water because of the carbon dioxide that is being bubbled into it," he told a meeting of the American Association for the Advancement of Science in San Diego, California, last week. "Indeed, it reaches a level that we believe will be matched by the acidity of oceans in three or four decades. That is why they are so important."
As part of his research, Hall-Spencer has scuba-dived into waters around vents and used submersibles to study those in deeper waters. In both cases the impact was dramatic, he told the conference.
"The sea floor is often very colourful. There are corals, pink algae and sea urchins. But I have found that these are wiped out when the water becomes more acidic and are replaced by sea grasses and foreign, invasive algae.
"There is a complete ecological flip. The seabed loses all its richness and variety. And that is what is likely to happen in the next few decades across the world's oceans."
Hall-Spencer also noted that in acidic seawater a type of algae known as coralline algae – which act as the glue holding coral reefs together – are destroyed.
"When coralline algae are destroyed, coral reefs fall apart," he said. "So we can see that coral islands like the ­Maldives face a particularly worrying future. ­Rising sea levels threaten to drown them, while acidic waters will cause them to disintegrate.
"It is a very worrying combination."

Climate scientists withdraw journal claims of rising sea levels

Study claimed in 2009 that sea levels would rise by up to 82cm by the end of century – but the report's author now says true estimate is still unknown
David Adam, Sunday 21 February 2010 18.00 GMT
Scientists have been forced to withdraw a study on projected sea level rise due to global warming after finding mistakes that undermined the findings.
The study, published in 2009 in Nature Geoscience, one of the top journals in its field, confirmed the conclusions of the 2007 report from the Intergovernmental Panel on Climate Change (IPCC). It used data over the last 22,000 years to predict that sea level would rise by between 7cm and 82cm by the end of the century.
At the time, Mark Siddall, from the Earth Sciences Department at the University of Bristol, said the study "strengthens the confidence with which one may interpret the IPCC results". The IPCC said that sea level would probably rise by 18cm-59cm by 2100, though stressed this was based on incomplete information about ice sheet melting and that the true rise could be higher.
Many scientists criticised the IPCC approach as too conservative, and several papers since have suggested that sea level could rise more. Martin Vermeer of the Helsinki University of Technology, Finland and Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research in Germany published a study in December that projected a rise of 0.75m to 1.9m by 2100.
Siddall said that he did not know whether the retracted paper's estimate of sea level rise was an overestimate or an underestimate.
Announcing the formal retraction of the paper from the journal, Siddall said: "It's one of those things that happens. People make mistakes and mistakes happen in science." He said there were two separate technical mistakes in the paper, which were pointed out by other scientists after it was published. A formal retraction was required, rather than a correction, because the errors undermined the study's conclusion.
"Retraction is a regular part of the publication process," he said. "Science is a complicated game and there are set procedures in place that act as checks and balances."
Nature Publishing Group, which publishes Nature Geoscience, said this was the first paper retracted from the journal since it was launched in 2007.
The paper – entitled "Constraints on future sea-level rise from past sea-level change" – used fossil coral data and temperature records derived from ice-core measurements to reconstruct how sea level has fluctuated with temperature since the peak of the last ice age, and to project how it would rise with warming over the next few decades.
In a statement the authors of the paper said: "Since publication of our paper we have become aware of two mistakes which impact the detailed estimation of future sea level rise. This means that we can no longer draw firm conclusions regarding 21st century sea level rise from this study without further work.
"One mistake was a miscalculation; the other was not to allow fully for temperature change over the past 2,000 years. Because of these issues we have retracted the paper and will now invest in the further work needed to correct these mistakes."
In the Nature Geoscience retraction, in which Siddall and his colleagues explain their errors, Vermeer and Rahmstorf are thanked for "bringing these issues to our attention".

In Japan, Even a Used Prius Shines

TOKYO—As Toyota Motor Corp.'s best-selling Prius gets caught up in the auto maker's broader quality issues, it can still count on one solid base of support: Japanese drivers who want one now.
It is too early to tell how the brake-system issues Toyota disclosed earlier this month in some Priuses will affect sales of the gasoline-electric hybrid in Japan, where it is the top-selling car. Sales topped 22,300 in Toyota's home country last month, according to the Japan Automobile Dealers Association.
But dealers report demand remains strong for used models of the 2010-model Prius, which is at the center of brake concerns that prompted Toyota to recall more than 400,000 Prius and other hybrid models. That's because car buyers can drive a used Prius off the lot immediately instead of enduring the wait of up to 4½ months for delivery of a new one.

Tokyo-based Gulliver International Co., one of Japan's biggest used-car trading firms, said used 2010-model Priuses continue to sell at prices that are 5% to 10% higher than the same model sold new.
The average price for a used Prius S, one of the basic Prius models released last May, is around 2.4 million yen, or about $26,300, the firm says. The sticker price for a new version of the same model is 2.2 million yen.
Some used models in popular colors such as pearl, black and silver are quoted at even higher prices, between 2.5 million yen and 2.8 million yen, on Gulliver's Web site.
"There are people who decided not to buy Prius out of safety concerns, but everyone is not like that. Some go for a used one so they can drive it immediately," said Shoichi Suzuki, head of research at Gulliver.
Japan Plant Service, a used-car dealer in Hachioji city outside central Tokyo, sells about five second-hand Priuses a month, which account for half of the company's monthly business. Hiroyuki Endo, a employee there, said the pace of 2010 Prius sales hasn't slowed. The company has sold two Priuses so far this month.
Unlike in the U.S., where sales of most Toyota models, including the Prius, have fallen, "It seems customers do not mind. Prius remains the most popular car for our business," said Mr. Endo.
Sellers of the used Priuses are required to make sure the necessary repairs to the braking system are made.
Japanese consumers pay close attention to product sales rankings. When something reaches the No. 1 spot, be it be cars, cosmetics or beverages, a buying frenzy sometimes follows.
The Prius delivery time has been reduced from its peak of eight months. Still, Toyota said Feb. 5 that customers who ordered a Prius after Feb. 3 wouldn't have their vehicle delivered until late June.
Netz Toyota Yokohama, an independent dealer with 31 outlets outside Tokyo, has received four cancellations out of about 100 new Priuses that have been ordered but haven't yet been shipped. The customers who canceled cited the safety issue, said company spokesman Takeshi Kitagawa.
But, he added, "While cancellation is not zero, confusion is much less than we expected. We thought customers would be much more upset given the widespread media reports."
At Exfeel Corp., an independent used-car dealer in the commuter town of Chiba, president Taro Yoshikawa sold some used Priuses this week to other used-vehicle dealers at prices higher than those for brand new ones, indicating that dealers assume that they can still sell Prius at premium prices for some time.
"The popularity of Prius is still strong and I don't anticipate the price [for the used ones] would drop dramatically," he said. "Its popularity is persistent."
Write to Miho Inada at and Mariko Sanchanta at

A green future for Teesside

With proper levels of investment and government support, offshore windfarms could fill economic gaps in the north-east

Ed Cox, Thursday 18 February 2010 17.00 GMT
In October 1346, English forces from across the north of England rode into Durham to take up positions against the Scottish army of King David II. They arrived just in time to take up the better ground and in the ensuing Battle of Neville's Cross they saw off the invaders and captured their king. This week the cabinet is meeting close by the scene of the battle, but it remains to be seen whether the arrival of the cavalry will see off the economic forces currently hammering the north-east.
The cabinet meeting in Durham is certainly timely, for just down the road the final steel will be poured at Teesside Cast Products bringing to a close 150 years of steelmaking. The process – known as chasing the salamander – involves emptying the last dregs from the smelt before mothballing the plant and any remaining hopes of a last-minute takeover.
The perceived need for this cabinet meeting is symbolic of a parliamentary system that has so distanced the executive from the day-to-day life of steelworkers that such gestures count for a great deal. (Roll-on powers for local authorities to drive their own economic development). Ministers will seize upon the opportunity to beat their breasts about the Corus closure and reiterate promises of a £60m investment package to bring new jobs to the area: new, green jobs in offshore wind-farming.
The argument for green jobs is persuasive. As unemployment continues to rise and the economy stumbles out of recession, the need to create new jobs and new markets is pressing. At the same time, the imperative to drastically reduce emissions of greenhouse gases means that we need to rethink how the UK economy will be structured. But three issues must be addressed if the "green new deal" is to become a reality.
First, the UK must seize some competitive advantage. Taking offshore wind-farming alone, a recent Institute for Public Policy report argues that anything between 20,000 and 70,000 jobs could be created in the sector. But with the significant majority of job growth predicted to be in turbine and component manufacture, unless the UK can attract a turbine manufacturer to its shores it is likely to lose out to Denmark, Spain or Germany with their better-developed wind markets. Not only do we need to secure the domestic market through reforms to the renewables obligation and investment in the national grid infrastructure, but a combination of tax incentives and loan guarantees might also be required.
Second, we need to do the maths. Even the best estimates suggest that for every four jobs lost at Corus, only one will be created in offshore wind locally. Analysis for the Department for Business, Innovation and Skills found that in 2007-08 there were only 445,000 UK jobs in the entire green sector (including the low-carbon support and supply chain), and with growth projected at just 5% per annum we have to look more widely not just at new green jobs but at the greening of existing industry.
Third, we need to build up our skills base. We simply don't have enough engineers and skilled technicians to fill emerging opportunities. Offshore wind industries will look to attract offshore oil and gas talent before they turn to retraining Corus workers. A demand-led skills strategy is going to be too little, too late; a more proactive, long-term plan is needed urgently.
Records have it that despite being outnumbered as a result of fighting wars on many fronts, the English army prevailed in 1346 by pooling the best forces from across the northern regions and seeking the better ground. The cabinet should dwell on this lesson, for victory at the Battle of Neville's Cross led to many decades of peace and prosperity in England's north-east.

MPA, NTU launch Maritime Clean Energy Research Programme

Posted: 18 February 2010 1821 hrs

SINGAPORE: Research funding of up to S$15 million will be available over five years under a newly launched "Maritime Clean Energy Research Programme". The programme, jointly launched by the Maritime and Port Authority of Singapore (MPA) and Nanyang Technological University (NTU), aims to help the development of green technologies in shipping and port management. Research will be conducted through the Centre for Maritime Energy Research, a new centre under the Energy Research Institute at NTU. The first Call for proposals for the research grant opened on Thursday.

Clipper confirms plans for wind turbine factory

Manufacturing of wind turbines set to return to England, with factory in Newcastle making world's largest blades

Tim Webb, Thursday 18 February 2010 17.27 GMT
Manufacturing of wind turbines is set to return to England after Clipper Windpower finally confirmed plans for a ­factory in the north-east, providing a much-needed boost to a region that has suffered badly from the recession.
The American company said it would invest in a testing facility at Walker, on the banks of the Tyne near Newcastle, for what it claims are the world's largest turbine blades.
Each of Windpower's "Britannia" turbines would be able to generate up to 10MW of electricity, enough for 10,000 homes. If the blades work properly, Clipper will make up to 75 of them a year, employing up to 500 people by 2020.
Gordon Brown and the energy and ­climate change secretary, Ed Miliband, visited the site today. The government recently announced it was helping to fund the project.
Clipper has been planning the scheme for some time, but it had been delayed while it negotiated the sale of a 49% stake in itself to UTC for $270m (£172m), which was completed in December.
The company hopes the turbines will be used in the giant offshore wind projects for which the Crown Estate recently awarded operating licences. The turbines will be manufactured in a new factory being built in Walker by Shepherd Offshore, the business run by former Newcastle United chairman Freddie Shepherd.
Brown said: "The essential work of tackling climate change brings with it new ways of doing things, which in turn brings with it new jobs. Today's announcement is clear evidence of new, green industries being firmly established in the UK."
"The UK is a global leader in offshore wind power and the north-east is at the forefront in providing the skills, expertise and enterprise to capitalise on this rapidly expanding market, which has the potential to create thousands of green jobs."
Shepherd, whose business has risen to prominence in the wake of the decline of the shipbuilding industry, said: "This industry will build on our proud history, our skills and our ambition. We are determined that we will create here the very best location for the international offshore wind industry."
Danish firm Vestas closed its turbine plant on the Isle of Wight last year, blaming a slowdown in orders worldwide and "nimby" objectors to onshore projects in the UK. Skykon in Scotland is currently the only factory making components for the wind industry in the UK.

Friday, 19 February 2010

World's top firms cause $2.2tn of environmental damage, report estimates

Report for the UN into the activities of the world's 3,000 biggest companies estimates one-third of profits would be lost if firms were forced to pay for use, loss and damage of environment•
Juliette Jowit, Thursday 18 February 2010 18.19 GMT
The cost of pollution and other damage to the natural environment caused by the world's biggest companies would wipe out more than one-third of their profits if they were held financially accountable, a major unpublished study for the United Nations has found.
The report comes amid growing concern that no one is made to pay for most of the use, loss and damage of the environment, which is reaching crisis proportions in the form of pollution and the rapid loss of freshwater, fisheries and fertile soils.
Later this year, another huge UN study - dubbed the "Stern for nature" after the influential report on the economics of climate change by Sir Nicholas Stern - will attempt to put a price on such global environmental damage, and suggest ways to prevent it. The report, led by economist Pavan Sukhdev, is likely to argue for abolition of billions of dollars of subsidies to harmful industries like agriculture, energy and transport, tougher regulations and more taxes on companies that cause the damage.
Ahead of changes which would have a profound effect - not just on companies' profits but also their customers and pension funds and other investors - the UN-backed Principles for Responsible Investment initiative and the United Nations Environment Programme jointly ordered a report into the activities of the 3,000 biggest public companies in the world, which includes household names from the UK's FTSE 100 and other major stockmarkets.
The study, conducted by London-based consultancy Trucost and due to be published this summer, found the estimated combined damage was worth US$2.2 trillion (£1.4tn) in 2008 - a figure bigger than the national economies of all but seven countries in the world that year.
The figure equates to 6-7% of the companies' combined turnover, or an average of one-third of their profits, though some businesses would be much harder hit than others.
"What we're talking about is a completely new paradigm," said Richard Mattison, Trucost's chief operating officer and leader of the report team. "Externalities of this scale and nature pose a major risk to the global economy and markets are not fully aware of these risks, nor do they know how to deal with them."
The biggest single impact on the $2.2tn estimate, accounting for more than half of the total, was emissions of greenhouse gases blamed for climate change. Other major "costs" were local air pollution such as particulates, and the damage caused by the over-use and pollution of freshwater.
The true figure is likely to be even higher because the $2.2tn does not include damage caused by household and government consumption of goods and services, such as energy used to power appliances or waste; the "social impacts" such as the migration of people driven out of affected areas, or the long-term effects of any damage other than that from climate change. The final report will also include a higher total estimate which includes those long-term effects of problems such as toxic waste.
Trucost did not want to comment before the final report on which sectors incurred the highest "costs" of environmental damage, but they are likely to include power companies and heavy energy users like aluminium producers because of the greenhouse gases that result from burning fossil fuels. Heavy water users like food, drink and clothing companies are also likely to feature high up on the list.
Sukhdev said the heads of the major companies at this year's annual economic summit in Davos, Switzerland, were increasingly concerned about the impact on their business if they were stopped or forced to pay for the damage.
"It can make the difference between profit and loss," Sukhdev told the annual Earthwatch Oxford lecture last week. "That sense of foreboding is there with many, many [chief executives], and that potential is a good thing because it leads to solutions."
The aim of the study is to encourage and help investors lobby companies to reduce their environmental impact before concerned governments act to restrict them through taxes or regulations, said Mattison.
"It's going to be a significant proportion of a lot of companies' profit margins," Mattison told the Guardian. "Whether they actually have to pay for these costs will be determined by the appetite for policy makers to enforce the 'polluter pays' principle. We should be seeking ways to fix the system, rather than waiting for the economy to adapt. Continued inefficient use of natural resources will cause significant impacts on [national economies] overall, and a massive problem for governments to fix."
Another major concern is the risk that companies simply run out of resources they need to operate, said Andrea Moffat, of the US-based investor lobby group Ceres, whose members include more than 80 funds with assets worth more than US$8tn. An example was the estimated loss of 20,000 jobs and $1bn last year for agricultural companies because of water shortages in California, said Moffat.

Drax power plant suspends plan to replace coal with greener fuel

Ben Webster, Environment Editor

Britain’s biggest power station has suspended its plan to replace coal with greener fuel, leaving the Government little chance of meeting its target for renewable energy.
Drax, in North Yorkshire, which produces enough electricity for six million homes, is withdrawing a pledge to cut CO2 emissions by 3.5 million tonnes a year, or 17.5 per cent.
The power station, which is the country’s largest single source of CO2, has invested £80 million in a processing unit for wood, straw and other plant-based fuels, known as biomass. The unit is designed to produce more renewable electricity than 600 wind turbines, but will operate at only a fraction of its capacity because Drax says it is cheaper to continue to burn coal.
Drax is also one of dozens of companies delaying investments in new biomass power stations because of uncertainty over the Government’s policy on long-term subsidies. Hundreds of farmers growing biomass crops may now struggle to sell their produce.

Drax’s decision will make it almost impossible for the Government to meet its commitment to increase the proportion of electricity from renewable sources from 5.5 per cent to 30 per cent by 2020. Renewable energy is a key component of Britain’s legally binding targets to cut overall emissions by 34 per cent by 2020 and 80 per cent by 2050.
In an interview with The Times, Dorothy Thompson, Drax’s chief executive, blamed the Government for failing to give sufficient subsidy to biomass to make it competitive with coal.
Drax has bought two million tonnes of biomass, but Ms Thompson said that it was considering selling it overseas because it no longer made economic sense to burn it in its six boilers.
Ms Thompson said: “We are not confident that the [subsidy] regime for what is one of the cheapest forms of renewable energy will support operat- ing the biomass unit at full load. The UK is missing out massively on the potential for renewable energy from biomass. We want to run in a lowcarbon way but policy is against us.”
She accused the Department of Energy and Climate Change of lacking the skills to develop a successful biomass policy and focusing too heavily on expensive and unreliable wind turbines. “I think they simply have not put enough expertise into biomass. Wind is not a silver bullet; its benefits have been overstated.”
Ms Thompson said that the Government was holding back biomass by offering it only a quarter of the subsidy given to offshore wind farms and capping the amount of crops that can be burnt in coal-fired power stations.
She said that it was cheaper for Drax to pay for emissions permits to burn coal, the most polluting fossil fuel, than to switch to biomass.
Each megawatt hour of electricity costs Drax £31 to produce from coal and £40 from biomass.
Ms Thompson said that Drax would also be unable to proceed with its £2 billion plan for three new biomass power plants unless the Government offered longer-term support. “We do not believe we can create a credible investment case for our shareholders if there is complete regulatory uncertainty. This is a very serious issue because renewable energy through biomass is a key component for delivering the 2020 target.”
The Renewable Energy Association said that plans for more than 50 biomass projects, totalling £13 billion of investment, had been suspended because of uncertainty over policy. Lord Hunt of Kings Heath, the Energy Minister, said this month that the subsidy regime for biomass needed to be reviewed. Wind farm developers are guaranteed fixed subsidies for 20 years, but biomass investors could have the subsidy cut after four years.

CDP sends out carbon questionnaires to 4,500 firms

London, 18 February:
More than 4,500 companies have been sent questionnaires about their greenhouse gas emissions and climate strategies this year by the Carbon Disclosure Project (CDP). This is up from 3,700 companies questioned last year.
The number of institutional investors signed up to this year’s CDP request has also increased, to 534 with $64 trillion in assets under management, from 475 investors in 2009, at that time managing around $55 trillion.
Signing up to the request for information this year for the first time were Wells Fargo, BNY Mellon and the Industrial Bank of Korea, among others.
For 10 years, the CDP has asked constituents of the FT500 – the world’s largest listed companies – to disclose investment-related information about climate change. It has gradually increased its reach.
The CDP this year wrote for the first time to companies in Turkey, Peru, Morocco, Egypt and Israel.
The largest global companies reporting to the CDP will be given a performance score on actions they have taken to address climate change, such as setting emission reduction targets, achieved and expected reductions, governance structures and getting their data externally verified.
For the first time, the results will be published in a performance leadership index, expected to be released in September this year.
The CDP has also upgraded its system for reporting this year, introducing online tools developed with Accenture, Microsoft and SAP. Henk de Bruin, head of corporate sustainability at electronics firm Philips, which piloted the upgraded tools, said they “enable greater analysis of the CDP data to benchmark against peers and other sectors and geographies”.

Companies not satisfying thirst for water reporting

London, 18 February:
One hundred water-dependent firms have been chastised by investors for weak disclosure of their exposure to water risks.
The best score in a survey by investor coalition Ceres of global listed companies in water-intensive sectors such as chemicals, electric power and mining was earned by UK drinks firm Diageo – but it managed to garner only 43 out of a possible 100 points.
Eighty of the companies scored less than 30 points.
Ceres’ president Mindy Lubber said: “Most companies provide basic disclosure on overall water use and water scarcity concerns, but their focus and attention so far is not nearly at the level needed given the enormity of this growing global challenge.”
No companies disclosed “comprehensive” data on water use and reporting of water risks in their supply chains was particularly poor, Ceres found, although Danone, SABMiller and Unilever did provide estimates of the water use embedded in their supply chains. “For example, Danone reports the water footprint for its milk and water divisions at each stage of the product lifecycle, including raw material production, processing, packaging and logistics,” Ceres said.
The report chided companies for using “vague, boilerplate language” about water risks in their annual reports or financial filings. “They fail to reference specific at-risk operations or supply chains and lack any attempt to quantify or monetise risk,” it said.
Likewise, Ceres, which published the report with US financial services firm UBS and data provider Bloomberg, found some companies disclosing risks in their sustainability report, but leaving the information out of financial reports, “a finding that indicates an ongoing lack of integration between voluntary reports and regulated financial filings”.
Only 21 of the companies disclosed a quantified goal to reduce their water use, the report found, while only three – Diageo, chemicals firm DuPont and Swiss mining company Xstrata – set reduction targets differentiated by the level of water stress at specific facilities.
Just 14 provided data on water use broken down to regional or site-specific levels. Ceres said: “Because water risk is geographically dependent, this absence of context makes it nearly impossible for investors and analysts to assess corporate exposure to water scarcity, or to understand if corporate actions to mitigate risk are either appropriate or effective.”
“This report makes clear that companies are not providing investors with the kind of information they need to understand the risks and opportunities posed by water scarcity,” said Jack Ehnes, CEO of the California State Teachers Retirement System, the second largest public pension fund in the US with $134 billion under management.