Greenpeace outlined a scenario in which 95 percent of energy generation comes from renewable sources by 2050 while creating 12 million jobs, for a price tag of around $18 trillion in global investment, resulting in an 80 percent decrease in CO2 emissions.
According to the Greenpeace report, Energy [R]evolution: A Sustainable World Energy Outlook, such a drastic revolution in energy production is necessary, since even a 50 percent decrease in CO2 emissions by 2050 might not be enough to prevent runaway climate change scenarios. Under the Greenpeace scenario, CO2 emissions will peak in 2015 before dropping by more than 80 percent by 2050.
In its first edition of the Energy [R]evolution report in 2007, Greenpeace had predicted that 156 GW of renewable energy would be produced in 2010. As of the end of 2009, 158 GW were being produced.
The report makes several policy recommendations, such as phasing out all subsidies for fossil and nuclear fuel businesses, establishing legally binding targets for renewable energy, and strict efficiency standards. According to the report, conventional fuel sources receives an estimated $250 – 300 billion in worldwide subsidies, with coal alone receiving $63 billion.
The Energy [R]evolution scenario outlines a scenario which creates about 12 million jobs, with 8.5 million in the renewables sector alone, by 2030. Without adopting the policy recommendations of the report, however, only 2.4 million renewable jobs will be created. The renewable energy sector already employees two million people worldwide. The policy recommendations also state that the market for renewable technology will increase from $100 billion today to more than $600 billion by 2030.
Actual energy consumption is expected to increase up to 60 percent by 2050, according to the report. Implementing the policy recommendations in the report, including improved insulation and design for buildings, implementing efficiency standards and replacing heating systems with renewable technology, would decrease energy consumption by 20 percent.
Greenpeace also reported that renewable energy resources alone have the potential to generate up to 32 times current global power demands.
The report estimates potential savings in fuel costs of switching to renewable systems at $282 billion per year. However, the annual investment necessary between now and 2030 is estimated to be $782 billion, though without further investment costs beyond that time horizon. Under current policies, Greenpeace estimates global energy investment of $11.2 trillion dollars from now until 2030, while under the Energy [R]evolution scenario, global investment reaches $17.9 trillion.
Tuesday, June 8, 2010
Saturday, June 5, 2010
Sunday, May 23, 2010
Jairam Ramesh calls for convergence on reducing biodiversity loss
Minister of State for Environment and Forests Jairam Ramesh on saturday called for effective and coherent convergence of policies at the global, national and regional level to achieve significant reduction of biodiversity loss
Buzz up!Addressing participants at TERI-sponsored seminar on climate change at the India Habitat Centre here, Ramesh said that the conservation and sustainable use of biodiversity offers resilience to climate variability and natural disasters.
He said: "India has been the initial signatory of the CBD (Convention of Biological Diversity), hence safeguarding biodiversity requires action at all levels with a strong commitment to contribute towards achieving international targets."
"Over the years India has taken several key initiatives to achieve the objectives of the CBD at the national level through-creation of an institutional structure, through traditional Knowledge Digital Library (TKDL), an important initiative by India to fight against bio-piracy globally and through the People Biodiversity Register (PBR) to document the oral knowledge on biodiversity," he added.
Ramesh also mentioned that India has proactively developed a link between climate change and biodiversity by creating an Indian version of IPCC, i.e. Indian Network for Comprehensive Climate Change Assessment that will conduct a 4x4 assessment in the sectors of agriculture, forests, water and health, and regions of Himalayas, Western Ghats, North-East India, and Coastal regions, measuring the impact of Climate Change.
On forests and climate change, he said India is going to double its aforestation targets as a part of Greening India Mission, one of the missions under the National Action Plan on Climate Change.
Dr. Leena Srivastava, Executive Director (Operations) TERI, said that the contribution of biodiversity to human and economic well-being is important; hence adequate management and governance at the national level are the key strengths that need to be developed.
Emphasizing the need for strengthening and addressing the key challenges afflicting economic resources, bio-resources and the environment, Dr. Srivastava said: "There is an urgent need for strengthening the institutional base for biodiversity and a requirement for checking the impact of climate change on biodiversity, environment, ecosystems, health, etc, ensuring continued benefits to people and opportunities for poverty reduction and economic development."
The seminar also had deliberations on issues such as:
Tropical forest productivity: Conservation for development which highlighted issues on the forest productivity of the tropical evergreen forests from different part of world to build a stronger case for tropical forests as the active sink for mitigation. This function of the tropical forests strengthens the case of these ecosystems for conservation and hence also provides an important developmental opportunity for the local communities dependent on these ecosystems.
Poverty alleviation using agro-forestry models providing an important opportunity for small and medium farmers to gain from the market value of the products as well as from the carbon markets.
REDD+: Addressing poverty alleviation of forest dependent communities, addressing the participation of local communities and financial mechanisms to benefit the local communities from biodiversity conservation.
Discovering new drugs: Banking on species diversity to explore the rich base of biodiversity in developing the new drugs against the future health problems.
Biofuels: Biodiversity based adaptation strategy focusing on various options available in nature to harness future demands and types of fuel.
Key recommendations of the seminar included strengthening of biodiversity monitoring system, creation of linkages amongst various pertinent institutions, focusing on marine ecosystems along with terrestrial ones, and understanding the economic values of biodiversity.
The seminar ended with a conclusion that the learning's from Indian experience has a wider benefit in South Asian and the pan-tropical context, in terms of developing the programs for mitigation and adaptation based on the biodiversity resources, hence country leadership and increased support from development cooperation are critical for the implementation of the Convention on Biological Diversity, nationally and internationally too.
Buzz up!Addressing participants at TERI-sponsored seminar on climate change at the India Habitat Centre here, Ramesh said that the conservation and sustainable use of biodiversity offers resilience to climate variability and natural disasters.
He said: "India has been the initial signatory of the CBD (Convention of Biological Diversity), hence safeguarding biodiversity requires action at all levels with a strong commitment to contribute towards achieving international targets."
"Over the years India has taken several key initiatives to achieve the objectives of the CBD at the national level through-creation of an institutional structure, through traditional Knowledge Digital Library (TKDL), an important initiative by India to fight against bio-piracy globally and through the People Biodiversity Register (PBR) to document the oral knowledge on biodiversity," he added.
Ramesh also mentioned that India has proactively developed a link between climate change and biodiversity by creating an Indian version of IPCC, i.e. Indian Network for Comprehensive Climate Change Assessment that will conduct a 4x4 assessment in the sectors of agriculture, forests, water and health, and regions of Himalayas, Western Ghats, North-East India, and Coastal regions, measuring the impact of Climate Change.
On forests and climate change, he said India is going to double its aforestation targets as a part of Greening India Mission, one of the missions under the National Action Plan on Climate Change.
Dr. Leena Srivastava, Executive Director (Operations) TERI, said that the contribution of biodiversity to human and economic well-being is important; hence adequate management and governance at the national level are the key strengths that need to be developed.
Emphasizing the need for strengthening and addressing the key challenges afflicting economic resources, bio-resources and the environment, Dr. Srivastava said: "There is an urgent need for strengthening the institutional base for biodiversity and a requirement for checking the impact of climate change on biodiversity, environment, ecosystems, health, etc, ensuring continued benefits to people and opportunities for poverty reduction and economic development."
The seminar also had deliberations on issues such as:
Tropical forest productivity: Conservation for development which highlighted issues on the forest productivity of the tropical evergreen forests from different part of world to build a stronger case for tropical forests as the active sink for mitigation. This function of the tropical forests strengthens the case of these ecosystems for conservation and hence also provides an important developmental opportunity for the local communities dependent on these ecosystems.
Poverty alleviation using agro-forestry models providing an important opportunity for small and medium farmers to gain from the market value of the products as well as from the carbon markets.
REDD+: Addressing poverty alleviation of forest dependent communities, addressing the participation of local communities and financial mechanisms to benefit the local communities from biodiversity conservation.
Discovering new drugs: Banking on species diversity to explore the rich base of biodiversity in developing the new drugs against the future health problems.
Biofuels: Biodiversity based adaptation strategy focusing on various options available in nature to harness future demands and types of fuel.
Key recommendations of the seminar included strengthening of biodiversity monitoring system, creation of linkages amongst various pertinent institutions, focusing on marine ecosystems along with terrestrial ones, and understanding the economic values of biodiversity.
The seminar ended with a conclusion that the learning's from Indian experience has a wider benefit in South Asian and the pan-tropical context, in terms of developing the programs for mitigation and adaptation based on the biodiversity resources, hence country leadership and increased support from development cooperation are critical for the implementation of the Convention on Biological Diversity, nationally and internationally too.
PM hurts Canada by leaving climate off agenda: May
The decision by Prime Minister Stephen Harper to keep climate change off the agenda when leaders of the world's most powerful nations gather in Ontario is unprecedented and hurts Canada's credibility, Elizabeth May, leader of the Green Party of Canada, said yesterday.
May, speaking at a news conference in Victoria, was flanked by Nobel Prize co-winner Andrew Weaver, a University of Victoria professor and Canada Research Chair in climate modeling, as she called for the federal government to reinstate climate change on the agenda for the G8 and G20 summits to be held in Ontario next month.
"We haven't had a G8 or G20 summit in over 20 years when climate change was not on the agenda," May said.
"It's only because Stephen Harper is hosting this meeting that this is possible and the rest of the world is stunned."
Previous summits have included a meeting of environment ministers, giving them a chance to debate their positions on climate change face to face.
There is mounting pressure from the European Union and United Nations to include climate change and May said she has been in touch with G20 ambassadors to Canada who are horrified at the omission.
"It has never happened before that a G20 summit has been hijacked by a host government which refused to put a critical issue on the agenda," said May, who is calling for Vancouver Island residents to mobilize and put pressure on the federal government.
May, who is running in Saanich-Gulf Islands in the next federal election, will be one of the speakers at a rally to be held June 7 at Alix Goolden Hall which will push for a "green economy and climate sanity."
"If we make our call as citizens, before the summits begin, our pressure will be joined by internal diplomatic efforts which I know are ongoing, to get Canada to open the agenda and allow world governments to address world issues," May said.
Weaver said, especially in light of the latest report from the National Round Table on the Environment and Economy which rates Canada's low carbon performance as sixth out of the G8 countries, there is a staggering disconnect between science and policy in Canada.
"The policies we have in place in Canada make it impossible to stay below the two degree [increase in temperature] threshold," Weaver said.
"I am here to support Elizabeth May and her call to get Canada to put climate change on the G8 and G20 agendas."
Although the economy -- the focus of the summits -- is important, climate change is the most important issue facing the world today, May said.
Others calling for Canada to host a meeting of environment ministers include the federal NDP and environmental organizations, who say a face-to-face meeting is essential before the UN climate change conference is held in Mexico in November.
May, speaking at a news conference in Victoria, was flanked by Nobel Prize co-winner Andrew Weaver, a University of Victoria professor and Canada Research Chair in climate modeling, as she called for the federal government to reinstate climate change on the agenda for the G8 and G20 summits to be held in Ontario next month.
"We haven't had a G8 or G20 summit in over 20 years when climate change was not on the agenda," May said.
"It's only because Stephen Harper is hosting this meeting that this is possible and the rest of the world is stunned."
Previous summits have included a meeting of environment ministers, giving them a chance to debate their positions on climate change face to face.
There is mounting pressure from the European Union and United Nations to include climate change and May said she has been in touch with G20 ambassadors to Canada who are horrified at the omission.
"It has never happened before that a G20 summit has been hijacked by a host government which refused to put a critical issue on the agenda," said May, who is calling for Vancouver Island residents to mobilize and put pressure on the federal government.
May, who is running in Saanich-Gulf Islands in the next federal election, will be one of the speakers at a rally to be held June 7 at Alix Goolden Hall which will push for a "green economy and climate sanity."
"If we make our call as citizens, before the summits begin, our pressure will be joined by internal diplomatic efforts which I know are ongoing, to get Canada to open the agenda and allow world governments to address world issues," May said.
Weaver said, especially in light of the latest report from the National Round Table on the Environment and Economy which rates Canada's low carbon performance as sixth out of the G8 countries, there is a staggering disconnect between science and policy in Canada.
"The policies we have in place in Canada make it impossible to stay below the two degree [increase in temperature] threshold," Weaver said.
"I am here to support Elizabeth May and her call to get Canada to put climate change on the G8 and G20 agendas."
Although the economy -- the focus of the summits -- is important, climate change is the most important issue facing the world today, May said.
Others calling for Canada to host a meeting of environment ministers include the federal NDP and environmental organizations, who say a face-to-face meeting is essential before the UN climate change conference is held in Mexico in November.
Report from the Heartland Institute Climate Change Conference
Nancy Thorner
Most likely American Thinker readers are aware that the Heartland Institute held its Fourth International Conference on Climate Change earlier in the week - May 16 - 18. I'm still feeling electrified from the impact the event had on me.
The Heartland Institute of Chicago, Joseph L. Bast, President, held its Fourth International Conference on Climate Change in Chicago at the Marriott Magnificent Mile Hotel on Michigan Avenue from May 16 - l8. The Heartland Institute is a nonprofit, nonpartisan Chicago-based research organization founded in 1984. Its purpose is to discover, develop and promote free-market solutions to social and economic problems. For more information about the Heartland Institute, visit http://www.heartland.org/ or call 312/377-4000.
It is appropriate that the theme of this year's conference was Reconsidering the Science and Economics, as much has happened since Heartland's Third International Conference on Climate Change held in Washington, D.C. in June of last year. Among the happenings: It was in November of last year that emails and other documents from the Climatic Research Unit at the University of East Anglia revealed a pattern of mismanagement of temperature data, interference with peer review, and an effort to suppress academic debate on global warming (Climategate). In December of 2009, negotiations in Copenhagen, meant as a successor to the Kyoto Protocol, collapsed, leaving the world without a binding international agreement after Kyoto expires in 2012.
Attending Heartland's Fourth International Conference were seventy-three distinguished scientists, economists, and policy experts from twenty-three countries. The speakers were all united in thought that the time is now to reconsider the science and economics of global warming. New scientific discoveries cast doubt on how much of the warming during the twentieth century was man-caused, and how much was due to natural causes. Governments around the world have begun to recognize the astronomical cost of reducing emissions, and how the cost of slowing or stopping global warming might exceed the societal benefits. Even so, not all seventy-three of the invited guests agreed on the causes, extent, or the consequences of climate change.
Among the seventy-three distinguished speakers were two global climate believers: 1) Tam Hunt, J.D. who owns and runs Community Renewable Solutions LLC and is also a lecturer in Climate Change Law and Policy at UC Santa Barbara's Bren School of Environmental Science & Management (a graduate-level program), and 2) A. Scott Denning, PhD, a professor at the Cooperative Institute for Research in the Atmosphere, a joint project of the National Oceanic and Atmospheric Administration and Colorado State University.
Although Heartland Institute extended invitations to many global warming believers, only Hunt and Denning were brave enough to accept. Heartland's president, Joseph L. Bast, hopes to persuade more speakers with opposing viewpoints to attend next year's conference.
The electricity generated by the speakers was felt by the 700-plus individuals who registered to attend the conference. As examples of the caliber of distinguished guest speaker, I've arbitrarily chosen those that I came in contact at the conference and whose names are known to many: Howard Hayden, PhD; Christopher C. Horner, J.D.; Paul C. "Chip" Knappenberger; Jay H. Lehr, PhD; Ben Lieberman; Richard Lindzen, PhD; Stephen Mcintyre; Patrick J. Michaels, PhD; Lord Christopher Monckton; Ian Plimer, PhD; S. Fred Singer, PhD; Roy W. Spencer, PhD; and James M. Taylor, J.S. To view the names of all speakers and conference events go to: http://www.heartland.org/events/2010Chicgo/program.html
Four of the above guest speakers participated in book signing sessions: Ian Plimer, PhD - Heaven and Earth: Global Warming, the Missing Science; Roy W. Spencer, PhD - The Great Global Warming Blunder: How Mother Nature Fooled the World's Top Scientists; S. Fred Singer, PhD - Hot Talk, Cold Science; and Christopher C. Horner, J.D. - Power Grab: How Obama's Green Policies Will Steal your Freedom and Bankrupt America.
Much visited by convention participants were thirteen Conference Exhibitors. All thirteen deserve recognition, but to list all would not be practical in this format. Pajamas Media deserves special recognition because of its "on location" coverage from the Copenhagen Climate Conference. Its online video arm of the new media company, Pajamas Media, has also been in the forefront and broken many key stories on the global warming controversy from both the scientific and business perspectives. Pajamas Media videotaped the entire Heartland Conference. http://www.pajamasmedia.com/
My one regret is that I could not listen to the presentations of all seventy-three of the distinguished speakers. Tracks were set up from which conference participants could select those speakers they wished to hear based on their interests. Four tracks were available at each of the five sessions, three on Monday and two on Tuesday. Two of tracks were devoted to Science and one track each to Economics and Public Policy. Each of the four tracks in every session featured either three or four guest speakers. With this in mind, as a participant who attended all five of the sessions, I was limited to hearing, at the most, seventeen of the featured speakers. Additionally, however, there were two keynote speakers at Sunday's opening supper and two each at breakfast and lunch on both Monday and Tuesday.
As a conference participant, I would like to comment about two of the guest speakers. One of them, James Delingpole, was the only non-scientist in the group of seventy three. He is an author, broadcaster, and blogger who helped break the Climategate story in the United Kingdom. Having earned an English degree from Oxford University, Delingpole "felt like a shepherd boy who had been transported to Mt. Olympus."
According to Delingpole, the Climategate story fell into his lap and changed his life. His pitch to the conference attendees was how we represent the happy people who want a good life. Also, that we have a place in this war. The war we are fighting is for our liberty. It is between two opposing views of the world. It's also a propaganda war. James Delingpole is the author of Obamaland: I've Seen Your Future And It Doesn't Work.
There could be no question as to the climax of the conference. Even the president of Heartland Institute, Joseph Bast, concluded as much, when he decided to present the wrap-up of the conference before Lord Christopher Monckton had spoken at the final lunch gathering on Tuesday, May 18. Lord Monckton is chief policy adviser to the Science and Public Policy Institute. Monckton was also a policy adviser to Prime Minister Margaret Thatcher. He now travels the world, all for the truth of sound science. Monckton describes truth "as the center of every lasting consensus."
Lord Monckton's speech was anticipated by all and he didn't disappoint. His tongue-in-cheek British humor was entertaining, but then Monckton turned serious. It was because of Lord Monckton that the "Hockey Stick" report by the IPPC was thoroughly discredited. As Monckton described it, bogus facts were used to construct the computer model in an attempt to show that the rate of global warming is accelerating and that it's because of man.
According to Lord Monckton, even if all economic activity were closed down to forestall global warming for a period of 100 years, the temperature reduction would only amount to 1 degree Fahrenheit. This would be the height of folly and cruelty!
In speaking about Cap and Trade, Monckton warned how any measure to curtail global warming would result in abject failure. Monckton then listed three current approaches that are doomed to failure because they would have no effect on climate change: Kerry-Lieberman Bill, EPA regulations, and an attempt to push a new treaty at Cancun to replace the failed Copenhagen one.
Further words of truth spoken by Lord Monckton described how science and economics cannot be divorced from politics, that Cap and Trade is nothing less than an attempt by the rich and powerful to take away the chance for the little guy to face up to the big guy, and that scientific truth will always remain the truth because it doesn't matter how many lies are told.
It was during the conclusion of Lord Monckton's remarks - Global Warming: The Trojan that Menaces Global Freedom - when not a dry eye was left in the room. In a dramatic presentation, Monckton quoted Lincoln's Gettysburg Address. Not only did Lord Monckton tear up, but so did his attentive and enraptured audience, as Monckton passionately intoned its final words: ". . .that this nation, under God, shall have a new birth of freedom -- and that government of the people, by the people, for the people, shall not perish from the earth."
Listed below are but a few of the many salient facts about global warming which conference participants were privileged to hear:
Weather stations can no longer be trusted. 90% of the 1,064 weather stationsdo not meet government standards because contamination is present.
The billions of dollars spent by government and others to fund science just perpetuates problems rather than solving them. Funding only continues if research shows what those funding itwish it to prove,otherwise funding is discontinued.
If temperature can't be projected for a week,how is itpossible to project temperature to 2050 and beyond?
The public is susceptible to scare tactics: Silent Spring, byRachel Carson, published in September of 1962, helped to start the environmental movement. A marine biologist, Carson documented the detrimental effects of pesticides on the environment, which ledto the banning of DDT here in the U.S. and millions of deathsin malaria-prone countries.
70% of the public believes that we're almost running out of fossil fuel.
Uncertainty allows for the possibility of disaster. Something must be done even though that something might make things worse.
Computer models are not reliable because garbage in yields garbage out. Facts are often cherry-picked and can be tweaked to create the results that the computer modeler is looking for.
Peer review is a way of screening out opposing views. Skeptics of climate change have a difficult time getting published.
The influence of CO2 is so small that it's at a noise level.
Science education is in a general decline. Students are taught that science is based on evidence, and yet all they are presented are inaccurate models.
It was disconcerting and even unconscionable that the Chicago media ignored Heartland's Fourth International Conference on Climate Change. While there were reporters and TV stations present at the conference, they were not from Chicago media sources. The bias shown by the Chicago media is unforgiveable
Shame on the Chicago Tribune, the Chicago Sun-Times, and the Daily Herald for not even including a short blurb in their newspapers about Heartland's outstanding Fourth International Conference on Climate Change with its distinguished world-wide list of speakers. Sadly the media has taken the politically approved stance that global warming is man-made. As such the media is not about to inform its readers of the thousands of leading scientists around the world who reject global warming. Is it any wonder that newspapers are losing subscribers, when they only tell one side of the story? Kudos to Pajama Media for filming the entire conference!
A Heartland Institute sign prominently displayed at the conference said it all: "Global Warming? It is not man made, it's a natural variation, the human impact is very small, computer models are flawed, and there is no "consensus". Global Warming is also not harmful, past warmings were beneficial, no current warming harms, future warmings will be modest, and warming is better.
Sixty four cosponsors from twenty three different countries signed on to Heartland's Fourth International Conference on Climate Change including Americans for Prosperity, Ayn Rand Center for Individual Rights, Freedom Works, Illinois Policy Institute, JunkScience.com, George C. Marshall Institute, National Center for Public Policy Research, and the Science and Public Policy Institute.
Most likely American Thinker readers are aware that the Heartland Institute held its Fourth International Conference on Climate Change earlier in the week - May 16 - 18. I'm still feeling electrified from the impact the event had on me.
The Heartland Institute of Chicago, Joseph L. Bast, President, held its Fourth International Conference on Climate Change in Chicago at the Marriott Magnificent Mile Hotel on Michigan Avenue from May 16 - l8. The Heartland Institute is a nonprofit, nonpartisan Chicago-based research organization founded in 1984. Its purpose is to discover, develop and promote free-market solutions to social and economic problems. For more information about the Heartland Institute, visit http://www.heartland.org/ or call 312/377-4000.
It is appropriate that the theme of this year's conference was Reconsidering the Science and Economics, as much has happened since Heartland's Third International Conference on Climate Change held in Washington, D.C. in June of last year. Among the happenings: It was in November of last year that emails and other documents from the Climatic Research Unit at the University of East Anglia revealed a pattern of mismanagement of temperature data, interference with peer review, and an effort to suppress academic debate on global warming (Climategate). In December of 2009, negotiations in Copenhagen, meant as a successor to the Kyoto Protocol, collapsed, leaving the world without a binding international agreement after Kyoto expires in 2012.
Attending Heartland's Fourth International Conference were seventy-three distinguished scientists, economists, and policy experts from twenty-three countries. The speakers were all united in thought that the time is now to reconsider the science and economics of global warming. New scientific discoveries cast doubt on how much of the warming during the twentieth century was man-caused, and how much was due to natural causes. Governments around the world have begun to recognize the astronomical cost of reducing emissions, and how the cost of slowing or stopping global warming might exceed the societal benefits. Even so, not all seventy-three of the invited guests agreed on the causes, extent, or the consequences of climate change.
Among the seventy-three distinguished speakers were two global climate believers: 1) Tam Hunt, J.D. who owns and runs Community Renewable Solutions LLC and is also a lecturer in Climate Change Law and Policy at UC Santa Barbara's Bren School of Environmental Science & Management (a graduate-level program), and 2) A. Scott Denning, PhD, a professor at the Cooperative Institute for Research in the Atmosphere, a joint project of the National Oceanic and Atmospheric Administration and Colorado State University.
Although Heartland Institute extended invitations to many global warming believers, only Hunt and Denning were brave enough to accept. Heartland's president, Joseph L. Bast, hopes to persuade more speakers with opposing viewpoints to attend next year's conference.
The electricity generated by the speakers was felt by the 700-plus individuals who registered to attend the conference. As examples of the caliber of distinguished guest speaker, I've arbitrarily chosen those that I came in contact at the conference and whose names are known to many: Howard Hayden, PhD; Christopher C. Horner, J.D.; Paul C. "Chip" Knappenberger; Jay H. Lehr, PhD; Ben Lieberman; Richard Lindzen, PhD; Stephen Mcintyre; Patrick J. Michaels, PhD; Lord Christopher Monckton; Ian Plimer, PhD; S. Fred Singer, PhD; Roy W. Spencer, PhD; and James M. Taylor, J.S. To view the names of all speakers and conference events go to: http://www.heartland.org/events/2010Chicgo/program.html
Four of the above guest speakers participated in book signing sessions: Ian Plimer, PhD - Heaven and Earth: Global Warming, the Missing Science; Roy W. Spencer, PhD - The Great Global Warming Blunder: How Mother Nature Fooled the World's Top Scientists; S. Fred Singer, PhD - Hot Talk, Cold Science; and Christopher C. Horner, J.D. - Power Grab: How Obama's Green Policies Will Steal your Freedom and Bankrupt America.
Much visited by convention participants were thirteen Conference Exhibitors. All thirteen deserve recognition, but to list all would not be practical in this format. Pajamas Media deserves special recognition because of its "on location" coverage from the Copenhagen Climate Conference. Its online video arm of the new media company, Pajamas Media, has also been in the forefront and broken many key stories on the global warming controversy from both the scientific and business perspectives. Pajamas Media videotaped the entire Heartland Conference. http://www.pajamasmedia.com/
My one regret is that I could not listen to the presentations of all seventy-three of the distinguished speakers. Tracks were set up from which conference participants could select those speakers they wished to hear based on their interests. Four tracks were available at each of the five sessions, three on Monday and two on Tuesday. Two of tracks were devoted to Science and one track each to Economics and Public Policy. Each of the four tracks in every session featured either three or four guest speakers. With this in mind, as a participant who attended all five of the sessions, I was limited to hearing, at the most, seventeen of the featured speakers. Additionally, however, there were two keynote speakers at Sunday's opening supper and two each at breakfast and lunch on both Monday and Tuesday.
As a conference participant, I would like to comment about two of the guest speakers. One of them, James Delingpole, was the only non-scientist in the group of seventy three. He is an author, broadcaster, and blogger who helped break the Climategate story in the United Kingdom. Having earned an English degree from Oxford University, Delingpole "felt like a shepherd boy who had been transported to Mt. Olympus."
According to Delingpole, the Climategate story fell into his lap and changed his life. His pitch to the conference attendees was how we represent the happy people who want a good life. Also, that we have a place in this war. The war we are fighting is for our liberty. It is between two opposing views of the world. It's also a propaganda war. James Delingpole is the author of Obamaland: I've Seen Your Future And It Doesn't Work.
There could be no question as to the climax of the conference. Even the president of Heartland Institute, Joseph Bast, concluded as much, when he decided to present the wrap-up of the conference before Lord Christopher Monckton had spoken at the final lunch gathering on Tuesday, May 18. Lord Monckton is chief policy adviser to the Science and Public Policy Institute. Monckton was also a policy adviser to Prime Minister Margaret Thatcher. He now travels the world, all for the truth of sound science. Monckton describes truth "as the center of every lasting consensus."
Lord Monckton's speech was anticipated by all and he didn't disappoint. His tongue-in-cheek British humor was entertaining, but then Monckton turned serious. It was because of Lord Monckton that the "Hockey Stick" report by the IPPC was thoroughly discredited. As Monckton described it, bogus facts were used to construct the computer model in an attempt to show that the rate of global warming is accelerating and that it's because of man.
According to Lord Monckton, even if all economic activity were closed down to forestall global warming for a period of 100 years, the temperature reduction would only amount to 1 degree Fahrenheit. This would be the height of folly and cruelty!
In speaking about Cap and Trade, Monckton warned how any measure to curtail global warming would result in abject failure. Monckton then listed three current approaches that are doomed to failure because they would have no effect on climate change: Kerry-Lieberman Bill, EPA regulations, and an attempt to push a new treaty at Cancun to replace the failed Copenhagen one.
Further words of truth spoken by Lord Monckton described how science and economics cannot be divorced from politics, that Cap and Trade is nothing less than an attempt by the rich and powerful to take away the chance for the little guy to face up to the big guy, and that scientific truth will always remain the truth because it doesn't matter how many lies are told.
It was during the conclusion of Lord Monckton's remarks - Global Warming: The Trojan that Menaces Global Freedom - when not a dry eye was left in the room. In a dramatic presentation, Monckton quoted Lincoln's Gettysburg Address. Not only did Lord Monckton tear up, but so did his attentive and enraptured audience, as Monckton passionately intoned its final words: ". . .that this nation, under God, shall have a new birth of freedom -- and that government of the people, by the people, for the people, shall not perish from the earth."
Listed below are but a few of the many salient facts about global warming which conference participants were privileged to hear:
Weather stations can no longer be trusted. 90% of the 1,064 weather stationsdo not meet government standards because contamination is present.
The billions of dollars spent by government and others to fund science just perpetuates problems rather than solving them. Funding only continues if research shows what those funding itwish it to prove,otherwise funding is discontinued.
If temperature can't be projected for a week,how is itpossible to project temperature to 2050 and beyond?
The public is susceptible to scare tactics: Silent Spring, byRachel Carson, published in September of 1962, helped to start the environmental movement. A marine biologist, Carson documented the detrimental effects of pesticides on the environment, which ledto the banning of DDT here in the U.S. and millions of deathsin malaria-prone countries.
70% of the public believes that we're almost running out of fossil fuel.
Uncertainty allows for the possibility of disaster. Something must be done even though that something might make things worse.
Computer models are not reliable because garbage in yields garbage out. Facts are often cherry-picked and can be tweaked to create the results that the computer modeler is looking for.
Peer review is a way of screening out opposing views. Skeptics of climate change have a difficult time getting published.
The influence of CO2 is so small that it's at a noise level.
Science education is in a general decline. Students are taught that science is based on evidence, and yet all they are presented are inaccurate models.
It was disconcerting and even unconscionable that the Chicago media ignored Heartland's Fourth International Conference on Climate Change. While there were reporters and TV stations present at the conference, they were not from Chicago media sources. The bias shown by the Chicago media is unforgiveable
Shame on the Chicago Tribune, the Chicago Sun-Times, and the Daily Herald for not even including a short blurb in their newspapers about Heartland's outstanding Fourth International Conference on Climate Change with its distinguished world-wide list of speakers. Sadly the media has taken the politically approved stance that global warming is man-made. As such the media is not about to inform its readers of the thousands of leading scientists around the world who reject global warming. Is it any wonder that newspapers are losing subscribers, when they only tell one side of the story? Kudos to Pajama Media for filming the entire conference!
A Heartland Institute sign prominently displayed at the conference said it all: "Global Warming? It is not man made, it's a natural variation, the human impact is very small, computer models are flawed, and there is no "consensus". Global Warming is also not harmful, past warmings were beneficial, no current warming harms, future warmings will be modest, and warming is better.
Sixty four cosponsors from twenty three different countries signed on to Heartland's Fourth International Conference on Climate Change including Americans for Prosperity, Ayn Rand Center for Individual Rights, Freedom Works, Illinois Policy Institute, JunkScience.com, George C. Marshall Institute, National Center for Public Policy Research, and the Science and Public Policy Institute.
Help nature or risk humanity: report
THE economic case for global action to stop the destruction of the natural world is even more powerful than the argument for tackling climate change, a report for the United Nations will declare later this year.
The Stern report on climate change, which was prepared for the British Treasury and published in 2006, stated that the cost of limiting climate change would be about 1 to 2 per cent of annual global wealth, but the longer-term economic benefits would be five to 20 times that figure.
The UN's biodiversity report, dubbed Stern for Nature, is expected to say that the value of saving ''natural goods and services'', such as pollination, medicines, fertile soils, clean air and water, will be even higher - between 10 and 100 times the cost of saving the habitats and species that provide them.
The report will advocate immense changes to the way the global economy is run to factor in the value of the natural world.
The measures it will recommend include:
■Paying communities to conserve nature rather than deplete it.
■Giving strict limits to companies on what they can take from the environment and fining or taxing more to limit overexploitation.
■Asking businesses and national governments to publish accounts for their use of natural and human capital alongside their financial results.
■Reforming subsidies worth more than US$1 trillion a year for industries such as agriculture, fisheries, energy and transport.
The potential economic benefits of protecting biodiversity are huge. Setting up and running a comprehensive network of protected areas would cost $45 billion a year globally, according to one estimate, but the benefits of preservation within these zones would be worth $4-5 trillion a year.
''We need a sea change in human thinking and attitudes towards nature,'' said the report's author, the economist Pavan Sukhdev, who is a former senior banker with Deutsche Bank and a special adviser to the UN environment program. He called for nature to be seen ''not as something to be vanquished or conquered, but rather something to be cherished and lived within''.
The UN report's authors say if the goods and services provided by the natural world are not valued and factored into the global economic system, the environment will become more fragile and less resilient to shocks, risking human lives, livelihoods and the global economy.
The changes will involve a revolution in the way humans do business, consume and think about their lives, Mr Sukhdev said. He referred to the damage being inflicted on the natural world as ''a landscape of market failures''.
The report follows a series of studies showing that the world is in the grip of a mass extinction event as pollution, climate change, development and hunting destroys habitats of all types.
However, only two of the world's 100 biggest companies believe reducing biodiversity is a strategic threat to their business, according to another report released on Friday by PricewaterhouseCoopers, which is advising the team compiling the UN report.
Mr Sukhdev said: ''We do have limits to how much we can extract, and why and where.''
The Economics of Ecosystems and Biodiversity (TEEB) report shows that, on average, one-third of Earth's habitats have been damaged by humans.
The Stern report on climate change, which was prepared for the British Treasury and published in 2006, stated that the cost of limiting climate change would be about 1 to 2 per cent of annual global wealth, but the longer-term economic benefits would be five to 20 times that figure.
The UN's biodiversity report, dubbed Stern for Nature, is expected to say that the value of saving ''natural goods and services'', such as pollination, medicines, fertile soils, clean air and water, will be even higher - between 10 and 100 times the cost of saving the habitats and species that provide them.
The report will advocate immense changes to the way the global economy is run to factor in the value of the natural world.
The measures it will recommend include:
■Paying communities to conserve nature rather than deplete it.
■Giving strict limits to companies on what they can take from the environment and fining or taxing more to limit overexploitation.
■Asking businesses and national governments to publish accounts for their use of natural and human capital alongside their financial results.
■Reforming subsidies worth more than US$1 trillion a year for industries such as agriculture, fisheries, energy and transport.
The potential economic benefits of protecting biodiversity are huge. Setting up and running a comprehensive network of protected areas would cost $45 billion a year globally, according to one estimate, but the benefits of preservation within these zones would be worth $4-5 trillion a year.
''We need a sea change in human thinking and attitudes towards nature,'' said the report's author, the economist Pavan Sukhdev, who is a former senior banker with Deutsche Bank and a special adviser to the UN environment program. He called for nature to be seen ''not as something to be vanquished or conquered, but rather something to be cherished and lived within''.
The UN report's authors say if the goods and services provided by the natural world are not valued and factored into the global economic system, the environment will become more fragile and less resilient to shocks, risking human lives, livelihoods and the global economy.
The changes will involve a revolution in the way humans do business, consume and think about their lives, Mr Sukhdev said. He referred to the damage being inflicted on the natural world as ''a landscape of market failures''.
The report follows a series of studies showing that the world is in the grip of a mass extinction event as pollution, climate change, development and hunting destroys habitats of all types.
However, only two of the world's 100 biggest companies believe reducing biodiversity is a strategic threat to their business, according to another report released on Friday by PricewaterhouseCoopers, which is advising the team compiling the UN report.
Mr Sukhdev said: ''We do have limits to how much we can extract, and why and where.''
The Economics of Ecosystems and Biodiversity (TEEB) report shows that, on average, one-third of Earth's habitats have been damaged by humans.
Friday, May 21, 2010
Emissions Reductions is Top Environmental Concern for U.S. Businesses
The top concern of U.S. businesses for climate change and environmental issues is reducing carbon emissions, according to PricewaterhouseCoopers’ Appetite for Change global survey.
U.S. survey respondents ranked reduction of carbon-dioxide emissions as the issue to most impact their companies over the next two to five years (16 percent), followed by new regulation, (13 percent), energy efficiency (12 percent), and legislation/new laws (11 percent).
“The Obama Administration recently announced that the federal government would reduce its own carbon footprint by 28 percent by 2020. If the government were to push down that requirement through its supply chain to all government contractors and suppliers, the impact on U.S. business would be quite significant,” says Kathy Nieland, leader of the Sustainability and Climate Change practice of PricewaterhouseCoopers LLP.
Eighty-seven percent of U.S. survey respondents say change is likely over the next few years as a result of the climate change and environmental debate. Twenty-eight percent believe these changes could be significant.
Although more than half the respondents (55 percent) noted that the climate change and environmental debate has had an impact on the way their organization conducts business 45 percent said it has had little if no impact at all.
The survey also finds that there is broad-based support for tax incentives for renewable energy and energy efficiency. Eighty-eight percent of American companies surveyed said that tax incentives were effective in encouraging businesses to reduce their environmental impact, although 67 percent said that tax incentives currently in place are not sufficiently motivating them to change their business behavior to obtain them.
One in four U.S. respondents (23 percent) said government should have primary responsibility for leading behavioral change around climate initiatives, rather than businesses overall or their own industry. In comparison, 44 percent of respondents globally said government should have primary responsibility in this area.
A significantly higher proportion of U.S. respondents (38 percent) want business/the market to have primary responsibility for leading behavioral change, compared with only 18 percent globally.
A majority (56 percent) of U.S. respondents do not feel that government engages effectively with business to ensure its environmental policies take industry views into account. Only 17 percent said they believe the government has a clear, unambiguous policy with regard to environmental economic instruments.
Another finding shows that U.S. businesses are split on whether voluntary programs to disclose their carbon emissions results in a reduction of their environmental impact. Fifty percent said they are not very/not at all effective, while the remaining 50 percent say they are effective.
More than four in 10 (44 percent) of respondents said the potential cost savings from introducing energy-efficient measures was “very influential” on their organization’s environmental behavior.
U.S. survey respondents ranked reduction of carbon-dioxide emissions as the issue to most impact their companies over the next two to five years (16 percent), followed by new regulation, (13 percent), energy efficiency (12 percent), and legislation/new laws (11 percent).
“The Obama Administration recently announced that the federal government would reduce its own carbon footprint by 28 percent by 2020. If the government were to push down that requirement through its supply chain to all government contractors and suppliers, the impact on U.S. business would be quite significant,” says Kathy Nieland, leader of the Sustainability and Climate Change practice of PricewaterhouseCoopers LLP.
Eighty-seven percent of U.S. survey respondents say change is likely over the next few years as a result of the climate change and environmental debate. Twenty-eight percent believe these changes could be significant.
Although more than half the respondents (55 percent) noted that the climate change and environmental debate has had an impact on the way their organization conducts business 45 percent said it has had little if no impact at all.
The survey also finds that there is broad-based support for tax incentives for renewable energy and energy efficiency. Eighty-eight percent of American companies surveyed said that tax incentives were effective in encouraging businesses to reduce their environmental impact, although 67 percent said that tax incentives currently in place are not sufficiently motivating them to change their business behavior to obtain them.
One in four U.S. respondents (23 percent) said government should have primary responsibility for leading behavioral change around climate initiatives, rather than businesses overall or their own industry. In comparison, 44 percent of respondents globally said government should have primary responsibility in this area.
A significantly higher proportion of U.S. respondents (38 percent) want business/the market to have primary responsibility for leading behavioral change, compared with only 18 percent globally.
A majority (56 percent) of U.S. respondents do not feel that government engages effectively with business to ensure its environmental policies take industry views into account. Only 17 percent said they believe the government has a clear, unambiguous policy with regard to environmental economic instruments.
Another finding shows that U.S. businesses are split on whether voluntary programs to disclose their carbon emissions results in a reduction of their environmental impact. Fifty percent said they are not very/not at all effective, while the remaining 50 percent say they are effective.
More than four in 10 (44 percent) of respondents said the potential cost savings from introducing energy-efficient measures was “very influential” on their organization’s environmental behavior.
HSBC climate change fund linked to deforestation
Environmental charity Greenpeace has made claims that HSBC's asset management arm is sidestepping the bank's environmental guidelines by holding shares in a company accused of destroying Indonesian forests and peatland.
The Greenpeace campaign points to the bank's Global Investment Funds 2009 annual report as evidence that its Global Climate Change Fund is investing in Golden Agri-Resources Ltd, the palm oil arm of Sinar Mas.
Late last year, the charity released a report on Sinar Mas' activities in Indonesia, claiming that it has been selective in complying with requirements of the Round Table on Sustainable Palm Oil, and that its subsidiaries were flouting legal requirements in developing land for palm oil production.
The report estimated that the annual CO2 emissions from the company's palm oil concessions in Indonesia's Riau province amount to 2.5m tonnes.
Greenpeace raised the alarm on the investment as part of its effort to stop companies from doing business with Sinar Mas. The charity has already persuaded Nestlé to cut the palm oil producer out of its supply chain.
Although HSBC has an ethical forestry policy, which states that the bank "will not finance plantations converted from natural forest since June 2004", the rule currently does not apply to its asset management funds.
Francis Sullivan, the bank's adviser on the environment, told Guardian Sustainable Business: "I can confirm that neither Sinar Mas nor any of its subsidiaries are clients of HSBC, which would be consistent with the forest policy that we do have." But he added that the policy was not extended to HSBC's Global Investment Funds.
The Global Cimate Change Fund is described on the bank's website as "an innovative fund that invests in carefully selected companies that are considered best placed to benefit from addressing the challenges presented by climate change".
Sullivan said the fund had singled out palm oil as a promising raw material for producing low carbon biofuels. It has began to invest in the most profitable companies in the palm oil sector, including Golden Agri-Resources.
The fund's selection method had, according to Sullivan, sifted out companies producing biofuels from corn and rapeseed after calculating that the net carbon emissions of the process were too high for the product to be considered a low carbon alternative to fossil fuels.
But he claimed HSBC "could not find specific scientific evidence" to quantify the carbon emissions from deforestation by individual companies.
HSBC is due to review the company selection criteria for the fund in September, but Sullivan said the company cannot guarantee it will find enough scientific evidence to change the way it selects palm oil companies for investment.
The Renewable Fuels Agency, the UK government's biofuels regulator, reviewed the net carbon emissions of palm oil grown in deforested and peatland areas in a report, published in January.
It stated: "If palm oil expansion causes loss of natural forest, the carbon release associated will negate any potential carbon savings from the use of palm biodiesel." The land use change emissions from deforestation would take "130 years to repay" in carbon benefits from palm oil biodiesel, and the "carbon payback for biodiesel feedstock produced on peatland can be measured in millennia".
Greenpeace predicts that, compared with levels in 2000, palm oil demand will more than double by 2030 and triple by 2050. In January 2007 Golden Agri-Resources made one of the biggest single biofuel investments worldwide, signing a deal to invest $5.5bn over eight years to develop biodiesel based on palm oil and bioethanol based on sugar cane or cassava.
The Greenpeace campaign points to the bank's Global Investment Funds 2009 annual report as evidence that its Global Climate Change Fund is investing in Golden Agri-Resources Ltd, the palm oil arm of Sinar Mas.
Late last year, the charity released a report on Sinar Mas' activities in Indonesia, claiming that it has been selective in complying with requirements of the Round Table on Sustainable Palm Oil, and that its subsidiaries were flouting legal requirements in developing land for palm oil production.
The report estimated that the annual CO2 emissions from the company's palm oil concessions in Indonesia's Riau province amount to 2.5m tonnes.
Greenpeace raised the alarm on the investment as part of its effort to stop companies from doing business with Sinar Mas. The charity has already persuaded Nestlé to cut the palm oil producer out of its supply chain.
Although HSBC has an ethical forestry policy, which states that the bank "will not finance plantations converted from natural forest since June 2004", the rule currently does not apply to its asset management funds.
Francis Sullivan, the bank's adviser on the environment, told Guardian Sustainable Business: "I can confirm that neither Sinar Mas nor any of its subsidiaries are clients of HSBC, which would be consistent with the forest policy that we do have." But he added that the policy was not extended to HSBC's Global Investment Funds.
The Global Cimate Change Fund is described on the bank's website as "an innovative fund that invests in carefully selected companies that are considered best placed to benefit from addressing the challenges presented by climate change".
Sullivan said the fund had singled out palm oil as a promising raw material for producing low carbon biofuels. It has began to invest in the most profitable companies in the palm oil sector, including Golden Agri-Resources.
The fund's selection method had, according to Sullivan, sifted out companies producing biofuels from corn and rapeseed after calculating that the net carbon emissions of the process were too high for the product to be considered a low carbon alternative to fossil fuels.
But he claimed HSBC "could not find specific scientific evidence" to quantify the carbon emissions from deforestation by individual companies.
HSBC is due to review the company selection criteria for the fund in September, but Sullivan said the company cannot guarantee it will find enough scientific evidence to change the way it selects palm oil companies for investment.
The Renewable Fuels Agency, the UK government's biofuels regulator, reviewed the net carbon emissions of palm oil grown in deforested and peatland areas in a report, published in January.
It stated: "If palm oil expansion causes loss of natural forest, the carbon release associated will negate any potential carbon savings from the use of palm biodiesel." The land use change emissions from deforestation would take "130 years to repay" in carbon benefits from palm oil biodiesel, and the "carbon payback for biodiesel feedstock produced on peatland can be measured in millennia".
Greenpeace predicts that, compared with levels in 2000, palm oil demand will more than double by 2030 and triple by 2050. In January 2007 Golden Agri-Resources made one of the biggest single biofuel investments worldwide, signing a deal to invest $5.5bn over eight years to develop biodiesel based on palm oil and bioethanol based on sugar cane or cassava.
Thursday, May 20, 2010
Cancer by the numbers: How many are caused by the environment?
Traces of chemicals known to cause human cancer lurk everywhere. But after decades of research, figuring out how many people might contract cancer because of them remains an elusive goal.
More than 60 percent of U.S. cancer deaths are caused by smoking and diet. But what about the rest?
A report by the President's Cancer Panel, released earlier this month, reignited a 30-year-old controversy among cancer experts and environmental epidemiologists about how large a role environmental factors play in the No. 2 killer of Americans.
Some experts, including the President’s panel, say a decades-old estimate that six percent of cancer deaths are due to environmental and occupational exposures is outdated and far too low. But scientists most likely will never be able to tease out the true role of environmental contaminants because environmental exposures, genetics and lifestyle seem to all intertwine.
“It’s like looking at strands of a spider web and deciding which one is important,” said Dr. Ted Schettler, director of the Science and Environmental Health Network, a nonprofit group that advocates use of science in setting environmental policy.
From the womb to old age, people around the world are exposed to countless carcinogens in their food, air, water and consumer goods.
The National Institutes of Health has classified 54 compounds as known human carcinogens based on studies indicating they cause at least one type of cancer in people, according to the nation’s 11th Report on Carcinogens. The highest exposures occur in an occupational setting, but there are environmental exposures as well.
Figuring out how many cancers are caused by the environment "is like looking at strands of a spider web and deciding which one is important.” - Ted Schettler, Science and Environmental Health Network For example, benzene, a known cause of human leukemia, is a common pollutant in vehicle exhaust. Radon, a natural radioactive gas found in many homes, raises the risk of lung cancer. Arsenic, linked to skin, liver, bladder and lung cancer, contaminates some drinking water supplies. Other known human carcinogens include asbestos, hexavalent chromium, aflatoxins and vinyl chloride.
Since 1981, agencies and institutes have cited the same estimate when regulating carcinogens in the workplace, air, water and consumer products. Roughly four percent of cancer deaths – or 20,000 deaths per year – may be attributable to occupational exposures, and two percent – or 10,000 deaths per year – to environmental exposures.
In its new report, the panel, appointed by former President Bush, called that estimate “woefully out of date,” reporting that “the true burden of environmentally induced cancers has been grossly underestimated.”
But the American Cancer Society took issue with that statement, saying there is no scientific consensus.
“On what grounds do you know it's being grossly underestimated? It's a possibility, but many hypotheses have been proposed, and unless you have real evidence, you can’t say that it is,” said Dr. Michael Thun, vice president emeritus of epidemiology and surveillance research for the American Cancer Society.
Thun said the President’s panel overstates the concern about environmental causes when the best way to prevent cancer is to combat the largest risks that people encounter: tobacco, diet, physical inactivity and sun.
But many environmental epidemiologists say quibbling over the numbers becomes a diversionary tactic.
They say the American Cancer’s Society’s statement sounds a bit like a principle espoused by industry groups – don’t act without absolute proof of harm. Many environmental epidemiologists are in favor of moving toward the precautionary principle – reducing people’s exposure to environmental pollutants even if there is uncertainty about the risks.
"It would be unfortunate if people came away with the message that chemicals in the environment are the most important cause of cancer at the expense of those lifestyle factors, like tobacco, physical activity, nutrition, and obesity, that have by far the most potential in reducing cancer deaths." - Michael Thun, American Cancer Society It’s an "erroneous exercise” to try to assign each chemical or exposure a specific fraction of cancer, said Richard Clapp of Boston University's School of Public Health, who co-authored a 2005 review and 2007 update on environmental and occupational causes of cancer.
"It's estimating a fiction, because nobody knows and nobody can know," said Clapp. "Why do we keep beating this dead horse? If there are things we can move on, let's work on those."
Cancer is the second leading killer of Americans, and the leading cause of death worldwide. Every year, about 1.5 million new cases are diagnosed in the United States and more than half a million people die from the disease, according to the American Cancer Society.
Experts agree that most cancers are caused by lifestyle factors such as smoking, diet and alcohol. Smoking alone accounts for at least 30 percent of all U.S. cancer deaths, and about the same percentage is attributed to diet, obesity and physical inactivity, according to the American Cancer Society.
But it’s the remaining cancers – roughly one out of every three – that trigger debate.
A 1981 report by two scientists, Sir Richard Doll and Sir Richard Peto, published in the Journal of the National Cancer Institute, estimated that two percent of cancer deaths were attributable to exposures to pollutants in the environment and four percent to exposures in occupational settings. In 2009, those percentages amounted to about 30,000 U.S. deaths.
“If you looked at the number of deaths per day, if that were a plane crash it would be a national news story,” Clapp said.
The 1981 report only considered deaths, not cancer cases. (About half of those diagnosed with cancer die.) Also, the study only included Caucasians under the age of 65, although many cancers increase with age and many minority groups are more highly exposed to environmental contaminants.
US Centers for Disease Control and Prevention
Asbestos fibers lodged in a lung.
The old two percent estimate for environmentally induced cancers is still commonly used – despite advances in modern cancer biology.
New areas of cancer research are focusing on the potential for pollutants to interact with one another and with genetic factors. Carcinogens can act by damaging DNA, disrupting hormones, inflaming tissues, or switching genes on or off.
Also, exposure to hormonally active agents during critical periods of human development – particularly in the womb or during childhood – may trigger cancer later in life. For example, the risk of breast cancer could be influenced by exposures during puberty.
All these elements make it tricky to calculate the magnitude of environmentally induced cancers.
Scientists now know that getting cancer is like being attacked by a multi-headed monster: How can you really be sure which part did the most damage?
Schettler said “we now know from cancer biology that multiple interacting factors” are involved so it’s impossible to assign percentages to certain causes.
“It’s really important that we understand the limits of this notion. We have to be humbled by this and know that our estimates may be way off,” he said.
Margaret Kripke, a professor at University of Texas' M.D. Anderson Cancer Center and co-author of the President’s Cancer Panel report, said the idea that cancer biologists can put a number on the environmental component of cancer is fraught with limitations.
She uses the example of a person who is genetically predisposed to lung cancer, but also smokes and lives in an area with high air pollution. If this person develops cancer, it is almost always attributed to smoking because almost 90 percent of lung cancer deaths are caused by tobacco. But researchers can't simply dismiss the remaining 10 percent. The way these fractions are teased apart is crucial, and important contributors are easily overlooked by limitations in study design.
Smoking alone accounts for at least 30 percent of all U.S. cancer deaths, and about the same percentage is attributed to diet, obesity and physical inactivity. But it’s the remaining cancers – about one out of every three – that trigger debate.There is substantial evidence that synergism between two different exposures can cause some cancers. Asbestos, for example, enhances the carcinogenicity of tobacco smoke, so the rate of lung cancer was especially high among people who smoked and also were exposed to asbestos in their workplaces.
The major reason that it’s so difficult to pin down how many cancers are due to environmental factors is that studies that allow epidemiologists to link human cancers to an environmental pollutant are rare opportunities.
Scientists need a setting where they can be absolutely certain about what and when people were exposed to something, and then be able to follow up with the patients many years later, since cancer takes decades to develop. Yet this is hardly ever possible, said Dr. Richard Jackson, former director of the federal Centers for Disease Control and Prevention’s National Center for Environmental Health.
Humans aren't lab rats; they tend to move around, so they don't know what they were exposed to, said Jackson, who is now a UCLA professor. Also, tracking systems for environmental exposures and chemicals are inadequate.
Let Ideas Compete/Flickr
Smoking is relatively easy to study, but determining people's exposure to other carcinogens is much more difficult.
Smoking is relatively easy to study – you can ask someone about their smoking habits – but if you ask someone if they were exposed to benzene, chlorinated solvents, or pesticides, they probably won’t have the slightest idea or they certainly won’t know how much they were exposed to, Schettler said.
There are examples of natural experiments where communities have banned a suspected carcinogen such as a pesticide and then seen cancer rates drop, such as when Sweden banned phenoxy herbicides over a decade ago. While these natural experiments are useful to epidemiologists, they usually only confirm that a chemical is harmful and reveal little about its overall contribution to cancer death.
In most cases, environmental agencies estimate the number of cases attributable to a certain environmental chemical by extrapolating from studies of lab animals or occupational settings where cancer rates rise among workers, then estimating the public's exposures. But those risk assessments carry many uncertainties.
The two members of the President’s Cancer Panel believe their claim about the “grossly underestimated” role of the environment is justified because technologies such as CT scans, which expose people to large amounts of radiation, are in greater use today. Also, there are more known carcinogens today, and the original estimates didn't consider multiple exposures over a person’s life.
“We think all of those things combine to make the current estimate higher. They certainly won't go down, but are probably much larger than estimated,” said Kripke.
She said the panel’s intent was to bring attention to human carcinogens in the environment that the public is unaware of, such as radon and formaldehyde, she said.
The panel pointed out bisphenol A, used in polycarbonate plastic and can linings, along with radon, formaldehyde and benzene, as carcinogens that need more regulation.
Clapp said instead of worrying about specific numbers, the focus should be on banning or restricting workplace carcinogens with strong evidence that they are harmful. One example is methylene chloride, used in semiconductor factories.
The National Institutes of Health has classified 54 compounds as known human carcinogens based on studies indicating they cause at least one type of cancer in people. Included are asbestos, benzene and radon.Reducing use of CT scans and cleaning up military bases are other ways to reduce exposures, according to the President’s Cancer Panel report.
The American Cancer Society agreed with much of the panel’s report, and in the past, it has expressed concern about environmental chemicals.
“Although the relatively small risks associated with low-level exposure to carcinogens in air, food, or water are difficult to detect in epidemiological studies, scientific and regulatory bodies throughout the world have accepted the principle that it is reasonable and prudent to reduce human exposure to substances shown to be carcinogenic at higher levels of exposure,” the American Cancer Society said in a 2009 Cancer Facts and Figures report.
But the group worries that the President’s Cancer Panel overstated the risks and detracts from combating the bigger causes of cancer.
Kelly Sue/flickr
Combating the largest cancer risks includes avoiding too much sun exposure.
“There is no doubt that environmental pollution is an important issue to address to improve the lives of Americans. At the same time, it would be unfortunate if people came away with the message that the chemicals in the environment are the most important cause of cancer at the expense of those lifestyle factors, like tobacco, physical activity, nutrition, and obesity, that have by far the most potential in reducing cancer deaths,” Thun said in a statement.
Thun added in an interview that “many of the carcinogens in smoking are the same ones that people worry about in the general environment,” such as benzene. But in cigarettes, “they are much more concentrated and people are inhaling them deep into their lungs. The magnitude of exposure is just gigantically different.”
But Kripke pointed out that there has been plenty of emphasis on smoking, diet and other causes of cancer over the past few years. Last year’s 2009 President’s Cancer Panel report focused on lifestyle-related cancers.
"To say that we have ignored those factors doesn't take into account that we have put a lot into work into it,” Kripke said. "We're very cognizant that there are other, larger factors that contribute to cancer, but that doesn't mean we shouldn't look at the smaller ones.”
The general public can understand that many factors can lead to disease and that all should be addressed, Schettler said.
“People can walk and chew gum at the same time. We can pay attention to many factors at the same time,” he said.
More than 60 percent of U.S. cancer deaths are caused by smoking and diet. But what about the rest?
A report by the President's Cancer Panel, released earlier this month, reignited a 30-year-old controversy among cancer experts and environmental epidemiologists about how large a role environmental factors play in the No. 2 killer of Americans.
Some experts, including the President’s panel, say a decades-old estimate that six percent of cancer deaths are due to environmental and occupational exposures is outdated and far too low. But scientists most likely will never be able to tease out the true role of environmental contaminants because environmental exposures, genetics and lifestyle seem to all intertwine.
“It’s like looking at strands of a spider web and deciding which one is important,” said Dr. Ted Schettler, director of the Science and Environmental Health Network, a nonprofit group that advocates use of science in setting environmental policy.
From the womb to old age, people around the world are exposed to countless carcinogens in their food, air, water and consumer goods.
The National Institutes of Health has classified 54 compounds as known human carcinogens based on studies indicating they cause at least one type of cancer in people, according to the nation’s 11th Report on Carcinogens. The highest exposures occur in an occupational setting, but there are environmental exposures as well.
Figuring out how many cancers are caused by the environment "is like looking at strands of a spider web and deciding which one is important.” - Ted Schettler, Science and Environmental Health Network For example, benzene, a known cause of human leukemia, is a common pollutant in vehicle exhaust. Radon, a natural radioactive gas found in many homes, raises the risk of lung cancer. Arsenic, linked to skin, liver, bladder and lung cancer, contaminates some drinking water supplies. Other known human carcinogens include asbestos, hexavalent chromium, aflatoxins and vinyl chloride.
Since 1981, agencies and institutes have cited the same estimate when regulating carcinogens in the workplace, air, water and consumer products. Roughly four percent of cancer deaths – or 20,000 deaths per year – may be attributable to occupational exposures, and two percent – or 10,000 deaths per year – to environmental exposures.
In its new report, the panel, appointed by former President Bush, called that estimate “woefully out of date,” reporting that “the true burden of environmentally induced cancers has been grossly underestimated.”
But the American Cancer Society took issue with that statement, saying there is no scientific consensus.
“On what grounds do you know it's being grossly underestimated? It's a possibility, but many hypotheses have been proposed, and unless you have real evidence, you can’t say that it is,” said Dr. Michael Thun, vice president emeritus of epidemiology and surveillance research for the American Cancer Society.
Thun said the President’s panel overstates the concern about environmental causes when the best way to prevent cancer is to combat the largest risks that people encounter: tobacco, diet, physical inactivity and sun.
But many environmental epidemiologists say quibbling over the numbers becomes a diversionary tactic.
They say the American Cancer’s Society’s statement sounds a bit like a principle espoused by industry groups – don’t act without absolute proof of harm. Many environmental epidemiologists are in favor of moving toward the precautionary principle – reducing people’s exposure to environmental pollutants even if there is uncertainty about the risks.
"It would be unfortunate if people came away with the message that chemicals in the environment are the most important cause of cancer at the expense of those lifestyle factors, like tobacco, physical activity, nutrition, and obesity, that have by far the most potential in reducing cancer deaths." - Michael Thun, American Cancer Society It’s an "erroneous exercise” to try to assign each chemical or exposure a specific fraction of cancer, said Richard Clapp of Boston University's School of Public Health, who co-authored a 2005 review and 2007 update on environmental and occupational causes of cancer.
"It's estimating a fiction, because nobody knows and nobody can know," said Clapp. "Why do we keep beating this dead horse? If there are things we can move on, let's work on those."
Cancer is the second leading killer of Americans, and the leading cause of death worldwide. Every year, about 1.5 million new cases are diagnosed in the United States and more than half a million people die from the disease, according to the American Cancer Society.
Experts agree that most cancers are caused by lifestyle factors such as smoking, diet and alcohol. Smoking alone accounts for at least 30 percent of all U.S. cancer deaths, and about the same percentage is attributed to diet, obesity and physical inactivity, according to the American Cancer Society.
But it’s the remaining cancers – roughly one out of every three – that trigger debate.
A 1981 report by two scientists, Sir Richard Doll and Sir Richard Peto, published in the Journal of the National Cancer Institute, estimated that two percent of cancer deaths were attributable to exposures to pollutants in the environment and four percent to exposures in occupational settings. In 2009, those percentages amounted to about 30,000 U.S. deaths.
“If you looked at the number of deaths per day, if that were a plane crash it would be a national news story,” Clapp said.
The 1981 report only considered deaths, not cancer cases. (About half of those diagnosed with cancer die.) Also, the study only included Caucasians under the age of 65, although many cancers increase with age and many minority groups are more highly exposed to environmental contaminants.
US Centers for Disease Control and Prevention
Asbestos fibers lodged in a lung.
The old two percent estimate for environmentally induced cancers is still commonly used – despite advances in modern cancer biology.
New areas of cancer research are focusing on the potential for pollutants to interact with one another and with genetic factors. Carcinogens can act by damaging DNA, disrupting hormones, inflaming tissues, or switching genes on or off.
Also, exposure to hormonally active agents during critical periods of human development – particularly in the womb or during childhood – may trigger cancer later in life. For example, the risk of breast cancer could be influenced by exposures during puberty.
All these elements make it tricky to calculate the magnitude of environmentally induced cancers.
Scientists now know that getting cancer is like being attacked by a multi-headed monster: How can you really be sure which part did the most damage?
Schettler said “we now know from cancer biology that multiple interacting factors” are involved so it’s impossible to assign percentages to certain causes.
“It’s really important that we understand the limits of this notion. We have to be humbled by this and know that our estimates may be way off,” he said.
Margaret Kripke, a professor at University of Texas' M.D. Anderson Cancer Center and co-author of the President’s Cancer Panel report, said the idea that cancer biologists can put a number on the environmental component of cancer is fraught with limitations.
She uses the example of a person who is genetically predisposed to lung cancer, but also smokes and lives in an area with high air pollution. If this person develops cancer, it is almost always attributed to smoking because almost 90 percent of lung cancer deaths are caused by tobacco. But researchers can't simply dismiss the remaining 10 percent. The way these fractions are teased apart is crucial, and important contributors are easily overlooked by limitations in study design.
Smoking alone accounts for at least 30 percent of all U.S. cancer deaths, and about the same percentage is attributed to diet, obesity and physical inactivity. But it’s the remaining cancers – about one out of every three – that trigger debate.There is substantial evidence that synergism between two different exposures can cause some cancers. Asbestos, for example, enhances the carcinogenicity of tobacco smoke, so the rate of lung cancer was especially high among people who smoked and also were exposed to asbestos in their workplaces.
The major reason that it’s so difficult to pin down how many cancers are due to environmental factors is that studies that allow epidemiologists to link human cancers to an environmental pollutant are rare opportunities.
Scientists need a setting where they can be absolutely certain about what and when people were exposed to something, and then be able to follow up with the patients many years later, since cancer takes decades to develop. Yet this is hardly ever possible, said Dr. Richard Jackson, former director of the federal Centers for Disease Control and Prevention’s National Center for Environmental Health.
Humans aren't lab rats; they tend to move around, so they don't know what they were exposed to, said Jackson, who is now a UCLA professor. Also, tracking systems for environmental exposures and chemicals are inadequate.
Let Ideas Compete/Flickr
Smoking is relatively easy to study, but determining people's exposure to other carcinogens is much more difficult.
Smoking is relatively easy to study – you can ask someone about their smoking habits – but if you ask someone if they were exposed to benzene, chlorinated solvents, or pesticides, they probably won’t have the slightest idea or they certainly won’t know how much they were exposed to, Schettler said.
There are examples of natural experiments where communities have banned a suspected carcinogen such as a pesticide and then seen cancer rates drop, such as when Sweden banned phenoxy herbicides over a decade ago. While these natural experiments are useful to epidemiologists, they usually only confirm that a chemical is harmful and reveal little about its overall contribution to cancer death.
In most cases, environmental agencies estimate the number of cases attributable to a certain environmental chemical by extrapolating from studies of lab animals or occupational settings where cancer rates rise among workers, then estimating the public's exposures. But those risk assessments carry many uncertainties.
The two members of the President’s Cancer Panel believe their claim about the “grossly underestimated” role of the environment is justified because technologies such as CT scans, which expose people to large amounts of radiation, are in greater use today. Also, there are more known carcinogens today, and the original estimates didn't consider multiple exposures over a person’s life.
“We think all of those things combine to make the current estimate higher. They certainly won't go down, but are probably much larger than estimated,” said Kripke.
She said the panel’s intent was to bring attention to human carcinogens in the environment that the public is unaware of, such as radon and formaldehyde, she said.
The panel pointed out bisphenol A, used in polycarbonate plastic and can linings, along with radon, formaldehyde and benzene, as carcinogens that need more regulation.
Clapp said instead of worrying about specific numbers, the focus should be on banning or restricting workplace carcinogens with strong evidence that they are harmful. One example is methylene chloride, used in semiconductor factories.
The National Institutes of Health has classified 54 compounds as known human carcinogens based on studies indicating they cause at least one type of cancer in people. Included are asbestos, benzene and radon.Reducing use of CT scans and cleaning up military bases are other ways to reduce exposures, according to the President’s Cancer Panel report.
The American Cancer Society agreed with much of the panel’s report, and in the past, it has expressed concern about environmental chemicals.
“Although the relatively small risks associated with low-level exposure to carcinogens in air, food, or water are difficult to detect in epidemiological studies, scientific and regulatory bodies throughout the world have accepted the principle that it is reasonable and prudent to reduce human exposure to substances shown to be carcinogenic at higher levels of exposure,” the American Cancer Society said in a 2009 Cancer Facts and Figures report.
But the group worries that the President’s Cancer Panel overstated the risks and detracts from combating the bigger causes of cancer.
Kelly Sue/flickr
Combating the largest cancer risks includes avoiding too much sun exposure.
“There is no doubt that environmental pollution is an important issue to address to improve the lives of Americans. At the same time, it would be unfortunate if people came away with the message that the chemicals in the environment are the most important cause of cancer at the expense of those lifestyle factors, like tobacco, physical activity, nutrition, and obesity, that have by far the most potential in reducing cancer deaths,” Thun said in a statement.
Thun added in an interview that “many of the carcinogens in smoking are the same ones that people worry about in the general environment,” such as benzene. But in cigarettes, “they are much more concentrated and people are inhaling them deep into their lungs. The magnitude of exposure is just gigantically different.”
But Kripke pointed out that there has been plenty of emphasis on smoking, diet and other causes of cancer over the past few years. Last year’s 2009 President’s Cancer Panel report focused on lifestyle-related cancers.
"To say that we have ignored those factors doesn't take into account that we have put a lot into work into it,” Kripke said. "We're very cognizant that there are other, larger factors that contribute to cancer, but that doesn't mean we shouldn't look at the smaller ones.”
The general public can understand that many factors can lead to disease and that all should be addressed, Schettler said.
“People can walk and chew gum at the same time. We can pay attention to many factors at the same time,” he said.
SOLAR AERO!S BLADELESS WIND TURBINE
A research company in New Hampshire recently announced the patent of their bladeless wind turbine, which is based on a patent issued to Nikola Tesla in 1913. The Fuller Wind Turbine developed by Solar Aero has only one rotating part, the turbine-driveshaft. The entire assembly is contained inside a housing, so that this turbine offers several advantages versus blade-style (primarily horizontal-axis type) turbines. With a screened inlet and outlet, this turbine does not present a danger to wildlife such as bats and birds. To an outside observer, the only movement visible is the entire turbine housing as it adjusts to track the wind. This also makes it a good candidate for use near military surveillance and radar installations, where moving blades would otherwise cause difficulties.
According to the company, the turbine is expected to deliver power at a cost comparable to coal-fired power plants. Total operating costs over the lifetime of the unit are expected to be about $0.12/kWh. The turbine also should have fewer maintenance requirements, leading to lower lifetime operating costs. The turbine itself can also be supported on magnetic bearings, and all of the generating equipment kept at ground level, which will also make maintenance easier. The company estimates "final costs will be about $1.50/watt rated output, or roughly 2/3 the cost of comparable bladed units."
The Tesla turbine operates using the viscous flow of a fluid to move the turbine and thereby produce energy. The Tesla turbine "consists of a set of smooth disks, with nozzles applying a moving gas to the edge of the disk. The gases drag on the disk by means of viscosity and the adhesion of the surface layer of the gas. As the gas slows and adds energy to the disks, it spirals in to the center exhaust. Since the rotor has no projections, it is very sturdy." Disks in the turbine need to be closely spaced in order to capture the viscous flow,. In order to be effective, the Tesla turbine also needs to have extremely thin disks to minimize turbulence at the edges. Tesla was not able to find metals of sufficient quality to make this work effectively, but apparently, nearly a century later, those limitations have been overcome.
Solar Aero's current example is an unassuming trailer-mounted unit, but a unit the size of the one pictured (see website) "should be capable of 10kW output with no problem," according to the inventor. The number of disks determines the amount of power that can be produced. It will be interesting to see if this technology takes off, and if the technology is something that can be scaled up to provide utility level power production, or if it is only a niche system. In any case, it is interesting to see alternatives to bladed wind turbines.
According to the company, the turbine is expected to deliver power at a cost comparable to coal-fired power plants. Total operating costs over the lifetime of the unit are expected to be about $0.12/kWh. The turbine also should have fewer maintenance requirements, leading to lower lifetime operating costs. The turbine itself can also be supported on magnetic bearings, and all of the generating equipment kept at ground level, which will also make maintenance easier. The company estimates "final costs will be about $1.50/watt rated output, or roughly 2/3 the cost of comparable bladed units."
The Tesla turbine operates using the viscous flow of a fluid to move the turbine and thereby produce energy. The Tesla turbine "consists of a set of smooth disks, with nozzles applying a moving gas to the edge of the disk. The gases drag on the disk by means of viscosity and the adhesion of the surface layer of the gas. As the gas slows and adds energy to the disks, it spirals in to the center exhaust. Since the rotor has no projections, it is very sturdy." Disks in the turbine need to be closely spaced in order to capture the viscous flow,. In order to be effective, the Tesla turbine also needs to have extremely thin disks to minimize turbulence at the edges. Tesla was not able to find metals of sufficient quality to make this work effectively, but apparently, nearly a century later, those limitations have been overcome.
Solar Aero's current example is an unassuming trailer-mounted unit, but a unit the size of the one pictured (see website) "should be capable of 10kW output with no problem," according to the inventor. The number of disks determines the amount of power that can be produced. It will be interesting to see if this technology takes off, and if the technology is something that can be scaled up to provide utility level power production, or if it is only a niche system. In any case, it is interesting to see alternatives to bladed wind turbines.
Wednesday, May 19, 2010
Shifting rivers threaten India's top tea region
Shifting rivers in India's largest tea producing state and abnormally high rainfall this year is destroying hundreds of acres of tea gardens and could cut output in the world's second-largest tea grower.
More than a tenth of the 18,000 hectares of plantations, or tea gardens, in India's northeast state of Assam could be washed away as the mighty Himalaya-born Brahmaputra and other smaller rivers flood the region where century-old operations grow over half of India's tea.
"Some tea gardens have already fallen into rivers and some of them are on the verge of disappearing," said Dipanjol Deka, secretary general of Tea Association of India (TAI) in Guwahati, the main city in the region.
"In the long run there is a possibility of production loss and overall loss to the industry."
India consumes the bulk of its tea production. Last year, it exported a fifth of its 979 million kilograms (kg) of tea output, earning about $570 million.
Tea has been commercially grown in Assam since the early 19th century, after the East India Company which governed British possessions in the subcontinent lost its monopoly on tea-trade with China.
Assam's 850 gardens employ over 800,000 people and export the strong tea to over 80 countries including Russia and Britain. In 2009 the state produced nearly 500 million kg of tea.
But annual summer floods in the state which receives heavy monsoon rains has led to rivers breaking banks and wearing away slopes of gardens where tea is grown. This year, the region has received more rain than usual, weather officials say.
More than a tenth of the 18,000 hectares of plantations, or tea gardens, in India's northeast state of Assam could be washed away as the mighty Himalaya-born Brahmaputra and other smaller rivers flood the region where century-old operations grow over half of India's tea.
"Some tea gardens have already fallen into rivers and some of them are on the verge of disappearing," said Dipanjol Deka, secretary general of Tea Association of India (TAI) in Guwahati, the main city in the region.
"In the long run there is a possibility of production loss and overall loss to the industry."
India consumes the bulk of its tea production. Last year, it exported a fifth of its 979 million kilograms (kg) of tea output, earning about $570 million.
Tea has been commercially grown in Assam since the early 19th century, after the East India Company which governed British possessions in the subcontinent lost its monopoly on tea-trade with China.
Assam's 850 gardens employ over 800,000 people and export the strong tea to over 80 countries including Russia and Britain. In 2009 the state produced nearly 500 million kg of tea.
But annual summer floods in the state which receives heavy monsoon rains has led to rivers breaking banks and wearing away slopes of gardens where tea is grown. This year, the region has received more rain than usual, weather officials say.
Saturday, May 15, 2010
Letter from Secretary Napolitano and Secretary Salazar On Oil Spill
“Dear Dr. Hayward:
The BP Deepwater Horizon oil spill may prove to be one of the most devastating environmental disasters this nation has ever faced. As one of the responsible parties for this event, BP is accountable to the American public for the full clean up of this spill and all the economic loss caused by the spill and related events.
We recognize that, to date, BP has undertaken to promptly pay the damages associated with the Deepwater Horizon events, in addition to all removal costs. In an interview with Reuters on April 30, 2010, you stated that, “We are taking full responsibility for the spill and we will clean it up, and where people can present legitimate claims for damages we will honor them. We are going to be very, very aggressive in all of that.”
Mr. lamar McKay, Chairman and President of BP America, in his May 11, 2010 testimony before the Senate Energy and Natural Resources Committee, also acknowledged BP’s responsibility for clean up and compensation associated with the oil spill. “[W]e are the responsible party. Our obligation is to deal with the spill, clean it up and make sure the impacts of that spill are compensated and we are going to do that.” Mr. McKay further noted in his testimony, “BP will pay all necessary clean up costs and is committed to paying legitimate claims for other loss and damages caused by the spill.” Finally, we note that Mr. McKay in his Senate testimony also agreed that BP will pay all claims even if they exceed what he described as an “irrelevant” statutory cap of $75 million per incident.
On May 10, 2010, BP reiterated this point in a letter from its U.S. General Counsel, John. E. Lynch, Jr., to theAttorneys General of the five Gulf Coast states: “[I]t is BP’s position that the cap on liability under the Oil Pollution Act is not relevant; BP will pay necessary clean up costs associated with the spill and legitimate claims for other loss and damage.”
Based on these statements, we understand that BP will not in any way seek to rely on the potential $75 million statutory cap to refuse to provide compensation to any individuals or others harmed by the oil spill, even if more than $75 million is required to provide full compensation to all claimants, and BP will not seek reimbursement from the American taxpayers, the United States Government, or the Oil Spill Liability Trust Fund for any amount.
The public has a right to a clear understanding of BP’s commitment to redress all of the damage that has occurred or that will occur in the future as a result of the oil spill. Therefore, in the event that our understanding is inaccurate, we request immediate public clarification of BP’s true intentions.
Thank you for your prompt attention to this matter.
Sincerely,
Ken Salazar
Secretary of the Interior
Janet Napolitano
Secretary of Homeland Security
cc: Lamar McKay, Chairman and President, BP America”
The BP Deepwater Horizon oil spill may prove to be one of the most devastating environmental disasters this nation has ever faced. As one of the responsible parties for this event, BP is accountable to the American public for the full clean up of this spill and all the economic loss caused by the spill and related events.
We recognize that, to date, BP has undertaken to promptly pay the damages associated with the Deepwater Horizon events, in addition to all removal costs. In an interview with Reuters on April 30, 2010, you stated that, “We are taking full responsibility for the spill and we will clean it up, and where people can present legitimate claims for damages we will honor them. We are going to be very, very aggressive in all of that.”
Mr. lamar McKay, Chairman and President of BP America, in his May 11, 2010 testimony before the Senate Energy and Natural Resources Committee, also acknowledged BP’s responsibility for clean up and compensation associated with the oil spill. “[W]e are the responsible party. Our obligation is to deal with the spill, clean it up and make sure the impacts of that spill are compensated and we are going to do that.” Mr. McKay further noted in his testimony, “BP will pay all necessary clean up costs and is committed to paying legitimate claims for other loss and damages caused by the spill.” Finally, we note that Mr. McKay in his Senate testimony also agreed that BP will pay all claims even if they exceed what he described as an “irrelevant” statutory cap of $75 million per incident.
On May 10, 2010, BP reiterated this point in a letter from its U.S. General Counsel, John. E. Lynch, Jr., to theAttorneys General of the five Gulf Coast states: “[I]t is BP’s position that the cap on liability under the Oil Pollution Act is not relevant; BP will pay necessary clean up costs associated with the spill and legitimate claims for other loss and damage.”
Based on these statements, we understand that BP will not in any way seek to rely on the potential $75 million statutory cap to refuse to provide compensation to any individuals or others harmed by the oil spill, even if more than $75 million is required to provide full compensation to all claimants, and BP will not seek reimbursement from the American taxpayers, the United States Government, or the Oil Spill Liability Trust Fund for any amount.
The public has a right to a clear understanding of BP’s commitment to redress all of the damage that has occurred or that will occur in the future as a result of the oil spill. Therefore, in the event that our understanding is inaccurate, we request immediate public clarification of BP’s true intentions.
Thank you for your prompt attention to this matter.
Sincerely,
Ken Salazar
Secretary of the Interior
Janet Napolitano
Secretary of Homeland Security
cc: Lamar McKay, Chairman and President, BP America”
Tuesday, April 20, 2010
Beach Erosion: Can We Stop Beach Erosion from Destroying Our Coastlines?
Dear EarthTalk: I’ve noticed a lot of beach erosion along the eastern U.S. coast. Beaches are virtually non-existent in places. Is this a usual cycle that will self-correct, or are these beaches permanently gone from sea level rise or other environmental causes? – Jan Jesse, Morristown, TN
Unfortunately for beach lovers and owners of high-priced beach-front homes, coastal erosion in any form is usually a one-way trip. Man-made techniques such as beach nourishment—whereby sand is dredged from offshore sources and deposited along otherwise vanishing beaches—may slow the process, but nothing short of global cooling or some other major geomorphic change will stop it altogether.
Beach Erosion Not Simply “Shifting Sands”
According to Stephen Leatherman (“Dr. Beach”) of the National Healthy Beaches Campaign, beach erosion is defined by the actual removal of sand from a beach to deeper water offshore or alongshore into inlets, tidal shoals and bays. Such erosion can result from any number of factors, including the simple inundation of the land by rising sea levels resulting from the melting of the polar ice caps.
Beach Erosion is an Ongoing Problem
Leatherman cites U.S. Environmental Protection Agency estimates that between 80 and 90 percent of the sandy beaches along America’s coastlines have been eroding for decades. In many of these cases, individual beaches may be losing only a few inches per year, but in some cases the problem is much worse. The outer coast of Louisiana, which Leatherman refers to as “the erosion ‘hot spot’ of the U.S.,” is losing some 50 feet of beach every year.
Is Global Warming Accelerating Beach Erosion?
Of particular concern is the effect climate change, which not only causes sea levels to rise but also increases the severity and possibly the frequency of harsh storms, has on beach erosion. “While sea level rise sets the conditions for landward displacement of the shore, coastal storms supply the energy to do the ‘geologic work’ by moving the sand off and along the beach,” writes Leatherman on his DrBeach.org website. “Therefore, beaches are greatly influenced by the frequency and magnitude of storms along a particular shoreline.”
What Can You Do Personally to Stop Beach Erosion? Not Much
Besides collectively lowering our greenhouse gas emissions substantially, there is little that individuals—let alone coastal landowners—can do to stop beach erosion. Building a bulkhead or seawall along one or a few coastal properties may protect homes from damaging storm waves for a few years, but could end up doing more harm than good. “Bulkheads and seawalls may accelerate beach erosion by reflecting wave energy off the facing wall, impacting adjacent property owners as well,” writes Leatherman, adding that such structures along retreating shorelines eventually cause diminished beach width and even loss.
Slowing or Stopping Beach Erosion is Possible, but Pricey
Other larger scale techniques like beach nourishment may have better track records, at least in terms of slowing or delaying beach erosion, but are expensive enough as to warrant massive taxpayer expenditures. In the early 1980s, the city of Miami spent some $65 million adding sand to a 10-mile stretch of fast-eroding shoreline. Not only did the effort stave off erosion, it helped revitalize the tony South Beach neighborhood and rescue hotels, restaurants and shops there that cater to the rich and famous.
GOT AN ENVIRONMENTAL QUESTION? Send it to: EarthTalk, c/o E/The Environmental Magazine, P.O. Box 5098, Westport, CT 06881; submit it at: www.emagazine.com/earthtalk/thisweek/, or e-mail: earthtalk@emagazine.com.
EarthTalk is a regular feature of E/The Environmental Magazine. Selected EarthTalk columns are reprinted on About Environmental Issues by permission of the editors of
Unfortunately for beach lovers and owners of high-priced beach-front homes, coastal erosion in any form is usually a one-way trip. Man-made techniques such as beach nourishment—whereby sand is dredged from offshore sources and deposited along otherwise vanishing beaches—may slow the process, but nothing short of global cooling or some other major geomorphic change will stop it altogether.
Beach Erosion Not Simply “Shifting Sands”
According to Stephen Leatherman (“Dr. Beach”) of the National Healthy Beaches Campaign, beach erosion is defined by the actual removal of sand from a beach to deeper water offshore or alongshore into inlets, tidal shoals and bays. Such erosion can result from any number of factors, including the simple inundation of the land by rising sea levels resulting from the melting of the polar ice caps.
Beach Erosion is an Ongoing Problem
Leatherman cites U.S. Environmental Protection Agency estimates that between 80 and 90 percent of the sandy beaches along America’s coastlines have been eroding for decades. In many of these cases, individual beaches may be losing only a few inches per year, but in some cases the problem is much worse. The outer coast of Louisiana, which Leatherman refers to as “the erosion ‘hot spot’ of the U.S.,” is losing some 50 feet of beach every year.
Is Global Warming Accelerating Beach Erosion?
Of particular concern is the effect climate change, which not only causes sea levels to rise but also increases the severity and possibly the frequency of harsh storms, has on beach erosion. “While sea level rise sets the conditions for landward displacement of the shore, coastal storms supply the energy to do the ‘geologic work’ by moving the sand off and along the beach,” writes Leatherman on his DrBeach.org website. “Therefore, beaches are greatly influenced by the frequency and magnitude of storms along a particular shoreline.”
What Can You Do Personally to Stop Beach Erosion? Not Much
Besides collectively lowering our greenhouse gas emissions substantially, there is little that individuals—let alone coastal landowners—can do to stop beach erosion. Building a bulkhead or seawall along one or a few coastal properties may protect homes from damaging storm waves for a few years, but could end up doing more harm than good. “Bulkheads and seawalls may accelerate beach erosion by reflecting wave energy off the facing wall, impacting adjacent property owners as well,” writes Leatherman, adding that such structures along retreating shorelines eventually cause diminished beach width and even loss.
Slowing or Stopping Beach Erosion is Possible, but Pricey
Other larger scale techniques like beach nourishment may have better track records, at least in terms of slowing or delaying beach erosion, but are expensive enough as to warrant massive taxpayer expenditures. In the early 1980s, the city of Miami spent some $65 million adding sand to a 10-mile stretch of fast-eroding shoreline. Not only did the effort stave off erosion, it helped revitalize the tony South Beach neighborhood and rescue hotels, restaurants and shops there that cater to the rich and famous.
GOT AN ENVIRONMENTAL QUESTION? Send it to: EarthTalk, c/o E/The Environmental Magazine, P.O. Box 5098, Westport, CT 06881; submit it at: www.emagazine.com/earthtalk/thisweek/, or e-mail: earthtalk@emagazine.com.
EarthTalk is a regular feature of E/The Environmental Magazine. Selected EarthTalk columns are reprinted on About Environmental Issues by permission of the editors of
Billions of People Face Food Shortages Due to Global Warming, Study Warns
Half of the world’s population could face severe food shortages by the end of this century as rising temperatures shorten the growing season in the tropics and subtropics, increase the risk of drought, and reduce the harvests of dietary staples such as rice and maize by 20 percent to 40 percent, according to a study published in the journal Science.
Global warming is expected to affect agriculture in every part of the world but it will have greater impact in the tropics and subtropics, where crops are less able to adapt to climate change and food shortages are already starting to occur due to rapid population growth.
Scientists at Stanford University and the University of Washington, who worked on the study, discovered that by 2100 there is a 90 percent chance that the coolest temperatures in the tropics during the growing season will be higher than the hottest temperatures recorded in those regions through 2006. Even more temperate parts of the world can expect to see previously record-high temperatures become the norm.
With the world population expected to double by the end of the century, the need for food will become increasingly urgent as rising temperatures force nations to retool their approach to agriculture, create new climate-resistant crops, and develop additional strategies to ensure an adequate food supply for their people.
All of that could take decades, according to Rosamond Naylor, who is director of food security and the environment at Stanford. Meanwhile, people will have fewer and fewer places to turn for food when their local supplies begin to run dry.
"When all the signs point in the same direction, and in this case it's a bad direction, you pretty much know what's going to happen," said David Battisti, the University of Washington scientist who led the study. "You're talking about hundreds of millions of additional people looking for food because they won't be able to find it where they find it now.
Global warming is expected to affect agriculture in every part of the world but it will have greater impact in the tropics and subtropics, where crops are less able to adapt to climate change and food shortages are already starting to occur due to rapid population growth.
Scientists at Stanford University and the University of Washington, who worked on the study, discovered that by 2100 there is a 90 percent chance that the coolest temperatures in the tropics during the growing season will be higher than the hottest temperatures recorded in those regions through 2006. Even more temperate parts of the world can expect to see previously record-high temperatures become the norm.
With the world population expected to double by the end of the century, the need for food will become increasingly urgent as rising temperatures force nations to retool their approach to agriculture, create new climate-resistant crops, and develop additional strategies to ensure an adequate food supply for their people.
All of that could take decades, according to Rosamond Naylor, who is director of food security and the environment at Stanford. Meanwhile, people will have fewer and fewer places to turn for food when their local supplies begin to run dry.
"When all the signs point in the same direction, and in this case it's a bad direction, you pretty much know what's going to happen," said David Battisti, the University of Washington scientist who led the study. "You're talking about hundreds of millions of additional people looking for food because they won't be able to find it where they find it now.
Readers Respond: What are you doing to help reduce global warming
Reducing global warming may require global solutions, but it also requires personal action by millions of individuals. What are you doing in your own life to help reduce global warming, and which of those strategies would you recommend to others? Share Your Ideas
Number of animals raised per year
Those who argue against the veg diet to stop global warming have some serious misunderstandings. Worldwide 60 billion (yes, 60 BILLION) animals are killed every year for meat. Those 60 billion animals create a far bigger burden than 6 billion humans. They create 130 times more excrement (which is untreated and poured into our waterways or buried where it pollutes underground water sources), over 50% of greenhouse gases, causes the majority of deforestation in the Amazon, and a whole host of other environmental problems (thousands of miles of poisoned rivers, 400 dead zones in our oceans, uses 70% of all usable water, over 60% of grain, etc). These animals do not happen naturally - they are raised in artificial numbers that would never happen in nature. They are an enormous burden on our planet and its people (while 10000 children die each day of starvation, we feed over 60% of the world's grain to animals. That grain would feed billions more people) Go veg, be green!
—Guest Vegg Mom
Most important thing left off list!
The meat industry is responsible for more greenhouse gas emissions than all forms of transportation combined! The Worldwatch Institute recently released a study indicating that the meat industry is repsonsible for 51% (!) of greenhouse gases. Cut your meat consumption to save the planet. Go Veg, Be Green to save our planet!
—Guest Debra
Small steps add up
Reduce, reuse, recycle are the tenets we live by. We: 1. eliminated bottled water; now use reusable stainless steel bottles & own water 2. use CFLs 3. grow many of our own veggies 4. shop local - farmers markets & locally grown meats & eggs 5. Barter for use of items we don't use often 6. Use freecycle/donate to charity to give others an opportunity to use items we no longer need 7. Hang our clothes outside on a clothesline to dry 8. Use reusable shopping bags everywhere we shop 9. Carpool to work with 2 others 10. Replaced our water heater; turned down the temp 11. Replaced our furnace; use a programmable thermostat 12. Replaced our home's windows 13. Planted multi-purpose landscape beds with ornamentals as well as small fruits (e.g, blueberries, strawberries, currants, rhubarb) and room for veggies 14. Use blinds to cut heat loss in winter/heat gain in summer 15. Turn off the lights, PC, TV, etc 16. Taught our kids to be environmentally conscious!
—Guest Green one
Global warming
Use public transport instead of using private vehicles.
—Guest Sandy
Getting Involved
Get involved in a community activity (one that educates others about reducing energy or doing Green Habitat For Humanity or Green the Ghetto-type projects) will not only get you out of the house and not using power while there, but you'll bring yourself closer to your community and create a sense of cooperative involvement; "we're all in this together" or "I am my brother's keeper". Perhaps most important, you'll feel better about yourself as will the people you help who in turn can be persuaded to serve their community in the same way.
—Guest Andy
Advocacy
SETTING UP AN ADVOCACY TO EDUCATE AND INFORM MY COMMUNITY PROTECT AND CONSERVE THE ENVIRONMENT.PPL.ATTITUDE HAVE TO CHANGE. Setting up an advocacy to educate and inform my community, protect and conserve the environment. People's attitudes have to change.
—Guest omang dave
zizi
I'll plant trees and encourage other to do so. It's so relaxing, enjoying fresh air under a bit tree you planted. It's more refreshing than an air-conditioned house and you'll be saving the environment too.
—Guest zizi
Save Mother Earth
We all should make resolution on this new year to plant at least 5 plants, save electricity, walk instead of using vehicles--it will make us healthy too. Make others aware of Global Warming. We also should discuss this in our offices and work places so that more and more people will join the mission to save Mother Earth. It is our duty to save earth because in this way we will gift a safe, beautiful and clean place to live in to our next generation. It is very important and urgent issue.
—Guest Rahul
Opinion
The best change is to try and teach our next generation of kids to be environmentally friendly and energy efficient. We have to stop all forms of mainstream utilization of fossil fuels. And to most who believe their one small appliance doesn't do much damage, they are wrong. You are a contributing factor that is capable of producing several tons of Carbon Dioxide every year, how much you expel depends on your lifestyle. The problem is the Earth has reached it's carrying capacity for the human population. We are slowly taking over and have already impacted EVERY piece of wildlife known to man, either directly or indirectly through pollution, farming, Logging, habitat fragmentation, construction, and hunting/poaching. We just all need to play our part to help the right people (who have the power to change things) understand the value and ecological importance of the factors nature contributes to us.
—Guest Freeman
global warming
I switch off all the lights whenever I am not at home.
—Guest taj
Terrace farming
When hills/mountains are not in use,a round curvy slope should be formed & plants should be grown there.
—Guest honey
reduce
we should less used electrical things that can give affect to atmosphere layer
—nur79
save electricity
I always switch off the electric appliance which are not being used. Eg., not leaving the TV on stand-by mode.
—Guest kanika
Global Warmings
1.Started Gardening 2.Bathing in cold water rather than using warm water 3.Switching of the PC and Lights when not in Use 4.Avoided using Plastic Bags and Plastic materials 5.Replaced CFL Bulbs with Normal Tube lights
—Guest Shanmuga Kumar
heat your house
you can use geo-thermal energy to heat your house rather than use oil, which releases harmful gases in the environment.
—Guest rinku
1-15 of 61Next
Share Your Ideas
What are you doing to help reduce global warming?
Guest Name*Login with Membername or Register
Response TitleResponse50-character minimum
1000-character limit
Email
(optional)
Receive a one-time notification when your response is published.
User AgreementRelated Articles
Reduce Global Warming - What are You Doing to Help Reduce Global Warming?
Reduce Global Warming - What are You Doing to Help Reduce Global Warming?
Effects of Global Warming and Future Outlook of Global Warming
Sima
Neapolitan Meat Ragù - Carne al Ragù
Larry West
Environmental Issues Guide
Sign up for my Newsletter
My BlogMy Forum
Sponsored Links
Stop Polluting Oceans
Start acting today before it gets too late. Support Greenpeace now!
SaveYourSeas.org
Degree in Solar Energy.
Amity’s UGC Recognized Degree Top faculty, Great placement
www.Amity.Edu/Admission
Carbon Management Courses
Online Training for Beginners and Advanced. Become a GHG Expert
www.ghginstitute.org
Global Warming
All About Global Warming Global Warming and Much More!
Peeplo.com/Top_Results
World Bioenergy 2010
Conference & Exhibition on Biomass for Energy, 25-27 May 2010
www.elmia.se/en/worldbioenergy
Carbon dioxide equipment
Super-fine alcohol, DDGS equipment Carbon dioxide technology
www.keyualcohol.com
Number of animals raised per year
Those who argue against the veg diet to stop global warming have some serious misunderstandings. Worldwide 60 billion (yes, 60 BILLION) animals are killed every year for meat. Those 60 billion animals create a far bigger burden than 6 billion humans. They create 130 times more excrement (which is untreated and poured into our waterways or buried where it pollutes underground water sources), over 50% of greenhouse gases, causes the majority of deforestation in the Amazon, and a whole host of other environmental problems (thousands of miles of poisoned rivers, 400 dead zones in our oceans, uses 70% of all usable water, over 60% of grain, etc). These animals do not happen naturally - they are raised in artificial numbers that would never happen in nature. They are an enormous burden on our planet and its people (while 10000 children die each day of starvation, we feed over 60% of the world's grain to animals. That grain would feed billions more people) Go veg, be green!
—Guest Vegg Mom
Most important thing left off list!
The meat industry is responsible for more greenhouse gas emissions than all forms of transportation combined! The Worldwatch Institute recently released a study indicating that the meat industry is repsonsible for 51% (!) of greenhouse gases. Cut your meat consumption to save the planet. Go Veg, Be Green to save our planet!
—Guest Debra
Small steps add up
Reduce, reuse, recycle are the tenets we live by. We: 1. eliminated bottled water; now use reusable stainless steel bottles & own water 2. use CFLs 3. grow many of our own veggies 4. shop local - farmers markets & locally grown meats & eggs 5. Barter for use of items we don't use often 6. Use freecycle/donate to charity to give others an opportunity to use items we no longer need 7. Hang our clothes outside on a clothesline to dry 8. Use reusable shopping bags everywhere we shop 9. Carpool to work with 2 others 10. Replaced our water heater; turned down the temp 11. Replaced our furnace; use a programmable thermostat 12. Replaced our home's windows 13. Planted multi-purpose landscape beds with ornamentals as well as small fruits (e.g, blueberries, strawberries, currants, rhubarb) and room for veggies 14. Use blinds to cut heat loss in winter/heat gain in summer 15. Turn off the lights, PC, TV, etc 16. Taught our kids to be environmentally conscious!
—Guest Green one
Global warming
Use public transport instead of using private vehicles.
—Guest Sandy
Getting Involved
Get involved in a community activity (one that educates others about reducing energy or doing Green Habitat For Humanity or Green the Ghetto-type projects) will not only get you out of the house and not using power while there, but you'll bring yourself closer to your community and create a sense of cooperative involvement; "we're all in this together" or "I am my brother's keeper". Perhaps most important, you'll feel better about yourself as will the people you help who in turn can be persuaded to serve their community in the same way.
—Guest Andy
Advocacy
SETTING UP AN ADVOCACY TO EDUCATE AND INFORM MY COMMUNITY PROTECT AND CONSERVE THE ENVIRONMENT.PPL.ATTITUDE HAVE TO CHANGE. Setting up an advocacy to educate and inform my community, protect and conserve the environment. People's attitudes have to change.
—Guest omang dave
zizi
I'll plant trees and encourage other to do so. It's so relaxing, enjoying fresh air under a bit tree you planted. It's more refreshing than an air-conditioned house and you'll be saving the environment too.
—Guest zizi
Save Mother Earth
We all should make resolution on this new year to plant at least 5 plants, save electricity, walk instead of using vehicles--it will make us healthy too. Make others aware of Global Warming. We also should discuss this in our offices and work places so that more and more people will join the mission to save Mother Earth. It is our duty to save earth because in this way we will gift a safe, beautiful and clean place to live in to our next generation. It is very important and urgent issue.
—Guest Rahul
Opinion
The best change is to try and teach our next generation of kids to be environmentally friendly and energy efficient. We have to stop all forms of mainstream utilization of fossil fuels. And to most who believe their one small appliance doesn't do much damage, they are wrong. You are a contributing factor that is capable of producing several tons of Carbon Dioxide every year, how much you expel depends on your lifestyle. The problem is the Earth has reached it's carrying capacity for the human population. We are slowly taking over and have already impacted EVERY piece of wildlife known to man, either directly or indirectly through pollution, farming, Logging, habitat fragmentation, construction, and hunting/poaching. We just all need to play our part to help the right people (who have the power to change things) understand the value and ecological importance of the factors nature contributes to us.
—Guest Freeman
global warming
I switch off all the lights whenever I am not at home.
—Guest taj
Terrace farming
When hills/mountains are not in use,a round curvy slope should be formed & plants should be grown there.
—Guest honey
reduce
we should less used electrical things that can give affect to atmosphere layer
—nur79
save electricity
I always switch off the electric appliance which are not being used. Eg., not leaving the TV on stand-by mode.
—Guest kanika
Global Warmings
1.Started Gardening 2.Bathing in cold water rather than using warm water 3.Switching of the PC and Lights when not in Use 4.Avoided using Plastic Bags and Plastic materials 5.Replaced CFL Bulbs with Normal Tube lights
—Guest Shanmuga Kumar
heat your house
you can use geo-thermal energy to heat your house rather than use oil, which releases harmful gases in the environment.
—Guest rinku
1-15 of 61Next
Share Your Ideas
What are you doing to help reduce global warming?
Guest Name*Login with Membername or Register
Response TitleResponse50-character minimum
1000-character limit
(optional)
Receive a one-time notification when your response is published.
User AgreementRelated Articles
Reduce Global Warming - What are You Doing to Help Reduce Global Warming?
Reduce Global Warming - What are You Doing to Help Reduce Global Warming?
Effects of Global Warming and Future Outlook of Global Warming
Sima
Neapolitan Meat Ragù - Carne al Ragù
Larry West
Environmental Issues Guide
Sign up for my Newsletter
My BlogMy Forum
Sponsored Links
Stop Polluting Oceans
Start acting today before it gets too late. Support Greenpeace now!
SaveYourSeas.org
Degree in Solar Energy.
Amity’s UGC Recognized Degree Top faculty, Great placement
www.Amity.Edu/Admission
Carbon Management Courses
Online Training for Beginners and Advanced. Become a GHG Expert
www.ghginstitute.org
Global Warming
All About Global Warming Global Warming and Much More!
Peeplo.com/Top_Results
World Bioenergy 2010
Conference & Exhibition on Biomass for Energy, 25-27 May 2010
www.elmia.se/en/worldbioenergy
Carbon dioxide equipment
Super-fine alcohol, DDGS equipment Carbon dioxide technology
www.keyualcohol.com
Tuesday, April 6, 2010
Arctic Sea Ice News And Anylsis
Cold snap causes late-season growth spurt
Arctic sea ice reached its maximum extent for the year on March 31 at 15.25 million square kilometers (5.89 million square miles). This was the latest date for the maximum Arctic sea ice extent since the start of the satellite record in 1979.
Early in March, Arctic sea ice appeared to reach a maximum extent. However, after a short decline, the ice continued to grow. By the end of March, total extent approached 1979 to 2000 average levels for this time of year. The late-season growth was driven mainly by cold weather and winds from the north over the Bering and Barents Seas. Meanwhile, temperatures over the central Arctic Ocean remained above normal and the winter ice cover remained young and thin compared to earlier years.
Figure 1. Arctic sea ice extent for March 2010 was 15.10 million square kilometers (5.83 million square miles). The magenta line shows the 1979 to 2000 median extent for that month. The black cross indicates the geographic North Pole. Sea Ice Index data. About the data.
—Credit: National Snow and Ice Data Center
High-resolution image Overview of conditions
Arctic sea ice extent averaged for March 2010 was 15.10 million square kilometers (5.83 million square miles). This was 650,000 square kilometers (250,000 square miles) below the 1979 to 2000 average for March, but 670,000 square kilometers (260,000 square miles) above the record low for the month, which occurred in March 2006.
Ice extent was above normal in the Bering Sea and Baltic Sea, but remained below normal over much of the Atlantic sector of the Arctic, including the Baffin Bay, and the Canadian Maritime Provinces seaboard. Extent in other regions was near average.
Figure 2. The graph above shows daily sea ice extent as of April 4, 2010. The solid light blue line indicates 2010; green shows 2007; dark blue indicates 1999, the year with the previous latest maximum extent, which occurred on March 29, 1999; and solid gray indicates average extent from 1979 to 2000. The gray area around the average line shows the two standard deviation range of the data. Sea Ice Index data.
—Credit: National Snow and Ice Data Center
High-resolution image
Conditions in context
Sea ice reached its maximum extent for the year on March 31, the latest maximum date in the satellite record. The previous latest date was on March 29, 1999. The maximum extent was 15.25 million square kilometers (5.89 million square miles). This was 670,000 square kilometers (260,000 square miles) above the record low maximum extent, which occurred in 2006.
Sea ice extent seemed to reach a maximum during the early part of the month, but after a brief decline, ice extent increased slowly and steadily through the end of the month. By the end of the month, extent had approached the 1979 to 2000 average. During March 2010, ice extent grew at an average of 13,200 square kilometers (5100 square miles) per day. Usually there is a net loss of ice through the month.
Figure 3. Monthly March ice extent for 1979 to 2010 shows a decline of 2.6% per decade.
—Credit: National Snow and Ice Data Center
High-resolution image March 2010 compared to past years
The average ice extent for March 2010 was 670,000 square kilometers (260,000 square miles) higher than the record low for March, observed in 2006. The linear rate of decline for March over the 1978 to 2010 period is 2.6% per decade.
Figure 4. The map of sea level pressure (in millibars) for March 2010 shows high pressure over the central Arctic (areas in yellow and orange) and areas of low pressure over the Bering and Barents seas (areas in blue and purple). The low pressure systems over the Bering and Barents seas have helped to push the ice edge southward.
—Credit: National Snow and Ice Data Center courtesy NOAA/ESRL Physical Sciences Division
High-resolution image Late-season growth spurt
The maximum Arctic sea ice extent may occur as early as mid-February to as late as the last week of March. As sea ice extent approaches the seasonal maximum, extent can vary quite a bit from day to day because the thin, new ice at the edge of the pack is sensitive to local wind and temperature patterns. This March, low atmospheric pressure systems persisted over the Gulf of Alaska and north of Scandinavia. These pressure patterns led to unusually cold conditions and persistent northerly winds in the Bering and Barents Seas, which pushed the ice edge southward in these two regions.
Figure 5. This map of air temperature anomalies for March 2010, at the 925 millibar level (roughly 1,000 meters or 3,000 feet above the surface), shows warmer than usual temperatures over most of the Arctic Ocean, but colder than usual temperatures in the Bering and Barents seas, where sea ice extent is above normal. Areas in orange and red correspond to positive (warm) anomalies. Areas in blue and purple correspond to negative (cool) anomalies.
—Credit: National Snow and Ice Data Center courtesy NOAA/ESRL Physical Sciences Division
High-resolution imageMeanwhile, elsewhere in the Arctic
This winter's strong negative mode of the Arctic Oscillation was moderated through the month of March. Average air temperatures for the month nevertheless remained above average over the Arctic Ocean region. Overall for the winter, temperatures over most of the Arctic were above average, while northern Europe and Siberia were colder than usual.
Figure 6. These images show the change in ice age from fall 2009 to spring 2010. The negative Arctic Oscillation this winter slowed the export of older ice out of the Arctic. As a result, the percentage of ice older than two years was greater at the end of March 2010 than over the past few years.
—Credit: National Snow and Ice Data Center courtesy J. Maslanik and C. Fowler, CU Boulder
High-resolution imageIce age and thickness
The late date of the maximum extent, though of special interest this year, is unlikely to have an impact on summer ice extent. The ice that formed late in the season is thin, and will melt quickly when temperatures rise.
Scientists often use ice age data as a way to infer ice thickness—one of the most important factors influencing end-of-summer ice extent. Although the Arctic has much less thick, multiyear ice than it did during the 1980s and 1990s, this winter has seen some replenishment: the Arctic lost less ice the past two summers compared to 2007, and the strong negative Arctic Oscillation this winter prevented as much ice from moving out of the Arctic. The larger amount of multiyear ice could help more ice to survive the summer melt season. However, this replenishment consists primarily of younger, two- to three-year-old multiyear ice; the oldest, and thickest multiyear ice has continued to decline. Although thickness plays an important role in ice melt, summer ice conditions will also depend strongly on weather patterns through the melt season.
At the moment there are no Arctic-wide satellite measurements of ice thickness, because of the end of the NASA Ice, Cloud, and Land Elevation Satellite (ICESat) mission last October. NASA has mounted an airborne sensor campaign called IceBridge to fill this observational gap.
Arctic sea ice reached its maximum extent for the year on March 31 at 15.25 million square kilometers (5.89 million square miles). This was the latest date for the maximum Arctic sea ice extent since the start of the satellite record in 1979.
Early in March, Arctic sea ice appeared to reach a maximum extent. However, after a short decline, the ice continued to grow. By the end of March, total extent approached 1979 to 2000 average levels for this time of year. The late-season growth was driven mainly by cold weather and winds from the north over the Bering and Barents Seas. Meanwhile, temperatures over the central Arctic Ocean remained above normal and the winter ice cover remained young and thin compared to earlier years.
Figure 1. Arctic sea ice extent for March 2010 was 15.10 million square kilometers (5.83 million square miles). The magenta line shows the 1979 to 2000 median extent for that month. The black cross indicates the geographic North Pole. Sea Ice Index data. About the data.
—Credit: National Snow and Ice Data Center
High-resolution image Overview of conditions
Arctic sea ice extent averaged for March 2010 was 15.10 million square kilometers (5.83 million square miles). This was 650,000 square kilometers (250,000 square miles) below the 1979 to 2000 average for March, but 670,000 square kilometers (260,000 square miles) above the record low for the month, which occurred in March 2006.
Ice extent was above normal in the Bering Sea and Baltic Sea, but remained below normal over much of the Atlantic sector of the Arctic, including the Baffin Bay, and the Canadian Maritime Provinces seaboard. Extent in other regions was near average.
Figure 2. The graph above shows daily sea ice extent as of April 4, 2010. The solid light blue line indicates 2010; green shows 2007; dark blue indicates 1999, the year with the previous latest maximum extent, which occurred on March 29, 1999; and solid gray indicates average extent from 1979 to 2000. The gray area around the average line shows the two standard deviation range of the data. Sea Ice Index data.
—Credit: National Snow and Ice Data Center
High-resolution image
Conditions in context
Sea ice reached its maximum extent for the year on March 31, the latest maximum date in the satellite record. The previous latest date was on March 29, 1999. The maximum extent was 15.25 million square kilometers (5.89 million square miles). This was 670,000 square kilometers (260,000 square miles) above the record low maximum extent, which occurred in 2006.
Sea ice extent seemed to reach a maximum during the early part of the month, but after a brief decline, ice extent increased slowly and steadily through the end of the month. By the end of the month, extent had approached the 1979 to 2000 average. During March 2010, ice extent grew at an average of 13,200 square kilometers (5100 square miles) per day. Usually there is a net loss of ice through the month.
Figure 3. Monthly March ice extent for 1979 to 2010 shows a decline of 2.6% per decade.
—Credit: National Snow and Ice Data Center
High-resolution image March 2010 compared to past years
The average ice extent for March 2010 was 670,000 square kilometers (260,000 square miles) higher than the record low for March, observed in 2006. The linear rate of decline for March over the 1978 to 2010 period is 2.6% per decade.
Figure 4. The map of sea level pressure (in millibars) for March 2010 shows high pressure over the central Arctic (areas in yellow and orange) and areas of low pressure over the Bering and Barents seas (areas in blue and purple). The low pressure systems over the Bering and Barents seas have helped to push the ice edge southward.
—Credit: National Snow and Ice Data Center courtesy NOAA/ESRL Physical Sciences Division
High-resolution image Late-season growth spurt
The maximum Arctic sea ice extent may occur as early as mid-February to as late as the last week of March. As sea ice extent approaches the seasonal maximum, extent can vary quite a bit from day to day because the thin, new ice at the edge of the pack is sensitive to local wind and temperature patterns. This March, low atmospheric pressure systems persisted over the Gulf of Alaska and north of Scandinavia. These pressure patterns led to unusually cold conditions and persistent northerly winds in the Bering and Barents Seas, which pushed the ice edge southward in these two regions.
Figure 5. This map of air temperature anomalies for March 2010, at the 925 millibar level (roughly 1,000 meters or 3,000 feet above the surface), shows warmer than usual temperatures over most of the Arctic Ocean, but colder than usual temperatures in the Bering and Barents seas, where sea ice extent is above normal. Areas in orange and red correspond to positive (warm) anomalies. Areas in blue and purple correspond to negative (cool) anomalies.
—Credit: National Snow and Ice Data Center courtesy NOAA/ESRL Physical Sciences Division
High-resolution imageMeanwhile, elsewhere in the Arctic
This winter's strong negative mode of the Arctic Oscillation was moderated through the month of March. Average air temperatures for the month nevertheless remained above average over the Arctic Ocean region. Overall for the winter, temperatures over most of the Arctic were above average, while northern Europe and Siberia were colder than usual.
Figure 6. These images show the change in ice age from fall 2009 to spring 2010. The negative Arctic Oscillation this winter slowed the export of older ice out of the Arctic. As a result, the percentage of ice older than two years was greater at the end of March 2010 than over the past few years.
—Credit: National Snow and Ice Data Center courtesy J. Maslanik and C. Fowler, CU Boulder
High-resolution imageIce age and thickness
The late date of the maximum extent, though of special interest this year, is unlikely to have an impact on summer ice extent. The ice that formed late in the season is thin, and will melt quickly when temperatures rise.
Scientists often use ice age data as a way to infer ice thickness—one of the most important factors influencing end-of-summer ice extent. Although the Arctic has much less thick, multiyear ice than it did during the 1980s and 1990s, this winter has seen some replenishment: the Arctic lost less ice the past two summers compared to 2007, and the strong negative Arctic Oscillation this winter prevented as much ice from moving out of the Arctic. The larger amount of multiyear ice could help more ice to survive the summer melt season. However, this replenishment consists primarily of younger, two- to three-year-old multiyear ice; the oldest, and thickest multiyear ice has continued to decline. Although thickness plays an important role in ice melt, summer ice conditions will also depend strongly on weather patterns through the melt season.
At the moment there are no Arctic-wide satellite measurements of ice thickness, because of the end of the NASA Ice, Cloud, and Land Elevation Satellite (ICESat) mission last October. NASA has mounted an airborne sensor campaign called IceBridge to fill this observational gap.
Sunday, February 14, 2010
Study links mother's age to child's risk of autism
Women who give birth after age 40 are nearly twice as likely to have a child with autism as those under 25, but it is unlikely that delayed parenthood plays a big role in the current autism epidemic, California researchers reported Monday.
The findings were expected to draw widespread attention because of the intense public interest in autism, but their true impact was expected to be simply in suggesting further avenues of research.
Surprisingly, the age of the father plays little role unless the mother is younger than 30 and the father is over 40, according to the analysis of all births in California in the 1990s.
The number of women over age 40 in California giving birth increased by 300% in the 1990s, while the diagnosis of autism increased by 600%. At first glance, it might seem that the rise in older pregnancies could be responsible for the rise in autism, which is now thought to affect as many as one child in every 100. But the authors, from UC Davis, calculate that older mothers account for less than 5% of the increase in autism diagnoses.
"There is a long history of blaming parents" for the development of autism, said senior author Dr. Irva Hertz-Picciotto, a professor of public health sciences and a researcher at the UC Davis MIND Institute who has been studying potential causes for the autism increase. "We're not saying this is the fault of mothers or fathers. We're just saying this is a correlation that will direct research in the future."
Researchers have long known that the age of the parents plays a role in a child's risk of developing autism, but how big a role and how that role varies with the sex of the parent has remained confusing, with contradictory results reported in different studies.
To investigate, Hertz-Picciotto, graduate student Janie E. Shelton and epidemiologist Daniel J. Tancredi of UC Davis analyzed all the singleton births in California during the 1990s for which information was available about the ages of both parents, a total of about 4.9 million births and 12,159 cases of autism.
Because of the large sample size, they were able to show how the risk was affected by each parent's age. They reported in the February issue of the journal Autism Research that women over 40 were 77% more likely to deliver an autistic child than those younger than 25 and 51% more likely than those age 25 to 29, independent of the age of the father.
For men over 40, there was a 59% increased risk of autism if the mother was younger than 30, but virtually no increased risk if the mother was over 30.
The researchers also calculated that the recent trend toward delayed childbearing contributed about a 4.6% increase in autism diagnoses over the decade.
"Five percent is probably indicating that there is something besides maternal age going on because we are seeing a rise in every age group of parents," Shelton said.
Also, noted Hertz-Picciotto, older women may be followed more closely during pregnancy, which would mean more ultrasounds -- which a few researchers have suggested might play a role in autism. Older women are more likely to suffer gestational diabetes and to develop autoimmune disorders, both of which have been linked to an increased risk of autism.
"We still have a real long way to go" in determining the causes of autism, she concluded
The findings were expected to draw widespread attention because of the intense public interest in autism, but their true impact was expected to be simply in suggesting further avenues of research.
Surprisingly, the age of the father plays little role unless the mother is younger than 30 and the father is over 40, according to the analysis of all births in California in the 1990s.
The number of women over age 40 in California giving birth increased by 300% in the 1990s, while the diagnosis of autism increased by 600%. At first glance, it might seem that the rise in older pregnancies could be responsible for the rise in autism, which is now thought to affect as many as one child in every 100. But the authors, from UC Davis, calculate that older mothers account for less than 5% of the increase in autism diagnoses.
"There is a long history of blaming parents" for the development of autism, said senior author Dr. Irva Hertz-Picciotto, a professor of public health sciences and a researcher at the UC Davis MIND Institute who has been studying potential causes for the autism increase. "We're not saying this is the fault of mothers or fathers. We're just saying this is a correlation that will direct research in the future."
Researchers have long known that the age of the parents plays a role in a child's risk of developing autism, but how big a role and how that role varies with the sex of the parent has remained confusing, with contradictory results reported in different studies.
To investigate, Hertz-Picciotto, graduate student Janie E. Shelton and epidemiologist Daniel J. Tancredi of UC Davis analyzed all the singleton births in California during the 1990s for which information was available about the ages of both parents, a total of about 4.9 million births and 12,159 cases of autism.
Because of the large sample size, they were able to show how the risk was affected by each parent's age. They reported in the February issue of the journal Autism Research that women over 40 were 77% more likely to deliver an autistic child than those younger than 25 and 51% more likely than those age 25 to 29, independent of the age of the father.
For men over 40, there was a 59% increased risk of autism if the mother was younger than 30, but virtually no increased risk if the mother was over 30.
The researchers also calculated that the recent trend toward delayed childbearing contributed about a 4.6% increase in autism diagnoses over the decade.
"Five percent is probably indicating that there is something besides maternal age going on because we are seeing a rise in every age group of parents," Shelton said.
Also, noted Hertz-Picciotto, older women may be followed more closely during pregnancy, which would mean more ultrasounds -- which a few researchers have suggested might play a role in autism. Older women are more likely to suffer gestational diabetes and to develop autoimmune disorders, both of which have been linked to an increased risk of autism.
"We still have a real long way to go" in determining the causes of autism, she concluded
FDA addresses radiation safety
The Food and Drug Administration has decided to impose new safety controls on medical imaging devices and encourage development of more precise dosing standards in a bid to reduce unnecessary exposure of patients to diagnostic radiation.
The agency also will promote a personal medical imaging history card that will enable patients to keep track of the number of images, and the amount of radiation, they receive over time, according to a medical imaging safety initiative unveiled Tuesday.
The safety push comes months after Cedars-Sinai Medical Center in Los Angeles discovered that it had accidentally exposed more than 260 patients to eight times the normal dose of radiation for CT brain scans over a period of 18 months.
Two other Los Angeles County healthcare facilities -- Providence St. Joseph Hospital in Burbank and Glendale Adventist Medical Center -- and one hospital in Huntsville, Ala., reported possible overdoses by imaging equipment to at least 104 people.
"We're aware that the exposure of the American public to [diagnostic] radiation was increasing fairly dramatically over the past 20 years," said Jeffrey Shuren, director of the FDA's Center for Devices and Radiological Health. "These tests can provide tremendous medical benefit. We're trying to optimize that benefit while lowering the risk."
The FDA will hold a public meeting March 30 and 31 to collect suggestions about new safety features and training that should be required for CT and fluoroscopic devices, Shuren said.
Equipment might be automatically calibrated to a recommended dose for a given procedure, so that any dosage increase would require an action by the operator. Equipment might also be designed to require identification of the operator as a way of tracking errors.
Shuren said regulators also are looking at software upgrades and other fixes to existing equipment.
In addition, the FDA is encouraging the development of a voluntary national database to determine the optimal dosages for a given procedure, fine-tuned to variables such as age and body type. Such a database also would allow individual practitioners to measure their use of radiation against that of their peers, Shuren said.
The FDA effort will build on data gathering already underway by the American College of Radiology and the National Council for Radiation Protection.
Some medical radiation experts questioned whether the FDA went far enough.
Requiring scanner manufacturers to add safeguards to their machines would have prevented the CT overdoses at Cedars-Sinai Medical Center, said David Brenner, director of the Center for Radiological Research at Columbia University Medical Center. But "it doesn't address what I see as the central issue: too many CT scans being done without medical justification," he said.
The number of CT scans performed in the U.S. each year has climbed to more than 70 million, more than triple the number in 1995.
The trend is driven by a variety of factors, including economic incentives for doctors and hospitals to order tests, and doctors' fear of being sued if they miss a problem.
CT has become standard procedure in emergency rooms for diagnosing head injuries, kidney stones and appendicitis. Scans are often repeated as part of routine follow-up or if a patient is transferred.
Shuren said the FDA was not directly addressing the question of medical justification because "it's really outside FDA's scope." But Shuren pointed out that putting imaging history cards in the hands of patients will call attention to previous exposures to diagnostic radiation.
The agency also will promote a personal medical imaging history card that will enable patients to keep track of the number of images, and the amount of radiation, they receive over time, according to a medical imaging safety initiative unveiled Tuesday.
The safety push comes months after Cedars-Sinai Medical Center in Los Angeles discovered that it had accidentally exposed more than 260 patients to eight times the normal dose of radiation for CT brain scans over a period of 18 months.
Two other Los Angeles County healthcare facilities -- Providence St. Joseph Hospital in Burbank and Glendale Adventist Medical Center -- and one hospital in Huntsville, Ala., reported possible overdoses by imaging equipment to at least 104 people.
"We're aware that the exposure of the American public to [diagnostic] radiation was increasing fairly dramatically over the past 20 years," said Jeffrey Shuren, director of the FDA's Center for Devices and Radiological Health. "These tests can provide tremendous medical benefit. We're trying to optimize that benefit while lowering the risk."
The FDA will hold a public meeting March 30 and 31 to collect suggestions about new safety features and training that should be required for CT and fluoroscopic devices, Shuren said.
Equipment might be automatically calibrated to a recommended dose for a given procedure, so that any dosage increase would require an action by the operator. Equipment might also be designed to require identification of the operator as a way of tracking errors.
Shuren said regulators also are looking at software upgrades and other fixes to existing equipment.
In addition, the FDA is encouraging the development of a voluntary national database to determine the optimal dosages for a given procedure, fine-tuned to variables such as age and body type. Such a database also would allow individual practitioners to measure their use of radiation against that of their peers, Shuren said.
The FDA effort will build on data gathering already underway by the American College of Radiology and the National Council for Radiation Protection.
Some medical radiation experts questioned whether the FDA went far enough.
Requiring scanner manufacturers to add safeguards to their machines would have prevented the CT overdoses at Cedars-Sinai Medical Center, said David Brenner, director of the Center for Radiological Research at Columbia University Medical Center. But "it doesn't address what I see as the central issue: too many CT scans being done without medical justification," he said.
The number of CT scans performed in the U.S. each year has climbed to more than 70 million, more than triple the number in 1995.
The trend is driven by a variety of factors, including economic incentives for doctors and hospitals to order tests, and doctors' fear of being sued if they miss a problem.
CT has become standard procedure in emergency rooms for diagnosing head injuries, kidney stones and appendicitis. Scans are often repeated as part of routine follow-up or if a patient is transferred.
Shuren said the FDA was not directly addressing the question of medical justification because "it's really outside FDA's scope." But Shuren pointed out that putting imaging history cards in the hands of patients will call attention to previous exposures to diagnostic radiation.
SNOWSTORM AND CLIMATIC CHANGE
As record snowfall buried the nation's capital this week, the quickest joke around town was, "So much for global warming."
The quip was timely, given the recent controversies over Climategate -- the release of e-mails allegedly showing some leading climate scientists trying to suppress criticism -- and new questions about the integrity of the United Nations' Intergovernmental Panel on Climate Change.
After 55-plus inches of snow fell in the Washington area, critics are delighting in the irony, and those who warn of climate change are taking pains to say the snow fits the pattern of a warming world.
So who's right? If the earth is warming, why all the snow?
Snow and global warming aren't mutually exclusive, climate scientists say. For starters, the amount of recorded warming over the last century, about 1 degree Fahrenheit above preindustrial levels, is nowhere near enough to eradicate winter in the mid-Atlantic.
Also, weather is variable: The planet would have extreme highs and lows with or without an overall warming trend.
And for all the recent snow in Washington, it hasn't been that cold -- mostly in the 20s or low 30s. The average temperature in Washington in January, according to the National Climatic Data Center, was about a degree warmer than the average for the last 40 years.
But the reverse is also true: The fact that Vancouver, Canada, is experiencing record-high temperatures and importing snow for the Winter Olympics doesn't prove a warming trend.
Are the snowstorm and climate science completely unrelated, then?
Not necessarily. Increased snowfall fits a pattern suggested by many climate models, in which rising temperatures warm the world's bodies of water, leading to more evaporation.
Climate scientists say the amount of atmospheric moisture has increased, which they predict will bring more rain in warmer conditions and more snow in freezing temperatures.
"All you need is cold air and moisture to meet each other" to make snow, said Jay Gulledge, senior scientist for the Pew Center on Global Climate Change. "And with global warming, the opportunities to do that should be more frequent."
How will the snow affect the politics of the climate bill?
Probably not much, because proponents are pitching the bill as a boost to national security and a creator of clean-energy jobs, as opposed to a curb on global warming. The swing voters who will dictate the bill's fate are senators who more or less say they accept the science behind climate change.
On the other hand, the snow has given some novel arguments to opponents of the "cap and trade" system of limiting emissions, which is the heart of most climate proposals. Critics warn that such a system, in which emitters must obtain permits to cover the carbon dioxide and other greenhouse gases they release, would send electricity prices soaring -- particularly during extreme weather events.
William O'Keefe, a cap-and-trade critic and chief executive of the nonprofit George C. Marshall Institute, observed: "As Washington works to clear its streets from this week's 'snowpocalypse,' policymakers should work to clear a new path on climate policy."
The quip was timely, given the recent controversies over Climategate -- the release of e-mails allegedly showing some leading climate scientists trying to suppress criticism -- and new questions about the integrity of the United Nations' Intergovernmental Panel on Climate Change.
After 55-plus inches of snow fell in the Washington area, critics are delighting in the irony, and those who warn of climate change are taking pains to say the snow fits the pattern of a warming world.
So who's right? If the earth is warming, why all the snow?
Snow and global warming aren't mutually exclusive, climate scientists say. For starters, the amount of recorded warming over the last century, about 1 degree Fahrenheit above preindustrial levels, is nowhere near enough to eradicate winter in the mid-Atlantic.
Also, weather is variable: The planet would have extreme highs and lows with or without an overall warming trend.
And for all the recent snow in Washington, it hasn't been that cold -- mostly in the 20s or low 30s. The average temperature in Washington in January, according to the National Climatic Data Center, was about a degree warmer than the average for the last 40 years.
But the reverse is also true: The fact that Vancouver, Canada, is experiencing record-high temperatures and importing snow for the Winter Olympics doesn't prove a warming trend.
Are the snowstorm and climate science completely unrelated, then?
Not necessarily. Increased snowfall fits a pattern suggested by many climate models, in which rising temperatures warm the world's bodies of water, leading to more evaporation.
Climate scientists say the amount of atmospheric moisture has increased, which they predict will bring more rain in warmer conditions and more snow in freezing temperatures.
"All you need is cold air and moisture to meet each other" to make snow, said Jay Gulledge, senior scientist for the Pew Center on Global Climate Change. "And with global warming, the opportunities to do that should be more frequent."
How will the snow affect the politics of the climate bill?
Probably not much, because proponents are pitching the bill as a boost to national security and a creator of clean-energy jobs, as opposed to a curb on global warming. The swing voters who will dictate the bill's fate are senators who more or less say they accept the science behind climate change.
On the other hand, the snow has given some novel arguments to opponents of the "cap and trade" system of limiting emissions, which is the heart of most climate proposals. Critics warn that such a system, in which emitters must obtain permits to cover the carbon dioxide and other greenhouse gases they release, would send electricity prices soaring -- particularly during extreme weather events.
William O'Keefe, a cap-and-trade critic and chief executive of the nonprofit George C. Marshall Institute, observed: "As Washington works to clear its streets from this week's 'snowpocalypse,' policymakers should work to clear a new path on climate policy."
Obama nuke plant loan reflects new energy strategy
The Obama administration's planned loan guarantee to build the first nuclear power plant in the U.S in almost three decades is part of a broad shift in energy strategy to lessen dependence on foreign oil and reduce the use of other fossil fuels blamed for global warming.
President Barack Obama called for "a new generation of safe, clean nuclear power plants" in his Jan. 27 State of the Union speech and followed that by proposing to triple loan guarantees for new nuclear plants. He wants to use nuclear power and other alternative sources of energy in his effort to shift energy policy.
Obama in the coming week will announce the loan guarantee to build the nuclear power plant, an administration official said Friday. The two new Southern Co. reactors to be built in Burke, Ga., are part of a White House energy plan that administration officials hope will draw Republican support.
Loan guarantees for other sites are expected to be announced in the coming months, the official said, who would speak only on condition of anonymity ahead of Obama's announcement. The federal guarantees are seen as essential for construction of any new reactor because of the expense involved. Critics call the guarantees a form of subsidy and say taxpayers will assume a huge risk, given the industry's record of cost overruns and loan defaults.
"The last thing Americans want is another government bailout for a failing industry, but that's exactly what they're getting from the Obama administration," said Ben Schreiber, an analyst for the environmental group Friends of the Earth. "This is great news for Wall Street but a bad deal for Main Street."
Even with next week's announcement, construction of the first reactor is still years away. The Southern Co. has applied to the Nuclear Regulatory Commission for a construction and operating license for the plant, one of 13 such applications the agency is considering. NRC spokesman Eliot Brenner said the earliest any of those could be approved would be late 2011 or early 2012.
The Southern Co. has begun site preparation in Burke but cannot begin construction without NRC approval.
Obama's budget for the coming year would add $36 billion in new federal loan guarantees on top of $18.5 billion already budgeted — but not spent — for a total of $54.5 billion. That's enough to help build six or seven new nuclear plants, which can cost $8 billion to $10 billion each.
The proposed new reactors would generate power for some 1.4 million people and employ about 850 people, the administration official said, adding that the Georgia project would create about 3,000 construction jobs.
Spiraling costs, safety concerns and opposition from environmentalists have kept utilities from building any new nuclear power plants in the U.S. since the early 1980s. The 104 nuclear reactors now in operation in 31 states provide about 20 percent of the nation's electricity. But they are responsible for 70 percent of the power from pollution-free sources, including wind, solar and hydroelectric dams that Obama has championed as a way to save the environment and economy at the same time.
Environmentalists and fiscal hawks oppose new nuclear plants and note that they come at the same time Obama has proposed eliminating a long-planned nuclear waste dump at Yucca Mountain in Nevada. Obama has appointed a commission to find a safe solution for dealing with nuclear waste, but in the meantime the government has no long-term plan to store commercial radioactive waste.
Republicans like South Carolina Sen. Lindsey Graham welcome the shift, but some pro-nuclear Republicans remain nervous about the heart of the Obama-backed climate bill — a plan to limit heat-trapping pollution, which would raise energy costs.
President Barack Obama called for "a new generation of safe, clean nuclear power plants" in his Jan. 27 State of the Union speech and followed that by proposing to triple loan guarantees for new nuclear plants. He wants to use nuclear power and other alternative sources of energy in his effort to shift energy policy.
Obama in the coming week will announce the loan guarantee to build the nuclear power plant, an administration official said Friday. The two new Southern Co. reactors to be built in Burke, Ga., are part of a White House energy plan that administration officials hope will draw Republican support.
Loan guarantees for other sites are expected to be announced in the coming months, the official said, who would speak only on condition of anonymity ahead of Obama's announcement. The federal guarantees are seen as essential for construction of any new reactor because of the expense involved. Critics call the guarantees a form of subsidy and say taxpayers will assume a huge risk, given the industry's record of cost overruns and loan defaults.
"The last thing Americans want is another government bailout for a failing industry, but that's exactly what they're getting from the Obama administration," said Ben Schreiber, an analyst for the environmental group Friends of the Earth. "This is great news for Wall Street but a bad deal for Main Street."
Even with next week's announcement, construction of the first reactor is still years away. The Southern Co. has applied to the Nuclear Regulatory Commission for a construction and operating license for the plant, one of 13 such applications the agency is considering. NRC spokesman Eliot Brenner said the earliest any of those could be approved would be late 2011 or early 2012.
The Southern Co. has begun site preparation in Burke but cannot begin construction without NRC approval.
Obama's budget for the coming year would add $36 billion in new federal loan guarantees on top of $18.5 billion already budgeted — but not spent — for a total of $54.5 billion. That's enough to help build six or seven new nuclear plants, which can cost $8 billion to $10 billion each.
The proposed new reactors would generate power for some 1.4 million people and employ about 850 people, the administration official said, adding that the Georgia project would create about 3,000 construction jobs.
Spiraling costs, safety concerns and opposition from environmentalists have kept utilities from building any new nuclear power plants in the U.S. since the early 1980s. The 104 nuclear reactors now in operation in 31 states provide about 20 percent of the nation's electricity. But they are responsible for 70 percent of the power from pollution-free sources, including wind, solar and hydroelectric dams that Obama has championed as a way to save the environment and economy at the same time.
Environmentalists and fiscal hawks oppose new nuclear plants and note that they come at the same time Obama has proposed eliminating a long-planned nuclear waste dump at Yucca Mountain in Nevada. Obama has appointed a commission to find a safe solution for dealing with nuclear waste, but in the meantime the government has no long-term plan to store commercial radioactive waste.
Republicans like South Carolina Sen. Lindsey Graham welcome the shift, but some pro-nuclear Republicans remain nervous about the heart of the Obama-backed climate bill — a plan to limit heat-trapping pollution, which would raise energy costs.
Thursday, February 11, 2010
Somali pirates hold science to ransom
SOMALI pirates terrorising the Indian Ocean are a hazard to more than shipping and tourists. They are also killing important scientific research and may be indirectly damaging the ocean's ecosystem.
Fishing boats in the Indian Ocean routinely carry scientists who gather data about fish stocks and threatened species while ensuring that boats comply with fishing rules. The piracy threat has put a stop to that. "We can't monitor and we can't do experiments because of the pirates," says Laurent Dagorn of France's Research Institute for Development (IRD).
Boats now carry guards and no longer have room for scientists, who have had to confine their own research vessels to port. IRD has cancelled all of its cruises in the last nine months.
By eliminating scientific observers, piracy may be indirectly increasing by-catch. It could also be encouraging the use of damaging fishing methods like "fish-attracting devices" - bamboo rafts held together with netting that are left at sea for days or weeks. Fish such as tuna congregate under FADs, making them easier to catch, but FADs also snag and kill turtles and sharks. Michel Goujon, director of the French tuna-boat owners' association, Orthongel, has evidence that their use is on the rise.
Regional governments accept the need to resume research. But, "I don't see any sign that piracy is going to decrease", says Goujon. "In fact, every time a ransom is paid it's an incentive for new attacks."
Subscribe to:
Comments (Atom)