Category: Seas / Oceans


WHALES AHOY


by Staff Writers
Tokyo (AFP) April 18, 2014

Japan said Friday it would redesign its controversial Antarctic whaling mission in a bid to make it more scientific, after a United Nations court ruled it was a commercial hunt masquerading as research.

The bullish response, which could see harpoon ships back in the Southern Ocean next year, sets Tokyo back on a collision course with environmentalists.

Campaigners had hailed the decision by the International Court of Justice, with hopes that it might herald the end of a practice they view as barbaric.

“We will carry out extensive studies in cooperation with ministries concerned to submit a new research programme by this autumn to the International Whaling Commission (IWC), reflecting the criteria laid out in the verdict,” said Yoshimasa Hayashi, minister of agriculture, forestry and fisheries.

Japan, a member of the IWC, has hunted whales under a loophole allowing for lethal research. It has always maintained that it was intending to prove the whale population was large enough to sustain commercial hunting.

But it never hid the fact that the by-product of whale meat made its way onto menus.

“The verdict confirmed that the (IWC moratorium) is partly aimed at sustainable use of whale resources.

“Following this, our country will firmly maintain its basic policy of conducting whaling for research, on the basis of international law and scientific foundations, to collect scientific data necessary for the regulation of whale resources, and aim for resumption of commercial whaling.”

Hayashi, who had met with Japanese Prime Minister Shinzo Abe earlier in the day, confirmed a previous announcement that the 2014-15 hunt in the Southern Ocean would not go ahead.

Last month’s court ruling does not apply to Japan’s two other whaling programmes: a “research” hunt in coastal waters and in the northwestern Pacific, and a much smaller programme that operates along the coast, which is not subject to the international ban.

 

Read More Here

 

…..

SBS News

  • null

    (AP)
   
Hundreds of Japanese officials and pro-whaling lobbyists have eaten whale in defiance of a international court ruling that ordered the country to stop its Antarctic whaling program.
By

SBS with AAP
UPDATED 2:05 PM – 16 Apr 2014

The 26th whale meat tasting event in Tokyo was hosted near the nation’s parliament and was attended by lawmakers, officials and pro-whaling lobbyists.

Agriculture, Forestry and Fisheries Minister Yoshimasa Hayashi told attendees that the country must protect its whale-eating culture.

“[Japan] has a policy of harvesting and sustainably using the protein source from the ocean, and that is unshakable,” Associated Press quoted Mr Hayashi as saying.

Meanwhile, a lower house MP criticised the arguments against whaling as emotional and not based on reason.

“Japan’s whaling is based on scientific reasons, while counterarguments by anti-whaling groups are emotional, saying they are against the hunts because whales are cute or smart,” the Japan Times reported Shunichi Suzuki of the ruling Liberal Democratic Party as saying.

 

Read More Here

 

…..

Japan ‘will continue whaling in Pacific’

Updated: 15:21, Friday April 18, 2014

Japan 'will continue whaling in Pacific'

Japan has decided to continue its whaling program in the Pacific Ocean, reports say, despite losing a United Nations court case on its other “research” hunt in the Antarctic.

If confirmed, the move will likely spark anger among environmentalists who hailed a ruling in March by the UN’s International Court of Justice (ICJ) that Tokyo’s hunt in the Southern Ocean was a commercial activity disguised as science.

Japan has exploited a loophole in a 1986 moratorium that allowed it to conduct lethal research on the mammals, but has openly admitted their meat makes its way onto dinner tables.

Campaigners urged Tokyo to follow the spirit of the ruling, and not just its letter, which specifically referred to Japan’s hunt in the Antarctic, not its other research scheme in the northwest Pacific or its smaller coastal program.

But after the ICJ verdict, a government review has said the Pacific hunt should press ahead, public broadcaster NHK and Kyodo News Agency reported on Friday.

The review suggests the Pacific mission should reduce its catch and focus more on carrying out research that does not involve catching whales.

A spokesman for the fisheries agency said he was unable to comment.

 

Read More Here

…..

Enhanced by Zemanta
About these ads

LiveScience

Unusual Bacteria Gobbles Up Carbon in the Ocean

Debris in the Pacific Ocean, ocean currents

 

Underneath the floating debris in the Pacific Ocean.

Credit: NOAA – Marine Debris Program.

 

The finding may help researchers better understand how carbon cycling works in marine ecosystems.

“We found that an individual bacterial strain was capable of consuming the same amount of carbon in the ocean as diverse [bacterial] communities,” said study author Byron E. Pedler at the University of California, San Diego.

The researchers found the results surprising because of the immense diversity of molecules that constitute dissolved carbon in one form or another in the ocean, Pedler told Live Science.

Those molecules include both “young” carbon recently produced by phytoplankton — the tiny organisms that are the foundation of the marine food web, and really old carbon that is hundreds of years old. Some of this carbon consists of carbohydrates, but a significant portion of it “is simply uncharacterizable, in that even modern chemical techniques cannot determine what it is,” Pedler said.

 

Read More Here

Enhanced by Zemanta

 

A photo of a fishmonger peeling the spine from a tuna.

A worker peels the spine from a tuna at New York’s Fulton Fish Market—the world’s largest after the Tsukiji Market in Tokyo, Japan—on March 29, 2013.

PHOTOGRAPH BY JOHN MINCHILLO, AP

Brian Clark Howard

Published April 9, 2014

Do you know if the fish on your plate is legal? A new study estimates that 20 to 32 percent of wild-caught seafood imported into the U.S. comes from illegal or “pirate” fishing. That’s a problem, scientists say, because it erodes the ability of governments to limit overfishing and the ability of consumers to know where their food comes from.

The estimated illegal catch is valued at $1.3 billion to $2.1 billion annually and represents between 15 and 26 percent of the total value of wild-caught seafood imported into the U.S., report scientists in a new study in the journal Marine Policy.

Study co-author Tony Pitcher says those results surprised his team. “We didn’t think it would be as big as that. To think that one in three fish you eat in the U.S. could be illegal, that’s a bit scary,” says Pitcher, who is a professor at the fisheries center of the University of British Columbia in Vancouver.

To get those numbers, Pitcher and three other scientists analyzed data on seafood imported into the U.S. in 2011. They combed through government and academic reports, conducted fieldwork, and interviewed stakeholders.

The scientists report that tuna from Thailand had the highest volume of illegal products, 32,000 to 50,000 metric tons, representing 25 to 40 percent of tuna imports from that country. That was followed by pollack from China, salmon from China, and tuna from the Philippines, Vietnam, and Indonesia. Other high volumes were seen with octopus from India, snappers from Indonesia, crabs from Indonesia, and shrimp from Mexico, Indonesia, and Ecuador.

Imports from Canada all had levels of illegal catches below 10 percent. So did imports of clams from Vietnam and toothfish from Chile.

Graphic showing percent of seafood imported into the U.S. that is illegal and unreported.

NG STAFF. SOURCE: P. GANAPATHIRAJU, ET AL., MARINE POLICY

In response to the study, Connie Barclay, a spokesperson for the National Oceanic and Atmospheric Association (NOAA) Fisheries, said, “We agree that [pirate] fishing is a global problem, but we do not agree with the statistics that are being highlighted in the report.” Barclay says data are too scarce to make the conclusions verifiable.

But, she adds, “NOAA is working to stop [pirate] fishing and the import of these products into the U.S. market.” She points to recent increased collaboration with other law enforcement agencies and improved electronic tracking of trade data.

Pirate Fishing

The U.S. is important to consider when it comes to fishing because it is tied with Japan as the largest single importer of seafood, with each nation responsible for about 13 to 14 percent of the global total, says Pitcher. Americans spent $85.9 billion on seafood in 2011, with about $57.7 billion of that spent at restaurants, $27.6 billion at retail, and $625 million on industrial fish products.

However, what few Americans realize, says Pitcher, is that roughly 90 percent of all seafood consumed in the United States is imported, and about half of that is wild caught, according to NOAA.

Pirate fishing is fishing that is unreported to authorities or done in ways that circumvent fishery quotas and laws. In their paper, the authors write that pirate fishing “distorts competition, harms honest fishermen, weakens coastal communities, promotes tax evasion, and is frequently associated with transnational crime such as narcotraffic and slavery at sea.” (See: “West Africans Fight Pirate Fishing With Cell Phones.”)

Scientists estimate that between 13 and 31 percent of all seafood catches around the world are illegal, worth $10 billion to $23.5 billion per year. That illegal activity puts additional stress on the world’s fish stocks, 85 percent of which are already fished to their biological limit or beyond, says Tony Long, the U.K.-based director of the Pew Charitable Trust’s Ending Illegal Fishing Project.

“The ocean is vast, so it is very difficult for countries to control what goes on out there,” says Long. He explains that pirate fishers are often crafty, going to remote areas where enforcement is lax. They may leave a port with a certain name on the boat and the flag of a particular country, engage in illegal fishing, then switch the name and flag and unload their catch at a different port.

 

Read More Here

…..

 

The oceans are vast and humans are small — as the monthlong hunt for a vanished Malaysian jetliner demonstrates. Think of the challenge, then, for law enforcement and fisheries managers in going after fleets of shady boats that engage in illegal, unreported and unregulated fishing. These criminals ply the seas and sell their catches with impunity, making off with an estimated 11 million to 26 million metric tons of stolen fish each year, a worldwide haul worth about $10 billion to $23.5 billion. Many use banned gear like floating gillnets, miles long, that indiscriminately slaughter countless unwanted fish along with seabirds, marine mammals, turtles and other creatures.

The danger that illegal fishing poses to vulnerable ocean ecosystems is self-evident, but the harm goes beyond that. Illegal competition hurts legitimate commercial fleets. And lawless fishermen are prone to other crimes, like forced labor and drug smuggling. The convergence of illegal fishing with other criminal enterprises makes it in every country’s interest to devise an effective response.

That’s the job of the Port State Measures Agreement. It is a treaty adopted by the United Nations in 2009 that seeks to thwart the poachers in ports when they try to unload their ill-gotten catches. Many countries have been unable or unwilling to enforce their own laws to crack down on poachers flying their flags.

 

Read More Here

…..

 

Enhanced by Zemanta

 

Oceana Report Sheds Light On Staggering By-Catch Problem In U.S. Fisheries

 

Posted: 03/20/2014 5:37 pm EDT Updated: 03/20/2014 5:59 pm EDT

 

 

 

 

 

That fish dish at your favorite neighborhood bistro may be hiding a gruesome secret.

“When you buy fish at a grocery store or restaurant, you might also be getting a side order of sea turtle or dolphin to go with it,” said Dominique Cano-Stocco, Oceana‘s campaign director of responsible fishing, referring to the large number of dead sea creatures tossed by fishermen each year.

According to a new Oceana report, United States fisheries discard about 17 percent to 22 percent of everything they catch every year. That amounts to a whopping 2 billion pounds of annual by-catch — injured and dead fish and other marine animals unintentionally caught by fishermen and then thrown overboard. This includes endangered creatures like whales and sharks, as well as commercially viable fish that may have been too young or too damaged to bring to port.

“By-catch is one of the biggest challenges facing the U.S. today,” Cano-Stocco said. “It’s one of the largest threats to the proper management of our fisheries and to the health of our oceans and marine ecosystems.” Due to underreporting, by-catch numbers are probably an underestimate, she explained.

Released Friday, Oceana’s report strives to highlight the need to document by-catch numbers and develop better management strategies to prevent the high level of unnecessary slaughter in our oceans.

shark

Bull shark trapped in fishing net

 

The report identifies nine of the worst by-catch fisheries in the nation. These fisheries — defined as groups of fishermen that target a certain kind of fish using a particular kind of fishing gear in a specific region — are reportedly responsible for more than half of all domestic by-catch; however, they’re only responsible for about 7 percent of the fish brought to land, the report notes.

Some of these fisheries reportedly discard more fish than they keep; others are said to throw out large amounts of the very fish species they aim to catch. California fishermen who use drift gillnets (walls of netting that drift in the water) to capture swordfish, for example, reportedly throw out about 63 percent of their total catch.

Between 2008 and 2012, about 39,000 common molas, 6,000 sharks, as well as hundreds of seals, sea lions and dolphins, were seriously injured or killed in the California drift gillnet fishery, Oceana notes.

bycatch

Read More Here

 

Enhanced by Zemanta

Oil firm agrees to abide by EPA monitoring arrangements for five years, allowing it to bid for drilling contracts in Gulf of Mexico
BP

BP is still awaiting a US court ruling about whether it was grossly negligent over the Deepwater Horizon blowout in 2010. Photograph: AFP/Getty Images

BP is closer to restoring its operations and reputation in the US after agreeing a deal with environmental protection authorities that it will enable the oil firm to bid for new drilling rights in the Gulf of Mexico.

The British-based group had started legal proceedings against the US environmental protection agency (EPA) which had banned BP from new contracts on the grounds that it had failed to correct problems properly since the Deepwater Horizon disaster in 2010.

BP said it had now dropped its law suit after resolving outstanding problems with the EPA but the firm will have to abide by monitoring arrangements with the agency for the next five years.

“After a lengthy negotiation, BP is pleased to have reached this resolution, which we believe to be fair and reasonable,” said John Mingé, head of BP America. “Today’s agreement will allow America’s largest energy investor to compete again for federal contracts and leases.”

Read More Here

 

…..

March 18, 2014

Huffpost Business

Government Declares BP a ‘Responsible’ Contractor: Workers and Taxpayers Beware

A scant five days before the Department of Interior opens a new round of bids for oil leases in the Gulf of Mexico, the EPA has blinked, pronouncing BP, the incorrigible corporate scofflaw of the new millennium, once again fit to do business with the government.

To get right to the point, the federal government’s decision that BP has somehow paid its debt and should once again be eligible for federal contracts is a disgrace. Not only does it let BP off the hook, it sends an unmistakable signal to the rest of the energy industry: That no matter how much harm you do, no matter how horrid your safety record, the feds will cut you some slack.

Back in 2012, the agency’s intrepid staff had finally gotten permission to pull the trigger on the company, de-barring it from holding any new U.S. contracts on the grounds that it was not running its business in a “responsible” way. Undoubtedly under pressure by the Cameron government and the U.S. Defense Logistics Agency, BP’s most loyal customer, the EPA settled its debarment suit for a sweet little consent decree that will try to improve the company’s sense of ethics by having “independent” auditors come visit once a year.

To review the grim record: BP, now the third-largest energy company in the world, is the first among the roster of companies that have caused the most memorable industrial fiascos in the post-modern age.

  • Its best-known disaster, the explosion aboard the Deepwater Horizon, a drilling rig moored in the Gulf of Mexico that BP had hired to develop its lease of the Macondo well, killed 11 and deposited 205 million gallons of crude oil along the southern coast of the United States — the worst environmental disaster in American history.
  • In a troubling precursor, another explosion killed 15 and injured 180 at the company’s Texas City refinery in July 2005. This incident happened even after the plant manager there had gone on bended knee to John Manzoni, BP’s second in command worldwide, to plead for money to address severe maintenance problems that jeopardized safety at that plant after a consultant surveying refinery workers reported that many thought they ran a real risk of being killed at work. Those fears were warranted, it turned out.
  • Also in 2005, 200,000 gallons of oil spilled from a BP pipeline on Alaska’s North Slope.

 

Read More Here

 

…..

Enhanced by Zemanta

missingsky102 missingsky102

 

Published on Mar 7, 2014

Fracking fluids dumped into the ocean
Environmentalists are trying to convince the EPA to ban the dumping of fracking fluids, in federal waters off the California coast. The Center for Biological Diversity claims that at least a dozen off shore rigs in Southern California are dumping wastewater right into the Pacific. RT’s Ramon Galindo has the story.
Find RT America in your area: http://rt.com/where-to-watch/
Or watch us online: http://rt.com/on-air/rt-america-air/

RT’s Ramon Galindo talks about a recent legal petition by environmental groups in California calling for the Federal government to force an end to the practice of offshore fracking, and the dumping of hundreds of millions of gallons of fracking waste in the ocean every year.

Abby Martin calls out Exxon CEO Rex Tillerson for his blatant hypocrisy after filing a lawsuit against a fracking water tower being built near his property.

Enhanced by Zemanta

 

Acidic ocean deadly for Vancouver Island scallop industry

 

CBC News

Posted: Feb 25, 2014 8:58 PM PT Last Updated: Feb 26, 2014 7:04 PM PT

High acidity levels in B.C.'s oceans mean millions of the shellfish die before they reach full maturity.

High acidity levels in B.C.’s oceans mean millions of the shellfish die before they reach full maturity.

The deteriorating health of B.C.’s oceans is impacting not only the province’s marine life, but also its economy.

 

Millions of shellfish are dying off before they can be harvested at Island Scallops, near Parksville, B.C., due to increased acidity levels in the ocean.

 

One-third of the workforce at Island Scallops — 20 people — are being laid off because the business has lost more than 10 million scallops before they were able to reach maturity since 2009.

 

“It’s obviously kicked our feet out from underneath us,” said CEO Rob Saunders.

 

Island Scallops

Island Scallops, near Parksville, B.C., is laying off 20 employees because high acidity in the oceans has meant the loss of millions of scallops.

 

He said low pH levels in the water appear to be the root of the problem.

 

Read More Here

 

…..

 

“Acidic Waters Kill 10 Million Scallops Off Vancouver”

 
By Kiley Kroh on February 26, 2014 at 11:16 am
 

 

A worker harvests oysters for Taylor Shellfish in Washington, another company grappling with the effects of ocean acidification.

A worker harvests oysters for Taylor Shellfish in Washington, another company grappling with the effects of ocean acidification.

CREDIT: AP Photo/Ted S. Warren, File

 

A mass die-off of scallops near Qualicum Beach on Vancouver Island is being linked to the increasingly acidic waters that are threatening marine life and aquatic industries along the West Coast.

 

Rob Saunders, CEO of Island Scallops, estimates his company has lost three years worth of scallops and $10 million dollars — forcing him to lay off approximately one-third of his staff.

 

“I’m not sure we are going to stay alive and I’m not sure the oyster industry is going to stay alive,” Saunders told The Parksville Qualicum Beach NEWS. “It’s that dramatic.”

Ocean acidification, often referred to as global warming’s “evil twin,” threatens to upend the delicate balance of marine life across the globe.

 

Read More Here

 

…..

 

Struggling shellfish farmers eye genomic research

Industry looks for answers to cope with rising carbon dioxide levels, increased acidity

 
 
 
Struggling shellfish farmers eye genomic research
 

High acidity is being blamed for a mass die-off of B.C. scallops.

Shellfish farmers are appealing to the federal and provincial governments to support genomic research in an effort identify oysters, mussels and scallops suited to withstand the west coast’s rapidly changing marine environment.

Oyster and scallop farmers from Oregon right up the coast of British Columbia are experiencing massive die-offs of animals associated with rising carbon dioxide levels and increasing acidity in local waters.

“We’ve been aware of these problems for quite a while and we just have to learn to operate our farms under new parameters,” said Roberta Stevenson, executive director of the B.C. Shellfish Growers Association. “Genomics offers us an opportunity to develop an animal that is more capable of adapting to this new pH level.”

Shellfish farms employ about 1,000 people in mostly rural parts of the coast and generate about $33 million in sales each year, Stevenson said.

 

Read More Here

 

…..

Enhanced by Zemanta

Geoengineering side effects could be potentially disastrous, research shows

Comparison of five proposed methods shows they are ineffective, alter weather systems or could not be safely stopped
Geoengineering techniques need more study, says science coalition

Geoengineering the planet’s climate: even when applied on a massive scale, the most that could be expected is a temperature drop of about 8%, new research shows. Photograph: Nasa/REUTERS

Large-scale human engineering of the Earth’s climate to prevent catastrophic global warming would not only be ineffective but would have severe unintended side effects and could not be safely stopped, a comparison of five proposed methods has concluded.

Science academies around the world as well as some climate activists have called for more research into geoengineering techniques, such as reflecting sunlight from space, adding vast quantities of lime or iron filings to the oceans, pumping deep cold nutrient-rich waters to the surface of oceans and irrigating vast areas of the north African and Australian deserts to grow millions of trees. Each method has been shown to potentially reduce temperature on a planetary scale.

But researchers at the Helmholtz Centre for Ocean Research Kiel, Germany, modelled these five potential methods and concluded that geoengineering could add chaos to complex and not fully understood weather systems. Even when applied on a massive scale, the most that could be expected, they say, is a temperature drop of about 8%.

The potential side effects would be potentially disastrous, say the scientists, writing in Nature Communications. Ocean upwelling, or the bringing up of deep cold waters, would cool surface water temperatures and reduce sea ice melting, but would unbalance the global heat budget, while adding iron filings or lime would affect the oxygen levels in the oceans. Reflecting the sun’s rays into space would alter rainfall patterns and reforesting the deserts could change wind patterns and could even reduce tree growth in other regions.

In addition, say the scientists, two of the five methods considered could not be safely stopped. “We find that, if solar radiation management or ocean upwelling is discontinued then rapid warming occurs. If the other methods are discontinued, less dramatic changes occur. Essentially all of the CO2 that was taken up remains in the ocean.”

 

Read More Here

.

 

LiveScience

 

 

 

Diagram of geoengineering ideas
A diagram of the geoengineering projects people have proposed to combat climate change. The laws surrounding such projects are still uncertain.
Credit: Diagram by Kathleen Smith/LLNL

 

Current schemes to minimize the havoc caused by global warming by purposefully manipulating Earth’s climate are likely to either be relatively useless or actually make things worse, researchers say in a new study.

 

The dramatic increase in carbon dioxide levels in the atmosphere since the Industrial Revolution is expected to cause rising global sea levels, more-extreme weather and other disruptions to regional and local climates. Carbon dioxide is a greenhouse gas that traps heat, so as levels of the gas rise, the planet overall warms.

 

In addition to efforts to reduce carbon dioxide emissions, some have suggested artificially manipulating the world’s climate in a last-ditch effort to prevent catastrophic climate change. These strategies, considered radical in some circles, are known as geoengineering or climate engineering.

 

 

Many scientists have investigated and questioned how effective individual geoengineering methods could be. However, there have been few attempts to compare and contrast the various methods, which range from fertilizing the ocean so that marine organisms suck up excess carbon dioxide to shooting aerosols into the atmosphere to reflect some of the sun’s incoming rays back into space. [8 Ways Global Warming is Already Changing the World]

 

Now, researchers using a 3D computer model of the Earth have tested the potential benefits and drawbacks of five different geoengineering technologies.

 

Will it work?

 

The scientists found that even when several technologies were combined, geoengineering would be unable to prevent average surface temperatures from rising more than 3.6 degrees Fahrenheit (2 degrees Celsius) above current temperatures by the year 2100. This is, the current limit that international negotiations are focused on. They were unable to do so even when each technology was deployed continuously and at scales as large as currently deemed possible.

 

“The potential of most climate engineering methods, even when optimistic deployment scenarios were assumed, were much lower than I had expected,” said study author Andreas Oschlies, an earth system modeler at the GEOMAR Helmholtz Centre for Ocean Research in Kiel, Germany.

 

Read More Here

 

…..

International Law Encourages Use of Geoengineering Weather Modification

 

 

Derrick Broze

According to a new study due to be published in 2014, Geoengineering field research is not only allowed, it is encouraged.

The study was authored by Jesse Reynolds at Tilburg Law School in the Netherlands. Reynolds researched the legal status of geoengineering research by analyzing international documents and treaties.

Geo-engineering is the science of manipulating the climate for the stated purpose of fighting mad made climate change. These include Solar Radiation Management (SRM), the practice of spraying aerosols into the sky in an attempt to deflect the Sun’s rays and combat climate change.

According to a recent congressional report:

“The term “geoengineering” describes this array of technologies that aim, through large-scale and deliberate modifications of the Earth’s energy balance, to reduce temperatures and counteract anthropogenic climate change. Most of these technologies are at the conceptual and research stages, and their effectiveness at reducing global temperatures has yet to be proven. Moreover, very few studies have been published that document the cost, environmental effects, socio-political impacts, and legal implications of geoengineering. If geoengineering technologies were to be deployed, they are expected to have the potential to cause significant transboundary effects.

In general, geoengineering technologies are categorized as either a carbon dioxide removal (CDR) method or a solar radiation management (SRM) method. CDR methods address the warming effects of greenhouse gases by removing carbon dioxide (CO2) from the atmosphere. CDR methods include ocean fertilization, and carbon capture and sequestration. SRM methods address climate change by increasing the reflectivity of the Earth’s atmosphere or surface.

Aerosol injection and space-based reflectors are examples of SRM methods. SRM methods do not remove greenhouse gases from the atmosphere, but can be deployed faster with relatively immediate global cooling results compared to CDR methods.“
Reynolds’ study will be published in the Journal of Energy, Climate and the Environment around the same time that the Intergovernmental Panel on Climate Change presents its Fifth Assessment Report. The study continues the calls for an international body to regulate the controversial weather modification techniques.

Some believe the answer is international agreement for international tests but low-risk domestic research should continue to assist in the overall decision of what to do with geoengineering.

One of the many dangers of manipulating the weather are the loss of blue skies. According to a report by the New Scientist, Ben Kravitz of the Carnegie Institution for Science has shown that releasing sulphate aerosols high in the atmosphere would scatter sunlight into the atmosphere. He says this could decrease the amount of sunlight that hits the ground by 20% and make the sky appear more hazy.

 

Read More Here

 

…..

Yale University

 

09 Jan 2014: Report

Solar Geoengineering: Weighing
Costs of Blocking the Sun’s Rays

With prominent scientists now calling for experiments to test whether pumping sulfates into the atmosphere could safely counteract global warming, critics worry that the world community may be moving a step closer to deploying this controversial technology.

by nicola jones

In 1991, Mount Pinatubo in the Philippines erupted in one of the largest volcanic blasts of the 20th century. It spat up to 20 million tons of sulfur into the upper atmosphere, shielding the earth from the sun’s rays and causing global temperatures to drop by nearly half a degree Celsius in a single year. That’s more than half of the amount the planet has warmed

Studies have shown that such a strategy would be powerful, feasible, fast-acting, and cheap.

due to climate change in 130 years.

Now some scientists are thinking about replicating Mount Pinatubo’s dramatic cooling power by intentionally spewing sulfates into the atmosphere to counteract global warming. Studies have shown that such a strategy would be powerful, feasible, fast-acting, and cheap, capable in principle of reversing all of the expected worst-case warming over the next century or longer, all the while increasing plant productivity. Harvard University physicist David Keith, one of the world’s most vocal advocates of serious research into such a scheme, calls it “a cheap tool that could green the world.” In the face of anticipated rapid climate change, Keith contends that the smart move is to intensively study both the positive and negative effects of using a small fleet of jets to inject

“Mount

Arlan Naeg/AFP/Getty Images
The 1991 Mount Pinatubo eruption lowered temperatures nearly half a degree Celsius.

sulfate aerosols high into the atmosphere to block a portion of the sun’s rays.

Yet even Keith acknowledges that there are serious concerns about solar geoengineering, both in terms of the environment and politics. Growing discussion about experimentation with solar radiation management has touched off an emotional debate, with proponents saying the technique may be needed to avert climate catastrophe and opponents warning that deployment could lead to international conflicts and unintended environmental consequences — and that experimentation would create a slippery slope that would inevitably lead to deployment. University of Chicago geophysicist Raymond Pierrehumbert has called the scheme “barking mad.” Canadian environmentalist David Suzuki has dismissed it as “insane.” Protestors have stopped even harmless, small-scale field experiments that aim to explore the idea. And Keith has received a couple of death threats from the fringe of the environmentalist community.

Clearly, there are good reasons for concern. Solar geoengineering would likely make the planet drier, potentially disrupting monsoons in places like India and creating drought in parts of the tropics. The technique could help eat away the protective ozone shield of our planet, and it would cause air pollution. It would also do nothing to counteract the problem of ocean

Some worry that solar geoengineering would hand politicians an easy reason to avoid emissions reductions.

acidification, which occurs when the seas absorb high levels of CO2 from the atmosphere.

Some worry that solar geoengineering would hand politicians an easy reason to avoid reducing greenhouse gas emissions. And if the impacts of climate change worsen and nations cannot agree on what scheme to deploy, or at what temperature the planet’s thermostat should be set, then conflict or even war could result as countries unilaterally begin programs to inject sulfates into the atmosphere. “My greatest concern is societal disruption and conflict between countries,” says Alan Robock, a climatologist at Rutgers University in New Jersey.

As Keith himself summarizes, “Solar geoengineering is an extraordinarily powerful tool. But it is also dangerous.”

Studies have shown that solar radiation management could be accomplished and that it would cool the planet. Last fall, Keith published a book, A Case for Climate Engineering, that lays out the practicalities of such a scheme. A fleet of ten Gulfstream jets could be used to annually inject 25,000 tons of sulfur — as finely dispersed sulfuric acid, for example — into the lower stratosphere. That would be ramped up to a million tons of sulfur per year by 2070, in order to counter about half of the world’s warming from greenhouse gases. The idea is to combine such a scheme with emissions cuts, and keep it running for about twice as long as it takes for CO2 concentrations in the atmosphere to level out.

Under Keith’s projections, a world that would have warmed 2 degrees C by century’s end would instead warm 1 degree C. Keith says his “moderate, temporary” plan would help to avoid many of the problems associated with full-throttle solar geoengineering schemes that aim to counteract all of the planet’s warming, while reducing the cost of adapting to rapid climate change. He estimates this scheme would cost about $700 million annually — less than 1 percent of what is currently spent on clean energy development. If such relatively modest cost projections prove to be accurate, some individual countries could deploy solar geoengineering technologies without international agreement.

‘The thing that’s surprising is the degree to which it’s being taken more seriously,’ says one scientist.

The idea of solar geoengineering dates back at least to the 1970s; researchers have toyed with a range of ideas, including deploying giant mirrors to deflect solar energy back into space, or spraying salt water into the air to make more reflective clouds. In recent years the notion of spraying sulfates into the stratosphere has moved to the forefront. “Back in 2000 we just thought of it as a ‘what if’ thought experiment,” says atmospheric scientist Ken Caldeira of the Carnegie Institution for Science, who did some of the first global climate modeling work on the concept. “In the last years, the thing that’s surprising is the degree to which it’s being taken more seriously in the policy world.”

In 2010, the first major cost estimates of sulfate-spewing schemes were produced. ‎ In 2012, China listed geoengineering among its earth science research priorities. Last year, the Intergovernmental Panel on Climate Change’s summary statement for policymakers controversially mentioned geoengineering for the first time in the panel’s 25-year history. And the National Academy of Sciences is working on a geoengineering report, funded in part by the U.S. Central Intelligence Agency.

Solar geoengineering cannot precisely counteract global warming. Carbon dioxide warms the planet fairly evenly, while sunshine is patchy: There’s more in the daytime, in the summer, and closer to the equator. Back in the 1990s, Caldeira was convinced that these differences would make geoengineering ineffective. “So we did these simulations, and much to our surprise it did a pretty good job,” he says. The reason is that a third factor has a bigger impact on climate than either CO2 or sunlight: polar ice. If you cool the planet enough to keep that ice, says Caldeira, then this dominates the climate response.

 

Read More Here

 

…..

Geoengineering could bring severe drought to the tropics, research shows

Study models impact on global rainfall when artificial volcanic eruptions are created in a bid to reverse climate change
Layers of Volcanic Dust in the Earth's Atmosphere following eruption of Mount Pinatubo, Philippines

A view from the space shuttle Atlantis of three layers of volcanic dust in the Earth’s atmosphere, following the 1991 eruption of Mount Pinatubo in the Philippines. Photograph: ISS/NASA/Corbis

Reversing climate change via huge artificial volcanic eruptions could bring severe droughts to large regions of the tropics, according to new scientific research.

The controversial idea of geoengineering – deliberately changing the Earth’s climate – is being seriously discussed as a last-ditch way of avoiding dangerous global warming if efforts to slash greenhouse gas emissions fail.

But the new work shows that a leading contender – pumping sulphate particles into the stratosphere to block sunlight – could have side-effects just as serious as the effects of warming itself. Furthermore, the impacts would be different around the world, raising the prospect of conflicts between nations that might benefit and those suffering more damage.

“There are a lot of issues regarding governance – who controls the thermostat – because the impacts of geoengineering will not be uniform everywhere,” said Dr Andrew Charlton-Perez, at the University of Reading and a member of the research team.

The study, published in the journal Environmental Research Letters, is the first to convincingly model what happens to rainfall if sulphates were deployed on a huge scale.

While the computer models showed that big temperature rises could be completely avoided, it also showed cuts in rain of up to one-third in South America, Asia and Africa. The consequent droughts would affect billions of people and also fragile tropical rainforests that act as a major store of carbon. “We would see changes happening so quickly that there would be little time for people to adapt,” said Charlton-Perez.

Another member of the research team, Professor Ellie Highwood, said: “On the evidence of this research, stratospheric aerosol geoengineering is not providing world leaders with any easy answers to the problem of climate change.”

 

Read More Here

 

…..

 

Enhanced by Zemanta

Scientists say lack of government supported research forcing them to use volunteers, predictive models

- Lauren McCauley, staff writer

An image from a model of the progression of a radioactive plume coming across the Pacific following the Fukushima nuclear meltdown. (via BBC News)A radioactive plume released from the Fukushima meltdown is expected to reach the west coast of the U.S. in April, said a panel of researchers in Honolulu Monday. However, without any federal or international monitoring, scientists are bereft of “actual data,” guessing at the amount of radiation coming at us.

Monitors along the Pacific U.S. coast have yet to detect any traces of cesium-134, said Ken Buesseler, a chemical oceanographer at the Woods Hole Oceanographic Institution (WHOI), speaking on a panel at the meeting of the American Geophysical Union’s Ocean Sciences. However, sampling undertaken by Dr. John Smith at the Bedford Institute of Oceanography has helped develop models that forecast the “probable future progression of the plume.”

According to Buesseler, initial traces should be detectable along the Pacific coast in April.

One of the radioactive isotopes that is formed during a nuclear accident is cesium-134. With a short half-life of two years, any traces of it detected by monitoring instruments can be specifically attributed to the Fukushima nuclear accident.

Another isotope, cesium-137, decays very slowly with a half-life of 30 years. Though traces of cesium-137 have been detected in the world’s oceans, their source may be attributed to previous nuclear-weapons tests.

One shortcoming of the current models available to the scientists is that lack of solid data is creating varying predictions about the amount of radiation and when it is expected to reach the west coast. And though the estimated levels fall far shorter than acceptable drinking water concentrations, according to the WHOI, the concern is not direct exposure but rather the “uptake by the food web and, hence, the potential for human consumption of contaminated fish.”

“To my mind, this is not really acceptable,” said Buesseler, speaking of the variation between the predictive models. “We need better studies and resources to do a better job, because there are many reactors on coasts and rivers and if we can’t predict within a factor of 10 what cesium or some other isotope is downstream—I think that’s a pretty poor job.”

Individuals have recently spread alarm about the presence of radioactive isotopes already found along the Pacific coast, although those concerns were debunked.

Without any federal or international agencies currently monitoring ocean waters from Fukushima on this side of the Pacific, Buesseler and the WHOI have had to recruit volunteers to collect seawater at 16 sites along the California and Washington coasts and two in Hawaii and ship the samples back to the Cape Cod, Mass. laboratory.

“We need to know the real levels of radiation coming at us,” said Bing Dong, a retired accountant and one of the volunteers with the WHOI project. “There’s so much disinformation out there, and we really need actual data.”

_____________________

Radioactive water travelling from Fukushima power plant being measured by scientists

Researchers are attempting to predict the amount of radioactive water from Japan that will hit the North American coast.

The concentration of radioactive water from the Fukushima power plant in Japan expected to hit North American coasts should be known in the next two months.

So far only small traces of pollution have been recorded in Canadian continental waters, but this is expected to increase as contaminants move eastwards on Pacific currents, BBC News has reported.

In 2011, three nuclear reactors at the Japanese facility went into meltdown, leaking radiation into the Pacific Ocean.

Researchers from the Bedford Institute of Oceanography in Canada have been analysing water along a line running almost 2,000km due west of Vancouver, British Columbia, since the 2011 Fukushima accident.

In June 2013, radioactive caesium-137 and 134 were detected in the entire line of the sampling length.

Scientists stress that even with the probable increases taken into account, the measurements will be well within limits set by safety authorities.

Researchers have harnessed the radioactive water to test two forecasting models to try and map the probable future progression of the plume of radioactive water.

Using one model, the scientists have predicted that a maximum concentration of 27 becquerels per cubic metre of water will appear on the Canadian coast by mid-2015, but the other model predicts no more than about two becquerels per cubic metre of water.

Bedford’s Dr John Smith told BBC News that further measurements currently being taken in the ocean should give researchers a fair idea of which model is correct.

Read More Here

Enhanced by Zemanta

| February 20, 2014 11:58 am

There’s clearly a lot of honor in being named the first offshore wind farm in the U.S., and developers keep that in mind with each deal they strike and announcement they make.

In the past two weeks, Deepwater Wind announced deals that it believes keeps its Block Island Wind Farm “on target to become the nation’s first offshore wind farm.” First, the Providence, RI-based firm signed a deal with the French Alstom Group for five, 6-megawatt (MW) turbines that will power the farm to be constructed on waters near Rhode Island’s Block Island. Next, Deep Wind tapped Oslo, Norway-based Fred. Olsen Windcarrier to provide the vessel for the farm’s turbine installation.

Video screenshot: Deepwater Wind

Video screenshot: Deepwater Wind

“This agreement represents a giant leap forward for the Block Island Wind Farm, and the start of turbine construction just last month marked a major project milestone,” said Deepwater Wind CEO Jeffrey Grybowski.

Alstom’s 6-MW Haliade 150 turbines are 589 feet tall. The company has 2.3 gigawatts of offshore wind farm substations delivered or under construction around the world.

The 30-MW Block Island Wind Farm will generate more than 125,000 MW hours annually, enough to power about 17,000 homes. The energy will be exported to the mainland electric grid through a 21-mile, bi-directional Block Island transmission system that includes a submarine cable proposed to make landfall in Narragansett, RI.

Read More Here

Enhanced by Zemanta
Follow

Get every new post delivered to your Inbox.

Join 1,503 other followers