Advertisements

Archive for November, 2015


Health and Wellness Report Banner photo FSPLogoBannerHealthandWellness831x338Blogger_zps68b43460.jpg

……………………………………………………..

 

 

https://i2.wp.com/drugline.org/img/term/vaccination-mumps-15714_0.jpg

Drugs Information Online

………………………………………………………………………………………………

Vaccinated Norwegians Get Mumps During “Outbreak”

 

Roughly 80 Norwegian college students have reportedly contracted mumps. Of the 80, many reportedly were previously vaccinated with the MMR vaccine, which, by all logical and reasonable accounts, should have protected them. However, Norwegian health officials are making excuses over the matter.

According to OutBreakNewsToday.

Several of those who are now sick with mumps are Norwegian students who have previously received two doses of MMR vaccine is recommended.

It is possible to get sick with mumps even if you have been fully vaccinated against the disease, confirming Margrethe Greve-Isdahl, chief physician at the Department of vaccine, Public Health (FHI).

 

Read More Here

………………………………………………………………………………………..

 

The Norway Institute for Public Health, or Folkehelseinstituttet has announced a mumps outbreak, primarily among university students. The first reported cases were in the Trondheim area–the Norwegian technical and University of Science and Technology (NTNU) and the University College of Sør-Trøndelag (HIST) in late October.

Norway/CIA

Norway/CIA

Now the case count hovers around 80 and health officials expect the cases to increase in coming weeks.

Several of those who are now sick with mumps are Norwegian students who have previously received two doses of MMR vaccine is recommended.

It is possible to get sick with mumps even if you have been fully vaccinated against the disease, confirming Margrethe Greve-Isdahl, chief physician at the Department of vaccine, Public Health (FHI).

In Norway, the vaccine against mumps in the MMR vaccine (measles, mumps and rubella) offered in the childhood immunization. First dose offered to children at 15 months of age and second dose at 11 years of age (6th grade). FHI generally recommend that all who have not received two doses of MMR vaccine are eligible for this. This also applies to students who come to Norway. Upon initial vaccination is recommended that at least three months between doses, but there is a benefit to the immune response if it goes longer.

 

Read More Here

Advertisements

 photo FamilySurvivalProtocolColliseumBannergrayscale900x338_zpsb17c85d0.jpg     Health and Wellness Report Banner photo FSPLogoBannerHealthandWellness831x338Blogger_zps68b43460.jpg

Global Community Report Banner photo FSPLogoGlobalCommunityFulloldworldmapbckgrnd_zps43d3059c.jpg

……………………………………………………………………………………

 

Costco and Red Lobster Say No to GMO Salmon

| November 25, 2015 9:21 am |

The U.S. Food and Drug Administration’s (FDA) controversial approval of AquaBounty’s genetically modified (GMO) salmon has garnered further backlash from national grocery stores and restaurant chains.

Costco, the second largest retailer in the world with 487 stores and one of the largest retailers of salmon and seafood in the U.S., has made a firm commitment not to sell GMO salmon.

gmosalmonfight750

“Although the FDA has approved the sale of GM salmon, Costco has not sold and does not intend to sell GM salmon at this time,” the company said in a statement.

Costco’s move to reject GMO salmon comes after vehement opposition from anti-GMO activists.

According to a statement from Friends of the Earth, Walmart and Publix are among the last remaining large retail grocers in the U.S. that have not yet rejected GMO salmon.

“The market is rejecting GMO salmon. Stores won’t sell it and people don’t want to eat it,” said Friends of the Earth Campaigner Dana Perls. “Now other retailers like Walmart and restaurants need to follow suit, and we need mandatory GMO labeling so that consumers know how to avoid GMO salmon.”

More than 60 grocery store chains representing more than 9,000 stores across the U.S. have made commitments to not sell GMO salmon, including Safeway, Kroger, Target, Trader Joe’s, Whole Foods, Aldi and many others.

The nation’s largest seafood restaurant is also denying GMO salmon. Red Lobster, with 705 North American locations and more than 40 internationally, told the Dallas Morning News that it would not sell GMO salmon last Friday.

Incidentally, consumers might not even know they’re eating GMO salmon. AquaBounty’s salmon, which is genetically altered to grow to market size in half the time of conventional salmon, will not require a GMO label under FDA guidelines.

It’s unclear if we’ll  ever see labels for genetically altered food, period—not just salmon. Currently, the hotly contested Safe and Accurate Food Labeling Act—dubbed by opponents as the Denying Americans the Right to Know Act or DARK Actlanguishes in the Senate.

The act, H.R. 1599, which passed the House of Representatives in July, bans states from issuing mandatory labeling laws for foods containing GMOs. The bill gives the FDA the authority to establish national standards and regulations for GMO food. The Department of Agriculture would be granted full discretion over the law’s implementation.

 

Read More Here

 photo FamilySurvivalProtocolColliseumBannergrayscale900x338_zpsb17c85d0.jpg          Health and Wellness Report Banner photo FSPLogoBannerHealthandWellness831x338Blogger_zps68b43460.jpg

…………………………………………………………………………………..

India Opines

Are Bill and Melinda Gates Guilty of Committing Fraud in India?

Bill And Melinda Gates Are Bill and Melinda Gates Guilty of Committing Fraud in India?

The  Bill & Melinda Gates Foundation connected to the HPV vaccine and its trials in India? The connection is that  in 2002, The Bill and Melinda Gates Foundation (BMGF) acquired shares in Merck and the charge is that the BMGF along with an organisation called GAVI (a vaccine alliance funded by BMGF) are pushing a vaccine agenda in India and other places in the world in collusion with health authorities who are recommending the use of vaccines without proper testing; actions amounting to vaccine fraud.

It has long been an ugly truth that Indian lives are cheaper compared with their western counterparts. Poorer sections of Indian communities are routinely used as human guinea pigs, subject to testing by pharmaceutical companies

– testing that could be illegal or expensive or impossible in western nations; testing that is most often done without informed consent and at times by employing coercion, concealment and other underhand tactics.

 

Bill Gates GAVI Are Bill and Melinda Gates Guilty of Committing Fraud in India?

The charge made by community health activists against BMGF and GAVI (which has members from pharma companies on its board) is that they are working with WHO and UNICEF to promote vaccine use among the poorer countries; in populations that typically wouldn’t be able to afford them. Some commentators are referring to the incident as “fraud” perpetrated on third world where the Bill & Melinda Gates Foundation and their “vaccine empire are under fire”.

The perception is that vaccine empires in the western world are crumbling, and Big Pharma is looking to consolidate its position by plugging expensive and often unnecessary drugs in poorer nations in collusion with local authorities.

gardasil Are Bill and Melinda Gates Guilty of Committing Fraud in India?

One in a line of questionable activities by international pharmaceutical companies in India refers to the HPV vaccine made by American company Merck, which happens to be one of the largest pharmaceutical companies in the world. The United States government earns royalty on the sale of Merck’s vaccine and there is a strong perception that any negative news report against this vaccine is not only discouraged but actively squashed. In the US, well known news broadcaster Katie Couric was made to apologise for her interview with a mother whose daughter died after receiving the Gardasil vaccine. Later the assistant Surgeon General appeared on her show to assure everyone the vaccine was safe.

 

Read More Here

Earth Watch Report Banner photo FSPEarthWatchReport900x228Blogger_zps53ef6af0.jpg          Health and Wellness Report Banner photo FSPLogoBannerHealthandWellness831x338Blogger_zps68b43460.jpg

………………………………………………………………………………………..

 

VOA    Voice of America

Mysterious Illness Kills Dozens of Children in Indonesian Village

 

FILE – Mosquito nets helped control malaria after an outbreak in the late 1990s. Today, a mysterious illness initially thought to be malaria has hit Papua, killing at least 41 children within three weeks.

FILE – Mosquito nets helped control malaria after an outbreak in the late 1990s. Today, a mysterious illness initially thought to be malaria has hit Papua, killing at least 41 children within three weeks.

Fatiyah Wardah

A mysterious illness in Indonesia has killed dozens of children in a village in the remote eastern province of Papua in the past three weeks, leading to charges that the government has failed to take aggressive action.

Read More Here

………………………………………………………………………………………

41 kids die from mystery
disease in Papua

A large number of children, many below the age of seven, have died of an unexplained disease in Mbuwa district, Nduga regency, Papua, following the start of the rainy season in early November.

A medical team consisting of health workers from Nduga, Wamena and Jayawijaya regencies arrived at the location but have yet to ascertain the cause of the deaths.

“As many as 41 children have died, as of today. They present with a slight illness at first but die shortly after these initial signs. The medical team from Nduga Health Office, assisted by the Wamena Health Office may have returned home, but the cause of these deaths remains uncertain,” said Mbuwa district chief Erias Gwijangge, during a call to The Jakarta Post on Monday.

Erias said Nduga and surrounding areas had experienced drought and were exposed to haze from forest fires. Rain only fell in the past month. When the rain began, a number of livestock, such as pigs and poultry, also died abruptly.

“Many of the children died prior to the livestock but there was no report of child fatalities, only in the last three days,” said Erias.

When contacted by the Post, Wamena City community health clinic analysis member Yan Hubi, who joined the trip to Mbuwa district, said his clinic analyzed blood samples of the children to find out if the children had been infected by malaria, but all were negative.

Yan returned to Mbuwa on Nov. 17. A doctor and several other medical workers are also continuing to conduct medical treatment in Mbuwa.

 

Health and Wellness Report Banner photo FSPLogoBannerHealthandWellness831x338Blogger_zps68b43460.jpg          Global Community Report Banner photo FSPLogoGlobalCommunityFulloldworldmapbckgrnd_zps43d3059c.jpg

…………………………………………………………………………………………..

July 27, 2015

by Rob Wallace

The notion of a neoliberal Ebola is so beyond the pale as to send leading lights in ecology and health into apoplectic fits.

Here’s one of bestseller David Quammen’s five tweets denouncing my hypothesis that neoliberalism drove the emergence of Ebola in West Africa. I’m an “addled guy” whose “loopy [blog] post” and “confused nonsense” Quammen hopes “doesn’t mislead credulous people.”

Scientific American’s Steve Mirksy joked that he feared “the supply-side salmonella”. He would walk that back when I pointed out the large literature documenting the ways and means by which the economics of the egg sector is driving salmonella’s evolution.

The facts of the Ebola outbreak similarly turn Quammen’s objection on its head.

Guinea Forest Region in 2014

Guinea Forest Region in 2014 (Photo Credit Daniel Bausch)

 

 

 

 

 

 

 

 

 

 

The virus appears to have been spilling over for years in West Africa. Epidemiologist Joseph Fair’s group found antibodies to multiple species of Ebola, including the very Zaire strain that set off the outbreak, in patients in Sierra Leone as far back as five years ago. Phylogenetic analyses meanwhile show the Zaire strain Bayesian-dated in West Africa as far back as a decade.

An NIAID team showed the outbreak strain as possessing no molecular anomaly, with nucleotide substitution rates typical of Ebola outbreaks across Africa.

That result begs an explanation for Ebola’s ecotypic shift from intermittent forest killer to a protopandemic infection infecting 27,000 and killing over 11,000 across the region, leaving bodies in the streets of capital cities Monrovia and Conakry.

Explaining the rise of Ebola

The answer, little explored in the scientific literature or the media, appears in the broader context in which Ebola emerged in West Africa.

The truth of the whole, in this case connecting disease dynamics, land use and global economics, routinely suffers at the expense of the principle of expediency. Such contextualization often represents a threat to many of the underlying premises of power.

In the face of such an objection, it was noted that the structural adjustment to which West Africa has been subjected the past decade included the kinds of divestment from public health infrastructure that permitted Ebola to incubate at the population level once it spilled over.

The effects, however, extend even farther back in the causal chain. The shifts in land use in the Guinea Forest Region from where the Ebola epidemic spread were also connected to neoliberal efforts at opening the forest to global circuits of capital.

Daniel Bausch and Lara Schwarz characterize the Forest Region, where the virus emerged, as a mosaic of small and isolated populations of a variety of ethnic groups that hold little political power and receive little social investment. The forest’s economy and ecology are also strained by thousands of refugees from civil wars in neighboring countries.

The Region is subjected to the tandem trajectories of accelerating deterioration in public infrastructure and concerted efforts at private development dispossessing smallholdings and traditional foraging grounds for mining, clear-cut logging, and increasingly intensified agriculture.

The Ebola hot zone as a whole comprises a part of the larger Guinea Savannah Zone the World Bank describes as “one of the largest underused agricultural land reserves in the world.” Africa hosts 60% of the world’s last farmland frontier. And the Bank sees the Savannah best developed by market commercialization, if not solely on the agribusiness model.

As the Land Matrix Observatory documents, such prospects are in the process of being actualized. There, one can see the 90 deals by which U.S.-backed multinationals have procured hundreds of thousands of hectares for export crops, biofuels and mining around the world, including multiple deals in Sub-Saharan Africa. The Observatory’s online database shows similar land deals pursued by other world powers, including the UK, France, and China.

Under the newly democratized Guinean government, the Nevada-based and British-backed Farm Land of Guinea Limited secured 99-year leases for two parcels totaling nearly 9000 hectares outside the villages of N’Dema and Konindou in Dabola Prefecture, where a secondary Ebola epicenter developed, and 98,000 hectares outside the village of Saraya in Kouroussa Prefecture. The Ministry of Agriculture has now tasked Farm Land Inc to survey and map an additional 1.5 million hectares for third-party development.

While these as of yet undeveloped acquisitions are not directly tied to Ebola, they are markers of a complex, policy-driven phase change in agroecology that our group hypothesizes undergirds Ebola’s emergence.

The role of palm oil in West Africa

Our thesis orbits around palm oil, in particular.

Palm is a vegetable oil of highly saturated fats derived from the red mesocarp of the African oil palm tree now grown around the world. The fruit’s kernel also produces its own oil. Refined and fractionated into a variety of byproducts, both oils are used in an array of food, cosmetic and cleaning products, as well as in some biodiesels. With the abandonment of trans fats, palm oil represents a growing market, with global exports totaling nearly 44 million metric tons in the 2014 growing season.

Oil palm plantations, covering more than 17 million hectares worldwide, are tied to deforestation and expropriation of lands from indigenous groups. We see from this Food and Agriculture Organization map that while most of the production can be found in Asia, particularly in Indonesia, Malaysia and Thailand, most of the suitable land left for palm oil can be found in the Amazon and the Congo Basin, the two largest rainforests in the world.

Palm oil represents a classic case of Lauderdale’s paradox. As environmental resources are destroyed what’s left becomes more valuable. A decaying resource base, then, is no due cause for agribusiness turning into good global citizens, as industry-funded advocates have argued. On the contrary, agribusiness seeks exclusive access to our now fiscally appreciating, if ecologically declining, landscapes.

Food production didn’t start that way in West Africa, of course.

Natural and semi-wild groves of different oil palm types have long served as a source of red palm oil in the Guinea Forest Region. Forest farmers have been raising palm oil in one or another form for hundreds of years. Fallow periods allowing soils to recover, however, were reduced over the 20th century from 20 years in the 1930s to 10 by the 1970s, and still further by the 2000s, with the added effect of increasing grove density. Concomitantly, semi-wild production has been increasingly replaced with intensive hybrids, and red oil replaced by, or mixed with, industrial and kernel oils.

Other crops are grown too, of course. Regional shade agriculture includes coffee, cocoa and kola. Slash-and-burn rice, maize, hibiscus, and corms of the first year, followed by peanut and cassava of the second and a fallow period, are rotated through the agroforest. Lowland flooding supports rice. In essence, we see a move toward increased intensification without private capital but still classifiable as agroforestry.

But even this kind of farming has since been transformed.

The Guinean Oil Palm and Rubber Company (with the French acronym SOGUIPAH) began in 1987 as a parastatal cooperative in the Forest but since has grown to the point it is better characterized a state company. It is leading efforts that began in 2006 to develop plantations of intensive hybrid palm for commodity export. SOGUIPAH economized palm production for the market by forcibly expropriating farmland, which to this day continues to set off violent protest.

International aid has accelerated industrialization. SOGUIPAH’s new mill, with four times the capacity of one it previously used, was financed by the European Investment Bank.

The mill’s capacity ended the artisanal extraction that as late as 2010 provided local populations full employment. The subsequent increase in seasonal production has at one and the same time led to harvesting above the mill’s capacity and operation below capacity off-season, leading to a conflict between the company and some of its 2000 now partially proletarianized pickers, some of whom insist on processing a portion of their own yield to cover the resulting gaps in cash flow. Pickers who insist on processing their own oil during the rainy season now risk arrest.

The new economic geography has also initiated a classic case of land expropriation and enclosure, turning a tradition of shared forest commons toward expectations whereby informal pickers working fallow land outside their family lineage obtain an owner’s permission before picking palm.

Palm oil and Ebola

What does all this have to do with Ebola?

Fig. 1 Palm Oil and Ebola

Fig. 1 Palm Oil and Ebola

The figure at top left (of Fig. 1) shows an archipelago of oil palm plots in the Guéckédou area, the outbreak’s apparent ground zero. The characteristic landscape is a mosaic of villages surrounded by dense vegetation and interspersed by crop fields of oil palm (in red) and patches of open forest and regenerated young forest.

The general pattern can be discerned at a finer scale as well, above, west of the town of Meliandou, where the index cases appeared.

The landscape embodies a growing interface between humans and frugivore bats, a key Ebola reservoir, including hammer-headed bats, little collared fruit bats and Franquet’s epauletted fruit bats.

Nur Juliani Shafie and colleagues document a variety of disturbance-associated fruit bats attracted to oil palm plantations. Bats migrate to oil palm for food and shelter from the heat while the plantations’ wide trails also permit easy movement between roosting and foraging sites.

Bats aren’t stupid. As the forest disappears they shift their foraging behavior to what food and shelter are left.

Bush meat hunting and butchery are one means by which subsequent spillover may take place. But to move away from the kinds of Western ooga booga epidemiology that wraps outbreaks in such ‘dirty’ cultural cloth, agricultural cultivation may be enough. Fruit bats in Bangladesh transmitted Nipah virus to human hosts by urinating on the date fruit humans cultivated.

Almudena Marí Saéz and colleagues have since proposed the initial Ebola spillover occurred outside Meliandou when children, including the putative index case, caught and played with Angolan free-tailed bats in a local tree. The bats are an insectivore species also previously documented as an Ebola virus carrier.

Whatever the specific reservoir source, shifts in agroeconomic context still appear a primary cause. Previous studies show the free-tailed bats also attracted to expanding cash crop production in West Africa, including of sugar cane, cotton, and macadamia.

Indeed, every Ebola outbreak appears connected to capital-driven shifts in land use, including back to the first outbreak in Nzara, Sudan in 1976, where a British-financed factory spun and wove local cotton. When Sudan’s civil war ended in 1972, the area rapidly repopulated and much of the local rainforest—and bat ecology—was reclaimed for subsistence farming, with cotton returning as the area’s dominant cash crop.

Are New York, London and Hong Kong as much to blame?

Clearly such outbreaks aren’t merely about specific companies.

We have started working with University of Washington’s Luke Bergmann to test whether the world’s circuits of capital as they relate to husbandry and land use are related to disease emergence. Bergmann and Holmberg’s maps, still in preparation, show the percent of land whose harvests are consumed abroad as agricultural goods or in manufactured goods and services for croplands, pastureland and forests.

The maps show landscapes are globalized by circuits of capital. In this way, the source of a disease may be more than merely the country in which it may first appear and indeed may extend as far as the other side of the world. We need to identify who funded the development and deforestation to begin with.

Such an epidemiology begs whether we might more accurately characterize such places as New York, London and Hong Kong, key sources of capital, as disease ‘hot spots’ in their own right. Diseases are relational in their geographies, and not solely absolute, as the ecohealth cowboys chronicled by David Quammen claim.

Similarly, such a new approach ruins the neat dichotomy between emergency responses and structural interventions.

Some disease hounds who acknowledge global structural issues tend to still focus on the immediate logistics of any given outbreak. Emergency responses are needed, of course. But we need to acknowledge that the emergency arose from the structural. Indeed, such emergencies are used as a means by which to avoid talking about the bigger picture driving the emergence of new diseases.

The forest may be its own cure

There’s another false dichotomy to unpack—this one between the forest’s ecosystemic noise and deterministic effect.

The environmental stochasticity at the center of forest ecology isn’t synonymous with random noise.

Here a bit of math can help. A simple stochastic differential model of exponential pathogen population growth can include fractional white noise of an index 0 to 1 defined by a covariance relationship across time and space. An Ito expansion produces a classic result in population growth:

When below a threshold, the noise exponent is small enough to permit a pathogen population to explode in size. When above the threshold, the noise is large enough to control an outbreak, frustrating efforts on the part of the pathogen to string together a bunch of susceptibles to infect.

Never mind the technical details. The important point is that disease trajectories, even in the deepest forest, aren’t divorced from their anthropogenic context. That context can impact upon the forest’s environmental noise and its effects on disease.

How exactly in Ebola’s case?

It’s been long known that if you can lower an outbreak below an infection Allee threshold—say by a vaccine or sanitary practices—an outbreak, not finding enough susceptibles, can burn out on its own. But commoditizing the forest may have lowered the region’s ecosystemic threshold to such a point where no emergency intervention can drive the Ebola outbreak low enough to burn out on its own. The virus will continue to circulate, with the potential to explode again.

In short, neoliberalism’s structural shifts aren’t just a background on which the emergency of Ebola takes place. The shifts are the emergency as much as the virus itself.

In contrast to Nassim Taleb’s Black Swan—history as shit happens—we have here an example of stochasticity’s impact arising out of deterministic agroeconomic policy—a phenomenon I’ve taken to calling the Red Swan.

Here, sudden switches in land use may explain Ebola’s emergence. Deforestation and intensive agriculture strip out traditional agroforestry’s stochastic friction that until this point had kept the virus from stringing together enough transmission.

Under certain conditions, the forest may act as its own epidemiological protection. We risk the next deadly pandemic when we destroy that capacity.

Rob Wallace is an evolutionary biologist and public health phylogeographer currently visiting the Institute of Global Studies at the University of Minnesota. He also blogs at Farming Pathogens.

 photo FamilySurvivalProtocolColliseumBannergrayscale900x338_zpsb17c85d0.jpg          Health and Wellness Report Banner photo FSPLogoBannerHealthandWellness831x338Blogger_zps68b43460.jpg

………………………………………………………………………………….

Why the United States Leaves Deadly Chemicals on the Market

November 21, 2015  

By Valerie Brown and Elizabeth Grossman

 

chemical_industry_influencing_regulation.jpg_850_593

Scientists are trained to express themselves rationally. They avoid personal attacks when they disagree. But some scientific arguments become so polarized that tempers fray. There may even be shouting.

Such is the current state of affairs between two camps of scientists: health effects researchers and regulatory toxicologists. Both groups study the effects of chemical exposures in humans. Both groups have publicly used terms like “irrelevant,” “arbitrary,” “unfounded” and “contrary to all accumulated physiological understanding” to describe the other’s work. Privately, the language becomes even harsher, with phrases such as “a pseudoscience,” “a religion” and “rigged.”

The rift centers around the best way to measure the health effects of chemical exposures. The regulatory toxicologists typically rely on computer simulations called “physiologically based pharmacokinetic” (PBPK) modeling. The health effects researchers—endocrinologists, developmental biologists and epidemiologists, among others—draw their conclusions from direct observations of how chemicals actually affect living things.

The debate may sound arcane, but the outcome could directly affect your health. It will shape how government agencies regulate chemicals for decades to come: how toxic waste sites are cleaned up, how pesticides are regulated, how workers are protected from toxic exposure and what chemicals are permitted in household items. Those decisions will profoundly affect public health: the rates at which we suffer cancer, diabetes, obesity, infertility, and neurological problems like attention disorders and lowered IQ.

The link from certain chemicals to these health effects is real. In a paper published earlier this year, a group of leading endocrinologists concluded with 99 percent certainty that environmental exposure to hormone-disrupting chemicals causes health problems. They estimate that this costs the European Union healthcare system about $175 billion a year.

Closer to home, Americans are routinely sickened by toxic chemicals whose health effects have been long known. To cite one infamous example, people exposed to the known carcinogen formaldehyde in FEMA trailers after Hurricane Katrina suffered headaches, nosebleeds and difficulty breathing. Dozens of cancer cases were later reported. Then there are workplace exposures, which federal government estimates link to as many as 20,000 cancer deaths a year and hundreds of thousands of illnesses.

“We are drowning our world in untested and unsafe chemicals, and the price we are paying in terms of our reproductive health is of serious concern,” wrote the International Federation of Gynecology and Obstetrics in a statement released on October 1.

Yet chemical regulation in the United States has proceeded at a glacial pace. And corporate profit is at the heart of the story.

That the chemical industry exerts political influence is well documented. What our investigation reveals is that, 30 years ago, corporate interests began to control not just the political process but the science itself. Industry not only funds research to cast doubt on known environmental health hazards; it has also shaped an entire field of science—regulatory toxicology—to downplay the risk of toxic chemicals.

Our investigation traces this web of influence to a group of scientists working for the Department of Defense (DOD) in the 1970s and 1980s—the pioneers of PBPK modeling. It quickly became clear that this type of modeling could be manipulated to minimize the appearance of chemical risk. PBPK methodology has subsequently been advanced by at least two generations of researchers—including many from the original DOD group—who move between industry, government agencies and industry-backed research groups, often with little or no transparency.

The result is that chemicals known to be harmful to human health remain largely unregulated in the United States—often with deadly results. For chemicals whose hazards are just now being recognized, such as the common plastics ingredient bisphenol A (BPA) and other , this lack of regulation is likely to continue unless the federal chemical review process becomes more transparent and relies less heavily on PBPK modeling.

Here we lay out the players, the dueling paradigms and the high-stakes health consequences of getting it wrong.

The dawn of PBPK simulation

The 1970s and 1980s saw a blizzard of environmental regulation. The Clean Air Act, Clean Water Act and Toxic Substances Control Act, along with the laws that established Superfund and Community Right-to-Know Programs, for the first time required companies— and military bases—using and producing chemicals to account for their environmental and health impacts. This meant greater demand for chemical risk assessments as the Occupational Safety and Health Administration (OSHA) and the Environmental Protection Agency (EPA) began to establish safety standards for workplace exposures and environmental cleanups.

In the 1980s, the now-defunct Toxic Hazards Research Unit at the Wright-Patterson Air Force Base in Dayton, Ohio, was investigating the toxicity and health effects of chemicals used by the military. Of particular concern to the DOD were the many compounds used by the military to build, service and maintain aircraft, vehicles and other machinery: fuels and fuel additives, solvents, coatings and adhesives. The military is responsible for about 900 of the approximately 1,300 currently listed Superfund sites, many of which have been contaminated by these chemicals for decades.

In the mid-1980s, scientists at the Wright-Patterson Toxic Hazards Research Unit began using PBPK simulations to track how chemicals move through the body. Known as in silico (in computers) models, these are an alternative to testing chemicals in vivo (in live animals) or in vitro (in a test tube). They allow scientists to estimate what concentrations of a chemical (or its breakdown products) end up in a particular organ or type of tissue, and how long they take to exit the body. The information can then be correlated with experimental data to set exposure limits—or not.

PBPK simulations made testing faster and cheaper, something attractive to both industry and regulators. But the PBPK model has drawbacks. “It tells you nothing about effects,” says Linda Birnbaum, director of both the National Institute of Environmental Health Sciences (NIEHS) and National Toxicology Program (NTP). Observational studies and laboratory experiments, on the other hand, are designed to discover how a chemical affects biological processes.

Even regulatory toxicologists who support PBPK acknowledge its limitations: “[PBPK models] are always going to be limited by the quality of the data that go into them,” says toxicologist James Lamb, who worked for the NTP and EPA in the 1980s and is now principal scientist at the consulting firm Exponent.

The late health effects researcher Louis Guillette, a professor at the Medical University of South Carolina famous for studies on DDT’s hormonedisrupting effects in Florida alligators, put it more bluntly: “PBPK? My immediate response: Junk in, junk out. The take-home is that most of the models [are] only as good as your understanding of the complexity of the system.”

Many biologists say PBPK-based risk assessments begin with assumptions that are too narrow, and thus often fail to fully capture how a chemical exposure can affect health. For example, a series of PBPK studies and reviews by toxicologist Justin Teeguarden of the Pacific Northwest National Laboratory in Richland, Wash., and his colleagues suggested that BPA breaks down into less harmful compounds and exits the body so rapidly that it is essentially harmless. Their research began with certain assumptions: that BPA only mimics estrogen weakly, that it affects only the body’s estrogen system, and that 90 percent of BPA exposure is through digestion of food and beverages. However, health effects research has shown that BPA mimics estrogen closely, can affect the body’s androgen and thyroid hormone systems, and can enter the body via pathways like the skin and the tissues of the mouth. When PBPK models fail to include this evidence, they tend to underestimate risk.

Because of its reliance on whatever data are included, PBPK modeling can be deliberately manipulated to produce desired outcomes. Or, as University of Notre Dame biologist Kristin Shrader-Frechette, who specializes in human health risk assessment, says: “Models can offer a means of avoiding the conclusions derived from actual experiments.” In other words, PBPK models can be customized to provide results that work to industry’s advantage.

That’s not to say PBPK itself is to blame. “Let’s not throw the baby out completely with the bathwater,” says New York University associate professor of environmental medicine and health policy Leo Trasande. “However, when you have biology telling you there are basic flaws in the model, that’s a compelling reason that it’s time for a paradigm shift.”

A handy tool for industry

That PBPK studies could be used to make chemicals appear safer was as clear in the 1980s as it is now. In a 1988 paper touting the new technique, Wright-Patterson scientists explained how their modeling had prompted the EPA to stop its regulation process for a chemical of great concern to the military: methylene chloride.

Methylene chloride is widely used as a solvent and as an ingredient in making plastics, pharmaceuticals, pesticides and other industrial products. By the 1990s, the U.S. military would be the country’s second greatest user. Methylene chloride was—and remains—regulated under the Clean Air Act as a hazardous air pollutant because of its carcinogenic and neurotoxic effects.

Between 1985 and 1986, the National Institute for Occupational Safety and Health estimated that about 1 million workers a year were exposed to methylene chloride, and the EPA classified the compound as a “probable human carcinogen.” A number of unions, including United Auto Workers and United Steelworkers, also petitioned OSHA to limit on-the-job exposure to methylene chloride.

In 1986, OSHA began the process of setting occupational exposure limits. Stakeholders were invited to submit public comments.

Among the materials submitted was a PBPK study by Melvin Andersen, Harvey Clewell—both then working at Wright-Patterson—and several other scientists, including two employed by methylene chloride product manufacturer Dow Chemical. Published in 1987, this study concluded, “Conventional risk analyses greatly overestimate the risk in humans exposed to low concentrations [of methylene chloride].”

Later that year, the EPA revised its previous health assessment of methylene chloride, citing the Wright-Patterson study to conclude that the chemical was nine times less risky than previously estimated. The EPA “has halted its rulemaking on methylene chloride [based on our studies],” wrote Wright-Patterson scientists in 1988.

OSHA, too, considered the Wright-Patterson study in its methylene chloride assessment—and its rulemaking dragged on another 10 years before the agency finally limited exposure to the chemical.

The usefulness of PBPK modeling to industry did not escape the Wright-Patterson researchers. “The potential impact,” wrote Andersen, Clewell and their colleagues in 1988, “is far reaching and not limited to methylene chloride.” Using PBPK models to set exposure limits could help avoid setting “excessively conservative”—i.e., protective— limits that could lead to “unnecessary expensive controls” and place “constraints on important industrial processes.” In other words, PBPK models could be used to set less-stringent environmental and health standards, and save industry money.

So far, they’ve been proven right. The work done at Wright-Patterson set the stage for the next 30-plus years. Results obtained using PBPK modeling—especially in industry-funded research, often conducted by former Wright-Patterson scientists—have downplayed the risk and delayed the regulation of numerous widely used and commercially lucrative chemicals. These include formaldehyde, styrene, tricholorethylene, BPA and the pesticide chlorpyrifos. For many such chemicals, PBPK studies contradict what actual biological experiments conclude. Regulators often defer to the PBPK studies anyway.

A web of influence

At the time that PBPK modelling was being developed, the chemical industry was struggling with its public image. The Bhopal, India, disaster—the methyl isocyanate release that killed and injured thousands—happened in 1984. The following year, a toxic gas release at a West Virginia Union Carbide plant sent about 135 people to hospitals.

In response to these incidents, new federal regulations required companies to account for the storage, use and release of hazardous chemicals. The minutes from a May 1988 Chemical Manufacturers Association (CMA) meeting show industry was feeling the pressure. Noting the federal scrutiny and the growing testing requirements, the CMA recommended that industry help “develop exposure data” and “explore innovative ways to limit required testing to that which is needed.”

Industry had already begun to do this by founding a number of research institutes such as the Chemical Industry Institute of Toxicology (CIIT), a nonprofit toxicology research institute (renamed the Hamner Institutes in an act of linguistic detoxification in 2007). This period also saw the rise of for-profit consulting firms like Environ (1982), Gradient (1985), ChemRisk (1985) and K.S. Crump and Company (1986), with which industry would collaborate advantageously in the following decades.

“Our goal was to do the science that would help the EPA and other regulatory bodies make the policies,” explained William Greenlee, Hamner president and CEO, in an interview for a business website. Indeed, over the past 30 years, Hamner and these consultancies have produced hundreds of PBPK studies, often with the support of chemical companies or trade groups. Overwhelmingly, these studies downplay or cast doubt on chemicals’ health effects—and delay regulation.

“I have seen how scientists from the Hamner Institutes can present information in a way that carefully shapes or controls a narrative,” says Laura Vandenberg, an assistant professor of environmental health sciences at University of Massachusetts Amherst. She explains that Hamner scientists often use narrow time windows or present data in a limited context, rejecting information that does not conform to their models. “These are the kinds of tactics used to manufacture doubt,” she says.

A close look at the authors of studies produced by these industry-linked research groups reveals a web of influence traceable to Wright-Patterson (see chart on following page). At least 10 researchers employed at or contracted by Wright-Patterson in the 1980s went on to careers in toxicology at CIIT/Hamner, for-profit consulting firms or the EPA. About half have held senior positions at Hamner, including the co-authors of many of the early Wright-Patterson PBPK studies: Melvin Anderson, now a chief scientific officer at Hamner, and Harvey Clewell, now a senior investigator at Hamner and principal scientist at the consulting firm ENVIRON. “I’m probably given credit as the person who brought PBPK into toxicology and risk assessment,” Andersen told In These Times.

A revolving door between these industry-affiliated groups and federal regulators was also set in motion. More than a dozen researchers have moved from the EPA to these for-profit consultancies; a similar number have gone in the other direction, ending up at the EPA or other federal agencies.

Further blurring the public-private line, CIIT/Hamner has received millions of dollars in both industry and taxpayer money. The group stated on its website in 2007 that $18 million of its $21.5 million annual operating budget came from the “chemical and pharmaceutical industry.” Information about its corporate funders is no longer detailed there, but Hamner has previously listed as clients and supporters the American Chemistry Council (formerly the CMA, and one of the most powerful lobbyists against chemical regulation), American Petroleum Institute, BASF, Bayer CropScience, Dow, ExxonMobil, Chevron and the Formaldehyde Council. At the same time, over the past 30 years, CIIT/Hamner has received nearly $160 million in grants and contracts from the EPA, DOD and Department of Health and Human Services. In sum, since the 1980s, these federal agencies have awarded hundreds of millions of dollars to industry-affiliated research institutes like Hamner.

But the federal reliance on industry-linked researchers extends further. Since 2000, the EPA has signed a number of cooperative research agreements with the ACC and CIIT/ Hamner. All involve chemical toxicity research that includes PBPK modeling. And in 2014, Hamner outlined additional research it will be conducting for the EPA’s next generation of chemical testing—the ToxCast and Tox21 programs. Over the past five years, Hamner has received funding for this same research from the ACC and Dow.

Meanwhile, the EPA regularly contracts with for-profit consultancies to perform risk assessments, assemble peer review panels and select the scientific literature used in chemical evaluations. This gives these private organizations considerable sway in the decision-making process, often with little transparency about ties to chemical manufacturers. The upshot: Experts selected to oversee chemical regulation often overrepresent the industry perspective.

These cozy relationships have not gone unnoticed; the EPA has been called to task by both its own Office of Inspector General and by the U.S. Government Accountability Office. “These arrangements have raised concerns that ACC or its members could potentially influence, or appear to influence, the scientific results that may be used to make future regulatory decisions,” wrote the GAO in a 2005 report.

Asked for comment by In These Times, the EPA said these arrangements do not present conflicts of interest.

Decades of deadly delay

PBPK studies have stalled the regulation of numerous chemicals. In each case, narrowly focused models developed by industry-supported research concluded that risks were lower than previously estimated or were not of concern at likely exposure levels.

Take, for example, methylene chloride, the subject of the 1987 paper Wright-Patterson scientists bragged had halted the EPA’s regulatory process. Despite the chemical being identified as “probably carcinogenic to humans” by the U.N. International Agency for Research on Cancer, a “reasonably anticipated” human carcinogen by the U.S. National Toxicology Program, and an “occupational carcinogen” by OSHA, the EPA has yet to limit its use. EPA researchers noted this year that the 1987 PBPK model by the Wright-Patterson scientists remains the basis for the agency’s risk assessment.

Today, methylene chloride remains in use—to produce electronics, pesticides, plastics and synthetic fabrics, and in paint and varnish strippers. The Consumer Product Safety Commission, OSHA and NIOSH have issued health warnings, and the FDA bars methylene chloride from cosmetics— but no U.S. agency has totally banned the chemical. The EPA estimates that some 230,000 workers are exposed directly each year. According to OSHA, between 2000 and 2012, at least 14 people died in the United States of asphyxiation or heart failure after using methylene chloride-containing products to refinish bathtubs. The Center for Public Integrity reports that methylene chloride exposure prompted more than 2,700 calls to U.S. poison control centers between 2008 and 2013.

Another telling example of industry-funded PBPK studies’ influence is formaldehyde. This chemical remains largely unrestricted in the United States, despite being a well-recognized respiratory and neurological toxicant linked to nasal cancer and leukemia, as well as to allergic reactions and skin irritation. The EPA’s toxicological review of formaldehyde, begun in 1990, remains incomplete, in no small part because of delays prompted by the introduction of studies—including PBPK models conducted by CIIT/Hamner—questioning its link to leukemia.

If that link is considered weak or uncertain, that means formaldehyde—or the companies that employ the sickened workers—won’t be held responsible for the disease. The chemical industry is well aware that “more people have leukemia … than have nasal tumors,” says recently retired NIEHS toxicologist James Huff.

Some of this CIIT/Hamner research was conducted between 2000 and 2005 with funding from an $18,750,000 EPA grant. In 2010, Hamner received $5 million from Dow, a formaldehydeproduct manufacturer, for toxicity testing, including PBPK modeling. The ACC, which opposes formaldehyde restriction, also supported this research.

Consequently, apart from a few state regulations and a pending EPA proposal to limit formaldehyde emissions from composite wood products like plywood, companies can still use the chemical—as in the FEMA trailers.

Cosmetics and personal-care products can also be sources of formaldehyde exposure. This made headlines in 2011 after hair salon workers using a smoothing product called Brazilian Blowout reported nausea, sore throats, rashes, chronic sinus infections, asthma-like symptoms, bloody noses, dizziness and other neurological effects. “You can’t see it … but you feel it in your eyes and it gives you a high,” salon owner and hair stylist Cortney Tanner tells In These Times. “They don’t teach this stuff in beauty school,” she says, and no one warns stylists about these products or even suggests using a ventilator.

OSHA has issued a hazard alert for these products and the FDA has issued multiple warnings, most recently in September, but regulations prevent federal agencies from pulling the products from store shelves. So, for formaldehyde, as in the case of the paint strippers containing methylene chloride, exposures continue.

BPA rings alarm bells

The chemical currently at the center of the most heated debates about consumer exposure is BPA. The building block of polycarbonate plastics, BPA is used in countless products, including the resins that line food cans and coat the thermal receipt paper at cash registers and ATMs. While scientific evidence of adverse health effects from environmentally typical levels of BPA mounts, and many manufacturers and retailers have responded to public concern by changing their products, federal regulatory authorities still resist restricting the chemical’s use.

BPA does not produce immediate, acute effects, like those experienced by salon workers exposed to formaldehyde or machinists working with methylene chloride. But in laboratory tests on animals, BPA is a known endocrine disruptor. Structurally similar to natural hormones, endocrine disruptors can interfere with normal cellular processes and trigger abnormal biochemical responses. These can prompt numerous health problems, including cancer, infertility, and metabolic and neurological disorders. BPA has also been linked to increased risk of cardiovascular disease, diabetes and obesity.

To promote the idea that BPA is safe, the chemical industry routinely lobbies policymakers and “educates” consumers. What has not been widely discussed, however, is how industry has backed PBPK studies that marginalized research showing risks from environmentally typical levels of BPA. Many of these doubt-inducing studies have been conducted by researchers whose careers can be linked to the PBPK work done at Wright-Patterson. In published critiques, health effects researchers—among them Gail Prins and Wade Welshons—have detailed the many ways in which these PBPK models fail to accurately reflect BPA exposure.

PBPK and endocrine disruption

Over the past several decades, our evolving understanding of our bodies’ responses to chemicals has challenged previous toxicological assumptions— including those that are fed into PBPK models. This is particularly true of endocrine disruptors.

Cause-and-effect relationships between endocrine disruptors and health problems can be hard to pinpoint. We now know that early—even prenatal— exposure to endocrine disruptors can set the stage for adult disease. In addition, a pregnant woman’s exposures may affect not only her children but also her grandchildren. These transgenerational effects have been documented in animal experiments. The classic human evidence came from victims of DES, a drug prescribed in the 1940s, 1950s and 1960s to prevent miscarriages. Daughters of women who took the endocrine disruptor developed reproductive cancers, and preliminary research suggests their daughters may be at greater risk for cancer and other reproductive problems.

“The transgenerational work raises an incredible specter,” says Andrea Gore, who holds the Vacek Chair in Pharmacology at the University of Texas at Austin and edits the influential journal Endocrinology. “It’s not just what you’re exposed to now, it’s what your ancestors were exposed to.”

Complicating PBPK modeling further, hormone-mimicking chemicals, just like hormones, can have biological effects at concentrations as low as parts per trillion. In addition, environmental exposures most often occur as mixtures, rather than in isolation. And each individual may respond differently.

“PBPK doesn’t come close” to capturing the reality of endocrine disruption, the late developmental biologist Louis Guillette told In These Times, in part because modelers are “still asking questions about one chemical exposure with one route of exposure.” Even for health effects researchers, understanding of mixtures’ effects is in its infancy.

The debate over how endocrine disruption can be represented in PBPK models has intensified the unease between regulatory toxicologists and health effects researchers. That tension is particularly well-illustrated by a recent series of events that also reveal how some journal editors privilege the industry’s point of view.

A life-and-death debate

In February 2012 the World Health Organization (WHO) and the U.N. Environment Programme (UNEP) published a report intended to inform regulation worldwide. The authors were an international group of health effects researchers with long experience studying endocrine disruption.

“There is an increasing burden of disease across the globe in which [endocrine disruptors] are likely playing an important role, and future generations may also be affected,” said the report. These diseases, it continued, are being seen in humans and wildlife, and include male and female reproductive disorders, changes in the numbers of male and female babies born, thyroid and adrenal gland disorders, hormone-related cancers and neurodevelopmental diseases.

The backlash from toxicologists was immediate. Over the next few months—as the EU prepared to begin its regulatory decision-making on endocrine disruptors—the editors of 14 toxicology journals each published an identical commentary harshly criticizing the WHO/UNEP conclusions.

The commentary included a letter from more than 70 toxicologists urging the EU not to adopt the endocrine disruption framework. The letter said that the WHO/UNEP report could not be allowed to inform policy because its science is “contrary to all accumulated physiological understanding.”

This commentary was followed by further attacks. One critique, published in the journal Critical Reviews in Toxicology, was funded and vetted by the ACC.

These commentaries infuriated health effects researchers. Twenty endocrine journal editors, 28 associate editors and 56 other scientists—including several WHO/UNEP report authors—signed a statement in Endocrinology, saying in part:

The dismissive approach to endocrine disruption science put forth … is unfounded, as it is [not] based on the fundamental principles of how the endocrine system works and how chemicals can interfere with its normal function.

Endocrinology editor Andrea Gore tells In These Times that she and other health effects researchers don’t think the scientifically demonstrated dangers of endocrine disruptors are subject to debate. “There are fundamental differences between regulatory toxicologists and what I refer to as ‘people who understand the endocrine science.’ ”

The outcome of this debate and the structure of future regulatory toxicity testing in the United States and Europe is not yet clear. The EPA appears to be attempting to incorporate endocrine disruption into PBPK models, but many scientists are skeptical the process will produce reliable results, given the models’ limitations and the complexity of endocrine effects.

From science to activism

Although couched in complex language, these arguments are not academic, but have profound implications for public health. Disorders and diseases, increasingly linked to exposure to endocrine disruptors— including metabolic, reproductive, developmental and neurological problems—are widespread and increasing. About 20 percent of U.S. adults show at least three of the five indicators of metabolic syndrome: obesity, diabetes, high blood pressure, high cholesterol and heart disease. Neurological problems, including behavioral and learning disabilities in children as well as Parkinson’s disease, are increasing rapidly. Fertility rates in both men and women are declining. Globally, the average sperm count has dropped 50 percent in the last 50 years.

Scientists typically shy away from activism, but many now believe it’s what’s needed to punch through the machinations and inertia regarding chemical regulation. Shanna Swan, Mount Sinai professor of preventive medicine, obstetrics, gynecology and reproductive medicine, notes that some of the biggest reductions in chemical exposures have happened in response to consumer pressure on both industry and policymakers. Or, as the University of California’s Bruce Blumberg says, “I think we need to take the fight to the people.”

The Endocrine Society stressed the urgency of addressing these public health impacts in a statement released September 28. Not surprisingly, industry disagreed, calling this science “unsupported” and “still-unproven.”

Meanwhile, PBPK studies continue to succeed in sowing doubt about adverse health effects of endocrine disorders. Their extremely narrow focus leads to narrow conclusions that often result in calls for more research before regulation. In regulatory decisions, “the assumption is that if we don’t know something, it won’t hurt us,” says University of Massachusetts, Amherst professor of biology R. Thomas Zoeller. In other words, the burden of proof remains on health effects researchers to prove harm, not on industry to prove safety—and proving harm is difficult, especially when other scientists are seeding doubt.

But the clock is ticking. As Washington State University geneticist Pat Hunt told In These Times, “If we wait [to make regulatory decisions] for ‘proof’ in the form of compelling human data, it may be too late for us as a species.”

This investigation was supported by the Leonard C. Goodman Institute for Investigative Reporting and published originally in In These Times.

Earth Watch Report Banner photo FSPEarthWatchReport900x228Blogger_zps53ef6af0.jpg

 

Daily News

NEW YORK DAILY NEWS
Updated: Wednesday, November 25, 2015, 11:58 AM

“Kissing Bug” now in Florida and Georgia
WTEV – Jacksonville, FL
Don't let the kissing bugs bite.

Here’s another reason to stay in New York this holiday season — the “kissing bug” has now spread to 28 states.

Texas is the latest to report an outbreak of infections from the Latin American triatomine bug after the pest had been spotted in other southern and western states, including Georgia, Alabama and California, according to the Centers for Disease Control and Prevention.

The creepy crawler resembling a cockroach gets its colorful nickname because it likes to bite around the lips and eyes of people when they are asleep. More than half of the bugs carry a parasite that can cause Chagas disease in humans, dogs and other mammals.

The good news? To actually pass on the disease, the bug not only needs to bite you, but then defecate into the gash. If left untreated, up to 30% of bite victims will develop chronic conditions such as difficulty breathing, heart and intestinal complications, and, in extreme cases, death.

There have been eight million cases in Latin America and South America because of poorly constructed rural homes, according to the CDC.

To prevent an outbreak, the CDC recommends:

Sealing cracks and gaps around windows, walls, roofs, and doors.

Removing wood, brush, and rock piles near your house.

 

Read More Here

Earth Watch Report Banner photo FSPEarthWatchReport900x228Blogger_zps53ef6af0.jpg

……………………………………………………….

 

Floating cars, people in boats: Havoc as Qatar, Saudi Arabia ravaged by heavy rains (PHOTOS, VIDEOS)

© carolyn_redaelli
Cars floating in rivers that were once streets, water gushing through ceilings and people sailing to work on boats – that’s the current picture in Qatar and Saudi Arabia, both desert countries, which should be dry and sunny for the whole year.

Qatar’s capital Doha was apparently unprepared for the deluge and flooding that damaged many buildings in the city. The area near the capital’s Hamad International Airport was hammered with around 66mm of rain in just a few days, according to the Qatar Meteorology Department. For the record, Doha has 75mm of rain on average a year.

 Many buildings at the multibillion-dollar airport failed to hold up to the torrent: pictures and videos on social media show water flooding into the passenger terminal.

Cascades of water fell from a ceiling inside Ezdan Mall in Doha, Doha News reported.

“Inclement weather” prompted the closure of schools across the country as well as the US Embassy in Qatar on Wednesday.

 

Read More Here

Earth Watch Report Banner photo FSPEarthWatchReport900x228Blogger_zps53ef6af0.jpg

…………………………………………………………

 

Scientists get first glimpse of black hole eating star, ejecting high-speed flare

Date:
November 27, 2015
Source:
Johns Hopkins University
Summary:
An international team of astrophysicists has for the first time witnessed a star being swallowed by a black hole and ejecting a flare of matter moving at nearly the speed of light.

This artists impression shows a black hole consuming a star that has been torn apart by the black hole’s strong gravity. As a result of this massive “meal” the black hole begins to launch a powerful jet that we can detect with radio telescopes.
Credit: NASA/Goddard Space Flight Center/Swift

An international team of astrophysicists led by a Johns Hopkins University scientist has for the first time witnessed a star being swallowed by a black hole and ejecting a flare of matter moving at nearly the speed of light.

The finding reported in the journal Science tracks the star — about the size of our sun — as it shifts from its customary path, slips into the gravitational pull of a supermassive black hole and is sucked in, said Sjoert van Velzen, a Hubble fellow at Johns Hopkins.

“These events are extremely rare,” van Velzen said. “It’s the first time we see everything from the stellar destruction followed by the launch of a conical outflow, also called a jet, and we watched it unfold over several months.”

Black holes are areas of space so dense that irresistible gravitational force stops the escape of matter, gas and even light, rendering them invisible and creating the effect of a void in the fabric of space. Astrophysicists had predicted that when a black hole is force-fed a large amount of gas, in this case a whole star, then a fast-moving jet of plasma — elementary particles in a magnetic field — can escape from near the black hole rim, or “event horizon.” This study suggests this prediction was correct, the scientists said.

“Previous efforts to find evidence for these jets, including my own, were late to the game,” said van Velzen, who led the analysis and coordinated the efforts of 13 other scientists in the United States, the Netherlands, Great Britain and Australia.

Supermassive black holes, the largest of black holes, are believed to exist at the center of most massive galaxies. This particular one lies at the lighter end of the supermassive black hole spectrum, at only about a million times the mass of our sun, but still packing the force to gobble a star.

The first observation of the star being destroyed was made by a team at the Ohio State University, using an optical telescope in Hawaii. That team announced its discovery on Twitter in early December 2014.

After reading about the event, van Velzen contacted an astrophysics team led by Rob Fender at the University of Oxford in Great Britain. That group used radio telescopes to follow up as fast as possible. They were just in time to catch the action.

By the time it was done, the international team had data from satellites and ground-based telescopes that gathered X-ray, radio and optical signals, providing a stunning “multi-wavelength” portrait of this event.

It helped that the galaxy in question is closer to Earth than those studied previously in hopes of tracking a jet emerging after the destruction of a star. This galaxy is about 300 million light years away, while the others were at least three times farther away. One light year is 5.88 trillion miles.

The first step for the international team was to rule out the possibility that the light was from a pre-existing expansive swirling mass called an “accretion disk” that forms when a black hole is sucking in matter from space. That helped to confirm that the sudden increase of light from the galaxy was due to a newly trapped star.

“The destruction of a star by a black hole is beautifully complicated, and far from understood,” van Velzen said. “From our observations, we learn the streams of stellar debris can organize and make a jet rather quickly, which is valuable input for constructing a complete theory of these events.”

Van Velzen last year completed his doctoral dissertation at Radboud University in the Netherlands, where he studied jets from supermassive black holes. In the last line of the dissertation, he expressed his hope to discover these events within four years. It turned out to take only a few months after the ceremony for his dissertation defense.

Van Velzen and his team were not the only ones to hunt for radio signals from this particular unlucky star. A group at Harvard observed the same source with radio telescopes in New Mexico and announced its results online. Both teams presented results at a workshop in Jerusalem in early November. It was the first time the two competing teams had met face to face.

“The meeting was an intense, yet very productive exchange of ideas about this source,” van Velzen said. “We still get along very well; I actually went for a long hike near the Dead Sea with the leader of the competing group.”


Story Source:

The above post is reprinted from materials provided by Johns Hopkins University. Note: Materials may be edited for content and length.


Journal Reference:

  1. S. van Velzen, G. E. Anderson, N. C. Stone, M. Fraser, T. Wevers, B. D. Metzger, P. G. Jonker, A. J. van der Horst, T. D. Staley, A. J. Mendez, J. C. A. Miller-Jones, S. T. Hodgkin, H. C. Campbell, R. P. Fender. A radio jet from the optical and X-ray bright stellar tidal disruption flare ASASSN-14li. Science, 2015; DOI: 10.1126/science.aad1182

Earth Watch Report Banner photo FSPEarthWatchReport900x228Blogger_zps53ef6af0.jpg

…………………………………………………………..
Maldives Independent

Addu City suffers worst floods in 40 years

Photos shared by the MRC show a foot of water inside some households.

Addu City floods

Addu City floods

 

November 25 14:56 2015

Southern Addu City has suffered the worst storm damage in 40 years after 12 continuous hours of torrential rain left streets inundated and flooded some 200 households.

“This is the worst flooding I’ve seen in decades. The water is knee-deep in most areas, and a majority of houses are under a foot of water,” saud Abdulla Thoyyib, the deputy mayor.

The Feydhoo and Maradhoo-Feydhoo wards suffered the most damage. According to the Maldives Red Crescent, some 32 houses in Feydhoo and 11 houses in Maradhoo-Feydhoo suffered major damage. A majority of household appliances were destroyed, a spokesperson said.

Residents are now worried of water contamination as sewers are full and overflowing. The city, home to some 20,000 people, and the second most populous region, is out of chlorine, according to Thoyyib.

The Maldives National Defence Forces have set up water pumps in the three worst affected wards. Sand bags have been piled up to stop water entering into 17 houses in the Feydhoo ward.

 

The rain, which started at 3pm on Tuesday, continued for 12 hours. The department of meteorology recorded 228mm of rain, the worst in 40 years in the Maldives.

“This kind of rain is not common and it has damaged houses that are normally safe,” Thoyyib said.

 

Read More Here