Category Archives: Health/Environment

Record Earthquake Activity in Oklahoma: Experts Blame Fracking

Series of small earthquakes rock Oklahoma in record seismic activity

By Carey Gillam
Reuters: April 5, 2014

Earthquakes rattled residents in Oklahoma on Saturday, the latest in a series that have put the state on track for record quake activity this year, which some seismologists say may be tied to oil and gas exploration.

One earthquake recorded at 3.8 magnitude by the U.S. Geological Survey rocked houses in several communities around central Oklahoma at 7:42 a.m. local time. Another about two hours earlier in the same part of the state, north of Oklahoma City, was recorded at 2.9 magnitude, USGS said.

Those two were preceded by two more, at 2.6 magnitude, and 2.5 magnitude, that also rolled the landscape in central Oklahoma early Saturday morning. A 3.0 magnitude tremor struck late Friday night in that area as well, following a 3.4 magnitude hit Friday afternoon.

Austin Holland, a seismologist with the Oklahoma Geological Survey who tracks earthquake activity for the USGS, said the earthquake activity in the state is soaring.

[…]

“We have already crushed last year’s record for number of earthquakes,” Holland said.

Most earthquakes occur naturally. But scientists have long linked some small earthquakes to oil and gas work underground, which can alter pressure points and cause shifts in the earth.

Oil and gas exploration has increased in recent years across the country, spurred by U.S. efforts for energy independence. Modern hydraulic fracturing, or fracking, is one particularly controversial technique.

For bigger quakes, so far this year the state has recorded 106 at 3.0 magnitude and above, according to Holland. For all of last year the state had 109 at 3.0 and above.

In November 2011, Oklahoma suffered a 5.6 magnitude quake that damaged more than a dozen homes and several businesses.

Wastewater disposal related to the fracking is suspected by many scientists to contribute to the earthquake activity. Millions of gallons of wastewater are typically trucked from a fracking site to wells where the water is injected thousands of feet underground into porous rock layers. That work, if done near a fault, can trigger larger quakes, according to several recent scientific studies.

Oklahoma recorded 278 earthquakes from 2008 through 2013 that have registered on the Richter scale at a magnitude of 3.0 or greater, a level that can shake objects inside a home.

Before that, from 1975-2008, the state on average recorded less than six earthquakes a year.

(Read the full article at Yahoo)

—-
Alterntaive Free Press – fair use –

23 Flawed Nuclear Reactors in the USA: Fukushima, General Electric & the Obama Administration

AlternativeFreePress.com

General Electric’s Mark 1 system has had known unacceptable safety risks for decades, and the nuclear industry has incredibly limited liability.

In 1972, Stephen H. Hanauer, then a safety official with the Atomic Energy Commission, recommended that the Mark 1 system be discontinued because it presented unacceptable safety risks. Among the concerns cited was the smaller containment design, which was more susceptible to explosion and rupture from a buildup in hydrogen — a situation that may have unfolded at the Fukushima Daiichi plant. Later that same year, Joseph Hendrie, who would later become chairman of the Nuclear Regulatory Commission, a successor agency to the atomic commission,said the idea of a ban on such systems was attractive. But the technology had been so widely accepted by the industry and regulatory officials, he said, that “reversal of this hallowed policy, particularly at this time, could well be the end of nuclear power.”

NY Times

On February 2, 1976, Gregory C. Minor, Richard B. Hubbard, and Dale G. Bridenbaugh “blew the whistle” on safety problems at nuclear power plants. The three engineers gained the attention of journalists, and their disclosures about the threats of nuclear power had a significant impact. They timed their statements to coincide with their resignations from responsible positions in General Electric’s nuclear energy division, and later established themselves as consultants on the nuclear power industry for state governments, federal agencies, and overseas governments… Bridenbaugh described design flaws of General Electric’s Mark 1 reactors, which account for five of the six reactors at the Fukushima 1 power plant. Bridenbaugh claimed that the design “did not take into account the dynamic loads that could be experienced with a loss of coolant” and that, despite efforts to retrofit the reactors, “the Mark 1 is still a little more susceptible to an accident that would result in a loss of containment.”

Wikipedia

A year after the disaster, Tepco was taken over by the Japanese government because it couldn’t afford the costs to get the damaged reactors under control. By June of 2012, Tepco had received nearly 50 billion dollars from the government.

The six reactors were designed by the U.S. company General Electric (GE). GE supplied the actual reactors for units one, two and six, while two Japanese companies Toshiba provided units three and five, and Hitachi unit four. These companies as well as other suppliers are exempted from liability or costs under Japanese law.

Many of them, including GE, Toshiba and Hitachi, are actually making money on the disaster by being involved in the decontamination and decommissioning, according to a report by Greenpeace International.

“The nuclear industry and governments have designed a nuclear liability system that protects the industry, and forces people to pick up the bill for its mistakes and disasters,” says the report, “Fukushima Fallout“.

“If nuclear power is as safe as the industry always claims, then why do they insist on liability limits and exemptions?” asked Shawn-Patrick Stensil, a nuclear analyst with Greenpeace Canada.

Nuclear plant owner/operators in many countries have liability caps on how much they would be forced to pay in case of an accident. In Canada, this liability cap is only 75 million dollars. In the United Kingdom, it is 220 million dollars. In the U.S., each reactor owner puts around 100 million dollars into a no-fault insurance pool. This pool is worth about 10 billion dollars.

“Suppliers are indemnified even if they are negligent,” Stensil told IPS.

IPS

NBC News has reported that there are 23 nuclear plants in the United States that use the GE Mark 1 BWR. Yes, 23. There are 23 nuclear plants in the United States where the used fuel rods are suspended, in a pond, 100 feet above the ground. Additionally, 12 more reactors in the USA have GE’s later Mark II or Mark III containment system.

” Jeffery Immelt is the head of GE. He is also the head of the United States Economic Advisory Board. He was invited to join the board personally by President Obama in 2009 and took over as head in 2011 when Paul Volcker stepped down in February 2011, just a month before the earthquake and tsunami that devastated Fukushima.

Paul Volcker was often seen as being at odds with the administration, and many of his ideas were not embraced by the government. The appointment of Immelt, a self-described Republican, was seen as a move to give Obama a leg up when dealing with the Republican majority in the House.

There have been calls from many organizations for GE to be held accountable for the design faults in the reactors that powered the Fukushima plant. The fact that they had been known for so long does seem to indicate that the company ignored and over-ruled advice from nuclear experts.

… Any admission that radiation has spread across the Pacific Ocean and contaminated American soil is an admission that the technology was flawed, and that same flawed technology is being used in the United States. The government does not want anyone looking closer at the situation. They don’t want people poking around asking questions about why the radiation got out in the first place…it’s too close to home.

Better to say that the radiation is within safe levels, and then if such a disaster happens here they can mourn those in the immediate fallout zone and maintain that the rest of the country is okay, just as it was after Fukushima.

The fact that the CEO of GE works for Obama just highlights the facts. There is no way that Immelt doesn’t know about all the warning his company was given about the design flaws of the Mark 1; and if he knows, the government knows.”

The Daily Sheeple

Can we trust the Obama Administration?
Can we trust the 23 Mark 1 reactors in the United States?

Compiled by Alternative Free Press
Creative Commons License
23 Flawed Nuclear Reactors in the USA: Fukushima, General Electric & the Obama Administration by AlternativeFreePress.com is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

—-
Alternative Free Press – fair use –

Organic Standards Are Being Watered Down

Corporate Clout Chips Away at Organic Standards

By Alexis Baden-Mayer and Ronnie Cummins
Organic Consumers Association: April 2, 2014

The Organic Consumers Association has a long history of defending the integrity of organic standards.

Last September, the U.S. Department of Agriculture (USDA), under pressure from corporate interests represented by the Organic Trade Association, made our job harder.

They also made it more important than ever for consumers to do their homework, even when buying USDA certified organic products.

Without any input from the public, the USDA changed the way the National Organic Standards Board (NOSB) decides which non-organic materials are allowed in certified organic. The change all but guarantees that when the NOSB meets every six months, the list of non-organic and synthetic materials allowed in organic will get longer and longer.

The USDA’s new rule plays to the cabal of the self-appointed organic elite who want to degrade organic standards and undermine organic integrity. For consumers, farmers, co-ops and businesses committed to high organic standards, the USDA’s latest industry-friendly move is a clarion call to fight back against the corporate-led, government-sanctioned attack on organic standards.

Old rules, new rules

The NOSB, a federal advisory committee appointed by the Secretary of Agriculture, decides what is allowed on the National List of non-organic materials approved for use in organic. Prior to last September’s new ruling, each non-organic material on the list had to be reviewed every five years, using what’s called the “sunset process.” Under this process, five years after a non-organic material was added to the National List, it would be automatically removed, unless a two-thirds majority of the NOSB board voted to keep the material on the list.

The intent behind this process was clear. Maximize flexibility for the organic industry, minimize the use of non-organic materials in organic, and encourage continuous improvement of the organic standard.

But last September, the USDA reversed this process. Now, instead of automatically coming up for review after five years, each non-organic material will automatically—and indefinitely—stay on the National List unless a two-thirds majority of the NOSB board votes to remove it. And that’s not likely to happen, given that the 15-member board of the NOSB is stacked with industry reps who consistently vote with industry against consumers.

Labels help, but . . .

Should you just give up on the organic label? Absolutely not. With all its flaws, USDA Organic is still the only credible and comprehensive certification program in the natural foods marketplace. The new rules mean you’ll have to scrutinize labels more carefully than ever. But even then, you won’t get the whole picture when it comes to non-organic substances in organics.

Certain non-organic or synthetic materials can be used in up to 5 percent of a “USDA Organic” product, and in up to 30 percent of a “Made with Organic” product.

Under the new rules, the list of non-organic or synthetic ingredients allowed at those percentages will likely grow. But here’s something most consumers don’t realize: The National List isn’t just about synthetic and non-organic ingredients that are allowed in food. The list also governs every non-organic material or synthetic material used in the production of organic food, from farm to fork. (Here’s the complete list). As that list, too, grows, organic standards will continue to erode. And as a consumer, you’ll have a difficult time identifying those materials as they won’t be listed on the product’s label.

What non-organic materials should you look for on food labels? Here are a few of the worst ones:

Carrageenan, an additive linked to gastrointestinal inflammation and higher rates of colon cancer. More here: http://salsa3.salsalabs.com/o/50865/p/dia/action3/common/…

Synthetic nutrients, including DHA and ARA which have been linked to severe gastrointestinal distress, prolonged periods of vomiting and painful bloating.

Sausage casings made from processed intestines

What won’t you find on food labels, but should be aware that the NOSB has approved for organic? And because of the new rule, likely won’t revisit?

Synthetic methionine: In 2001, the NOSB approved the synthetic version of methionine, a sulfur-based essential amino acid, for use in livestock feed—but only, as the Rodale Institute points out, after organic poultry farmers realized the substance was already in the feed they were using. As long as synthetic methionine remains on the list of approved substances in organics, organic farmers can continue to keep chickens confined. Why? Because, again according to the Rodale Institute, synthetic methionine keeps confined chickens healthy. Take the synthetic out of the feed, and you have to allow the chickens access to outdoor pastures in order to maintain their health. But wouldn’t that be a good thing?

Genetically engineered vaccines: Genetically modified organisms, and the genetic engineering process itself, are not allowed in certified organic products. But there’s one exception. Genetically engineered vaccines can be used in organic livestock production, on the condition that the vaccines are included on the National List. So which genetically engineered vaccines did the NOSB approve for the National List? All of them. Instead of reviewing the safety of each vaccine individually, as the law clearly intends, the NOSB included all genetically engineered vaccines on the list, as a single group of “synthetic substances.” Now that the sunset process has been weakened, what are the chances of getting genetically engineered vaccines off of the list of approved substances? Next to none.

Antibiotics: Under organic standards, antibiotics can’t be used in animals. But there’s a little-known loophole, applicable only to poultry, that says the standard doesn’t take effect until “the second day of life.” So as it turns out, the eggs that hatch into organic chickens are routinely injected with an antibiotic called gentamicin, which is also used to treat bacterial skin infections in humans. Because of the loophole, the use of gentamicin in organic poultry production has never been subject to the NOSB’s sunset process. The process does, however, govern the use of antibiotics sprayed on apple and pear trees to control something called fire blight. Under the old rules, the NOSB voted to end the use of those antibiotics—tetracycline and streptomycin—as of October 21, 2014. But industry is fighting that ruling. If it succeeds, and the NOSB ever re-lists those antibiotics, the changes to the sunset process will make it more difficult than ever to get tetracycline and streptomycin off of the National List of approved substances.

Mutagenesis: There’s another loophole in the “no genetic engineering in organics” standard. It’s called mutagenesis. In 2011, the NOSB approved synthetic DHA and ARA for use in organics. As mentioned above, these synthetic nutrients, used in baby formula, are linked to side effects. But what you won’t learn from reading the labels on baby formula, or any other product containing DHA or ARA, is that these synthetic nutrients are derived from mutated microorganisms, created through a process called mutagenesis. We believe mutagenesis is a form of genetic engineering, and others support http://www.gmo-compass.org/eng/search/ our position. But when Martek Biosciences Corp., the manufacturer of synthetic DHA, argued that mutagenesis should be allowed because the process is nothing more than a form of classical seed breeding, the NOSB sided with the company. So while consumers can see DHA and ARA on product labels, few will know that they are produced using a technique that has dangers similar to genetic engineering.

Defending organic standards

The NOSB meets again April 29-May 2, 2014. For the first time, it will be operating under the new rule. The USDA didn’t give the public an opportunity to comment on its change to the sunset process, but that doesn’t mean the agency is immune to public outcry. Starting with President Obama and USDA Secretary Vilsack, we need to press USDA leadership to reverse this disastrous new rule.

Please sign and share our petition: here

(originally published at The Organic Consumers Association

—-
Alternative Free Press – fair use –

EPA tested deadly pollutants on humans

Report: EPA tested deadly pollutants on humans to push Obama admin’s agenda

By Michael Bastasch
The Daily Caller: April 2, 2014

The Environmental Protection Agency has been conducting dangerous experiments on humans over the past few years in order to justify more onerous clean air regulations.

The agency conducted tests on people with health issues and the elderly, exposing them to high levels of potentially lethal pollutants, without disclosing the risks of cancer and death, according to a newly released government report.

These experiments exposed people, including those with asthma and heart problems, to dangerously high levels of toxic pollutants, including diesel fumes, reads a EPA inspector general report obtained by The Daily Caller News Foundation. The EPA also exposed people with health issues to levels of pollutants up to 50 times greater than the agency says is safe for humans.

The EPA conducted five experiments in 2010 and 2011 to look at the health effects of particulate matter, or PM, and diesel exhaust on humans. The IG’s report found that the EPA did get consent forms from 81 people in five studies. But the IG also found that “exposure risks were not always consistently represented.”

“Further, the EPA did not include information on long-term cancer risks in its diesel exhaust studies’ consent forms,” the IG’s report noted. “An EPA manager considered these long-term risks minimal for short-term study exposures” but “human subjects were not informed of this risk in the consent form.”

According to the IG’s report, “only one of five studies’ consent forms provided the subject with information on the upper range of the pollutant” they would be exposed to, but even more alarming is that only “two of five alerted study subjects to the risk of death for older individuals with cardiovascular disease.”

Three of the studies exposed people to high levels of PM and two of the studies exposed people to high levels of diesel exhaust and ozone. Diesel exhaust contains 40 toxic air contaminants, including 19 that are known carcinogens and PM. The EPA has publicly warned of the dangers of PM, but seemed to downplay them in their scientific studies on humans.

“This lack of warning about PM,” the IG’s report notes, “is also different from the EPA’s public image about PM.”

The EPA has been operating under the assumption that PM is deadly for years now. The IG’s report points to a 2003 EPA document that says short-term exposure to PM can result in heart attacks and arrhythmias for people with heart disease — and long-term exposure can result in reduced lung function and even death. A 2006 review by the EPA presents even further links between short-term PM exposure and “mortality and morbidity.”

“Particulate matter causes premature death. It doesn’t make you sick. It’s directly causal to dying sooner than you should,” former EPA administrator Lisa Jackson told Congress on Sept. 22, 2011.

“If we could reduce particulate matter to healthy levels it would have the same impact as finding a cure for cancer in our country,” Jackson added.

PM is a “mixture of harmful solid and liquid particles” that the EPA regulates. PM that is 2.5 microns or less is known as PM2.5, which is about “1/30th the thickness of a human hair.” These small particles can get into people’s respiratory system and can harm human health and even lead to death after just short-term exposure.

The EPA set PM2.5 primary standards at 15 micrograms per cubic meter of air on an annual average basis, but the agency exposed test subjects to PM levels of 600 micrograms per cubic meter — 40 times what the EPA sets as an acceptable outdoor air standard.

But in five of the studies, people were subject to levels higher than what they signed on for. The EPA IG found that one person was hit with “pollutant concentrations that reached 751 [micrograms per cubic meter], which exceeded the IRB-approved concentration target of 600 [micrograms per cubic meter].”

(Read the full article at The Daily Caller

—-
Alternative Free Press

Four Years After Gulf Oil Spill, BP Is Recovering Faster Than Environment

Miyoko Sakashita
Huffington Post: March 26, 2014

Nearly four years after the BP Deepwater Horizon explosion dumped more than 200 million gallons of crude oil into the Gulf of Mexico, the slate has been largely cleared for BP — the EPA ban on federal contracts has been lifted, and the company is free once again to bid on federal oil and gas leases.

But as a new study published this week makes clear, we’re only beginning to understand the spill’s devastating long-term implications for the region’s sea life.

Released on the 25th anniversary of Alaska’s Exxon Valdez oil spill, the National Oceanic and Atmospheric Administration-led study found that young bluefin tuna and amberjack exposed to oil samples collected during the Gulf skimming operations after the spill are showing a range of disconcerting abnormalities.

The exposed fish suffered from finfold deformities, dramatic reductions in eye growth and, most concerning of all, heart defects likely to limit the open water food-catching abilities key to their survival.

The study’s findings are significant, but hardly surprising. There’s no doubt that for years to come we’ll be learning the spill’s true cost to wildlife in one of our world’s most important spawning grounds for imperiled bluefin tuna.

Four years after the spill, officials have no idea how many fish were killed or the extent of the long-term damage to the Gulf. But with only 25 percent of the spilled oil recovered and nearly 2 million gallons of toxic oil dispersants sprayed into the Gulf’s waters, we know the toll on wildlife will be measured in decades, not years.

An analysis projecting the true wildlife toll based on documented strandings suggests the spill likely harmed more than 80,000 birds, 25,000 marine mammals and 6,000 sea turtles as well as doing untold damage to marine invertebrates such as coral lobsters, crabs, oysters, clams, zooplankton and starfish.

Despite the ongoing carnage, little has been done to tighten oversight necessary to prevent similar spills in the future.

The name of the agency overseeing offshore drilling changed, but little else. Environmental review is still waived for many drilling projects in the Gulf. And though there are already nearly 4,000 offshore oil and gas operations in the Gulf, the expansion of risky deepwater wells continues.

(Read the full article at Huffington Post)

—-
Alternative Free Press – fair use –

School Science Project Reveals High Levels Of Fukushima Nuclear Radiation in Grocery Store Seafood

Michael Snyder
The Truth: March 27, 2014

A Canadian high school student named Bronwyn Delacruz never imagined that her school science project would make headlines all over the world. But that is precisely what has happened. Using a $600 Geiger counter purchased by her father, Delacruz measured seafood bought at local grocery stores for radioactive contamination.

What she discovered was absolutely stunning. Much of the seafood, particularly the products that were made in China, tested very high for radiation. So is this being caused by nuclear radiation from Fukushima? Is the seafood that we are eating going to give us cancer and other diseases? The American people deserve the truth, but as you will see below, the U.S. and Canadian governments are not even testing imported seafood for radiation. To say that this is deeply troubling would be a massive understatement.

In fact, what prompted Bronwyn Delacruz to conduct her science project was the fact that the Canadian government stopped testing imported seafood for radiation in 2012…

Alberta high-school student Bronwyn Delacruz loves sushi, but became concerned last summer after learning how little food inspection actually takes place on some of its key ingredients.

The Grade 10 student from Grande Prairie said she was shocked to discover that, in the wake of the 2011 Fukushima nuclear disaster in Japan, the Canadian Food Inspection Agency (CFIA) stopped testing imported foods for radiation in 2012.

And what should be a major red flag for authorities is the fact that the seafood with the highest radiation is coming from China…

Armed with a $600 Geiger counter bought by her dad, Delacruz studied a variety of seafoods – particularly seaweeds – as part of an award-winning science project that she will take to a national fair next month.

“Some of the kelp that I found was higher than what the International Atomic Energy Agency sets as radioactive contamination, which is 1,450 counts over a 10-minute period,” she said. “Some of my samples came up as 1,700 or 1,800.”

Delacruz said the samples that “lit up” the most wereproducts from China that she bought in local grocery stores.

It is inexcusable that the Canadian government is not testing this seafood. It isn’t as if they don’t know that it is radioactive. Back in 2012, the Vancouver Sun reported that cesium-137 was being found in a very high percentage of the fish that Japan was selling to Canada…

• 73 percent of the mackerel

• 91 percent of the halibut

• 92 percent of the sardines

• 93 percent of the tuna and eel

• 94 percent of the cod and anchovies

• 100 percent of the carp, seaweed, shark and monkfish

So why was radiation testing for seafood shut down in Canada in 2012?

Someone out there needs to answer some very hard questions.

Meanwhile, PBS reporter Miles O’Brien has pointed out the extreme negligence of the U.S. government when it comes to testing seafood for Fukushima radiation. The following comes from a recent EcoWatch article…

O’Brien also introduces us to scientists from the Woods Hole Oceanographic Institute who have been testing waters around the reactors—as well as around the Pacific Rim—to confirm the levels of Fukushima fallout, especially of cesium.

These scientists are dedicated and competent. But they are also being forced to do this investigation on their own, raising small amounts of money from independent sources. They were, explains lead scientist Ken Buesseler, turned down for even minimal federal support by five agencies key to our radiation protection. Thus, despite a deep and widespread demand for this information, no federal agency is conducting comprehensive, on-the-ground analyses of how much Fukushima radiation has made its way into our air and oceans.

In fact, very soon after Fukushima began to blow, President Obama assured the world that radiation coming to the U.S. would be minuscule and harmless. He had no scientific proof that this would be the case. And as O’Brien’s eight-minute piece shows all too clearly, the “see no evil, pay no damages” ethos is at work here. The government is doing no monitoring of radiation levels in fish, and information on contamination of the ocean is almost entirely generated by underfunded researchers like Buesseler.

It is the job of the authorities to keep us safe, and the Fukushima nuclear disaster was the worst nuclear disaster in human history.

So why aren’t they doing testing?

Why aren’t they checking to make sure that this radiation is not getting into our food chain?

(Read the full article at The Truth)

—-
AlternativeFreePress.com – fair use –

Ohio & New York 2014 Mumps Outbreaks Only Infect Vaccinated Population

AlternativeFreePress.com

The United States has already seen more than one outbreak of the mumps in 2014, but you can’t blame people who choose not to vaccinate. Outbreaks in both New York and Ohio have occurred on campuses which include a strict vaccine mandate.

At Fordham University in New York City all students are required to be vaccinated including the vaccination for mumps, measles, and rubella (MMR), but as of February 21st, 13 cases of the mumps had been reported with 100% of those infected having already been vaccinated.

In Ohio, as of March 24th there were 63 reported cases & 97% of those infected had been vaccinated.

Officials acknowledged that the vaccine is only 80-90% effective.

Dr Tetyana Obukhanych is an Immunologist who earned her PhD in Immunology at the Rockefeller University in New York and did postdoctoral training at Harvard Medical School, Boston, MA. and Stanford University in California. In an interview with Catherine Frompovich, Dr Obukhanych explains why these outbreaks are occurring among the vaccinated population:

I think this is happening because vaccination does not engage the genuine mechanism of immunity. Vaccination typically engages the immune response—that is, everything that immunologists would theoretically “want” to see being engaged in the immune system. But apparently this is not enough to confer robust protection that matches natural immunity. Our knowledge of the immune system is far from being complete.

Dr Obukhanych describes natural immunity as: “in a way, a tautological expression because immunity can only be acquired naturally at this point, only through the exposure to an infected individual, although occasionally such exposure would go asymptomatic while still establishing immunity. Nevertheless, because there is a common misconception that vaccines also confer immunity, it is sometimes necessary to use a qualifier “natural,” when referring to immunity, to distinguish it from vaccine-based protection.

In the following video Dr Obukhanych explains how protective serum titers drop very quickly after the second MMR dose, meaning some vaccinated people do not receive any lasting protection from the MMR vaccine.

Here is a screenshot of the chart in the video highlighting the data which shows the MMR booster is not very effective & provides at best, leaky immunity.

The CDC says mumps typically begins with a fever, headache, muscle aches, fatigue and loss of appetite, followed by swelling salivary glands. People unfortunate enough to catch the mumps usually recover after a week or two, but occasionally the disease can cause serious complications.

Dr. Russell Blaylock, a board-certified neurosurgeon, author and lecturer who attended the LSU School of Medicine and completed his internship and neurosurgical residency at the Medical University of South Carolina explains how herd immunity is only truly obtainable through natural immunity:

In the original description of herd immunity, the protection to the population at large occurred only if people contracted the infections naturally. The reason for this is that naturally-acquired immunity lasts for a lifetime. The vaccine proponents quickly latched onto this concept and applied it to vaccine-induced immunity. But, there was one major problem – vaccine-induced immunity lasted for only a relatively short period, from 2 to 10 years at most, and then this applies only to humoral immunity. This is why they began, silently, to suggest boosters for most vaccines, even the common childhood infections such as chickenpox, measles, mumps, and rubella.

Then they discovered an even greater problem, the boosters were lasting for only 2 years or less. This is why we are now seeing mandates that youth entering colleges have multiple vaccines, even those which they insisted gave lifelong immunity, such as the MMR. The same is being suggested for full-grown adults. Ironically, no one in the media or medical field is asking what is going on. They just accept that it must be done.

Alternative Free Press has reported previously that 40% of the recent measles outbreak in California were also full vaccinated. 10 people were identified as likely sources of the measles outbreak having traveled to high risk areas, but it was not disclosed if any of the 10 were vaccinated or not. At the time we speculated “It’s likely safe to assume that if the majority of the ten people who visited an outbreak area were not vaccinated, we would be hearing about it. I’d guess that most people who are not vaccinated would avoid travel to an area with a large outbreak & it seems reasonable to assume that many of the ten people identified were vaccinated.” While we can only speculate about the measles, these recent mumps outbreaks are infecting vaccinated populations and were definitely not caused by people choosing to skip vaccinations.

Written by Alternative Free Press
Creative Commons License
Ohio & New York 2014 Mumps Outbreaks Only Infect Vaccinated Population by AlternativeFreePress.com is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Sources for this article:

1. Mumps outbreak spreads beyond Ohio State campus http://www.cnn.com/2014/03/24/health/ohio-mumps/

2. Fordham University mumps outbreak jumps campuses http://abclocal.go.com/wabc/story?id=9438450

3. Interview with PhD Immunologist, Dr Tetyana Obukhanych- part 1, by Catherine Frompovich http://www.vaccinationcouncil.org/2012/06/13/interview-with-phd-immunologist-dr-tetyana-obukhanych-by-catherine-frompovich/

4. Forced Vaccinations, Government, and the Public Interest >http://www.thenhf.com/article.php?id=1975

5. Dr Tetyana Obukhanych Talk At Aligned Chiropractic Kelowna B.C. March 2013 http://www.youtube.com/watch?v=5Dts3ebwWlo

RELATED:
New Jersey mumps outbreak exclusively infecting vaccinated population

—-
Alternative Free Press

Chevron Bought a Newspaper to Mask its Record on Safety Abuses

Crude Journalism

By Thor Benson
Vice: March 26, 2014

Richmond is tucked into California’s western tricep, a former wine town with a population just over 100,000. Under the administration of Mayor Gayle McLaughlin, the town is the largest city in the United States with a Green Party mayor. It’s also an oil town—in 1901, Standard Oil set up an operation and tank farm, choosing the location for its easy access to the San Francisco Bay. Soon after, a western terminus of the Santa Fe Railroad was built in Richmond to handle the outflux of crude. Over the course of the 20th century, Standard Oil became the Standard Oil Company of California (SOCAL), and later, Chevron.

Throughout the 90s, the Richmond refinery was fined thousands of dollars for unsafe conditions, explosions, major fires, and chemical leaks, as the plant oozed chlorine and sulfur trioxide into Richmond’s atmosphere. In August of 2012, the Richmond refinery exploded after Chevron ignored the warning of corroding pipes from the local safety board. The disaster was linked to aging pipes, which were simply clamped instead of replaced altogether. 15,000 residents in the surrounding area were forced to seek medical treatment, and Chevron’s CEO, John Watson, got a $7.5 million dollar raise.

Now that some time has passed, Chevron has decided to modernize the refinery, and has simultaneously sponsored the creation of the Richmond Standard, an online newspaper that is decidedly positive about anything the company does. The paper, whose name is a sly reference to the company that Chevron grew out of, covers minimally reported local stories on crime, public meetings, and sports. It also features a section called “Chevron Speaks,” which works as a place for the company to put forth its ideology. According to SF Gate, “the idea of the nation’s second-largest oil company funding a local news site harkens back to an era of journalism when business magnates often owned newspapers to promote their personal, financial, or political agendas. Now that mainstream newspapers are struggling to survive, online news sites are testing ways to fund their operations.”

The founding of the Standard coincides with a modernization initiative at the Richmond plant, which would allow the facility to process fuel with higher percentages of sulfur, the key to the corrosion that resulted in the 2012 plant explosion. “They’re planning on doubling the sulfur content of the crude,” Andres Soto, the Richmond Community Organizer at Communities For a Better Environment, told me.

According to Andres, Chevron wants to go from 1.5% sulfur content to 3%. Outside of the fear of another explosion, there is a serious environmental problem with the modernization and the refinery in general. “They’re publically claiming there will be no net increase in emissions,” he says. “Our suspicion is they plan on releasing more greenhouse and particulate emissions, here in the local arena, in exchange for cap and trade.”

For those of you unfamiliar with cap and trade, it’s essentially when you let one of your refineries pollute above the federal limit in exchange for another refinery polluting below the federal limit. The differentials from “caps” are traded so that in the end everyone is supposed to be meeting requirements, on average. “It’s the single largest refinery on the West Coast of the United States. As a facility, it’s the single largest emitter of green house gases in California,” Andres says. He believes the refinery will cause serious pollution to the Bay Area, and that the company will start using the port of Richmond to export tar sands to China that aren’t legal for fuel in the United States because of the sulfur content.

As you may have heard in relation to the Keystone Pipeline proposals, tar sands are semi-solid petroleum and sand mixtures that are being harvested in Alberta, Canada. The process of extracting the petroleum wastes a lot of water, and a Yale article reported last year that a company extracting petroleum used 370 million cubic meters of fresh water in 2011. Chevron could export the sands to China for their needs and save Americans time and resources, but Andres points out the emissions from shipping and the emissions China will create will circle back to America with the winds.

Chevron is attempting to convince the public that the refinery is a good idea in the pages of the Richmond Standard, typically using the promise of more jobs and money. Andres says the company is buying every billboard in town. The billboards often depict people of color, likely in an effort to convince minorities that they can trust a multinational oil conglomerate.

(Read the full story complete with links to sources at Vice)

—-
AlternativeFreePress.com

BP oil spill in Lake Michigan

WHITING, Ind. (AP) — BP says it is assessing how much crude oil entered Lake Michigan following a malfunction at its northwestern Indiana refinery.

BP spokesman Scott Dean says crews have placed booms across a cove at the company’s Whiting refinery where workers discovered the oil spill Monday afternoon.

Dean says BP believes the oil released during an oil refining malfunction has been confined to that cove.

He says the oil entered the refinery’s cooling water system, which discharges into the lake about 20 miles southeast of downtown Chicago.

(View the full article at Yahoo)

Check out this related recent report: 25 Years After Exxon Valdez, BP Was the Hidden Culprit
—-
AlternativeFreePress.com