Dirty Air at Factory Farms

Researchers at Purdue University have found that the air quality at concentrated animal feeding operations (CAFOs) is often dirtier than the most polluted cities in the United States. The researchers measured concentrations and emissions of ammonia, hydrogen sulfide, particulates, and volatile organic compounds—all pollutants known to have health risks—at 15 livestock-confinement sites, nine livestock waste lagoons, and one dairy corral in nine states. The U.S. Environmental Protection Agency supervised the study.

According to the research, 11 of the sites tested in the study emitted more than 100 pounds of ammonia on an average day – an amount that would require EPA pollution-reporting in non-livestock industries. Furthermore, six of the farms released fine-particle pollution that was higher than the federal 24-hour exposure limit. Data showed that hydrogen-sulfide emissions exceeded 100 pounds per day at several of the large hog and dairy CAFOs, which must be reported under federal right-to-know laws in other industries.

Tarah Heinzen, an attorney with the Environmental Integrity Project (EIP), a nonpartisan, nonprofit organization that advocates the effective enforcement of environmental laws, says, “No other industry in the country would be allowed to pollute at these levels without triggering EPA emissions reporting laws that have applied to other large industries for decades.”

While the data shows that CAFOs are a significant source of pollution, these operations are not subjected to the same environmental regulations that other agencies see. This is due to a deal that was brokered by the Bush administration in 2008, which exempted CAFOs from federal pollution reporting rules.

The EIP has released a report, comparing air pollution levels at CAFOs with established health standards and reporting rules. Their report, released in March, is called “Hazardous Pollution from Factory Farms: An Analysis of EPA’s National Air Emissions Monitoring Study Data.” In this report, the EIP recommends several steps to fix the CAFO pollution problems, including establishing an independent committee to oversee the emission-estimating methodology process and creating regulations necessary to use the Clean Air Act to protect public health from ammonia, volatile organic compounds, and other factory-farm pollution.

There are no current plans to conduct future studies using non-CAFO farms. Furthermore, there is currently no push for legislation requiring small-scale farms to follow the same EPA emissions reporting guidelines that the EIP believes should be mandatory for all factory farms.

Heinzen already believes that smaller, more diversified farms are already doing the right thing when it comes to being stewards of the environment.

Source: Hobby Farms

The Decline of the Amazon River Dolphin

It is a story retold all around the world; humanity competing with nature’s greatest species for now dwindling resources. Not surprising, people tend to dominate the competition. The same is true along the waters of the Amazon river basin where pink dolphins compete with local fishermen for the day’s catch. Despite legal protections on the freshwater dolphins, the Amazon is far too expansive to protect and convincing fishermen to respect their rivals is a difficult prospect.

There are 40 species of dolphin around the world, and while the U.S. tends to view the species as a playful aquatic neighbor, the same cannot be said elsewhere. Many Japanese fishermen view schools of dolphin that pass through their fishing grounds as genuine pests. Fisherman in Taiji, Japan and the Faroe Islands still hunt and eat dolphins, despite the high mercury levels known to be found in dolphin meat. The Chinese more or less disregarded the Yangtze river dolphin as they dammed, polluted, and over-fished the waterway, driving the dolphin into extinction in the wild. The Ganges river dolphin is not far behind. The pink dolphins of the Amazon river find themselves in an similar predicament: they are competing with locals for food and the locals are not particularly inclined to share.

To make matters worse, the dolphins are harpooned and used as bait to catch catfish. Supposedly, in just on day of fishing, two dead dolphins can provide enough bait to yield $2,400 in catfish sales. The prospect of such substantial returns coupled with the always pressuring need to catch enough to feed and provide for their families makes the fishermen of the Amazon anything but allies of the pink dolphins.

The illegal dolphin hunting is on the rise in the Amazon and clearly demonstrates the great challenges of policing environmental law in a protected land. The wild and untouched character of the Amazon basin reflects the immeasurable ecological value as well as the near impossible task of patrolling the territory. Researchers believe that hundreds, if not thousands, of the estimated 30,000 remaining pink dolphins are killed each year by people. When you realize that 1,300 Brazilian environmental protection agents are responsible for looking after a territory larger than India, it is no surprise that the future of the Amazonian river dolphin is in the hands of the local fishermen that travel the waters each day.

The root cause of the dolphin’s decline in the Brazilian Amazon is the indifference of the people living along side them toward their killing. Jars of oil from dolphin fat can regularly be found in open-air markets. Dolphin genitals are sold as good luck charms for sex and love. There is no need to hide these illegally acquired products when the vendors know that no one from the environmental protection agency is coming to arrest them. People know they are not allowed to kill the dolphins, but protecting them is simply not a priority.

The pink Amazonian river dolphin, an iconic character in local lore, is in a state of decline. The species may very well find itself on the verge of extinction faster than anyone can predict. As has been the story time and time again, human indifference proves to be one of the most destructive forces the planet has ever seen.

Source: NYTimes

Environmentally Friendly Paint

When renovating a room or a piece of furniture, you want to pay attention to the potential environmental impact of the paint. While the carbon footprint of the product is important, in this case I am talking about choosing a less-toxic paint to allow for fresh, clean indoor air. Here are some tips and things to look for when choosing an eco-friendly paint:

Ideally, you’ll want to use paints which meet all three better health requirements—low volatile organic compounds (VOCs), low biocides, and natural pigments. Many paints are labeled “low-VOC” to meet the Environmental Protection Agency’s minimum requirements—which call for no more than 250 grams per liter (gm/l) of VOCs in “low-VOC” latex paints and no more than 380 gm/l for “low-VOC” oil-based paints. Some paints are available with even lower VOC levels (0-100 gm/l). To locate the VOC level, check the pain can label or call the company to request a material safety data sheet.

You’ll also need to base your eco-requirements on whether you’re searching for an exterior or an interior paint.

Exterior paint: All exterior paints contain fungicides, and low-biocide paints are not available for exteriors. The best choice for an exterior paint is one which contains zinc oxide as the fungicide. The next best choice would be a zero- to very low- VOC paints, acrylic or latex paints, and recycled water-based paint. Try to avoid all oil-based paints because of their high VOC content. Furthermore, oil-based paints may come from old cans which contain mercury or lead.

Interior paint: The first choice for interior paint should be a milk paint or a natural paint. Natural paints come from substances such as citrus and balsam, as well as minerals. Though these paints are made with natural materials and are petroleum-free, they still contain terpenes, which are VOCs derived from plants.

Milk paint, made with lime and a milk protein called casein, is excellent for interiors and gives wood a rich, deep color, allowing the grain to show through.

While latex paint can contain low biocide and VOC levels, it is much safer for the environment than oil-based paint. Still, latex paint needs to be used with great care due to the strong terpenes. Other acceptable paints include acrylic and recycled latex paints, assuming they don’t contain mercury or lead. Try to avoid all oil- and solvent-based paints.

No matter which type of paint you choose to use, remember to keep the room well-ventilated. Never use old pain which may contain lead, which are extremely toxic to children or pets who may eat dry paint chips. Call a certified professional to inspect your home if you suspect that your home contains lead-based paint.

Source: GreenAmerica

One-Third of Global Food Supply Is Never Eaten

While global populations may be edging closer to a food crisis, it is not due to a lack of food. According to a study conducted by the United Nations’ Food and Agriculture Organization, one-third of all the food produced around the world, approximately 1.3 billion tons a year, is never eaten. That is, one-third of the global food supply is lost or thrown away. The report indicated that the food waste is roughly split between developed and developing countries, though it is important to recognize that rich countries account for a small portion of the world’s population yet an equal share of the waste.

In developed countries, food waste is disproportionately the result of retailers and consumers who throw away “perfectly edible food.” This behavior can be described as nothing but wasteful. In developing countries, food waste is, for the most part, the unavoidable outcome of “poor infrastructure and low levels of technology in harvesting, processing and distribution.” The impacts of food scarcity on the developing world could be substantially reduced if the essential infrastructure were put into place to prevent this unnecessary waste.

In many ways, heightened food prices are more to blame for the prevalence of starvation than a scarcity of food. But even as food riots ignite throughout Africa, consumers in the world’s wealthiest countries continue to throw away a comparable quantity of food (222 million tonnes) as the entire net food production of sub-Saharan Africa (230 million tonnes).

Food waste represents not only the squandering of produce but the meaningless loss of valuable natural resources. Food production relies heavily on water resources, land, labor, and capital. Not to mention the enormous quantity of fossil fuels burned during planting, harvesting, and post-harvest transportation, adding unnecessary tonnes of CO2 into the atmosphere each year.

It is sad to consider how different these stats would be if Americans were willing to eat a bruised apple. I believe our migration away from the farm has distorted our understanding of the environment and disconnected us from where our food comes from. Changing consumer attitudes will be an uphill struggle in a culture so preoccupied with convenience. Our disposable society, begun by the consumer boom of the 1950s and 60s, will inevitably be the force that destabilizes the natural world.

Source: FAO via NYTimes

The Energy Costs of Modern Living

While turning off the lights when you’re leaving the room and using energy efficient fluorescent light bulbs may help lower the cost of energy bills, it hardly saves anything in your monthly energy bill. Due to higher fuel and energy costs, the average household will spend $2,350 on electricity and gas this year, up from $2,100 in 2007, according to the Alliance to Save Energy.

In order to make these bills more manageable and to cut down on the impacts of vampire energy, try going after the five biggest energy guzzlers in the home. Here are the five worst appliances and how to lower their costs:

1. HVAC System:

Your home’s heating, ventilation, and air conditioning (HVAC) system is probably the home’s worst offender, says Maria Vargas, a spokeswoman for the government’s Energy Star Program. This should come as no surprise, seeing as most households use some sort of climate control nearly 24 hours a day, seven days a week throughout both the hottest and coldest parts of the year. In reality, heating and cooling account for 50% of the average household’s annual energy bill.

How to cut your bill: Try programming the thermostat so that the HVAC system doesn’t work so hard while you are at work or asleep. By reducing the temperature by just two degrees during the winter or adjusting the air-conditioner two degrees higher during the summer, Energy Star estimates you’ll save $180 annually.

2. Water Heater:

The water heather works around the clock to provide enough hot water for showers, laundry, and dishes, among other things. The water heater represents nearly 13% –the second-biggest amount—of your annual energy bill.

How to cut your bill: By dialing down the heater’s temperature to 120 degrees from the standard 140 degrees, you’ll reduce your annual bill by 6% to 10%. You can also opt to wash your clothing in cold water only, which can cut the energy bill by $73 a year, while keeping clothing just as clean.

3. Refrigerator:

The refrigerator runs at all hours of the day. It periodically cycles up to draw maximum watts and keep the temperature consistent. Each time the refrigerator door is opened for an extended period of time, the appliance must work harder to maintain the desired temperature. The fridge accounts for 5% of the annual energy bill. Using an ancient, inefficient model for spare food storage in the garage or basement, then you can expect to pay twice that amount.

How to cut your bill: Keep your machine clean. A refrigerator cycles on less frequently if the coils beneath and behind the unit remain clean and dust free. Also, by keeping the fridge at a moderate temperature (36-38 degrees, not lower), the refrigerator won’t have to use so much energy. You should regularly defrost the freezer to eliminate ice buildup on the interior coils. If you’ve got a second refrigerator that’s been around for more than a decade, you should look into recycling it. One bigger fridge is more efficient than two smaller ones.

4. Clothes dryer

Here’s an indication on how inefficient a clothes dryer really is: “A dryer can’t earn the Energy Star label right now,” says Vargas. While it’s unknown what makes an appliance more or less efficient than another, one thing is certain: they’re all energy hogs. Clothes washers and dryers collectively account for 6% of the annual energy bill, with the bulk of that coming from the dryer.

How to cut your bill: Spend money on the most energy-efficient clothes washer you can afford, which will wring out more water from your clothes, cutting the drying time in half. If you can’t afford a new major appliance, be sure to use your dryer’s moisture sensor settings. Also, consider hang drying your clothes to virtually eliminate the cost of using a dryer.

5. Dishwasher

While using the dishwasher may be more efficient than hand-scrubbing dishes, it comes with a heavy convenience fee. Dishwasher use accounts for 2% of our annual energy bill.

How to cut your bill: If you must use the dishwasher, make sure to run the dishwasher only when it is full. Try letting the dishes air dry instead of using the drying feature, which doubles the appliance’s power draw.

Source: WalletPop

Easy Guide to Vermiculture

Vermiculture, or worm composting, allows you to compost your food rapidly, while also producing high quality compost and fertilizing liquid. Here are six easy steps to follow to create a compost system with worms:

1. Obtain a worm bin

This can be done by purchasing a bin from online venders or hardware stores; however, you can build your own compost bin using storage totes, galvanized tubs, wood, or plastic.

Material: Rubber is cheap, easy to use, and durable. Galvanized tubs cost more but will outlast rubber totes. If you choose to use wood, make sure the wood has not been chemically-treated. This can be harmful and dangerous to the worms, and potentially leach harmful chemicals into the compost. Five-gallon plastic buckets, sold at most hardware stores, can be used. Make sure to clean the bucket thoroughly and let it sit for at least a day before using it as a worm bin.

Ventilation: Your bin should be well-ventilated. Several 1/8 inch holes 4 inches from the bottom should be drilled into the bin; otherwise the worms will stay at the bottom of the bin and potentially drown.

Size: The larger the container, the more worms it can hold. Follow this equation for determining how many worms should be in your container: 1 pound of worms for every square foot of surface area. The maximum productive depth for the bin should be 24 inches deep, because composting worms will not go further down than that.

Cover: It is important to have a cover on the composting bin to prevent light from getting in and drying out the compost. The lid should be removable in case the compost gets too wet.

2. Prepare the bin for worms

Fill your compost bin with thin strips of unbleached corrugated cardboard or shredded newspaper, straw, dry grass, or the like. This material will provide a source of fiber to the worms, while also keeping the bin well-ventilated. Place a handful of dirt on top of the material and thoroughly moisten. Allow the water to soak in for a day or so before adding the worms.

3. Obtain the worms

There are several varieties of worms that are bred and sold for vermicomposting and it is not recommended to dig up earthworms from your own backyard. Try searching the internet or a local gardening club for finding a worm vender. The most common used worm, Eisenia foetida (Red Wigglers), grows to be about 4 inches long, are mainly red, and have a yellow tail. Another variety of worm is known as the “European Night crawlers,” with their latin name being Eisenia hortensis. This variety does not reproduce at the same rate as the red wigglers, but they grow to be larger, eat coarser paper and cardboard better, and seem to be heartier overall. Remember: if you are bringing in non-native species, it is imperative to not let them reach the wild.

4. Maintain your compost bin

Try keeping the compost bin elevated off the ground, which will help speed up composting and keep the worms content. If the worms are kept fed and the compost is kept properly damp, the worms will not try to escape.

Sprinkle the surface of the compost with water each day. Try feeding the worms vegetable scraps at least once per week. About once a month, add more cardboard, shredded newspaper, hay, or other fibrous material.

5. Harvest the compost

There are a variety of techniques used to harvest your compost:

One technique involves moving any large un-composted vegetable matter to one side, and gently scooping a section of worms and compost mixture onto a brightly lit piece of newspaper or plastic wrap. Slowly scrape off the compost in layers, giving the worms time to burrow into the center of the mound. Eventually, a pile of compost will form next to a pile of worms. Return the worms to the bin.

If you prefer not to take on a hands-on approach to gathering the compost, you could use a separator. Barrel separators may be expensive and can be bought on the internet. It is also possible to make your own home-made shaker box.

6. Make use of your compost

You now have your own compost to be used on plants, in gardens, or wherever you choose!

Source: WikiHow

Germany and Italy Rethink Nuclear Power

In the wake of the nuclear crisis in Japan, the German government announced this week that it is accelerating plans to close down its nuclear power plants. Italy is following suit with a one year moratorium. These announcements have shown how quickly political views on nuclear energy have shifted. Italy had planned on pushing nuclear power by referendum, while Germany currently has 17 reactors which it plans to replace completely with renewable energy.

Italian Premier Silvio Berusconi’s Cabinet issued the one year moratorium after pushing for nuclear energy as a way to reduce the country’s dependence on fossil fuel imports.

German Chancellor’s Angela Merkel has declared the situation in Japan a “catastrophe of apocalyptic dimensions.” The German government has already shut down 7 of its older nuclear reactors for extensive inspections. Now, the government plans to take all of its plants offline. Originally, Germany planned to extend the plants’ life for another 12 years. The plants produce 23% of the country’s power. By cutting out nuclear energy, pressure has been placed on Germany to accelerate its renewable energy and smart grid technologies.

Germany has put forth the most aggressive effort when it comes to adopting renewable energy. It aims to run 40% of its grid using clean energy within 10 years. The cost of conversion are as low as .5 cents a kilowatt, but it is too early to have any realistic estimate of the ultimate cost of conversion due to the complexity of predicting equipment cost and upgrading the grid.

The cost of nuclear energy is much higher now than the original estimates for the construction alone. Japan’s 40 year old Fukushima Dai-ichi nuclear power plant reactor was still in operation due to the cost of building a new plant. However, all concerns about cost go out the window when serious accidents, such as the recent one in Japan, occur.

A Note to President Obama: Political Polarization

[The following is a paper I wrote for an American Politics class. I enjoyed writing it so I thought I’d post it.]

While Robert Kuttner’s observation that “the national agenda is looking more Democratic, both because the circumstances demand it and because Republican policies have so palpably failed”[1] speaks to a distinct turning-away from the policies of the Bush administration, it does not suggest that the Obama agenda is going to find support and success. Something is certainly amiss when protesters carry signs that read “Don’t steal from Medicare to support socialized medicine.” I do not mean to imply that only Republican protesters are folly to hypocrisy, but as the Tea Party takes center stage in the push-back against the Obama administration and the Democrats, such errors in logic and misinformation are important to understand.  A mixing of personal beliefs and party rhetoric is to blame for such clearly incompatible declarations. The Gallup polls detailing the gaps in presidential approval ratings show a clear divide between the Democratic and Republican Party. This divide has widened between the first and second year of the Obama term.

Obama entered office with a message of optimism and progress and yet only 23% of Republicans approve of his first year. When addressing the approval of the highest elected official in the U.S., a 65 point approval gap in the first year, long before policy changes had time to take effect, is not politics as usual. In this essay, I will suggest that, in the current political climate, approval ratings and public support are largely indiscriminate of the individual in office. While Obama’s race, oratory skills, and leadership have an amplifying effect on public opinion, any president would have an approval rating of over 70% by his/her political party and less than 30% by the opposing party. The increased approval gap from year one to year two is not a sign of Obama’s performance, but a byproduct of disenchanted Obama supporters and an opposition intent on undermining the presidency using partisan politics, racism, and bias media. Second, I will defend that Obama should continue to speak to right-wing voters but to focus his attention on those voters who won him the election. Obama entered office because the majority of voters supported his message and his legislative agenda. To focus too much attention on compromise and the corralling of bipartisan support is a risky endeavor in the short term and will be judged harshly by Obama supporters if unsuccessful.

As observed by Doris Kearns Goodwin, the great American presidents used their leadership to first transform the public understanding of and attitude toward national challenges.[2] Once this has been established, the president is able to “break through impasses made up of congressional blockage, interest-group power, voter cynicism or passivity, and conventional wisdom.”[3] The Obama campaign convinced many that a new era of politics was on the horizon, but as a whole, the country is not yet prepared to move on to phase two of Goodwin’s vision. The popular understanding of national challenges remains too polarized to be effectively mobilized in support of Obama’s legislation.

At this point in American history, political parties do not reflect core principles or ideological convictions. The Democrats still bear the flag of liberalism but share few other values with their anti-Federalist founding. The Republicans are as business friendly as ever, yet miles from the anti-slavery coalition that first brought them into power. More recently, “the right wing has abandoned its principled support of states’ rights in favor of a doctrine of opportunistic federal preemption.”[4] This gradual drift in ideology is natural for a political party responding to the issues of the day, but as we continue into the 21st century it must be realized that political parties now separate themselves on an issue by issue basis; the centrist and moderates willing to move across the aisle as their principles required are being replaced. Democrats say pro-choice, raise taxes, reform healthcare, and so forth while Republicans take the correspondingly dissonant chord. In this way, people judge a politician based on his/her political affiliation first and look to party leaders for cues rather than forming logical beliefs on issues. The further polarization of Obama’s approval ratings in his second year reflects the successful efforts by right-wing conservatives to take advantage of the public’s malleability and undermine the legitimacy of the administration. Political pundits question Obama’s birthplace and his citizenship, compare his administration to the Third Reich and Communist Russia, and oppose his legislation because of his ethnicity and not-republicanism.

In a truly American way, “people blamed their slowly worsening circumstances on themselves rather than coming together in a movement for political change.”[5] Poll studies have found that “belonging to the upper social class encourages one to believe that through one’s own effort success can be achieved, and government should be limited in its ability to spend tax money to aid those who have not been successful in life’s competition.”[6] Over the past three decades, the Republican party has managed to pass this attitude on to even its middle to low income supporters, a significant portion of whom have been made to believe that medical reform, government regulation of industry, and tax cuts to the country’s wealthiest will somehow do them harm. Such an illogical conclusion hinges only on the belief that government does not work. “Reagan succeeded in transforming public assumptions from the general premise that governments should help to the idea that government was likely to make matters worse.”[7] Consequently, the more Obama tries to achieve, the more conservatives will cry foul, regardless of the principles (states right, individual rights, government regulation, etc) at stake.

This presumption of ineffectual government is the greatest obstacle that the Obama presidency will face in the pursuit of publicly supported legislation. It will require every last bit of oratory skill and inspirational speech writing to move the country away from the cynicism that Reagan and subsequent Republican leaders have so skillfully shaped. For three decades, the American people have been told by Republican campaigns that government does not work. Once elected, these same Republicans have demonstrated precisely how incompetent government fails to respond to the needs of the populace. It is no surprise that a Democrat behind the desk of the oval office finds himself approved of by Democrats and adamantly opposed of by Republicans. While I respect Obama’s attempts at bipartisan legislation, very little will come of it unless he can first change the way the public perceives the role of government in their lives. Great political leaders do no achieve success by simply being centrist and bipartisan. Rather, Obama must challenge his opposition, inspire our better selves, overcome cynicism, and take political risks in the pursuance of principles and policies that promote the common good.

A delicate balancing act is necessary to persuade those on the center-right, maintain the support of the youth and those on the center-left, and become the progressive leader the country needs in this era of economic hardship. The people have shown that they are ready to support a progressive president if he asks them to be a part of the solution. American youth are more politically active than they have been in the previous three decades. The age of Facebook and social media has made the younger generation easier to mobilize but quick to dissolve if left to idle. This is precisely what occurred after Obama’s election and continued from the first to the second year of his presidency; Obama dropped his supporters and allowed them to return to their politically-inactive lives. The President needs to put his base back to work and continue the social-political movement begun by his campaign. If left to mass media, the messages of the Obama administration will be skewed, debated, and contradicted long before they reach the general public. When the White House can speak directly to the people through YouTube, Facebook, and Twitter, the information does not need to come second hand. Make high school, college, and post-grad youth the conveyors of the Obama legislative agenda and the message of progress and optimism will outlive the next two or three decades of presidential turnover; send the Obama youth back to the streets with a clear purpose, the addresses of congressmen to pressure, and well documented support regarding key issues of the day.  People are prepared to defend their beliefs and convictions if Obama is prepared to call upon them, a give-and-take that has been seriously lacking from the administration.

In Barack Obama’s own words, “the true test of the American ideal is whether we’re able to recognize our failings and then rise together to meet the challenges of our time. Whether we allow ourselves to be shaped by events and history, or whether we act to shape them.” The Obama opposition is spearheaded by a group of faux-conservatives who mistake tax cuts for fiscal responsibility and corporate profits for economic stability. Because Republican policy is largely incapable of responding to the needs of the lower and middle class, bipartisan legislature is not a positive end-goal. To achieve success and growth, Obama must directly address the fallacies of right-wing policy and take on the role as a resolute progressive rather than a well-meaning post-partisan. I am not alone in feeling that Obama has done much less shaping-of-events since entering office. We live in a country shaped by decades of Democratic dominance in Congress and yet we remain convinced that government does not work. We need a president to influence events and engage the populace, “a president who profoundly alters American politics and the role of government in American life.”[8] To bridge the gap in approval ratings and pull the country back from the political poles, Obama must be the Democratic progressive we need, must reconnect with and energize his base, and must put to use the influence of the executive office. Obama has the advantage of being on the positive side of history and will be remembered as a great and influential president if he can continue the popular movement begun by his campaign.

[1] Kuttner, Robert. Obama’s Challenge: America’s Economic Crisis and the Power of a Transformative Presidency. White River Junction, VT: Chelsea Green Pub., 2008. Print. Pg 189.

[2] Kuttner, Robert. 2008. Pg 2.

[3] Kuttner, Robert. 2008. Pg 2.

[4] Kuttner, Robert. 2008. Pg 189.

[5] Kuttner, Robert. 2008. Pg 20.

[6] Erickson, Robert S. American Public Opinion: Its Origins, Content, and Impact. New York: Longman, 2002. Print. Pg 203.

[7] Kuttner, Robert. 2008. Pg 4.

[8] Kuttner, Robert. 2008. Pg 1.

A New Era For Drilling?

American companies involved in offshore oil drilling are creating a safety institute modeled after those already established by foreign oil companies and the nuclear power and chemical industries, according to the American Petroleum Institute.

The new safety board, soon to be known as the Center for Offshore Safety, will aim at improving the offshore industry’s management and operations and will be financed by the companies. This statement was made by Jack Gerard, the president of the Petroleum Institute. The size, scope, and budget of the new board have yet to be determined.

In the aftermath of the BP Deepwater Horizon explosion and oil spill nearly a year ago, federal officials have tightened up government oversight of the industry. This move is intended to improve the industry’s self-regulation. Oil companies are not fully behind the new regulations, claiming that they have slowed the permitting process and depressed oil production.

The National Commission on the Deepwater Horizon Oil Spill and Offshore Drilling panel, which studied the BP accident, recommended the creation of an independent safety body that would review all phases of drilling operations, assuring that they meet the highest international standards. The panel has cited the Institute of Nuclear Power Operations, an industry-financed group created after the accident at the Three Mile Island nuclear plant, as a potential model.

To me, this sounds like a lot of hot gas, but perhaps I will choose to be positive and believe that the industry is remotely capably of regulating itself. As we continue to rely on riskier and riskier method for acquiring fossil fuel (deep sea drilling, hydrolic fracturing, and mountain top mining) we will have to chose between the preservation of the environment and the future of our lifestyles. I’m willing to give up the technological lifestyle I have come to know but I realize I speak for only a small fraction of the people stumbling across this article. In the end, we wont really have a choice. Fossil fuels will disappear and humanity will be forced to deal with the consequences.


Escalating Nuclear Crisis in Japan

During the night of March 16, 2011, four lead lined military helicopters dumped nearly 30 tons of water on the No. 3 and No. 4 reactors at Japan’s Fukushima Daiichi nuclear power plant. The seawater is meant to cool off the reactors, while filling the water in the pools of spent fuel rods, which have reportedly boiled dry at reactor No. 4. Furthermore, eleven water cannon trucks join in the effort. However, ABC has reported that officials have suspended more water flights by helicopters due to high levels of radiation.

The helicopters and water trucks were used because of high levels of radiation within the plant, which have made it extremely dangerous for workers to be directly in the plant for extended periods of time. It is not clear whether the water dropped by the helicopters h it the reactors and helped them to cool. The four helicopters made the first mission despite the fact that radiation levels 300 feet above the plant were 87.7 millisieverts per hour.

Gregory Jackzo, the U.S. Nuclear Regulatory Commission Chairman, told the House Energy and Commerce Committee that they believe the spent fuel pool at No. 4 has lost a significant amount, if not all, of its water. If the fuel rods are exposed, there is nothing preventing them from heating and melting down. Many experts believe that a fuel rod fire poses an even greater risk than the reactors, because the fuel rods do not have any protective covering like the reactors do.

The U.S. government has expanded its evacuation zone for American citizens currently in Japan to 50 miles around the plant. Chartered flights have been offered to citizens wishing to leave the country. Japanese officials still believe that their set evacuation zone of 12 miles is sufficient.

A Wikileaks report shows that Japan was warned more than two years ago about safety issues at its nuclear power plants. The International Atomic Energy Agency told officials in Japan that the plants would not be able to withstand earthquakes with a magnitude greater than 7.0. The earthquake that hit Japan on March 11, 2011 had a magnitude of 9.0.


Five Elements of Passive Solar

Passive solar technologies use sunlight to get energy without the use of active mechanical systems. Passive solar systems convert sunlight into usable heat and cause air-movement for ventilation. As contrasted to active solar systems, which use a significant amount of conventional energy to power pumps or fans, passive systems tend to us a small amount of conventional energy for the use of controlling dampers, shutters, or other devices which enhance the solar energy collection, storage, and use.

A home’s windows, walls, and floors can be designed in such a way to collect, store, and distribute solar energy in the form of heat during the winter and reject solar heat during the summer months. Some passive solar homes are heated almost entirely by the sun, while others are designed with south-facing windows which provide some fraction of the heating load. The design of a passive solar home is what distinguishes it from a conventional home.

In order to design a completely passive solar home, the incorporation of the five elements of passive solar design is necessary.

Source US DOE

The first element is the aperture, the large glass area, usually a window, through which sunlight enters the building. Typically, the aperture faces within 30 degrees of true south and should avoid being shaded by other buildings or trees between 9 a.m. to 3 p.m. each day during the heating session. The absorber, a hard, darkened surface of the storage element, is the second element of the design. The surface sits in the direct path of the sunlight, which hits the surface and is absorbed as heat. The third element is the thermal mass: the materials that retain or store the heat produced by the sunlight. Unlike the absorber, which is in the direct path of the sunlight, the thermal mass is the material below or behind the absorber’s surface. The distribution, the method by which solar heat circulates from the collection and storage points to the different areas of the house, constitutes the fourth element of solar design. In a strictly passive design, three natural heat transfer modes will be used: conduction, convection, and radiation. However, in some applications, fans, ducts, and blowers help with the distribution of the heat throughout the house. The final element of the design is the control. During the summer months, roof overhangs are used to shad the aperture. Other elements can be used to control the under- and/or overheating include electronic sensing devices, operable vents and dampers, low-emissivity blinds, and awnings.

While other elements go into the designing of a passive solar home, such as the window location or air sealing, these five elements constitute a complete passive solar home design. Each element serves its own function; however, all five must work together for the design to be successful.

Source (Info and Images): US Department of Energy

Recycling Nuclear Fuel

The former chairmen of the Nuclear Regulatory Commission claims that the U.S.’s failure to pursue a program for recycling spent nuclear fuel has put the country behind other countries and also can be connected to the country’s missed opportunities to enhance the nation’s energy security and influence in other countries.

Dale Klein, Ph.D., Associate Vice Chancellor for Research at the University of Texas System, said that unfounded concerns about the reprocessing of spent fuel has prevented the U.S. from tapping into an extremely valuable resource. Klein states that spent nuclear fuel is often referred to as waste.

Klein states that the spent nuclear fuel is not a waste; the waste comes from the failure to tap into the valuable and abundant domestic source of clean energy in a systematic way. Compared to other fuels which are used in the production of electricity, the energy density of uranium is extraordinary. Furthermore, Klein stated that nearly 95% of the energy value in a bundle of spent nuclear fuel rods remains available for re-use.

Critics to nuclear energy cite the potential for nuclear weapon proliferation as the largest reason to oppose nuclear fuel recycling. Klein acknowledges these critiques, but believes that while it may be true that plutonium in recycled nuclear fuel is fissionable, no country in the world has ever made a nuclear weapon from low-grade plutonium from recycled high burn-up nuclear fuel. Klein believes that this tactic is not practical for a strategic or a tactical nuclear weapon.

Other countries, including France, Japan, the United Kington, Russia, India, and China have dedicated significant resources toward their reprocessing programs. Reprocessing not only recovers significant energy value from spent fuel, it also reduces the volume and radiotoixicity of high-level nuclear waste.

In the U.S., utilities operating nuclear power plants store spent nuclear fuel rods on site in pools of water. Eventually, the fuel is moved into dry cask storage. While there is debate over whether the casks should be located in one central storage site, the practice is accepted as safe and secure.

Klein claims that in order to establish a program to recycle nuclear fuel, a public-private partnership operating outside normal Congressional appropriations as well as having a charter to manage the fuel over a period of decades would need to be established.


Composting Basics

Simply put, compost is decomposed organic material. The organic material can be either plant material or animal matter. Composting is a simple and natural process that occurs in nature, often without the assistance of mankind. Both living plants and annual plants die at the end of the season and are consumed by animals of all sizes, from larger mammals to microscopic organisms. The result of this natural cycle is compost, a combination of digested and undigested food left on the earth to create rich, usually soft, sweet-smelling soil.

Backyard or personal composting is the intentional and managed decomposition of organic materials for the creation of compost. Anyone can effectively manage the composting process, however, the trick is to maximize the process of decomposition, while avoid the unpleasant effects of having a pile of decaying matter.

Compost is beneficial for home gardens because it improves the soil, which in turn supports healthier and more productive plants. Compost provides nearly all of the essential nutrients for healthy plant grows, and almost always releases those nutrients over time to give plants a slow, steady, consistent intake of the elements necessary for plant growth.

A compost pile should contain the right mixture of key ingredients to properly compost, as well as air and water. The pile should consist of two classes of materials, referred to as “greens” and “browns.” Green materials (fresh green grass clippings, fresh manure, kitchen scraps, and weeds) are high in nitrogen, while brown materials (brown, dry leaves, dried grass, straw, sand, sawdust) are high in carbon. Too much of one ingredient or too little of another reduces the productivity of the microorganisms which compost the materials. The best combination of browns and greens is about 4 parts “brown” to one part “green.”

To make a compost pile, some outdoor space needs to be dedicated to he process. The location of the compost should be close to the garden, as well as close to the source of raw materials (kitchen scraps, lawn clippings, etc.), without being an eyesore.  Open bins and enclosed containers are both used for compost piles.

There are advantages and disadvantages of both types of compost piles. Entire books have been written on the subject of composting. Read up on composting to find which type best suits your personal composting needs.

Garden of OZ

Phosphorous Demand and the World Food Supply

Due to the ever increasing global population, the demands for food have greatly risen over the past few decades. Farmers around the world rely partially on phosphorous-based fertilizers in order to maintain and improve their crop yields. However, the overuse of phosphorous has led to freshwater pollution and a number of other problems, such as the growth of blue-green algae in lakes and the growing number of coastal ‘dead zones.’

Furthermore, the fact that phosphorus is a non-renewable resource comes into play. Phosphorus comes from phosphate rock of which there are limited supplies. For the first time ever, a detailed map has been produced showing the imbalances in how phosphorus, an essential plant nutrient, is being distributed and used around the world.

Graham MacDonald, a PhD student in the Department of Natural Resource Sciences at McGill University, who led the study, says that, “Typically, people either worry about what happens when an excess of phosphorus finds its way into the water, or they focus on what happens when we run out of phosphorus.” MacDonald remarks on how this study shows the issue on a global scale that these two are not separate problems. MacDonald believes the issue is how to distribute the phosphorus that we’ve got.

The study used detailed agronomic information on how much phosphorus is applied to soils in fertilizers and manures for more than 100 different food, feed, and fiber crops produced around the world in 2000. The results show a large imbalance in phosphorus use, with both the overuse of phosphorus in some parts of the world and high phosphorus deficits in others. While it is typically believed that phosphorus deficits exist in only the poorer countries, the results proved this to not be the case. The phosphorus levels vary widely within most nations-with surpluses and deficits commonly occurring side-by-side within a single region.

Long-known as the Russian empire’s ‘bread basket’, Ukraine is one area that suffers from phosphorus deficits. Eastern China and southern Brazil have become known as phosphorus ‘hotspots.’ Within these hotspots, the surplus of phosphorus from the intensive use of fertilizers pose a danger of being lost when runoff from farmlands pollute freshwater supplies.

The study will help policy-makers to go ahead and make informed decisions at a national or global scale about the use of phosphorus.

McGill University via ScienceDaily

The Rising Cost of Settling the American Desert

In 1968, when Interior Secretary Stewart L. Udall replaced two dams in the Grand Canyon with an agreement to build a huge coal-fired power plant on the Arizona-Utah border near Page, he viewed the pact as one of the last steps in what he called “water statesmanship,” to resolve “the overwhelming political issue in Arizona.” When it was completed in 1974, the new 2,250-megawatt Navajo Generating Station (NGS) provided the power to draw 1.42 billion gallons of water a day out of Lake Havasu, along the border with California, and pump it 336 miles and nearly 3,000 feet uphill in the Central Arizona Project (CAP) canal all the way to Tucson.

Photo © Central Arizona Project: The Central Arizona Project canal system runs 336 miles from Lake Havasu, on the California border, to Tucson, providing water to nearly 80 percent of the state’s residents.

In almost every way conceivable, the power plant and the canal reflected the hubris of a rich nation at the height of its wealth, and determined to build in one of the driest regions on the continent energy-hungry and water-wasting cities that defied the laws of nature. Nearly four decades later, in an era marked by the warming climate, the increasing financial and environmental costs of generating power with coal, and declining reserves of fresh water in the West, the historically tenuous cords of legal agreement and civic support that have always defined the CAP are threatening to come unraveled.

At the core of the problem is the price of water, which is closely tied to the cost of operating the plant. Both could rise substantially if the Obama administration and the U.S. Environmental Protection Agency issue new rules to limit emissions of carbon dioxide and other haze-producing gases. There is also a drought that has persisted for more than a decade on the Colorado Plateau, raising a serious question about how much Colorado River water will be available to both cool the giant power plant, and also supply the CAP’s farm and business customers, and 80 percent of Arizona’s residents.

In other words, say executives of the CAP, the 36-year union of cheap energy and cheap water is about to end, and water prices for most Arizonans could increase drastically.
“We’re very concerned,” David Modeer, general manager of CAP, told Circle of Blue. “Given the atmosphere right now, we don’t think the owners would make that decision to invest in the plant. We’re looking at possible closure by 2019.”

During the past 10 weeks, in Choke Point: U.S., Circle of Blue has described in probing detail the collisions occurring in almost every region of the country between rising energy demand and declining reserves of fresh water. Next to agriculture, energy production withdraws and uses more water than any other sector of the American economy. Choke Point: U.S. has raised urgent questions about the capacity of the United States to meet a 40 percent increase in energy demand by mid-century—a projection developed by the Department of Energy—and not exhaust the country’s freshwater reserves. The collision between water and energy is most fierce in the fastest growing and most water-scarce regions of the nation, including California, the Southwest, the Rocky Mountain West and the Southeast.

Choke Point: U.S. identified the Colorado River as a particularly telling example of how energy demand and water scarcity is producing authentic threats to the American way of life. Lake Mead, which dams the Colorado River near Las Vegas and is one of the largest reservoirs in the country, is 59 percent empty and the water level is so low that the giant turbines in Hoover Dam could soon stop turning.

Upriver, where the NGS withdraws eight billion gallons of water annually from Lake Powell, the nation’s second largest reservoir, water levels have stabilized after falling to record low levels in 2005. As a hedge against future droughts, a deeper cooling water intake was completed earlier this year, allowing the plant to draw water below Lake Powell’s dead pool — the elevation where water levels are lower than the reservoir’s outlet pipes.

The newest threat to the power plant, say its managers, is the cost of controlling emissions, particularly carbon dioxide and haze-producing nitrogen oxides.

The closure of the NGS, they say, would be a brutal blow to CAP water prices because electricity from NGS is sold at cost. Farmers would be hit especially hard, since they currently pay for water at the cost of the energy to move it — a 60 percent discount as the result of a state compromise designed to end unsustainable groundwater pumping. Using energy price forecasts from Navigant Consulting Inc., and global engineering corporation Black & Veatch, CAP analysts estimate that water prices would rise by 50 to 300 percent if they had to purchase energy at market rates.

Photo © Central Arizona Project: The Picacho Pumping plant is one of 14 stations that lift Colorado River water 2,800 feet for delivery to central Arizona.

New Standards Mean More Green

A lot of the energy produced in the United States is used to move water from its source to the treatment plant and then the end user. Energy also is needed to dispose billions of gallons of water every day. Typically, moving water is the single largest electrical expense of the nation’s cities and towns. Nationally, four percent of electricity is used to get water from one place to another.

The U.S. Environmental Protection Agency, given new vigor under the Obama administration, announced in August 2009 that it was considering regulations to reduce nitrogen oxide emissions from NGS in order to improve visibility in the region, especially in Grand Canyon National Park, 12 miles southwest. Depending on what type of technology the EPA mandates, energy costs at NGS will rise between 1 and 34 percent, said engineers.

The nitrogen oxide regulations come at a time when carbon emissions are already under pressure to be priced or regulated in the United States. Though the Senate failed to pass a cap-and-trade bill this year, regional trading schemes are beginning to be floated and a national carbon price, while low on the political radar, is still possible. A carbon price of $20 per ton, which is in the middle of what the House passed in June 2009 and the Senate proposed this year, would increase CAP energy costs 71 percent. The price of an acre-foot of water (326,000 gallons) from the CAP for agricultural users would rise to $88 from $53 in 2011 prices.

The EPA also was granted authority by the U.S. Supreme Court in 2007 to regulate carbon emissions from big polluters under the Clean Air Act—which it plans to begin doing next year.

This adds up to a lot of uncertainty for CAP managers and Arizona water users. The CAP managers are not sure whether owners of the Navajo station will take on the higher costs from retrofits and the possibility of carbon prices, or shut the plant down. Executives of the consortium of utilities that own the Navajo station have been cautious in their statements. Installing new emissions control technology requires a unanimous vote of the owners. One owner, the Los Angeles Department of Water and Power, has already stated its intention to divest from coal-based energy sources by 2020.

Energy for Water

Water, at 8 pounds per gallon, is a heavy load when billions of gallons need to be lifted. As a result, the largest share of energy to move it goes toward pumping.
The longest, highest pumping systems in the United States are in the West, where water often travels great distances from source to user, nowhere more so than California. The State Water Project sends Sierra Nevada snowmelt 444 miles to the state’s southern half, and the Colorado River Aqueduct traverses 242 miles in its course. Nearly 8 percent of California’s electricity is invested in water movement, according to the California Public Utility Commission (CPUC). If you add the energy used by end users to heat water, 19 percent of the state energy is tied to water use. Meanwhile, the CPUC is looking into how much energy can be shifted away from peak demand hours by reducing water use.
“Energy costs play a big part in water rate increases,” said Jon Lambeck, power resources manager at Metropolitan Water District of Southern California.

Photo © Salt River Project: The coal-fired Navajo Generating Station near Page, Ariz., which supplies 95 percent of the power for the Central Arizona Project, is facing possible emissions regulations from EPA that might cause the plant’s owners to shut it down. The power station is the nation’s fourth highest emitter of nitrogen oxides, gases that produce haze.

Across the West, water managers looking to increase supplies are facing significantly higher energy costs to access that water since all of the easily available sources are spoken for. A report from Western Resource Advocates, a research group, found that the energy intensity (the energy required to move a unit of water) of most proposed new supply projects is higher than current supplies because wells will have to be drilled deeper, canals extended farther and lower quality water treated more thoroughly.

Higher energy requirements for water supplies are coinciding with a national realization that past ways of electrical generation may not be viable in a water-constrained world.
The starkest example of how old water and energy practices are bumping up against new realities is the Central Arizona Project and the Navajo Generating Station.

CAP/NGS History

In 1968, the U.S. Congress authorized the Central Arizona Project, the country’s largest and most expensive aqueduct system, to bring Colorado River water to the state’s dry center. To that point, Arizona had been unable to effectively utilize its share of the river, but reaching the inland area required pumping the water nearly 3,000 vertical feet, an energy-intensive process that would turn CAP into the state’s top energy consumer.

To provide that electricity, the U.S. Bureau of Reclamation’s preferred option was to construct bookend dams at the margins of the Grand Canyon National Park. But those plans were scuttled after a Sierra Club advertising campaign comparing the proposal to flooding the Sistine Chapel for a better view of the ceiling prompted national outcry.
The power source resolution was the Navajo Generating Station—three coal-fired boilers with a total rated capacity of 2,250 megawatts. The coal would come from Kayenta mine on the Navajo reservation 92 miles away, providing the tribe $12 million a year or most of its revenue. The Navajo Nation, moreover, receives $137.5 million annually from employment in the plant and the mine, royalties and other fees, according to the Central Arizona Project.

As the largest stakeholder in the venture, CAP receives 4.3 million megawatt-hours per year, almost a quarter of the plant’s electrical output. That is enough to meet 95 percent of its energy needs and generate an annual surplus of 1.5 million megawatt-hours that are sold on the open market to repay the federal government for construction of the canal, as well as pay for Indian water rights settlements.

In 1999, EPA established a regional haze rule, requiring no manmade visibility impairment by 2064 around national parks and wilderness areas. NGS had already cleaned up its sulfur dioxide emissions with a retrofit in 1991 and is currently voluntarily installing low-nitrogen oxide burners at a cost of $46 million in order to reduce pollutants.

Photo © Central Arizona Project: As part of a deal to stop unsustainable groundwater pumping, farmers buy water from the Central Arizona Project at the cost of the energy to move it, or 40 percent the price other users pay.

While the EPA could rule that this is sufficient action, also on the table are two significantly more expensive technologies: selective catalytic reduction (SCR) and SCRs with polishing baghouses. The former would cost $550 million and increase energy costs 17 percent; the latter would cost $1.1 billion with a 34 percent rise in energy costs, according to Salt River Project, the plant’s operating partner.

The operators of NGS are in favor of the low-cost burners. They argue that the other options are too costly for marginal improvement that will not be humanly perceptible and that much of the haze comes from upstream sources outside the immediate area. The Navajo and Hopi tribes also support the operators’ position because they rely on coal mining and land leases for employment and tribal income. However, there is grassroots opposition within the tribal communities against NGS and Peabody Western Coal Company, which operates the mine, because of the health and environmental consequences of coal-mining, especially depletion of the aquifer system.

The National Park Service supports SCR technology and argues that Salt River Project overestimated the cost and underestimated the visibility improvements.
The EPA has not finished its analysis of control technologies and is still assessing submissions made during the open comment period, public affairs officer Margot Perez-Sullivan wrote in an email to Circle of Blue. There is no firm date yet on when a ruling will be made.

Modeer, CAP’s general manager, said that they are looking at alternative energy sources, but it would take 12 to 15 years to develop a base load source to replace the Navajo plant.
“The immediate concern of the Navajo Generating Station is a huge issue for Arizona,” Modeer added. “It’s a confluence of a lot of things that’s putting the station at risk. It’s not just the EPA. It’s carbon legislation, ozone issues, mercury issues. All operators are looking at the future.”

This article was originally posted on the Circle of Blue website.

Brett Walton is a Seattle-based reporter for Circle of Blue. Reach Walton at brett[at]circleofblue.org.

2010 Ties Record for World’s Warmest Year

Together with 1998 and 2005, the year 2010 ranked as the warmest year on record, according to the world Meteorological Organization. Data received by the WMO show no statistically significant difference between the global temperatures in 2010, 2005, and 1998.

In 2010, the global average temperature was 0.53˚C (0.95˚F) above the 1961-1990 mean. This value is 0.01˚C (0.02˚F) above the nominal temperature in 2005, and 0.02˚C (0.05˚F) above the temperature in 1998. The difference between the three years is less than the margin of uncertainty (± 0.09°C or ± 0.16°F) when comparing the data.

The statistics are based upon data sets maintained by the UK Meterorological Office Hadley Centre/Climatic Research Unit (HadCRU), the U.S. National Climatic Data Center (NCDC), and the U.S. National Aeronautics and Space Administration (NASA).

Arctic sea-ice cover in December 2010 was the lowest on record, with an average monthly extent of 12 million square kilometers, 1.35 million square kilometers below the 1979-2000 average for the month of December. This comes after the third-lowest minimum ice extent recorded in September 2010.

Between 2001 and 2010, global temperatures have averaged 0.46˚C (0.83˚F) above the 1961-1990 average. The temperatures are the highest ever recorded for a 10-year period since the beginning of instrumental climate records. Warming has been especially strong in Africa, parts of Asia, and parts of the Arctic, with many subregions registering temperatures 1.2 to 1.4˚C (2.2 to 2.5˚F) above the long-term average.

The information for 2010 is based on climate data from networks of land-based weather and climate stations, ships and buoys, and satellites. Dada are continuously collected and disseminated by the National Meteorological and Hydrological Services (NMHSs) of the 189 Members of the WMO and several other collaborating research institutions.

The data feed three main depository global climate data and analysis centers, which develop and maintain homogeneous global climate datasets based on peer-reviewed methodologies. Thus, the WMO global temperature analysis is based on three complementary datasets; first, the combined dataset maintained by both the Hadley Centre of the UK Met Office and the Climate Research Unit, University of East Anglia, United Kingdom; second, the National Oceanic and Atmospheric Administration (NOAA) under the United States Department of Commerce; and third, the Goddard Institute of Space Studies (GISS) operated by the national Aeronautics and Space Administration (NASA).