Feed aggregator

Two Simple Questions for Al Gore

Somewhat Reasonable - July 17, 2014, 11:52 AM

Al Gore is at it again. He was just in Australia (he should hire Weatherbell.com to help him avoid the “Gore effect” – Brisbane recorded its coldest temperature in 103 years during his stay) and told BBC, “This [climate change] is the biggest crisis our civilization faces.”

A statement like that, which echoes much of what State Secretary John Kerry says, is very serious indeed, so perhaps Al Gore should answer a couple of basic questions.

Some points first. CO2 in the atmosphere is portrayed in proportions that distort it in the same way a picture of an ant under a microscope would distort its size in relation to the environment around it.

A pretty intimidating creature.

Next, CO2 has no linkage with the globe temperature in the geological time scale, nor in recent times (as the Pacific started to cool, so did temperatures), as plainly seen in the charts below.

Left is CO2 vs. temps, middle: model busts, and right: closeups of latest temps

The CO2 graphics are on a scale similar to showing an ant under a microscope and then claiming these monster creatures are taking over earth. The correct scale of CO2, since it is measured in parts per million, is to show it on a scale from zero to a million instead of the way it is portrayed most commonly, which makes it look like increases can take over the world. It is 400 parts per million total and increases 1.8 parts per million a year. Since it is impossible to create a chart like that in millions, and CO2 would not show up anyway because its contribution is so small, I will try to be more realistic. It looks more like the following chart in relation to the total atmosphere. In this case, I will use the graphic of CO2 as a percentage of the atmosphere (.04%).

Another way of putting it: Here is Beaver Stadium at Penn. State filled to capacity with 107,000 people.

The current percentage of atmospheric CO2 is equivalent to picking 43 people out of 107,000. The yearly increase from the U.S.: ¼ to 1/3 person to that crowd.

Consider this: The yearly increase in the level of CO2 from all sources is 1.8 ppm. (ppm = parts per million). There are arguments as to how much of this is due to man. To make sure that I give my opponents the benefit of a doubt, I will assume all of that increase is because of man.

Now remember, the heat capacity of the atmosphere is only 1/1000th of the ocean’s. That means when we are talking man’s input of CO2 into the entire planetary climate system, the fact is the part we put into the air only has 1/100th of the greenhouse gas effect, the primary one being water vapor. Water vapor makes up just 4% of the atmosphere, and the atmosphere only has 1/1000th the heat capacity of the ocean. Common sense reasoning shows that the effect of CO2 has to be boxed in by all this. But let’s continue, shall we?

The EPA estimates that the U.S. contributes about 1/5th of the CO2 man emits, which would be .20 x 1.8 ppm, or .36 (that’s point 36) ppm. I am not going to use smaller estimates of the U.S. contribution, which are as low as 10%. As I said, I am assuming all the increase is from man, which is also arguable. But I want to consider the worst case scenario.

So let’s keep this short and sweet. Two question for Mr. Gore:

1.) What is the perfect temperature for the planet?

2.) Do you really believe that the U.S.‘ contribution of .36 parts per million of CO2 has any provably measurable effect on weather/climate?

Joe Bastardi is chief forecaster at WeatherBELL Analytics, a meteorological consulting firm.

Originally published at The Patriot Post. 

 

 

Categories: On the Blog

Heartland’s Science Director Breaks Ground to Rein in US EPA

Somewhat Reasonable - July 17, 2014, 9:46 AM

The U.S. Environmental Protection Agency announced on Monday, June 1, a power-plant proposal that seeks a 30% carbon dioxide emissions cut by 2030 from existing power plant, based on emission levels from 2005. With this proposal, the main piece of President Obama’s Climate Change Agenda has been set in motion. Although the rule is scheduled to be completed one year from now and will give flexibility to the states, it will regulate carbon emissions from hundreds of fossil-fuel power plants across the U.S. The 600 U.S. coal plants will be hardest hit by the standard.

As reported by the EPA (International Energy Agency), the cost of decarbonization since the last estimate of two year ago has increased by 22% due to the growth of coal power far outpacing renewables in carbon dioxide production. It would now cost $44 trillion to reduce carbon dioxide emissions from the world’s power generation industry to levels that would curtail global warming to 3.6 degrees through 2050, a safe level agreed to by international leaders.

As most of the warming in the past century occurred before 1940, before CO2 emissions could have been a major factor. Common sense alone should be enough to alert the Obama administration, yet there is sound science, that spending $44 trillion to limit temperature rise by 3.6 degrees by 2050 wouldn’t amount to a hill of beans other than cause an economic disaster.

Between 1940 and 1970 temperatures fell even as CO2 levels increased.  According to Climate Scientist Dr. Judith Curry and other notable scientists, public debate is moving away from the 15-17 year pause in any Global Warming to a period of Global Cooling.  Why then would the temperature rise a few degrees by 2050?  Because the ‘Sun Sleeps’, according to Danish Solar Scientist Svensmark, who has declared that “global warming has stopped and cooling is beginning…enjoy global warming while it lasts.”

Nicolas Loris, an economist who focuses on energy, environmental land regulatory issues at The Heritage Foundation, compares the EPA to a misbehaved child who is recklessly doing what it wants at the expense of others without any supervision. He has proposed that Congress cut the EPA budget, just as parents might punish children by taking away their allowance.  Loris suggests ten areas for Congress to cut that would prevent the EPA from implementing regulations “that will drive up living costs for American families for little, if any, environmental benefit.” Loris further recommends that the EPA “eliminate programs that are either “wasteful, duplicative, or simply not the role of the federal government”, and then goes on to list program that Congress should cut immediately.

Cuts could be made to rein in the EPA, as outlined in the plan put forth by Nicolas Loris, but Jay Lehr, Ph.D., science director for the Heartland Institute, and one of the nation’s most respected and widely cited experts on air and water quality, climate change, and biotechnology, broke ground in revealing his comprehensive plan to reform the EPA in his remarks as a Keynote breakfast speaker at Heartland’s recent International Conference on Climate Change held in Las Vegas from July 7 – 9.   Jay Lehr, Ph.D. introduced a legislative plan to replace the United States Environmental Protection Agency with a Committee of the Whole of the State Environmental Protection Agencies, utilizing a phased five-year transition period.

Jay Lehr’s comprehensive EPA plan follows:

In 1968 when I was serving as the head of a ground water professional society it became obvious to me, and a handful of others, that the United States did not have any serious focus on potential problems with its air quality, drinking water quality, surface water quality, waste disposal problems as well as contamination that could occur from mining and agriculture. I held the nation’s first Ph.D. in ground water hydrology which gave me insight to understand the problems. I was asked by the Director of the Bureau of Water Hygiene in the U.S. Department of Health to serve on a panel to study the potential to expand their oversight into a full Environmental Protection organization.

We collectively spoke before dozens of congressional committees in both the House of Representatives and the U.S. Senate calling attention to mounting environmental pollution problems.  We called for the establishment of an Environmental Protection Agency and in 1971 we succeeded.  I was appointed to a variety of the new agency’s advisory councils, and over the next 10 years we helped to write a significant number of legislative bills which were to make up a true safety net for our environment.  They included: THE WATER POLLUTION CONTROL ACT (later to be renamed THE CLEAN WATER ACT); THE SAFE DRINKING WATER ACT; RESOURCE CONSERVATION AND RECOVERY ACT; THE SURFACE MINING AND RECLAMATION ACT (which surprisingly covered deep mines as well); THE CLEAN AIR ACT; THE FEDERAL INSECTICIDE, RODENTICIDE AND FUNGICIDE ACT; and finally THE COMPREHENSIVE ENVIRONMENTAL RESPONSE COMPENSATION AND LIABILITY ACT we now know as Superfund.

All of these acts worked extremely well for the protection of the environment and the health of our citizens, with the exception of the Superfund Law, which proved to be far too over reaching and wreaked havoc with American business, as after the fact companies operating within the law were fined countless dollars and required to pay huge sums for cleanup of waste disposal that had been within the law at the time of the activity.

From 1981 activists groups recognized that the EPA could be used to alter our government by coming down heavily on all human activities regardless of its impact on the environment.  It is my strong opinion that no single law or regulation has been passed since 1981 which benefited either the environment or society.

The power of environmental activists to take control of the EPA, and all of its activities, was slow and methodical over the past 30 years to the point where EPA is all but a wholly owned subsidiary of environmental activist groups controlling 10% of the U.S. budget.

For over 20 years I have worked tirelessly to expose this story to the public beginning with my 1991 book, ‘Rational Readings on Environmental Concerns’, where 50 other environmental scientists joined together to describe the manner in which their own fields had been hijacked and distorted to allow fear mongering of an unconscionable nature.

I believe that the current structure of the U.S. EPA can and should be replaced now by a Committee of the Whole of the outstanding 50 state Environmental Protection Agencies, which in nearly all cases have long ago taken over primary responsibility for the implementation of all environmental laws passed by Congress or simply handed down as fiat rulings without congressional vote or oversight of U.S EPA.

When the agency was established in 1971 the federal government had no choice but to oversee all regulations of the initial seven safety net laws. Rapidly, however, every state established an independent agency which filed for and were granted primary control of the implementation of all the existing laws.  With only rare exception, each state is now under full control of the regulatory program. They are continually harassed to be sure that no one evades the heavy hand of the dozens of new regulations passed yearly over the past three decades, which have strengthened the initial laws, but have not benefited either our environment or the health of our citizens. Instead, they deter our economy and the right of our citizens to make an honest living without endangering the environment.

With 30 years of experience, these 50 state environmental agencies are ready to collectively take over the entire management of the nation’s environment. Only the USEPA research laboratories should be left in place to answer continual scientific questions, which would no longer be under the heavy hand of Washington politics.

Eighty per cent of what is now USEPA’s budget could be eliminated, while 20% could be used to run the research labs and administer the Committee of the Whole 50 State Agencies.  A relatively small administrative structure will be needed to allow the collective states to refine all existing environmental laws in a manner more suitable to the primary requirement of protecting our environment, without thwarting national progress in all industry and the development of our natural resources and energy supplies.

USEPA could be phased out over five years with a one-year preparation period, followed by a four-year program in which 25% of the agencies activities would be passed to the Committee of the Whole each year beginning with those activities least critical to the nation.  The Committee of the Whole will be made up of representatives from each state from each significant area of concern.  The Committee of the Whole will be divided up into subcommittees reflecting exactly how USEPA is set up, although many programs and offices within USEPA may be eliminated at the will of the states.  Offices, for instance, whose primary purpose is oversight of the state agencies.

The Committee of the Whole would quickly determine which regulations are actually mandated in law by the Congress and which were established independently by USEPA, believing that legislation allowed them such latitude. Rules written clearly into legislation would be recommended for continuance or considered for request that the Congress take another look at regulations that the Committee of the Whole deems unnecessary in their current form.

Regulations clearly not supported by writings within legislation would be considered by the applicable subcommittees and the whole committee for alteration by a two-thirds vote of the Committee of the Whole.

Until the Committee of the Whole acted upon each individual regulation, all regulations would remain in force.  Many regulations would give states latitude to act while others would be required of all states by a two-thirds vote of the Committee of the Whole.

Each state would be funded to increase their staff to include people whose primary jobs would be to serve on subcommittees of the Committee of the Whole overseeing the issues previously overseen by the current USEPA.

This phase out of USEPA could be done in an orderly manner within five years.  Oversight of the existing USEPA research labs would eventually be seeded to a subcommittee of the whole.

When one considers how the USEPA was initially set up, along with the growth of the state agencies, this is actually a logical endpoint that could have begun 30 years ago.

The specific details of the five-year transfer from the Washington D.C. based USEPA and its 10 regional offices would be carried out as follows.

The Federal Budget for environmental protection will be reduced from $8.2 billion to $2 billion.

The manpower will be reduced from over 15,000 to 300, who will serve in the new headquarters to be located centrally in Topeka Kansas to allow the closest contact with the individual states and reduce travel costs from the states to central headquarters of the Committee of the Whole.

The 300 individuals working in Topeka, Kansas will come as six delegate/employees from each of the 50 states.  The personnel currently working at more than two dozen research centers will remain in place until the Committee of the Whole chooses to make changes.

The United States Environmental Protection Agency is presently divided into 14 Offices which include the following:

OFFICE OF THE ADMINISTRATOR

OFFICE OF AMERICAN INDIAN ENVIRONMENTAL OFFICE

OFFICE OF INTERNATIONAL AND TRIBAL AFFAIRS

OFFICE OF POLICY

OFFICE OF ADMINISTRATION AND MANAGEMENT

OFFICE OF ENFORCEMENT AND COMPLIANCE ASSURANCE

OFFICE OF FAIR AND RADIATION

OFFICE OF CHEMICAL SAFETY AND POLLUTION PREVENTION

OFFICE OF SOLID WASTE AND EMERGENCY RESPONSE

OFFICE OF WATER

OFFICE OF THE CHIEF FINANCIAL OFFICER

OFFICE OF GENERAL COUNCIL

OFFICE OF ENVIRONMENTAL INFORMATION

OFFICE OF RESEARCH AND DEVELOPMENT

Year One:

In the first year of transition all employees of USEPA will be informed of the five-year transition period allowing them ample time to seek other employment opportunities both from the Washington D.C. offices and the ten Regional offices that parallel to a large extent the activities in Washington. Additionally. during year one the two offices relating to Indian issues (American Indian Environmental Office and International and Tribal Affairs), will be transferred to THE UNITED STATES BUREAU OF INDIAN AFFAIRS which should welcome this responsibility, along with about half of the monies budgeted for them within USEPA. During this first year all 300 employees relocating from our 50 states (six each) will begin work in the new Topeka, Kansas offices to be established early in year one.

A Chairman of the Committee of the Whole will be elected by the 300 delegate employees to a three-year term early in the transition after which the 300 employee/ delegates will be assigned to sub committees simulating the offices which exist in Washington, D.C.

Year Two:

During year two all activities of the Offices of Policy, Administration and Resource Management, as well as Enforcement and Compliance Assurance, will be transferred to Topeka from Washington, D.C. and the Regional Offices.

Years Three, Four, and Five:

In year three all activities of the Offices of Air and Radiation, as well as Chemical Safety and Pollution Prevention, will be transferred to Topeka.  In year four the responsibilities of the Office of Water and the Office of Solid Waste and Emergency Response will move to Topeka.  In the final year the remaining Offices of Chief Financial Officer, General Council, Environmental Information, and the Office of the Administrator will have their responsibilities moved to Topeka.

During each year of transition members of the Topeka staff will be assigned for periods of time to both the Washington D.C. Offices and the Regional Offices to study the activities of the existing units. It is quite likely that as the office responsibilities are transferred to Topeka, the Committee of the Whole of the 50 state agencies will choose to totally eliminate some of them.

It is also anticipated that if some D.C. offices experience an early excessive attrition of employees relocating before the phase out of their office, an earlier transfer of responsibility to Topeka may be required.

As monies are freed up in the transition from 15,000 federal employees to 300, each state will be allocated $20 million to enhance their new independent responsibilities and replace the six employees transferred to Topeka.  In addition to that use of one billion dollars (50 x $20 million), it is anticipated that the management of the Topeka offices and the continuation of the research and development program will require a second billion dollars allowing the permanent reduction of a $8.2 billion federal outlay for environmental protection down to a total of $2 billion.

Not only will large sums of money be saved by this transition, but the efficiency and quality of environmental protection will be enhanced by placing power and responsibility in the hands of the individual states.  It is well known that government closest to the location of the governed is the best for all.  Most states will embrace this plan, as it gives them the authority they have always sought and the funding to carry it out.

At this point in time, it should be recognized that it is absurd to think that 50 outstanding state environmental protection agencies with over 30 years of experience require the over sight of 15,000 federal employees.  It made sense in the agencies first decade of the 70′s’, it has made no sense since then.  Today the EPA is an agency whose value has long since disappeared. The new oversight by the collective Committee of the Whole of the 50 state agencies will carry out the needs of the nation more effectively and more efficiently with dramatically reduced costs in line with how our republic was established to operate.

Jay Lehr, Ph.D. was a Keynote speaker on Wednesday, July 9, at 8:00 AM PDT Listen to Lehr’s remarks here

 

[Originally published at Illinois Review]

Categories: On the Blog

Down the Memory Hole

Somewhat Reasonable - July 17, 2014, 9:00 AM

In his novel Nineteen Eighty-Four, George Orwell described the “memory hole,” a chute leading to a vast incinerator into which all unwanted documents were cast. The memory hole served as the ultimate form of state censorship, destroying any trace of information deemed to pose a threat to the regime. Thanks to a ruling in May by the European Court of Justice, a genuine digital memory hole has come online.

The ruling concerned the so-called “right to be forgotten,” wherein individuals and firms have the right to have old stories and information published online about them taken down at their request. Ostensibly a way to remove out-of-date or spurious information from readers’ online searches, the reality has taken on a much darker, quite Orwellian turn.

Several businesses have begun the process of “scrubbing” their records. In other words, they file requests with Google to remove access to certain articles. This has resulted in many pieces of genuine journalism suddenly becoming impossible to access. The implication is clear: citing their right to be forgotten, firms and individuals can muzzle the press.

This poses a very sticky issue for defenders of individual rights. Many liberty-conscious individuals have invested in technologies that allow them to search the internet anonymously and to avoid the prying eyes of governments and businesses alike. Privacy is an ever scarcer commodity in this Information Age and people have belatedly come alive to the risks they face in it.

The memory hole is a threat to everyone. While it may purport to being a protection of people’s privacy, it is in fact a mechanism by which some individuals and groups can manipulate internet searches without any means of appeal. When it is a legal right to be forgotten, it does not much matter whether there are multiple internet service providers to choose from; they are all muzzled by the net non-neutrality created by the ECJ’s ruling. If the ruling spreads, to America and farther afield, the latter-day Renaissance of the free press enjoyed by web users will be brought to an abrupt close.

While it may be embarrassing to have damaging information floating forever online, it is a necessary component of the openness demanded by the people born into this era. The press cannot be silenced because what they say is embarrassing. People have a right to access the information made public online and journalists have a right to publish their work without fear of it being shut down for sake of those they would expose.

Categories: On the Blog

New York Times Commits Rare Act of Objective Climate Journalism, Salon Pounces in Outrage

Somewhat Reasonable - July 17, 2014, 8:06 AM

NY Times photo of Dr. John Christy. Salon would like to see him a bit less comfortable.

On Wednesday afternoon, Salon’s Lindsay Abrams — who declined to cover Heartland’s latest climate conference — upbraided the NewYork Times for recently profiling “skeptic” climate scientist John Christy and not sufficiently denouncing him and his views.

It’s a weird thing. Against all odds, the Times committed an act of objective journalism — or what passes for that in today’s politicized MSM — with a classic contrarian piece titled “Though Scorned by Colleagues, a Climate-Change Skeptic Is Unbowed.” Maybe it was an accident on the part of the Times, but Abrams didn’t want it to go unpunished. Can’t step out of line, you see.

I read both the Times piece and Abrams’ critique of it at Salon titled “New York Times’ Climate Skeptic Debacle: How a New Profile Sets Back Science.” The subhead at Salon makes sure (literally) that no one misses the point:

Paper of Record Issues Bizarrely Sympathetic Treatment of Prominent Skeptic John Christy, Totally Misses the Point.

Having recently returned from working with the media for Heartland’s Ninth International Conference on Climate Change in Las Vegas — and having already written two rebuttals to prominent journalists to set the record straight about what happened there — I couldn’t resist sending an email to Ms. Abrams. Enjoy it, below:

Lindsay,

Interesting article today. I had hoped to simply put aside your distaste for the New York Times committing the journalistic sin of reflecting the tiniest bit of sympathy for the esteemed Dr. John Christy and get right to critiquing your essay. But that is impossible because your objections to what was a reasonable and fair New York Times piece seem so small and baseless to me. What is so objectionable — what “missed the mark” — about the following?

[Paraphrase] A soft lead that opens with Christy relating how scientists who disagree with his analysis of the climate data refuse to shake his hand.

That is certainly true, and a great way to frame for readers of the Times how Christy is willing to listen to and be cordial towards “the other side” of the climate debate. Should the Times have not mentioned this because it portrays Christy as the better man? You seem to be saying: Yes. That is rather uncharitable. Indeed, Christy was quite the gentleman for not embarrassing the rude, contrarian scientist by revealing his name.

But in speeches, congressional testimony and peer-reviewed articles in scientific journals, he argues that predictions of future warming have been greatly overstated and that humans have weathered warmer stretches without perishing.

This is demonstrably true in the scientific record. See the presentation by Greenpeace co-founder Dr. Patrick Moore at Heartland’s latest climate conference.

Christy’s willingness to publicize his views, often strongly, has also hurt his standing among scientists who tend to be suspicious of those with high profiles.

This is laughable. To use just one high-profile alarmist scientist as an example, James Hansen has leveraged his former position at NASA to publicize himself without shame. Hansen has certainly testified more frequently in front of Congress than Christy has, and even got himself arrested on purpose to draw attention to his view that human CO2 emissions have doomed future generations to eco-calamity.

For these offenses, Hansen was punished with more attention, including a high-profile “Ted” lecture. I think it’s safe to say Hansen’s antics didn’t “hurt his standing” among his alarmist peers.

The Times continues, quoting Christy:

“I detest words like ‘contrarian’ and ‘denier,’ ” [Christy] said. “I’m a data-driven climate scientist. Every time I hear that phrase, ‘The science is settled,’ I say I can easily demonstrate that that is false, because this is the climate — right here. The science is not settled.”

Dr. Christy was pointing to a chart comparing seven computer projections of global atmospheric temperatures based on measurements taken by satellites and weather balloons. The projections traced a sharp upward slope; the actual measurements, however, ticked up only slightly.

Such charts — there are others, sometimes less dramatic but more or less accepted by the large majority of climate scientists — are the essence of the divide between that group on one side and Dr. Christy and a handful of other respected scientists on the other.

What is the objection? That the New York Times allowed Christy to deny that he is a “climate denier”? That’s the truth, as the Times story acknowledges by noting the difference between the observed climate data and the “sharp upward slope” of the climate models that have been wrong for decades. (BTW: The scores of “other” published alarmist models were wrong at least 95 percent of the time.) For a larger discussion of these dynamics, see presentations by Dr. Roy Spencer and Dr. Patrick Michaels at Heartland’s latest climate conference.

“Where the disagreement comes is that Dr. Christy says the climate models are worthless and that there must be something wrong with the basic model, whereas there are actually a lot of other possibilities,” Dr. Mears said. Among them, he said, are natural variations in the climate and rising trade winds that have helped funnel atmospheric heat into the ocean.

So there is a dispute. Dr. Christy says there is “something wrong with the basic model” the “consensus” scientists use — they are nearly universally and dramatically wrong, after all — and other scientists say the heat they predicted is hiding in the ocean. The larger burden of proof in this dispute is on the “consensus” scientists, who have yet to find the hidden heat in the oceans. Why? Because they can’t find compelling data to support it.

Did the Times “miss the mark” by merely suggesting to an informed reader like yourself that Christy might have a point? (BTW, many “skeptics” at Heartland’s latest climate conference are big “natural variation” advocates, so even the critical source the Times found to counter Christy concedes a key “skeptic” point.)

I’ll stop breaking down the Times’ profile of Christy with this last excerpt:

Dr. Christy has been dismissed in environmental circles as a pawn of the fossil-fuel industry who distorts science to fit his own ideology. (“I don’t take money from industries,” he said.)

He says he worries that his climate stances are affecting his chances of publishing future research and winning grants. The largest of them, a four-year Department of Energy stipend to investigate discrepancies between climate models and real-world data, expires in September.

“There’s a climate establishment,” Dr. Christy said. “And I’m not in it.”

Ask any of the scores of scientists who have presented at Heartland’s climate conferences: Finding the alarmist climate models wanting puts one’s career in jeopardy, and affects one’s “chances of publishing future research and winning grants.” Or, you could simply watch Pat Michaels’ presentation at our climate conference last week. He makes the point, from experience, quite emphatically.

Your objection to this last bit from the Times is this:

Wines dismisses that in a parenthetical comment from Christy (“I don’t take money from industries”), and leaves it at that.

This, again, plays right into Christy’s desire to be seen as misunderstood — he’s been careful to avoid associations not just with polluting industries, but with most of the groups dedicated to spreading climate denial. He doesn’t attend the Heartland Institute’s annual climate denial conferences, he told the Times’ Andrew Revkin several years back, because he wants to avoid “guilt by association.”

Christy is welcome at our conferences, because there is no “guilt by association.” As I explained to Slate’s Will Oremus the other day — and we explain in more detail here — Heartland is not funded by fossil fuel corporations. Indeed, no corporate gifts amount to more than 5 percent of Heartland’s annual budget, and no corporate gifts finance our climate conferences or the Climate Change Reconsidered series of scientific reports by the Nongovernmental International Panel on Climate Change (NIPCC).

(BTW: I just have to note that the foreground of the photo that accompanies the Times piece on Christy features the latest NIPCC report, “Climate Change Reconsidered II: Biological Impacts.”)

You write:

Yet Christy’s perspective on global warming — that the effects will be mild, and potentially even beneficial — is more or less aligned with those voiced by the participants in Heartland’s most recent conference, which took place last week. Aside from a few remaining loonies, most deniers have by now conceded the two most basic facts of climate change: that the climate is changing, and that man-made emissions of greenhouse gases are at least partially responsible.

Putting aide the “loonies” crack, this passage is accurate! Most scientists and policy experts who have presented at Heartland’s conferences haven’t recently “conceded” that truth. They have stated it plainly for years. As I explained to Oremus, if you came to our conference yourself, you would have heard that from scientists in the hall between sessions and been disabused of your wrong-headed thinking about scientific climate “skeptics.” You can still watch the video presentations yourself.

Speaking of Oremus, you write:

Christy’s not special in this regard. Instead, he’s part of a growing movement that Will Oremus, writing in Slate, describes as an effort to rebrand climate denial as “climate optimism”: the idea that climate change, while real, isn’t something worth worrying about — and certainly not worth making an effort to mitigate. In some ways, this is even more dangerous than flat-out denial, which is at least easy to shut down; climate optimism, instead, conflates science with conservative political ideology …

That “rebrand” was Oremus’ doing, not ours, but it has a nice ring to it: “climate optimism.” It accurately reflects the majority of viewpoints expressed by scientists — none of whom deny that climate change is “real” — who presented at Heartland’s latest climate conference, and the previous eight. To quickly sum up:

1. Global temperatures rose in the 20th century.

2. CO2 is a “greenhouse gas,” and it’s likely that human emissions of CO2 was responsible for some of the warming of the 20th century.

3. Global temperatures have not risen in the 21st century (and for nearly 18 years) according to observable data.

4. That fact suggests human CO2 emissions are not a primary driver of global temperatures, especially since even the warming of the 20th century is within natural variation — with natural factors being a much stronger force.

5. So we should think twice about re-ordering the world’s energy economy — and depriving the developing world of much-needed cheap energy to raise them from poverty — without stronger scientific justification.

For a fuller rebuttal of the myths climate alarmists falsely attribute to skeptics, a one-stop summary is provided in this excellent and humorous presentation by Christopher Monckton at our latest conference.

Here’s a bonus remarkable view by many climate “skeptics”: We are living in a world that is starving for even more CO2 in the atmosphere! Again, Greenpeace co-founder Patrick Moore tackles that controversial subject rather well in his presentation.

But that’s the thing about “skeptic” scientists, such as Dr. Christy: They continually challenge each other with science from multiple disciplines to foster a greater understanding about what is really happening to our climate. To do otherwise — to think the “science is settled” — is what it really means to “deny science.”

Best,

Jim

Categories: On the Blog

Promoting Parasitic Power Producers

Somewhat Reasonable - July 17, 2014, 8:06 AM

Letter to the Editor
Somewhat Reasonable

Wind and solar are parasitic power producers, unable to survive in a modern electricity grid without the back-up of stand-alone electricity generators such as hydro, coal or gas. And like all parasites, they weaken their hosts, causing increased operating and transmission costs and reduced profits for all participants in the grid.

Without subsidies, few large wind/solar plants would be built, and without mandated targets, few would get connected to the grid.

Green zealots posing as energy engineers should be free to play with their green energy toys at their own expense, on their own properties, but the rest of us should not be saddled with their costs and unreliability.

We should stop promoting parasitic power producers. As a first step, all green energy subsidies and targets should be abolished.

Categories: On the Blog

Internet sales tax opposed by voters in 6 states, poll finds

Out of the Storm News - July 16, 2014, 4:28 PM

From Law360:

Polling released Friday by free-market think tank R Street Institute and conservative lobbying group National Taxpayers Union indicates that a majority of residents in six states including Pennsylvania, Minnesota and Wisconsin oppose efforts floating in Congress to impose new sales tax collection laws on Internet retailers.

R Street and NTU are conducting a national polling campaign that the groups says is aimed at measuring the public’s support of the Marketplace Fairness Act. So far, a majority of respondents in Pennsylvania, Minnesota, Wisconsin, North Carolina, Virginia and…

Rewriting History

Blog - Education - July 16, 2014, 3:07 PM

The same man who was one of the lead architects in creating the Common Core State Standards Initiative, David Coleman, has now redesigned the Advanced Placement United States History (APUSH) course and exam. Aside from the fact that these huge changes have received almost no media coverage (possibly because the Common Core advocates at the Gates Foundation are now funding education coverage by NBC) and they greatly remove control from parents, teachers, and students, the revision aims to teach a biased version of American history that largely focuses on the supposed faults of our country rather than our accomplishments.

While multiple practice exams for the course were previously available to the public online, the College Board, the organization responsible for AP tests, will now only release a single practice exam to teachers of the course. If a teacher discloses the content of the sample exam, he will be penalized and possibly stripped of his right to teach Advanced Placement courses. This lack of transparency is an attempt to silence the public as well as to foster reliance on the state for education. Since students can no longer self-prep for the exam, they must take the course to discover what material will be covered on the exam.

Additionally, the redesigned course severely limits individual teachers’ flexibility in teaching the material. Previously, APUSH teachers received a five page overview of how to teach an AP course, now the College Board published a 142 page frameworkdictating specific topics teachers must cover as well as the manner in which they must be covered. Unfortunately, teachers are forced to use this framework if they want to give their students a chance to succeed on the exam and receive college credit for the course.

Along with the grand encroachment on teacher’s freedoms in the classroom, the material presented in the framework is clearly biased to portray American history from a Leftist point of view. The course morphs the discipline of history into a subject that more closely resembles sociology. The framework does not emphasize student knowledge of important figures and events that shaped our great nation but rather the development of “historical thinking skills” with much emphasis on changing roles of race and ethnicity, gender, social classes, and power relations throughout our country’s history. In fact, the required themes and objectives are: “Work, Exchange, and Technology; Identity; Ideas, Beliefs, and Culture; America in the World; Environment and Geography; Politics and Power; Peopling” with no mention of figures and events.

The philosophy behind the revision of APUSH is flawed in itself, but the view of history it seeks to present is historically dishonest and utterly dangerous to the future of our country. The most blatant inaccuracy is evident in the framework’s discussion of the Founding of our nation. Incredibly, George Washington is only mentioned in passing reference to his Farewell Address not in regards to his heroic sacrifices as a general and the first leader of our country. Furthermore, no other Founding Fathers are mentioned nor are any of the events or principles that led to the American War for Independence.

Most amazingly, neither the Declaration of Independence nor the Constitution are mentioned in great detail or even listed as suggested reading (however Betty Friedan’s The Feminist Mystique is suggested reading, placing it and other biased sources above some of the most important documents to the American people). The theories and principles behind these two founding works are never discussed and neither is any mention of how our government is organized. The lack of import placed on these essential documents does a great disservice to the future voters of this country and further promotes a reliance on the state for educating voters with what it deems necessary rather than allowing individuals to think for themselves.

What the framework does mention regarding the Founding of our nation (and in every historical period following) is the apparent tension and inequality among various minority groups. This topic is a major theme of the new course at the expense of the study of influential people and events that formed and maintained the United States. While it may be important to study the trends and conflicts between groups in America, it is academically dishonest to slant history to overwhelmingly focus on these aspects without, for example, mentioning major battles or political conflicts during the Civil War, the Gettysburg Address, the fact that Lincoln was assassinated, key details and motives in World War I, or even Hitler’s existence and prominence in World War II.

The new APUSH model is far from objective in what it chooses to cover (such as discussing Wilson and FDR the most of any other presidents or important figures) as well as the manner in which it is covered. States’ rights and capitalism are continually criticized and linked with inequality, while the federal government is seen as the champion of social justice from policies like the New Deal to the Great Society. While negative aspects of limited federal regulation are highlighted and even blamed for everything from the South’s belief in slavery to the Great Depression, the counterargument to big government is never discussed.

The revision to the Advanced Placement United States History course claims to promote “historical thinking skills,” but in reality it simply fosters indoctrination and historical inaccuracy. If David Coleman and the College Board were truly interested in “thinking skills” they would have designed a transparent course that presents the facts and allows students to draw their own conclusions. Instead the College Board has chosen to tell students what they must think about American history and to withhold some of the most important aspects of our nation’s history from the next set of leaders in this country.

 

Image originally published at http://vyturelis.com/emanuel-leutze-washington-crossing-the-delaware-describes-the-historical-event-when.htm

 

Rewriting History

Somewhat Reasonable - July 16, 2014, 3:07 PM

The same man who was one of the lead architects in creating the Common Core State Standards Initiative, David Coleman, has now redesigned the Advanced Placement United States History (APUSH) course and exam. Aside from the fact that these huge changes have received almost no media coverage (possibly because the Common Core advocates at the Gates Foundation are now funding education coverage by NBC) and they greatly remove control from parents, teachers, and students, the revision aims to teach a biased version of American history that largely focuses on the supposed faults of our country rather than our accomplishments.

While multiple practice exams for the course were previously available to the public online, the College Board, the organization responsible for AP tests, will now only release a single practice exam to teachers of the course. If a teacher discloses the content of the sample exam, he will be penalized and possibly stripped of his right to teach Advanced Placement courses. This lack of transparency is an attempt to silence the public as well as to foster reliance on the state for education. Since students can no longer self-prep for the exam, they must take the course to discover what material will be covered on the exam.

Additionally, the redesigned course severely limits individual teachers’ flexibility in teaching the material. Previously, APUSH teachers received a five page overview of how to teach an AP course, now the College Board published a 142 page frameworkdictating specific topics teachers must cover as well as the manner in which they must be covered. Unfortunately, teachers are forced to use this framework if they want to give their students a chance to succeed on the exam and receive college credit for the course.

Along with the grand encroachment on teacher’s freedoms in the classroom, the material presented in the framework is clearly biased to portray American history from a Leftist point of view. The course morphs the discipline of history into a subject that more closely resembles sociology. The framework does not emphasize student knowledge of important figures and events that shaped our great nation but rather the development of “historical thinking skills” with much emphasis on changing roles of race and ethnicity, gender, social classes, and power relations throughout our country’s history. In fact, the required themes and objectives are: “Work, Exchange, and Technology; Identity; Ideas, Beliefs, and Culture; America in the World; Environment and Geography; Politics and Power; Peopling” with no mention of figures and events.

The philosophy behind the revision of APUSH is flawed in itself, but the view of history it seeks to present is historically dishonest and utterly dangerous to the future of our country. The most blatant inaccuracy is evident in the framework’s discussion of the Founding of our nation. Incredibly, George Washington is only mentioned in passing reference to his Farewell Address not in regards to his heroic sacrifices as a general and the first leader of our country. Furthermore, no other Founding Fathers are mentioned nor are any of the events or principles that led to the American War for Independence.

Most amazingly, neither the Declaration of Independence nor the Constitution are mentioned in great detail or even listed as suggested reading (however Betty Friedan’s The Feminist Mystique is suggested reading, placing it and other biased sources above some of the most important documents to the American people). The theories and principles behind these two founding works are never discussed and neither is any mention of how our government is organized. The lack of import placed on these essential documents does a great disservice to the future voters of this country and further promotes a reliance on the state for educating voters with what it deems necessary rather than allowing individuals to think for themselves.

What the framework does mention regarding the Founding of our nation (and in every historical period following) is the apparent tension and inequality among various minority groups. This topic is a major theme of the new course at the expense of the study of influential people and events that formed and maintained the United States. While it may be important to study the trends and conflicts between groups in America, it is academically dishonest to slant history to overwhelmingly focus on these aspects without, for example, mentioning major battles or political conflicts during the Civil War, the Gettysburg Address, the fact that Lincoln was assassinated, key details and motives in World War I, or even Hitler’s existence and prominence in World War II.

The new APUSH model is far from objective in what it chooses to cover (such as discussing Wilson and FDR the most of any other presidents or important figures) as well as the manner in which it is covered. States’ rights and capitalism are continually criticized and linked with inequality, while the federal government is seen as the champion of social justice from policies like the New Deal to the Great Society. While negative aspects of limited federal regulation are highlighted and even blamed for everything from the South’s belief in slavery to the Great Depression, the counterargument to big government is never discussed.

The revision to the Advanced Placement United States History course claims to promote “historical thinking skills,” but in reality it simply fosters indoctrination and historical inaccuracy. If David Coleman and the College Board were truly interested in “thinking skills” they would have designed a transparent course that presents the facts and allows students to draw their own conclusions. Instead the College Board has chosen to tell students what they must think about American history and to withhold some of the most important aspects of our nation’s history from the next set of leaders in this country.

 

Image originally published at http://vyturelis.com/emanuel-leutze-washington-crossing-the-delaware-describes-the-historical-event-when.htm

 

Categories: On the Blog

You Cannot Rewrite Laws to Achieve Your Political Agenda

Somewhat Reasonable - July 16, 2014, 2:02 PM
Now that the dust has settled on the Supreme Court’s 2014 session, we can look at the decisions and conclude that the Administration received a serious smack down. Two big cases got most of the news coverage: Hobby Lobby and the National Labor Relations Board’s (NLRB) recess appointments. In both cases, the Administration lost. At the core of both, is the issue of the Administration’s overreach. Within the cases the Supreme Court heard, one had to do with energy—and it, too, offered a rebuke.You likely haven’t heard about Utility Air Regulatory Group (UARG) v. Environmental Protection Agency (EPA)—and may think you don’t care. But with the session over, UARG v. EPA makes clear the Court’s trend to trim overreach.The UARG v. EPA decision came down on June 23. None of the major news networks covered it.Reviews of the 2014 cases, since the end of the session, haven’t mentioned it either. The decision was mixed—with both sides claiming victory. Looking closely, there is cause for optimism from all who question the president’s authority to rewrite laws.

A portion of the UARG v. EPA case was about the EPA’s “Tailoring Rule” in which it “tailored” a statutory provision in the Clean Air Act—designed to regulate traditional pollutants such as particulate matter—to make it work for CO2. In effect, the EPA wanted to rewrite the law to achieve its goals. The decision, written by Justice Antonin Scalia for the majority, stated:

“Were we to recognize the authority claimed by EPA in the Tailoring Rule, we would deal a severe blow to the Constitution’s separation of powers… The power of executing laws…does not include a power to revise clear statutory terms that turn out not to work in practice.”

Had the EPA gotten everything it wanted, it could have regulated hundreds of thousands of new sources of CO2—in addition to the already-regulated major industrial sources of pollutants. These new sources would include office buildings and stores that do not emit other pollutants—but that do, for example, through the use of natural gas for heating, emit 250 tons, or more of CO2 a year.

The Supreme Court did allow the EPA to regulate CO2 emissions from sources that already require permits due to other pollutants—and therefore allowed the EPA and environmentalists pushing for increased CO2 reductions to claim victory because the decision reaffirmed the EPA does have the authority to regulate CO2 emissions. However, at the same time, the decision restricted the EPA’s expansion of authority. Reflecting the mixed decision, the Washington Post said the decision was: “simultaneously very significant and somewhat inconsequential.”

It is the “very significant” portion of the decision that is noteworthy in light of the new rules the EPA announced on June 2.

Currently, the Clean Air Act is the only vehicle available to the Administration to regulate CO2 from power plant and factory emissions. However, the proposed rules that severely restrict allowable CO2emissions from existing power plants, and will result in the closure of hundreds of coal-fueled power plants, bear some similarities to what the Supreme Court just invalidated: both involve an expansive interpretation of the Clean Air Act.

It is widely believed that the proposed CO2 regulations for existing power plants will face legal challenges.

Tom Wood, a partner at Stoel Rives LLP who specializes in air quality and hazardous waste permitting and compliance, explains: “Although the EPA’s Section 111 (d) proposals cannot be legally challenged until they are finalized and enacted, such challenges are a certainty.” With that in mind, the UARG v. EPA decision sets an important precedent. “Ultimately,” Wood says, “the Supreme Court decision seems to give more ammunition to those who want to challenge an expansive view of 111 (d).” Wood sees it as a rebuke to the EPA—a warning that in the coming legal battles, the agency should not presume that its efforts will have the Supreme Court’s backing.

In his review of the UARG v. EPA decision, Nathan Richardson, a Resident Scholar at Resources For the Future, says: “In strict legal terms, this decision has no effect on EPA’s plans to regulate new or existing power plants with performance standards. … However, if EPA is looking for something to worry about, it can find it in this line from Scalia:”

When an agency claims to discover in a long-extant statute an unheralded power to regulate “a significant portion of the American economy” . . . we typically greet its announcement with a measure of skepticism. We expect Congress to speak clearly if it wishes to assign an agency decisions of vast “economic and political significance.” 

Cato’s Andrew Grossman adds: “The Court’s decision may be a prelude of more to come. Since the Obama Administration issued its first round of greenhouse gas regulations, it has become even more aggressive in wielding executive power so as to circumvent the need to work with Congress on legislation. That includes … new regulations for greenhouse gas emissions by power plants …that go beyond traditional plant-level controls to include regulation of electricity usage and demand—that is, to convert EPA into a nationwide electricity regulator.” Grossman suggests: “this won’t be the last court decision throwing out Obama Administration actions as incompatible with the law.”

Philip A. Wallach, a Brookings fellow in Governance Studies, agrees. He called the UARG v. EPA case “something of a sideshow,” and sees “the main event” as EPA’s power plant emissions controls, which have “much higher practical stakes.”

The UARG v. EPA decision is especially important when added to the more widely known Hobby Lobby and NLRB cases, which is aptly summed up in the statement by the American Fuel & Petrochemical Manufacturers’ General Counsel Rich Moskowitz: “We are pleased that the Court has placed appropriate limits on EPA’s authority to regulate greenhouse gases under the Clean Air Act. By doing so, the Court makes clear that an agency cannot rewrite the law to advance its political goals.”

Justice Scalia’s opinion invites Congress to “speak clearly” on agency authority. It is now up to our elected representatives to rise to the occasion and pass legislation that leaves “decisions of vast ‘economic and political significance’” in its hands alone. Such action could rein in many agency abuses including the heavy-handed application of the Endangered Species Act and public lands management.

It would seem that the UARG v. EPA decision—while “somewhat inconsequential”—is, in fact, “very significant.” With this decision the Supreme Court has outlined the first legislation of the new, reformatted, post 2014 election, Congress. 

 

The author of Energy Freedom, Marita Noon serves as the executive director for Energy Makes America Great Inc., and the companion educational organization, the Citizens’ Alliance for Responsible Energy (CARE). Together they work to educate the public and influence policy makers regarding energy, its role in freedom, and the American way of life. Combining energy, news, politics, and, the environment through public events, speaking engagements, and media, the organizations’ combined efforts serve as America’s voice for energy.

Categories: On the Blog

Free Market Capitalism vs. Crony Capitalism

Somewhat Reasonable - July 16, 2014, 1:36 PM

In the minds of many people around the world, including in the United States, the term “capitalism” carries the idea of unfairness, exploitation, undeserved privilege and power, and immoral profit making. What is often difficult to get people to understand is that this misplaced conception of “capitalism” has nothing to do with real free markets and economic liberty, and laissez-faire capitalism, rightly understood.

During the dark days of Nazi collectivism in Europe, the German economist, Wilhelm Röpke(1899-1966), used the haven of neutral Switzerland to write and lecture on the moral and economic principles of the free society.

“Collectivism,” he warned, “was the fundamental and moral danger of the West.” The triumph of collectivism meant, “nothing less than political and economic tyranny, regimentation, centralization of every department of life, the destruction of personality, totalitarianism and the rigid mechanization of human society.”

If the Western world were to be saved, Röpke said, it would require a “renaissance of [classical] liberalism” springing “from an elementary longing for freedom and for the resuscitation of human individuality.”

 

What is the Meaning of Capitalism?

At the same time, such a renaissance was inseparable from the establishing of a capitalist economy. But what is capitalism? “Now here at once we are faced with a difficulty,” Röpke lamented, because, “capitalism contains so many ambiguities that it becoming every less adapted for an honest spiritual currency.”

As a solution, Röpke suggested that we “make a sharp distinction between the principle of a market economy as such . . . and the actual development which during the nineteenth and twentieth centuries has led to the historical foundation of market economy.”

Röpke went on, “If the word ‘Capitalism’ is to be used at all this should be with due reserve and then at most only to designate the historical form of market economy . . . Only in this way are we safe from the danger . . . of making the principle of the market economy responsible for things which are to be attributed to the whole historical combination  . . . of economic, social, legal, moral and cultural elements  . . . in which it [capitalism] appeared in the nineteenth century.”

In more recent times it has become common to use the term “crony capitalism,” implying a “capitalism” that is used, abused, and manipulated by those in political power to benefit and serve well connected special interest groups desiring to obtain wealth, revenues and “market share” that they could successfully acquire on an open, free and competitive market by offering better and less expense goods and services to consumers than their rivals.

 

Corrupted Capitalism vs. Free Market Capitalism

This facet of a corrupted capitalism is, unfortunately, not new. Even as the classical liberal philosophy of political freedom and economic liberty was growing in influence in Europe and America in the nineteenth century, many of the reforms moving society in that freer direction happened within a set of ideas, institutions, and policies that undermined the establishment of a truly free society.

Thus, the historical development of modern capitalism was “deformed” in certain essential aspects virtually from the start. Before all the implications and requirements of a free-market economy could be fully appreciated and implemented in the nineteenth century, it was being opposed and subverted by the residues of feudal privilege and mercantilist ideology.

Even as many of the proponents of free market capitalism and individualist liberalism were proclaiming their victory over oppressive and intrusive government in the middle decades of the nineteenth century, new forces of collectivist reaction were arising in the form of nationalism and socialism.

Three ideas in particular undermined the establishment of the true principles of the free market economy, and as a result, historical capitalism contained elements totally inconsistent with ideal of laissez-faire capitalism – a free competitive capitalism completely severed from the collectivist and power-lusting state.

 

The Ideas of “National Interest” and “Public Policy.”

In the seventeenth and eighteenth centuries, the emergence of the modern nation-state in Western Europe produced the idea of a “national interest” superior to the interests of the individual and to which he should be subservient. The purpose of “public policy” was to define what served the interests of the state, and to confine and direct the actions of individuals into those channels and forms that would serve and advance this presumed “national interest.”

In spite of the demise of the notion of the divine right of kings and the rise of the idea of the rights of (individual) man, and in spite of the refutation of mercantilism by the free-market economists of the eighteenth and nineteenth centuries, democratic governments continued to retain the conception of a “national interest.”

Instead of being defined as serving the interests of the king, it was now postulated as serving the interests of “the people” of the nation as a whole. In the twentieth century, public policy came to be assigned the tasks of government guaranteed “full employment,” targeted levels of economic growth, “fair” wages and “reasonable” profits for “labor” and “management,” and the politically influenced direction of investment and resource uses into those activities considered to foster the economic development viewed as advantageous to “the nation” in the eyes of those designing and implementing “public policy.”

Capitalism, therefore, was considered to be compatible with and indeed even requiring activist government. In nineteenth century America it often took the form of what were then called “internal improvements” – the government funded and subsidized “public works” projects to build, roads, canals, and railways, all which transferred taxpayers’ money into the hands of business interests interested in getting the government’s business rather than that of consumers in the marketplace.

It also manifested itself through trade protectionism meant to artificially foster “infant industries” behind high tariff walls. Selected businesses ran to the government insisting that they could never grow and prosper unless they were protected from foreign competition, at the expense, of course, of the consumers who would then have fewer choices at higher prices.

Today, it still includes public works projects, but also manipulation of investment patterns through fiscal policies designed to target “start-up” companies considered environmentally desirable or essential to “national security.” It also takes the form of pervasive economic regulation that controls and dictates methods of manufacturing, types and degrees of competition, and the associations and relationships that are permitted in the arena of commerce and exchange both domestically and in international trade.

In the misplaced use of the phrase “American free market capitalism” there is little that occurs in any corner of society that does not include the long arm of the highly interventionist state, and all with the intended purpose and resulting unintended consequences of political power being applied to benefit some at the expense of many others.

Perversely, the interventionist state in the evolution of historical capitalism has come to mean in too many people’s eyes the inescapable prerequisite for the maintenance of the market economy in the service of an ever-changing meaning of the “national interest.”

 

Central Banking as Monetary Central Planning

Whether in Europe or the United States, the application and practice of the principles of a free market economy were compromised from the start with the existence of monetary central planning in the form of central banking.

First seen as a device for assuring a steady flow of cheap money to finance the operations of government in excess of what those governments could extract from their subjects and citizens directly through taxation, monopolistic central banks were soon rationalized as the essential monetary institution for economic stability.

But the German economist, Gustav Stopler, clearly explained many decades ago in his book, This Age of Fable (1942), the government’s control of money undermines the very notion of a real free market economy:

“Hardly ever do the advocates of free capitalism realize how utterly their ideal was frustrated at the moment the state assumed control of the monetary system . . . A ‘free’ capitalism with governmental responsibility for money and credit has lost its innocence. From that point on it is no longer a matter of principle but one of expediency how far one wishes or permits governmental interference to go. Money control is the supreme and most comprehensive of all governmental controls short of expropriation.”

Once government controls the supply of money, it has the capacity to redistribute wealth, create inflations and cause economic depressions and recessions; distort the structure of relative prices and wages so they no longer reflect the values and choices of the buyers and sellers in the market; and generate misallocations of labor and capital throughout the economy that brings about imbalances of resource uses inconsistent with a market-based pattern of consumer demands for alternative goods and services.

Then, in the face of the market instabilities and distortions caused by the government’s mismanagement of the money supply and the banking system, the political authorities rationalize even more government intervention to “fix” the consequences of the boom-bust cycles their own earlier monetary central panning policies created.

 

The “Cruelty” of Capitalism and the Welfare State

The privileged classes of the pre-capitalist society hated the market. The individual was freed from subservience and obedience to the nobility, the aristocracy, and the landed interests.

For these privileged groups, a free market meant the loss of cheap labor, the disappearance of “proper respect” from their “inferiors,” and the economic uncertainty of changing market-generated circumstances.

For the socialists of the nineteenth and twentieth centuries, capitalism was viewed as the source of exploitation and economic insecurity for “the working class” who were considered dependent for their livelihood upon the apparent whims of the “capitalist class.”

The welfare state became the “solution” to capitalism’s supposed cruelty, a solution that created a vast and bloated welfare bureaucracy, made tens of millions of people perpetual wards of a paternalistic state, and drained society of the idea that freedom meant self-responsibility and mutual help through voluntary association and human benevolence.

A “capitalist” system with a welfare state is no longer a free society. It penalizes the industrious and the productive for their very success by punishing them through taxes and other redistributive burdens under the rationale of the “victimhood” of others in society who are claimed to have not received their “fair” due.

It weakens and then threatens to destroy the spirit and the reality of individual accomplishment, and spreads a mentality of “entitlement” to what others have honestly produced. And it restores the fearful idea that the state should not be the protector of each citizens individual rights but the compulsory arbiter who determines through force what each one is considered to “rightfully” deserve.

Peaceful and harmonious free market competition in the pursuit of excellence and creative improvement is replaced by the coerced game of mutual political plunder as individuals and groups in society attempt to grab what others have through a redistributive system of government force.

 

Free Market Capitalism was Hampered and Distorted

The ideal and the principle of the free market economy, of capitalism rightly understood were never fulfilled. What is called “capitalism” today is a distorted, twisted and deformed system of increasingly limited market relationships, as well as market processes hampered and repressed by state controls and regulations.

And overlaying the entire system of interventionist “crony” capitalism are the ideologies of eighteenth century mercantilism, nineteenth century socialism and nationalism, and twentieth century paternalistic welfare statism.

In this warped development and evolution of “historical capitalism,” as Wilhelm Röpke called it, the institutions for a truly free-market economy have either been undermined or prevented from emerging.

As the same time, the principles and actual meaning of a free-market economy have become increasingly misunderstood and lost. But it is the principles and the meaning of a free-market economy that must be rediscovered if liberty is to be saved and the burden of “historical capitalism” is to be overcome.

The socialists and “progressives” twisted and stole the good and worthy concept of liberalism as a political philosophy of individual rights and freedom, respect and protection of honestly acquired private property, and peaceful and voluntary industry, production and trade. It was usurped and made into the “modern” notion of liberalism as paternalistic Big Bother government controlling every aspect of life in the name of the “social good.”

 

Restoring the Ideal of Free Market Capitalism

The word “capitalism” was used as a term of abuse by the socialists almost from the beginning. But it also meant a system of creative and productive enterprise and industry by free and self-guiding individuals, each pursuing their peaceful self-interests through honest work, saving, and investment. The “self-made” man of capitalism was an ideal and model for the youth of America. The man who was motivated by his own independent self-responsible vision, who built something, new, better, and greater as a reflection of the potential of the reasoning and acting human being who sets his mind to work.

His wealth, if successfully accumulated, was honorably earned in the marketplace of ideas and industry, not plundered and stolen by force and political power. No individual is robbed or exploited on the truly free market, since all trade is voluntary and no man could be forced into an exchange or association not to his liking and consent.

Free competition sees to it that everyone tends to receive and earn a wage that reflects the estimation of his productive worth to others in society. Each individual is free to improve his talents and abilities to make his services more valuable to others over time, and earn the commensurate higher wages from possessing more marketable skills.

Wealth accumulated enables investment and capital formation for the production of new, better and more goods and services wanted by the consuming public, the majority of whom are the very wage-earning workers employed in the production and manufacture of those goods under the market-determined guiding hands of successful businessmen and entrepreneurs.

Free Market capitalism makes the consumer “king” of the marketplace who determines whether businessmen earn profits or suffer losses, base on what they decide to buy and how much they are willing to pay.

It is free market capitalism that helps make each man and woman a “captain” of their own fate, with the freedom about what work and employment to pursue, and the liberty to spend the income they earn in their own personal, desiring way to live the life they value and want, and that gives meaning and purpose to their own life.

No person need put up with humiliation, abuse or disrespect from a bureaucrat or political official who has control over their fate through the power of government planning, regulation and redistribution.

Free market capitalism offers people opportunities and choices as consumers, workers and producers, with the liberty to change course whenever the benefits from doing so seem to outweigh the costs in the eyes of the individual.

Free market, or laissez-faire, capitalism makes this all possible because it rests on a deeper political philosophical foundation based on the idea and ideal of the right of the individual to his own life, to be lived as he desires and chooses, as long as he respects the equal right of others to do the same.

Free market capitalism insists that there is no higher “national interest” above the individual interests of the separate citizens of a free society. In a system of free market capitalism government should no more control money and the banking system than a limited government should control the production and sale of shoes, soap, or salami.

And free market capitalism calls for each individual’s peacefully earned property and income to be respected and protected from plunder and theft, and that includes any created rationale and attempted justification to rob Peter to redistribute to Paul through the coercive power of government.

The good name of “capitalism” has to be recaptured and restored, just as the good name and concept of “liberalism,” rightly understood, should be returned to the advocates of individual liberty and free enterprise.

But this task requires friends of freedom to explain and make clear to others that what we live under today is not “capitalism” as it could be, should be and properly really means.

The reality of that “historical capitalism,” about which Wilhelm Röpke spoke, is the “crony capitalism” that must be rejected and opposed so that free men may some day live under and benefit from the truly free market capitalism that is the only economic system consist with a society of human liberty.

Originally published at EpicTimes. 

Categories: On the Blog

Transportation reform debate is out of gas

Out of the Storm News - July 16, 2014, 12:52 PM

With the impending shortage in the U.S. Highway Trust Fund, claws have come out across the political spectrum. Grassroots activists decry what they see as wasteful spending on many highway projects, while governors on both sides of the aisle, as well as union and business interests, fear what they believe would happen if the federal dollars begin to dry up.

Harsh rhetoric abounds, with Jay Timmons, president of the National Association of Manufacturers, calling the grassroots threatening Republican lawmakers:

…fringe elements who are using intolerant social propaganda and distorting the records of honorable men and women, driving them into the wilderness of defeat.

Meanwhile Dan Holler at Heritage Action responds that:

America is not facing ‘a transportation government shutdown’ and lawmakers should stop trying to create an artificial crisis which they can use as an excuse to raise taxes or increase spending.

The fight hinges mainly on the gas tax, which many lawmakers are keen to raise to address the shortfall. The House passed legislation to plug the hole yesterday, and the Senate will take up the issue later this week. Alternate plans are continually bandied about, including Rep. Kerry Bentivolio’s Repairing Our Aging Roads Act (the ROAR Act, unfortunately introduced without any references to bringing our roads “roaring” back). The opposing sides have dug in, with many interest groups favoring an increase, while conservative activists press lawmakers to refuse until all wasteful spending is rooted out.

Like many political fights in D.C., it’s quite plausible that both sides are correct – while we should be fighting unnecessary spending and artificially inflated costs, it could also be very possible that a gas tax increase is necessary to modernize our nation’s infrastructure. The American Society of Civil Engineers rated America’s roads a “D” and our bridges a “C+.” The Federal Highway Administration estimates that $170 billion is needed annually to improve road performance, but the gas tax falls short. While these studies should be taken with a grain of salt, even Richard Geddes of the conservative American Enterprise Institute questions the ability of the gas tax to bring in the amount necessary to fix the problems.

On July 8, Americans for Prosperity released a coalition letter signed by 17 conservative and libertarian organizations laying out a set of principles to address the issue. These principles include limiting fuel tax revenue to fund federal roads and bridges only; giving control over state interests back to the states; and reforming regulations like the Davis-Bacon Act’s “prevailing wage” requirements and redundant environmental impact studies that increase costs. Each of these principles have merit, and should be considered seriously by any fiscally responsible lawmaker also interested in improving America’s infrastructure.

However, it could be true that even if all those principles were adhered to, an increase in the gas tax may still be necessary to deal with our nation’s aging infrastructure. Unfortunately, the heated nature of these debates obscures any real discussion over our country’s needs and how to best address them. For unions, the money is a sacred pot in an age of declining membership and opportunity. For business, the specter of aging roads and failing transportation networks incites deep fears. And for politicians, the money represents real dollars for their districts.

But for government watchdogs, the spending is rightfully another example of waste and abuse. In today’s age of bitter partisanship, thoughtful conversation seems unlikely, which is unfortunate, as it will likely result in more dollars wasted and less actual infrastructure improvement. We should be considering a wide variety of alternatives, as fuel efficiency increases and Americans drive less. In this vein, Geddes and Brookings’ Clifford Winston have put forward several innovative solutions. But with elections pending, Congress’ ability to consider real alternatives seems to be out of gas.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

The Case for Six Californias

Somewhat Reasonable - July 16, 2014, 11:04 AM

It began as the idea of one eccentric entrepreneur, but now has 1.3 million signatories backing it: the case for breaking California up into six separate states is gathering steam. When the Six Californias campaign began, most serious commentators thought it was crackpot scheme, a pipe-dream of a few people that had no hope of gaining traction. They have been proved wrong. To an extent anyway.

The idea driving Six Californias is that the state is too large and its politics to disparate to be managed by the incompetent and venal state government in Sacramento. Anyone who knows anything about California knows it is choked with regulations to the point where running a business, let alone starting one, is desperately difficult. Indeed, California has recently been placed in the top three least friendly states for small businesses. For a state that relies on start-ups to stay afloat at all, that is a pretty bad sign for things to come.

And it’s not just business that suffers. Public utilities are being stretched to the limit thanks to grossly inefficient investments by the state government. Other public services, like education, have deteriorated in recent decades to being among the worst in the nation.

Conceived and bankrolled by billionaire Timothy Draper, who has been described as one of the world’s most successful venture capitalists, Six Californias is seeking to radically alter the status quo. Draper is famous for making big bets on new technologies, and clearly his betting nature is turning political. His stated aim is to break California into six states that would be better administered and more politically harmonious in their internal affairs.

California is a massive state. With 38 million citizens and the 8th largest economy in the world, California has come to be ungovernable in the traditional model of states. This has not been helped by Sacramento’s attempts to micromanage the affairs of Californians.

Six Californias argues that six smaller states would be far more representative and responsive to their constituents. That is music to the ears of any supporter of liberty. After all, the larger and more centralized the government, the less accountability to the citizens it has. The breakup would divide California into states somewhat closer in size to other states in the union, and would no doubt be much easier for new state government to manage.

The project has succeeded in gaining ballot access. The 1.3 million signatures recorded far exceed the 808 thousand that was necessary to trigger a state-wide referendum. The vote will likely be scheduled for 2016.

What would happen if Californians voted for the breakup? That is a knotty constitutional question already being addressed by scholars and politicians. The Constitution does not allow for the instantaneous inclusion of new states carved out of old ones, so some suggest that each successor state of California would have to petition to be readmitted to the union as full states. However, there is a degree of precedent, albeit a rather old one. During the Civil War, part of Virginia refused to secede from the United States, declaring itself West Virginia in 1861 and was recognized by the federal government as a full state in 1863. Such a process might lie in the future for California.

Other sticky issues persist. The questions of how debt would be divided and the rights over public works and resources would all be disputed by the successor governments. Such disagreements will no doubt be extremely rancorous, probably carrying on for years after the referendum.

The question of what to do in the event of a breakup may, however, be moot since it seems, at least for now, that voters would not choose to break up their home state. For all its flaws, California is still considered home to millions of people, and many of them do identify with the state as a real polity of which they are a part. To sever those bonds and to shatter a state is an exceptionally difficult thing to accomplish. In all likelihood the referendum will fail.

But the prospect of failure to create six Californias does not make the project a waste of time. Indeed, it is extremely valuable whether it succeeds or not. There is clearly an appetite among many Californians for government that is more decentralized and more responsive to the needs of citizens. That can be accomplished without anything so radical as breaking the state apart. Devolution of power to regions, counties, and cities would go a long way toward creating the accountability and better, leaner government Six Californias is after.

The momentum from the Six Californias project should be carried through, no matter what the referendum results in. If the state is to continue to be an important part of the nation’s economy it must be willing to change.

Categories: On the Blog

Supreme Court to Obama Administration: Congress Writes Laws, You Don’t!

Somewhat Reasonable - July 16, 2014, 10:21 AM

Now that the dust has settled on the Supreme Court’s 2014 session, we can look at the decisions and conclude that the Administration received a serious smack down. Two big cases got most of the news coverage: Hobby Lobby and the National Labor Relations Board’s (NLRB) recess appointments. In both cases, the Administration lost. At the core of both, is the issue of the Administration’s overreach.

Within the cases the Supreme Court heard, one had to do with energy—and it, too, offered a rebuke.

You may not have heard about Utility Air Regulatory Group (UARG) v. Environmental Protection Agency (EPA).

The UARG v. EPA decision came down on June 23. The decision was mixed—with both sides claiming victory. Looking closely, there is cause for optimism from all who question the president’s authority to rewrite laws.

A portion of the UARG v. EPA case was about the EPA’s “Tailoring Rule” in which it “tailored” a statutory provision in the Clean Air Act—designed to regulate traditional pollutants such as particulate matter—to make it work for CO2. In effect, the EPA wanted to rewrite the law to achieve its goals. The decision, written by Justice Antonin Scalia for the majority, stated:

“Were we to recognize the authority claimed by EPA in the Tailoring Rule, we would deal a severe blow to the Constitution’s separation of powers… The power of executing laws…does not include a power to revise clear statutory terms that turn out not to work in practice.”

Had the EPA gotten everything it wanted, it could have regulated hundreds of thousands of new sources of CO2—in addition to the already-regulated major industrial sources of pollutants. These new sources would include office buildings and stores that do not emit other pollutants—but that do, for example, through the use of natural gas for heating, emit 250 tons, or more of CO2 a year.

The Supreme Court did allow the EPA to regulate CO2 emissions from sources that already require permits due to other pollutants—and therefore allowed the EPA and environmentalists to claim victory because the decision reaffirmed the EPA does have the authority to regulate CO2 emissions. However, at the same time, the decision restricted the EPA’s expansion of authority. Reflecting the mixed decision, the Washington Post said the decision was: “simultaneously very significant and somewhat inconsequential.”

It is the “very significant” portion of the decision that is noteworthy in light of the new rules the EPA announced on June 2.

Currently, the Clean Air Act is the only vehicle available to the Administration to regulate CO2 from power plant and factory emissions. However, the proposed rules that severely restrict allowable CO2 emissions from existing power plants bear some similarities to what the Supreme Court just invalidated: both involve an expansive interpretation of the Clean Air Act.

Tom Wood, a partner at Stoel Rives LLP who specializes in air quality and hazardous waste permitting and compliance, explains: “Although the EPA’s Section 111 (d) proposals cannot be legally challenged until they are finalized and enacted, such challenges are a certainty.” With that in mind, the UARG v. EPA decision sets an important precedent. “Ultimately,” Wood says, “the Supreme Court decision seems to give more ammunition to those who want to challenge an expansive view of 111 (d).” Wood sees it as a rebuke to the EPA—a warning that in the coming legal battles, the agency should not presume that its efforts will have the Supreme Court’s backing.

Philip A. Wallach, a Brookings fellow in Governance Studies, called the UARG v. EPA case “something of a sideshow,” and sees “the main event” as EPA’s power plant emissions controls, which have “much higher practical stakes.”

In his review of the UARG v. EPA decision, Nathan Richardson, a Resident Scholar at Resources For the Future, says: “In strict legal terms, this decision has no effect on EPA’s plans to regulate new or existing power plants with performance standards. … However, if EPA is looking for something to worry about, it can find it in this line from Scalia:”

When an agency claims to discover in a long-extant statute an unheralded power to regulate “a significant portion of the American economy” . . . we typically greet its announcement with a measure of skepticism. We expect Congress to speak clearly if it wishes to assign an agency decisions of vast “economic and political significance.”

The UARG v. EPA decision is especially important when added to the more widely known Hobby Lobby and NLRB cases, which is aptly summed up in the statement by the American Fuel & Petrochemical Manufacturers’ General Counsel Rich Moskowitz: “We are pleased that the Court has placed appropriate limits on EPA’s authority to regulate greenhouse gases under the Clean Air Act. By doing so, the Court makes clear that an agency cannot rewrite the law to advance its political goals.”

Justice Scalia’s opinion invites Congress to “speak clearly” on agency authority. It is now up to our elected representatives to rise to the occasion and pass legislation that leaves “decisions of vast ‘economic and political significance’” in its hands alone. Such action could rein in many agency abuses including the heavy-handed application of the Endangered Species Act and public lands management.

The decision—while “somewhat inconsequential”—is, in fact, “very significant.” The Supreme Court has, perhaps, outlined the first legislation of the new, reformatted, post-2014 election Congress.

Categories: On the Blog

The Climate Change Truth in Vegas

Somewhat Reasonable - July 15, 2014, 1:38 PM

Having recently returned from The Heartland Institute’s 9th International Conference on Climate Change held in Las Vegas from July 7-9, “Just Don’t Wonder About Global Warming, Understand It,” I was privileged to hear some of the world’s hundreds of leading climate scientists and researcher discuss the latest state of global warming science, all who question whether manmade global warming” will be harmful to plants, animals, or human welfare. Eight hundred participants were on hand to hear 64 speakers from 12 different countries (14 countries if counting the moon with Astronaut Walter Cunningham and Washington, D.C.) despite the fierce heat of Las Vegas in July. At one point 4,000 individuals were listening to the conference as it was streamed live from the conference website in Las Vegas.

This year’s delegates’ speeches showed how the myths of the climate alarmist are false, which shatters the often quoted 97% consensus figure given for those who believe most of the warming since 1959 was man-made. On the contrary, only 0.5 percent of the authors of 11,944 scientific papers on climate and related topics over the past 21 years have said they agreed that most of the warming since 1950 was man-made. Furthermore, according to the RSS satellite record (Remote Sensing Systems), there has been no global warming for 17 years and 10 months.

Obama’s statements conflict with scientific findings:

The above conclusions conflict with the statements made by President ObamaOn Tuesday, May 6, when he warned that “people’s lives are at risk” because of man-made climate change proclaimed during a series of interviews with National and Local television meteorologists. “Not only is climate change a problem in the future, it’s already effecting Americans,” Obama told CBS News, warning that the phenomenon was “increasing the likelihood” of floods, droughts, storms and hurricanes.

Even the U.N.’s International Panel on Climate Change (IPCC) has said in its last two reports that there has seen no particular change in the frequency or severity of floods worldwide. Neither are droughts getting worse (the fraction of the world’s land under drought has fallen for 30 years), nor are hurricanes getting worse (combined frequency, severity and duration has been at or near the lowest in the 35-year satellite record).

There was an element of truth, however, to be found in President Obama’s remarks on Tuesday, May 6, but as happens time and again, Obama’s spoken version of the truth amounted to fantasy.  Instead of putting people “lives at risk” by failing to take drastic measures to curb CO2, millions of people are dying because Western policies seem more interested in carbon-dioxide levels than in life itself. Such  was the topic of the final panel discussion, “Panel 21:  Global Warming as a Social Movement,” on Wednesday afternoon before adjournment of Heartland’s 9th Annual International Conference on Climate Change   The distinguished panelists included E. Calvin Beisnert, Ph.D., Founder and National Spokesman of the Cornwall Alliance; Paul Driessan, J.D. senior policy advisor with the Committee For A Constructive Tomorrow and Center for the Defense of Free Enterprise; and Peter Ferrara, J.D., general counsel of the American Civil Rights Union at the Heartland Institute.  Minnesota State Rep. Pat Garofalo was the Moderator, a Republican member of the Minnesota House of Representatives representing District 588.

Panelists Beisnert, Driessan and Ferrara laid out a convincing message how climate alarmists, as environmentalists, view people primarily as polluters and consumers who use up Earth’s resources and poison the planet in the process, rather than being good stewards.  It might even be said that environmentalism is the new face of the anti-human, “Pro-Death” agenda.  Through the bogus “crises” of man-made global warming, affordable and reliable energy and other modern blessings are being denied to the developing world.  This despite the $3.5 billion spent around the world to combat climate change.  Worth reading is an opinion piece by Caleb S. Rossitger, updated May 4, 2014, “Sacrificing Africa for Climate Change.” Change.”

Social Impacts of Reducing Carbon Emissions:

  • 90% of the people living in sub-Saharan Africa do not have electricity and lack light to study and work by, refrigeration to prevent food spoilage and power to operate equipment that could multiply their productivity.  Environmentalists’ oppose building large power plants and electric grids.  Each American accounts for 20 times the emissions of each African.  With 15% of the world’s population, Africa produces less than 5% of carbon-dioxide emissions.  Shouldn’t real years added to real lives trump the minimal impact that African carbon emissions could have on a theoretical catastrophe?
  • Because of the lack of electricity, two to three million women and children die annually from lung disease around the world from burning wood and dried dung to cook their food or heat their huts.
  • Another one to two million people die annually from malaria since the banning of DDT.
  • Where energy is available, regulation of greenhouse gas and other environmental regulations drive up the cost of basic necessities such a food, fuel and electricity, stifling economic growth and costing jobs.
  • America’s ethanol policy alone is estimated to cause nearly 200,000 premature deaths every year in the developing world by limiting the amount of corn for human consumption, which, in turn, raises its purchase price.
  • Golden corn seeds could end Vitamin A deficiency in millions of children.  Genetically produced rice with Vitamin E is also available.  Even so, eight million children have died since the invention of this life-saving rice out of fear of using genetically enhanced food items.
  • Proposed caps on emissions, and so-called renewable energy mandates, would cost our nation millions of jobs and hundreds of billions of dollars per year.  Even though Americans are wealthy by world’s standards, poor and single-income families in the U.S. would be hardest hit, while much poorer people around the would suffer even more if required to restrain greenhouse gas emissions.
  • A carbon tax on Cap and Trade is a regressive tax which would hit hardest the poor among us. The poor already pay a higher proportion of their income for energy, plundering the poor, as would state mandates for wind and solar power, which would result in higher energy costs over what is currently being provided by power plant now under fire by the EPA for CO2 emissions linked to Global Warming.
  • Wealth increases more when the overall global temperature is warmer and furthermore correlates with happiness, better health, and longevity. The more we do to fight Global Warming, the less off the poor will be in poorer nations, with higher rates of disease and death.

For Reflection: 

If this nation really cared about the poor, our government would stay off the Global Warming bandwagon and use the billions currently being spent to combat EPA fuel emissions standard, which have no effect, and instead put the billions to where it would do the most good fighting disease and poverty.  Building fossil fuel plants and a grid to provide electricity to all the houses around the globe where dung and wood are still burnt in the absence of electricity, would cost 1/2 billion a year less than compliance with EPA’s fuel emission standards.

Evident is that those who control carbon control our lives.  Shutting down power plants could carry some health benefits by reducing the risk of asthma and heart attacks in areas near the plants, but will cutting carbon emissions from existing power plants by about 25% from 2012 levels by 2020 make the planet healthier?  Greenhouse gasses would still escape into the atmosphere from around the world?  Hence, cutting carbon emissions would be a drag on this nation’s economy.  See this article by Sally Deneen for National Geographic,“One Key Question on Obama’s Push Against Climate Change:  Will It Matter”, for further clarification.

Global Warming could rightly be called a social movement, a big green and government movement, not unlike the “Population Bomb” which warned of mass starvation of humans in 1970′s and 1980′s due to overpopulation, and which advocated immediate action to limit population growth.

The emphasis on Climate Change as a urgent threat, propagated by President Obama and being carried out through the EPA, is in actuality a weapon of mass destruction and a war on women and children.  Alarmists use threats as a way to justify their power to decide how much energy is available for use by humanity throughout the world.  As such, big green with its eco-friendly measures appears callous to human destruction.

In Conclusion: 

It is not being denied that global temperature have risen over the last 150 years or more, but it is mostly a natural occurrence, and certainly within the range of natural climate variability over the centuries; i.e. the Medieval Warm Period, an interval from approximately AD1000 to AD1300.  During that time many places around the world exhibited conditions that seem warm compared to today. Heartland and the scientists it works with have never promoted “denial of a changing climate.”  The climate is always changing. The question is whether man’s contribution to climate change rises above statistical noise and whether it is a crisis.

The issue of Climate Change is the greatest moral and ethical battle of our time.  We must stand up for the tyranny resulting from the seizure of that which powers our civilization, sufficient energy production at an affordable cost.  Without this availability, the global death toll will rise before is decreases due to the dark forces of a Climate Change fantasy.

View here videos of all Speakers and Panel Discussions at Heartland’s 9th International Conference on Climate Change.

Categories: On the Blog

The twisting tale of the CVT

Out of the Storm News - July 15, 2014, 10:18 AM

Would you think that an unexpected ban of a disruptive technology, particularly a ban imposed 20 years ago by a private race car administrative organization, could retard timely availability to consumers of sophisticated, high-performance automotive technology? Please hold that question while I digress for a few moments.

Car culture lives at the intersection of innovation, industry and regulation. From this spot, enthusiasts develop preferences and prejudices alike.

A kernel of axiomatic truth among many who drive their cars exuberantly is that, when possible, a manual transmission is essential. There are three elements undergirding this preference. The first is a belief that manual transmissions provide a driver with greater control over the drivetrain. The second is that, until very recently, manual transmissions tended to be more efficient, allowing for lighter engines that accelerate faster and get slightly better mileage.

The third element is not mechanical at all, it is cultural. In an era in which manual transmissions represent only a small fraction of all vehicles sold in the United States, a buyer’s preference for manual transmissions oft arises from a condition we will call “throwback authenticity.” This makes a very satisfying and low-budget snobbery available to anybody who chooses to drive a manual.

As a self-styled car enthusiast, one with a snobby history of seeking out manual transmissions whenever possible, I was dead-set on continuing to select my own gears. Then, last week, a sudden need for a new vehicle emerged. After ticking through my mental shortlist of desirable vehicles, I took a ride to my local Subaru dealership. Upon arrival, I was delighted to see, sitting front and center, the blue 2015 WRX for which I had made the trip.

I experienced utter disappointment upon finding that it was not a manual. Worse…it was not just any automatic, it was a nearly universally despised form of automatic known as CVT (continuously variable transmission).

My snob sense went off the chart and I became peevish. CVTs are known for being slow, unresponsive, dull and generally antithetical to all things performance. Still, I was coaxed into test driving the vehicle by the person who had given me a ride to the lot.

The test was brief but transformational. Impossibly, I was forced to reconcile myself to a new reality when, as I accelerated out of a corner, the transmission responded to inputs from the steering wheel-mounted paddle shifters as fast as my fingers could muster a tug. When I hopped out of the car I was left wondering how in the world such a powerful anti-CVT narrative could ever have taken hold in my head.

Back to the intersection point of “innovation, industry and regulation”:

The innovation: CVTs do not have gears. Instead, inside of a CVT, there is a drive-belt positioned between a pair of pulleys. The significance of this is that there are an infinite combination of power-delivery settings between the two pulleys, hence the name “continuously variable.” By not having to change gears, there is less parasitic loss between the engine and the tires. The associated savings can manifest themselves in the form of increased miles per gallon. Further, a CVT is capable of keeping an engine operating in a specific manner (be it for economy or performance) all of the time, because there are no set gear ratios, allowing the engine to operate at peak efficiency for whatever purpose it is being used at that time.

Like many novel technologies, CVTs were temperamental in their initial applications. Early adapters were beset by frequent drive-belt failures, because the belt was made from rubber. Subsequent adapters found the CVT both reliable and economical, but unrewarding to drive, because of their prevalence in low-powered vehicles. By pairing the CVT to the Prius, the transmission became a lodestar of enthusiast disdain.

Industry’s role: CVTs found their first automotive application in a small Dutch make named DAF, an abbreviation of Van Doorne’s Trailer Factory (in Dutch: Van Doorne’s Aanhangwagen Fabriek). DAF was gobbled up by Volvo, which allowed the technology to gain widespread exposure.

In the early 1990s, CVTs came to the attention of teams competing at the highest level of racing in the world, Formula 1. Teams recognized that, since an engine is constantly accelerating and decelerating, it is rarely operating at its full potential. For an engine to operate at its full potential, it is necessary for it to hold its speed at the peak of its power – a feat that a CVT is uniquely suited to accomplish. To this end, a number of well-financed Formula 1 teams began to develop CVT transmissions with a belt strong enough to withstand the phenomenal power loads of a Formula 1 engine.

Racing regulation: By 1993, a number of teams were testing CVTs in their cars under race conditions. Unsurprisingly, because the engines were not wasting time or power revving up and down the unprofitable parts of their power-curves, the cars were fast…several seconds a lap faster than traditional transmissions.

The CVT cars were arguably too fast. Not because the cars or the drivers could not sustain the pace, but because they were able to seriously upset the competition’s ability to compete without them. For this reason, to preserve competitive balance, Formula 1′s governing body decided to ban the use of CVTs.

Banning the use of CVTs at the highest level of racing competition retarded the development of the technology. Formula 1 teams enjoy an unparalleled level of factory funding and support because the cars are excellent platforms from which speculative technologies may be proven and refined. Arguably, without a fair trial in the crucible of motorsports, CVTs were unable to realize their potential until decades later. The intervening decades of mediocrity spawned a legion of detractors, hence the existence of an anti-CVT narrative among the automotive press corps and the enthusiast crowd for whom they write.

The generally applicable lesson that can be induced from the CVT story is that the shadow cast by regulation, even by non-governmental bodies, can be long and profound. By prohibiting the use of a particular technology, as opposed to introducing regulations designed to shape outcomes more globally, Formula 1 sought parity in an overbroad and ineffective way (fittingly, Team Williams, the first team to develop a racing CVT, enjoyed an uninterrupted period of dominance even without the transmission).

An enthusiast, I remain. But, now I proudly drive a technology that was able to overcome the heavy hand of regulatory shortsightedness. I drive a CVT.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Blowing Our Dollars in the Wind

Somewhat Reasonable - July 15, 2014, 9:12 AM

Wind energy produces costly, intermittent, unpredictable electricity. But Government subsidies and mandates have encouraged a massive gamble on wind investments in Australia – over $7 billion has already been spent and another $30 billion is proposed. This expenditure is justified by the claim that by using wind energy there will be less carbon dioxide emitted to the atmosphere which will help to prevent dangerous global warming.

Incredibly, this claim is not supported by any credible cost-benefit analysis – a searching enquiry is well overdue. Here is a summary of things that should be included in the analysis.

Firstly, no one knows how much global warming is related to carbon dioxide and how much is due to natural variability. However, the historical record shows that carbon dioxide it is not the most important factor, and no one knows whether climate feedbacks are positive or negative. Also, in many ways, the biosphere and humanity would benefit from more warmth, carbon dioxide and moisture in the atmosphere.

However, let’s assume that reducing man’s production of carbon dioxide is a sensible goal and consider whether wind power is likely to achieve it. To do this we need to look at the whole life cycle of a wind tower.

Wind turbines are not just big simple windmills – they are massive complex machines whose manufacture and construction consume much energy and many expensive materials.  These include steel for the tower, concrete for the footings, fibre glass for the nacelle, rare metals for the electro-magnets, steel and copper for the machinery, high quality lubricating oils for the gears, fibre glass or aluminium for the blades, titanium and other materials for weather-proof paints, copper, aluminium and steel for the transmission lines and support towers, and gravel for the access roads.

There is a long production chain for each of these materials. Mining and mineral extraction rely on diesel power for mobile equipment and electrical power for haulage, hoisting, crushing, grinding, milling, smelting, refining. These processes need 24/7 reliable electric power which, in Australia, is most likely to come from coal.

These raw materials then have to be transported to many specialised manufacturing plants, again using large quantities of energy, generating more carbon dioxide.

Then comes the construction phase, starting with building a network of access roads, clearance of transmission routes, and excavation of the massive footings for the towers. Almost all of this energy will come from diesel fuel, with increased production of carbon dioxide. Moreover, every bit of land cleared results in the production of carbon dioxide as the plant material dozed out of the way rots or is burnt, and the exposed soil loses its humus to oxidation.

Once the turbine starts operating, the many towers, transmission lines and access roads need more maintenance and repair than a traditional power plant that produces concentrated energy from one small plot of land using a small number of huge, well-tested, well protected machines. Turbines usually operate in windy, exposed, isolated locations. Blades need to be cleaned using large specialised cranes; towers and machinery need regular inspection and maintenance; and mobile equipment and manpower needs to be on standby for lightning strikes, fires or accidents. All of these activities require diesel powered equipment which produces more carbon dioxide.

Even when they do produce energy, wind towers often produce it at time when demand is low – at night for example. There is no benefit in this unwanted production, but it is usually counted as saving carbon fuels.

Every wind farm also needs backup power to cover the +65% of wind generating capacity that is lost because the wind is not blowing, or blowing such a gale that the turbines have to shut down.

In Australia, most backup is provided by coal or gas plants which are forced to operate intermittently to offset the erratic winds. Coal plants and many gas plants cannot switch on and off quickly but must maintain steam pressure and “spinning reserve” in order to swing in quickly when the fickle wind drops. This causes grid instability and increases the carbon dioxide produced per unit of electricity. This waste should be debited to the wind farm that caused it.

Wind turbines also consume energy from the grid when they are idle – for lubrication, heating, cooling, lights, metering, hydraulic brakes, energising the electro-magnets, even to keep the blades turning lazily (to prevent warping) and to maintain line voltage when there is no wind. A one-month study of the Wonthaggi wind farm in Australia found that the facility consumed more electricity than it produced for 16% of the period studied. A detailed study in USA showed that 8.3% of total wind energy produced was consumed by the towers themselves. This is not usually counted in the carbon equation.

The service life of wind towers is far shorter than traditional power plants. Already many European wind farms have reached the end of their life and contractors are now gearing up for a new boom in the wind farm demolition and scrap removal business. This phase is likely to pose dangers for the environment and require much diesel powered equipment producing yet more carbon dioxide.

Most estimates of carbon dioxide “saved” by using wind power look solely at the carbon dioxide that would be produced by a coal-fired station producing the rated capacity of the wind turbine. They generally ignore all the other ways in which wind power increases carbon energy usage, and they ignore the fact that wind farms seldom produce name-plate capacity.

When all the above factors are taken into account over the life of the wind turbine, only a very few turbines in good wind locations are likely to save any carbon dioxide. Most will be either break-even or carbon-negative – the massive investment in wind may achieve zero climate “benefits” at great cost.

Entrepreneurs or consumers who choose wind power should be free to do so but taxpayers and electricity consumers should not be forced to subsidise their choices for questionable reasons. People who claim climate sainthood for wind energy should be required to prove this by detailed life-of-project analysis before getting legislative support and subsidies.

Otherwise we are just blowing our dollars in the wind.

Categories: On the Blog

Virginia residents, businesses split over Internet sales tax

Out of the Storm News - July 14, 2014, 5:33 PM

From WVTF Public Radio:

Both sides of the issue want to sway Virginia Congressman Bob Goodlatte, who chairs the House Judiciary Committee, to their side. The R Street Institute’s Andrew Moylan says a new poll reveals that most Virginians don’t want their goods purchased through sites such as eBay to be taxed. He also says the process is burdensome because on-line businesses would be required to pay varying taxes based on each state and purchase point. He says there’s another option.

“The sort of technical term for it is origin sourcing, but in practice what that means is allowing online retailers to utilize the same collection scheme that brick and mortar retailers use, which is based on where they are physically present – where they are physically located. Then they only have to know one sales tax code – they only have to be accountable to one revenue agency.”

The New Rent Seeking?

Somewhat Reasonable - July 14, 2014, 3:33 PM

According to data released this week, Samsung and Apple make up the majority of the top 20 global smartphone models sold in the first quarter of 2014. While that success demonstrates the robust market prowess of these smartphone manufacturers, the real winners are the customers, getting more services, better products and lower prices. Almost the exact opposite happens when companies resort to lawsuits to gain market advantage, a sort of rent seeking via the courts.

Apple’s long-running lawsuits against Samsung continue despite, or perhaps because of, their mixed results. In the first Apple-Samsung lawsuit, Samsung was forced to pay its rival nearly $1 billion in damages, plus an International Trade Commission exclusion order imposed an importation ban. Apple also sought a full-sales ban on the Samsung products in question, which a judge ultimately blocked.

In a second trial, Apple sought sky-high damages of $40 per device for all Samsung devices sold in the U.S. that were named in the lawsuit, and sought to block the sales of Samsung products. While substantial, that dollar figure likely paled in comparison to the opportunity costs incurred by this fixation on legal action.

In the end, the jury decided that Samsung relied on some of Apple’s patented technology, but Apple too was caught using Samsung’s patented technology. The jury awarded Apple financial damages but not nearly the amount the company sought, which was further offset by an award to Samsung. The decision has been appealed both by Apple and Samsung, with Apple still trying to block Samsung sales, and with Samsung appealing the verdict in total.

If this litigious acrimony continues unabated, consumers, mobile innovation, and perhaps even the companies themselves will suffer. One sign that such damage has already occurred is that technology industry news increasingly seems to be about litigation rather than about new technological advances. And according to observers, Apple innovation may already be flagging.

Moreover, courtroom victories do not necessarily translate into benefits for consumers because they could drastically limit competition in the mobile marketplace. Instead of gaming the courts for potential advantages or trying to ban certain products, mobile device makers should compete in the open marketplace.

The ongoing dispute also raises broader questions about damages awarded in patent cases, particularly for design patents, and especially when the infringement is unknown. In real time, the courts are actively issuing new rulings guiding what is and is not patentable, such as abstract ideas tied to computer systems. Are awards that are so large that a company’s ability to compete is hampered good for consumers or the marketplace?  Are absolute bans on the products in the marketplace best for the free market?

When disputes do arise, companies should put their customers first by negotiating in good faith with their rivals, going to court only as a last resort. Of course, legitimate disputes, including important claims such as intellectual property infringement, may still need a judicial remedy, just not as a first option to hamper one’s competition.

[Originally published at The Institute for Policy Innovation]

Categories: On the Blog

Media Ignorance Is Worse When It’s Intentional

Somewhat Reasonable - July 14, 2014, 1:57 PM

I hope you all took time to read Mollie Hemingway’s piece this week concerning the problem of media ignorance. The really troublesome aspect of it, as I see it, is not when people are unintentionally ignorant of the matters they cover, which is of course excusable. No one is expected to be an expert on everything they write about, and in practice, it just serves to foster the Gell-Mann Amnesia effect, which you have surely experienced regularly if you are an expert in something and a consumer of media. Yes, it’s a problem when those youngsters in media who got promoted because they are really good at the Instagram don’t know about something because it’s on the second page of the Google results. But leaving something you didn’t know out of a story is more excusable than asserting something inaccurate out of ignorance, which is still more excusable than purposefully putting on blinders and ignoring anything that conflicts with your thesis because you’d rather not engage it. It’s one thing to not know another perspective exists – it’s another to purposefully pretend it doesn’t exist.

I know this is a minor complaint in the scheme of things, but if you want an example of this in practice, I’d draw your attention to the recent staff changes at the Washington Post’s Wonkbook, which has been dramatically reduced in usefulness since Ezra Klein pulled a great deal of their talent into Vox. To his credit, Klein has always understood that even media in pursuit of an ideological agenda gets boring very quickly if it’s entirely one-sided. Good political media requires conflict – it needs someone to take the other position in a debate, which is why his criticisms of Paul Ryan would be followed with an interview with the subject and the like. The overall effect was to provide people with a fairly consistent look at what the major Washington think tanks were doing, and while the reporters obviously leaned in a direction, I’d argue they rarely pretended conservative views didn’t exist or lacked legitimacy.

Unfortunately, ever since Klein, Evan Soltas, and others departed Wonkbook, replaced by Puneet Kollipara, Matt O’Brien, and a new crop of writers, the once-useful morning email has very obviously felt the impact. It has drastically reduced the number of right-leaning links, diminishing them to the point of nonexistence or only featuring critiques of conservative views. It regularly reaches the point of laughability in the context of multi-day debates, in which you can only learn the existence of a perspective through the frame of a liberal critique, or only learn of something gone wrong with Obamacare through a piece explaining why it doesn’t matter.

To pick one recent example: On the day the reform conservatives released their Room to Grow agenda at AEI (a development of significance whatever you think of the actual agenda), Wonkbook didn’t link a single thing about it – not one oped or post in favor of it or any of the source materials. Over the course of the next few days, they gave a few scant nods to pieces in favor of it, while linking a litany of pieces from liberals reacting to the proposals, criticizing something that they hadn’t even acknowledged existed.

A purposefully cloistered attitude, where the only good conservative is the one making the case for lefty ideas, is a real disservice to debate. If most of your links are to a conservative making the case for a universal wage subsidy or a carbon tax or immigration reform, it’s simply not an accurate depiction of where the other side is. And it leads to your site and email sounding less like a fair-minded left-leaning traditional media outlet and more like, well, ThinkProgress.

The impression you get is of a place with an ideological perspective that overwhelms its ability to fairly depict policy debates. The other day, after the GOP announced that it would hold its 2016 convention in Cleveland, Wonkbook sent out their afternoon update with the subject line and first piece headlined “How the Republican platform fails Cleveland”, which to me sounds more like a DNC press release header than an evenhanded evaluation. The piece has since been renamed. But the first title is a more accurate reflection of their perspective, which is disappointing to say the least.

As a postscript: it’s not as if you need to be a younger writer to play pretend and ignore the legitimacy of a different perspective. Back in 2012, I had a particularly frustrating interaction with Post fact-checker Glenn Kessler in which he outright refused to consider an alternate perspective on a question. Kessler gave “Four Pinocchios” to then-Mississippi Gov. Haley Barbour for some testimony the governor gave about Medicaid fraud in his state, noting that people were driving BMWs while claiming they couldn’t afford copays. Kessler’s rationale was so twisted that I still can’t believe he advanced it: his view was that Barbour had to be lying, because BMWs are too expensive for people who qualify for Medicaid to afford. I’m serious – he even did the Cars.com search. Despite citing a half dozen news stories to him from that very week of people being arrested for Medicaid fraud who owned flashy cars and McMansions, and pointing out that people can easily go onto Medicaid after buying BMWs earlier in life, Kessler refused to consider a world in which it is possible for Medicaid fraud or downward social mobility to exist, and got more than a little testy when challenged with the idea there was any gap in his logic.

Perhaps now that his own publication has run a piece about someone driving a Mercedes to pick up food stamps, he’ll reconsider his perspective. But I doubt it. Blinders can be awfully effective once you wear them long enough.

Update: Kessler has since offered a mea culpa.

Subscribe to Ben’s daily newsletter, The Transom.

 

[Originally published at The Federalist]

Categories: On the Blog

Lyft’s insurance move a first step toward regular TNC coverage

Out of the Storm News - July 14, 2014, 12:55 PM

Even as it faces new regulatory headaches in New York, transportation network company Lyft is making news this week with a major announcement today that should quiet at least some of its vocal critics: the company has begun offering primary commercial auto insurance coverage for its drivers.

Lyft already provided a $1 million excess liability policy, generally designed to kick in once a driver’s private passenger auto policy limits were exhausted, typically at $50,000 of coverage. However, given that some standard private passenger policies may exclude coverage for a driver while acting in a commercial capacity (or, at least, given general legal uncertainty about whether such coverage would be upheld in a dispute) Lyft’s policy had a unique structure that would allow it to “drop down” to cover the first dollar of loss in case the primary policy did not respond.

But given concerns from regulators in places like Virginia, New York, California and Seattle (largely egged on by local taxi associations) that even this “drop-down” coverage wasn’t sufficient, Lyft is just going all the way to offering primary coverage:

In response to that feedback from leaders in markets such as New York, California and Seattle, Lyft has voluntarily converted its policy from excess to be primary to a driver’s personal policy during the period from the time a driver accepts a ride request until the time the ride has ended in the app. This major change is part of our continued effort to set the highest standard for trust and safety in transportation.

The coverage is provided via James River Insurance Co., a Richmond, Va.-based surplus lines writer that is ultimately owned by Bermuda-based Franklin Holdings Ltd. According to statutory filings, the company had $165.0 million of policyholder surplus as of the end of the first quarter, and it hold an A- financial strength rating from A.M. Best Co.

It makes sense that this new kind of risk would require looking to the surplus lines market for a solution. But over the longer term, if car-sharing does indeed take hold as a major transportation option across a broad swath of American cities, we would expect admitted market insurers will be able to craft their own products to meet this emerging consumer need.

Though “hybrid” products could either from personal lines or commercial lines insurers, full-scale commercial auto insurance policies would likely be a bit of overkill for the limited amounts of commercial activity in which most car-sharing drivers engage. A far simpler solution would be for personal lines insurers to come forward with riders or endorsements that offer coverage for a nominal amount of commercial activity, provided they could appropriately price the coverage.

That’s where it is crucial that insurance regulators remain flexible to permit insurers to innovate and bring new products to market in a reasonable time frame.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
Syndicate content