Feed aggregator

Progressives Want Everything Locally Grown Except Government

Somewhat Reasonable - August 24, 2014, 8:53 AM

Peter Blair has a piece at The American Interest on the anti-statist alliance that wasn’t. It’s worth a read:

Perhaps sensing an opportunity, social conservatives have used these libertarian fears to make the case for strengthening local communities. They argue that the disappearance of strong neighborhoods and towns created a void into which the overbearing hand of the state has entered. In this way, a roughly overlapping vocabulary of concerns has developed between the two groups. … Social conservatives are right to point out that traditional forms of social trust and community have eroded. There is a real sense of loss and tragedy to all this. But one shouldn’t let a desire to construct a bigger tent for Americans concerned about the growing power of the state lead one to the unwise conflation of two different views of human flourishing.

I am not convinced by Blair’s argument, which strikes me as premature. The overlap of civil libertarianism and the modern libertarian and federalist perspective on limited government is not merely due to shared vocabulary, but to an increasingly shared perspective on the encroachments of life under the ever-expanding scope of a nationalized and unrepresentative government. People with a shared understanding of government’s outsized power can absolutely be at odds over what a good American life looks like. City mice and country mice can both love liberty and understand the appeal of local government over the rule of distant administrators.

Today, progressives want everything locally grown except government. Why is this? In part, it’s because self-government proves the failure of the Administrative State. As a rule, responsible self-governing neighborhoods and communities thrive and foster virtues which lead to close-knit communities and strong civil institutions. Ideally, they turn into little Lake Woebegons, where all the women are strong, all the men are good looking, and all the children are above average. The urban elites may mock the little pink houses with their picket fences, the soccer moms making grocery runs in the SUVs, the Vacation Bible Schools … but these are the communities where people actually live good wholesome American lives. The stronger these communities are, the more they form a hedge against the encroachments of government.

Consider the death of the suburbs, which has been predicted eagerly (and inaccurately) by the left for decades. Today just one out of ten Americans live in a square mile of more than 10,000 people. And while just a third of Millennials are the heads of their own households and only 1 in 4 are married, they overwhelmingly say that they want to get married and have a family – 70% for men and 78% for women. The desire of Millennials to own their own home is actually stronger than prior generations. And having a kid accelerates that desire even more, with the cohort with young children the likeliest to move to the suburbs or to small towns. Why is this? Because of the incentives of economy and community – the obvious benefit of neighborhood and homeownership and good schools and churches and community. Many Millennials won’t choose this life, but many more will. The presence of Yelp does not eliminate the lure of this approach.

The progressive left and the technocratic right want the whole world to look like the political machines they know and love. They cannot tolerate the idea that self-governing communities outside their approach to dealmaking and spoils-centered politics could give people an attractive alternative. That’s why it’s so essential to them that the major decisions about our economy, policing, health care, education, transportation funding, and more be made in Washington, where the order they seek to impose on people’s lives can be dictated by administrators beholden to no one except the stakeholders, the political and business interests who have a seat at the table.

The appeal of the self-governing community, which can decide for itself what is encouraged or discouraged, what is banned or made legal, is one directly at odds with this approach. The issue is not whether we all agree about the definition of the life well-lived. The issue remains, as Ronald Reagan framed it a half-century ago, “whether we believe in our capacity for self-government or whether we abandon the American Revolution and confess that a little intellectual elite in a far-distant capital can plan our lives for us better than we can plan them ourselves.”

[First published at The Federalist.]

Categories: On the Blog

The Sinestro Theory of The Administrative State

Somewhat Reasonable - August 23, 2014, 9:39 PM

Over the past decade, we’ve witnessed a decline in the level of trust in government, and a rise in distrust, to levels unprecedented in American history. But to think this is an entirely new phenomenon is a mistake: trust in government has steadily declined since the Great Society and the Vietnam War under Lyndon Johnson.

This graph from Pew with data running through the fall of 2013 shows how people answer the question: “How much of the time do you trust the government in Washington?” The answer is pretty clear: not much at all.


There are many reasons that this phenomenon has accelerated despite the promises of one presidential candidate after another to restore our faith in government. But there is one reason in particular which runs through the political stories of the past year – the IRS scandals, the unanticipated failure of Obamacare’s implementation, the broad expansion of the administrative discretion and “the-secretary-shall” lawmaking, executive actions on amnesty and DACA, deleted emails right and left, and, most recently, the indictment of Texas Gov. Rick Perry for vetoing funding for a unit of legal bureaucrats after it became obvious they were headed by a drunk driver who refused to resign. On the whole, it presents a picture of how far the Administrative State is willing to blatantly ignore any checks on their ability to enact their whims.

Government has always been frustrated by any checks on their power. The Founders believed that you can check that power with the mob, either democratic or anarchic, for only so long before government would turn to despotism. So their solution was to deliberately balance the forces of power against each other, to tie the governors’ hands via limited, enumerated powers of national government. The Constitution made the action of government – in nearly all areas outside of waging war – deliberately difficult, and that was on purpose. It was supposed to be hard to pass new laws, not because the Founders were opposed to new laws, but because they wanted to make it impossible for those laws to oppress the people.

What the Founders did not anticipate was the degree to which those invested by the Constitution with the power to make law would find it politically advantageous over the course of a century to steadily cede their power to unelected governmental bodies of vast size and with an ever-enlarged mission. Representative government, it turns out, is very difficult. Better and wiser to shift the responsibility for such decisions to someone else – to tell the frustrated citizen that it is beyond your control to address their concern, and isn’t there an agency for that? This new unchecked branch of government has seized the power it wants along the way: the power to reward friends and grant waivers and special privileges to people and firms who they like or who play by their rules, to abuse their power in the course of punishing those who they don’t, the power to live large and cover up mistakes without that difficult legislative process.

That’s why they are so willing to go to such great lengths to hold on to their jobs. When bad things happen in the real world, heads roll. But in the world of the Administrative State, resignation is the worst possible thing you can demand of someone. So political appointees are given taxpayer funded vacations, cops who break the law are put on leave, and district attorneys who drive out-of-their-minds drunk demand they keep their job. Life in the bureaucracy is too sweet to lose, no matter what – and those who hold those positions know how good a deal they have and won’t give it up under any conditions.

What we’re talking about here is really just human nature, of course. Government employees want just what everyone else wants. The only difference is they think they have the power to make good on their whims. Consider this the right’s corollary to the Green Lantern theory of the presidency. In the era of the Administrative State, big government has been giving out too many rings to too many would-be Sinestros. And when it comes to trust in Washington, it’s the fact that this power is centralized in the Administrative State, rather than localized via federalism, which creates the special class of modern ringbearers. It allows them to work together in common purpose, as the progressives intended, as opposed to balancing and checking each other, as the Founders always understood to be essential.

The progressive view that checks and balances must be eliminated, that things should be organized by wise neutral administrators and we should just make things easier for the government to get things done (even Nuclear Option style), is motivated by the belief that the government can then make things a lot better for the people. But of course, in reality, the Administrative State is most interested in making things better for the people in government. And that’s where little things like trust start to break down.

[First published at The Federalist.]

Categories: On the Blog

1,461 Days of Summer

Stuff We Wish We Wrote - Homepage - August 22, 2014, 3:30 PM
President Obama golfs with former NBA star Alonzo Mourning in Martha’s Vineyard / AP BY: Matthew ContinettiAugust 22, 2014 5:00 am The headline was brutal.…

Mouth cancer facts

Out of the Storm News - August 22, 2014, 10:58 AM

Baseball star Curt Schilling says he has mouth cancer that was caused by chewing tobacco. His announcement has generated considerable interest in mouth cancer, its frequency and causes.

What is mouth cancer?

Mouth cancer typically appears in the lining of the mouth; it may start as an ulcer or red area that is discovered in a dental or medical exam. The phrase is often used incorrectly to include cancers of the throat.

Schilling did not disclose the location of his cancer, but he did say that he found a lump in his neck. This indicates that the tumor had spread to a lymph node, a condition that more likely suggests a tumor of the throat, rather than the mouth.

How common is mouth cancer?

It is very rare. Mouth cancer occurs with higher frequency in people who have the risk factors I describe below, but it is possible for someone with no risk factors to develop this disease. As I described previously, among 100,000 men age 40 and over, perhaps three or four with no risk factors will develop mouth/throat cancer each year; only one or two of those cases will be mouth cancer.

What causes mouth cancer?

The most common cause of mouth cancer is smoking, which can increase risk tenfold; smokers who drink alcohol have even higher odds. Alcohol abuse raises the odds about fourfold.

Another recognized risk factor is infection with human papillomavirus, a sexually transmitted disease discussed previously. HPV is considered by some experts to be a significant cause of mouth cancer, but precise estimates of risk elevation are not available.

Schilling attributes his cancer to chewing tobacco. There are numerous studies of the risks related to smokeless tobacco. The odds of developing mouth cancer if you use chewing tobacco or moist snuff are about the same as if you didn’t smoke, drink or have HPV. In other words, one or two users out of 100,000 will develop mouth cancer.

Smoking and drinking can produce a cancer anywhere in the mouth, esophagus, voicebox and lungs. HPV is generally linked to cancers of the throat. In contrast, the most common location, by far, for mouth cancer in a smokeless tobacco user is at or very close to where the tobacco is placed, normally between the cheek and gum.

While rare, every case of mouth cancer is unfortunate and potentially avoidable. Have your dentist or physician perform a thorough head and neck exam every year.

Urgent Call for citizen action to rein in EPA

Somewhat Reasonable - August 22, 2014, 9:57 AM

On June 2nd of this year the Obama administration announced new regulations from the Environmental Protection Agency (EPA) with a goal of reducing carbon emissions over the next 15 years. These goals as outlined by the EPA in the Clean Power Plan impose significant restrictions on power plants already in existence, even natural gas plants. Power plants are cited by the EPA as the largest source of carbon pollution in the U.S., accounting for roughly on-third of all domestic greenhouse gas emissions.

The reasons given by the EPA for its new regulations are in keeping with blaming man’s emission of CO2 as the cause for runaway global warming, yet the global temperature has remained flat over the past 17 years. History tells us that natural climate variability has happened over the centuries; i.e. the Medieval Warm Period was warmer than today. In 2009 some prominent alarmists are on record saying that if temperatures remained flat for 15-20 years, the scientific community needs to re-evaluate the theory. Three years later in 2012 a report in the UK Daily Mail revealed the following from a quietly released Met Office report:

The figures reveal that from the beginning of 1997 until August 2012 there was no discernible rise in aggregate global temperatures

  • This means that the ‘pause’ in global warming has now lasted for about the same time as the previous period when temperatures rose, 1980 to 1996

It’s 2014 and we’re now near the twenty-year time frame to re-evaluate the theory of global warming, but will it be possible to stop the new EPA regulations with the powerful force of government behind them?

According to the EPA, its Power Plan proposal will put Americans to work, make the U.S. electricity system less polluting and our homes and businesses more efficient, and shrink electricity bills by 8% in 2030.  If this sound too good to be true, it is!  According to an article by Katie Pavlich, “What you need to know about Obama’s new EPA regulations,” the regulations will negatively impact the economy in the following ways:

-New regulations will kill 226,000 American jobs.

-New regulations will cost the U.S. economy $51 billion per year.

-Even with new regulations, carbon is only expected to decrease by 1.8 percent by 2030.

-Despite EPA claims new regulations will reduce the cost of power, electricity rates are expected to go up, as coal-fired plants, the dominant source of cheap power, shut down in response to environmental regulations, further making new coal power plants impossible to build.

Later this year the EPA will decide whether it should tighten the air-quality standard for ground level ozone. In 2008 the EPA set ozone standards for air quality at 75 parts per billion (ppb).  Even before states have fully implemented the 2008 standards, the EPA is expected to propose revising it to as low as 60 ppb.  A new study for the National Association of Manufacturers by NERA Economic Consulting, finds that “the new ozone standard could cost Americans $170 billion annually, put millions of jobs at risk, and drastically increase energy prices for consumers and manufacturers.”

The Heartland Institute is circulating a “Citizen’s Petition to Rein in the Environmental Protection Agency.” You can read and sign the online petition below, download a hard copy of the petition, or contact The Heartland Institute at 312/377-4000 and ask that a petition be sent to sign, or for multiple copies to distribute to your family and friends.

Thorner first learned about Heartland’s call to action petition to reign in the EPA when receiving it as a handout at the The Ninth International Conference on Climate Change  which took place from July 7-9, 2014 in Las Vegas.  Some 650 scientists, economists, policy experts, and guests attended the ICCC9 conference, all willing to question whether man-made global warming is a problem worth addressing.  The ICCC9 schedule can be viewed here.  Videos (and PowerPoint presentations, when available) from every presenter can be seen here.

The nine statements in The Heartland Institute’s petition are capsules of knowledge gained through hard scientific data and observations as documented through published reports by the NIPCC (Nongovernmental International Panel on Climate Change), which counter the reports put forth by the IPCC (Intergovernmental Panel of Climate Change).  Given the devastating consequences of the proposed EPA regulation, an urgency exists for deep cuts in the size, power and cost of the EPA.  Below are the nine petition statement, each of which is proceeded by “Whereas”:

1.  For decades an unprecedented campaign has been waged to scare the American people into believing their health, safety and even survival was at stake because of manmade global warming.

2.  We were told that CO2, which makes up less than one-half of one-tenth of one percent of our atmosphere, is acting like a blanket, keeping the heat in, and it was going to cause the baking of the Earth.

3.  The hard scientific data now show that the predicted rise in global temperatures just plainly did not happen.

4.  Sea ice in the Arctic is rebounding and continues to expand in the Antarctic, despite predictions of the opposite, polar bears are thriving, and sea levels are not rising.

5.  Deaths due to extreme weather are radically declining and global tropical cyclone activity is at near record lows.

6.  It is now clear that the global warming alarmists are wrong and a large majority of Americans now understand it was a hoax.  It’s false that man reducing CO2 emissions — especially the U.S. virtually alone — will have any effect on the global temperature.

7.  Regulators at the EPA, having lost their war to scare America into giving them legislation that would allow them to seize control of virtually all energy production and use, are perverting the Clean Air law to give themselves unprecedented powers to regulate American society.

8.  The toll the EPA is now taking on this country is staggering, putting hundreds of thousands of Americans out of work.

9.  The only way a new president or a new Congress will undo the terrible damage being inflicted by the EPA after the election is if we make it a major issue NOW, BEFORE THE ELECTION, so that after the election even those who support the EPA’s reckless regulations will understand it is political suicide to support them.

First revealed at The Ninth International Conference on Climate Change in Las Vegas in July, was a plan by Jay Lehr, Ph.D, science director at The Heartland Institute to replace the EPA.  Dr. Lehr’s plan, “Replacing the Environmental Protection Agency,” was posted on July 15 at Heartland as a policy document.  Click to read the full text of this document59.97 KB

In Congress Rep. Vicky Hartzler (R-Mo.) has called for legislation that would ban the EPA from issuing rules through the proposed EPA’s Waters of the United States program, which would wreak havoc on farmers and possibly drive up food prices.

The E.P.A. says the new rules are needed to clarify which bodies of water it must oversee under the federal Clean Water Act.

The EPA has outlived its usefulness.  Don’t delay by downloading and signing The Heartland Institute petition as a citizen concerned about the overreach of the EPA and its proposals which point to economic disaster for this nation.

Categories: On the Blog

EPA Pesticide Bans Threaten You and the Economy

Somewhat Reasonable - August 22, 2014, 9:46 AM

When Rachel Carson’s book, “Silent Spring”, was published, filled with totally false claims about DDT, the Environmental Protection Agency looked it over and concluded she had used manipulated data. They concluded that DDT should not be banned, but its first administrator, William Ruckleshaus, overruled the agency and imposed a ban.

Ruckleshaus was a lawyer, not a scientist. He was also politically connected enough to hold a variety of government positions. He got the nod for the EPA job from John Mitchell, Nixon’s Attorney General who later went to jail for his participation in the Watergate cover-up.

Wikipedia says, “With the formation of EPA, authority over pesticides was transferred to it from the Department of Agriculture. The fledgling EPA’s first order of business was whether to issue a ban of DDT. Judge Edmund Sweeney was appointed to examine the case and held testimony hearings for seven months. His conclusion was that DDT “is not a carcinogenic hazard to man” and that “there is a present need for the essential uses of DDT”. However, Ruckelshaus (who had not attended the hearings or read the report himself) overruled Sweeney’s decision and issued the ban nevertheless, claiming that DDT was a ‘potential human carcinogen.’” In 2008, having returned to the practice of law, he endorsed Barack Obama.

I cite this history from the 1970s because most people believe that the EPA operates on the basis of science and, from the beginning, that could hardly have been less true. It has evolved over the years into a totally rogue government agency issuing thousands of regulations with the intent to control virtually every aspect of life in America, from agriculture to manufacturing, and, in the case of pesticides, the effort to ban them all, always claiming that it was to protect public health.

Not killing pests, insects and rodents, is a great way to put everyone’s health in jeopardy. New York City announced a new war in May against rats and will spend $600,000 to hire new inspectors to deal with an increased population. Lyme disease and West Nile Fever are just two of the diseases that require serious insect pest control. A wide variety of insects spread many diseases from Salmonella to Hantavirus. Termites do billions in property damage every year.

Thanks to the EPA ban on DDT and the nations that followed the USA action, an estimated 60 million people have died from malaria since 1970 because it was and is the most effective way to control the mosquitoes that spread it, particularly in Africa. In the West, malaria had been eliminated thanks to the use of DDT before the ban.

In the 1980s I worked with the company that produced an extraordinary pesticide, Ficam that was applied with nothing more than water. Although it had gone through the costly process of securing EPA registration, the agency told the manufacturer it would have to do so again. Because the cost could not justify re-registration it was taken off the market in the USA, but continues to be used successfully for malaria control in more than sixteen nations in Sub-Saharan Africa and against the spread of Chagas, a tropical parasitic disease in Latin American nations. Ficam can be used to control a wide variety of insect pests. But not in the USA.

In 2000, the EPA, during the Clinton-Gore administration, announced that “a major step to improve safety for all Americans from the health risks posed by pesticides. We are eliminating virtually all home and garden use of Dursban—the most used household pesticide in the United States.”  It was widely used because it did a great job of controlling a wide variety of insect pests, but the EPA preferred the pests to the human species it allegedly was “protecting.” The ban was directed against chlorpyrifos which the EPA noted was “the most commonly used pesticide in homes, buildings, and schools.” It was used in some 800 pest control products.

Recently I have been receiving notices from Friends of the Earth (FOE) announcing “a new effort to help save bees. “We need to ban bee-killing pesticides now!” says one of their emails, claiming that “A growing body of science shows that neonicotinoid (neonic) pesticides are a key contributor to bee declines.” This is an outright lie. As always, FOE’s claims are accompanied by a request for a donation.

Dr. HenryI. Miller, a physician and molecular biologist, a fellow at Stanford University’s Hoover Institution, was a founding director of the FDA’s Office of Biotechnology. Recently he disputed the White House’s creation of a Pollinator Health Task Force and a directive to the EPA to “assess the effect if pesticides, including neonicotinoids, on bee and other pollinator health and take action, as appropriate.”  This is the next step—a totally political one—that will deny one of the most important pesticides to protect crops from being used. “This would have disastrous effects on modern farming and food prices,” warns Dr. Miller.

“Crafted to target pests that destroy crops, while minimizing toxicity to other species, neonics,” said Dr. Miller, “are much safer for humans and other vertebrates than previous pesticides…there is only circumstantial or flawed experimental evidence of harm to bees by neonics.”

“The reality is that honeybee populations are not decline,” noted Dr. Miller, citing U.N. Food and Agricultural Organization statistics. If anything is affecting bee populations worldwide it is the increasingly cold weather than has been occurring for the past 17 years as the result of a natural cooling cycle which is the result of less solar radiation from the Sun. The other threat to bee is Varroa mites and the “lethal viruses they vector into bee colonies.”

“A ban on neonics would not benefit bees, because they are not the chief source of bee health problems today.

But the Friends of the Earth who are no friends of the humans that live on it want to ban neonics and it is clear that the White House and the EPA are gearing up, for example, to induce a major reduction in crops such as Florida’s citrus industry which is subject to the Asian citrus psyllid, an insect that spreads a devastating disease of citrus trees. Other food crops are similarly affected by insect pests and the end result of a ban would severely damage the U.S. economy.

In every way possible the environmentalists—Greens—continue to attack the nation’s and the world’s food supply and the result of that will kill off a lot of humans. The EPA’s pesticide bans are not about protecting health. They are an insidious way of increasing sickness from an ancient enemy of mankind, insect and rodent pests.

© Alan Caruba, 2014

Categories: On the Blog

The New York Times Has Zero Idea How the Internet Works – Or Is Lying Its Masthead Off

Somewhat Reasonable - August 22, 2014, 9:34 AM

It takes a special man to cram so much wrong into a mere 342 words.  Or an Old Grey Lady.

The New York Times utterly ridiculous Editorial Board recently as one addressed Title II Internet regulatory Reclassification and Network Neutrality – and they did so in utterly ridiculous fashion.

They either have absolutely no idea what any of this is – or they are lying through their printing presses.

The Times calls for the federal government to illegally commandeer control of the entirety of the World Wide Web – so as to then impose Net Neutrality.  Guess with whom they are in agreement?

The hardcore Media Marxist Left wants President (Barack) Obama’s Federal Communications Commission to unilaterally change – for the worse – how the government regulates the Internet. Which would be an egregious violation of existing law – the 1996 Telecommunications Act.

This law classified the Internet as Title I – a very light-touch regulatory regime. As happens when the government largely leaves something alone, the Internet has become a free speech, free market Xanadu. Arguably no endeavor in human history has grown so big, so well, so fast.

If ever there was an example of “if it ain’t broke, don’t fix it” – this is it. Yet the perpetually broken government is listening to these Leftist loons – and considering the move to Title II.

Title II is the uber-regulatory superstructure with which we have strangled landline phones – you know, that bastion of technological and economic innovation. Which do you find more impressive – your desktop dialer or your iPhone?

Title II regulations date back to the 1930s – so you know they’ll be a perfect fit for the ultra-modern, incredibly dynamic, expanding-like-the-universe World Wide Web.

This would be the most detrimental of all Information Superhighway road blocks. Rather than the omni-directional, on-the-fly innovation that now constantly occurs, Title II is a Mother-May-I-Innovate, top-down traffic congest-er.

Imagine taking a 16-lane Autobahn down to just a grass shoulder.

The Times editorial wrongness begins in their title.

President Obama: No Internet Fast Lanes

There will be no “fast lanes.”  There will be what there have been since just about the Internet’s inception – innovative ways to make uber-bandwidth hogs, like video merchants, easier to deliver.  Which keeps the traffic for everyone flowing smoothly.

The Web would have long ago ground to a halt had not these innovations been developed and continuously enhanced.  The bandwidth hogs are looking to have the government mandate that the Internet Service Providers (ISPs) build, maintain and grow them – and give the hogs free, unlimited access.

Guess who would then get to pick up that gi-normous, ever-growing, ongoing tab?  Hint: You saw him or her this morning brushing your teeth.

The Times then gets the first half of their very first sentence wrong.

The Federal Communications Commission, which could soon allow phone and cable companies to block or interfere with Internet content,….

Actually, ISPs have always and forever been able to do that.  But they haven’t.  Why?  Because they are in the customer service business – if they intentionally fail to service their customers, they will no longer have customers.

It’s called the free market, Times.  You should look into it – instead of looking to end it.

And there are already existing laws and an existing government entity – the Federal Trade Commission (FTC) – to address this if it ever does happen.  Which it won’t.

More Times inanity:

The F.C.C. is trying to decide whether telecommunications companies should be able to strike deals with powerful firms like Netflix and Amazon for faster delivery of videos and other data to consumers.

As Amazon grew and their package tally exponentially increased, it didn’t demand the various delivery services keep their shipping rates exactly the same.  That would be absurd.

And I’m sure the government-run Postal Service would have been very accommodating of that request.

Small and young businesses will not be able to compete against established companies if they have to pay fees to telephone and cable companies to get content to users in a timely manner.

Small and young businesses don’t and won’t have to pay – because they are but a blip on the Internet radar screen.  Netflix and Google’s YouTube are at peak times more than half of all U.S. Internet trafficthey should pay a little something, you know, for the effort.

If you leave the grocery store with twenty steaks, you pay more than if you walk around a little and leave with nothing – is that so complicated?

Tom Wheeler, the chairman of the F.C.C. who was appointed by Mr. Obama, has proposed troubling rules that would allow cable and phone firms to enter into specials with companies like Facebook and Google as long as the contracts are “commercially reasonable.

Apparently it is, in fact, too complicated for the Times.  Which wants to mandate grocery stores allow someone to fill up an eighteen-wheeler with steaks – and pay the store nothing.  Which isn’t “commercially reasonable”- it is supermarket death by government.

Which is exactly what the Media Marxists want for the Internet.

“(T)he ultimate goal is to get rid of the media capitalists in the phone and cable companies and to divest them from control.

How very Hugo Chavez of them.  And, of course, the New York Times.

[Originally published at NewsBusters]

Categories: On the Blog

Scary armored vehicles aren’t the biggest danger of police militarization

Out of the Storm News - August 21, 2014, 12:30 PM

The problem of police militarization has been in the news for more than a week, as the city of Ferguson, Mo. continues to deal with the aftermath of the police shooting of 18-year-old Michael Brown.

Much of the debate and scrutiny of the police response to the Ferguson protests has focused on the Pentagon’s 1033 program. Created in 1997, the program allows state and local law enforcement to stock up on excess military equipment free of charge. Among the equipment that has been transferred include armored vehicles, assault rifles, aircraft and other military surplus equipment.

Some of these transfers, like the distribution of an MRAP to the Ohio State University Police Department, truly are bizarre, but it is important to note that not all of the equipment transferred from the Department of Defense to local and state police is lethal. Items as mundane as office equipment also are transferred under the 1033 program.

The real problem of police militarization is not, or not primarily, the DOD equipment police can acquire. Many of the arguments against the 1033 program, in fact, sound rather suspiciously like arguments of gun control advocates, in that they presume restrictions on inanimate objects will cause crime – or, in this case, police brutality – to decrease.

In fact, it’s paramilitary-style policing tactics such as “stop and frisk” that really contribute to the distrust of police in minority communities. As I wrote last week in a piece at Rare, police militarization is an attitude in policing that sees itself at war with the people they’re supposed to serve and the community of which they are a part.

We have seen some of this attitude on display in Ferguson. Police have arrested and threatened journalists covering the protests. In some of the most infamous pictures of the violence, police officers confront protesters wearing paramilitary gear and deploy snipers against them. In one instance, a police officer pointed his assault rifle at unarmed protesters and threatened to kill them. When asked for his name and badge number, the officer allegedly replied with profanities. As of this writing, that officer has been suspended, pending investigation.

Another example of the mindset can be found in a Washington Post op-ed written by a veteran Los Angeles cop with the provocative headline: “I’m a cop. If you don’t want to get hurt, don’t challenge me.” Such inflammatory rhetoric from peace officers only serves to separate the police from the people they’re supposed to serve, to make ordinary citizens afraid of the police.

Militarized policing also has led to overuse of paramilitary SWAT teams. The Cato Institute has this interactive map displaying all of the botched SWAT raids that have been conducted over the past three decades. Some of what is defined as “botched raids” include raiding the wrong house, killing a non-violent offender or killing an innocent person.

According to Radley Balko at the Washington Post, it’s estimated there are more than 50,000 SWAT raids in the United States every year. However, only one state, Maryland, requires law enforcement to record when SWAT teams are used and for what purposes. Balko has found that in Maryland, 90 percent of all SWAT raids are used to serve search warrants and that half of all SWAT raids are used in cases where the alleged offenses were non-violent. More states should follow Maryland’s lead and require police agencies keep records on when and why SWAT is deployed.

Many SWAT raids are to enforce “no-knock” warrants, in which a judge allows police to force their way into a residence without knocking or otherwise announcing their presence. No-knock warrants are supposed to be issued only when police believe announcing their presence would result in the destruction of contraband or put their lives in danger. They have led to tragic consequences, both for police officers and people inside the homes.

In May 2014, a SWAT team raid in Habersham County, Ga. – on what turned out to be the wrong house – left a two-year-old in a coma with a hole in his chest, caused by a flashbang grenade that landed in his crib. Incredibly, the county refuses to pay the medical bills of the child who was injured.

No-knock raids can also be deadly for the cops who execute them, as homeowners sometimes confuse them for intruders. In December 2013, Henry Magee’s home in Burleson County, Texas was subject to a no-knock raid by the county sheriff’s office. Magee mistook one of the deputies for a burglar, shooting and killing him. In February 2014, a grand jury declined to indict Magee for murder.

No-knock and SWAT raids need to be reserved for instances where an officer’s life genuinely would be endangered by serving a warrant conventionally. If the raid is botched or an innocent house is raided, there needs to be consequences.

Ultimately, it doesn’t matter how police are equipped, so long as they use the proper tactics and have the proper mindset to serve the public, protect their rights and fight crime. An armed officer with just a revolver and a shotgun can be as abusive as an officer wearing the latest in paramilitary gear and armed with an assault rifle. In the end, what we need most of all is to rebuild the broken trust between the public and the police.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Letter to FCC on Comcast-Time Warner merger

Out of the Storm News - August 21, 2014, 11:25 AM

 

 

Federal Communications Commission
445 12th Street SW
Washington, DC 20554
VIA ELECTRONIC COMMENT FILING SYSTEM

Re: Comcast – Time Warner Cable, MB Docket 14-57

Aug. 21, 2014

Dear Commission Members:

On behalf of the R Street Institute, a Washington-based free-market think tank with offices in Sacramento, Calif., Austin, Texas, Columbus, Ohio, and Tallahassee, Fla., I write in support of approving the proposed merger between Comcast and Time Warner Cable (TWC). Our analysis of this merger is that it is a natural response to changing market conditions, offers significant potential benefits for consumers in both the residential and business markets and that potential harms are either minimal or mitigated by other existing regulations or market dynamics.

The proposed $45 billion merger takes place in an environment characterized by two trends that have hit cable television providers particularly hard in recent years – a shrinking subscriber base for pay-television services and the rising cost of content acquisition.

Comcast has been losing video customers, on net, for at least five consecutive years, down nearly 10 percent from 24.8 million at year-end 2007 to 22.5 million at the end of the second quarter of 2014. TWC has also lost net video subscribers in each of the past five years, falling more than 17 percent from 13.3 million at year-end 2007 to 11.0 million at mid-year 2014.

Cable companies also have seen rapidly escalating costs to acquire content, driven in part by competition from a profusion of video on-demand services like Netflix, Amazon Prime and Hulu, of which Comcast is a part-owner. Intense negotiations for content – including a 2013 dispute between TWC and CBS – also have led to a number of service blackouts, which unquestionably harm consumers. Reflecting trends across the industry, TWC has seen its per-subscriber content costs rise 24 percent since 2010, while Comcast has seen a 20 percent jump over the past two years.

Currently, nine companies – AMC, CBS, Discovery, Disney, Fox, Scripps, Time Warner Inc., Viacom and Comcast itself – control about 90 percent of the $45 billion market for television content. While the content creation market is not itself a monopoly, growing demand has contributed to higher prices. The market for sports content – provided by the likes of Comcast’s own NBC Sports, as well as CBS Sports, Fox Sports, Time Warner’s TNT and TBS and, especially, Disney’s ESPN – has proven particularly thorny for cable companies. The trend toward “cord cutting,” in which consumers eschew any pay-television service in favor of streaming video on-demand, has raised the stakes for cable companies to retain consumers of live broadcasts, tilting leverage further toward providers of sports content.

According to SNL Kagan, fees paid by distributors to carry cable channels are expected to grow from $31.7 billion in 2013 to $40.8 billion in 2016. The market is led by ESPN, which takes in about $5.54 per month per subscriber, compared to about $1 per month per subscriber paid to broadcast network affiliates for retransmission consent, another rapidly growing cost driver. SNL Kagan projects the broadcast networks – including Comcast’s NBC and Telemundo – will pull in about $3 billion in retransmission consent fees in 2015, with the networks themselves taking roughly a $1.3 billion cut and network-owned affiliates getting the remaining $1.7 billion.

The additional negotiating power wielded by a combined Comcast-TWC could potentially serve as a check on rising content acquisition costs, both in carriage fees and retransmission consent agreements. It should be noted that the extent to which this would reverse the prevailing trend is uncertain and may depend partially on whether the combination spurs further media consolidation in response. To the extent that the combined company can negotiate across any of these markets to reduce fixed costs, it could translate into consumer benefits in the form of lower service bills.

Consumers also should benefit from operating efficiencies that reduce costs without reducing output, and from network upgrades, in particular to TWC’s relatively older and slower service. Comcast has said it expects the combination initially to yield about $400 million in capital expenditure efficiencies and to save about $1.5 billion in operating expenses within three years. The company also has announced it will accelerate TWC’s planned migration of at least 75 percent of its service footprint to all-digital service.

One under-appreciated consumer benefit of a combined Comcast-TWC is the role the larger company could play in the business services sector. While both Comcast and TWC have a modest presence in the market to provide broadband and voice service to small business, the firms are only marginal players in the market to serve large commercial enterprises. Because of the need for a large national service footprint, the business services market traditionally has been dominated by telecoms like Verizon and AT&T. A combined Comcast-TWC, with at least some footprint in all of the 50 largest markets, could for the first time become competitive, with benefits redounding to business services consumers.

Some have raised concerns that a combined company would have undue market power to discriminate in both the video and broadband markets, for instance by privileging its own content over that of competitors. Some of these concerns are relevant to the commission’s own separate industry-wide deliberations on regulation on net neutrality, a subject on which R Street has not taken any formal position. However, it is incumbent on those who raise such concerns to demonstrate why a combined Comcast-TWC presents any new issues or heightens any existing issues that did not already exist with the companies operating separately.

Comcast is already bound by the FCC’s program carriage rules not to privilege its own content. The company also has already pledged that the seven-year net neutrality agreement it consented to when it purchased NBCUniversal in 2011 would also apply to TWC. What’s more, any incentive a combined Comcast-TWC would have to discriminate against particular content providers operating on its platform would, by necessity, be balanced against consumer demand for that same content. This is a lesson already learned the hard way by TWC, which lost 300,000 customers during its blackout dispute with CBS.

Were it the case that a combined company would leave consumers with fewer choices, concerns about discriminatory treatment of content would have more force. But Comcast and TWC already do not compete with one another for customers in any market in the country. Moreover, Comcast also has stipulated as part of the terms of the agreement that it will divest 3.9 million residential video subscribers to Charter Communications. The combined Comcast-TWC would remain the largest provider of pay-television services, but it would control less than 30 percent of the market, with DirecTV and Dish Network – both of which do compete directly with Comcast and TWC — having 20 percent and 14 percent, respectively. Other services, including the telephone providers that also compete directly with cable and satellite, comprise with the remaining 36 percent.

As believers in pragmatic, free-market solutions, we believe antitrust action should be limited in scope and focus on demonstrable harm to consumers. We do not believe the issues raised by the proposed Comcast-Time Warner Cable merger meet that threshold. We ask that you allow it to go forward without undue delay.

Respectfully submitted,

 

R.J. Lehmann
Senior Fellow
The R Street Institute

 

Eli Lehrer
President
The R Street Institute

Don’t Tax the Internet into Oblivion

Somewhat Reasonable - August 21, 2014, 9:56 AM

Original photo by Alexander Anton.

Most of us who have grown up with the Internet have watched it do amazing work. Companies that are idea-driven and whose sources of income come from “users” have even started going public with major IPOs. Whatever has been going on seems to be working. Freedom surges online; anyone can start a Facebook page, a Tumblr blog, a PayPal account, a Twitter, or a YouTube account and gain millions of followers with as little capital as a phone or a PC. Even the much ridiculed Justin Bieber was found by Usher on YouTube, where the young star is now worth $200 million—all in just seven short years.

For much of the time that this extreme growth occurred, the government was careful not to hamper innovation. The recent debate has been about the Internet Tax Freedom Act (ITFA) that is set to expire on November 1st. First created in 1998, it has been renewed three times, preventing state and local taxation of Internet access and electronic commerce. It also allows seven states who were taxing Internet access prior to the law to be grandfathered in so they can continue unabridged.

It seems at first glance that the federal government should not be telling states and municipalities what they can and can’t tax, but undue taxes naturally hinder the flow of the marketplace to begin with. Looking at current cell phone taxes provide some insight as to how Internet access taxes could look if the law expires. The national average on cell phone taxes is over 16.3%! My $80/month “unlimited everything” plan from Sprint sounds good in a Spotify ad or on a billboard, but it’s deceiving. It’s actually closer to $100 with all the extra taxes. A similar phenomenon could happen and put accessing the Internet out of reach for people who can barely afford it now.

A bipartisan group in the Senate has been working on a proposal called the Marketplace and Internet Tax Fairness Act, which would ban Internet access taxes for another 10 years, but allow states to collect sales taxes from out-of-state companies, and still allow grandfathered states to charge Internet access taxes. It is unclear whether or not sales taxes from Internet purchases would also be up for expiration after 10 years like the Internet access tax ban would be, but it seems unlikely.

Since the Internet itself has no one “location,” it would be difficult to create a simple set of tax rules for items bought and sold. Rather than make it complex and add to the mix of confusing tax policies that already dominate American life, we should continue to shop and sell unabridged from government interference. Because the Internet can facilitate private marketplace transactions easily and efficiently, profits can be spent on investments that can lead humanity further into the future. Imagine if the government taxed the Internet. They could tax individual apps, how much data you use, the words you type in an email or a blog… Considering there are state taxes that apply to altered bagels in New York, toilet-flushing in Maryland, or holiday decorations in Texas, it isn’t far-fetched that certain states would come up with odd ways to suck out as much money as they could from the Internet…and for counties and cities that have home rule, and thus, their own taxes, multiple taxes by multiple bodies of government could begin appearing.

This proposal is ludicrous, especially since we still operate under federalism. If the Internet is inherently borderless and knows no location, then it should not have any state or local tax applied to its use. Imagine the arguments between states on who should be able to tax what. Should the states that house the servers for the seller’s website get any of the revenue? How about the state where the seller lives? What about the state that the seller’s items get shipped out of? These, along with many other variables, seem not to be an option when it comes to Internet sales taxes.

When taxes are first explained to us—probably when most of us were young—they made it sound all well and good. “Taxes are used for roads and schools and to defend our country!” We repeatedly hear these statements throughout our lives. However, as we grow up, we realize how wasteful governments of all sizes are. The surplus of social security income is not kept in a nice “pot” for us to all go back and draw from (plus, it’s a forced retirement plan), and tolls we pay in most states (with exorbitant fines if you don’t pay, often marked up 1000% or more than the original price)—contrary to popular belief—do not all go to repairing and building new highways. The documented waste, tax increases, new tax proposals all keep growing and growing while the good service we’re promised for our schools and our roads and our defense continually seems neglected. Do not add Internet taxes of any kind—sales or access—to this list that will only fuel the government fire of irresponsibility and waste for decades to come.

Categories: On the Blog

As competition between Lyft and Uber grows, questions linger about disruption

Out of the Storm News - August 21, 2014, 8:00 AM

As Americans become more familiar with the concept of “ridesharing,” things are heating up in what the Wall Street Journal last week dubbed the “fiercest battle in the tech capital,” between Uber and its largest competitor, Lyft.

The Journal piece portrays a “bitter war,” featuring “two heavily financed upstarts plotting the demise of the taxi industry—and each other.” The campaign is mostly being waged in the marketplace, with the two firms competing over price, pick-up times, drivers and services offered. But there are also some allegations of dirty tricks:

A Lyft spokeswoman said Monday that representatives from Uber have abused its service in the past several months with the goal of poaching drivers and slowing down its network. Passengers who identify themselves as working for Uber frequently order a Lyft and then ride for only a few blocks, sometimes repeating this process dozens of times a day, she said…A spokeswoman for Uber denied the company is intentionally ordering Lyft rides to add congestion to its competitor’s service.

Competition is at the heart of capitalism, but some might question the wisdom of devoting so much energy to fighting one another when a common set of opponents lurk: regulators, lawmakers and the special interests who have their ear.

In city after city and state after state—from Pittsburgh to Seattle, and Nevada to Virginia—municipal taxi authorities and public-transit commissions have been cracking down and shutting down ridesharing services with claims that they violate rules governing the licensing, insurance, vehicle types, payment systems and handicapped accessibility required of for-hire taxi or limousine services. In some places, the services have managed to carve out at least temporary accommodation, but much work needs to be done if transportation network companies like Uber, Lyft, Sidecar and smaller upstarts like Summon and Wingz are to grow and thrive.

The first and most important question will be the TNCs’ contention that they are “information content providers” (in other words, publishers) and thus should be held immune from most liability under Section 230 of the Communications Decency Act of 1996. The argument is that, like dating sites, the TNCs merely match potential riders and available drivers.

It’s still not certain if the courts will see it that way. Uber already has been sued in a case charging vicarious liability for the behavior of one of its drivers, a charge that usually only would apply in an employer-employee relationship. More recently, an UberX driver was arrested following a fatal New Year’s Eve accident in San Francisco, a case that has become a centerpiece of the California regulatory debate this year.

Among the questions courts will have to weigh is the extent to which the TNC transactions are held at arm’s length. Uber provides a centralized pricing algorithm for its drivers, including the well-publicized “surge pricing” intended to draw more drivers to areas experiencing service shortages. This contrasts with Sidecar, which allows drivers set their own prices and lets consumers choose among nearby drivers. Lyft has implemented its own version of surge pricing, but more recently has experimented with the reverse: “happy hour” pricing with cut-rate fares when a surplus of drivers are on the road. Add to these considerations that Lyft and Sidecar formally regard payments to their drivers as “donations” that are always optional and negotiable.

Why does this matter? Because the more “tools of the trade” the services provide to their drivers—whether pricing algorithms or GPS devices or even the pink moustaches that adorn the fronts of cars operated by Lyft drivers—the more they potentially undermine their Section 230 defense. This may extend even to steps the firms already have taken to accommodate safety and insurance concerns, including beefing up their screening and background-check processes and purchasing commercial insurance to cover their drivers’ liability.

These are the kinds of thorny issues that could torpedo progress on the regulatory front. It’s obviously essential that TNC firms continue to offer the best services at the best prices, which is the only way to build a constituency who will demand regulators allow the companies to operate. But it also would probably be wise for the nascent industry to begin thinking about best practices that demonstrate they can agree to at least some common solutions.

Toward that end, it has been good to see the emergence of Peers, a nonprofit dedicated to taking on issues common to the sharing economy. Lyft also has taken the lead by founding the Peer-to-Peer Ridesharing Insurance Coalition, which could provide a needed dialogue with the insurance industry to develop new products that better fit the kinds of risks that ridesharing presents.

As my colleague Andrew Moylan and I argue in a recent paper, the peer-production economy holds the potential to free billions in trapped and underutilized capital and spur economic growth. But even as these innovative firms look to best each other in the market, they also must work together to keep regulators from strangling their industry while it’s still in the cradle.

Debunking Consumerist Bogus Claim Mobile Data Does Not Compete with Cable

Somewhat Reasonable - August 21, 2014, 6:13 AM

Pro-regulation interests often resort to highly misleading arguments to advance their cause. Fortunately that kind of deception ultimately exposes the weakness of their underlying argument and public policy position.

To promote Netflix’ “strong” version of net neutrality regulation and to oppose the Comcast-TWC acquisition, Consumerist just framed a very deceptive whopper competition argument: “Comcast says mobile data is competitive, but it costs $2k to stream Breaking Bad over LTE.”

Consumerist“Since Netflix is the driver of so much internet traffic, and the center of so many of the conversations around home broadband, TV seemed to be the way to go. The question we decided to answer is: How much will you pay for the data it takes to watch the entire series run of Breaking Bad in one month?” Consumerist’s contrived calculation was $1,200 – $2,200 for a billing cycle.

Consumerist cynically uses a classic deceptive straw man argument, hoping that most people will not catch their bogus premise that Comcast does not have broadband competition from 4 national mobile LTE OTT competitors, because it is exceptionally more expensive to binge-watch premium video programming on mobile LTE plans than it is on cable.

Let’s deconstruct this clearly unreasonable straw man argument.

First, no good deed goes unpunished. Consumerist turns the great benefit of Comcast-TWC’s high-bandwidth/usage broadband offerings that enable binge-watching of premium programming, into a problem!

Second, one can easily buy all seasons of Breaking Bad on DVD at Best Buy and other retail outlets, or on iTunes and Amazon.

Third, one can binge-watch Breaking Bad on cable or DBS — with or without a DVR. These technologies are designed, in infrastructure and economic model, to enable mass binge-watching economically.

Fourth, almost no one uses mobile LTE Consumerist’ straw man way. That’s because people know how to routinely use free WiFi to watch video on their LTE phones. Most LTE providers enable and encourage offloading high-bandwidth video content like Breaking Bad onto WiFi to more-economically manage their data usage. In addition, cable broadband providers offer their subscribers free WiFi in hundreds of thousands of spots in America so they also could binge-watch in those widely-available locations if they wanted to.

Fifth, in what common sense world is it bad for people to binge-watch video programming on technologies actually designed for viewing mass volumes of video by millions of people? Any engineer will tell Consumerist that distributing large volumes of video programming daily to many millions of people is highly economic and efficient via over-the-air broadcast, cable, or DBS technologies.

Sixth, Consumerist’s straw-man argument rests on an even more bogus straw-man core-assumption: competitors must have near identical offerings to consumers in order to be considered competitive substitutes. That is not a consumer-focused view of competition, because consumers know they have a diversity of wants, needs, demands, and means that need to be matched with a diversity of technologies, infrastructures, services, amounts, and prices. Diversity of choice, availability of innovative offerings, and robust investment are all hallmarks of dynamic market competition, and which are all generously present in America’s world-leading broadband and video distribution markets.

Lastly, Consumerist is implicitly promoting the Netflix argument for maximal broadband regulation and blocking the Comcast-TWC acquisition by assuming that the high-end markets that offer the highest prices for the most cutting-edge or market-leading services can be separated from the underlying basic mass market — for antitrust purposes. The fallacy here is similar to thinking that the market for luxury cars is separate from the market for non-luxury cars – when one can understand that the distinction of what car or feature is considered a “luxury” when the products and features in high-end markets are highly-fluid, with easily-disputed market boundaries.

In sum, apparently NetFlix, Consumerist and other broadband-regulation maximalists, need to resort to deceptive straw man arguments to try and somehow justify their extreme position for maximal broadband regulation and blocking the Comcast-TWC merger.

Importantly, one of the things that make the FCC an expert agency is that they can see through fallacious straw men arguments, and FCC-reviewing courts certainly can as well.

[First published at the Precursor blog.]

Categories: On the Blog

Obama, ISIS, and Being on the Right Side of History Between Tee Times

Somewhat Reasonable - August 21, 2014, 12:44 AM

President Obama on Wednesday slightly delayed his afternoon tee time to speak about the monstrous beheading of American journalist James Foley by ISIS. It was an underwhelming address from the Leader of the Free World who finds the crown so heavy and bothersome that he puts it down aside the putting green.

In his address, Obama did well in the “sympathy-in-chief” role. I do believe that Obama is horrified and saddened, as all Americans are, about the tragic fate of James Foley. But Obama failed in his actual job — that of a leader who must express genuine and righteous anger about this act of barbarism against all people who cherish liberty.

Obama has displayed more passion and employed sharper rhetoric when talking about Republicans in Congress — who, last I heard, are not in the business of sawing off heads to make their point clear. Maybe we’ll get a better performance from our president if ISIS makes fun of the Obamacare website.

Read the whole transcript of Obama’s remarks here, but this is the excerpt that matters to me:

People like this ultimately fail. They fail because the future is won by those who build and not destroy. The world is shaped by people like Jim Foley and the overwhelming majority of humanity who are appalled by those who killed him.

Obama’s phrasing — “people like this ultimately fail” — is passive and weak. It’s akin to Obama’s frequent rhetorical tic about anyone in America who opposes his agenda being on the “wrong side of history.” It’s a throw-away line. It’s meaningless, especially from him. Our semi-retired president just doesn’t get it.

An ideology, a movement, or a nation ultimately fails because someone put them on the wrong side of history. The history-writers are the ideological victors — almost always via war. The people of those nations sacrificed many lives and much treasure to present the “right side of history,” to ensure that “people like this ultimately fail.” At the end of the 20th Century, a history in favor of liberty was written by the West, the inheritors of the Enlightenment. Despite Obama’s rhetoric, such results did not, and will not, happen passively — and certainly not because The One merely states it.

It took the West’s leadership and action to ensure the Nazis would “ultimately fail.” It took the West’s leadership and action to ensure Soviet Communism would “ultimately fail.” It was the West’s reluctance for total victory in Korea allowed the Kim clan to write their own “right side of history,” which has starved and enslaved millions of innocents. The Democrats’ betrayal of its South Vietnamese allies allowed the murderous communists to write its own “right side of history.” So far, at least, many oppressors and murderers in the world — for decades — are not counted among those who will “ultimately fail.”

And if the West doesn’t fully rise to the challenge of Islamic Fascism, centuries of Enlightenment progress for the betterment of liberty will be wiped away in a new history written by these Islamist Fascist monsters.

Categories: On the Blog

Don’t Fear the Data-Reaper?

Somewhat Reasonable - August 20, 2014, 3:47 PM

Two articles today show how the Internet economy tends to be like the overall economy but much, much faster. Innovation is faster, the rise of new companies is faster, and maturing and death of those firms is likewise faster than in the industrial and service sectors that preceded it and remain in place beside it.

The incredibly rapid rise of sales of smartphone apps, for example, has topped out and is likely headed for a precipitous decline, the Financial Times reports:

Almost a third of smartphone users do not download any apps for their devices in a typical month, according to a report by Deloitte that predicts the volume of app store sales is hitting a ceiling.

The average number of apps downloaded on a monthly basis has decreased considerably in 2014, the firm found in a survey of people in the UK. As smartphones saturate mobile markets in the US and Europe, developers must rely on customers continuing to download new apps for their businesses to grow.

A variety of problems—including concentration of sales among a few companies and the difficulty of marketing apps in an overcrowded market—indicate that the smartphone app sector has matured and is heading down:

The number of smartphone users who do not download any apps has reached 31 per cent, a steep increase from less than fifth in a similar survey last year. For those that have, the mean number of apps downloaded has fallen to just 1.82, from 2.32 last year.

“Each additional new smartphone [owner] has less inclination to download apps, either out of apathy or, at a more global level, affordability,” Mr Lee said.

Identifying an example of a Web phenomenon that is currently on a rapid rise and hastening the pace of change in the web sector, Gene Marks notes in Forbes that a new web-based “sharing economy” is already having a huge effect and is poised to grow much bigger and have a significantly greater effect on our lives, for both good and ill:

[T]his sharing economy is going to change the world. It’s not cars or tasks or rooms or groceries. It’s data. Today’s cloud based software companies are building enormous troves of data. And the smarter ones are doing this because they see the future. And their future is sharing.

Your location is being tracked. Your purchase history is being stored. Your user profile has been collected. You are prompted to save your passwords. You are asked to confirm your personal details. You must submit an email address. You are required to provide your mother’s maiden name. Every hour more bits of data about you are being gathered, stored, categorized, and archived.

For the forward thinking software service, it’s not just about user licenses. It’s not just about selling boxes or books or tablets or shoes. It’s not merely a mobile app that lets you just buy concert tickets, listen to music or take a note. It’s about the data that’s being collected. It’s why Facebook purchased What’sApp or why Amazon recently announced it was going into the mobile payments business to compete with the likes of Square and PayPal. The current fees from these services are not what’s important. In the long term, the data is what’s important.

Noting that we are currently in only “the very early days of the data sharing economy,” Marks tells readers they shouldn’t worry about this accumulation of information about them:

We are entering a world where the trillions of terabytes of data collected by software companies will soon be put to beneficial use. This world will make them a lot of money. And in return, the world will be a better place for you and me.

Companies will use this information, Marks says, to make life easier for you, and the fact that they will make money from the process should not bother us, he argues. There is some merit to that. After all, that’s what businesses do: give you things you want in exchange for things they want (usually money); it’s a mutually beneficial process. As an example of how it works, Marks quotes David Barrett, CEO of Expensify (a mobile expense reporting service):

“When you forward a travel itinerary to us, we identify it as such and build a travel profile with flight status updates and other features. With this we know not just your past preferences, but your future plans.” . . .

“Imagine you add in another player to this scenario: the OpenTable API (Application Programming Interface ).” Barrett continued. “With this we could say ‘hey there, it’s dinnertime and you’re in a town you don’t know. There’s a great Thai food restaurant next to your hotel and Bob, another one of our users who traveled to this town, ranked this Thai restaurant 5 stars saying ‘So hot I cried!’ Do you want me to make a reservation?’ Or perhaps withGrubHub, we just order your favorite dish and have it waiting for you at the hotel. Or add in the Uber API: ‘I see you just landed, would you like a ride to the hotel? Or maybe a detour to this great Thai restaurant first?’”

“Then imagine you added a simple star-rating and review system onto the expense when you submit it. Now we have a business-travel focused Yelp, except “authenticated” via the credit card purchase (e.g. no reviewing someplace you didn’t go) and “weighted” by how much you spent there (someone who spends $100 should be more trusted than someone who spent $10).”

Later in the article, Marks quotes Barrett extending the example further:

“Want to get really crazy? How about: ‘Hey, this is a bit weird, but there’s another business traveler nearby in town for a bit who loves Thai food as much as you do, and I see from her calendar that she’s free; would you like to meet up at Bob’s favorite Thai place for some curry? If this isn’t your thing, let me know and I’ll never mention it again.’”

As that somewhat tongue-in-cheek but ultimately serious and plausible example shows, the personal-data-sharing economy is beginning to hit its economic prime years, as the technology could indeed be powerfully useful to consumers. We’ve gotten used to amazon.com and other online retailers making recommendations based on our past buying and browsing habits, and there is nothing different in kind about the advanced sort of data-sharing Marks and Barrett describe.

There is a big difference, however, between this kind of economy and conventional economic transactions. In the latter, the exchange is a straightforward trade of goods and services for money, which is merely a means of purchasing the former. In a personal-data-sharing economy, the exchange is, on the surface, strictly of information: the consumer lets the business collect his data, and the business then sells that data to other businesses or uses it itself to generate more business from that consumer and others. The consumer then gets information—about restaurants, transportation options, and even potential companions—in return for letting their data be mined.

The exchange, however, seems to give a much greater amount of power to the business than to the consumer, as is evident in the example of the dating recommendation cited above. When strangers are able to make dating recommendations without even being asked, they clearly have a good deal more power than the individual consumer in such a transaction. Such power could, as Marks notes, do much good. I am far from convinced, however, that most independent-minded people would prefer to live in such a world.

It appears, of course, that we probably won’t have much choice in the matter. Nonetheless, the example of the smartphone apps, noted above, may provide some hope for those who prefer privacy over convenience. In addition, widespread problems tend to bring forth commercial available solutions, which seems likely to happen in regard to the personal-data-sharing economy. To paraphrase Mark Twain, the reports of the death of privacy may have been greatly exaggerated.

 

Categories: On the Blog

Digital Learning Makes Rewards Fun, Effective

Blog - Education - August 20, 2014, 9:26 AM

[NOTE: The following is excerpted from a chapter of the next Heartland Institute book titled Rewards: How to use rewards to help children learn — and why teachers don’t use them well.  Read the first part of this series here. This piece was first published at The American Thinker.]

Children today are much more comfortable using information technology than are those of previous generations.  Many grow up playing video games offering strong visual and audio stimulation, instant feedback on decisions, and nonfinancial rewards for achievement, such as winning competitions, accumulating points, and being able to move to the next level of a game.  The popularity of such games confirms what parents and good teachers know instinctively: children can acquire knowledge and learn new skills at seemingly phenomenal speeds when they are fully engaged in the learning experience.

Technology applied to learning, also known as digital learning or online adaptive instruction, has vast potential to transform schooling.  Either by itself or “blended” with traditional classroom teaching, digital learning is building a record of results substantially superior to traditional teaching and potentially far cheaper when used on a large scale.

Online adaptive instruction can provide in one package the goals, activities, tests, and incentives needed to accelerate student learning.  Students receive feedback as they move through a set of activities that the program customizes to their individual abilities.  Many programs utilize algorithms grounded in psychological research on common errors students have made in face-to-face settings.  Such research makes it possible to offer detailed cues for what to do next and prompt the user to move on to more difficult levels, or to repeat a lesson, perhaps from another perspective, when appropriate.

While there are obstacles to the spread of digital learning, cost is not one of them.  The per-pupil costs of online schooling, which requires fewer teachers, have only recently been compared to that of traditional classroom instruction.  According to a study by the Thomas B. Fordham Institute, full online learning on average costs about $4,300 annually less than traditional schooling, while the blended model saves about $1,100 per student per year [1].  These cost savings are likely to increase over time as the technology improves and as educators gain experience in its use.  Requiring nine rather than 12 years of schooling would reduce costs substantially more.

Best Practices

Digital learning is spreading quickly as parents, students, and educators recognize its transformative potential.  Some obstacles need to be overcome, such as certification requirements that block entry into the teaching profession by talented and motivated individuals, seat-time and class-size requirements that make school schedules rigid and unable to accommodate computer lab sessions, and opposition from teachers’ unions [2].  A rapidly growing community of educators with experience using digital learning tools and literature describing best practices are available to reformers who want to accelerate this progress.

The Digital Learning Council, a nonprofit organization launched in 2010 to integrate current and future technological innovations into public education, has produced a series of publications (all of them available online) to help parents, educators, and policymakers find and use the best practices for digital learning.  The council has proposed “10 Elements of High Quality Digital Learning,” which it describes as “actions that need to be taken by lawmakers and policymakers to foster a high-quality, customized education for all students.  This includes technology-enhanced learning in traditional schools, online and virtual learning, and blended learning that combines online and onsite learning.”

In 2011, the American Legislative Exchange Council (ALEC), a respected membership organization for state legislators, adopted a model resolution endorsing the “ten elements” approach.  In 2012, ALEC created and endorsed model legislation, the Statewide Online Education Act, that provides a detailed template for states to follow to remove roadblocks to expanding digital learning.  The National Conference of State Legislatures (NCSL), another organization of state legislators, also has endorsed expanding the use of digital learning and provides case studies of its successful implementation [3].

The Clayton Christensen Institute for Disruptive Innovation, formerly the Innosight Institute, is another good source of best practices.  The nonprofit think-tank was founded by Harvard professor Clayton M. Christensen, author of the 2008 bestseller Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns.  The organization conducts original research on the cutting edge of digital learning, consults with elected officials, and provides speakers for public events.  Researchers affiliated with the organization have created a “blended-learning taxonomy” that distinguishes among the various ways of blending digital learning with traditional schooling, such as Station Rotation, Lab Rotation, Flipped Classroom, Flex, A La Carte, Enriched Virtual, and Individual Rotation models.

Conclusion

Digital learning – the combination of online adaptive testing and instruction made possible by new technologies, software, and the internet – is beginning to transform K-12 education.  It accelerates learning for a number of reasons, but an important one is because it makes rewards for learning more accurate, timely, and attuned to the interests and abilities of students.  It promises to deliver the “creative destruction” required to substantially improve America’s failing elementary and high-school system.

ClassDojoGoalbook, and Funnix are three examples of the rapidly growing number of software programs available to educators to bring digital learning into the classroom.  Rocketship EducationKhan AcademyCoursera, andUdacity illustrate the variety of new institutions that are using digital learning to transform traditional teaching methods.  Given the pace at which software is improving and institutions are evolving, these examples may seem out of date in a few years.

Research shows substantial positive achievement effects of online education in pre-internet days and larger effects in recent years.  More advanced technologies used on a much wider scale promise even larger achievement effects, lower costs, and a greater variety of incentives, curricula, and teaching methods from which parents, students, and educators can choose.  Obstacles in the path to increased use of digital learning can be removed by parents and policymakers working together to adopt the policies recommended by pioneering leaders in the field, the Digital Learning Council, and other groups supporting this disruptive innovation – which will likely lead to far more effective education.

Herbert J. Walberg and Joseph L. Bast are chairman and president, respectively, of The Heartland Institute and authors of Rewards: How to use rewards to help children learn – and why teachers don’t use them well(October 1, 2014; ISBN 978-1-934791-38-7).  This article is excerpted from Chapter 10, “Rewards and Digital Learning.”

[First published at the American Thinker.]

Notes

[1] Tamara Butler Battaglino, Matt Haldeman, and Eleanor Laurans, “The Costs of Online Learning,” in Chester E. Finn, Jr. and Daniela R. Fairchild, eds., Education Reform for the Digital Era (Washington, DC: Thomas B. Fordham Institute, 2012), pp. 55–76.

[2] Chester E. Finn, Jr. and Daniella R. Fairchild, “Overcoming the Obstacles to Digital Learning”; Paul T. Hill, “School Finance in the Digital-Learning Era”; and John E. Chubb, “Overcoming the Governance Challenge in K-12 Online Learning,” all in Chester E. Finn, Jr. and Daniela R. Fairchild, eds.,ibid., pp. 1–11, 77–98, and 99–134.

[3] Sunny Deyé, “K-12 Online Learning Options,” National Conference of State Legislatures, Legisbriefs 21, no. 16 (April 2013).

Digital Learning Makes Rewards Fun, Effective

Somewhat Reasonable - August 20, 2014, 9:26 AM

[NOTE: The following is excerpted from a chapter of the next Heartland Institute book titled Rewards: How to use rewards to help children learn — and why teachers don’t use them well.  Read the first part of this series here. This piece was first published at The American Thinker.]

Children today are much more comfortable using information technology than are those of previous generations.  Many grow up playing video games offering strong visual and audio stimulation, instant feedback on decisions, and nonfinancial rewards for achievement, such as winning competitions, accumulating points, and being able to move to the next level of a game.  The popularity of such games confirms what parents and good teachers know instinctively: children can acquire knowledge and learn new skills at seemingly phenomenal speeds when they are fully engaged in the learning experience.

Technology applied to learning, also known as digital learning or online adaptive instruction, has vast potential to transform schooling.  Either by itself or “blended” with traditional classroom teaching, digital learning is building a record of results substantially superior to traditional teaching and potentially far cheaper when used on a large scale.

Online adaptive instruction can provide in one package the goals, activities, tests, and incentives needed to accelerate student learning.  Students receive feedback as they move through a set of activities that the program customizes to their individual abilities.  Many programs utilize algorithms grounded in psychological research on common errors students have made in face-to-face settings.  Such research makes it possible to offer detailed cues for what to do next and prompt the user to move on to more difficult levels, or to repeat a lesson, perhaps from another perspective, when appropriate.

While there are obstacles to the spread of digital learning, cost is not one of them.  The per-pupil costs of online schooling, which requires fewer teachers, have only recently been compared to that of traditional classroom instruction.  According to a study by the Thomas B. Fordham Institute, full online learning on average costs about $4,300 annually less than traditional schooling, while the blended model saves about $1,100 per student per year [1].  These cost savings are likely to increase over time as the technology improves and as educators gain experience in its use.  Requiring nine rather than 12 years of schooling would reduce costs substantially more.

Best Practices

Digital learning is spreading quickly as parents, students, and educators recognize its transformative potential.  Some obstacles need to be overcome, such as certification requirements that block entry into the teaching profession by talented and motivated individuals, seat-time and class-size requirements that make school schedules rigid and unable to accommodate computer lab sessions, and opposition from teachers’ unions [2].  A rapidly growing community of educators with experience using digital learning tools and literature describing best practices are available to reformers who want to accelerate this progress.

The Digital Learning Council, a nonprofit organization launched in 2010 to integrate current and future technological innovations into public education, has produced a series of publications (all of them available online) to help parents, educators, and policymakers find and use the best practices for digital learning.  The council has proposed “10 Elements of High Quality Digital Learning,” which it describes as “actions that need to be taken by lawmakers and policymakers to foster a high-quality, customized education for all students.  This includes technology-enhanced learning in traditional schools, online and virtual learning, and blended learning that combines online and onsite learning.”

In 2011, the American Legislative Exchange Council (ALEC), a respected membership organization for state legislators, adopted a model resolution endorsing the “ten elements” approach.  In 2012, ALEC created and endorsed model legislation, the Statewide Online Education Act, that provides a detailed template for states to follow to remove roadblocks to expanding digital learning.  The National Conference of State Legislatures (NCSL), another organization of state legislators, also has endorsed expanding the use of digital learning and provides case studies of its successful implementation [3].

The Clayton Christensen Institute for Disruptive Innovation, formerly the Innosight Institute, is another good source of best practices.  The nonprofit think-tank was founded by Harvard professor Clayton M. Christensen, author of the 2008 bestseller Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns.  The organization conducts original research on the cutting edge of digital learning, consults with elected officials, and provides speakers for public events.  Researchers affiliated with the organization have created a “blended-learning taxonomy” that distinguishes among the various ways of blending digital learning with traditional schooling, such as Station Rotation, Lab Rotation, Flipped Classroom, Flex, A La Carte, Enriched Virtual, and Individual Rotation models.

Conclusion

Digital learning – the combination of online adaptive testing and instruction made possible by new technologies, software, and the internet – is beginning to transform K-12 education.  It accelerates learning for a number of reasons, but an important one is because it makes rewards for learning more accurate, timely, and attuned to the interests and abilities of students.  It promises to deliver the “creative destruction” required to substantially improve America’s failing elementary and high-school system.

ClassDojoGoalbook, and Funnix are three examples of the rapidly growing number of software programs available to educators to bring digital learning into the classroom.  Rocketship EducationKhan AcademyCoursera, andUdacity illustrate the variety of new institutions that are using digital learning to transform traditional teaching methods.  Given the pace at which software is improving and institutions are evolving, these examples may seem out of date in a few years.

Research shows substantial positive achievement effects of online education in pre-internet days and larger effects in recent years.  More advanced technologies used on a much wider scale promise even larger achievement effects, lower costs, and a greater variety of incentives, curricula, and teaching methods from which parents, students, and educators can choose.  Obstacles in the path to increased use of digital learning can be removed by parents and policymakers working together to adopt the policies recommended by pioneering leaders in the field, the Digital Learning Council, and other groups supporting this disruptive innovation – which will likely lead to far more effective education.

Herbert J. Walberg and Joseph L. Bast are chairman and president, respectively, of The Heartland Institute and authors of Rewards: How to use rewards to help children learn – and why teachers don’t use them well(October 1, 2014; ISBN 978-1-934791-38-7).  This article is excerpted from Chapter 10, “Rewards and Digital Learning.”

[First published at the American Thinker.]

Notes

[1] Tamara Butler Battaglino, Matt Haldeman, and Eleanor Laurans, “The Costs of Online Learning,” in Chester E. Finn, Jr. and Daniela R. Fairchild, eds., Education Reform for the Digital Era (Washington, DC: Thomas B. Fordham Institute, 2012), pp. 55–76.

[2] Chester E. Finn, Jr. and Daniella R. Fairchild, “Overcoming the Obstacles to Digital Learning”; Paul T. Hill, “School Finance in the Digital-Learning Era”; and John E. Chubb, “Overcoming the Governance Challenge in K-12 Online Learning,” all in Chester E. Finn, Jr. and Daniela R. Fairchild, eds.,ibid., pp. 1–11, 77–98, and 99–134.

[3] Sunny Deyé, “K-12 Online Learning Options,” National Conference of State Legislatures, Legisbriefs 21, no. 16 (April 2013).

Categories: On the Blog

Don’t intervene in Azeroth: Thieves in MMOs shouldn’t face criminal charges

Out of the Storm News - August 20, 2014, 9:04 AM

While the world reels from the Ebola epidemic, the instability in Gaza and the intensifying tensions between Russia and Ukraine, one British politician seems to have put his finger on a crisis that no one noticed. Mike Weatherley, MP, proposed that criminal penalties be enforced for the theft of (wait for it) items stolen in the virtual world of Azeroth, home to the popular World of Warcraft role-playing game.

As Weatherly describes it, the law would mean that people “who steal online items in video games with a real-world monetary value receive the same sentences as criminals who steal real-world items of the same monetary value.”

It’s tempting to dismiss this idea as the ramblings of an easily wounded WoW player who just happens to have the capacity for legislation at his back. But Weatherley does point to a significant issue that has arisen in the game world, and which does seem to implicate virtual items as real world property, even if his proposed means of responding to this issue is wrongheaded. Given that this is a policy area that may become more and more relevant the more the popularity of online gaming (especially the sort powered in part by real money) grows, it behooves us to take a look at Weatherley’s proposal.

Let’s start with what Weatherley’s idea gets right, which is that items in video games like WoW do have real world monetary value. This is true even when they aren’t sold directly for real money, because often, the virtual currencies used in these games can actually have their exchange value relative to the dollar calculated and exploited. The practice of “gold farming” in WoW for the purposes of selling virtual gold in online marketplaces for real money has become infamous. So has the case of Anshe Chung, a Second Life character whose real-life player Ailin Graef became the first “virtual millionaire” by turning her virtual prostitute character into a real estate mogul within the game.

What’s more, the practice of buying and selling rare items with real money has a long history in video game culture. This author, for instance, recalls the thriving trade in “Rings of Jordan” from the Diablo II multi-player community, and which exploded with the introduction of World of Warcraft. At this stage, it is beyond doubt that high-value items within online games do have actual real world value. For instance, an “epic mount” in WoW can fetch as much as $500. Faced with sums like this, the question has to be raised: Why shouldn’t theft of such goods be treated as a criminal act?

Well, because despite the value of these virtual items, they still exist within a virtual ecosystem whose rules don’t quite line up with the real world, where private property rights enforcement mechanisms already exist and where the law is singularly ill-equipped to operate. A metaphor that treats the theft of one player’s epic mount as equivalent to stealing a real thoroughbred horse looks persuasive on its face, but when you actually think about how it would be prosecuted, the metaphor breaks down quickly.

In an online game, identity can be far more fluid than in real life. WoW ties players’ identities to their emails, meaning that a player who finds himself banned on one account can easily create a new one with a new email address. This isn’t a complete barrier to enforcement of criminal laws in games like WoW, which operate on a “pay to play” basis and thus gives law enforcement an additional means to track players though their credit cards, although there are privacy concerns and systemic identity theft problems associated with doing so. But in free massively multi-player games like the Guild Wars series, email addresses might be the only clue as to the criminal’s identity. It’s also quite plausible that a parent whose credit card was used in WoW might find themselves on the hook for “crimes” their children committed.

Players who do steal or otherwise violate in-game rules are usually held accountable by a much more efficient system than a legal regime — the moderators of the game itself. These moderators are usually successful at weeding out problematic players by banning their accounts or using other technical solutions that would run afoul of the constitution if employed by law enforcement. Companies like Blizzard, the maker of WoW, are allowed to be more draconian in ways that are more appropriately tailored to the setting and thus more effective than the law enforcement system.

Recognizing video game items as legally protected “property” raises a whole host of troubling legal questions about what is actually happening in games like WoW. For instance, along with selling items, players also sometimes sell epic-level characters to other players (this author even bought one once), which serve as a means for less-experienced players to gain access to perks within the game that would otherwise be off-limits. But if the items are real, then does that make transactions of this nature slavery? Do players who kill each other in-game, especially in games where no resurrection is allowed, face charges of murder or destruction of property? And are “griefers” and trolls to be treated as terrorists? Just how close do the rules of virtual life and real life have to be before the police force gets involved?

Having had my only set of unique armor in Diablo II stolen at age 13, I understand the impulse to seek an authority who can clap an online tormentor in very real irons. However, in this case, the remedy is worse than the disease. A discussion of civil penalties might be more appropriate, but when it comes to the application of criminal law to the world of Azeroth or EVE Online, the only winning move is not to play.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

For insurers, California’s ‘diverse procurement’ is well-meaning and wrong

Out of the Storm News - August 19, 2014, 2:10 PM

Under Section 927.2 of the California Insurance Code, insurers writing $100 million or more in annual premiums in the state are required to submit reports to the Department of Insurance on what efforts they’ve made to procure business from firms owned by racial minorities, women and disabled veterans.

On the face of it, this “diverse business procurement” policy is laudable, because it seeks to empower historically marginalized communities while being minimally invasive and nominally disruptive to insurers. Deeper analysis yields a different conclusion, especially when one considers the unique degree of government involvement such a policy requires.

The degree to which government should be involved in business judgments should be predicated on the degree of separation between government and private conduct. The closer the private conduct is to the operation of government, or the extent to which the private conduct is dependent upon government sanction, the better the government’s case for substituting its goals for that of the private entity in question.

In the case of California’s diverse business procurement efforts, three tiers of involvement have developed. At the first tier is the state government itself, which is well within its proper scope of authority when it establishes its own procurement framework. Second are quasi-governmental bodies and third are heavily regulated private bodies. The distinction between the final two tiers is important for policy makers to keep in mind when they consider which praiseworthy objectives are to be achieved through disruptive means.

In California, in the second tier, state government has long obligated private actors to seek to redress social ills by substituting its policy preferences for private business judgment. Supplier diversity programs are designed to improve the financial standing of historically marginalized communities by channeling specific business expenditures (paper supplies, for instance) to firms that are owned by women, minorities and disabled veterans. The channeling function is created by obligating regulated parties to publicly report the extent to which their business procurement is “diverse.” Since 1986, firms that are subject to regulation by the California Public Utilities Commission have been made to report the extent to which they have achieved “diverse business procurement.”

Public utilities were the first candidates for adoption of diverse supplier requirements because of their close regulatory relationship with the state. Utilities are granted quasi-monopoly status because they meter out public services broadly thought of as essential goods. As a result, public utilities do not function in highly competitive markets. As protectors of the public utilities’ monopolies on public goods, policymakers saw little harm in directing them to seek out diverse suppliers at the admittedly nominal expense of ratepayers.

However, more recently, California’s highly regulated insurers have become entangled in similar treatment. Quickly drawn comparisons between public utilities and insurers give the mistaken impression that the two are not dissimilar. For instance, like public utilities, insurers are heavily regulated. Like public utilities, insurance products are purchased by virtually all Californians. Where the similarities end and where the meaningful distinctions between the two begins is the point at which consumer choice enters the equation.

The government-protected quasi-monopoly status enjoyed by public utilities and the competitive marketplace in which insurers operate requires that the two approach their consumers differently. While public utilities rely on tangible materials subject to physical scarcity or decay, insurers offer services of a non-physical nature, which are capable of rapid change and innovation to meet consumer demand. This has led to a voluntary insurance market in which market advantages are sought by attempting to maintain low rates through innovation and hard-fought proprietary advantage – not through state protection.

Were policy makers interested, they would find that their heavy-handedness is working at cross purposes with their stated objective. By imposing their own priorities on the insurance market, California’s policymakers have not only prevented policy holders from realizing their lowest possible rates – many of whom are themselves historically marginalized – but they have also encumbered the robust procurement diversity programs that insurers already have in place at a national level by subjecting them to state requirements.

 

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
Syndicate content