Out of the Storm News

Syndicate content
Free markets. Real solutions.
Updated: 3 weeks 1 day ago

Don't take the Texas GOP's crazy platform too seriously

June 25, 2014, 9:00 AM

In recent weeks, more than a few groups – including one on whose board I serve – have denounced various aspects of the Texas Republican Party’s platform.

Indeed, there’s a lot to criticize. In addition to its absurd call for so-called “reparative therapy” for homosexuals and bans on pornography, the platform draft includes nativist language on immigration and an attack on vaccination. There’s also some conspiracy theory garbage opposing Sharia law and the United Nations’ Agenda 21.

A few parts of the platform seem downright sloppy: one provision calls for the repeal of all laws “regarding the production, distribution or consumption of food.” I’m sympathetic to what the writers of this were probably thinking, but taken literally, the provision would make it legal to label jars of baby food as containing “carrots and peas,” even if what they really contained was fermented gerbil vomit.

It also includes some foreign policy planks that somebody must care about but that seem quite out of place for what is, after all, a state party.

For all its real flaws, however, the platform is a pretty decent summation of current streams of thought among the populist, socially conservative right. The current draft calls for the outright repeal of the Patriot Act as well as the National Defense Authorization Act provisions that allow the use of military tribunals for trying terrorists. Both provisions would probably get more votes in the Democratic caucus than the Republican one. Previous iterations of the Texas GOP platform have also called for usury laws—government price controls on interest rates—and mandatory labeling of genetically modified food.

Frankly, much of the platform’s weirdness and its strong populist flavor come from the unusual way that Texas drafts its platform. The platform comes from a drafting committee, just as most other state platforms do. But where most party platforms typically are written by insiders for media consumption, the Texas GOP platform is debated and rewritten by anybody who takes time off and pays the fee to attend the party convention. The result is that it’s a true “grass roots” platform that reflects the feelings of the party’s activists, rather than its officeholders.

People with a pet issue can usually get it in, so long as it isn’t too contentious a topic. And I know this for a fact. In 2012, the Heartland Institute’s then-Texas director, attending the convention in her own private capacity, got some language into the platform on property insurance that I helped her write.

This method of writing the platform serves to tell office-holders what their grassroots are really thinking, rather than serving as a manifesto written by those officeholders. The platform may well pull Texas office-holders in a populist, social conservative direction, but it doesn’t necessarily prove much about how they would govern.

A group of Democrats coming together and voting in the same way would likely call for vastly higher taxes; a straightforward government takeover of health care; a forced conversion to “green energy” that would wreck the economy; an end to secret ballot elections for unions throughout the country; imposition of racial quotas on private employers; government bans on “unhealthy” food; outright confiscation of guns; laws against “hate speech”; Internet censorship; denial of broadcast licenses to “unbalanced” (read: conservative) media; new restrictions on prayer in public; and taxpayer funding for partial-birth abortion. And there would probably be a long Noam Chompsky-inspired rant against corporations thrown in somewhere as well.

Only a handful of Democratic politicians currently in office support any of these things in public and most would probably oppose them if asked. Many Democratic voters would probably oppose them too. But all are popular with certain parts of the Democratic base and a left wing populist platform would probably write a platform containing all of them.

My point isn’t that the Texas Republican platform is irrelevant: it does reflect the views of a certain portion of the Republican Party and may influence their views. But nobody should mistake it for a manifesto on how Republican officeholders in Texas or anywhere else plan to govern.


This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Insurance by/at the pound?

June 25, 2014, 8:00 AM

It has never been more of a dog’s world. Consider, 45 percent of households own dogs.

With such a high rate of cohabitation, insurers have seen dog-related claims rise. Today, according to the Centers for Disease Control and Prevention (CDC), there are approximately 4.5 million Americans bitten by dogs each year. Of that 4.5 million, 885,000 people are wounded seriously enough to require medical attention.

It is undeniable that dogs injure an enormous number of Americans. People with dogs in their homes are associated with a higher likelihood of being bitten by a dog and people with two or more dogs in their homes are five times more likely to be bitten than those without any dogs.

The Insurance Information Institute reports that one dollar of every three paid out in homeowners’ claims was a result of a dog-related claim and that the average cost paid out for such a claim was $27,862.

Whether or not the associative risks or costs of dog ownership are recognized, the popularity of dogs persists.

Insurers, as their business demands, assess the cost of the risk that dogs represent, and have compiled information related to dog-related claims. From this information, some insurers have chosen to adjust their underwriting practices to account for the risk profiles of different breeds. This activity sometimes leads to a policy showdown.

Two states, Pennsylvania and Michigan, have chosen to prevent insurers from distinguishing between the risks that different breeds represent. To forestall what proponents of such bans describe as “breed discrimination”, both states have prohibited underwriting practices that are sensitive to breed. There are two rationales behind such bans, one is emotive and the other is policy-based.

First, such bans are, to a dog-loving populous, intuitively attractive. Let’s face it, though legally property, dogs are so much more to those who love them. For those who love dogs, the term “breed discrimination” holds a great deal of rhetorical power. Semantically, “breed discrimination” sounds odious. And politically, fighting against discrimination is almost always good … right?

Well, not really. Risk classification of all types is, in a literal sense, the “quality or power of finely distinguishing.” AKA, legal discrimination. In spite of the legality and inevitability of risk classification, the issue remains sensitive.

Second, even without access to the proprietary data that insurers may have, underwriting according to breed – in the strict sense – is problematic. Though the CDC attempts to measure the health risks posed by one breed versus another, they confess the shortcomings of such an approach. Data about attacks is self-reported and prone to inaccuracy. Further, the majority of dogs in U.S. households are not pure in breed. The existence of mutts and customized cross-breeds (for instance, any dog known as a, “fill in the blank”-doodle) complicates easy classification.

From the perspective of insurers, who are interested in pricing their products competitively so that clients posing a low risk pay a lower premium, forbidding the use of dog breed data is problematic. In every state, insurers are required to underwrite on an “actuarially justified” basis. This means that only legitimate cost factors may be taken into account. Dog breed data, though imprecise, meets this threshold because it is a proxy for indicia of associated risks that are statistically correlative.

More specifically, it is known that male dogs bite more frequently than female dogs; that non-neutered dogs are more likely to bite than neutered dogs; and, that chained dogs are more likely to bite than unchained dogs. Further, it is known that larger dogs are capable of causing greater injury than are smaller dogs. Thus, to the extent that some breeds are more likely to possess any number of the enumerated characteristics, some insurers have come to believe that there is a meaningful correlation between that breed and heightened claim risk.

To be certain, insurers are not underwriting based on breed guided by a normative judgment.

Still, many insurance customers will find such reasoning unsatisfactory because, without access to the data that undergirds it, the correlation will, in their view, not rise to a sufficient level of statistical relevance. Fortunately, there is recourse available to those customers and it lies in the realm of the free market.

Some companies, like State Farm, forego breed-sensitive underwriting in favor of adjusting premiums after a dog has demonstrated itself to be a risk. It is not inconceivable that State Farm, by accepting a certain number of losses that it may have otherwise avoided by employing breed sensitive underwriting, has gained a reputational advantage that could well offset the costs and increase market share and profits. State legislatures, instead of succumbing to the temptation to ban breed-sensitive underwriting, should recognize that the market has a solution to unpopular underwriting practices.

With 885,000 people a year wounded seriously enough by dogs to require medical attention insurers are justified to classify dog owners of all types, as a higher risk than those who do not own dogs.

If legislatures must pass laws to save dogs and their owners from the effects of breed discrimination, perhaps they would allow insurers a rating exception as they pass their laws. The exception would be a straightforward approach to assessing the risk that a dog embodies by simply weighing the dog. We know that the larger the dog the more damage it can do. Can scales predict what breed is not allowed to?


This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Checkmate for E3's critics

June 24, 2014, 9:00 AM

Imagine the following: A new form of entertainment enters the market, which engages its players in intense competition, sometimes to the exclusion of their social lives, and which seems to many to be a waste of their energy when they could be accomplishing more useful things.

In fact, several argue the intense competition and warlike elements of this new form of entertainment make its participants prone to violence, and urge that it be removed from polite society in favor of the older, more respectable forms of social entertainment.

I am referring, of course, to chess, a game which is regarded today as an eminently respectable pastime requiring great reservoirs of strategic skill. Chess champions are international celebrities, and often take up (and bolster) political causes, as in the case of Garry Kasparov’s advocacy for the Magnitsky Act. Yet, as io9 documents, there was a time when statements like the following were written without irony about the game:

The great interest taken in this warlike game — the importance attached to a victory — and the disgrace attending defeat, are exemplified in numerous instances handed down to us by various writers, of which the most worthy of notice are the following….

Richlet, in his Dictionary, article Echec, writes, ” It is said, that the Devil, in order to make poor Job lose his patience, had only to engage him at a game at Chess.”

Would that we could look at similar accusations against video games with similar derision, especially given that the video game industry recently held its annual event showcasing the coming year in technology and games, known commonly as E3.

Unfortunately, it seems that every time the video game industry dares to show its face in public, a hyperventilating article is never far behind. This year’s E3 was no exception, as the New York Times‘ Nick Bilton fretted about the convention:

But it is hard to argue that there isn’t some level of desensitization after a day spent at E3. At the main entrance of the Los Angeles Convention Center, where the conference was held, people lined up to play the new          game Payday 2. In this game, you team up with friends to rob a bank. Killing police is a big part of succeeding.

As I watched people picking off cops and security guards with sniper rifles and handguns, news broke that a real-life shooting in Las Vegas had resulted in the death of two police officers and three civilians (including the two shooters).

I asked Almir Listo, manager of investor relations at Starbreeze Studios, which makes Payday 2, if he felt in any way uncomfortable about making a game that promotes shooting police.

“If you look hard enough, you can find an excuse for everything; I don’t think there is a correlation,” he said. “In Sweden, where I am from, you don’t see that stuff happen, and we play the same video games there.”

After the Sandy Hook shootings in Connecticut, when it became clear that Adam Lanza was a fan of first-person shooters, including the popular military game Call of Duty, President Obama said Congress should find out once and for all if there was a connection between games and gun violence.

“Congress should fund research on the effects violent video games have on young minds,” he said. “We don’t benefit from ignorance. We don’t benefit from not knowing the science.” Yet more than a year later, we don’t conclusively know if there is a link.

In the event that Payday 2 is ever played competitively at the same level that Chess is (or, for that matter, that Starcraft is in South Korea), one can only hope that articles like Bilton’s will be held up to a similar level of ridicule.

Where to begin? Perhaps with the fact that Adam Lanza’s favorite game – the one that he was, in fact, said to be “obsessed with” in the Sandy Hook Crime Report – was the thoroughly non-violent Dance Dance Revolution. Or that Bilton offers no evidence the shooters in Vegas had ever heard of video games, let alone Payday 2. Or perhaps the fact that, like so much of what President Obama suggests Congress fund, there is no need for research on the effects of violent video games on young minds. There have been scores of such studies already, and every one not either funded by anti-video game activists or so vague in conclusions as to be meaningless shows no significant effect that video games, violent or otherwise, have on young minds (or, for that matter, adult ones.)

That Bilton, whose coverage of the rest of E3 was relatively balanced, should fall for the chicanery that video games can cause violence – or, as the new and far less menacing cries of alarm would have it, “decrease empathy” – is sad, but not unexpected. Video games are a convenient bogeyman for a society that labels even the Western canon with “trigger warnings” and frets about whether even a single sexist joke could somehow lead to mass acceptance of rape.

Video games are unapologetic in their gore, their violence and their transgressiveness. Despite all that, they are and remain harmless. In this era of oversensitivity, the imperviousness of video games to the mindless censoriousness of our politically correct, morally panicky current culture is a refreshing checkmate in what often looks like a losing battle for free speech.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

R Street urges state and local governments to take steps on e-cigarettes

June 24, 2014, 8:05 AM

WASHINGTON (June 24, 2014) – State and local governments should take steps to curb cigarette use and promote e-cigarettes as a real driver of tobacco harm reduction, said the R Street Institute in a paper released today.

Authored by Dr. Joel Nitzkin, R Street senior fellow and public health expert, “E-cigarette primer for state and local lawmakers” lays out the benefits associated with using e-cigarettes as a means to quitting traditional cigarettes, while promoting controls through regulation to keep all tobacco products out of the hands of minors. 

“Adding a tobacco harm reduction component to current tobacco-control programming is the only policy option likely to substantially reduce tobacco-attributable illness and death in the United States over the next 20 years,” said Nitzkin. “Sensible FDA regulation will be needed if e-cigarette makers and vendors are to present the level of risk posed by these products honestly. However, any regulation must be evidence-based, practical and reasonably streamlined in a way that will protect and advance public health.”

As draft FDA regulations begin to make their way through the process, Nitzkin outlines several steps that state and local governments can take in the meantime.

First, state and local governments should fully enforce age restrictions on the purchase of all tobacco products, and consider upping the age restriction from 18 to 21 to remove cigarettes from the high school environment. Second, to encourage users to switch, governments should heavily tax cigarettes, but only lightly tax lower-risk products. Third, governments should consider implementing non-pharmaceutical smoking cessation protocols that could prove to be more effective for long-term abstinence. Finally, governments should urge tobacco-control leaders to open dialogue with those in various tobacco-related industries who endorse e-cigarettes as the solution to curbing cigarette use and would welcome the opportunity to partner with those in the public health community in pursuit of shared public health objectives. 

Simultaneously, governments should urge the FDA to sensibly regulate e-cigarettes and other lower-risk tobacco products by prohibiting sales to minors, restricting marketing and assuring quality and consistency of manufacture. They should urge the FDA not to impose restrictions on flavoring or nicotine content that would make those products unpalatable to smokers who otherwise would switch. 

The paper can be found here:


E-cigarette primer for state and local lawmakers

June 24, 2014, 8:00 AM

Cigarettes kill an estimated 480,000 Americans each year. An estimated 46 million Americans smoke cigarettes, the most hazardous and most addictive of tobacco products. Despite our best efforts, these numbers have been consistent, year to year, for more than a decade. Switching from cigarettes to a smokeless tobacco product or an e-cigarette can reduce a smoker’s risk of potentially fatal tobacco-attributable cancer, heart and lung disease by 98 percent or better. This approach is called “tobacco harm reduction” (THR). Adding a THR component to current tobacco-control programming is the only policy option likely to substantially reduce tobacco-attributable illness and death in the United States over the next 20 years. The e-cigarette family of products offers the most promising set of harm reduction methods because of their relative safety compared to cigarettes, their efficacy in helping smokers cut down or quit and their unattractiveness to teens and other non-smokers. They also promise to be less addictive than cigarettes and easier to quit.

This primer provides evidence in favor of e-cigarettes as a THR modality and a review of the arguments against them. Many in tobacco control oppose any consideration of e-cigarettes because of their dislike of the “tobacco industry”; because they fear that THR will attract large numbers of teens to nicotine addiction; because the case in favor of e-cigarettes has not been proven to their satisfaction; and possibly because of likely harm to the major pharmaceutical firms that now support much tobacco-control research and programming. This primer closes with recommendations for actions state and local lawmakers should and should not consider with respect to THR and e-cigarettes.

The Burkean case for immigration reform

June 23, 2014, 9:00 AM

Demos blogger Matt Bruenig, in an apparently Burkean mood, writes:

The biggest factor in production is not nature, labor, or capital, but in fact accumulated technology and knowledge that comes to us as an unearned inheritance from the past. The marginal productivity of that unearned inheritance accounts for the majority of our economic output. Imagine you held everything else equal in the economy, but then ticked off electricity technology (which nobody alive has produced). By how much would the economy shrink? A ton.

I say Burkean, of course, because of Edmund Burke’s famous passage:

We are afraid to put men to live and trade each on his own private stock of reason; because we suspect that this stock in each man is small, and that he would do better to avail himself of the general bank and capital of nations, and of ages.

Bruenig correctly points out that it is the bank and capital of nations and of ages that account for nearly all of our economic activity. We owe our know-how, our “lower-level knowledge” as Amar Bhide puts it, to those who came before us. We live off of their accomplishments, and can only aspire to add something meaningful to them.

A drastically more open immigration policy makes a great deal of sense, from this perspective. Global wealth is greatly hindered by the fact that nearly all of humanity is stuck in places that do not have a lot of capital in the bank of their nations, so to speak. From a broad point of view, allowing as many people as possible to move to areas where they can participate in greater “accumulated technology and knowledge” enriches the world. From a humanitarian point of view, it most directly lifts the poorest people on Earth out of poverty, and from a selfish point, it is highly likely to enrich the average American.

The great innovators of the 19th and 20th century in this country were largely either immigrants or the children or grandchildren of immigrants. Who believes that America would have been better off without a Ford or a Carnegie?

Some fear that opening the door to the bank and capital of our nation will do violence to those institutions, but history has not given us reason to give much credibility to this concern. It certainly did not happen when we had drastically more open borders in the 19th century. Technology and lower-level knowledge are accumulated by the sweat of our brows, and the more people we have to get to work pushing the frontier further, the better off we will all be, to say nothing of those who will inherit our legacy.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Patents? Where we're going, we don't need patents

June 23, 2014, 9:00 AM

A little more than a week ago, if you were to walk through the lobby of Tesla Motors – not that I ever have, but if one were – you would have admired a wall displaying hundreds of patents belonging to the company.

This week, however, it’s bare.

That’s because Tesla founder and serial entrepreneur Elon Musk has removed them “in the spirit of the open source movement, for the advancement of electric vehicle technology.” In a June 12 blog post, Musk declared that no longer would Tesla enforce protection on their patented technologies. Instead, the company plans to open the doors in hopes that other firms will enter, foster innovation and grow the electric vehicle market.

There are those who have scoffed at this news, believing Musk was embarking on a high-ride publicity stunt. It’s certainly true that Tesla and Musk have gotten the media’s attention. But more importantly, this has garnered more attention for the open source movement – a movement that has been stifled by patent trolls and hungry patent attorneys.

Musk elaborates on that point by writing that patents these days serve only to “stifle progress, entrench the positions of giant corporations and enrich those in the legal profession, rather than the actual inventors.” He goes on to say that receiving a patent “really just means that you bought a lottery ticket to a lawsuit.” A lottery that expenses millions in attorney fees and court costs is no winning one, and one I certainly wouldn’t want to be a part of.

Indeed, as the dust settles on this news it does appear that Tesla’s attempt at promoting electric car programs is paying off. It was reported by the Financial Times that BMW and Nissan, two of Tesla’s biggest rivals, are interested in pairing with the company to expand its network of charging stations throughout the United States. Indeed, investors seem to agree with the company that the network effect from having more electric cars on the road, and thus more charging stations, is more important than the monopoly rights granted by the patents. Tesla’s stock price soared to its highest point in months over the past week and a half.

Obviously, it’s going to take a lot more than one high-end car company standing up to say that they’re tired of the way the patent system works, especially on the infringement front, but Tesla is hopefully paving the way and opening the door for more major players to see that the open source movement can be a winning strategy for everyone involved. After all, has history ever proven that patents were a positive for innovation?

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Current design patent laws stifle innovation and competition, R Street study finds

June 20, 2014, 3:01 PM

WASHINGTON (June 20, 2014) – Current design patent law provides incentive for frivolous lawsuits and abuse, said the R Street Institute in a policy paper released today.

Authored by Ned Andrews, the paper, “Is interactive design becoming unpatentable?,” lays out recommendations for modernizing the design patent system to allow smaller companies to enter the technological market.

“In order to have the kind of ornamental status that could be the subject of a design patent, an object must possess either some entirely nonfunctional feature or be the result of workmanship that does not contribute in any way to its function,” wrote Andrews. “Current definitions falsely equate the aesthetic merit of functionality with that of applied ornamentation. Thus, some inventors seek design protection for aspects of an object that are, in fact, functional.”

Andrews writes that the system creates an incentive for companies to acquire the patent rights for designs that are as aesthetically or conceptually simple as possible. They then wait for another company to develop a product that resembles the original and then file a claim of infringement, hoping that a manufacturer-defendant will agree to an early settlement.

“The parties that tend to come out on top are the biggest players – the Apples and Samsungs,” he wrote. “This interferes with smaller players’ ability to make headway on a useable portion of their own applications, because they can’t afford to risk a lawsuit from or pay the fees demanded by the trolls or big firms.”

Andrews recommends modernizing the design patent system in a variety of ways. First, impose a simple test: if the device would be less functional if the claimed aspect of the design were absent, the claim in question fails the non-functionality test. Second, courts should limit the findings of design infringement to cases in which the similar aspects of the article’s design perform an ornamental purpose, rather than a functional purpose. Third, both the U.S. Patent and Trade Office and the courts should renew their attention to the criteria of novelty and non-obviousness.

Finally, courts should make standard the practice that in “exceptional cases” of bad faith or misconduct, of awarding reasonable attorney’s fees to the prevailing party in a civil case.

The paper can be found here:


Murray Rothbark

June 20, 2014, 12:28 PM

Murray Rothbark is R Street’s distinguished visiting office dog and director of canine policy.

The case in favor of e-cigarettes for tobacco harm reduction

June 20, 2014, 10:06 AM

This paper has been accepted for publication in the International Journal of Environmental Research and Public Health.

A carefully structured Tobacco Harm Reduction (THR) initiative, with e-cigarettes as a prominent THR modality, added to current tobacco control programming, is the most feasible policy option likely to substantially reduce tobacco-attributable illness and death in the United States over the next 20 years. E-cigarettes and related vapor products are the most promising harm reduction modalities because of their acceptability to smokers.

There are about 46 million smokers in the United States, and an estimated 480,000 deaths per year attributed to cigarette smoking. These numbers have been essentially stable since 2004. Currently recommended pharmaceutical smoking cessation protocols fail in about 90% of smokers who use them as directed, even under the best of study conditions,  when results are measured at six to twelve months.

E-cigarettes have not been attractive to non-smoking teens or adults. Limited numbers non-smokers have experimented with them, but hardly any have continued their use. The vast majority of e-cigarette use is by current smokers using them to cut down or quit cigarettes. E-cigarettes, even when used in no-smoking areas, pose no discernible risk to bystanders. Finally, addition of a THR component to current tobacco control programming will likely reduce costs by reducing the need for counseling and drugs.

R Street commends committee for passing TRIA reform

June 20, 2014, 9:26 AM

WASHINGTON (June 20, 2014) – The R Street Institute welcomed today’s passage of H.R. 4871, the TRIA Reform Act of 2014, by the House Financial Services Committee.

The measure, sponsored by Rep. Randy Neugebauer, R-Texas, calls for a five-year extension of the federal Terrorism Risk Insurance Program, a $100 billion reinsurance backstop originally passed in the wake of the Sept. 11, 2001 terrorist attacks. However, the bill includes important taxpayer-protection provisions that gradually shrink the size of the federal program.

“Rep. Neugebauer’s bill strikes the proper balance between ensuring that sufficient capacity exists for U.S. businesses to insure against catastrophic terrorism, while also guarding against government subsidies that would unjustly enrich insurance companies and major commercial real estate developers,” R Street Senior Fellow R.J. Lehmann said.

Under terms of the TRIA Reform Act, the trigger level for conventional terrorism attacks would be raised gradually from the current $100 million to $500 million by the end of 2019. For attacks involving nuclear, chemical, biological and radiological events, all of which must be covered by law under workers’ compensation policies, the program’s current terms would remain intact.

“Reinsurance broker Guy Carpenter recently issued a report finding that multiline terrorism reinsurance capacity is about $2.5 billion per program for conventional terrorism and about $1 billion per program for coverages that include NBCR,” Lehmann said. “Given those figures, and the continuing growth of capacity thanks to the influx of alternative sources of capital, we think the adjustments called for in the House bill are perfectly reasonable.”

The industry also would be asked to increase its co-payment share of conventional terrorist attacks from the current 15 percent to 20 percent, while individual company deductibles would remain at 20 percent of prior year premiums in a particular line of business. The industry would be asked to repay taxpayers 150 percent of funds expended, up from 133 percent currently, up to a floating retention level calculated by adding the aggregate amount of individual company deductibles.

Lehmann also praised a provision calling on the non-partisan U.S. Government Accountability Office to conduct a study on the feasibility of charging companies an upfront premium for TRIP’s reinsurance coverage.

“Much like the federal Riot Reinsurance Program of the 1970s, the way forward for federal terrorism reinsurance ultimately is to charge companies an actuarially adequate premium,” Lehmann said. “We can never know how much capacity the private reinsurance sector might be willing to commit to terrorism coverage so long as the government provides it for free.”

Is interactive design becoming unpatentable?

June 20, 2014, 8:00 AM

If 20th Century design was inspired by American architect Louis Sullivan’s 1896 pronouncement that “form ever follows function,” the key realization thus far of the 21st Century has been that this is merely a necessary – rather than a sufficient – condition for quality designs to flourish.

We have learned, and the market has confirmed, that an object should be designed in accordance not only with how it functions, but moreover with how it should function. Especially in the case of interactive technology, a description that has grown to describe just about anything, an object should function the way its user expects it to function.

As technology has become more powerful and flexible, the task of matching function and expectations has undergone a change akin to the philosopher Immanuel Kant’s metaphorical Copernican Revolution. For older generations of technology – in which scarce resources limited both what functions were available and the maximum complexity of users’ commands – the steps necessary for users to extract and refine what they could do with a device were explained in thick manuals. The prevailing strategy for more recent generations of technology has been to meet users halfway, competing to efficiently perform functions and effectively implement concepts that users have been had led to expect.

Today’s designs, however, are increasingly able to cut out the middleman, more and more closely conforming to their users’ preexisting intuitions and thought processes and less and less asking users to make those thought processes conform to products’ capabilities.

In other words, the key to success in modern interactive design does not lie in “creating” the best design possible. Rather, it begins with doing the best possible job of stripping designs down to concepts and procedures with which the user is already familiar, preferably through everyday use. Where there is no alternative but to require more input from a user, his or her options are laid out in terms the user already can be expected to know. While the fusion of design and utility has not yet been perfectly realized, industry has become more fully aware of both parts of this process and continues to pursue integration in earnest.

This coevolution of design standards and procedures has clashed, and continues to clash, with the structure of U.S. patent law. The first problem is the potential uncertainty that surrounds the scope and strength of a design patent’s protections. Even in the paradigm case of a design feature that has been aesthetically improved beyond what was required to give the feature its functional attributes, there remains the potential for overly broad claims about what aspects of a design qualify under the law as “ornamental.”

Under section 284 of the U.S. Code’s Title 35, triers of fact may award “non-statutory” damages for infringement of a design patent. But these same judges also may err in determining how much of an object’s value comes from the aesthetic appeal of its ornamental features and how much comes from other sources of value, whether ornamental or functional, and whether patented or unpatented.

The risk of error at each stage of the process – from the initial design patent application to the ultimate test of infringement in court – creates at least some incentive for a designer to overstate his or her case. Fortunately, these incentives are similar to the temptations to make overly broad claims about other grounds for patentability. Regardless what grounds are at issue, the remedy inevitably is better training for examiners and judges in traditional design standards and greater vigilance on their part about those standards’ application.

Ned Andrews

June 19, 2014, 4:53 PM

Ned Andrews is an assistant public defender with the Virginia Indigent Defense Commission and an associate fellow of the R Street Institute. He is a graduate of the University of Virginia School of Law, where he served on the managing board of the Journal of Law and Politics and the Virginia Journal of Law and Technology. He previously received a bachelor’s in philosophy from Yale University. Andrews was the 1994 Scripps National Spelling Bee champion and he is author of the 2011 book “A Champion’s Guide to Success in Spelling Bees: Fundamentals of Spelling Bee Competition and Preparation.”

The fight over TNCs in California

June 19, 2014, 4:39 PM

Legislation regulating so-called “transportation network companies” in California passed the state Senate Energy, Utilities and Commerce Committee by a unanimous 8-0 vote earlier this week, amid a row between the TNCs, insurance companies and traditional taxicabs.

This bill, A.B. 2293, stems from a tragic New Year’s Eve incident in San Francisco in which a six-year-old girl was struck and killed in a crosswalk by a driver who was an UberX contractor.

Under the bill, which moved now to the Senate Insurance Committee, TNCs would be required to provide primary insurance for any driver currently logged in to use their service. The measure codifies the California Public Utilities Commission’s proposed minimum of $1 million of coverage.

The measure is sponsored by Assemblywoman Susan Bonilla, D-Concord, who said she presented the bill to fill a “gap” in insurance coverage and consumer protections, create clear definitions of when commercial and personal insurance coverage is primary, and ensure all drivers are adequately covered during all periods of TNC services.

Bonilla’s legislation defines three distinct “periods” of TNC service, between when a driver turns the application on to when it is turned off.

  • Period 1: The driver turns the app on and waits for a passenger match
  • Period 2: A match is accepted, but the passenger is not yet picked up
  • Period 3: The passenger is in the vehicle

Insurance industry groups argue that the TNCs must be required to provide primary coverage for all three periods. For their part, the TNCs argue that during Period 1, a driver is not active and should not be required to carry the higher coverage standard demanded during the other two periods.

In addition to insurers, the bill also is supported by consumer attorneys, the California Airports Council and the San Francisco International Airport, who each argued for the measure on public safety grounds.

TNCs like Uber and Lyft raised opposed the $1 million minimum as too high and insisted the bill is anything but a compromise. Many Lyft and Uber drivers showed up to voice their opposition, most implying that A.B. 2293 would shut down the TNCs and therefore their livelihoods.

The other source of opposition came from taxi cab associations, who decried the bill for codifying TNCs as different from cab companies and subject to different regulations.

Committee Chairman Alex Padilla, D-Los Angeles, expressed support for the measure overall, although he felt it should be subject to further negotiation. He said he feared requiring $1 million coverage during all periods might be too high, but felt that was a matter on which the insurance committee should rule.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

Despite setback in Senate, there’s no reason to give up on patent reform

June 19, 2014, 11:23 AM

The last few weeks have brought both good and bad news to supporters of patent reform looking to reduce system abuse.

Hopes for legislative action were dashed when a major bipartisan reform bill that enjoyed the endorsement of President Barack Obama was pulled from the Senate calendar.

Conversely, there are still opportunities for the executive branch to intervene directly, as well as courts, which have recently been tougher on plaintiffs pursuing patent claims based on suspected invalid patents or outright frivolous claims. It also provides an opportunity to expand the conversation to international trade.

A patent troll generally has one of two goals: to extract a dubious royalty payment or to block market entrance by a potential competitor.

In regard to the former, the troll often attacks a small business or start-up, claiming to hold the original patent on the product or process its target is selling. The start-up, lacking the resources for a long court fight, settles out of court because it’s the better of two bad options. The cost, nonetheless, is passed onto consumers in the form of higher prices.

This practice – albeit slightly different – has spilled over into international trade.

France Brevets, Taiwan’s Industrial Technology Research Institute, Innovation Network Corporation of Japan and South Korea’s Intellectual Discovery are all examples of state-sponsored patent pools. Over time, they have all accumulated thousands of patents, many for products that never made it market. Their aim is to use weaknesses in patent law to favor companies within their own countries while taking legal action against foreign competitors.

Call this a latter-day version of protectionism: If a product from a foreign company threatens your domestic player, sue, ideally in a place where there are legal weaknesses to exploit.

That results in convoluted litigation such as ITRI’s infringement suit against South Kore’’s LG Corp. (and its U.S. subsidiaries) in a U.S. District Court for the Eastern District of Texas – a preferred venue among trolls because it ranks among the highest in the United States in upholding patent claims. This is a further inducement for defendants to settle, even if they have a strong case that the suit is frivolous. And, at the end of the day, consumers pay.

With legislative patent reform dead for the time being, it seems for now the movement to curb this abuse will have to rely on the courts. This means slower movement toward general reform, as court cases often focus on one aspect of the wide range of patent law. But each new ruling in favor of defendants adds to the weight of case law and jurisprudence.

The U.S. Supreme Court dealt a setback to trolling in an early June ruling when it vacated an appeals court ruling of infringement brought by Biosig Instruments against Nautilus Inc., an exercise equipment maker, over heart rate monitors. The Supreme Court said the appeals court, by disallowing only patents that were “insolubly ambiguous,” still left the door open for claims based on vague or indefinite specifications.

For trolls, who in lawsuits often try to stretch broad definitions and descriptions as much as possible, this is a setback, because the Supreme Court essentially raised the standard for a judgment of infringement beyond mere ambiguity, and will force plaintiffs to be more specific about the definitions and functions of a product or device to make an infringement charge stick.

As these and other court decisions add up, the attractiveness of the United States as a venue for patent trolling suits may diminish.

At the same time, the White House can be more assertive in condemning state-sponsored patent pooling. The practice is questionable under current trade agreements, as it involves governments taking an ownership stake in commercial intellectual property. This creates a conflict of interest when the regulator has an interest in the jurisdiction over which it presides. The U.S. Trade Representative’s Office and the Department of Commerce need to be more vigilant in protesting these practices.

Legislation may be on hold for now, but that doesn’t mean pressure for patent law reform should stop. Otherwise trolls will continue to abuse weaknesses in patent law and be a drain on domestic and international economic growth.

R Street welcomes Supreme Court decision on patents

June 19, 2014, 10:05 AM

WASHINGTON (June 19, 2014) - The R Street Institute welcomed today’s unanimous decision by the Supreme Court in Alice Corp. v. CLS Bank International, which ruled that patents filed by Alice Corp. regarding its abstract ideas are not legitimate.

“While this decision was more narrow than some had hoped, in that it does not fundamentally overturn the notion that software is patentable, we are encouraged that the court expanded the reasoning it used in Bilski v. Kappos to rule a broader category of abstract ideas are not eligible for patent protection,” R Street Senior Fellow R.J. Lehmann said.

“The U.S. Constitution offers limited monopoly rights to inventors to provide incentive for progress in sciences and the useful arts, but too often in recent years, overly broad and inappropriately granted patents have been used as the basis of litigation that has choked the courts and stifled innovation,” Lehmann added. “This decision offers a step in the direction of reform, but comprehensive action by Congress is still needed to address these problems.”


Stroke risk same for menthol and nonmenthol smokers

June 18, 2014, 4:27 PM

Dr. Nicholas Vozoris in 2012 reported that “smokers of mentholated cigarettes, and in particular women and non–African Americans, have significantly increased odds of stroke compared with nonmentholated cigarette smokers.” He used National Health and Nutrition Examination Surveys (NHANES) data to calculate odds ratios of 2.3 for all menthol smokers, 3.3 for women and 3.5 for non-African Americans. Dr. Vozoris found no increased risk for hypertension, heart attack, congestive heart failure or emphysema among menthol smokers.

If the Vozoris findings on menthol and stroke were independently confirmed, they could serve as a basis for a ban on menthol cigarettes. Instead, an analysis of NHANES data by Dr. Brian Rostron comes to an entirely different conclusion.

Dr. Rostron found that African-American menthol smokers had a significantly lower risk for stroke (OR = 0.52, 95% confidence interval= 0.28 – 0.99) than nonmenthol smokers. Dr. Rostron writes:

It is not clear to me how Vozoris obtained his findings, given that I cannot replicate his general results for stroke using the NHANES data and analyses that he specified. Moreover, the absence of observed differences in stroke prevalence among NHANES menthol smokers would suggest that methodological or analytical issues may have affected his results.

Dr. Rostron earlier showed that menthol smokers have lower lung cancer risks than nonmenthol smokers.

Promoted by the Archives of Internal Medicine, the Vozoris study was covered by major news media. The journal, which changed its name to JAMA Internal Medicine in January 2013, is not publicizing the study that corrects the deficient 2012 analysis.

Although medical journals cannot avoid publishing erroneous reports, editors should take all measures to correct the types of mistakes discovered by Rostron.

The dangerous proliferation of the ‘right to be forgotten’

June 18, 2014, 3:54 PM

Following on last month’s ruling by the European Court of Justice that search engines could be forced to take down links that individuals believe infringe their privacy rights, a new ruling from the Supreme Court of British Columbia has ordered Google to remove content not just from its Canadian site, but from all of its global web properties.

While the Canadian case deals with the sale of counterfeit products, rather than privacy, it embraces on the same logic to make its sweeping censorship demands.

At the least, the case may confirm widespread fears that the ruling by Europe’s highest court, which applies even to links that are factual and in the public record, could spread a newfound “right to be forgotten” across the globe, opening the door for disgraced politicians, sex offenders, and malpractice-burdened doctors to wipe their slate clean.

But in some ways, the British Columbia decision poses a threat even more dangerous than the fear that courts would pave the way for a fragmented, balkanized Web and lay the foundation for what the Wall Street Journal editorial board warned would be “an Internet with borders.” The Canadian case arguably establishes a precedent of national and even local courts handing down dictums that affect freedom of speech around the entire world.

University of Ottawa law professor Michael Geist explains on his blog:

The ruling in Equustek Solutions Inc. v. Jack is unusual since its reach extends far beyond Canada. Rather than ordering the company to remove certain links from the search results available through Google.ca, the order intentionally targets the entire database, requiring the company to ensure that no one, anywhere in the world, can see the search results. Note that this differs from the European right to be forgotten ruling, which is limited to Europe.

The Canadian court’s ruling favorably cites the ECJ ruling in Google v. González, which I wrote about in detail here, and relies heavily on it in its effort to assert jurisdiction. If Google does business in Canada by advertising there, then logically, the Canadian court system has jurisdiction over Google’s global search results.

Of course, the court isn’t simply asking Google to remove specific links. Having determined that “deletion of individual URLs is ineffective” like “an endless game of ‘whac-a-mole,’” the court instructed Google to prevent similar content from showing up anywhere for any reason — indefinitely.

But what happens when a takedown order in one country conflicts with the law in another? This happened in the French court case of Yahoo! Inc. v. LICRA. The case concerned the sale of Nazi artifacts on Yahoo’s auction site, which ran afoul of a French law banning the display of Nazi paraphernalia.

When the French court made the bold claim that it had authority over Yahoo’s servers in the United States, Yahoo asked a U.S. court to block the ruling on grounds it conflicted with its First Amendment rights and to “confirm that a non-U.S. court does not have the authority to tell a U.S. company how to operate.”

While the U.S. court agreed with Yahoo, the company also had substantial business activities in France that could be brought to a halt if it didn’t comply. The end result was that Yahoo banned the sale of these objects, even though the French court’s decision was ultimately reversed on other grounds.

This case foreshadows some of the strange scenarios and chilling effects that could emerge when regional courts claim jurisdiction over the whole Internet. Geist imagines a few of these:

The implications are enormous since if a Canadian court has the power to limit access to information for the globe, presumably other courts would as well. While the court does not grapple with this possibility, what happens if a Russian court orders Google to remove gay and lesbian sites from its database? Or if Iran orders it remove Israeli sites from the database? The possibilities are endless since local rules of freedom of expression often differ from country to country.

As western democracies find ways to limit content online, it gives more heavy-handed governments like Russia an excuse to jump on the global Internet censorship bandwagon.

The Russian Public Chamber has already submitted a recommendation to the Russian Parliament calling for the introduction of a right to be forgotten that would affect not only Russian search engines, but also foreign ones like Google and Yahoo. Countries like China and Korea are also seeking to assert their right to censor the global Internet.

It’s hard to imagine why Internet censorship would be the first, best option in any legal dispute. Particularly where other legal remedies seem obvious.

People depend on search engines like Google and Yahoo to be an accurate and open gateway to the web. Allowing any country to leverage privacy or other legal claims to limit search engine content globally will leave us with the lowest common denominator for free speech rights. And that could do a lot of damage to our free and open Internet.

Four other ethnic groups who could challenge pro team trademarks

June 18, 2014, 2:35 PM

Today’s decision by the U.S. Patent and Trademark Office’s Trademark Trial and Appeal Board to vacate six of the Washington Redskins’ trademarks on grounds that they are “disparaging to Native Americans” does more than just raise the temperature in a nasty public relations fight between owner Daniel Snyder and various public officials (Senate Majority Leader Harry Reid, perhaps most notable among them.)

The ruling could also set the stage for other ethnic and national groups to challenge long-standing professional sports trademarks that heretofore have largely flown under the radar.

The fight over the Redskins’ name, which some take to be a racially derogatory slur, is just the most high-profile of what has been a decades-long battle over team mascots. Understandably, given how common Native American names and imagery have been in the American sports landscape, those have been the primary focus, going back to when Stanford and Syracuse universities both dropped their Indian mascots in the 1970s.

The issue truly came to the forefront at the college level in the early 1990s (at least one version of the Redskins trademark case, similarly, dates from 1992.) In the past quarter-century, NCAA colleges that have shifted away from Indian-related names include Eastern Michigan, Marquette, St. John’s, St. Bonaventure, Miami of Ohio, Quinnipiac and, most recently and perhaps most contentiously, the University of North Dakota.

Of course, many Native American names and mascots persist. At the pro level, in addition to the Redskins, the National Football League also has the Kansas City Chiefs. In addition, there’s Major League Baseball’s Cleveland Indians and Atlanta Braves, the National Hockey League’s Chicago Blackhawks and the National Basketball Association’s Golden State Warriors. The teams differ in the degrees to which they continue to employ iconography that evoke Native American stereotypes, ranging from the Indians’ highly questionable, grinning, red-faced “Chief Wahoo” logo to the Warriors, which essentially dropped any Indian symbolism way back in 1971.

But Indians are not the only ethnic group that have been made team mascots. Indeed, the practice was once shockingly common. Unsurprisingly, Negro League teams were filled with references to their players’ skin color, with myriad “black” variants of other established team names: Black Barons, Black Bears, Black Bronchos, Black Caps, Black Colonels, Black Eagles, Black Lookouts, Black Senators, Black Sox, Black Spiders, Black Tyrites, Black Yankees and, of course, in a double-dose of offensiveness, Black Indians.

Then there were the Negro League team names that went above and beyond, into new levels of cringe-inducing tastelessness, like the Colored House of David, the New York Cubans, the Jersey City Colored Athletics, the Ethiopian Clowns, the Zulu Cannibal Giants and, perhaps with a hint of irony, the Atlanta Black Crackers.

In basketball’s pre-NBA days, one of the dominant teams was the all-Jewish Philadelphia “Sphas,” which took its name from the South Philadelphia Hebrew Association. The Sphas, who generally were known colloquially as the Philadelphia Hebrews, played in a pro and semi-pro circuit that included a number of all-Jewish teams, including the perhaps even more questionably named Cleveland Rosenblums.

The era of naming teams for ethnic groups has, by and large, faded into an increasingly embarrassing past. It’s safe to say the Anti-Defamation League wouldn’t cotton to a new team called the “Rosenblums,” to say nothing of what the NAACP would think of the “Zulu Cannibal Giants.” But there are still some teams in the major professional North American sports whose monikers originated as ethnic terms. They mostly go unnoticed due either to the obscurity of the term or the apathy (or even, positive embrace) of the affected groups. Nonetheless, in the wake of the Redskins decision, here are four ethnic groups who could conceivably earn their day in court.

The Irish: The best-known and perhaps most obvious of ethnic-related pro sports names happens to belong to the NBA’s most storied franchise, the Boston Celtics. According to possibly apocryphal team lore, founder Walter Brown insisted upon the name, over the objections of public relations staff, declaring: “Boston is full of Irishmen. We’ll put them in green uniforms and call them the Boston Celtics!”

The team’s logo is a slightly watered-down caricature of longstanding stereotypes perpetuated by bigoted anti-Catholic Irish bashers, from the shillelagh to the clog shoes to the derby hat to the pipe and potbelly and leering wink. All that’s missing from the equation are implications that the character is a violent alcoholic, perhaps because that would run too close to violating the trademark of college football’s most-celebrated team, the University of Notre Dame Fightin’ Irish.

The Dutch: The New York Knickerbockers, better known as the Knicks, trace their name to a term used to describe the descendants of the original Dutch colony of Nieuw-Nederlandt, whose progeny would eventually include the American presidents Martin Van Buren and Franklin and Teddy Roosevelt. The term was popularized by Washington Irving’s satirical 1809 novel “A History of New York from the Beginning of the World to the End of the Dutch Dynasty,” written under the pseudonym Diedrich Knickerbocker.

Though there is evidence the term was actually drawn from a real Knickerbacker family (originally Kinnekerbacker), the popular image of the Knickerbocker became inextricably tied with knickered pants that roll up at the knee. The stereotyped character of “Father Knickerbacker,” with his knickers, cotton wig, tricorn hat and buckled shoes, persisted into the 20th Century. Indeed, for their first 18 years of existence, the Knicks’ logo was Father Knickerbocker dribbling a basketball.

English-Americans: It can be argued that the name of the Knicks’ cross-town compatriots, the New York Yankees, is actually the direct inverse of Knickerbocker. Though the precise origin of the term Yankee remains under dispute, by 1758, it was recorded as a slur that native Brits would use to describe their cousins in New England.

Most linguists agree that Yankee is likely of Dutch origin, noting its similarity to the Dutch diminutive name “Janke,” or “Johnny,” and would have been used by the New York Dutch to describe English settlers in neighboring Connecticut. H.L. Mencken famously argued the name was derived from the Dutch Jan Kaas, literally meaning “John Cheese.” Basically, Mencken’s claim was that the term originated as a slur the Dutch used to call the English a bunch of “cheese heads.”

Canadians: Among the National Hockey League’s six Canadian-based teams, there are not one, but two teams named for ethnic terms for our friends from the Great White North: the Vancouver Canucks and the Montreal Canadiens. Though both terms are generally used today to describe pretty much any resident of Canada, they both originated more specifically as terms to describe French-Canadians. The term “Canadiens” predates the nation of Canada, and originally referred to any French-speaking North American.

Canuck is of more recent vintage, and its precise origins (like those of Yankee) are hotly disputed. What is in less dispute is that Canuck long has been used as a slur. Its first published appearance with its current spelling came in 1849, in James Edward Alexander’s “L’Acadie,” to describe a lusty fellow in a forest road with a keg of whisky slung round him.” By the turn of the 20th Century, the character of Johnny Canuck had emerged as a rival to Uncle Sam, except portrayed as a dim-witted lumberjack. Johnny Canuck (complete with toque, the stereotypical national hat) also served as the Canucks’ original mascot.

Meanwhile, if it turns out that Washington’s football team does need to change its name, the Washington Gerrymanders sounds about right to me.

This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.

R Street encouraged by House bill on terrorism insurance

June 17, 2014, 10:34 AM

WASHINGTON (June 12, 2014) - The R Street Institute welcomes today’s introduction of the TRIA Reform Act of 2014, legislation that would make significant, positive changes to the federal Terrorism Risk Insurance Program.

Sponsored by Rep. Randy Neugebauer, R-Texas, the bill would extend the program for five years, but with greater industry liability and less involvement on the part of the taxpayers. The House Financial Services Committee has scheduled a mark-up session for the measure the morning of June 19.

The bill would raise the program trigger for all non-nuclear, biological, radiological and/or chemical events from $100 million to $500 million by the end of 2019. It also increases the assessment rate to compensate taxpayers for the value of their TRIA investments from 133% to 150%. The bill addresses insurance industry concerns by establishing a fixed, 90-day timeline for certification of acts of terrorism.

“This legislation, while it doesn’t go as far as we may have liked, is an improvement over the Senate bill in terms of shifting more risk from taxpayers onto the private sector,” said R.J. Lehmann, R Street senior fellow. “Among other items, it would raise the trigger level for conventional terrorism events to $500 million. These are the sorts of events and the levels of coverage for which private reinsurance is already available, and this change will encourage more firms to enter the market.”

The Senate Banking Committee passed its version of legislation on June 4. It extends the program for seven years. Both bills reduce the federal share of losses from 85 percent to 80 percent.

Lehmann cautioned that even more changes would benefit the program and taxpayers considerably.

“As the House moves forward to consider this legislation, we still would encourage members to consider other changes. In particular, we feel the Terrorism Risk Insurance Program could withdraw coverage for commercial liability insurance,” he said. “As it stands, current policy in that area amounts to a federal subsidy for firms to be reckless in their preparation for terrorism events.”