Out of the Storm News
The United States faces severe earthquake risk in many areas of the country, yet consumers routinely choose not to purchase insurance products to cover this risk. Low earthquake insurance take-up rates create a scenario in which a major event could result in significant mortgage defaults. The problem is real and serious, although understanding its precise magnitude will require more research. However, there is ample reason to believe the insurance and reinsurance markets are sufficiently well-capitalized to address the issue. Confronting the risk of mortgage default will require changes to product offerings, mitigation efforts and mortgage-loan underwriting standards.
There’s an old saying in business: you get what you pay for. Yet all too often, government policy seems based on the hope that you can make costs go away by shifting them to someone else.
Take Texas windstorm insurance. The state-run Texas Windstorm Insurance Association provides cut-rate insurance against wind damage from hurricanes and other storms. While originally justified as a provider of last resort for families and businesses who couldn’t get insurance in the private market, TWIA now covers approximately 60 percent of residents in a 14-county Texas coastal region.
But TWIA’s artificially low insurance rates come at a cost. Without actuarially sound rates, TWIA risks being unable to pay out claims when Texas is next hit by a major storm. Such concerns aren’t hypothetical. In 2013, TWIA briefly considered going into receivership in order to stem the tide of claims made after 2008′s Hurricane Ike. Since then, TWIA has considered a number of alternatives to help put the insurer on a firmer financial footing. And even though TWIA’s financial position has improved somewhat, it still has $77 billion in liability.
Inevitably, when an organization like TWIA doesn’t charge enough to meet its liabilities, it has to find the money from somewhere else. Recently, the Texas Department of Insurance issued rules providing for a surcharge on property and auto insurance policies in the coastal region if other sources of funding for TWIA are exhausted.
The new rules have drawn criticism from coastal residents, on grounds that the surcharge would apply only in the 14-county coastal region, rather than in the whole state. Yet it’s hard to see why residents of Lubbock should be required to subsidize windstorm insurance for folks with beachfront property along the Gulf Coast. The real problem with the surcharge is not that it applies to too few people, but that it applies to too many. Under the rules, coastal residents who maintain private windstorm insurance would end up paying to bail out TWIA along with everyone else.
The Business and Commerce Committee of the Texas Senate is currently looking at ways to reform TWIA. Instead of using indirect and complicated mechanisms to shift costs around, TWIA’s funding problem should be solved by resetting premium rates on an actuarially sound basis. That will help to restore not only TWIA’s long-run viability, but also its status as a true provider of last resort. Artificially low insurance rates may seem appealing, but ultimately, the only way for Texans to ensure that windstorm insurance will be there for them when they need it is for them to pay for it.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
The American public often rails about bureaucracy. It is not difficult to fathom why. Who among us has not fumed while standing in a long line at an understaffed post office? And how many of us have thrown up our hands in frustration at the complexity of income tax instructions and outsourced the work to an accountant?
The public tends to explain bureaucratic behaviors by attributing ill motives to the bureaucrats. Civil servants, they allege, are arrogant and lazy. Scholars, such as the late James Q. Wilson, have provided us with social scientific evidence of what many individuals suspect: bureaucracies, especially government ones, tend to be slow to perform tasks, resist change and frequently creep beyond their missions.
But the blame should not be attributed to bad bureaucrats. Rather, research indicates that most of the problems spring from the very nature of government bureaucracy.
Agencies cannot run like businesses because they cannot do what private sector entities do: choose their lines of business and organize themselves accordingly. Instead, bureaucracies’ work is assigned through legislation, usually enacted over decades. The result is a progressive layering of policy duties, which often conflict with one another. And elected officials also tend to impose operational constraints on bureaucracies. For example, instead of allowing agencies to hire whomever they think is best for the job and pay them accordingly, elected officials force bureaucracies to follow byzantine hiring practices and dictate the permissible compensation packages. Thus it is that bureaucracies, as Wilson observed, tend “to be driven by the constraints” on them rather than “the tasks of the organization.”
And thanks to Adam Eckerd, a professor at Virginia Tech University, we know there is an additional reason that bureaucracies get the stink-eye from public: some of them are not good listeners. Eckerd looked at three federal agencies that recently embarked on significant public works. As required by the National Environmental Policy Act of 1969, the agencies prepared environmental impact statements in consultation with the public. Eckerd analyzed the comments submitted by the public and the agencies’ responses.
The results of the study, published in the latest copy of Public Administration Review, are dispiriting. He found little evidence of “meaningful dialogue” between agencies and the public. The two sides talked past one another, especially on the subject of proposed projects’ risk to the environment. “[P]ublic managers tend to take a more aggregate and technical view that risk is something to manage, while citizens focus on risks specific to themselves, consider the fairness of the distribution of risk, and come from a viewpoint that risk is best avoided.”
Eckerd further observed that “the administrators involved in the three cases were usually technical or project management specialists who were well versed in the details of the particular projects but likely had no training in public relations or political engagement.” Hence, agencies’ responses many times were tin-earred and merely acknowledged the receipt of the comments.
An Environmental Protection Agency administrator, who reviewed Eckerd’s study, wrote a response that was published in the same issue of the journal. “My experience,” he wrote, “bears out Eckerd’s conclusion that citizen involvement often has little impact in government decision making.” The administrator further validated the study’s findings by unabashedly proclaiming, “Regulatory agencies are charged with the informed, expert implementation of their organic statutes and resultant regulations…. The contrary attitude [amongst the public] stems from a lack of sophistication about the underlying technical issues, or… a predisposition to object to any proposal on the grounds that it risks changes to the status quo.”
These findings are distressing. Most agencies have public comment policies. These policies are intended to improve bureaucracies’ decision-making by providing them with additional information. The adversarial process also forces agencies to think twice about about what they propose doing. But public input serves an additional critical purpose: fostering a sense of democratic legitimacy. Bureaucrats are unelected and often tenured for life. As such, they are inherently suspect to Americans. So getting public comment right is a critical to having their exercise of authority accepted as legitimate by the public.
For the citizen feeling unheard, Congress is the place to turn. It created bureaucracies and funds them. Congress, accordingly, should make it a regular part of oversight to direct agencies to review their public input processes. And agencies should ask themselves, “Were I John Q. Public, would I feel my voice has been heard and taken seriously?”
The R Street Institute, a non-profit, free market think tank with offices in Washington, D.C., Florida, California, Ohio, and Texas, is ramping up its efforts in the Lone Star State, hiring a new Texas state director and setting its sights on an ambitious agenda for the 2015 legislative session that will include insurance reform, environmental and energy issues, the interaction between regulation and newly emerging technologies, among other issues.
Breitbart Texas conducted exclusive interviews with R Street President Eli Lehrer, Florida State Director Christian Cámara, and the newly hired Texas State Director, Josiah Neeley, who joins R Street from the Texas Public Policy Foundation’s Center for Energy and the Environment, where he was a policy analyst. “I’m thrilled to be joining R Street,” Neeley told Breitbart Texas. “I’ve followed the organization since its founding a few years ago, and appreciate its fresh conservative approach to policy.”
According to Lehrer, R Street has had a history in Texas, both as R Street and with their predecessor organization, the Heartland Institute, working mostly in the area of insurance reform, which he described as their “bread and butter” issue. Cámara discussed the similarities between Texas and his state of Florida, politically, being “generally business-friendly,” and facing similar challenges from risks like hurricanes and tornadoes, as well as noting that there was room for improvement in the regulatory systems in both states. Florida’s term limited legislature also causes some frustration related to policy work in a complicated field like insurance, with Florida’s legislators limited to eight years, consecutive, in either the House or Senate. Although a number of legislators do run for a Senate seat after serving their full House term, or return after taking a one-term break (which starts the term limit clock over), the end result, according to Cámara, is that the Texas legislature is “much more knowledgeable” and “not as clueless” on insurance issues.
Another key difference is although both states face risk of hurricane damage along the coasts, the insurance issue is more regional in Texas, partly due to the different political dynamics. Whereas in Florida, the Republicans currently have a lock on the Legislature and the statewide offices are up for grabs (polls show a neck-and-neck race between current Republican Governor Rick Scott and former Republican/former Independent/current Democrat and former Governor Charlie Crist), in Texas, the statewide races are looking like cakewalks for the Republican candidates and the legislative races are more variable.
A significant part of R Street’s efforts in the upcoming legislative session will be expanding their environmental work. Lehrer characterized their goal as to show that the conservative side “can be greener,” quoting Barry Goldwater that “there’s nothing more conservative than conservatism.”
Neeley will start at R Street in early November, and his background in environmental policy work made him a good fit for R Street’s strategy. “Texas is blessed with an abundance of energy resources, and R street has shown that you can protect the environment in a conservative way,” said Neeley, “The markets are the best means of environmental protection. You can protect the environment in a way that doesn’t threaten the energy lifeblood of the Texas economy.”
Another area where R Street will focus its energies is on the “sharing economy,” newly developing technologies like Uber, Lyft, Bitcoin, HomeAway, Airbnb, and so on. There is a “huge opportunity made possible by the internet to remove intermediaries from commerce and unlock otherwise dormant capital,” said Lehrer, and these technologies demonstrate the power of the “disintermediation of commerce” and “capital unlocking.” Lehrer noted how the rapid growth of ride sharing companies like Uber and Lyft show that there’s “no need for a taxi company between a driver and the customer,” and also pointed to HomeAway, the country’s second largest space sharing company, based here in Austin.
What makes R Street interested in these businesses is determining the appropriate role of regulation. “There’s no reason that renting out a room [through Airbnb or HomeAway] should be a tax-free transaction,” said Lehrer. “The question is, should these activities be allowed and under what circumstances?” Local and state governments have struggled with how to approach this question, and oftentimes end up pursuing reactionary regulation that many customers of these businesses view as overly restrictive. The California legislature has debated bills that would significantly increase the costs for Uber and Lyft, district attorneys in Los Angeles and San Francisco labeled them as “a continuing threat to consumers and the public,” and the Houston City Council voting to allow them to operate, but with restrictions. As Breitbart Texas previously reported, Dallas will take up the issue later this fall.
Lehrer told Breitbart Texas that getting involved in these issues presents a “golden opportunity for the political right to attract people who otherwise are culturally not connected to the right…this whole sharing economy is an enormous opportunity for conservatives.” Neeley echoed this sentiment, characterizing the situation as one “where 21st century technologies [are] butting up against 20th century regulatory systems.” Lehrer also observed that what these companies — and their customers — want as far as a regulatory environment goes is “very much aligned with the right,” but it may take some adjustment on the part of conservative elected officials, who “need a modern culturally relevant approach to conservatism.” The key, said Lehrer, is to find a way where they are “not giving up any core principles of conservatism, but rather applying them,” looking past the pink-mustachioed Lyft cars and their fist-bumping drivers, and seeing the opportunities for economic development and innovation.
Swedish researchers from several institutions document that snus use is not associated with atrial fibrillation (commonly known as AFib), the most common heart arrhythmia (irregular timing of the heart beat) and a risk factor for stroke. The same group previously reported that snus use conferred no significant risk for heart attack and stroke.
Led by Maria-Pia Hergens, researchers analyzed data on Swedish men who were subjects in several studies. While snus users had no risks for Afib, smokers had a small but significantly elevated risk (hazard ratio = 1.16, 95% confidence interval = 1.01-1.33).
Although smokeless tobacco cannot be proven to be absolutely safe, this study adds important evidence that any cardiovascular effect is very minor.
Given the number of institutions represented by its authors, the report is an important development for tobacco harm reduction. In contrast are the biased 1990s and 2000s studies from the Karolinska Institute. There, a small group of KI researchers had access to the construction workers’ cohort and refused to share the data. Instead, they manipulated the information to fabricate some health risks and amplify others in snus users, a fact which I documented in numerous blog posts (examples here, here, and here) and in letters to journal editors.
Too many Americans suffer from inadequate access to health care. The political left offers to construct government programs or expand the ones we already have.
America’s education system is falling behind its global counterparts. Liberals suggest that we send more tax dollars to the same government systems to finally get it right.
We need to protect our environment and steward our natural resources. The left’s first inclination is cede state authority to federal regulators hundreds or thousands of miles away.
Government management is the liberal answer to the right’s free marketplace. The left rails against the corporate tycoon as out of touch with the common man and the uncertainties of the market as too unreliable to meet the needs of the poor and uneducated.
At first blush, it actually seems to make sense. If capitalism produces an every-man-for-himself society subject to the uncaring and ever-changing laws of supply and demand, government control feels like a natural safeguard. The left holds government out as the remedy to the wrongs our free society creates and a source of stability in a tumultuous economy.
First, dispense with the notion that the political left is somehow run by the “common man.” The same elitism the left vilifies in the corporate boardroom is a virtual requirement in the upper echelons of government. In spite of the fact that the left is run by their own elite, they profess that their system of control produces better outcomes for society’s poor and vulnerable. In other words, they are elitists that care.
Again, the narrative feels good and the left spins it well. Americans pay their taxes and government addresses poverty, education, healthcare, and the environment and protects us from a host of bad things, like Ebola and terrorism. For the liberal, more money to the government means more of the good things and less of the bad things.
The mythic aspiration for government is shockingly disconnected from reality. The main problem is that the government behaves more like a slot machine than a vending machine. Buying more of it does not guarantee better outcomes. It is far from stable and equally as unpredictable at producing the desired outcomes for those the left hopes to aid.
Sometimes good policies and practices actually take shape. The Clean Water Act is a perfect example. In many respects, the original legislation worked as intended. Constraining the ability of industry to dump toxic waste into our national waters cleaned up the environment considerably.
Other times, and particularly recently, the results are shockingly incompetent. Consider the Centers for Disease Control & Prevention’s response to the Ebola outbreak. The CDC appears to have offered guidance that could have actually helped spread Ebola by assuring a nurse, now being treated for Ebola, that she could fly after coming into contact with an infected patient. Problems with the Affordable Care Act website, issues at the IRS and failures at the Secret Service are other troubling examples.
The danger of the political left’s myth of government provision is that it creates the feeling that we are solving society’s problems without much accountability for results. We continue to suffer the effects of poverty, declining education outcomes, and high costs of healthcare even as government has grown.
The myth operates as a soothing balm for the many liberal consciences. If we pay our taxes, we need not worry too much about society’s most vulnerable. Government will take care of them…except that it often does not.
Capitalism will not cure all of society’s ailments, but it was never designed to do so. It will only ever be as good and altruistic as the people participating in it. The left has responded with a myth that government will effectively and predictably fill in the gaps where the marketplace fails. The question we must answer is whether we would rather feel better believing the myth or face the reality that government might actually be more elitist and unpredictable than the free marketplace liberals find so inadequate.
The U.S. labor market may finally be picking up steam, but when it comes to our structural labor problems, big questions about stagnant wages and underemployed workers remain.
Will labor market tightening do anything to accelerate wage growth for middle- and lower-class workers? Will it ease the difficulties manufacturers and other skills-based employers face in finding an adequately trained workforce? Almost certainly not.
While today’s college-educated workers stand ready to step in to managerial and professional roles, those with only a high school degree or some college frequently frequently lack the skills to provide adequate value to employers. The result has been to accelerate the shift toward mechanization, further decreasing the opportunities for those who need them most.
Whatever answer policy-makers pursue will require big changes in how we think about providing services and training to those in need. Countless important battles are being fought at the state and local level over the future of K-12 education. Many more are being fought over higher education – the rise of student loan debt; the high drop-out rate, particularly for those in community college; the questionable value provided by many institutions and degrees.
But one crucial piece of the puzzle has been missing. As Tamar Jacoby pointed out in The Atlantic last week, the United States fails to provide worker training services for many roles in manufacturing and other traditional blue-collar services. Offering this sort of training used to be the role of unions. Their decline during the last half-century, combined with the push for college-for-all, has left the country with a skills gap.
Jacoby’s piece highlights the success of the German model, where the national government, labor and businesses have collaborated to create a series of standardized apprenticeship programs, beginning at age ten. They provide workers with a combination of traditional education; skills needed for a specific industry; and the further intangible skills learned through experience on the job.
Jacoby worries that such a model would never work here. She highlights as sticking points the Germans’ top-down, standardized approach and the coordination between business and labor. But a first question is whether the German approach would be either necessary or desirable.
America already is home to a plethora of local job-training programs – public, private and somewhere in between. For example, after moving its facilities to South Carolina, Boeing worked with local technical schools through the state’s ReadySC program to take work-ready locals and teach them skills to work in Boeing facilities. As the aerospace industry moves more jobs to the South, many of the incentive packages offered by states include provisions for worker training, including a $52 million facility in Alabama where workers are trained at state expense; $136 million in North Carolina; and $30 million from South Carolina.
Many partnerships between manufacturers and local job boards and state governments look to prepare high-school educated workers for positions in local plants. Some, such as the Wisconsin Regional Trading Partnership, go even further. The WRTP allows even those with only a 10th grade equivalency to access the program, which provides participants a 27 percent wage premium two years after graduation.
The Career Academies starts even younger, taking high school youth and putting them to work in local industry. Graduates receive an earnings premium of 11 percent over non-participants. Unfortunately, programs like these tend to be the “alternative” for at risk-students, rather than a path to be prized on its own merits.
These sorts of partnerships provide incentive for industry to see their workers as investments. Indeed, the German employers and educators Jacoby spoke with highlighted this as the most important aspect of the model. Being able to look, in their words, “beyond ROI” and see the longer-term benefits for the company is difficult in today’s corporate culture, but necessary if we want to close the skills gap.
To be sure, this piecemeal approach has flaws. Large areas are left unaddressed and the lack of national consensus allows the myth that college is the answer for all to persist. But it also allows experimentation, flexibility and local response to changing circumstances. Given the backdrop of rapid technological change, innovation in education is a key goal that tends to be stifled by top-down national programs.
The last thing the United States needs is to turn worker training into the adult version of K-12 education, where by most accounts we continue to train students in subpar ways for a world that no longer exists. Training programs need to stay nimble and businesses must have the right incentives to respond to changing circumstances.
It would be impractical and unwise to expect business to shoulder all of the burden of this kind of training, given the free-rider problems that inevitably crop up when workers take their skills to the competition. Rather than pursuing a German-style, top-down model, what we need is a bottom-up movement to take worker training seriously. We need to invest in our workforce, particularly those who have been hurt by the dramatic shifts in the need for skilled workers. Federal funds for worker training should flow to these programs, but the federal role should stop there.
Even if a national program were desirable, the political will to create one is close to nil. It is a large leap for conservatives to get on board with such an idea. However, many conservative politicians are looking for solutions for the poor and middle class, and redesigning our federal funding to support local programs such as those in Wisconsin or South Carolina would be a step in that direction.
Americans need to get past the idea that it’s college or bust. For too many twenty-somethings, the “bust” has left them at a competitive disadvantage in the international labor market, and without a clear path forward. The average annual salary for an aerospace machinist in Charleston, S.C. is $45,500, almost 200 percent of the poverty line for a family of four. As Boeing’s General Manager Jack Jones said in a 2013 interview:
It’s a prestige job. And you say you’re from Boeing or you work at Boeing, and there’s a prestige within the community.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
Here are five potential consequences of Republicans capturing the Senate majority in the upcoming elections:
- Democrats’ only tool to stop GOP-crafted legislation from landing on President Obama’s desk will be their refusal to end debate on legislation. Senate Majority Leader Harry Reid, D-Nev., has largely shielded President Barack Obama by declining to vote on House-passed legislation. In a GOP-controlled Senate, Democrats might try to protect the president with the cloture vote, but doing so will force more moderate Democrats to take difficult political votes.
- When Democrats are unable to derail legislation in the Senate, Republicans will likely force the president to repeatedly veto legislation or compromise on signature political victories for Democrats. The optics of passing legislation through the House and Senate only to have it constantly vetoed by the President could give Republicans a powerful political narrative going into 2016.
- Republican control over the House and Senate will also change the battle over federal spending. Currently, the split in the House and the Senate has short-circuited the normal appropriations process. Republicans could restore the “normal” appropriations measures and break the funding of the federal government up into 12 separate appropriations bills. This would enable them to use the funding process to target some of the president’s executive actions on issues like immigration and energy policy without threatening a full government shutdown.
- While Republican majorities might harass a veto-happy President Obama for obstructing the legislative process, Senate Republicans would be able to stand in the way of his political appointments. If Republicans are successful at the polls, President Obama will likely try to confirm his next attorney general in December before the new Congress arrives in January. Republicans could then force President Obama to moderate his subsequent appointments, including important federal judgeships.
- As unlikely as it seems, a Senate and House in Republican hands could be a recipe for tactical compromise. Whether it is the Keystone XL oil pipeline or corporate tax reforms, both Democrats and Republicans may be looking to show their ability to work across the aisle to a nation fed up with partisan political gridlock. Republicans have their eye on the White House in 2016, and Democrats are optimistic about their chances of retaking the Senate should it fall to Republicans in 2014.
Republicans still have a long way to go to turn an increasing political probability into a reality, but their success could create an interesting new political dynamic in Washington.
The latest wrinkle in the ongoing debate over network neutrality comes from Rep. Henry Waxman, D-Calif., ranking member of the House Committee on Energy and Commerce, who suggests the Federal Communications Commission adopt a “third way” in regulating the way last-mile Internet service providers like Comcast, AT&T and Verizon handle traffic from major video content providers like Netflix, Amazon and Google’s YouTube.
Waxman’s proposal was detailed in a filing in the FCC’s latest inquiry into ways it can effectively enforce network neutrality. While current FCC net neutrality guidelines allow ISPs the freedom to take “reasonable” steps to prioritize or manage traffic to improve quality or protect other applications, Waxman is among those who want the government to take a larger, more active role in managing the Internet ecosystem. Waxman states he wants a “bright-line” rule against any ISP blocking, throttling or paid prioritization.
Waxman agrees with many activists who want the FCC to change the regulatory classification of ISPs as written in the Telecommunications Act of 1996.
Regulations for ISPs are spelled out under Title I of the act. Here, with the intention of fostering a growing industry, Congress intentionally sought to keep regulation minimal and the compliance burden low. Title II of the act spells out regulations for older telephone companies, and derives heavily from the 1930s-era monopoly regulations that ended 30 years ago. Dated as it is today, the Telecom Act was a bipartisan effort, spurred by the arrival of both the Internet and wireless phone service, to transition the telecom industry from monopoly to competitive business.
Waxman is among those who want to reverse this 30-year course, and reclassify competitive wireline and wireless ISPs as Title II common carriers subject to a host of rules and regulations on rates and services.
Trouble is, even those who favor network neutrality are divided as to whether reclassification is the correct remedy. Even as he asks for public comment, FCC Chairman Thomas Wheeler has expressed some reservations about reclassification. Wheeler has stated “we don’t want to put rules in place that would dis-incentivize companies from making…continued investment.”
So did his predecessor, Julius Genachowski, whose own “third way” stopped short of this step. Both might worry that reclassification would impose egregious pricing and service conditions on ISPs that would be counterproductive in the long run. Others have pointed out that despite all the new rules reclassification would add, it would not prohibit paid prioritization. The U.S. Post Office, after all, is a regulated common carrier, yet it can offer Express Mail.
Waxman’s solution — boiled down — is to trust the government to restrain itself. Waxman points to another part of the Telecom Act, Section 706, and suggests that it be woven into the reclassification.
Section 706 is something of an elastic clause in Telecom Act. It allows the FCC to investigate whether advanced communications technologies — the definition of which includes the Internet — are being deployed in a reasonable and timely manner and to “take immediate action” to remedy any perceived problem.
What Waxman hopes is that the FCC can reclassify ISPs as Title II carriers, then use Section 706 to ban ISP prioritization, but disregard, or forbear, all of the other Title II requirements. Not only does this open all sorts of legal questions, beginning with whether the FCC can alter definitions set by congressional legislation, it also counts on the FCC to deliberately restrain itself from using the additional regulatory power Waxman’s proposal would give it.
Given the attempts the FCC has already made to expand its purview, Waxman is asking Americans to take a big leap of faith. Through the past two administrations, the FCC has continually sought to increase its regulatory scope beyond broadcasting and into cable TV and Internet. Just in the past week Wheeler has suggested the FCC regulate subscription-based Internet video services and use its indecency rules to stop sports reporters from saying the Washington Redskins team name on the air.
Waxman’s solution is weak because it tries to jam new realities — on-demand Web video, wireless Internet and changing consumer viewing habits — into regulatory silos that are three decades old. Rather than trying to redefine FCC scope through a sloppy cut-and-paste of outmoded law, Congress should revisit the Telecom Act in its entirety— modernizing it to fit the Internet ecosystem of our time.
It’s doubtful the digital economy will be served by tempting the FCC to take a power trip through this evolving landscape. There still too much of a chance for unintended consequences if the FCC is allowed unchecked discretion in overhauling the underlying economics of the Internet industry.
For most of my life, the Detroit area has been the epicenter of the domestic automotive world. Unless you are the sort of person who experienced a fainting spell when the National Corvette Museum in Kentucky dropped eight historically precious Corvettes into a big sinkhole last winter (including the one-millionth model ever produced) you probably still get most of your interesting domestic car news from Michigan.
I have a friend who has been saving for a while for a Chevy pickup, her dream vehicle. Her dad and other family members drove Chevys, and it’s embedded in the family culture. A lot of my neighbors without a firm brand allegiance lust instead after the new Tesla electric vehicle that changed most everybody’s idea about what an electric car could provide in the way of thrills more traditional to the driving culture than the cultures of saving money or saving the environment.
Several years ago, I was seated at a business next to the CEO of a business that made car parts. His opinion was that American companies were the slowest to innovate of all the major car companies in the world. That was largely before cars were built in pieces all over the world, which began the erosion of the “domestic content” rules that now are mostly impeding foreign sales by American solar energy industries. But that’s a topic for another day.
The Tesla is very innovative. It may even be a game changer at some point. It offers style, exciting driving and a low-impact on the environment, all in one. Oh yes, and it comes with a new direct-to-consumer marketing strategy as well! Ooops, maybe not in Michigan.
The forces aligned against disruptive ways of satisfying consumer demand have gotten, in the space of a couple days, an amendment into a bill that passed both houses and is sitting on the governor’s desk. The amendment prohibits Tesla — in the largest state in which company doesn’t yet have a sales office — from selling directly to its prospective customers. The company has said it had been discussing an approach to sales in Michigan with political leaders in the legislature and appropriate government agencies, until the amendment was adopted and voted on with great alacrity by the General Assembly.
There was never any public debate, and news reports suggest that most members of the legislature were unaware of the impact until the auto dealers started thanking them for their votes. That is, for voting to protect Michigan customers from having more choices about what to drive.
We’ve seen a lot of this lately. It’s just plain bad faith by lawmaking bodies to not at least host some discussion on a law that puts a death sentence on what appears to be a perfectly reasonable business model. If signed by Gov. Rick Snyder, sales of a shiny new consumer product will be impossible in Michigan unless the company decides to sell through traditional dealerships. It is difficult not to notice that these middlemen are third parties that are very active politically, in what was supposed to be a pretty close election for governor this time.
When I worked in the legislature years ago, it used to be considered normal that if your side had the votes on a bill, you were at least accommodating enough to let the opposition make their case before you enacted new requirements or prohibitions. The public is well served by debate. Yet we have seen less and less of it on most of the things that matter. Even the famous cut-and-paste job that has become the federal health insurance law was accompanied by extensive public debate before the final version was sewn together.
Apparently the only part of the process that Tesla was able to engage was a conversation with the governor’s office after the bill passed. The intended election year pressure is now fully upon the governor. We will see next week whether he decides to stick with his philosophy on how markets work best for his constituents, or is forced to bow to special interest politics.
I doubt if this innovative, named for one of the greatest scientists who ever lived, can be stamped out with the rearguard action by Michigan dealers. But it can surely be inhibited by the wrong decision from the state’s leadership.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
The history of the American administrative state has been filled with attempts to introduce “rationality” and “discipline” to circumstances in which markets, left to their own devices, allegedly lead to socially or politically undesirable outcomes.
Brimming with good will and well-meaning intention, champions of state intervention seek to invent systems to accommodate and serve those least able to care for themselves. Californians can even directly participate in such efforts through the initiative process.
In 1988, a slim majority of California voters passed Proposition 103, a landmark initiative they were told would reform the state’s auto insurance market for the better. What they did not count on was that, more than 25 years later, Prop 103’s architects would still be standing protectively, to thwart any attempt to update the initiative to reflect changes in the market, technology and the world at-large.
Within a few years, experts now predict, self-driving and autonomous vehicles will radically change the risk profile of vehicles on California’s roads. A recent study by the Casualty Actuary Society found that many of the risks associated with automated vehicles can not only be identified, but they can also be quantified, and therefore eliminated. But while that’s encouraging from a health and safety perspective, thanks in part to Prop 103, reducing accidents by such an overwhelming margin could have perverse impacts on drivers who can’t afford to upgrade to cars with more advanced safety technology.
Prop 103′s process for approving auto insurance rates is rigid. There’s a statutory hierarchy, with three mandatory rating factors that insurers must take into account above all others, the first of which is accident history. Self-driving and autonomous vehicles will experience far fewer accidents as a result of driver error, which currently is and historically always has been the cause of most accidents.
Since operators of self-driving and autonomous vehicles will experience fewer accidents, their premiums will decrease. But the benefits conferred to early adopters will be balanced out by higher rates for those who continue to drive conventional vehicles. Under Prop 103, insurance rates are a zero-sum game.
The law requires insurers to increase or decrease the significance of cost factors to comply with the rating-factor hierarchy. This “pumping and tempering” occurs independent of the actual cost associated with a rating factor. Thus, even though other rating factors will be of greater significance to self-driving and autonomous vehicles, Prop 103’s inflexibility will continue to require accident history to be the single most important determinant in the rate-making process.
The result will be that other members of the “class plan” will have to make up the reduced cost of early adopters. In a state split between vehicle operators who bear transportation risk and others who do not, Prop 103 will burden those least able to foot the bill.
If traditional drivers are made to cross-subsidize operators of self-driving and autonomous vehicles, it is likely Prop 103 will run afoul of its own judicially constructed purpose. In a 2005 state appeals court decision, Foundation for Taxpayer and Consumer Rights v. Garamendi, a discussion of the appropriateness of a legislatively enacted change to Prop 103 led the court to find that the initiative’s purpose was to protect the uninsured from arbitrary rates, so that insurance could be “fair, available and affordable.” In the context of a persistency rating factor that the court decided subsidized the insured at the expense of the uninsured, that purpose was deemed violated.
But what happens when existing language, faced with novel circumstances, leads to an identical result? Left unchanged, Prop 103 will violate its own purpose.
Perhaps, in the name of “fairness,” and to forestall the inevitable dismantling of their ugly creation, the drafters of Prop 103 will seek a ban on autonomous vehicles. For now, fortunately, they are satisfied to ignore reality and claim that “California is a long, long way from the so-called ‘autonomous vehicle.’”
To avoid problems like these in the future, the Prop 103 model should be scrapped entirely. Failing that, to properly address the risk presented by self-driving and autonomous vehicles, different rating factors will need to be introduced and given hierarchical flexibility. Contextual factors will be of particular importance, since they likely will present the greatest hurdles to self-guidance programs. For instance, a region’s weather and the quality of its road network could both predictively relate to an autonomous vehicle’s risk profile.
Increasingly and inevitably, Prop 103 is proof that the regulatory rationale of one moment can grow stale in the next. Prop 103 has swollen useless and counterproductive government intervention. More distressing still, Prop 103 now may injure the very same economically deprived people it was passed to help.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
In 1972, Congress enacted the Clean Water Act to “restore and maintain the chemical, physical and biological integrity of the nation’s waters.” The federal government’s legal authority to regulate water is largely derived from the U.S. Constitution’s Commerce Clause, which theoretically limits the government’s jurisdiction to the type of navigable waterways where such commerce occurs. Sadly, the federal government is rarely content with any limitation placed on its regulatory authority.
Although the CWA defines “navigable waters” as “the waters of the United States, including the territorial seas,” federal regulators further define the “waters of the United States” to cover traditional navigable waters and all other waters that could affect interstate or foreign commerce. The current regulatory definition opens up more waters to CWA coverage but still attempts to track Congress’ Commerce Clause authority.
Now the EPA and Army Corps of Engineers are taking advantage of a particularly unclear Supreme Court ruling in Rapanos v. United States. Narrowly interpreted, the 2006 Rapanos decision gives the EPA regulatory authority over wetlands “with a continuous surface connection” to navigable waterways. Read broadly, the Rapanos decision gives the EPA authority under the CWA to regulate water with a mere “significant nexus” to navigable waterways.
As a result of the Supreme Court’s lack of clarity, the EPA and Army Corps of Engineers have proposed to expand the coverage of the CWA in an exceptionally far-reaching manner. Many conservatives, agriculture groups and even the U.S. Small Business Administration are calling for the proposed rule to be withdrawn.
On October 8, 2014, Alabama Attorney General Luther Strange joined other state attorneys general opposing the new definition. The joint letter noted that many of “the waters and lands covered [by the proposed rule] are entirely outside of Congress’ authority under the Commerce Clause, such as non-navigable intrastate waters that lack any significant nexus to a core water, trenching upon state authority, including in areas of non-economic activity.”
The Alabama Farmers Federation has also raised concerns about the impact of the proposed rule on Alabama. “The government overreach from this rule would extend beyond farms to affect businesses, homes, schools, churches – any place built on land where water runs through after a heavy rain,” said ALFA president Jimmy Parnell. “This was never the intent of the Clean Water Act, and this bypassing of Congress should not be allowed.”
Baldwin County farmer Hope Cassebaum echoed Parnell’s concerns. “Farmers don’t need any more regulations than what we have now,” said Cassebaum. “It’s sad because we already try to be the best stewards of the land that we can be.”
Elmore County farmer Richard Edgar proudly highlighted that his family has worked with the USDA’s Natural Resources Conservation Service for generations. “We have some of the first, and still well-maintained, parallel terraces which are best for the environment,” said Edgar.
At the same time, Edgar considers the EPA’s move to be more about federal control than true environmental concern. “My children are the sixth generation on this land,” he said. “We’re going to take care of our farmland because we have a legacy and hope for the future. We don’t need government telling us how to take care of it.”
The CWA has been a major success in cleaning up our national waters, but federal authority under the act is not without limit. The EPA and Army Corps of Engineers are pressing the boundaries of their federal jurisdiction to their breaking point, and Alabamians would be wise to pay attention.
I have documented for several years a continuous decline in smoking rates among American teens. Rates of smoking and use of other tobacco products among teens are so low that they no longer provide a valid basis for the draconian anti-tobacco policy prescriptions favored by the FDA and CDC.
A fresh National Survey on Drug Use and Health summary confirms low tobacco use by teens. The chart below shows that the smoking rate continued its free-fall through 2013. Cigar use also declined over the past decade to 2.3 percent in 2013, while smokeless tobacco use was flat at about 2 percent over the entire period.
These figures aren’t underestimates. As I’ve discussed previously, NSDUH estimates tend to be robust, because they include any product use over the prior 30 days.
Other NSDUH data (in the chart at bottom) point to the population that should be targeted by the FDA and CDC – those aged 18-34. The sharp jump in smoking prevalence from 11 percent at ages 16-17, to 27 percent at ages 18-20, underscores that the latter group is where the real problem starts.
Anti-tobacco forces know that problematic behaviors in adults don’t stimulate support for prohibitionist policies, so they continue to inaccurately suggest the existence of a youth-tobacco crisis.
Ever since the advent of Internet-based, user generated mass media, Hollywood has been fighting a losing game of whack-a-mole against ordinary online users who share copyright-protected material in ways its creator didn’t explicitly authorize.
In particular, sites like YouTube, Vimeo and Dailymotion have enabled the proliferation of covers of popular songs, parodies, mash-ups of clips from movies and/or television shows (sometimes combined with music) and even, in some cases, people actually posting full movies online.
This kind of activity has sparked aggressive lobbying for ever harsher and harsher anti-piracy and pro-copyright laws by content creators. As anyone who watched the battle over the Stop Online Piracy Act in 2012 can attest, large segments of the content-producing world were seeking a government-enforced “blacklist” system, in which Internet service providers would be forced to treat practically any alleged copyright infringement as grounds to render a site unviewable.
Which is ironic, because the would-be “victims” who argued the hardest for SOPA have made more than $1 billion from the very proliferation of creativity they once tried to kill. That’s the tally, the Financial Times reports, for revenues derived by more than 5,000 companies (including all of the major U.S. television and film studios) who participate in YouTube’s “ContentID” program.
Under the program, creators are entitled to revenue streams from advertising sold on user-generated content that makes use of copyrighted material. Since it was established in 2007, most creators have opted to monetize the content, rather than block it.
The end result is mutually beneficial for creators, users and, ultimately, consumers. In some cases, it can open new revenue streams associated, for instance, with films that might otherwise have languished in a proverbial vault of commercial flops. Users who want to get discovered for their covers of popular songs, or simply want a space to perform, need not fear take-down notices for the crime of doing nothing but singing.
Which is not to say that Content ID itself is always perfect. It doesn’t provide users with much recourse if their video is unjustly flagged as a copyright violation even if a court might determine it falls under fair use. Disputes are referred to the rights-holder, and some video game critics raised concerns late last year that they ended up in a Sisyphean struggle over their reviews, which clearly fall under the “criticism” exception of fair use, with quibbles over such minutia as music that was licensed for use in game trailers, but not for YouTube users.
Ultimately, however, these issues represent not a problem with Content ID, but with the legal copyright regime it is required to enforce. While that fight is still in progress, these sorts of market-oriented compromises at least allow access to art that otherwise would be strangled by short-sighted rights-holders who don’t always see the financial rewards right in front of them.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
In 1925, a music industry professional complained to the New York Times that the new medium of radio was destroying the industry’s business model by making songs too widely available to the public for free. The Times quoted the unnamed professional saying:
The public will not buy songs it can hear almost at will by a brief manipulation of the radio dials. If we could control completely the broadcasting of our compositions we would endeavor to prevent this saturation of the radio listeners with any particular song.[…]
We are striving to have the copyright law, which protects us from the free use of our compositions by the makers of phonograph records and music rolls, construed to cover broadcasting. The law specifies that we must be compensated if any of our songs are used by some one else for profit to them. We contend that the radio station is an enterprise founded for gain. It is not controlled by those purveying sets or parts, it is employed by organizations which use it as a medium of institutional advertising.
The music industry professional got his wish as far as copyright, and has turned out to be right in another way as well, though surely not in a way he would have expected. Radio is treated as free advertising – and primarily for music producers! This is, in fact, the main reason why terrestrial radio stations are presently statutorily exempt from paying royalties.
Today, the story of radio’s transition from content industry bete noir to one of its core advertisers is being replayed in the case of another medium that content industry professionals once lambasted as nothing but a gateway for pirates: search engines,
In a report released today by Google, the company lays out the case that search engines aren’t major driver of piracy. The report claims that search is responsible for just 16 percent of the traffic to sites that host pirated content. By contrast, studies have shown that 64 percent of the traffic to legitimate sites comes from search engines.
To take one example, “katy perry” gets searched for 200,000 more times on Google than “free katy perry mp3.” What’s more, under new changes to the company’s search algorithm, legitimate sources of Katy Perry’s music will show up first on both searches, at no cost to Perry herself (though individual content salesmen such as Apple, Amazon or Spotify can pay to have their digital storefronts advertised prominently).
Starting in 2012, Google began “downranking” sites that receive a high volume of Digital Millennium Copyright Act take-down complaints, meaning that those results automatically are ranked lower in Google’s search algorithm. The new tweaks will go further to prioritize results for legitimate sources of film, music and other copyrighted content, as well as offering users multiple sources from which that content can be purchased, rented or streamed. This would apply even for obvious piracy-oriented searches, such as “the big lebowski torrent.”
In short, for content producers, search engines serve as a form of free advertising, paid for either by middlemen online retailers, or by the search engine itself. As the Google report puts it:
Piracy often arises when consumer demand goes unmet by legitimate supply. As services ranging from Netflix to Spotify to iTunes have demonstrated, the best way to combat piracy is with better and more convenient legitimate services. The right combination of price, convenience and inventory will do far more to reduce piracy than enforcement can.
Consumers have a huge appetite for content, and there’s evidence that they’re willing to pay a lot for it. A report from May 2013 found that the most frequent consumers of pirated digital files actually spend 300 percent more on content than so-called “honest” consumers. This tendency for piracy itself to serve as a form of free advertising is why savvy media producers, such as the makers of HBO’s Game of Thrones, find high piracy rates flattering, rather than alarming. Once HBO’s new stand-alone streaming service launches, these users of pirated content easily could turn into legitimate consumers.
Search engines thus have a huge opportunity to exploit a market with an above-average appetite for content and expose it to more ways to purchase that content. This benefits search engines like Google, but it also benefits the content industry itself.
Of course, as the 1925 Times quote demonstrates, the content industry hasn’t always been eager to embrace innovation. The disruptive effects of the Internet have shaken a lot of established content industry business models, which has led some of those industries into efforts at outright censorship, both through abuse of the DMCA’s take-down system and through attempts to bake censorship into the Internet itself, via new legislation and trade agreements.
Google’s report provides some truly lurid examples of what that “abuse” looks like, such as a film company allegedly trying to get a major newspaper’s review of their film blocked in search results. Techdirt has also outlined some truly ridiculous examples of DMCA takedown requests. In view of these shameless attempts at censorship, Google’s decision to protect against DMCA abuse from both directions is prudent. It remains to be seen whether these safeguards will continue to hold, but the proliferation of information about fair use on the Internet suggests reason for optimism.
Distinguishing between fighting piracy as an industry, and fighting individual pirates, who are rarely the hardened criminals that content industry advocates paint them as, could be a major step toward a better Internet both for consumers and producers.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
When the U.S. Supreme Court decided the fate of the Affordable Care Act’s individual mandate in 2012, the nation was gripped by an intimate and personal political drama worthy of Broadway or, at least, General Hospital.
Debates about insurance mandates and subsidies were elevated from obscurity to the top of the national discourse. Suddenly, having an opinion about the appropriate tools for the federal government to implement its health-care law was crucial. Partisans entrenched themselves into “pro-mandate” and “anti-mandate” camps. When Chief Justice John Roberts handed down the court’s opinion, he left both camps confused.
Mandates are seldom popular, but the legacy of that decision was to focus public vitriol at mandates generally, even in areas where a mandate might make sense. As a case in point, to stimulate and increase California’s earthquake insurance take-up rate, some form of a mandate might be necessary to adequately secure billions of dollars in mortgage loans currently backed by taxpayers.
California’s largest provider of earthquake insurance, the California Earthquake Authority, has a policy take-up rate of only around 10 percent. The nature of the policies themselves might be partly to blame for that low take-up rate. The coverage is expensive and the deductibles are quite high. Given public perception that the frequency of seismic events is low compared to other covered perils, and that most earthquake damage is unlikely to pierce the deductible, many find it hard to imagine the insurance is worth the expense. Whatever the reason, something must be done to broaden the pool of insureds.
One alternative is an earthquake-insurance mandate; the other is a taxpayer subsidy of earthquake risk. Each approach promises to bring more of the public into the earthquake insurance risk-pool, but they would achieve that in markedly different ways.
An earthquake-insurance mandate would achieve lower earthquake insurance premiums by expanding the pool of those insured. If needed, pre-funded catastrophe risk instruments would allow for an immediate infusion of contractually designated private capital after an event occurs.
Beyond the practical benefits of an earthquake insurance mandate, it is also desirable because the conduct mandated is limited and subject to private choice. Property owners would be required to obtain coverage as a condition for certain loans. Taxpayers are not placed at risk, or even implicated by an earthquake insurance mandate. Unlike the ACA mandate, in which the absence of coverage triggers a penalty, an earthquake insurance mandate would be triggered only by dealings with a public mortgage entity. This would have the added benefit of reducing mortgage-default risk that is currently shouldered by taxpayers.
Subsidies, in the form of government guarantees of principal and interest on post-event bonds, serves the function of transferring risk from one group (insurers and insureds) to another (taxpayers) to achieve lower premiums. While a bond issued after an event has the advantage of being tailored to that event’s specific cost, subsidizing earthquake risk by incurring debt after an event potentially places all taxpayers on the hook for those most vulnerable and least responsive to the risk.
A post-event bond approach that implicates Californians who maintain only a nominal relationship to the risk of earthquake loss is particularly problematic. While it spreads risk, it does so in a manner that abrogates private decision-making processes. There is no reason for homeowners in Siskiyou County to pay a surcharge on their insurance because homeowners elsewhere chose to forgo earthquake coverage.
Given the shortcomings of a subsidy approach, the perception of the political hurdle posed by a mandate must be staggering. And yet, mandates and subsidies are not that dissimilar. Broadly speaking, insofar as it compels economic behavior without a relationship to the risk of loss, a subsidy is itself a mandate of a different kind.
It is ironic, then, that the earthquake-insurance subsidy, which will command broad-scale public involvement, does not carry the stigma of a mandate. Perhaps the largest hurdle to increasing the earthquake insurance take-up rate, while respecting the agency of taxpayers and property owners, at is a semantic one. Claiming that a subsidy is the only way forward because a mandate is a political nonstarter disguises the true philosophical distinction between a mortgage requirement and a taxpayer backstop – individual agency vs. public obligation.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
Last week, U.S. Sen. Ron Wyden, D-Ore., hosted a forum in Silicon Valley on NSA spying as a means to drum up support for proposed reform legislation that has been stalled in the Senate.
Attended by executives from Google, Microsoft, Facebook and other tech companies, the forum found a receptive audience, as these companies are worried about their prospects of doing business abroad. A 2013 report warned that American companies could lose up to $180 billion in lost technology sales as a result of the NSA spying allegations. A report in August of last year found that American cloud computer services alone could lose up to $35 billion a year in lost overseas sales as a result of the revelations.
These estimates have turned into reality, according to the Associated Press:
A few companies, including Cisco Systems Inc. and Qualcomm Inc., have said they believe they lost some deals in China and other emerging markets because of concerns about U.S. spying. Germany did cancel a contract with Verizon this summer, citing a fear that it may provide customer phone records to the NSA. Some tech startups and telecommunications companies in France and Switzerland have claimed an increase in sales to customers who are wary of U.S. providers.
Not only are foreign companies shying away from doing business with American companies, but foreign governments are as well. According to tech executives, there are concerns that foreign governments, especially European Union member states such as Germany, may place protectionist restrictions on U.S. tech companies over the NSA’s spy programs. Such protectionist policies would break the Internet, according to Google CEO Eric Schmidt.
Among the protectionist policies that U.S. tech executives fear are requirements that all European-based data only be stored on European-based systems. The goal would be to ensure the data handling remains compliant with EU privacy protections, rather than those of the United States. In April, the European Court of Justice threw out the European Union’s similar metadata collection program.
Such mandates would require American companies to build EU-specific Internet infrastructure, based in Europe. This would cost untold amounts of money for American tech companies. The alternative would find American companies shut out of the European market. The Information Technology & Innovation Foundation has estimated U.S. tech firms could lose as much as 20 percent of their revenue to foreign companies that capitalize on the fears of U.S. spying.
There are also fears that such requirements could increase the likelihood of trade wars that spill out beyond the tech sector. This could cause massive economic damage, both domestically and globally.
The solution that Sen. Wyden would like to see enacted is passage of the USA Freedom Act, which would place strict restrictions on the NSA’s ability to conduct warrantless surveillance of Americans’ e-mail and Internet communications. The bill would also end the practice of the bulk collection of Americans’ phone records. The bill is supported by most in the tech industry and by a bipartisan coalition in Congress.
However, the House-passed version of the legislation was watered down at the request of leadership of both parties. The White House is opposed to the bill and the bill has not even made the Senate calendar. Senate Judiciary Committee Chairman Patrick Leahy, D-Vt., has proposed a substitute NSA reform bill that Wyden and others charge doesn’t go far enough.
The NSA spying program needs to be reined in for a couple of very important reasons. It is a violation of the civil liberties and the privacy of the American people. The program we’re finding has also caused damage to the American economy and may potentially damage the global economy as well. We hope that Congress will act on this issue in upcoming lame duck session after the elections.This work is licensed under a Creative Commons Attribution-NoDerivs 3.0 Unported License.
As ride-sharing services like Uber and Lyft continue to expand in cities around the country, the services have been dogged by questions about whether their drivers are fully insured.
Major auto insurers like State Farm, Geico and Progressive have made clear that personal policies exclude coverage for rides when a driver is paid. Some insurers will go so far as to reject any applicant who reports driving for a ride-sharing company.
Meanwhile, the kinds of commercial policies carried by full-time limo drivers can be as much as 10 times as expensive as personal auto policies, rendering them largely unaffordable for drivers who may do ride-sharing for only a couple hours per week.
The major ride-sharing services have responded by buying specialized coverage for themselves and their drivers’ liability, although some regulators and the taxi lobby have charged that even this coverage contains gaps.
But the debate does raise the question: Why don’t insurers simply create new products to accommodate this new and growing risk?
It isn’t that personal auto insurers are completely unwilling to write commercial coverage. According to data from SNL Financial, among the top 20 writers of personal auto insurance coverage in 2013, 15 also wrote commercial auto insurance. Nine companies would count among the top 20 in both lines of business.
But so far, such products have not come to market. Discussions with major auto insurance underwriters suggest four major reasons why:
It’s still unclear whether courts will find the major ride-sharing companies liable for accidents involving their drivers. The companies have a reasonable case that they are simply acting as match-makers between independent drivers and potential riders.
Even if they are ultimately found liable, courts also will be asked to weigh in on whether a driver who is logged in to a ride-sharing app, but not transporting a rider, will be held to the heightened standards-of-care traditionally applied to common carriers like taxicabs and limousines.
Insurers have decades of data on personal driving behavior and other relevant factors — from credit scores to ZIP codes — on which to base rates. Meanwhile, commercial auto insurers have their own underwriting data, as well as relying on local licensing requirements to screen out the worst drivers.
By contrast, ride-sharing — that is, professional transportation offered by unlicensed, amateur drivers — is a new risk for which credible data does not yet exist. The elements that make one a safe driver for the purposes of a personal auto insurance policy may not be identical to those required of a driver for hire.
Some states do not allow companies to insure commercial activity as a part of a personal policy. Even where it is allowed, states generally ask insurers to produce data that justifies the rates and terms they set — data that largely doesn’t yet exist.
In fact, some states, like California, will only allow insurers to consider factors that are expressly permitted by law, which confounds insurers’ ability to bring innovative products to market quickly.
It’s still unknown just how large the ride-sharing market will be. Polling conducted in August found that, even among residents of urban areas where Uber or Lyft are already operating, just 14 percent had ever used a smartphone application to order a ride.
Moreover, depending on the legal and regulatory framework that evolves, it may prove more cost-effective for ride-sharing services to purchase master policies that cover all of their drivers, rather than having each driver procure his or her own coverage.
Insurers could certainly still roll out new products in the coming months and years to better fit the ride-sharing drivers. If history is any guide, such products likely will originate in the surplus lines market, where underwriters do not face the same regulations of forms and rates.
Alternatively, ride-sharing drivers could form their own mutual insurance company, the way taxi drivers did when they formed the Public Service Mutual Casualty Insurance Corp. in 1925.
But for the major auto insurers to buy into ride-sharing in a big way — crafting new hybrid products and offering special riders or endorsements on existing personal policies — there are still a number of looming questions that need to be answered.
Yesterday’s presentation by the U.S. Treasury was a comical spectacle—at least for those of us with sardonic senses of humor. The good news? The deficit for FY2014 (which ended Sept. 30) was 29 percent lower than the deficit was in FY2013. Increased corporate tax receipts drove much of the deficit reduction.
The bad news? The actual deficit was $483 billion, or nearly half-a trillion dollars. Only in Washington, D.C. can such a massive gap between spending and revenues be celebrated.
Despite a government shutdown and sequestration, government outlays did not go down. In fact, the Treasury reported, spending went up by $50 billion, to $3.5 trillion from $3.45 trillion the previous year.
Rather than admit that the federal government still has a spending problem, Treasury Secretary Jacob Lew and OMB Director Shaun Donovan oozed spin:
The president’s policies and a strengthening U.S. economy have resulted in a reduction of the U.S. budget deficit of approximately two-thirds–the fastest sustained deficit reduction since World War II.
Of course, as BusinessWeek pointed out, Lew’s baseline was, yes, FY2009 when the federal deficit reached its highest level ever: $1.4 trillion. This is a bit like a baseball player batting .166 in FY2014 and bragging that he raised his average by two-thirds over the past 5 years. To put FY2014’s $483 billion deficit in perspective, the deficit in FY2008 (when the economy was still in the tank) was $455 billion.
Meanwhile, the accumulated deficits continue to climb. FY2014’s half-trillion dollar deficit boosts the federal debt to a frightful $12.8 trillion. And the nation remains on a fiscally unsustainable path. CBO predicts—absent any congressionally enacted changes to current law—“by 25 years from now, rising budget deficits would push federal debt held by the public to more than 100 percent of GDP, a level seen only once before in U.S. history, just after World War II. To put the budget on a sustainable path, lawmakers will need to cut benefits from some large programs relative to current law, raise tax revenue above its historical percentage of GDP to pay for the rising cost of those programs, or adopt a combination of those approaches. Moreover, the changes in federal spending or revenues that would be required to achieve certain possible objectives for federal debt are substantial.”
One can only hope that the new Congress, which arrives in late January, will take seriously the nation’s fiscal mess. And act.
“We have a substantial understanding of how and why e-cigarettes can help smokers switch to a far lower-risk product,” said Dr. Joel Nitzkin, who testifies around the country for the industry.