Richard Epstein at Hoover

The National Labor Relations Act of 1935 (NLRA) introduced a major revolution in labor law in the United States. Its reverberations are still acutely felt today, especially after the recent, ill-thought-out decision in the matter of Browning Ferris. There, the three Democratic members of the National Labor Relations Board overturned well-established law over the fierce dissent of its two Republican members. If allowed to stand, this decision could reshape the face of American labor law for the worse by the simple expedient of giving a broad definition to the statutory term “employer.”

Right now, that term covers firms that hire their own workers, and the NLRB subjects those firms to the collective bargaining obligations under the NLRA. Under its new definition of employer, the NLRB majority expands that term to cover any firm that outsources the hiring and management of employees to a second firm, over which it retains some oversight function. In its decision, the NLRB refers to such firms and those to whom they outsource the hiring as “joint employers.”

Just that happened when a Browning Ferris subsidiary contracted out some of its recycling work to an independent business, Leadpoint. Under traditional labor law, Browning Ferris would not be considered the “employer” of Leadpoint’s employees—but the Board’s decision overturns that traditional definition. No longer, its majority says, must the employer’s control be exercised “directly and immediately.” Now “control exercised indirectly—such as through an intermediary—may establish joint-employer status.”

By this one move, the Board ensures that unions will now have multiple targets for their organizing efforts. A union can sue the usual employer who hires and fires, and it may well be able to sue one or more independent firms who have outsourced some of their work to that firm. The exact standards by which this is done are not easy to determine in the abstract. Instead, the new rules depend on some case-by-base assessment of the role that the second firm has in setting the parameters for hiring workers, determining their compensation, and supervising their work…

And then there’s the minimum wage hikes, well meant, but job killers.

The city councils in Seattle, San Francisco and Los Angeles have already voted to increase their minimum wage to $15 an hour over several years. For large employers in Seattle, the first increase to $11 from $9.47 took effect in April. In San Francisco a hike to $12.25 from $10.74 began in May. Los Angeles rolled out a minimum wage for hotel workers of $15.37 in July.

It’s still early to know how the hikes are affecting the job market, but the preliminary data aren’t good. Mark Perry of the American Enterprise Institute, Adam Ozimek of Moody’s Analytics and Stephen Bronars of Edgewood Economics reported last month that the restaurant and hotel industries have lost jobs in all three cities. Mr. Bronars crunched the numbers and discovered that the “first wave of minimum wage increases appears to have led to the loss of over 1,100 food service jobs in the Seattle metro division and over 2,500 restaurant jobs in the San Francisco metro division.” That is a conservative estimate, he notes, as the data include areas outside city limits, where the minimum wage didn’t increase.

This comes as no surprise. In 2014 the Congressional Budget Office found that increasing the minimum wage to $10.10 an hour would result in employment falling by 500,000 jobs nationally. By the way, less than 20% of the earning benefits would flow to people living below the poverty line, as University of California-Irvine economist David Neumark has pointed out.