ROBERT B. REICH, Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies, was Secretary of Labor in the Clinton administration. Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written thirteen books, including the best sellers “Aftershock" and “The Work of Nations." His latest, "Beyond Outrage," is now out in paperback. He is also a founding editor of the American Prospect magazine and chairman of Common Cause. His new film, "Inequality for All," is now available on Netflix, iTunes, DVD, and On Demand.

+  FOLLOW ON TUMBLR    +  TWITTER    +  FACEBOOK

COLBERT REPORT, NOVEMBER, 2013

WITH BILL MOYERS, SEPT. 2013

DAILY SHOW, SEPTEMBER 2013, PART 1

DAILY SHOW, SEPTEMBER 2013, PART 2

DEMOCRACY NOW, SEPTEMBER 2013

INTELLIGENCE SQUARED DEBATES, SEPTEMBER 2012

DAILY SHOW, APRIL 2012, PART 1

DAILY SHOW, APRIL 2012, PART 2

COLBERT REPORT, OCTOBER, 2010

WITH CONAN OBRIEN, JANUARY, 2010

DEBATING RON PAUL, JANUARY, 2010

DAILY SHOW, OCTOBER 2008

DAILY SHOW, APRIL 2005

DAILY SHOW, JUNE 2004

  • Why Ordinary People Bear Economic Risks and Donald Trump Doesn’t


    Sunday, September 21, 2014

    Thirty years ago, on its opening day in 1984, Donald Trump stood in a dark topcoat on the casino floor at Atlantic City’s Trump Plaza, celebrating his new investment as the finest building in Atlantic City and possibly the nation.

    Last week, the Trump Plaza folded and the Trump Taj Mahal filed for bankruptcy, leaving some 1,000 employees without jobs.

    Trump, meanwhile, was on twitter claiming he had “nothing to do with Atlantic City,” and praising himself for his “great timing” in getting out of the investment.

    In America, people with lots of money can easily avoid the consequences of bad bets and big losses by cashing out at the first sign of trouble.

    The laws protect them through limited liability and bankruptcy.

    But workers who move to a place like Atlantic City for a job, invest in a home there, and build their skills, have no such protection. Jobs vanish, skills are suddenly irrelevant, and home values plummet.

    They’re stuck with the mess.

    Bankruptcy was designed so people could start over. But these days, the only ones starting over are big corporations, wealthy moguls, and Wall Street.

    Corporations are even using bankruptcy to break contracts with their employees. When American Airlines went into bankruptcy three years ago, it voided its labor agreements and froze its employee pension plan.

    After it emerged from bankruptcy last year and merged with U.S. Airways, America’s creditors were fully repaid, its shareholders came out richer than they went in, and its CEO got a severance package valued at $19.9 million.

    But American’s former employees got shafted.

    Wall Street doesn’t worry about failure, either. As you recall, the Street almost went belly up six years ago after risking hundreds of billions of dollars on bad bets.

    A generous bailout from the federal government kept the bankers afloat. And since then, most of the denizens of the Street have come out just fine.

    Yet more than 4 million American families have so far have lost their homes. They were caught in the downdraft of the Street’s gambling excesses.

    They had no idea the housing bubble would burst, and didn’t read the fine print in the mortgages the bankers sold them.

    But they weren’t allowed to declare bankruptcy and try to keep their homes. 

    When some members of Congress tried to amend the law to allow homeowners to use bankruptcy, the financial industry blocked the bill.

    There’s no starting over for millions of people laden with student debt, either.

    Student loan debt has more than doubled since 2006, from $509 billion to $1.3 trillion. It now accounts for 40 percent of all personal debt – more than credit card debts and auto loans.

    But the bankruptcy law doesn’t cover student debts. The student loan industry made sure of that.

    If former students can’t meet their payments, lenders can garnish their paychecks. (Some borrowers, still behind by the time they retire, have even found chunks taken out of their Social Security checks.)

    The only way borrowers can reduce their student debt burdens is to prove in a separate lawsuit that repayment would impose an “undue hardship” on them and their dependents.

    This is a stricter standard than bankruptcy courts apply to gamblers trying to reduce their gambling debts.

    You might say those who can’t repay their student debts shouldn’t have borrowed in the first place. But they had no way of knowing just how bad the jobs market would become. Some didn’t know the diplomas they received from for-profit colleges weren’t worth the paper they were written on.

    A better alternative would be to allow former students to use bankruptcy where the terms of the loans are clearly unreasonable (including double-digit interest rates, for example), or the loans were made to attend schools whose graduates have very low rates of employment after graduation.

    Economies are risky. Some industries rise and others implode, like housing. Some places get richer, and others drop, like Atlantic City. Some people get new jobs that pay better, many lose their jobs or their wages.

    The basic question is who should bear these risks. As long as the laws shield large investors while putting the risks on ordinary people, investors will continue to make big bets that deliver jackpots when they win but create losses for everyone else.

    Average working people need more fresh starts. Big corporations, banks, and Donald Trump need fewer. 

    Share
  • Harvard Business School’s Role in Widening Inequality


    Saturday, September 13, 2014

    No institution is more responsible for educating the CEOs of American corporations than Harvard Business School – inculcating in them a set of ideas and principles that have resulted in a pay gap between CEOs and ordinary workers that’s gone from 20-to-1 fifty years ago to almost 300-to-1 today.

    survey, released on September 6, of 1,947 Harvard Business School alumni showed them far more hopeful about the future competitiveness of American firms than about the future of American workers.

    As the authors of the survey conclude, such a divergence is unsustainable. Without a large and growing middle class, Americans won’t have the purchasing power to keep U.S. corporations profitable, and global demand won’t fill the gap. Moreover, the widening gap eventually will lead to political and social instability. As the authors put it, “any leader with a long view understands that business has a profound stake in the prosperity of the average American.”

    Unfortunately, the authors neglected to include a discussion about how Harvard Business School should change what it teaches future CEOs with regard to this “profound stake.” HBS has made some changes over the years in response to earlier crises, but has not gone nearly far enough with courses that critically examine the goals of the modern corporation and the role that top executives play in achieving them.

    A half-century ago, CEOs typically managed companies for the benefit of all their stakeholders – not just shareholders, but also their employees, communities, and the nation as a whole.

    “The job of management,” proclaimed Frank Abrams, chairman of Standard Oil of New Jersey, in a 1951 address, “is to maintain an equitable and working balance among the claims of the various directly affected interest groups … stockholders, employees, customers, and the public at large. Business managers are gaining professional status partly because they see in their work the basic responsibilities [to the public] that other professional men have long recognized as theirs.” 

    This view was a common view among chief executives of the time. Fortune magazine urged CEOs to become “industrial statesmen.” And to a large extent, that’s what they became. 

    For thirty years after World War II, as American corporations prospered, so did the American middle class. Wages rose and benefits increased. American companies and American citizens achieved a virtuous cycle of higher profits accompanied by more and better jobs.

    But starting in the late 1970s, a new vision of the corporation and the role of CEOs emerged – prodded by corporate “raiders,” hostile takeovers, junk bonds, and leveraged buyouts. Shareholders began to predominate over other stakeholders. And CEOs began to view their primary role as driving up share prices. To do this, they had to cut costs – especially payrolls, which constituted their largest expense.

    Corporate statesmen were replaced by something more like corporate butchers, with their nearly exclusive focus being to “cut out the fat” and “cut to the bone.”

    In consequence, the compensation packages of CEOs and other top executives soared, as did share prices. But ordinary workers lost jobs and wages, and many communities were abandoned. Almost all the gains from growth went to the top.

    The results were touted as being “efficient,” because resources were theoretically shifted to “higher and better uses,” to use the dry language of economics.

    But the human costs of this transformation have been substantial, and the efficiency benefits have not been widely shared. Most workers today are no better off than they were thirty years ago, adjusted for inflation. Most are less economically secure.

    So it would seem worthwhile for the faculty and students of Harvard Business School, as well as those at every other major business school in America, to assess this transformation, and ask whether maximizing shareholder value – a convenient goal now that so many CEOs are paid with stock options – continues to be the proper goal for the modern corporation.

    Can an enterprise be truly successful in a society becoming ever more divided between a few highly successful people at the top and a far larger number who are not thriving?

    For years, some of the nation’s most talented young people have flocked to Harvard Business School and other elite graduate schools of business in order to take up positions at the top rungs of American corporations, or on Wall Street, or management consulting.

    Their educations represent a substantial social investment; and their intellectual and creative capacities, a precious national and global resource.

    But given that so few in our society – or even in other advanced nations – have shared in the benefits of what our largest corporations and Wall Street entities have achieved, it must be asked whether the social return on such an investment has been worth it, and whether these graduates are making the most of their capacities in terms of their potential for improving human well-being.

    These questions also merit careful examination at Harvard and other elite universities. If the answer is not a resounding yes, perhaps we should ask whether these investments and talents should be directed toward “higher and better” uses.

    [This essay originally appeared in the Harvard Business Review’s blog, at http://blogs.hbr.org/2014/09/how-business-schools-can-help-reduce-inequality/]

    Share
  • Berkeley vs. Big Soda


    Monday, September 8, 2014

    I was phoned the other night in middle of dinner by an earnest young man named Spencer, who said he was doing a survey.

    Rather than hang up I agreed to answer his questions. He asked me if I knew a soda tax would be on the ballot in Berkeley in November. When I said yes, he then asked whether I trusted the Berkeley city government to spend the revenues wisely.

    At that moment I recognized a classic “push poll,” which is part of a paid political campaign.

    So I asked Spencer a couple of questions of my own. Who was financing his survey? “Americans for Food and Beverage Choice,” he answered. Who was financing this group? “The American Beverage Association,” he said.

    Spencer was so eager to get off the phone I didn’t get to ask him my third question: Who’s financing the American Beverage Association? It didn’t matter. I knew the answer: Pepsico and Coca Cola.

    Welcome to Berkeley, California: Ground Zero in the Soda Wars.

    Fifty years ago this month, Berkeley was the epicenter of the Free Speech Movement. Now, Berkeley is moving against Big Soda.

    The new movement isn’t nearly dramatic or idealistic as the old one, but the odds of victory were probably better fifty years ago. The Free Speech Movement didn’t challenge the profitability of a one of the nation’s most powerful industries.

    Sugary drinks are blamed for increasing the rates of chronic disease and obesity in America. Yet efforts to reduce their consumption through taxes or other measures have gone nowhere. The beverage industry has spent millions defeating them.

    If on November 4 a majority of Berkeley voters say yes to a one-cent-per-fluid-ounce tax on distributors of sugary drinks, Berkeley could be the first city in the nation to pass a soda tax. (San Franciscans will be voting on a 2-cent per ounce proposal requiring two-thirds of them approve; Berkeley needs a mere majority.)

    But if a soda tax can’t pass in the most progressive city in America, it can’t pass anywhere. Big Soda knows that, which is why it’s determined to kill it here.

    Taxing a product to reduce its consumption has been effective with cigarettes. According to the American Cancer Society, every 10 percent increase in the cost of a pack of cigarettes has caused a 4 percent decline in the rate of smoking.

    And for years cigarette manufacturers waged an all-ought war to prevent any tax or regulation. They eventually lost, and today it’s hard to find anyone who proudly smokes.

    Maybe that’s the way the Soda Wars will end, too. Consumption of sugary soft drinks is already down somewhat from what it was ten years ago, but kids (and many adults) are still guzzling it.

    Berkeley’s Soda War pits a group of community organizations, city and school district officials, and other individuals (full disclosure: I’m one of them) against Big Soda’s own “grassroots” group, describing itself as “a coalition of citizens, local businesses, and community organizations” without identifying its members.

    Even though a Field Research poll released in February found 67 percent of California voters (and presumably a similar percentage of Berkeley voters) favor a soda tax if revenues are spent on healthy initiatives, it will be an uphill fight.

    Since 2009, some thirty special taxes on sugary drinks have been introduced in various states and cities, but none has passed. Not even California’s legislature, with Democratic majorities in both houses, could enact a proposal putting warning labels on sodas.

    Even New York City’s former and formidable mayor Michael Bloomberg – no slouch when it came to organizing – lost to Big Soda. He wanted to limit the size of sugary drinks sold in restaurants and other venues to16 ounces.

    But the beverage industry waged a heavy marketing campaign against the proposal, including ads featuring the Statue of Liberty holding up a giant soda instead of a torch. It also fought it through the courts. Finally the state’s highest court ruled that the city’s Board of Health overstepped its authority by imposing the cap.

    Fifty years ago, Berkeley’s Free Speech Movement captured the nation’s attention and imagination. It signaled a fundamental shift in the attitudes of young Americans toward older forms of authority.

    Times have changed. Four years ago the Supreme Court decided corporations were people under the First Amendment, entitled to their own freedom of speech. Since then, Big Soda has poured a fortune into defeating ballot initiatives to tax or regulate sugared drinks.

    But have times changed all that much? In its battle with Big Soda, Berkeley may once again make history.

    Share
  • The Bankruptcy of Detroit and the Division of America


    Friday, September 5, 2014

    Detroit is the largest city ever to seek bankruptcy protection, so its bankruptcy is seen as a potential model for other American cities now teetering on the edge.

    But Detroit is really a model for how wealthier and whiter Americans escape the costs of public goods they’d otherwise share with poorer and darker Americans.

    Judge Steven W. Rhodes of the U.S. Bankruptcy Court for the Eastern District of Michigan is now weighing Detroit’s plan to shed $7 billion of its debts and restore some $1.5 billion of city services by requiring various groups of creditors to make sacrifices.

    Among those being asked to sacrifice are Detroit’s former city employees, now dependent on pensions and healthcare benefits the city years before agreed to pay. Also investors who bought $1.4 billion worth of bonds the city issued in 2005.

    Both groups claim the plan unfairly burdens them. Under it, the 2005 investors emerge with little or nothing, and Detroit’s retirees have their pensions cut 4.5 percent, lose some health benefits, and do without cost-of-living increases.

    No one knows whether Judge Rhodes will accept or reject the plan. But one thing is for certain. A very large and prosperous group close by won’t sacrifice a cent: They’re the mostly-white citizens of neighboring Oakland County.

    Oakland County is the fourth wealthiest county in the United States, of counties with a million or more residents.

    In fact, Greater Detroit, including its suburbs, ranks among the top financial centers, top four centers of high technology employment, and second largest source of engineering and architectural talent in America.

    The median household in the County earned over $65,000 last year. The median household in Birmingham, Michigan, just across Detroit’s border, earned more than $94,000. In nearby Bloomfield Hills, still within the Detroit metropolitan area, the median was close to $105,000

    Detroit’s upscale suburbs also have excellent schools, rapid-response security, and resplendent parks.

    Forty years ago, Detroit had a mixture of wealthy, middle class, and poor. But then its middle class and white residents began fleeing to the suburbs. Between 2000 and 2010, the city lost a quarter of its population.

    By the time it declared bankruptcy, Detroit was almost entirely poor. Its median household income was $26,000. More than half of its children were impoverished.

    That left it with depressed property values, abandoned neighborhoods, empty buildings, and dilapidated schools. Forty percent of its streetlights don’t work. More than half its parks closed within the last five years.

    Earlier this year, monthly water bills in Detroit were running 50 percent higher than the national average, and officials began shutting off the water to 150,000 households who couldn’t pay the bills.

    Official boundaries are often hard to see. If you head north on Woodward Avenue, away from downtown Detroit, you wouldn’t know exactly when you left the city and crossed over into Oakland County — except for a small sign that tells you.

    But boundaries can make all the difference. Had the official boundary been drawn differently to encompass both Oakland County and Detroit – creating, say, a “Greater Detroit” – Oakland’s more affluent citizens would have some responsibility to address Detroit’s problems, and Detroit would likely have enough money to pay all its bills and provide its residents with adequate public services.

    But because Detroit’s boundary surrounds only the poor inner city, those inside it have to deal with their compounded problems themselves. The whiter and more affluent suburbs (and the banks that serve them) are off the hook.

    Any hint they should take some responsibility has invited righteous indignation. “Now, all of a sudden, they’re having problems and they want to give part of the responsibility to the suburbs?” scoffs L. Brooks Paterson, the Oakland County executive. “They’re not gonna’ talk me into being the good guy. ‘Pick up your share?’ Ha ha.” 

    Buried within the bankruptcy of Detroit is a fundamental political and moral question: Who are “we,” and what are our obligations to one another?

    Are Detroit, its public employees, poor residents, and bondholders the only ones who should sacrifice when “Detroit” can’t pay its bills? Or does the relevant sphere of responsibility include Detroit’s affluent suburbs — to which many of the city’s wealthier resident fled as the city declined, along with the banks that serve them?

    Judge Rhodes won’t address these questions. But as Americans continue to segregate by income into places becoming either wealthier or poorer, the rest of us will have to answer questions like these, eventually. 

     

    Share
  • Back to College, the Only Gateway to the Middle Class


    Monday, September 1, 2014

    This week, millions of young people head to college and universities, aiming for a four-year liberal arts degree. They assume that degree is the only gateway to the American middle class.

    It shouldn’t be.

    For one thing, a four-year liberal arts degree is hugely expensive. Too many young people graduate laden with debts that take years if not decades to pay off.

    And too many of them can’t find good jobs when they graduate, in any event. So they have to settle for jobs that don’t require four years of college. They end up overqualified for the work they do, and underwhelmed by it.

    Others drop out of college because they’re either unprepared or unsuited for a four-year liberal arts curriculum. When they leave, they feel like failures. 

    We need to open other gateways to the middle class. 

    Consider, for example, technician jobs. They don’t require a four-year degree. But they do require mastery over a domain of technical knowledge, which can usually be obtained in two years.

    Technician jobs are growing in importance. As digital equipment replaces the jobs of routine workers and lower-level professionals, technicians are needed to install, monitor, repair, test, and upgrade all the equipment.

    Hospital technicians are needed to monitor ever more complex equipment that now fills medical centers; office technicians, to fix the hardware and software responsible for much of the work that used to be done by secretaries and clerks.

    Automobile technicians are in demand to repair the software that now powers our cars; manufacturing technicians, to upgrade the numerically controlled machines and 3-D printers that have replaced assembly lines; laboratory technicians, to install and test complex equipment for measuring results; telecommunications technicians, to install, upgrade, and repair the digital systems linking us to one another.

    Technology is changing so fast that knowledge about specifics can quickly become obsolete. That’s why so much of what technicians learn is on the job.

    But to be an effective on-the-job learner, technicians need basic knowledge of software and engineering, along the domain where the technology is applied – hospitals, offices, automobiles, manufacturing, laboratories, telecommunications, and so forth.

    Yet America isn’t educating the technicians we need. As our aspirations increasingly focus on four-year college degrees, we’ve allowed vocational and technical education to be downgraded and denigrated.

    Still, we have a foundation to build on. Community colleges offering two-year degree programs today enroll more than half of all college and university undergraduates. Many students are in full-time jobs, taking courses at night and on weekends. Many are adults.

    Community colleges are great bargains. They avoid the fancy amenities four-year liberal arts colleges need in order to lure the children of the middle class.

    Even so, community colleges are being systematically starved of funds. On a per-student basis, state legislatures direct most higher-education funding to four-year colleges and universities because that’s what their middle-class constituents want for their kids.

    American businesses, for their part, aren’t sufficiently involved in designing community college curricula and hiring their graduates, because their executives are usually the products of four-year liberal arts institutions and don’t know the value of community colleges. 

    By contrast, Germany provides its students the alternative of a world-class technical education that’s kept the German economy at the forefront of precision manufacturing and applied technology.

    The skills taught are based on industry standards, and courses are designed by businesses that need the graduates. So when young Germans get their degrees, jobs are waiting for them.

    We shouldn’t replicate the German system in full. It usually requires students and their families to choose a technical track by age 14. “Late bloomers” can’t get back on an academic track.

    But we can do far better than we’re doing now. One option: Combine the last year of high school with the first year of community college into a curriculum to train technicians for the new economy.

    Affected industries would help design the courses and promise jobs to students who finish successfully. Late bloomers can go on to get their associate degrees and even transfer to four-year liberal arts universities.

    This way we’d provide many young people who cannot or don’t want to pursue a four-year degree with the fundamentals they need to succeed, creating another gateway to the middle class.

    Too often in modern America, we equate “equal opportunity” with an opportunity to get a four-year liberal arts degree. It should mean an opportunity to learn what’s necessary to get a good job. 

    Share
  • Back to School, and to Widening Inequality


    Monday, August 25, 2014

    American kids are getting ready to head back to school. But the schools they’re heading back to differ dramatically by family income.

    Which helps explain the growing achievement gap between lower and higher-income children.

    Thirty years ago, the average gap on SAT-type tests between children of families in the richest 10 percent and bottom 10 percent was about 90 points on an 800-point scale. Today it’s 125 points.

    The gap in the mathematical abilities of American kids, by income, is one of widest among the 65 countries participating in the Program for International Student Achievement.

    On their reading skills, children from high-income families score 110 points higher, on average, than those from poor families. This is about the same disparity that exists between average test scores in the United States as a whole and Tunisia.

    The achievement gap between poor kids and wealthy kids isn’t mainly about race. In fact, the racial achievement gap has been narrowing.

    It’s a reflection of the nation’s widening gulf between poor and wealthy families. And also about how schools in poor and rich communities are financed, and the nation’s increasing residential segregation by income.

    According to the Pew Research Center’s analysis of 2010 census tract and household income data, residential segregation by income has increased during the past three decades across the United States and in 27 of the nation’s 30 largest major metropolitan areas.

    This matters, because a large portion of the money to support public schools comes from local property taxes. The federal government provides only about 14 percent of all funding, and the states provide 44 percent, on average. The rest, roughly 42 percent, is raised locally.

    Most states do try to give more money to poor districts, but most states cut way back on their spending during the recession and haven’t nearly made up for the cutbacks.

    Meanwhile, many of the nation’s local real estate markets remain weak, especially in lower-income communities. So local tax revenues are down.

    As we segregate by income into different communities, schools in lower-income areas have fewer resources than ever.

    The result is widening disparities in funding per pupil, to the direct disadvantage of poor kids.

    The wealthiest highest-spending districts are now providing about twice as much funding per student as are the lowest-spending districts, according to a federal advisory commission report. In some states, such as California, the ratio is more than three to one.

    What are called a “public schools” in many of America’s wealthy communities aren’t really “public” at all. In effect, they’re private schools, whose tuition is hidden away in the purchase price of upscale homes there, and in the corresponding property taxes.

    Even where courts have requiring richer school districts to subsidize poorer ones, large inequalities remain.

    Rather than pay extra taxes that would go to poorer districts, many parents in upscale communities have quietly shifted their financial support to tax-deductible “parent’s foundations” designed to enhance their own schools.

    About 12 percent of the more than 14,000 school districts across America are funded in part by such foundations. They’re paying for everything from a new school auditorium (Bowie, Maryland) to a high-tech weather station and language-arts program (Newton, MA).

    “Parents’ foundations,” observed the Wall Street Journal, “are visible evidence of parents’ efforts to reconnect their money to their kids.” And not, it should have been noted, to kids in another community, who are likely to be poorer.

    As a result of all this, the United States is one of only three, out of 34 advanced nations surveyed by the OECD, whose schools serving higher-income children have more funding per pupil and lower student-teacher ratios than do schools serving poor students (the two others are Turkey and Israel).

    Other advanced nations do it differently. Their national governments provide 54 percent of funding, on average, and local taxes account for less than half the portion they do in America. And they target a disproportionate share of national funding to poorer communities.

    As Andreas Schleicher, who runs the OECD’s international education assessments, told the New York Times, “the vast majority of OECD countries either invest equally into every student or disproportionately more into disadvantaged students. The U.S. is one of the few countries doing the opposite.”

    Money isn’t everything, obviously. But how can we pretend it doesn’t count? Money buys the most experienced teachers, less-crowded classrooms, high-quality teaching materials, and after-school programs.

    Yet we seem to be doing everything except getting more money to the schools that most need it.

    We’re requiring all schools meet high standards, requiring students to take more and more tests, and judging teachers by their students’ test scores.

    But until we recognize we’re systematically hobbling schools serving disadvantaged kids, we’re unlikely to make much headway. 

    Share
  • The Disease of American Democracy


    Monday, August 18, 2014

    Americans are sick of politics. Only 13 percent approve of the job Congress is doing, a near record low. The President’s approval ratings are also in the basement.

    A large portion of the public doesn’t even bother voting. Only 57.5 percent of eligible voters cast their ballots in the 2012 presidential election. 

    Put simply, most Americans feel powerless, and assume the political game is fixed. So why bother? 

    A new study scheduled to be published in this fall by Princeton’s Martin Gilens and Northwestern University’s Benjamin Page confirms our worst suspicions.

    Gilens and Page analyzed 1,799 policy issues in detail, determining the relative influence on them of economic elites, business groups, mass-based interest groups, and average citizens.

    Their conclusion: “The preferences of the average American appear to have only a miniscule, near-zero, statistically non-significant impact upon public policy.”

    Instead, lawmakers respond to the policy demands of wealthy individuals and monied business interests – those with the most lobbying prowess and deepest pockets to bankroll campaigns.

    Before you’re tempted to say “duh,” wait a moment. Gilens’ and Page’s data come from the period 1981 to 2002. This was before the Supreme Court opened the floodgates to big money in “Citizens United,” prior to SuperPACs, and before the Wall Street bailout.

    So it’s likely to be even worse now.

    But did the average citizen ever have much power? The eminent journalist and commentator Walter Lippman argued in his 1922 book “Public Opinion” that the broad public didn’t know or care about public policy. Its consent was “manufactured” by an elite that manipulated it. “It is no longer possible … to believe in the original dogma of democracy,” Lippman concluded.

    Yet American democracy seemed robust compared to other nations that in the first half of the twentieth century succumbed to communism or totalitarianism.

    Political scientists after World War II hypothesized that even though the voices of individual Americans counted for little, most people belonged to a variety of interest groups and membership organizations – clubs, associations, political parties, unions – to which politicians were responsive.

    “Interest-group pluralism,” as it was called, thereby channeled the views of individual citizens, and made American democracy function.

    What’s more, the political power of big corporations and Wall Street was offset by the power of labor unions, farm cooperatives, retailers, and smaller banks.

    Economist John Kenneth Galbraith approvingly dubbed it “countervailing power.” These alternative power centers ensured that America’s vast middle and working classes received a significant share of the gains from economic growth.

    Starting in 1980, something profoundly changed. It wasn’t just that big corporations and wealthy individuals became more politically potent, as Gilens and Page document. It was also that other interest groups began to wither.

    Grass-roots membership organizations shrank because Americans had less time for them. As wages stagnated, most people had to devote more time to work in order to makes ends meet. That included the time of wives and mothers who began streaming into the paid workforce to prop up family incomes.

    At the same time, union membership plunged because corporations began sending jobs abroad and fighting attempts to unionize. (Ronald Reagan helped legitimized these moves when he fired striking air traffic controllers.)

    Other centers of countervailing power – retailers, farm cooperatives, and local and regional banks – also lost ground to national discount chains, big agribusiness, and Wall Street. Deregulation sealed their fates.

    Meanwhile, political parties stopped representing the views of most constituents. As the costs of campaigns escalated, parties morphing from state and local membership organizations into national fund-raising machines.

    We entered a vicious cycle in which political power became more concentrated in monied interests that used the power to their advantage – getting tax cuts, expanding tax loopholes, benefiting from corporate welfare and free-trade agreements, slicing safety nets, enacting anti-union legislation, and reducing public investments.

    These moves further concentrated economic gains at the top, while leaving out most of the rest of America.

    No wonder Americans feel powerless. No surprise we’re sick of politics, and many of us aren’t even voting.

    But if we give up on politics, we’re done for. Powerlessness is a self-fulfilling prophesy.

    The only way back toward a democracy and economy that work for the majority is for most of us to get politically active once again, becoming organized and mobilized.

    We have to establish a new countervailing power. 

    The monied interests are doing what they do best – making money. The rest of us need to do what we can do best – use our voices, our vigor, and our votes. 

     

     

    Share
  • The Rebirth of Stakeholder Capitalism?


    Saturday, August 9, 2014

    In recent weeks, the managers, employees, and customers of a New England chain of supermarkets called “Market Basket” have joined together to oppose the board of director’s decision earlier in the year to oust the chain’s popular chief executive, Arthur T. Demoulas.

    Their demonstrations and boycotts have emptied most of the chain’s seventy stores.

    What was so special about Arthur T., as he’s known? Mainly, his business model. He kept prices lower than his competitors, paid his employees more, and gave them and his managers more authority.

    Late last year he offered customers an additional 4 percent discount, arguing they could use the money more than the shareholders.

    In other words, Arthur T. viewed the company as a joint enterprise from which everyone should benefit, not just shareholders. Which is why the board fired him.

    It’s far from clear who will win this battle. But, interestingly, we’re beginning to see the Arthur T. business model pop up all over the place.

    Patagonia, a large apparel manufacturer based in Ventura, California, has organized itself as a “B-corporation.” That’s a for-profit company whose articles of incorporation require it to take into account the interests of workers, the community, and the environment, as well as shareholders.

    The performance of B-corporations according to this measure is regularly reviewed and certified by a nonprofit entity called B Lab.

    To date, over 500 companies in sixty industries have been certified as B-corporations, including the household products firm “Seventh Generation.”

    In addition, 27 states have passed laws allowing companies to incorporate as “benefit corporations.” This gives directors legal protection to consider the interests of all stakeholders rather than just the shareholders who elected them.

    We may be witnessing the beginning of a return to a form of capitalism that was taken for granted in America sixty years ago.

    Then, most CEOs assumed they were responsible for all their stakeholders.

    “The job of management,” proclaimed Frank Abrams, chairman of Standard Oil of New Jersey, in 1951, “is to maintain an equitable and working balance among the claims of the various directly interested groups … stockholders, employees, customers, and the public at large.”

    Johnson & Johnson publicly stated that its “first responsibility” was to patients, doctors, and nurses, and not to investors.

    What changed? In the 1980s, corporate raiders began mounting unfriendly takeovers of companies that could deliver higher returns to their shareholders – if they abandoned their other stakeholders.

    The raiders figured profits would be higher if the companies fought unions, cut workers’ pay or fired them, automated as many jobs as possible or moved jobs abroad, shuttered factories, abandoned their communities, and squeezed their customers.  

    Although the law didn’t require companies to maximize shareholder value, shareholders had the legal right to replace directors. The raiders pushed them to vote out directors who wouldn’t make these changes and vote in directors who would (or else sell their shares to the raiders, who’d do the dirty work).

    Since then, shareholder capitalism has replaced stakeholder capitalism. Corporate raiders have morphed into private equity managers, and unfriendly takeovers are rare. But it’s now assumed corporations exist only to maximize shareholder returns.

    Are we better off? Some argue shareholder capitalism has proven more efficient. It has moved economic resources to where they’re most productive, and thereby enabled the economy to grow faster.

    By this view, stakeholder capitalism locked up resources in unproductive ways. CEOs were too complacent. Companies were too fat. They employed workers they didn’t need, and paid them too much. They were too tied to their communities.

    But maybe, in retrospect, shareholder capitalism wasn’t all it was cracked up to be. Look at the flat or declining wages of most Americans, their growing economic insecurity, and the abandoned communities that litter the nation.

    Then look at the record corporate profits, CEO pay that’s soared into the stratosphere, and Wall Street’s financial casino (along with its near meltdown in 2008 that imposed collateral damage on most Americans).

    You might conclude we went a bit overboard with shareholder capitalism. 

    The directors of “Market Basket” are now considering selling the company. Arthur T. has made a bid, but other bidders have offered more.

    Reportedly, some prospective bidders think they can squeeze more profits out of the company than Arthur T. did. 

    But Arthur T. may have known something about how to run a business that made it successful in a larger sense.

    Only some of us are corporate shareholders, and shareholders have won big in America over the last three decades.

    But we’re all stakeholders in the American economy, and many stakeholders have done miserably. 

    Maybe a bit more stakeholder capitalism is in order. 

     

     

    Share
  • Work and Worth


    Saturday, August 2, 2014

    What someone is paid has little or no relationship to what their work is worth to society. 

    Does anyone seriously believe hedge-fund mogul Steven A. Cohen is worth the $2.3 billion he raked in last year, despite being slapped with a $1.8 billion fine after his firm pleaded guilty to insider trading?

    On the other hand, what’s the worth to society of social workers who put in long and difficult hours dealing with patients suffering from mental illness or substance abuse? Probably higher than their average pay of $18.14 an hour, which translates into less than $38,000 a year.

    How much does society gain from personal-care aides who assist the elderly, convalescents, and persons with disabilities? Likely more than their average pay of $9.67 an hour, or just over $20,000 a year.

    What’s the social worth of hospital orderlies who feed, bathe, dress, and move patients, and empty their ben pans? Surely higher than their median wage of $11.63 an hour, or $24,190 a year.

    Or of child care workers, who get $10.33 an hour, $21.490 a year? And preschool teachers, who earn $13.26 an hour, $27,570 a year?

    Yet what would the rest of us do without these dedicated people?

    Or consider kindergarten teachers, who make an average of $53,590 a year.

    Before you conclude that’s generous, consider that a good kindergarten teacher is worth his or her weight in gold, almost.

    One study found that children with outstanding kindergarten teachers are more likely to go to college and less likely to become single parents than a random set of children similar to them in every way other than being assigned a superb teacher.

    And what of writers, actors, painters, and poets? Only a tiny fraction ever become rich and famous. Most barely make enough to live on (many don’t, and are forced to take paying jobs to pursue their art). But society is surely all the richer for their efforts.

    At the other extreme are hedge-fund and private-equity managers, investment bankers, corporate lawyers, management consultants, high-frequency traders, and top Washington lobbyists.

    They’re getting paid vast sums for their labors. Yet it seems doubtful that society is really that much better off because of what they do.

    I don’t mean to sound unduly harsh, but I’ve never heard of a hedge-fund manager whose jobs entails attending to basic human needs (unless you consider having more money as basic human need) or enriching our culture (except through the myriad novels, exposes, and movies made about greedy hedge-fund managers and investment bankers).

    They don’t even build the economy. 

    Most financiers, corporate lawyers, lobbyists, and management consultants are competing with other financiers, lawyers, lobbyists, and management consultants in zero-sum games that take money out of one set of pockets and put it into another.

    They’re paid gigantic amounts because winning these games can generate far bigger sums, while losing them can be extremely costly.

    It’s said that by moving money to where it can make more money, these games make the economy more efficient.

    In fact, the games amount to a mammoth waste of societal resources.

    They demand ever more cunning innovations but they create no social value. High-frequency traders who win by a thousandth of a second can reap a fortune, but society as a whole is no better off.

    Meanwhile, the games consume the energies of loads of talented people who might otherwise be making real contributions to society — if not by tending to human needs or enriching our culture then by curing diseases or devising new technological breakthroughs, or helping solve some of our most intractable social problems.  

    Graduates of Ivy League universities are more likely to enter finance and consulting than any other career. 

    For example, in 2010 (the most recent date for which we have data) close to 36 percent of Princeton graduates went into finance (down from the pre-financial crisis high of 46 percent in 2006). Add in management consulting, and it was close to 60 percent.

    The hefty endowments of such elite institutions are swollen with tax-subsidized donations from wealthy alumni, many of whom are seeking to guarantee their own kids’ admissions so they too can become enormously rich financiers and management consultants.

    But I can think of a better way for taxpayers to subsidize occupations with more social merit: Forgive the student debts of graduates who choose social work, child care, elder care, nursing, and teaching.  

    Share
  • The Increasing Irrelevance of Corporate Nationality


    Monday, July 28, 2014

    “You shouldn’t get to call yourself an American company only when you want a handout from the American taxpayers,” President Obama said Thursday.

    He was referring to American corporations now busily acquiring foreign companies in order to become non-American, thereby reducing their U.S. tax bill.

    But the President might as well have been talking about all large American multinationals. 

    Only about a fifth of IBM’s worldwide employees are American, for example, and only 40 percent of GE’s. Most of Caterpillar’s recent hires and investments have been made outside the US.

    In fact, since 2000, almost every big American multinational corporation has created more jobs outside the United States than inside. If you add in their foreign sub-contractors, the foreign total is even higher.

    At the same time, though, many foreign-based companies have been creating jobs in the United States. They now employ around 6 million Americans, and account for almost 20 percent of U.S. exports. Even a household brand like Anheuser-Busch, the nation’s best-selling beer maker, employing thousands of Americans, is foreign (part of Belgian-based beer giant InBev).

    Meanwhile, foreign investors are buying an increasing number of shares in American corporations, and American investors are buying up foreign stocks.

    Who’s us? Who’s them?

    Increasingly, corporate nationality is whatever a corporation decides it is.  

    So instead of worrying about who’s American and who’s not, here’s a better idea: Create incentives for any global company to do what we’d like it to do in the United States.

    For example, “American” corporations get generous tax credits and subsidies for research and development, courtesy of American taxpayers.

    But in reducing these corporations’ costs of R&D in the United States, those tax credits and subsidies can end up providing extra money for them to do more R&D abroad.

    3M is building research centers overseas at a faster clip than it’s expanding them in America. Its CEO explained this was “in preparation for a world where the West is no longer the dominant manufacturing power.”

    3M is hardly alone. Since the early 2000s, most of the growth in the number of R&D workers employed by U.S.-based multinational companies have been in their foreign operations, according to the National Science Board, the policy-making arm of the National Science Foundation.

    It would make more sense to limit R&D tax credits and subsidies to additional R&D done in the U.S. over and above current levels – and give them to any global corporation increasing its R&D in America, regardless of the company’s nationality.

    Or consider Ex-Im Bank subsidies – a topic of hot debate in Washington these days. These subsidies are intended to boost exports of American corporations from the United States.

    Tea Party Republicans call them “corporate welfare,” and Chamber-of-Commerce Republicans call them sensible investments. But regardless, they’re going to “American” multinationals that are making things all over the world.

    That means any subsidy that boosts their export earnings in the United States indirectly subsidizes their investments abroad – including, very possibly, their exports from foreign nations.

    GE, a major Ex-Im Bank beneficiary, has been teaming up with China to produce a new jetliner there that will compete with Boeing for global business. (Boeing, not incidentally, is another Ex-Im beneficiary). In fact, GE is giving its Chinese partner the same leading-edge avionics technologies operating Boeing’s 787 Dreamliner. 

    Caterpillar, another Ex-Im Bank beneficiary, is providing engine funnels and hydraulics to Chinese firms that eventually will be exporting large moving equipment from China. Presumably they’ll be competing in global markets with Caterpillar itself.

    Rather than subsidize “American” exporters, it makes more sense to subsidize any global company – to the extent it’s adding to its exports from the United States.

    Which brings us back to American companies that are morphing into foreign companies in order to lower their U.S. tax bill.

    “I don’t care if it’s legal,” said the President. “It’s wrong.”

    It’s just as wrong for American corporations to hide their profits abroad – which many are doing simply by setting up foreign subsidiaries in low-tax jurisdictions, and then making it seem as if the foreign subsidiary is earning the money.

    Caterpillar, for example, saved $2.4 billion between 2000 and 2012 by funneling its global parts business through a Swiss subsidiary (a ruse so audacious that one of its tax consultants warned Caterpillar executives to “get ready to do some dancing” when called before Congress to justify it).  

    And what about American corporations that avoid U.S. taxes by never bringing home what they legitimately earn abroad – a sum now estimated to be in the order of $1.6 trillion?

    Rather than focus on the newly-fashionable tax-avoidance strategy of changing corporate nationality, it makes more sense to tax any global corporation on all income earned in the United States (with high penalties for shifting that income abroad), and no longer tax “American” corporations on revenues earned outside America. Most other nations already follow this principle. 

    In other words, let’s stop worrying about whether big global corporations are “American.” We can’t win that game. Focus instead on what we want global corporations of whatever nationality to do in America, and on how we can get them to do it.

    Share


  • Click for Town Square Videos