ROBERT B. REICH, Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies, was Secretary of Labor in the Clinton administration. Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written thirteen books, including the best sellers “Aftershock" and “The Work of Nations." His latest, "Beyond Outrage," is now out in paperback. He is also a founding editor of the American Prospect magazine and chairman of Common Cause. His new film, "Inequality for All," is now available on Netflix, iTunes, DVD, and On Demand.

+  FOLLOW ON TUMBLR    +  TWITTER    +  FACEBOOK

COLBERT REPORT, NOVEMBER, 2013

WITH BILL MOYERS, SEPT. 2013

DAILY SHOW, SEPTEMBER 2013, PART 1

DAILY SHOW, SEPTEMBER 2013, PART 2

DEMOCRACY NOW, SEPTEMBER 2013

INTELLIGENCE SQUARED DEBATES, SEPTEMBER 2012

DAILY SHOW, APRIL 2012, PART 1

DAILY SHOW, APRIL 2012, PART 2

COLBERT REPORT, OCTOBER, 2010

WITH CONAN OBRIEN, JANUARY, 2010

DEBATING RON PAUL, JANUARY, 2010

DAILY SHOW, OCTOBER 2008

DAILY SHOW, APRIL 2005

DAILY SHOW, JUNE 2004

  • Why Government Spends More Per Pupil at Elite Private Universities than at Public Universities


    Monday, October 13, 2014

    Imagine a system of college education supported by high and growing government spending on elite private universities that mainly educate children of the wealthy and upper-middle class, and low and declining government spending on public universities that educate large numbers of children from the working class and the poor.

    You can stop imagining. That’s the American system right now.

    Government subsidies to elite private universities take the form of tax deductions for people who make charitable contributions to them. In economic terms a tax deduction is the same as government spending. It has to be made up by other taxpayers.

    These tax subsidies are on the rise because in recent years a relatively few very rich people have had far more money than they can possibly spend or even give away to their children. So they’re donating it to causes they believe in, such as the elite private universities that educated them or that they want their children to attend.

    Private university endowments are now around $550 billion, centered in a handful of prestigious institutions. Harvard’s endowment is over $32 billion, followed by Yale at $20.8 billion, Stanford at $18.6 billion, and Princeton at $18.2 billion.

    Each of these endowments increased last year by more than $1 billion, and these universities are actively seeking additional support. Last year Harvard launched a capital campaign for another $6.5 billion.

    Because of the charitable tax deduction, the amount of government subsidy to these institutions in the form of tax deductions is about one out of every three dollars contributed.

    A few years back, Meg Whitman, now CEO of Hewlett-Packard, contributed $30 million to Princeton. In return she received a tax break estimated to be around $10 million.

    In effect, Princeton received $20 million from Whitman and $10 million from the U.S. Treasury – that is, from you and me and other taxpayers who made up the difference.

    Add in these endowments’ exemptions from taxes on capital gains and on income they earn, and the total government expenditures is even larger.

    Divide by the relatively small number of students attending these institutions, and the amount of subsidy per student is huge.

    The annual government subsidy to Princeton University, for example, is about $54,000 per student, according to an estimate by economist Richard Vedder. Other elite privates aren’t far behind. 

    Public universities, by contrast, have little or no endowment income. They get almost all their funding from state governments. But these subsidies have been shrinking.

    State and local financing for public higher education came to about $76 billion last year, nearly 10 percent less than a decade before.

    Since more students attend public universities now than ten years ago, that decline represents a 30 percent drop per student.  

    That means the average annual government subsidy per student at a public university comes to less than $4,000, about one-tenth the per student government subsidy at the elite privates. 

    What justifies so much government spending per student in private elite universities relative to public ones?

    It’s not that the private elites educate more children from poor families. One way to know is to look at the percentage of their students receiving Pell Grants, which are available only to children from poor families. (The grants themselves are relatively modest, paying a maximum of $5,645.)

    In fact, the elite privates with large endowments educate a smaller percentage of poor students than universities with little or no endowment income.

    According to a survey by the National Association of College and University Business Officers, only 16 percent of students in highly-endowed private universities receive Pell Grants, on average, compared with 59 percent at the lowest-endowed institutions. 

    At Harvard, 11 percent of students receive Pell Grants; at Yale, it’s 14 percent; Princeton, 12 percent; Stanford, 17 percent.

    By contrast, 59 percent of students at the University of Texas in El Paso receive Pell grants, 53 percent at the University of California at Riverside, and 33 percent at the University of California at Berkeley.

    Moreover, because public universities have many more students than elite private universities, their larger percentages of Pell students represent far greater numbers of students from poor families.

    For example, the University of California at Berkeley has more Pell eligible students than the entire Ivy League put together. 

    But perhaps the far higher per-student subsidies received by elite private universities are justified because they’re training more future leaders who will be in a position to reduce the nation’s widening inequality.

    Unfortunately, there’s not much evidence for that proposition. According to a study by sociologist Lauren Rivera, 70 percent of Harvard’s senior class submits résumés to Wall Street and consulting firms. In 2007, before the global financial meltdown, almost 50 percent of Harvard seniors (58 percent of the men, 43 percent of the women) took jobs on Wall Street.

    Among Harvard seniors who got jobs last spring, 3.5 percent were headed to government and politics, 5 percent to health-related fields, and 8.8 percent to any form of public service. The percentages at the other Ivies are not much larger.

    So what justifies the high per-student government subsidies at the elite private universities, and the low per-student subsidies in public universities?

    There is no justification.

    [I want to thank Dean Henry E. Brady of the Goldman School of Public Policy for the memorandum that provided the idea for this column and for helpful discussions about it.]

    Share
  • Why We Allow Big Pharma to Rip Us Off


    Sunday, October 5, 2014

    According to a new federal database put online last week, pharmaceutical companies and device makers paid doctors some $380 million in speaking and consulting fees over a five-month period in 2013.

    Some doctors received over half a million dollars each, and others got millions of dollars in royalties from products they helped develop.

    Doctors claim these payments have no effect on what they prescribe. But why would drug companies shell out all this money if it didn’t provide them a healthy return on their investment?

    America spends a fortune on drugs, more per person than any other nation on earth, even though Americans are no healthier than the citizens of other advanced nations.

    Of the estimated $2.7 trillion America spends annually on health care, drugs account for 10 percent of the total.

    Government pays some of this tab through Medicare, Medicaid, and subsidies under the Affordable Care Act.  But we pick up the tab indirectly through our taxes.

    We pay the rest of it directly, through higher co-payments, deductibles, and premiums.

    Drug company payments to doctors are a small part of a much larger strategy by Big Pharma to clean our pockets.

    Another technique is called “product hopping” —making small and insignificant changes in a drug whose patent is about to expire, so it’s technically new.

    For example, last February, before its patent expired on Namenda, its widely used drug to treat Alzheimer’s, Forest Laboratories announced it would stop selling the existing tablet form of in favor of new extended-release capsules called Namenda XR. 

    The capsules were just a reformulated version of the tablet. But even the minor change prevented pharmacists from substituting generic versions of the tablet.

    Result: Higher profits for Forest Labs and higher costs for you and me.  

    Another technique is for drug companies to continue to aggressively advertise prescription brands long after their twenty-year patents have expired, so patients ask their doctors for them. Many doctors will comply.

    America is one of few advanced nations that allow direct advertising of prescription drugs.

    A fourth tactic is for drug companies to pay the makers of generic drugs to delay their cheaper versions. These so-called “pay-for-delay” agreements generate big profits for both the proprietary manufacturers and the generics. But here again, you and I pay. The tactic costs us an estimated $3.5 billion a year.

    Europe doesn’t allow these sorts of payoffs, but they’re legal in the United States because the major drug makers and generics have fought off any legislative attempts to stop them.

    Finally, while other nations set wholesale drug prices, the law prohibits the U.S. government from using its considerable bargaining power under Medicare and Medicaid to negotiate lower drug prices. This was part of the deal Big Pharma extracted for its support of the Affordable Care Act of 2010.

    The drug companies say they need the additional profits to pay for researching and developing new drugs.

    But the government supplies much of the research Big Pharma relies on, through the National Institutes of Health.

    Meanwhile, Big Pharma is spending more on advertising and marketing than on research and development – often tens of millions to promote a single drug.

    And it’s spending hundreds of millions more every year lobbying. Last year alone, the lobbying tab came to $225 million, according to the Center for Responsive Politics.

    That’s more than the formidable lobbying expenditures of America’s military contractors.

    In addition, Big Pharma is spending heavily on political campaigns. In 2012, it shelled out over $36 million, making it the biggest political contributor of all American industries.

    Why do we put up with this? It’s too facile to say we have no choice given how much the industry is spending on politics. If the public were sufficiently outraged, politicians and regulatory agencies wouldn’t allow this giant ripoff.

    But the public isn’t outraged. That’s partly because much of this strategy is hidden from public view.

    But I think it’s also because we’ve bought the ideological claptrap of the “free market” being separate from and superior to government.

    And since private property and freedom of contract are the core of the free market, we assume drug companies have every right to charge what they want for the property they sell.

    Yet in reality the “free market” can’t be separated from government because government determines the rules of the game.

    It determines, for example, what can be patented and for how long, what side payoffs create unlawful conflicts of interest, what basic research should be subsidized, and when government can negotiate low prices.

    The critical question is not whether government should play a role in the market. Without such government decisions there would be no market, and no new drugs.

    The issue is how government organizes the market. So long as big drug makers have a disproportionate say in these decisions, the rest of us pay through the nose. 

    Share
  • Why the Economy is Still Failing Most Americans


    Sunday, September 28, 2014

    I was in Seattle, Washington, recently, to congratulate union and community organizers who helped Seattle enact the first $15 per hour minimum wage in the country.

    Other cities and states should follow Seattle’s example.

    Contrary to the dire predictions of opponents, the hike won’t cost Seattle jobs. In fact, it will put more money into the hands of low-wage workers who are likely to spend almost all of it in the vicinity. That will create jobs.

    Conservatives believe the economy functions better if the rich have more money and everyone else has less. But they’re wrong. It’s just the opposite. 

    The real job creators are not CEOs or corporations or wealthy investors. The job creators are members of America’s vast middle class and the poor, whose purchases cause businesses to expand and invest. 

    America’s wealthy are richer than they’ve ever been. Big corporations are sitting on more cash they know what to do with. Corporate profits are at record levels. CEO pay continues to soar.

    But the wealthy aren’t investing in new companies. Between 1980 and 2014, the rate of new business formation in the United States dropped by half, according to a Brookings study released in May.

    Corporations aren’t expanding production or investing in research and development. Instead, they’re using their money to buy back their shares of stock.

    There’s no reason for them to expand or invest if customers aren’t buying.

    Consumer spending has grown more slowly in this recovery than in any previous one because consumers don’t have enough money to buy. 

    All the economic gains have been going to the top.

    The Commerce Department reported last Friday that the economy grew at a 4.6 percent annual rate in the second quarter of the year.

    So what? The median household’s income continues to drop.

    Median household income is now 8 percent below what it was in 2007, adjusted for inflation. It’s 11 percent below its level in 2000.

    It used to be that economic expansions improved the incomes of the bottom 90 percent more than the top 10 percent.

    But starting with the “Reagan” recovery of 1982 to 1990, the benefits of economic growth during expansions have gone mostly to the top 10 percent.

    Since the current recovery began in 2009, all economic gains have gone to the top 10 percent. The bottom 90 percent has lost ground.

    We’re in the first economic upturn on record in which 90 percent of Americans have become worse off.

    Why did the playing field start to tilt against the middle class in the Reagan recovery, and why has it tilted further ever since?

    Don’t blame globalization. Other advanced nations facing the same global competition have managed to preserve middle class wages. Germany’s median wage is now higher than America’s.

    One factor here has been a sharp decline in union membership. In the mid 1970s, 25 percent of the private-sector workforce was unionized.

    Then came the Reagan revolution. By the end of the 1980s, only 17 percent of the private workforce was unionized. Today, fewer than 7 percent of the nation’s private-sector workers belong to a union.

    This means most workers no longer have the bargaining power to get a share of the gains from growth.

    Another structural change is the drop in the minimum wage. In 1979, it was $9.67 an hour (in 2013 dollars). By 1990, it had declined to $6.84. Today it’s $7.25, well below where it was in 1979.

    Given that workers are far more productive now – computers have even increased the output of retail and fast food workers — the minimum wage should be even higher.

    By setting a floor on wages, a higher minimum helps push up other wages. It undergirds higher median household incomes.

    The only way to grow the economy in a way that benefits the bottom 90 percent is to change the structure of the economy. At the least, this requires stronger unions and a higher minimum wage.

    It also requires better schools for the children of the bottom 90 percent, better access to higher education, and a more progressive tax system.

    GDP growth is less and less relevant to the wellbeing of most Americans. We should be paying less attention to growth and more to median household income.

    If the median household’s income is is heading upward, the economy is in good shape. If it’s heading downward, as it’s been for this entire recovery, we’re all in deep trouble.

     

    Share
  • Why Ordinary People Bear Economic Risks and Donald Trump Doesn’t


    Sunday, September 21, 2014

    Thirty years ago, on its opening day in 1984, Donald Trump stood in a dark topcoat on the casino floor at Atlantic City’s Trump Plaza, celebrating his new investment as the finest building in Atlantic City and possibly the nation.

    Last week, the Trump Plaza folded and the Trump Taj Mahal filed for bankruptcy, leaving some 1,000 employees without jobs.

    Trump, meanwhile, was on twitter claiming he had “nothing to do with Atlantic City,” and praising himself for his “great timing” in getting out of the investment.

    In America, people with lots of money can easily avoid the consequences of bad bets and big losses by cashing out at the first sign of trouble.

    The laws protect them through limited liability and bankruptcy.

    But workers who move to a place like Atlantic City for a job, invest in a home there, and build their skills, have no such protection. Jobs vanish, skills are suddenly irrelevant, and home values plummet.

    They’re stuck with the mess.

    Bankruptcy was designed so people could start over. But these days, the only ones starting over are big corporations, wealthy moguls, and Wall Street.

    Corporations are even using bankruptcy to break contracts with their employees. When American Airlines went into bankruptcy three years ago, it voided its labor agreements and froze its employee pension plan.

    After it emerged from bankruptcy last year and merged with U.S. Airways, America’s creditors were fully repaid, its shareholders came out richer than they went in, and its CEO got a severance package valued at $19.9 million.

    But American’s former employees got shafted.

    Wall Street doesn’t worry about failure, either. As you recall, the Street almost went belly up six years ago after risking hundreds of billions of dollars on bad bets.

    A generous bailout from the federal government kept the bankers afloat. And since then, most of the denizens of the Street have come out just fine.

    Yet more than 4 million American families have so far have lost their homes. They were caught in the downdraft of the Street’s gambling excesses.

    They had no idea the housing bubble would burst, and didn’t read the fine print in the mortgages the bankers sold them.

    But they weren’t allowed to declare bankruptcy and try to keep their homes. 

    When some members of Congress tried to amend the law to allow homeowners to use bankruptcy, the financial industry blocked the bill.

    There’s no starting over for millions of people laden with student debt, either.

    Student loan debt has more than doubled since 2006, from $509 billion to $1.3 trillion. It now accounts for 40 percent of all personal debt – more than credit card debts and auto loans.

    But the bankruptcy law doesn’t cover student debts. The student loan industry made sure of that.

    If former students can’t meet their payments, lenders can garnish their paychecks. (Some borrowers, still behind by the time they retire, have even found chunks taken out of their Social Security checks.)

    The only way borrowers can reduce their student debt burdens is to prove in a separate lawsuit that repayment would impose an “undue hardship” on them and their dependents.

    This is a stricter standard than bankruptcy courts apply to gamblers trying to reduce their gambling debts.

    You might say those who can’t repay their student debts shouldn’t have borrowed in the first place. But they had no way of knowing just how bad the jobs market would become. Some didn’t know the diplomas they received from for-profit colleges weren’t worth the paper they were written on.

    A better alternative would be to allow former students to use bankruptcy where the terms of the loans are clearly unreasonable (including double-digit interest rates, for example), or the loans were made to attend schools whose graduates have very low rates of employment after graduation.

    Economies are risky. Some industries rise and others implode, like housing. Some places get richer, and others drop, like Atlantic City. Some people get new jobs that pay better, many lose their jobs or their wages.

    The basic question is who should bear these risks. As long as the laws shield large investors while putting the risks on ordinary people, investors will continue to make big bets that deliver jackpots when they win but create losses for everyone else.

    Average working people need more fresh starts. Big corporations, banks, and Donald Trump need fewer. 

    Share
  • Harvard Business School’s Role in Widening Inequality


    Saturday, September 13, 2014

    No institution is more responsible for educating the CEOs of American corporations than Harvard Business School – inculcating in them a set of ideas and principles that have resulted in a pay gap between CEOs and ordinary workers that’s gone from 20-to-1 fifty years ago to almost 300-to-1 today.

    survey, released on September 6, of 1,947 Harvard Business School alumni showed them far more hopeful about the future competitiveness of American firms than about the future of American workers.

    As the authors of the survey conclude, such a divergence is unsustainable. Without a large and growing middle class, Americans won’t have the purchasing power to keep U.S. corporations profitable, and global demand won’t fill the gap. Moreover, the widening gap eventually will lead to political and social instability. As the authors put it, “any leader with a long view understands that business has a profound stake in the prosperity of the average American.”

    Unfortunately, the authors neglected to include a discussion about how Harvard Business School should change what it teaches future CEOs with regard to this “profound stake.” HBS has made some changes over the years in response to earlier crises, but has not gone nearly far enough with courses that critically examine the goals of the modern corporation and the role that top executives play in achieving them.

    A half-century ago, CEOs typically managed companies for the benefit of all their stakeholders – not just shareholders, but also their employees, communities, and the nation as a whole.

    “The job of management,” proclaimed Frank Abrams, chairman of Standard Oil of New Jersey, in a 1951 address, “is to maintain an equitable and working balance among the claims of the various directly affected interest groups … stockholders, employees, customers, and the public at large. Business managers are gaining professional status partly because they see in their work the basic responsibilities [to the public] that other professional men have long recognized as theirs.” 

    This view was a common view among chief executives of the time. Fortune magazine urged CEOs to become “industrial statesmen.” And to a large extent, that’s what they became. 

    For thirty years after World War II, as American corporations prospered, so did the American middle class. Wages rose and benefits increased. American companies and American citizens achieved a virtuous cycle of higher profits accompanied by more and better jobs.

    But starting in the late 1970s, a new vision of the corporation and the role of CEOs emerged – prodded by corporate “raiders,” hostile takeovers, junk bonds, and leveraged buyouts. Shareholders began to predominate over other stakeholders. And CEOs began to view their primary role as driving up share prices. To do this, they had to cut costs – especially payrolls, which constituted their largest expense.

    Corporate statesmen were replaced by something more like corporate butchers, with their nearly exclusive focus being to “cut out the fat” and “cut to the bone.”

    In consequence, the compensation packages of CEOs and other top executives soared, as did share prices. But ordinary workers lost jobs and wages, and many communities were abandoned. Almost all the gains from growth went to the top.

    The results were touted as being “efficient,” because resources were theoretically shifted to “higher and better uses,” to use the dry language of economics.

    But the human costs of this transformation have been substantial, and the efficiency benefits have not been widely shared. Most workers today are no better off than they were thirty years ago, adjusted for inflation. Most are less economically secure.

    So it would seem worthwhile for the faculty and students of Harvard Business School, as well as those at every other major business school in America, to assess this transformation, and ask whether maximizing shareholder value – a convenient goal now that so many CEOs are paid with stock options – continues to be the proper goal for the modern corporation.

    Can an enterprise be truly successful in a society becoming ever more divided between a few highly successful people at the top and a far larger number who are not thriving?

    For years, some of the nation’s most talented young people have flocked to Harvard Business School and other elite graduate schools of business in order to take up positions at the top rungs of American corporations, or on Wall Street, or management consulting.

    Their educations represent a substantial social investment; and their intellectual and creative capacities, a precious national and global resource.

    But given that so few in our society – or even in other advanced nations – have shared in the benefits of what our largest corporations and Wall Street entities have achieved, it must be asked whether the social return on such an investment has been worth it, and whether these graduates are making the most of their capacities in terms of their potential for improving human well-being.

    These questions also merit careful examination at Harvard and other elite universities. If the answer is not a resounding yes, perhaps we should ask whether these investments and talents should be directed toward “higher and better” uses.

    [This essay originally appeared in the Harvard Business Review’s blog, at http://blogs.hbr.org/2014/09/how-business-schools-can-help-reduce-inequality/]

    Share
  • Berkeley vs. Big Soda


    Monday, September 8, 2014

    I was phoned the other night in middle of dinner by an earnest young man named Spencer, who said he was doing a survey.

    Rather than hang up I agreed to answer his questions. He asked me if I knew a soda tax would be on the ballot in Berkeley in November. When I said yes, he then asked whether I trusted the Berkeley city government to spend the revenues wisely.

    At that moment I recognized a classic “push poll,” which is part of a paid political campaign.

    So I asked Spencer a couple of questions of my own. Who was financing his survey? “Americans for Food and Beverage Choice,” he answered. Who was financing this group? “The American Beverage Association,” he said.

    Spencer was so eager to get off the phone I didn’t get to ask him my third question: Who’s financing the American Beverage Association? It didn’t matter. I knew the answer: Pepsico and Coca Cola.

    Welcome to Berkeley, California: Ground Zero in the Soda Wars.

    Fifty years ago this month, Berkeley was the epicenter of the Free Speech Movement. Now, Berkeley is moving against Big Soda.

    The new movement isn’t nearly dramatic or idealistic as the old one, but the odds of victory were probably better fifty years ago. The Free Speech Movement didn’t challenge the profitability of a one of the nation’s most powerful industries.

    Sugary drinks are blamed for increasing the rates of chronic disease and obesity in America. Yet efforts to reduce their consumption through taxes or other measures have gone nowhere. The beverage industry has spent millions defeating them.

    If on November 4 a majority of Berkeley voters say yes to a one-cent-per-fluid-ounce tax on distributors of sugary drinks, Berkeley could be the first city in the nation to pass a soda tax. (San Franciscans will be voting on a 2-cent per ounce proposal requiring two-thirds of them approve; Berkeley needs a mere majority.)

    But if a soda tax can’t pass in the most progressive city in America, it can’t pass anywhere. Big Soda knows that, which is why it’s determined to kill it here.

    Taxing a product to reduce its consumption has been effective with cigarettes. According to the American Cancer Society, every 10 percent increase in the cost of a pack of cigarettes has caused a 4 percent decline in the rate of smoking.

    And for years cigarette manufacturers waged an all-ought war to prevent any tax or regulation. They eventually lost, and today it’s hard to find anyone who proudly smokes.

    Maybe that’s the way the Soda Wars will end, too. Consumption of sugary soft drinks is already down somewhat from what it was ten years ago, but kids (and many adults) are still guzzling it.

    Berkeley’s Soda War pits a group of community organizations, city and school district officials, and other individuals (full disclosure: I’m one of them) against Big Soda’s own “grassroots” group, describing itself as “a coalition of citizens, local businesses, and community organizations” without identifying its members.

    Even though a Field Research poll released in February found 67 percent of California voters (and presumably a similar percentage of Berkeley voters) favor a soda tax if revenues are spent on healthy initiatives, it will be an uphill fight.

    Since 2009, some thirty special taxes on sugary drinks have been introduced in various states and cities, but none has passed. Not even California’s legislature, with Democratic majorities in both houses, could enact a proposal putting warning labels on sodas.

    Even New York City’s former and formidable mayor Michael Bloomberg – no slouch when it came to organizing – lost to Big Soda. He wanted to limit the size of sugary drinks sold in restaurants and other venues to16 ounces.

    But the beverage industry waged a heavy marketing campaign against the proposal, including ads featuring the Statue of Liberty holding up a giant soda instead of a torch. It also fought it through the courts. Finally the state’s highest court ruled that the city’s Board of Health overstepped its authority by imposing the cap.

    Fifty years ago, Berkeley’s Free Speech Movement captured the nation’s attention and imagination. It signaled a fundamental shift in the attitudes of young Americans toward older forms of authority.

    Times have changed. Four years ago the Supreme Court decided corporations were people under the First Amendment, entitled to their own freedom of speech. Since then, Big Soda has poured a fortune into defeating ballot initiatives to tax or regulate sugared drinks.

    But have times changed all that much? In its battle with Big Soda, Berkeley may once again make history.

    Share
  • The Bankruptcy of Detroit and the Division of America


    Friday, September 5, 2014

    Detroit is the largest city ever to seek bankruptcy protection, so its bankruptcy is seen as a potential model for other American cities now teetering on the edge.

    But Detroit is really a model for how wealthier and whiter Americans escape the costs of public goods they’d otherwise share with poorer and darker Americans.

    Judge Steven W. Rhodes of the U.S. Bankruptcy Court for the Eastern District of Michigan is now weighing Detroit’s plan to shed $7 billion of its debts and restore some $1.5 billion of city services by requiring various groups of creditors to make sacrifices.

    Among those being asked to sacrifice are Detroit’s former city employees, now dependent on pensions and healthcare benefits the city years before agreed to pay. Also investors who bought $1.4 billion worth of bonds the city issued in 2005.

    Both groups claim the plan unfairly burdens them. Under it, the 2005 investors emerge with little or nothing, and Detroit’s retirees have their pensions cut 4.5 percent, lose some health benefits, and do without cost-of-living increases.

    No one knows whether Judge Rhodes will accept or reject the plan. But one thing is for certain. A very large and prosperous group close by won’t sacrifice a cent: They’re the mostly-white citizens of neighboring Oakland County.

    Oakland County is the fourth wealthiest county in the United States, of counties with a million or more residents.

    In fact, Greater Detroit, including its suburbs, ranks among the top financial centers, top four centers of high technology employment, and second largest source of engineering and architectural talent in America.

    The median household in the County earned over $65,000 last year. The median household in Birmingham, Michigan, just across Detroit’s border, earned more than $94,000. In nearby Bloomfield Hills, still within the Detroit metropolitan area, the median was close to $105,000

    Detroit’s upscale suburbs also have excellent schools, rapid-response security, and resplendent parks.

    Forty years ago, Detroit had a mixture of wealthy, middle class, and poor. But then its middle class and white residents began fleeing to the suburbs. Between 2000 and 2010, the city lost a quarter of its population.

    By the time it declared bankruptcy, Detroit was almost entirely poor. Its median household income was $26,000. More than half of its children were impoverished.

    That left it with depressed property values, abandoned neighborhoods, empty buildings, and dilapidated schools. Forty percent of its streetlights don’t work. More than half its parks closed within the last five years.

    Earlier this year, monthly water bills in Detroit were running 50 percent higher than the national average, and officials began shutting off the water to 150,000 households who couldn’t pay the bills.

    Official boundaries are often hard to see. If you head north on Woodward Avenue, away from downtown Detroit, you wouldn’t know exactly when you left the city and crossed over into Oakland County — except for a small sign that tells you.

    But boundaries can make all the difference. Had the official boundary been drawn differently to encompass both Oakland County and Detroit – creating, say, a “Greater Detroit” – Oakland’s more affluent citizens would have some responsibility to address Detroit’s problems, and Detroit would likely have enough money to pay all its bills and provide its residents with adequate public services.

    But because Detroit’s boundary surrounds only the poor inner city, those inside it have to deal with their compounded problems themselves. The whiter and more affluent suburbs (and the banks that serve them) are off the hook.

    Any hint they should take some responsibility has invited righteous indignation. “Now, all of a sudden, they’re having problems and they want to give part of the responsibility to the suburbs?” scoffs L. Brooks Paterson, the Oakland County executive. “They’re not gonna’ talk me into being the good guy. ‘Pick up your share?’ Ha ha.” 

    Buried within the bankruptcy of Detroit is a fundamental political and moral question: Who are “we,” and what are our obligations to one another?

    Are Detroit, its public employees, poor residents, and bondholders the only ones who should sacrifice when “Detroit” can’t pay its bills? Or does the relevant sphere of responsibility include Detroit’s affluent suburbs — to which many of the city’s wealthier resident fled as the city declined, along with the banks that serve them?

    Judge Rhodes won’t address these questions. But as Americans continue to segregate by income into places becoming either wealthier or poorer, the rest of us will have to answer questions like these, eventually. 

     

    Share
  • Back to College, the Only Gateway to the Middle Class


    Monday, September 1, 2014

    This week, millions of young people head to college and universities, aiming for a four-year liberal arts degree. They assume that degree is the only gateway to the American middle class.

    It shouldn’t be.

    For one thing, a four-year liberal arts degree is hugely expensive. Too many young people graduate laden with debts that take years if not decades to pay off.

    And too many of them can’t find good jobs when they graduate, in any event. So they have to settle for jobs that don’t require four years of college. They end up overqualified for the work they do, and underwhelmed by it.

    Others drop out of college because they’re either unprepared or unsuited for a four-year liberal arts curriculum. When they leave, they feel like failures. 

    We need to open other gateways to the middle class. 

    Consider, for example, technician jobs. They don’t require a four-year degree. But they do require mastery over a domain of technical knowledge, which can usually be obtained in two years.

    Technician jobs are growing in importance. As digital equipment replaces the jobs of routine workers and lower-level professionals, technicians are needed to install, monitor, repair, test, and upgrade all the equipment.

    Hospital technicians are needed to monitor ever more complex equipment that now fills medical centers; office technicians, to fix the hardware and software responsible for much of the work that used to be done by secretaries and clerks.

    Automobile technicians are in demand to repair the software that now powers our cars; manufacturing technicians, to upgrade the numerically controlled machines and 3-D printers that have replaced assembly lines; laboratory technicians, to install and test complex equipment for measuring results; telecommunications technicians, to install, upgrade, and repair the digital systems linking us to one another.

    Technology is changing so fast that knowledge about specifics can quickly become obsolete. That’s why so much of what technicians learn is on the job.

    But to be an effective on-the-job learner, technicians need basic knowledge of software and engineering, along the domain where the technology is applied – hospitals, offices, automobiles, manufacturing, laboratories, telecommunications, and so forth.

    Yet America isn’t educating the technicians we need. As our aspirations increasingly focus on four-year college degrees, we’ve allowed vocational and technical education to be downgraded and denigrated.

    Still, we have a foundation to build on. Community colleges offering two-year degree programs today enroll more than half of all college and university undergraduates. Many students are in full-time jobs, taking courses at night and on weekends. Many are adults.

    Community colleges are great bargains. They avoid the fancy amenities four-year liberal arts colleges need in order to lure the children of the middle class.

    Even so, community colleges are being systematically starved of funds. On a per-student basis, state legislatures direct most higher-education funding to four-year colleges and universities because that’s what their middle-class constituents want for their kids.

    American businesses, for their part, aren’t sufficiently involved in designing community college curricula and hiring their graduates, because their executives are usually the products of four-year liberal arts institutions and don’t know the value of community colleges. 

    By contrast, Germany provides its students the alternative of a world-class technical education that’s kept the German economy at the forefront of precision manufacturing and applied technology.

    The skills taught are based on industry standards, and courses are designed by businesses that need the graduates. So when young Germans get their degrees, jobs are waiting for them.

    We shouldn’t replicate the German system in full. It usually requires students and their families to choose a technical track by age 14. “Late bloomers” can’t get back on an academic track.

    But we can do far better than we’re doing now. One option: Combine the last year of high school with the first year of community college into a curriculum to train technicians for the new economy.

    Affected industries would help design the courses and promise jobs to students who finish successfully. Late bloomers can go on to get their associate degrees and even transfer to four-year liberal arts universities.

    This way we’d provide many young people who cannot or don’t want to pursue a four-year degree with the fundamentals they need to succeed, creating another gateway to the middle class.

    Too often in modern America, we equate “equal opportunity” with an opportunity to get a four-year liberal arts degree. It should mean an opportunity to learn what’s necessary to get a good job. 

    Share
  • Back to School, and to Widening Inequality


    Monday, August 25, 2014

    American kids are getting ready to head back to school. But the schools they’re heading back to differ dramatically by family income.

    Which helps explain the growing achievement gap between lower and higher-income children.

    Thirty years ago, the average gap on SAT-type tests between children of families in the richest 10 percent and bottom 10 percent was about 90 points on an 800-point scale. Today it’s 125 points.

    The gap in the mathematical abilities of American kids, by income, is one of widest among the 65 countries participating in the Program for International Student Achievement.

    On their reading skills, children from high-income families score 110 points higher, on average, than those from poor families. This is about the same disparity that exists between average test scores in the United States as a whole and Tunisia.

    The achievement gap between poor kids and wealthy kids isn’t mainly about race. In fact, the racial achievement gap has been narrowing.

    It’s a reflection of the nation’s widening gulf between poor and wealthy families. And also about how schools in poor and rich communities are financed, and the nation’s increasing residential segregation by income.

    According to the Pew Research Center’s analysis of 2010 census tract and household income data, residential segregation by income has increased during the past three decades across the United States and in 27 of the nation’s 30 largest major metropolitan areas.

    This matters, because a large portion of the money to support public schools comes from local property taxes. The federal government provides only about 14 percent of all funding, and the states provide 44 percent, on average. The rest, roughly 42 percent, is raised locally.

    Most states do try to give more money to poor districts, but most states cut way back on their spending during the recession and haven’t nearly made up for the cutbacks.

    Meanwhile, many of the nation’s local real estate markets remain weak, especially in lower-income communities. So local tax revenues are down.

    As we segregate by income into different communities, schools in lower-income areas have fewer resources than ever.

    The result is widening disparities in funding per pupil, to the direct disadvantage of poor kids.

    The wealthiest highest-spending districts are now providing about twice as much funding per student as are the lowest-spending districts, according to a federal advisory commission report. In some states, such as California, the ratio is more than three to one.

    What are called a “public schools” in many of America’s wealthy communities aren’t really “public” at all. In effect, they’re private schools, whose tuition is hidden away in the purchase price of upscale homes there, and in the corresponding property taxes.

    Even where courts have requiring richer school districts to subsidize poorer ones, large inequalities remain.

    Rather than pay extra taxes that would go to poorer districts, many parents in upscale communities have quietly shifted their financial support to tax-deductible “parent’s foundations” designed to enhance their own schools.

    About 12 percent of the more than 14,000 school districts across America are funded in part by such foundations. They’re paying for everything from a new school auditorium (Bowie, Maryland) to a high-tech weather station and language-arts program (Newton, MA).

    “Parents’ foundations,” observed the Wall Street Journal, “are visible evidence of parents’ efforts to reconnect their money to their kids.” And not, it should have been noted, to kids in another community, who are likely to be poorer.

    As a result of all this, the United States is one of only three, out of 34 advanced nations surveyed by the OECD, whose schools serving higher-income children have more funding per pupil and lower student-teacher ratios than do schools serving poor students (the two others are Turkey and Israel).

    Other advanced nations do it differently. Their national governments provide 54 percent of funding, on average, and local taxes account for less than half the portion they do in America. And they target a disproportionate share of national funding to poorer communities.

    As Andreas Schleicher, who runs the OECD’s international education assessments, told the New York Times, “the vast majority of OECD countries either invest equally into every student or disproportionately more into disadvantaged students. The U.S. is one of the few countries doing the opposite.”

    Money isn’t everything, obviously. But how can we pretend it doesn’t count? Money buys the most experienced teachers, less-crowded classrooms, high-quality teaching materials, and after-school programs.

    Yet we seem to be doing everything except getting more money to the schools that most need it.

    We’re requiring all schools meet high standards, requiring students to take more and more tests, and judging teachers by their students’ test scores.

    But until we recognize we’re systematically hobbling schools serving disadvantaged kids, we’re unlikely to make much headway. 

    Share
  • The Disease of American Democracy


    Monday, August 18, 2014

    Americans are sick of politics. Only 13 percent approve of the job Congress is doing, a near record low. The President’s approval ratings are also in the basement.

    A large portion of the public doesn’t even bother voting. Only 57.5 percent of eligible voters cast their ballots in the 2012 presidential election. 

    Put simply, most Americans feel powerless, and assume the political game is fixed. So why bother? 

    A new study scheduled to be published in this fall by Princeton’s Martin Gilens and Northwestern University’s Benjamin Page confirms our worst suspicions.

    Gilens and Page analyzed 1,799 policy issues in detail, determining the relative influence on them of economic elites, business groups, mass-based interest groups, and average citizens.

    Their conclusion: “The preferences of the average American appear to have only a miniscule, near-zero, statistically non-significant impact upon public policy.”

    Instead, lawmakers respond to the policy demands of wealthy individuals and monied business interests – those with the most lobbying prowess and deepest pockets to bankroll campaigns.

    Before you’re tempted to say “duh,” wait a moment. Gilens’ and Page’s data come from the period 1981 to 2002. This was before the Supreme Court opened the floodgates to big money in “Citizens United,” prior to SuperPACs, and before the Wall Street bailout.

    So it’s likely to be even worse now.

    But did the average citizen ever have much power? The eminent journalist and commentator Walter Lippman argued in his 1922 book “Public Opinion” that the broad public didn’t know or care about public policy. Its consent was “manufactured” by an elite that manipulated it. “It is no longer possible … to believe in the original dogma of democracy,” Lippman concluded.

    Yet American democracy seemed robust compared to other nations that in the first half of the twentieth century succumbed to communism or totalitarianism.

    Political scientists after World War II hypothesized that even though the voices of individual Americans counted for little, most people belonged to a variety of interest groups and membership organizations – clubs, associations, political parties, unions – to which politicians were responsive.

    “Interest-group pluralism,” as it was called, thereby channeled the views of individual citizens, and made American democracy function.

    What’s more, the political power of big corporations and Wall Street was offset by the power of labor unions, farm cooperatives, retailers, and smaller banks.

    Economist John Kenneth Galbraith approvingly dubbed it “countervailing power.” These alternative power centers ensured that America’s vast middle and working classes received a significant share of the gains from economic growth.

    Starting in 1980, something profoundly changed. It wasn’t just that big corporations and wealthy individuals became more politically potent, as Gilens and Page document. It was also that other interest groups began to wither.

    Grass-roots membership organizations shrank because Americans had less time for them. As wages stagnated, most people had to devote more time to work in order to makes ends meet. That included the time of wives and mothers who began streaming into the paid workforce to prop up family incomes.

    At the same time, union membership plunged because corporations began sending jobs abroad and fighting attempts to unionize. (Ronald Reagan helped legitimized these moves when he fired striking air traffic controllers.)

    Other centers of countervailing power – retailers, farm cooperatives, and local and regional banks – also lost ground to national discount chains, big agribusiness, and Wall Street. Deregulation sealed their fates.

    Meanwhile, political parties stopped representing the views of most constituents. As the costs of campaigns escalated, parties morphing from state and local membership organizations into national fund-raising machines.

    We entered a vicious cycle in which political power became more concentrated in monied interests that used the power to their advantage – getting tax cuts, expanding tax loopholes, benefiting from corporate welfare and free-trade agreements, slicing safety nets, enacting anti-union legislation, and reducing public investments.

    These moves further concentrated economic gains at the top, while leaving out most of the rest of America.

    No wonder Americans feel powerless. No surprise we’re sick of politics, and many of us aren’t even voting.

    But if we give up on politics, we’re done for. Powerlessness is a self-fulfilling prophesy.

    The only way back toward a democracy and economy that work for the majority is for most of us to get politically active once again, becoming organized and mobilized.

    We have to establish a new countervailing power. 

    The monied interests are doing what they do best – making money. The rest of us need to do what we can do best – use our voices, our vigor, and our votes. 

     

     

    Share


  • Click for Town Square Videos