Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Open Democracy (11-28-08)
Yeltsin's 1993 Constitution, which is still valid today, has never been a particularly balanced basic law. The Russian political system is called ‘semi-presidential.' It has even been claimed that it follows the French model. In fact, the emergence of the Russian presidency in 1991 did not have much to do with international experiences, any more than did its strengthening through the constitution of 1993 and later on. These developments were rather a mutation of the Soviet model of governance. They were determined by the narrow interests of the major actors involved, at the respective points in time.
The Soviet model
Within the power structure that Stalin left to his successors, the General or First Secretary of the CPSU Central Committee was the central role in the governance of Russia's Soviet empire. We in the West would regard the Soviet state's prime-minister as equivalent to the status of a deputy minister with an especially wide range of responsibilities embracing economic, cultural and social affairs.
Not only had the Chairman of the Council of Ministers little influence on such issues as foreign, military or security policies. All major decision-making power was eventually located in the hands of the CPSU's Central Committee's Politburo - the de facto government of the USSR. The Council of Ministers was not a government proper, but merely the highest echelon of the state bureaucracy directed by the CPSU apparatus.
Having dismantled much of the legitimacy of one-party rule in the late 1980s, General Secretary Mikhail Gorbachev needed to diversify his power base. In 1989, Gorbachev introduced the office of the President of the Soviet Union to which he was duly elected by the Congress of People's Deputies.
Yeltsin's move to create the office of the RSFSR President with a popular mandate two years later, was less a replication of France's power structure, than a copy of the executive on the level of the Union. It was thus a somewhat transfigured replication of Soviet ruling patterns, but based on a popular vote rather than totalitarian control and political terror. While the gradual collapse of the Union-state in 1991 completed the shift of power from the party to the state, many of the underlying concepts and the dominant habitus, or system, of Soviet governance survived.
Russia's post-Soviet autocracy
In much of the post-Soviet area, including the Russian Federation, the prime-minister remained a tertiary figure in the power-structure and, sometimes, a scapegoat when things went wrong.
Under both Yeltsin and Putin, certain deputy Prime Ministers as well as other officials particularly close to the President, like the Head of the Presidential Administration or Secretary of the Security Council, came second to the President and overshadowed the Chairman of the Council of Ministers in terms of their influence and authority.
In both Soviet and post-Soviet Russia, the use of such words as ‘minister' or ‘head of government' was, in fact, a game with words that reminded of the misuse of such terms as ‘democracy,' union' or ‘federation' in Russia after 1917. With brief intermissions (1917-1920, 1924-1928, 1953-1954, 1989-1993,1998-2000), the political concept that best described Russia's post-revolutionary polity was the same as before the October Revolution - autocracy: self-rule by one man.
The duumvirate's unclarities
In a way, this changed this year when Putin stepped down as President, and created a duumvirate. Leaving the Kremlin and becoming Russia's new prime-minister, Putin took, informally, much of the authority and many prerogatives of the President with him to Moscow's White House, the seat of Russia's formal government. Such a power-shift has been useful in so far as it has transformed Russia's Council of Ministers into a proper ruling body. For the first time since Lenin's chairmanship of the Council of People's Commissars, most power is concentrated within Russia's official government.
However, Putin's move also entails risks. The new prime-minister overdid his establishment of real semi-presidentialism in Russia in that he now overshadows his formal superior, President Dmitry Medvedev. This not only creates non-clarity about who is responsible for what in Russia. It has also devalued the office of the President - a dangerous development that continues Putin's earlier blurring of the distribution of power and accountability in Russia.
Putin inherited from Yeltsin a set of institutions already involved in an unhealthy competition for the formulation of policies. The Council of Ministers, Presidential Administration, and Security Council had conflicting responsibilities. After Putin took over in 2000, he not only reduced the influence of Russian civil society, political parties and regional power. He also added to the already cumbersome central power system new offices and bodies like the Presidential Plenipotentiaries in Russia's new Federal Districts, the State Council, and the Public Chamber. All these bodies now contend with the Governors, State Duma and Federation Council in exerting political influence on the federal level.
Such a diversity of institutions has, obviously, been designed to reduce or dilute all non-presidential power. The many formal office holders of the listed institutions neutralise one another, while all relevant decisions are made by the President and his buddies.
The personalisation of political rule
Under Putin's incumbency as President, political competition was still taking place in Russia. Yet, was happening within a narrow circle of power-hungry presidential cronies - often, with a security service background - whose influence was determined less by their official functions, than by their relative closeness to Putin.
After the introduction of the duumvirate this year, this configuration transformed somewhat. The Presidential Office too has fallen victim to Putin's grab of power. Whereas until 2008, Russian politics was centred on the Kremlin as the seat of the President, his aides and his administration, no such clear centre of power exists today.
Instead, Putin himself - as a private person rather than as an office holder - is the remaining locus of power. Today, Russia has a strikingly personalistic form of political rule that is unusual, if not anachronistic for a highly developed country, in the 21st century.
Recent changes to constitution
It could be that Medvedev's recent proposals to change the Russian power structure are meant to solve this problem. The office of the President will be strengthened through an extension of the term from currently four to six years - a measure that may re-establish this institution as the main locus of power.
At the same time, the Russian political process will become less focused on the presidential office alone. For the term of the State Duma will be extended too, but only for one year, i.e. from four to five years. Thus parliamentary elections will in due course cease to be primarily run-ups for the presidential elections, as has been the case since 1995.
President Medvedev also announced, in his first presidential address of November 5th, that he wants to extend the State Duma's control functions. He made a number of further specific proposals and announcement. While these minor adaptations seem designed to increase society's influence on government. But taken together, they lack a clear direction, leaving both Russian and Western observers guessing where the new President wants to go.
Medvedev's presidential address should not be taken to constitute a proper and coherent political programme. Rather, it can be seen as an expression of the elite's uncertainty about Russia's political future, and might be understood as a reflection of power struggles behind the scenes.
At least that would explain various contradictions in Medvedev's outline of future domestic and foreign policies. For example, his attack on the Russian statist tradition sits oddly with his proposal to expand the presidential term. Likewise, the rabidly anti-American rhetoric at the beginning of his speech is out of kilter with his denial that there is anti-Americanism in Russia at the end.
This explanation would also account for the awkwardness of some of Medvedev's proposals, such as his idea that parties that receive between 5% and 7% of the vote in a State Duma election should get one or two seats in parliament.
This strange formula looks as if it was the result of a uneasy compromise between those wishing to increase societal control (including, probably, Medvedev himself), and those wishing to preserve the insulation of power from the people.
Return of Kremlinology
Above all, it is unclear who will benefit from the extension of the presidential term: Medvedev himself or Putin? The latter illustrates what is, perhaps, the most frustrating aspect of Medvedev's recent reform plan - the way it has been announced and is now being, at least partly, implemented. There has been some media discussion of Medvedev's various proposals after they had been made. But the design, evaluation, specification and execution of the numerous changes to Russia's political power structure remain, as under Brezhnev and Putin, secluded domains with no public participation.
Pundits in both Russia and the West are returning to the uncertain discipline of Kremlinology in order to make sense of where Medvedev wants to go. This feeling of déjà-vu is, perhaps, the most disturbing experience one is left with when trying to make sense where the world's largest country is heading in the new century.
Posted on: Thursday, December 4, 2008 - 19:35
SOURCE: Informed Comment (Blog run by Juan Cole) (12-4-08)
Why is tuition so high in state universities that the NYT is wondering if families will go on being able to afford it?
As someone who has observed this rise in tuition over an academic career of 30 years, as graduate student and professor, I have some theories from an insider perspective.
State universities have had to raise tuition because state legislatures have continually cut back every year the amount of state funding for them. Already in 2005 this article in Philanthropy News Digest explained:
'For example, less than 14 percent of the University of Oregon's total revenue came from state funds in 2003-04, compared to 32 percent in 1985-86, while tuition fees accounted for more than 33 percent of the university's budget, compared to 22 percent twenty years ago. Meanwhile, the University of Michigan has lost 12 percent of its state funding, or $43 million, over the past two years. According to UM spokeswoman Julie Peterson, state money now only accounts for 8 percent of the university's budget."We can't rely on state funding alone," said Peterson."It simply isn't enough."'
That statistic, whereby the University of Oregon went from having 33 percent of its total revenue from state sources in 1985 to 14 percent in 2005, was typical of what happened throughout the whole country. The typical revenue streams for state universities used to be 1) State support, 2) tuition and fees, 3) Federal grants, and 4) alumni donations and the resulting endowment. At some state universities, the state contribution may now be the fourth largest source.
Now, why did states cut back the universities so much? It wasn't that the state legislators were bad people or anti-intellectuals for the most part.
The Reagan/ Grover Norquist line that government is not the solution, government is the problem, and the demand for lower taxes (especially on the wealthy) was influential in many states. So essentially the American big business class of about 3 million people was given the opportunity to quadruple its vast wealth through lower taxes (when you lower taxes on a particular segment of the public, that is wealth distribution in their favor). Meanwhile public functions of government are cut back and everybody else gets potholes, closed public libraries, underfunded state universities, etc.
I can remember when I started grad school at UCLA in 1979 I heard on the radio that because of steep cutbacks in property taxes, the state would no longer be able to afford to spray for mosquitoes. I thought to myself, lord, they'll end up giving themselves malaria to avoid a millage!
A big drain on state budgets is the penitentiary system. In just the decade 1980 to 1990, the prison and jail population in the US doubled. Since 1980, the prison population has quadrupled. By the end of 2006, over 2 million persons were in prison and another 5 million were on probation or on parole.
I remember reading in the Ann Arbor News in 1988 about a big debate at the statehouse in Lansing over funding for prisons versus funding for universities. The prisons won.
In these same nearly four decades, there have been substantial declines in violent crimes and crimes against property in the US. The vastly increased prison population was produced by unreasonably long prison sentences for non-violent crime, by ridiculous 3-strikes-and-you-are-out life sentences and by the completely failed 'war on drugs' and by mandatory sentencing guidelines imposed by legislatures on judges in drug cases. Half of prisoners in state prisons did not commit a violent crime, and 20% are drug offenders.
This vast expansion in the number of imprisoned Americans required states to build prisons and to pay large amounts of money to keep people in them.
The states had to put their money into prosecuting, trying, imprisoning and then supporting to the tune of like $20,000 a year a bunch of . . . potheads.
So obviously the states had no money to spend on state universities, which were cut loose, and had to raise tuition and hit their alumni up for contributions just to try to keep their heads above water.
Of course, universities faced increased costs at the same time. European monopolies drove up the costs of medical journals in ways that the European Union should look into. Digitalization has been a huge added cost that cannot be escaped (in many cases it means paying twice for books and other materials, once in hard copy and once in digital form.) Universities with medical schools face the high costs of acquiring increasingly high tech, state of the art medical equipment. Etc.
But I think the 'war on drugs' and the cost of prisons has deeply harmed state economies and has hurt access to state universities for working and middle class families.
Marijuana in particular may well have important medicinal properties, and it should just be legalized.
This is a conclusion a lot of frustrated law enforcement officials have come to, and they are campaigning for an end to prohibition. Reuters has more.
It is true that some proportion of the population may face addiction problems from marijuana. But it is not as if it isn't already a multi-billion dollar business and widely available. About 15 percent of Americans regularly use it. And about 1/5 of the population is susceptible to alcohol addiction, but that doesn't impel us to a second Prohibition in its regard. Use some of the tax receipts on the industry to fund treatment of those who can't handle it. A lot of the deleterious effects of being high come from people driving under the influence. But actually you could just mandate that the auto industry put in ignition switches that only a sober person has the reflexes to make work. Since we are likely to own the auto industry soon, we should be able to do what we want. And besides, green mass transit is much better than individuals driving around wreaking mayhem,and a pothead on a subway isn't much of a threat to anyone. We should move in that direction for all sorts of reasons.
I can remember reading an op-ed in the NYT years ago arguing that there are 60 million crimes a year in the United States, but only a tiny fraction of the perpetrators is ever actually prosecuted and a smaller fraction still brought to trial. I thought to myself, and a good thing too! How would we pay for 60 million prisoners? And, if you have a country of 280 million people committing 60 million crimes a year, you clearly just have way too many laws.
The baneful impact on the United States of Puritanism, which comes in part from the Religious Right, has diverted our energies from educating ourselves, and developing our society, toward instead creating a Nanny State that employs people to make sure you only get high from alcohol, not from other substances. Bush even created an FBI porn squad. As if wealthy Republican hoteliers weren't the primary distributors of porn (via pay per view channels) in the country.
If the Religious Right could, it would just close down all the biology classes in the country (because after all they teach that wicked Darwin) and leave the development of biotech to the South Koreans, with Americans--denied the wealth that biotechnological innovation will bring in--turned into unemployed riffraff.
It turns out that if we had more personal freedoms, we'd have more state monies and could educate ourselves to develop our potential as free human beings.
In the past 40 years, the snowball has been going in the other direction-- fewer personal freedoms, a vast gulag of the incarcerated, and less and less state money for the development of the minds of the public. We've built ourselves a big ignorant prison, with a loud-mouthed fundamentalist preacher for a warden, and called it America.
We should legalize pot, and tax the resulting industry. We should repeal mandatory sentencing guidelines and develop rehabilitation strategies rather than putting the ill-behaved in expensive state hotels. And we should go back to having state-funded state universities.
Posted on: Thursday, December 4, 2008 - 19:19
SOURCE: Britannica Blog (12-3-08)
The election’s over and not much has happened. But that’s not the way democracy is supposed to work. Besides, doing nothing isn’t good enough in this environment. In fact, doing nothing in the presidency is never good enough; sometimes it’s dangerous. The 9/11 Commission made this point when it included a number of recommendations that addressed the security risks associated with presidential transitions. One implication of the Commission’s report was that the Bush Administration may have missed the warning signs of an impending terrorist attack simply because it was pre-occupied with the transition - and that transition was in peace time in a relatively placid economic environment.
Perhaps it’s time to rethink this structural defect in our Constitution. There’s no way to avoid some of the uncertainty and inefficiency of presidential transitions, but we can certainly reduce the risk. Maybe it’s time to do something about presidential transitions — let’s get rid of them.
The last presidential transition in crisis was in 1932 when Herbert Hoover was defeated by FDR. For five long months, (the inauguration was held on March 4 in those days), Herbert Hoover and Franklin Roosevelt danced around one another while the economy collapsed. Hoover frantically tried to persuade Roosevelt to sign on to a series of reforms and policy commitments before the Democrat took office. Roosevelt, fearing that he would be trapped into policies with which he did not agree, demurred.
It is not surprising, therefore, that the 1932 – 33 transition was the last that was held on March 4, the date being moved up in subsequent years to January 20th. Actually, the 20th or “Lame Duck” Amendment was “proposed” by Congress (approved by two-thirds of both Houses of Congress) on March 2, 1932 and ratified by seventeen states before the election that year. But it was only after the election, in January 1933, that the necessary three quarters of the states voted to make the Amendment part of the Constitution.
Obviously we have survived and prospered under the shortened transition period. There is no reason not to make the transition period even shorter.
In the 18th century, when the Constitution was written, a long transition was necessitated by the lack of communication. It took time after the election to tabulate the vote, to transmit the vote to the state capitol, to bring the presidential electors to the state capitols to cast their votes, and then to transmit the results Washington for Congress to approve. Clearly, a lot of this procedure has been speeded up or is unnecessary. Consequently, I see no reason not to reduce the length of the transition to a couple of days or at most a week.
There are going to be plenty of objections to this proposal, so let me anticipate some of them and, later, respond to others as they come up.
What about the Electoral College? Besides the fact that it might be time for the Electoral College to go (a topic for another column), the Electoral College could fit into this scheme. In the modern era there is no need for a slate of electors to physically meet, or even to be selected. States can cast their electoral votes for one candidate or the other simply by certifying their election results. Under most circumstances this could happen within a day or two of the election. The added bonus would be that there would be no chance of an “unfaithful elector” (an elector who casts their vote for somebody other than the candidate to whom they are pledged).
What about Congress? Pursuant to the Constitution, Congress has to meet to certify the Electoral College results. A shorter transition would necessitate (probably) a lame duck session of Congress immediately following the general election. There is no reason that Congress couldn’t convene to receive the Electoral College results a couple of days after the election.
But what if there is a controversy and the election isn’t yet decided? In other words, what would happen in an election similar to the one in 2000? In that case, Congress would await the results and the incumbent President would stay in office as a caretaker until some date certain. In 2008, there were a couple of states that were still undecided a couple of days after the election. But as it turned out, the votes didn’t matter to the final outcome, so Congress could have declared a winner despite the fact that there was not a final determination in a particular state.
But what about the administrative transition? First of all, parliamentary systems get on perfectly well without lengthy transitions. Second, in a modern state there is a large, professional bureaucracy that serves regardless of the politics of any particular administration. Nevertheless, a new president, especially one from another party, would inherit a government staffed at the top by the previous administration. Could appointees from the previous administration be trusted to do their jobs? On balance, I think the answer is yes. There is always a certain amount of political hackery in the politics of presidential appointments but most public officials whether appointed or professional are loyal Americans first. Of course, the new President will want to replace existing political appointees. But that can be done in an orderly fashion, without what we have now, a wholesale abandonment of the Executive Branch by the leaving administration.
But what about the parade and all the parties? Look, these presidential transitions are a serious business. How about on the day of the inauguration a parade and a dinner hosted by the outgoing president for the incoming president? These inaugural festivities have begun to go too far. They are often an opportunity for corporations and individuals to buy influence off the books. Corporate sponsorships and candidate shakedowns are the order of the day. Enough already!!! Let’s get to the business of governing.
What are some of the other advantages? If the outgoing president isn’t sure who the incoming president will be until just before he/she leaves office, that will cut out a lot of the last minute shenanigans that have been going on during these transitions literally since the beginning. Will an outgoing president want to embarrass an incoming president of his own party with a wave of last minute pardons, appointments, regulatory reforms, and executive orders? Lame duck presidents still have a lot of power; let’s do our best to see that they exercise it responsibly (to the voters).
President-Elect Obama is right; it’s time for a change. Let’s start with the transition.
Posted on: Wednesday, December 3, 2008 - 23:04
SOURCE: Japan Focus (Click here to see pictures accompanying this article.) (11-24-08)
Mid-September of 2008 abruptly blackened every rosy scenario concerning the American subprime crisis and its spillovers. Through the summer, optimists had continued to insist that an American recession was unlikely and that, even if there were to be one, it would almost certainly be short and shallow. They averted their eyes from the implications of the September 7 US Government takeover of Fanny Mae and Freddie Mac, with their USD 5 trillion-plus in debt obligations, declaring that markets would be stabilized by the then-biggest yet of interventions. Yet even effectively nationalizing Fanny and Freddie failed to put a damper on the panic welling up in the markets. And then on September 17, the long-simmering and periodically over-boiling pot of financial conundrums suddenly erupted into a full-blown crisis that continues to wreak havoc around the world. In a few short weeks, the blast eviscerated the investment-banking industry, brought extraordinary volatility to virtually all major stock markets, and induced something like "The Day After Tomorrow" glaciation scenario in global credit markets.
The state has been hurriedly ushered back in, as lender of only resort, to fund a steady stream of nationalizations of financial firms, capital injections, and other desperate measures. A spreading movement among the European countries and elsewhere would guarantee all banking deposits. On October 3, in an election year, the U.S. Congress upped the ante of increasingly colossal interventions with the USD 700 billion Emergency Economic Stabilization Act. The Act allows the government, via the Troubled Assets Relief Program (TARP), to purchase mortgage-related securities from financial agencies as well as directly inject capital into them. Then on October 8, nearly all major central banks simultaneously instituted a rate cut. A November 24 Bloomberg report relates that, altogether, the US financial authorities have committed themselves to lend about USD 7.4 trillion in order to rescue banks and other financial entities in peril. This figure includes the over USD 320 billion incurred in late November when the USD 2.2 trillion Citi Group teetered on the cusp of a bailout or (unthinkably) falling into Lehman-land.
Back in an America that waits whatever magic Obama can work, the financial industry and its hired guns in academe and elsewhere are fighting a rear-guard action against the return of the regulatory state. The real economy is quickly slumping into a long and deep recession even as taxpayers see their monies used to bail out the people who are costing them their jobs, homes and dreams of entering, or remaining in, the middle class. These same people have been taken to task even in the pages of the Wall Street Journal for making millions in fees, buyouts and bonuses in the midst of this catastrophe. The deeper the onrushing recession gets, the more potent the backlash against their irresponsible (and at times criminal) behavior will be. What is certain now is that there is no going back to the status quo before the word "subprime" entered the lexicon. In terms of the means deployed against this crisis, as well as the future of reform, what was unthinkable scant weeks ago is now deemed essential or simply unstoppable. For example, the Federal Deposit Insurance Corporation (FDIC), long a backwater in the financial world, is now crafting regulations that would allow it to force the likes of Citi Group and JP Morgan to open their books on agency request. Toxic assets would see the light of day.
And in spite of massive interventions, Japan's and other world markets are still riding a roller-coaster. On October 10, Japan's Nikkei stock-market average saw its biggest ever single-day drop -- nearly 10% -- and two significant firms - Yamato Life Insurance and NewCity Residence Investment - went bankrupt through contagion from America's financial innovations. On October 27, the Nikkei dropped nearly 500 points to 7,162, closing in on levels it has not seen for 26 years; i.e. several years before Japan's late-80s asset bubble even began building. Other Asian and European markets have been similarly roiled. By late October, for example, South Korea's Composite Stock Price Index had plunged over 35% in a month, well over the previous record 21.17% drop recorded in May 1998 during the Asian financial crisis. And US markets have been incredibly volatile as well. The Dow Jones Index fell 7.9% on October 15, its worst one-day drop since 1987. The Financial Times of November 17 tells us that the drop came during 9 consecutive days of over 8% interday volatility that continued to October 16, exceeding the previous record from the height of the Crash of 1929. Just before the October 15 plunge, the October 11 Wall Street Journal noted that following the October 3 passage of the US Financial Stabilization Act, the Dow had the worst week ever in its 112 year history. The Dow's 22% weekly loss contributed to the USD 8.4 trillion paper loss on stock assets over the past year. On October 28, the Bank of England estimated that financial firms' losses alone, via mark-to-market write downs, now total USD 2.8 trillion, about double the Bank's estimate last April. In late November, markets seem to have resumed their plunge, leaving the Standard & Poor's stock index a spectacular 50 percent below its peak. And the bottom of this mess seems yet quite far off.
Meanwhile, a disaster epic has been playing in the credit markets, essential for the transmission of finance into the pockets of the real economy. Put simply, credit markets have frozen up. With the likes of Bear Stearns and Lehman's Brothers made bankrupt through trading in complex financial instruments, especially mortgage-backed securities, banks do not know which counterparties they can trust. No one knows who is holding how much in toxic assets and what any given bad asset might be worth. Size and reputation no longer imply safety. The Lehman bankruptcy was the largest in US history, exceeding USD 600 billion in assets, so even a long-established and enormous presence offers little indication of the level of a counterparty's potential risk. Banks are, moreover, keen to protect their own capital bases - in part since they cannot even assess their own level of risk from in-house toxic assets - by hoarding cash and limiting exposure to borrowers. That hoarding worsens the crunch on financing for investment and consumption in the overall economy. Add into the mix the fact that the shockwaves from the Lehman bankruptcy spread into even the ordinarily safe venue of money market funds, which are specifically designed and regulated to limit risks. The result of these and other shocks saw interbank and other lending almost stop, and the flow of loans from these financial intermediaries to businesses and consumers dwindle to a trickle. The bulk of recent measures by state agencies in the developed world have been aimed at thawing out these conduits of credit and thus avoiding a precipitous decline into outright economic depression. Over the past few weeks, these measures seemed to be having some success, judging by standard measures of risk such as LIBOR (the London inter-bank lending rate) and the Ted Spread (the difference between 3-month Treasury Bills and the 3-month LIBOR); but even so, the flow of credit does not appear to be circulating beyond the big banks and into the broader economy. And now these risk measures are creeping up again, as financiers ponder the implications that a long and deep recession will have for asset values and the securities they are built on.
Ships at anchor
One of the most disturbing aspects of this credit crunch is its shock to the truly broad economy of global trade. According to an October 29 Bloomberg report, 90 percent of the USD 13.6 trillion global trade in goods relies on letters of credit exchanged between financial institutions to ensure payment of shipped goods. As trust contracts even within the once-cozy confines of Wall Street and other financial districts, its compression globally and, in particular, its effect on letters of credit is even more pronounced. Letters of credit have financed global trade for centuries, and especially in recent years. But financial intermediaries are now either refusing to honor these letters of credit or are hiking their premiums by multiples of what they were just months ago. Over the long chain of intermediaries involved in global trade, trust in repayment is dissipating like electrical current sent too far. Added to the problem is that plunging commodity prices and rapidly shifting exchange rates threaten to render the shipments themselves unprofitable en route to their destination. Moreover, shipping firms have an enormous debt burden due to a frenzied boom in buying new ships over the past several years in anticipation of continuing robust expansion in global trade. In consequence, the Baltic Dry Index (a measure of rates for shipping dry goods) has plunged by over 90 percent from its May 21, 2008, peak. The container-shipping index is expected to show even more spectacular declines as this shock wave rips through far-flung production chains in our borderless world.
The World Bank, IMF and other agencies are painfully aware of this crisis in global trade, and are working to extend short-term funding to low-income countries. But since global trade is, as we noted, a USD 13 trillion annual business, these agencies simply do not have the means to deal with a problem of this scale. Imagine the scenario early next year if this crisis within a crisis continues: goods are not being shipped, and hence those who export are being hurt; but those who import are being hurt as well, as they do not receive the raw materials and goods to make things with and to consume. The entire global supply chain is at risk, something that the World Trade Organization recognizes but is ill-equipped to address. Already, the East Asian port cities of Singapore, Hong Kong and Taiwan are witness to vast fleets of bulk carriers and other vessels riding at anchor in wait of better times. The latest projection for global growth is 1.7 percent, well below the 3 percent level that marks a global recession; but considering the momentum of current economic events, growth seems set to slide even further.
The credit freeze drove global finance to the brink of collapse in October, and it could quickly be driven back there and beyond by some blindsiding turn of events. This includes spillovers from the ongoing compression of global trade, the havoc pouring out of the currency markets and the ongoing implosion of the USD 2 trillion hedge fund industry. Or the accelerating downturn in American commercial property markets could undermine asset-backed securities enough to engender a new round of panic stretching across asset classes. The last redoubts of the optimists are the vast, unregulated universes of finely distributed risk seen in the USD 55 trillion credit default swap market, a subset of the nearly USD 670 trillion derivatives market. These derivatives markets are said to be impervious to the chaos swirling around them because they are largely insurance contracts that, overall, generally cancel each other out. Maybe so, but until recently subprime was regarded as too small an asset class to have systemic implications. Everyone should be concerned about the fact that derivatives are not traded in open markets with proper regulation on product design and capital-adequacy margins. Indeed, more than a few credible, specialist voices warn that the collapse of one or more large "reference entities" (i.e., the firm or other item being insured by these contracts) could trigger a catastrophe. The ostensible dispersion of risk through derivatives may have, in fact, produced an enormous systemic risk, akin to the explosive potential of a huge concentration of combustible dust awaiting a spark.
The details of this financial crisis - its enormous scale, the lack of trust among financial institutions, the freezing of the commercial paper markets and other evidence of an accelerating credit crunch - naturally make one think of Japan in the 1990s. And much specialist and media commentary has in fact been devoted to probing the lessons that Japan's post-bubble policy choices might offer as the US confronts the present meltdown. The main lesson would appear to be to act fast, firmly, and intelligently. This is a negative lesson rather than a roadmap of what to do. Japan's "10 lost years" in the 1990s saw a protracted string of policy failures and slow recognition of the need to use public money, especially via direct capital injections into the banks (such as the state's purchase of preferred stock and subordinated bonds), in order to shore up the banks' capital bases. Japan's stock and property bubbles collapsed largely at the start of the 1990s, but it took years for the authorities to recognize that their problems were systemic and to craft suitably comprehensive policy responses.
One of the main hurdles to effective action in Japan was the confidence that asset values would hit bottom and then recover after a few years, eliminating the need for serious public-sector action. This mistake was to some extent mirrored in the United States (and elsewhere) over the past year. The financial sector and financial authorities simply refused to recognize that toxic assets were taking a protracted, perhaps permanent loss of their bubble-era value. Both effectively colluded in a “wait and see” game, counting on recovery, a game that left the financial system spectacularly vulnerable. While they did not waste as much time as in the Japanese case during the 1990s, the scale of the tsunami that have since ensued dwarf any of the crises that roiled Japan from the late 1990s to early 2000s.
There was also political resistance to action, largely because the authorities were unwilling to investigate and prosecute criminal behavior. Public sector funds, amounting to Yen 685 billion, were first used to rescue seven failed "jusen" housing loan firms in 1996. This bail-out caused a political backlash that further encouraged the authorities to sit on their hands or intervene only sporadically as long as possible. In 1998, Japanese financial authorities injected a further Yen 1.8 trillion into the system, but the Long-Term Bank of Japan and the Nippon Credit Bank went bust (they are now, respectively, the Shinsei Bank and the Aozora Bank). August 24 of the same year saw the Economic Strategy Council formally set up. In 1999, this council announced a three-year suspension of any inquiries into bank management responsibility along with a further Yen 7.5 trillion injection of public funds. But these controversial moves failed to resolve the crisis and a protracted and dangerous deflation ensued.
So perhaps another negative lesson for America from Japan is that injecting even massive public funds is not in itself sufficient. The proper assessment of toxic assets via a close scrutiny of their value is clearly required, as is proposed by the FDIC (and appears about to become law). But perhaps more important even than that, there must be serious inquiry into management wrongdoing. Systemic failures require systemic solutions, lest the symptoms continue to fester and manifest themselves in periodic and perhaps increasingly large crises. In other words, if one is going to use the state, then its pecuniary as well as punitary arms need to be used with a comprehensiveness and intelligence that not only deals with the technical aspects of the crisis but also the natural political resistance to bailing out rich and irresponsible people whose actions contributed to the crisis. The latter clearly have to be made to pay, in the fiscal and legal senses of the term, lest the path to recovery be constricted by political fallout.
But to date, the American bailout and related policies are spectacular for their lack of transparency, accountability and consistency. As a result of various missteps and volte-faces, Treasury Secretary Henry Paulson's credibility is perhaps as low as that of the current President. At a September 23 Senate Banking Committee hearing, Paulson declared that "we need oversight...we need transparency," but he appears to have shifted his position considerably. As of late November, Paulson's Treasury had already spent about USD 300 billion worth of the TARP program. About USD 125 billion of this went to America's nine largest banks and investment banks, another USD 125 billion has gone to publicly traded regional banks (many almost certain to fail even with this support), and the remainder has been largely allocated to American International Group. According to some recent assessments, much of this money has been used by the banks to pay shareholders, with plenty also going to executive salaries and bonuses. Another chunk of it has gone for buybacks to increase the value of shares. These are all serious, and in some cases potentially illegal, departures from the original purposes of the fund.
Keep in mind that Paulson originally argued that, as the name implies, the “Troubled Assets Relief Program” would be used to purchase mortgage-backed securities. These assets were then be sold through something resembling a reverse auction (buying them from the banks at the lowest prices they would be willing to sell them for). That plan was absurdly unworkable then, and remains so, but in a November 12 about-face Paulson abandoned the original plan and declared the Treasury would focus on capital injections and other measures to end the freeze in consumer credit markets. This abrupt shift shocked Wall Street and came with few concrete details of how the new approach would be implemented.
Paulson has in fact been the most salient example of mismanagement. Last July, he secured authority from Congress to aid Fannie Mae and Freddie Mac by promising that he would not need to use it. Then in early September he did have to use it. His above-noted shift on capital injections was more recently followed by a declaration that he would probably need to use the entire USD 700 billion of the TARP. This declaration came a week after he claimed he would need only about half the funds, and would leave the remainder to the Obama Administration. Paulson's serial reversals of earlier positions are certainly rooted in the incredible flux of events. But his job is to craft pro-active policy and seek to stabilize markets rather than further roil the latter.
Moreover, Paulson’s mismanagement of the bail-out and the TARP is hard to explain as merely poor foresight. It was clear last year, and certainly from September of this year, that injecting capital directly into the banks was absolutely essential. Paulson staunchly opposed this policy, for reasons that remain obscure. But there is speculation that Paulson, as a product of Wall Street (a former head of Goldman Sachs), simply was unable to admit that matters had come to such a crisis that capital injections were necessary. In other words, ideological and personal ties to Wall Street appear to have prevented Paulson from recognizing the obvious (in back of that, one has to wonder how much lobbying there was on him from Wall Street not to opt for the interventionist strategy). Whatever the case, the policy failures with this program and his allowing funds to be used to pay dividends, bonuses and deferred compensation, as well as in making acquisitions, threaten to discredit the overall effort to fix the financial sector's crisis through direct public intention of capital into the banks. This interventionism is essential, but it is worrisome that Paulson may have succeeded in politicizing it to the point where a proper clean-up is either difficult to do or is done and then generates a strong backlash. This complacent incompetence, too, is reminiscent of post-bubble Japan’s policy failures.
We all learned from the Japanese experience that it is imperative to act fast, but it is also just as critical to act intelligently and in the collective interest. Wasting precious time allows the crisis to deepen and thus increases the eventual costs of bail-outs and economic losses. Politicized policy likewise risks generating a dangerous level of distrust concerning efforts to recapitalize the banks, providing a strong foothold for reactionary politics in the deepening recession. But thankfully, Paulson is running out of money, as once he hits the USD 350 billion mark, he must go to the US Congress to receive the next USD 350 billion. One can only hope that Paulson's failures force the Congress to investigate, and very aggressively investigate, wrongdoing in tandem with focusing on the smart targeting of public funds. One thing that is clear from Japan is that allowing the financial industry and its ideologically blinkered representatives to craft what they like is a recipe for costly failure, a failure whose burden compounds over the years in lost opportunities for millions in the real economy.
Posted on: Wednesday, December 3, 2008 - 19:43
SOURCE: NYT (11-29-08)
ON election night, Nov. 8, 1932, Herbert Hoover, in the company of friends and neighbors at his home on the Stanford campus, sifted through returns that were rendering a verdict on a presidency begun so hopefully on a March day in 1929. As she observed him — his eyes bloodshot, his face ashen, his expression registering disbelief and dismay — a little girl asked, “Mommy, what do they do to a president to make a man look like Mr. Hoover does?”
The campaign had been brutal. Detroit had to call out mounted police to protect the president from the fury of jobless auto workers chanting “Hang Hoover!”
“I’ve been traveling with presidents since Theodore Roosevelt’s time, and never before have I seen one actually booed with men running out into the streets to thumb their noses at him,” said a Secret Service agent. “It’s not a pretty sight.”
Even so, the returns on election night exceeded Hoover’s worst fears. The president suffered the greatest thrashing up to that point in a two-candidate race in the history of the Republican Party, and his opponent, Franklin Delano Roosevelt, became the first Democrat to enter the White House with a popular majority since Franklin Pierce 80 years earlier.
Two features made his defeat especially galling. One was that he knew the outcome was less an expression of approval for the challenger than a rejection of the incumbent. The other was that, despite being a pariah, he was expected to soldier on for nearly four more months — until March 4, 1933. The 20th Amendment, moving Inauguration Day to January, was close to being ratified but would not take effect until 1937. He was fated to be the last lame-duck president of the old order.
Hoover determined to exploit this interim to salvage his presidency. No sooner had the ballots been counted than he invited Governor Roosevelt to confer with him. The overture gave every appearance of being an exceptionally generous offer to share power with the man who had vanquished him. In fact, it was the first step of a scheme to undo the results of the election. Hoover acted, the historian Frank Freidel later wrote, “as though he felt it was his duty to save the nation, indeed the world, from the folly of the American voters.”
On Nov. 22, Hoover welcomed Roosevelt to the White House. Throughout the meeting, he treated his successor as though he were a thickheaded schoolboy who needed drilling on intransitive verbs. He sought to bully the president-elect into endorsing the administration’s policies at home and abroad, especially sustaining the gold standard at whatever cost. Alert to Hoover’s intent, Roosevelt smiled, nodded, smiled again, but made no commitment. A frustrated Hoover later vowed, “I’ll have my way with Roosevelt yet.”...
Posted on: Wednesday, December 3, 2008 - 04:11
SOURCE: CNN (12-1-08)
Many observers use historian Doris Kearns Goodwin's term, "A Team of Rivals," to describe the cabinet that President-elect Barack Obama is assembling.
They use the term to characterize choices like former Obama opponent Sen. Hillary Clinton -- expected to be nominated Monday as Secretary of State -- and current secretary of defense Robert Gates who is being asked to stay on by Obama.
But a more useful term might be a team of centrists. The most striking characteristic of the current lineup is how the personalities reflect the centrist vision of the Democratic Party promoted by Bill Clinton and his colleagues at the Democratic Leadership Council in the 1990s.
Obama has called on experts who aggressively promoted globalization and deregulation on economic matters, pushed for welfare reform, and accepted the necessity of military force and a strong defense. There are exceptions, but overall thus far, it appears Obama will be advised from the center.
Some of Obama's core supporters are surprised and upset with his choices while others say his choices are a logical reaction to the crises facing his administration.
A close look at Obama's development since 2004 suggests centrism should have been expected. There is little evidence beyond his history as a community organizer to indicate Obama is left of center.
That's part of the irony of the attacks made by Sen. John McCain and Gov. Sarah Palin against Obama for his association with 1960s radicals and statements about progressive taxation.
When Obama was introduced to the national scene at the 2004 Democratic Convention, his keynote speech focused on the need to overcome political polarization and long-standing divisions. In the most famous part of the speech, Obama said, "there's not a liberal America and a conservative America -- there's the United States of America."
This is far from the rallying cries of Sen. Ted Kennedy who has enthusiastically defended the liberal tradition of his party.
During his presidential campaign in 2008, Obama's policy proposals were not at all radical. Indeed many of his key positions looked much more like those of Bill Clinton than Franklin Roosevelt or Lyndon Johnson....
Posted on: Tuesday, December 2, 2008 - 21:45
SOURCE: Oxford University Press blog (12-2-08)
Many Americans, and the rest of the world, wonder why so much time elapses between the U.S. presidential election in November and the inauguration on January 20. Why not reform the system and reduce the interval? The answer is we did reform it–the interregnum used to last twice as long.
Under the original Constitutional scheme, the new president took office on March 4, four months after the November elections. The new Congress would not convene until the first Monday in December, thirteen months after the election. This made sense to the framers in the eighteenth century, when transportation was slow and treacherous. The incoming president would call the Senate into special session for a week in March to confirm his cabinet, and then have the rest of the year to get his administration underway free from congressional interference.
By the twentieth century, the old system had grown obsolete. The second session of every Congress did not meet until after the next election had taken place, meaning that senators and representatives who had been defeated or retired came back as lame ducks. They proved especially susceptible to lobbyists, and since the short session had to end at midnight on March 3, they could easily filibuster to block needed legislation. George Norris, a progressive Republican from Nebraska who chaired the Senate Judiciary Committee, led the effort to amend the Constitution and move the presidential inauguration from March 4 to January 20, and the opening of Congress from December up to January 3. By staggering the closing dates of the terms of the president and Congress, the amendment also eliminated the need for outgoing presidents to spend their last night on Capitol Hill signing and vetoing last-minute legislation.
Beyond getting rid of most lame duck sessions, Norris’ amendment halved the transition between presidential administrations, from four months down to two. Transitions had grown increasingly awkward. During peaceful and prosperous times, the incoming president had to keep out of the way of his predecessor. Herbert Hoover, for instance, sailed off to South America after the 1928 election to avoid upstaging Calvin Coolidge’s final months in office. During periods of conflict and crisis, however, the interregnum cost the nation needed leadership. Outgoing presidents tried to coerce their successors into continuing their policies, as James Buchanan attempted with Abraham Lincoln in 1861, and Herbert Hoover did with Franklin D. Roosevelt in 1933. Lincoln and Roosevelt wisely avoided committing themselves to failed ideas, but these impasses did nothing to resolve the crises they faced, which grew worse by the time they took office.
The transition between Hoover and Roosevelt took place against a dramatic collapse of the American financial system, with the nation’s banking system shutting down, credit drying up, and unemployment soaring. Congress had passed the Twentieth Amendment in March 1932 and sent it to the states, but the necessary three quarters of the states did not ratify it until January 23, 1933, three days after the new date for inaugurations, making it too late for that year. The first inauguration on January 20 took place in 1937.
That last long interregnum convinced everyone that a shorter transition was preferable, but is the current system still too long? In a parliamentary system such as Great Britain’s, the new prime minister can move into 10 Downing Street the day after the election and the new cabinet can show up ready for work. The American system of separation of powers, however, makes no provision for a shadow cabinet in waiting. The president-elect needs time to select cabinet members and a host of other executive branch nominees who will be confirmed by the Senate. It may not do the new president any favor to shorten the interregnum further, although when times are tough the inauguration still looks awfully far away.
Posted on: Tuesday, December 2, 2008 - 19:11
SOURCE: Britannica Blog (12-2-08)
After all, she won’t be the first female Secretary of State (in fact, she will make it 3 out of the last 4 who have been women), and the office has not been, at least not recently, considered a suitable consolation prize for those who don’t quite make it to be president.
However, we have to wonder whether she hopes to take the office of Secretary of State back to the future, transforming it into a role in the U.S. system that it once occupied, and perhaps should again.
Over the last fifty years, the office of Secretary of State has been held by three types of public figures
1) Foreign Policy “Experts” chosen from academia, the diplomatic corps, or the military because they are thought to have a special intellectual insight into the dynamics of foreign affairs (e.g. Condoleeza Rice, Colin Powell, Lawrence Eagleburger, Henry Kissinger, John Foster Dulles);
2) Senior Statesmen who are capping long careers by giving their services to the highest diplomatic office (e.g. Warren Christopher, Edmund Muskie, Dean Rusk);
3) Symbolic Figures who are given the office because the President plans to be his own chief diplomat and needs someone who can deliver his statements effectively on his behalf (e.g. Madeleine Albright, Christian Herter).
Hillary Clinton fits into none of these categories, unless we assume that she is looking upon this post as her capstone gift to the nation. I tend to doubt that she sees it that way.
More likely, she is looking back to a much earlier vision of the Secretary of State as the proving ground for past and future presidential aspirants.
Between 1801 and the Civil War, six of the eleven men elected to the presidency had previous served as Secretary of State (Jefferson, Madison, Monroe, John Quincy Adams, Martin Van Buren, and James Buchanan). At least five other future or former Secretaries of State had made major runs at the office (Henry Clay in 1824, 1832, and 1844; Lewis Cass in 1848; William Seward in 1856 and 1860; and John C. Calhoun and Daniel Webster who were perennial presidential contenders although neither secured a major party nomination).
The most interesting, and I think most plausible, reason why Hillary Clinton wants to be Secretary of State now is that she plans to secure the next open contest for the Democratic presidential nomination. In this regard, her decision to approach that goal in this way may help us reconsider where to look for presidential nominees. There are a number of very sound reasons why we might want to return to an idea that the Secretary of State is a fit training ground for presidential aspirants. The wide open race in 2008, with its nearly uncountable number of debates, demonstrated again and again why the most common modern incubators fail to provide the right types of experiences for the presidential office.
Senators. 2008 aspirants who performed their major public work as U.S. Senators included six Democrats (Clinton, Obama, Biden, Christopher Dodd, John Edwards, and Mike Gravel) and three Republicans (McCain, Fred Thompson, and Sam Brownback) made up nearly half of the total field. But we are repeatedly told that Senators lack training as executives with little need to manage personnel or administrative tasks, and some Senators (depending on committee assignments) may lack extensive familiarity with foreign policy.
Governors. Five 2008 aspirants performed their major public work as governors (Republicans Mitt Romney, Mike Huckabee, Tommy Thompson, and Jim Gilmore; and only one Democrat, Bill Richardson), and governors have tended to make up the major part of most presidential fields since the twentieth century. Four of the last seven presidential winners (Jimmy Carter, Ronald Reagan, Bill Clinton, and George W. Bush) trained as governors. Furthermore, we should not forget that John McCain chose a governor as his vice presidential nominee, and apart from his doomed hope to get Republicans to accept Joe Lieberman, he appears to have seriously considered only present and former governors for the post (Tom Ridge, Bobby Jindal, Charlie Crist, and Tim Pawlenty). Governors like to point out that they have executive and administrative experience, but by the very nature of holding state offices, they lack foreign policy experience (whether they can see foreign countries from their states or not).
Secretary of State, however, is a post that involves both significant executive and administrative experience and major responsibilities for crafting and executing the nation’s foreign policy. It is a perfect training ground for future presidents and perhaps should once again serve as our presidential incubator; in this regard, Senator Clinton’s appointment, whether or not she currently thinks of herself as looking toward 2016 or not, may serve to put the focus back on a particularly appropriate vision of the office of Secretary of State.
However, we need to recognize that this vision also introduces tensions into the office that do not exist when it is occupied by a Madeleine Albright. Presidential aspirants have their own visions and their own agendas that makes them less likely to work with the current president without significant internal tensions.
When Woodrow Wilson sought to install William Jennings Bryan, his primary rival within the Democratic party, at the State Department, things did not go well, and Bryan soon resigned over a policy dispute with Wilson, significantly raising doubts about the president’s foreign policy within his own party. As many commentators have noted, choosing a “team of rivals” is fraught with risk for the president who may be challenged from within, but if this works, for both Obama and Clinton, it may well suggest a new vision of our oldest cabinet office and the training of future presidents.
Posted on: Tuesday, December 2, 2008 - 18:39
SOURCE: Special to HNN (12-2-08)
For the last three months, as the American economy has gone into a free fall, economists and political leaders have parceled out the bad news in small, allegedly manageable proportions. Yesterday, the National Bureau of Economic Research finally confirmed what virtually every small business owner has known for some time- that the US economy is a recession, and that it has been in recession since the Fall of 2007!
But though everyone in political life now feels comfortable in using the word "recession," I have yet to see any economist or political leader say what all the economic indicators all suggest--that the US economy is headed into a Depression, with double digit unemployment figures that may persist for several years.
Despite the bank bailout initiated by the Bush Administration, and the stimulus package which the Obama administration will undoubtedly launch, the US economy faces a toxic mix unprecedented in post war US history- a brutal, and still escalating squeeze, on commercial and consumer credit, which, coupled with rising unemployment, is producing a dramatic shrinkage of consumer demand, the major engine of economic growth in the US and the world since World War II.
While economists will be debating the causes of the unprecedented collapse of the US banking system for some time, the consequences are only now beginning to be felt on the ground. The largest US Banks- Bank of America, Citicorp and JP Morgan Chase- and their smaller counterparts- are in such fragile condition that they becoming risk averse in their daily interactions with businesses and consumers. Not only are they making it much more difficult to lend money, they are sharply raising the interest rates on loans they do make, putting severe economic pressure on even long term customers. The restrictive, some would say extortionate conditions they are placing on lending, are having a powerfully depressing effect on economic activity
The following are the major ways the freezing of credit are forcing the economy into a depression:
1. The denial of business loans, and the withdrawal of lines of credit, are crushing small and medium size businesses throughout the United States. Not only do businesses require credit to upgrade their facilities or launch new product lines, many of them require lines of credit to pay off suppliers or get through lean periods in the business cycle. Shutting off credit, or dramatically raising the cost of lending money, is driving many small and medium size businesses into bankruptcy, forcing them to shed jobs and add to the rapidly rising unemployment figures, further depressing consumer demand.
2. Imposing much more rigid standards for home mortgages and car loans, while a necessary corrective to the loose lending practices of the last fifteen years, are making it much more difficult to purchase homes and cars, contributing to a huge glut of unsold homes, a devastating decline in new construction activity, and
a decline in new car sales so sharp that it has forced the three major US auto makers to the edge of bankruptcy. As construction companies shut down, car dealerships close, and auto manufacturers and parts companies fold, or engage in mass layoffs to stay afloat, hundreds of thousands, possibly millions of workers with relatively good incomes will lose their jobs.
3. As banks impose limits on credit card debt and ratchet up interest rates on credit card holders, which they are in the process of doing, consumer spending will take a huge hit, putting a large number of businesses from restaurants to big box stores, in grave economic difficulty. How consumer demand will
survive such a dramatic contraction of personal credit in a time of rapidly rising unemployment no economist has satisfactorily explained. Where will the money come from to support stores that sell anything but food and clothing? The entire landscape of American commerce that was built around consumer spending- particularly the shopping mall- may now be in jeopardy.
4. As banks restrict access to student loans, or raise interest rates in those loans, a whole generation of students will be forced to postpone graduate and professional education, transfer to, or attend, less expensive schools, or be forced out of college altogether. This will lead to freezes in university hiring and devastating losses in cultural capital as talented students become unable to afford the training they need to become economic innovators.
The consequences I have just described, and the economic statistics that accompany them--unemployment rates of 11-15 percent, millions of unsold homes, new construction and new car sales at a fraction of previous rates--may be a bit abstract for most readers so let me put where we are heading
in concrete physical terms.
1. You are going to see boarded up stores, boarded up car dealerships, and boarded up shopping malls in every section of the nation. Just like the boarded up factory was the symbol of the collapse of American manufacturing in the 1970s' and 1980's, the boarded up shopping mall is going to be the symbol of the credit crisis and collapse of consumer demand.
2. You are going to see large number of abandoned homes, and unoccupied high rise apartment buildings in many sections of the country because those units cannot be sold or rented at market rates. How people forced out of homes and jobs will respond to huge stretches of abandoned, or unoccupied commercial and residential space is anybody's guess. Perhaps there will be "squatter movements" or government programs to allow people to occupy abandoned space for nominal rents.
3. Large numbers of people who had thought they would have private living space will be living communally. As people lose jobs, or retirement income, they will be forced to move in with relatives or take in boarders to afford their current living space. Perhaps they will have to become boarders themselves. As for young people getting out of college, they will either have to move back with their parents or form communal apartments with friends and share housing costs and food.
This, my friends, is the world we will be living in during the next five years. And it is time for our political leaders are honest about what we are up against.
Times this hard require emergency measures- as Paul Krugman has argued, what ever stimulus package is being planned should be doubled in size-- but they also require patience and sacrifice from all of us, and a willingness to lend a helping hand to our neighbors and fellow citizens who are in worse shape than we are.
No matter what we do, a Depression is probably coming. Whether it last three years or ten years depends on how intelligently we plan and how aggressively we act.
Posted on: Tuesday, December 2, 2008 - 17:32
SOURCE: TomDispatch.com (12-1-08)
On a December day in 1932, with the country prostrate under the weight of the Great Depression, ex-president Calvin Coolidge -- who had presided over the reckless stock market boom of the Jazz Age Twenties (and famously declaimed that"the business of America is business") -- confided to a friend:"We are in a new era to which I do not belong." He punctuated those words, a few weeks later, by dying.
A similar premonition grips the popular imagination today. A new era beckons. No person has been more responsible for arousing that expectation than President-elect Barack Obama. From beginning to end, his presidential campaign was born aloft by invocations of the"fierce urgency of now," by" change we can believe in," by"yes, we can!" and by the obvious significance of his race and generation. Not surprisingly then, as the gravity of the national economic calamity has become terrifyingly clearer, yearnings for salvation have attached themselves ever more firmly to the incoming administration.
This is as it should be -- and as it once was. When in March 1933, a few months after Coolidge gave up the ghost, Franklin Delano Roosevelt was inaugurated president, people looked forward to audacious changes, even if they had little or no idea just what, in concrete terms, that might mean. If Coolidge, an iconic representative of the old order, knew that the ancien régime was dead, millions of ordinary Americans had drawn the same conclusion years earlier. Full of fear, depressed and disillusioned, they nonetheless had an appetite for the untried. Like Obama, FDR had, during his campaign, encouraged feverish hopes with no less vaporous references to a"new deal" for Americans.
Brain Trust vs Brainiacs
Yet today, something is amiss. Even if everyone is now using the Great Depression and the New Deal as benchmarks for what we're living through, Act I of the new script has already veered away from the original.
A suffocating political and intellectual provincialism has captured the new administration in embryo. Instead of embracing a sense of adventurousness, a readiness to break with the past so enthusiastically promoted during the campaign, Obama seems overcome with inhibitions and fears.
Practically without exception he has chosen to staff his government at its highest levels with refugees from the Clinton years. This is emphatically true in the realms of foreign and economic policy. It would, in fact, be hard to find an original idea among the new appointees being called to power in those realms -- some way of looking at the American empire abroad or the structure of power and wealth at home that departs radically from views in circulation a decade or more ago. A team photo of Obama's key cabinet and other appointments at Treasury, Health and Human Services, Commerce, the President's Economic Recovery Advisory Board, the State Department, the Pentagon, the National Security Council, and in the U.S. Intelligence Community, not to speak of senior advisory posts around the President himself, could practically have been teleported from perhaps the year 1995.
Recycled Clintonism is recycled neo-liberalism. This is change only the brainiacs from Hyde Park and Harvard Square could believe in. Only the experts could get hot under the collar about the slight differences between"behavioral economics" (the latest academic fad that fascinates some high level Obama-ites) and straight-up neo-liberal deference to the market. And here's the sobering thing: despite the grotesque extremism of the Bush years, neo-liberalism also served as its ideological magnetic north.
Is this parochialism, this timorousness and lack of imagination, inevitable in a period like our own, when the unknown looms menacingly and one natural reaction is certainly to draw back, to find refuge in the familiar? Here, the New Deal years can be instructive.
Roosevelt was no radical; indeed, he shared many of the conservative convictions of his class and times. He believed deeply in both balanced budgets and the demoralizing effects of relief on the poor. He tried mightily to rally the business community to his side. For him, the labor movement was terra incognita and -- though it may be hard to believe today -- played no role in his initial policy and political calculations. Nonetheless, right from the beginning, Roosevelt cobbled together a cabinet and circle of advisers strikingly heterogeneous in its views, one that, by comparison, makes Obama's inner sanctum, as it is developing today, look like a sectarian cult.
Heterogeneous does not mean radical. Some of FDR's early appointments -- as at the Treasury Department -- were die-hard conservatives. Jesse Jones, who ran the Reconstruction Finance Corporation, a Hoover administration creation, retained by FDR, that had been designed to rescue tottering banks, railroads, and other enterprises too big to fail, was a practitioner of business-friendly bailout capitalism before present Treasury Secretary Henry Paulson was even born.
But there was also Henry Wallace as Secretary of Agriculture, a Midwestern progressive who would become the standard bearer for the most left-leaning segments of the New Deal coalition. He was joined at the Agriculture Department -- far more important then than now -- by men like Mordecai Ezekiel, who was prepared to challenge the power of the country's landed oligarchs.
Then there were corporatists like Raymond Moley, Donald Richberg, and General Hugh Johnson. Moley was an original member of FDR's legendary"brain trust" (a small group of the President's most influential advisers who often held no official government position). Richberg and Johnson helped design and run the National Recovery Administration (the New Deal's first and failed attempt at industrial recovery). All three men were partial to the interests of the country's peak corporations. All three wanted them released from the strictures of the Sherman Anti-Trust Act so that they could collaborate in setting prices and wages to arrest the killing deflation that gripped the economy. But they also wanted these corporate behemoths and the codes of competition they promulgated subjected to government oversight and restraints.
Meanwhile, Felix Frankfurter (another confidant of FDR's and a future Supreme Court justice), aided by the behind-the-scenes efforts of Supreme Court Justice Louis Brandeis, fiercely contested the influence of the corporatists within the new administration, favoring anti-trust and then-new Keynesian approaches to economic recovery. Secretary of Labor Frances Perkins used her extensive ties to the social work community and the labor movement to keep an otherwise tone-deaf president apprised of portentous rumblings from that quarter. In this fashion, she eased the way for the passage of the Wagner Act that legislated the right to organize and bargain collectively, and that ended the reign of industrial autocracy in the workplace.
Roosevelt's"brain trust" also included Rexford Tugwell. He was an avid proponent of government economic planning. Another founding member of the"brain trust" was Adolph Berle, who had published a bestselling, scathing indictment of the financial and social irresponsibility of the corporate elite just before FDR assumed office.
People like Tugwell and others, including future Federal Reserve Board chairman Marriner Eccles, were believers in Keynesian deficit spending as the road to recovery and argued fiercely for this position within the inner councils of the administration, even while Roosevelt himself remained, until later in his presidency, an orthodox budget balancer.
All of these people -- the corporatists and the Keynesians, the planners and the anti-trusters -- were there at the creation. They often came to blows. A genuine administration of"rivals" didn't faze FDR. He was deft at borrowing all of, or pieces of, their ideas, then jettisoning some when they didn't work, and playing one faction against another in a remarkable display of political agility. Roosevelt's tolerance of real differences stands in stark contrast to the new administration's cloning of the Clinton-era brainiacs.
It was this openness to a variety of often untested solutions -- including at that point Keynesianism -- that helped give the New Deal the flexibility to adjust to shifts in the country's political chemistry in the worst of times. If the New Deal came to represent a watershed in American history, it was in part due to the capaciousness of its imagination, its experimental elasticity, and its willingness to venture beyond the orthodox. Many failures were born of this, but so, too, many enduring triumphs.
Beyond the Bailout State
Why, at least so far, is the Obama approach so different? Some of it no doubt has to do with the same native caution that caused FDR to navigate carefully in treacherous waters. But some of it may result from the fallout of history. Because the Great Depression and the New Deal happened, nothing can ever really be the same again.
We are accustomed to thinking of the Bush years -- maybe even the whole era from the presidency of Ronald Reagan on -- as a throwback to the 1920s or even the laissez-faire golden years of the Gilded Age of the late nineteenth century. In some respects, that's probably accurate, but in at least one critical way it's not. Back in those days, faced with a potentially terminal financial crisis, the government did nothing, simply letting the economy plunge into depression. This happened repeatedly until 1929, when it happened again.
Since the New Deal, however, inaction has ceased to be a viable option for Washington. State intervention to prevent catastrophe has become an unspoken axiom of political life in perilous times. Of course, thanks to regulatory mechanisms installed during the New Deal years, there was no need to engage in heroic rescues -- not, at least, until the triumph of deregulation in our own time.
Then crises began to erupt with ever greater frequency -- the stock market crash of 1987, the savings and loan collapse at the end of that decade, the massive Latin American debt defaults of the early 1990s, the collapse of the economies of the Asian"tigers" in the mid-1990s, the near bankruptcy of the then-huge hedge fund, Long Term Capital Management, later in that decade, the dot-com implosion at the turn the century, climaxing with the general global collapse of the present moment. Beginning perhaps with the bailout of the Chrysler Corporation in the late 1970s, these recurring crises have been met with increasingly strenuous efforts to stop the bleeding by what some have called"the bailout state."
The Resolution Trust Corporation, created to rescue the savings and loan industry, first institutionalized what Kevin Phillips has since described as a new political economy of"financial mercantilism." Under this new order the state stands ready to backstop the private sector -- or at least the financial sub-sector which, for the past quarter century, has been the driving engine of economic growth -- whenever it undergoes severe stress.
Today, the starting point for all mainstream policymakers, even those who otherwise preach the virtues of the free market and the evils of big government, is the active intervention of the state to prevent the failure of private-sector institutions considered"too big to fail" (as with most recently Citigroup and the insurance company AIG). So, too, the tolerance level for deficit spending, not only for military purposes but, in extremis, to help stop ordinary people from going under, is infinitely higher than in 1932. Ronald Reagan was prepared to live with such spending, if necessary, even as he removed portraits of Thomas Jefferson and Harry S. Truman from the Cabinet Room and replaced them with a canvas of Calvin Coolidge.
The question for our"new era" -- not one our New Deal ancestors would have thought to ask -- has become: How do we get beyond the bailout state? This is one crucial realm where genuinely new thinking and new ideas are badly needed.
At the moment, as best we can make out, the bailout state is being managed in secret and apparently in the interests, above all, of those who run the financial institutions being"rescued." Often, we don't actually know who is getting what from the Federal Reserve and the Treasury, or on what terms, or even which institutions are being helped and which aren't, or often what our public monies are actually being used for.
What we do know, however, is anything but encouraging. It includes tax exemptions for merging banks, prices for public-equity stakes in failing outfits that far exceed what is being paid by governments (or even private investors) abroad for similar holdings. Add to this a stark lack of accountability, aggravated by the fact that the U.S. government has neither voting rights (nor even a voice) on boards of directors whose firms would be in bankruptcy court without Washington's aid.
Living in an Empire of Depression
Are we, then, witnessing the birth of some warped, exceedingly partial version of state capitalism -- partial, that is, to the resuscitation of the old order? If so, lurking within this string of bum deals might there not be a great opportunity? Putting the economy and country back together will require massive resources directed toward common purposes. There is no more suitable means of mobilizing and steering those resources than the institutions of democratic government.
Under the present dispensation, the bailout state makes the government the handmaiden of the financial sector. Under a new one, the tables might be turned. But who will speak for that option within the limited councils of the Obama team?
A real democratic nationalization of the banks -- good value for our money rather than good money to add to their value -- should be part of the policy agenda up for discussion in the Obama era. As things now stand, the public supplies the loans and the investment capital, but the key decisions about how they are to be deployed remain in private hands. A democratic version of nationalizing the financial system would transfer these critical decisions to new institutions created by the Congress and designed to pursue public, not private, objectives. How to subject the flow of credit and investment capital to public control ought to be on the drawing boards if we are to look beyond the old New Deal to a new one.
Or, for instance, if we are to bail out the auto industry, which we should -- millions of jobs, businesses, communities, and what's left of once powerful and proud unions are at stake -- then why not talk about its nationalization, too? Why not create a representative body of workers, consumers, environmentalists, suppliers, and other interested parties to supervise the industry's reorganization and retooling to produce, just as the president-elect says he wants, new green means of transportation -- and not just cars?
Why not apply the same model to the rehabilitation of the nation's infrastructure; indeed, why not to the reindustrialization of the country as a whole? If, as so many commentators are now claiming, what lies ahead is the kind of massive, crippling deflation characteristic of such crises, then why not consider creating democratic mechanisms to impose an incomes policy on wages and prices that works against that deflation?
Overseas, if everything isn't up for discussion -- and it most certainly isn't -- it ought to be. What happens there bears directly on our future here at home. After all, we live in the empire of depression. America's favorite export for more than a decade has been a toxic line-up of securitized debt. Having ingested it in lethal amounts, every economy in the world from Iceland's and Germany's to Russia's and Indonesia's is either folding up or threatening to fold up like an accordion under the pressure of economic disaster.
Until now, the American way of life, including its economy of mass consumption, has depended on maintaining the country's global preeminence by any means possible: economic, political, and, in the end, military. The news of the Bush years was that, in this mix, Washington reached for its six-guns so much more quickly.
A global depression will challenge that fundamental hierarchy in every conceivable way. The United States can try to recapture its imperiled hegemony by methods familiar to the Obama-Clinton-Bush (the father) foreign policy establishment, that is by using the country's waning but still intimidating economic and military muscle. But that's a devil's game played at exorbitant cost which will further imperil the domestic economy.
It might, of course, be possible, as in domestic affairs, to try something new, something that embraces the public redevelopment of America in concert with the global South. This would entail at a minimum a radical break with the"Washington Consensus" of the Clinton years in which the United States insisted that the rest of the world conform to its free market model of economic behavior. It would establish multilateral mechanisms for regulating the flow of investment capital and severe penalties and restrictions on speculation in international markets. Most of all, it would mean lifting the strangulating grip of American military might that now girdles the globe.
All of this would require a capacity for re-imagining foreign affairs as something other than a zero-sum game. So far, nothing in Obama's line-up of foreign policy and national security mandarins suggests this kind of potential policy deviance. Again, no Rooseveltian"brain trust" is in sight, even though unorthodoxies are called for, not just because of the hopes Obama's victory have aroused, but because of the urgency of our present circumstances.
If original thinking doesn't find a home somewhere within this forming administration soon, it will be an omen of an even more troubled future to come, when options not even being considered today may be unavailable tomorrow. Certainly, Americans ought to expect something better than a trip down (the grimmest of) memory lanes into the failed neo-liberalism of yesteryear.
Posted on: Monday, December 1, 2008 - 19:22
SOURCE: Philadelphia Inquirer (12-1-08)
My family and I spent the first half of this year in Ghana, which is holding presidential elections on Sunday. By the time we left, the contest was already in full swing. Wherever you looked - street signs, storefronts, even schools - you saw posters of the two major candidates, John Evans Atta Mills and Nana Akufo-Addo, smiling next to their vice presidential running mates.
Since Nov. 4, however, a new face has appeared alongside Mills': Barack Obama's.
Obama isn't endorsing anyone in Ghana, of course. The Mills campaign simply changed its posters, replacing the picture of Mills' running mate with one of the American president-elect. So now, everywhere you go, you see them together.
Mills' party even has a new slogan: "Obama Nie, Atta Mills Nie." That means simply, "This is Obama, this is Mills."
The point is not just that Obama is black, although that's part of it. Mostly, Obama symbolizes democracy itself. Elected to the White House on themes of optimism and change, he has become international shorthand for hope.
Let's start with the matter of race, which is more complicated than you might guess. As the first free nation in sub-Saharan Africa, Ghana has long seen itself as a lodestar of black liberty and progress. That's why independence leader Kwame Nkrumah adorned the national flag with a black star, which echoed the name of the shipping company started by the Jamaican pan-Africanist Marcus Garvey.
Predictably, then, Obama's victory triggered an explosion of race pride in Ghana. "What an apotheosis for blacks in America and elsewhere!" columnist I.K. Gyasi proclaimed a week after Obama won. "Today, a Blackman is president and commander-in-chief of the Armed Forces of the United States."
To Ghanaians, though, Obama isn't just a "Blackman." He's also an African, because his father came from Kenya. So he wasn't a product of slavery, which remains a deeply sensitive subject across the continent. Ashamed of their role in perpetuating the slave trade, Africans are also proud that they weren't slaves themselves.
"Obama is not a descendant of our ancestors who were cruelly transported to the so-called New World and forced to do back-breaking labor in the sugar and cotton fields, and in the homes of white people," Gyasi added. "The new president will not have to change his name from Robinson, or Jackson. ... He already has a 'non-American' sounding one known as Barack Hussein Obama."
But Obama is an American, of course. And Ghanaians are ambivalent about America, too - celebrating its dynamism, but condemning its racism.
Last spring, as Obama began his surge in the primaries, a Ghanaian history teacher told my daughter's class that America would never allow a black man to become president.
"If he gets close, he'll be assassinated," the teacher declared flatly.
So when Obama won, Ghanaians had to look anew at Americans - and themselves. After all, commentators noted, Ghana also has a history of ethnic conflict.
In 1994, 1,000 were killed and an additional 150,000 had to flee during an outbreak of tribal violence. In February, shortly after we arrived in Ghana, about a dozen people died in ethnic fighting in the country's north.
"Maybe we should also find out if our ethnocentric views about those who come from the North will subside if someone is elected president from that area," another columnist wrote after Obama's victory. "Maybe Ghana can learn a lesson or two from this!"
Most of all, Ghanaians hope Obama's election can serve as an example of a peaceful, democratic transition. Despite its recent stability, Ghana has a history of military coups. When Kenya and Zimbabwe descended into election-related violence this year, many Ghanaians wondered if they might be next.
"Those who have invoked the Kenya/Zimbabwe models for Ghana must rethink their approach to politics," a third commentator wrote recently. "They must join us in helping Ghana to emulate the American example, and become an example for the Kenyas and Zimbabwes of our world."
That's a tall order, of course, and it certainly won't happen overnight. But by electing Barack Obama, the United States has created a new impetus for democratic change around the globe.
In Ghana especially, Obama isn't just a smiling face on a campaign poster. He is the face of a better future, for all of us.
Posted on: Monday, December 1, 2008 - 19:14
SOURCE: Informed Comment (Blog run by Juan Cole) (12-1-08)
Leaks to the Indian press by security officials in charge of interrogating the captured terrorist, Ajmal Amir Kamal (or Qasab?) are fleshing out the background of the attack on Mumbai and clarifying the evidence that it was an operation of the Lashkar-e Tayiba [the"Army of the Good"].
The Indian counterpart of the CIA, the Research and Analysis Wing (RAW), intercepted a cell phone call on November 18 to a number in Lahore, Pakistan, known to be that of a Lashkar-i Tayiba handler, saying that the caller was heading to Mumbai. They later found the phone itself on a hijacked Indian fishing boat, which the attackers had taken over to camouflage their approach to the port.
The sole captured LeT operative, Kamal, is said by the Indian press to be from Faridkot village near Dipalpur Tahsil in Okara District of Pakistani Punjab, southwest of Lahore [I saw one article, which I can no longer retrieve, in which the Indian press mispelled the tahsil or county as Gipalpur]). This is such a remote and little-known place that even Pakistani newspapers were having difficulty tracking it down).
Kamal is said to be telling Indian security that he and the others trained in camps in Pakistani Kashmir. (The original princely state of Kashmir, largely Muslim, is divided, with one third in Pakistani hands and two-thirds in Indian; India joined its portion to largely Hindu Jammu to create the province of Jammu and Kashmir.)
The Kashmir police have gotten good enough at counter-terrorism measures that elements of the LeT may have decided to go after a soft target such as Mumbai instead.
The story begins with the 1977 coup of Gen. Zia ul-Haqq, a Muslim fundamentalist who hanged his boss, PM Zulfikar Ali Bhutto, after overthrowing him. Zia favored Sunni fundamentalists and introduced discriminatory policies against Pakistani Shiites, secularists, etc.
Then in 1979 the Soviet Red Army came into Afghanistan to prop up a shaky Communist junta. Gen. Zia was suddenly America's man at the front lines of fighting the Soviets, and his Inter-Services Intelligence helped organize Afghan refugees in Pakistan to fight the Soviets. The ISI favored the most radical fundamentalists among the Mujahideen, such as Gulbadin Hikmatyar, who led the Hizb-i Islami. This model, of using private armies funded by black money (generated by illegal arms or drug sales) to"roll back" leftists, was being applied by Reagan in Nicaragua at the same time.
The military dictatorship was taking a lion's share of the Pakistani budget, and to whip up popular passions and make itself popular, it promoted the liberation of the rest of Muslim Kashmir from Hindu India as another major project alongside getting the Soviets out of Afghanistan. (This is the language of the military; actually India is a secular multicultural state, not a formally Hindu one; and in opinion polls Kashmiris do not say they want to join Pakistan, though they would like independence).
A lot of Pakistanis probably did not care so much about Kashmir, having other problems in life (and already worried about having to adopt 3 million Afghan refugees). But the military in Pakistan constantly played on the public's emotions on the issue, as a way of justifying military perquisites. (When British India was partitioned into Muslim Pakistan and Hindu India in 1947, Kashmir was the only Muslim-majority province to be successfully grabbed by India; Pakistan insisted it should have gone to the Muslim state; the UN insisted on a referendum, which was never held.)
The model that the Reagan administration pressed on the Pakistani military, of funding rightwing"Islamic" militias to kill Soviets, gradually became standard operating procedure. But then the Pakistani Religious Right began adopting the model for themselves. If it is all right to mobilize death squads in one righteous cause, why not in others?
Emboldened, lower middle class Sunni hate groups grew up in rural areas such as Jhang Siyal, where Shiite Sufi leaders had been given big estates by premodern rulers and so were big landlords. The Sipah-e Sahaba Pakistan (SSP), formed in 1985, was one such organization. It turned to violence, killing Shiites. Revivalist Deobandi clergy were important in its leadership. I don't think Zia much cared if they killed Shiites.
Others, including elements in the Pakistani military began wondering why they should not apply the Reagan Jihad model to Kashmir. And they did. In the late 1980s, Hafiz Muhammad Said (once a professor of engineering at Punjab University) set up the Center for Mission and Guidance (Markaz al-Da'wa wa al-Irshad) in a huge compound at Muridke outside Lahore. The Center soon established the Lashkar-e Tayiba as its paramilitary. With the behind the scenes encouragement of elements in the Pakistani military, the LeT sent guerrillas into Indian Kashmir to attack Indian troops and facilities. The Lashkar prided itself on not killing civilians, on not targeting Shiites, and on keeping its focus on what they thought of as the Indian occupation forces. But they fought alongside Sipah-e Sahaba elements that also took off time from murdering Shiites to infiltrate into Indian Kashmir and stage attacks.
I saw this militarization of Pakistani civil society with my own eyes. I first went to the country in 1981 before you could just buy a Kalashnikov in the bazaar. When I was doing research there in 1988 and then again in 1990, the situation was completely different. Pakistan had never had a drug problem but now there were a million addicts (the US encouraged the Afghan mujahidin to grow poppies for heroin to finance the anti-Soviet struggle, and the drugs spilled into Pakistan). Weapons were freely available. Karachi was having a kind of civil war. I remember that fanatics from the religious right attacked an art exhibition in Lahore, a city of the arts (graven images not allowed & etc.) Political figures were accused of cynically creating Islamic movements for personal and political gain. This deterioration of Pakistan was, in some important part, a direct result of Reagan-Bush policies. They used Pakistan, corrupted it with all those drugs, arms, and radical Muslim militias that they called 'freedom fighters,' and then threw it away when they did not need it any more. Reagan and the Saudis funneled billions to the Pakistani military. What did ordinary Pakistanis have to show for it?
When the Soviets withdrew in 1988-1989 from Afghanistan and the Mujahideen took over, the Pakistani military lost control of its northern neighbor. It therefore funded and promoted the Taliban (expatriate Afghan young men who had been through Deobandi seminaries in northern Pakistan) from 1994, enabling them to take over Afghanistan. The Taliban ran terrorist training camps, at which the Sipah-e Sahaba and the Lashkar-e Tayiba trained for missions in Kashmir. Afghanistan in essence was the boot camp for Pakistani Reaganism.
The SSP and the Lashkar-e Tayiba was joined by other Sunni militias, including the Movement of the Holy Warriors (Harakat ul Mujahidin). In 2000, Mawlana Massoud Azhar broke off from the latter to form the Jaish-e Muhammad or Army of Muhammad, a particularly violent group focusing on Kashmir. All these Pakistani organizations trained their fighters in the Taliban camps, some of which were actually run by al-Qaeda once Bin Laden allied with the latter in 1996. (It is said that the Inter-Services Intelligence made the introduction).
High Dudgeon of Americans directed at the Pakistani military for this activity is the height of hypocrisy. The Reagan administration actively encouraged Islamabad to mount precisely such activities against the leftist government of Afghanistan (which, while dictatorial and brutally oppressive, was busily educating girls, admitting women to professions, spreading literacy, working against the vestiges of landlord feudalism, etc.) From a Pakistani point of view, Soviet-occupied Afghanistan and Indian-occupied Kashmir were morally equivalent.
In 2002, under pressure from Washington, military dictator Pervez Musharraf dissolved the Lashkar-e Tayiba and other similar groups and initially arrested many members. They were later released by the Pakistani courts on the grounds that they hadn't broken any Pakistani laws. The dissolution was a bit of a farce, since the groups just took other names. Someone who now has a prominent official position in the Pakistani government once wryly observed to me that the Musharraf government couldn't seem to find the Lashkar-e Tayib headquarters at Muridke just outside Lahore, even though it was huge and a well known landmark at which thousands gathered. And, Lashkar went on raising money, supposedly for civilian relief works in Kashmir.
The Pakistani military is itself now suffering blowback for its past policies. Its name is mud in Pakistan. A Pakistani Taliban has emerged that often declines to be its puppet, and which has killed hundreds of Pakistani troops. The Marriott in Islamabad was blown up by the Pakistani Taliban.
The cell that hit Mumbai was probably a rogue splinter group. They completely disregarded the old Lashkar-e Tayiba concentration on hitting only Indian troops in Kashmir, targeting civilians instead. It is very unlikely that anyone in the Pakistani military put them up specifically to this Mumbai operation. This attack was much more likely to be blowback, when a covert operation produces unexpected consequences or agents that were previously reliable go rogue.
The Mumbai attacks were not the first of this scale on an Indian target by the LeT.
If the Pakistani government does not give up this covert terrorist campaign in Kashmir and does not stop coddling the radical vigilantes who go off to fight there, South Asian terrorism will grow as a problem and very possibly provoke the world's first nuclear war (possible death toll: 20 million).
The civilian government that has recently taken over Pakistan is weak. If it puts too much pressure on the military too quickly, it risks another coup and destabilization. But the training camps in Azad Kashmir must be closed.
India, Pakistan, and the Obama administration need to do some serious diplomacy on Kashmir, and try to settle this major global fault line before the 10.0 earthquake finally hits.
Posted on: Monday, December 1, 2008 - 18:36
SOURCE: Pajamasmedia.com (11-29-08)
There is a strange symmetry between the Bush hatred that emanated from the Left and what the writer John Avlon calls “irrational Obama exuberance.” Barack Obama has not spent one day as President, yet his admirers speak and write as if he has not and will not do anything wrong. I agree with Avlon that Obama’s centrist Cabinet choices have encouraged confidence in his ability to tackle our country’s problems. But when President Obama steps into the oval office, like any other President who is a human being, he will call some shots incorrectly, and polls will reflect disillusionment among his followers.
If you consider Obama the closest man can get to God, you are probably among those who think that George W. Bush is the closest man can get to being the devil. As Canadian journalist Robert Fulford writes in The National Post, “liberal Americans who see the Republicans as the party of the devil have enjoyed eight years of intense self-righteousness.” These are about to end, thankfully. As Obama takes over our nation’s helm, hopefully more reasoned opinion will prevail on the question of George W. Bush’s legacy as President.
Speaking about this himself, the President told an interviewer that he would like to be known “as somebody who liberated 50 million people and helped achieve peace,” and as a person “who first and foremost, did not sell his soul to accommodate the political process.” He would like to be known as a leader who “rallied people to help their neighbor, that led an effort to help relieve HIV/AIDS and malaria on places like the continent of Africa; that helped elderly people get their prescription drugs and Medicare as part of the basic package.”
Whether or not Bush’s hopes are fulfilled will only be told by future historians. Today’s academy has already reached its own judgment. A year or so ago, the eminent historian Sean Wilentz wrote a cover story for Rolling Stone, in which he called Bush “the worst President in all American history.” Most of his colleagues readily agreed with his call.
Bush and his defenders have good reason to be angry at Wilentz’s premature verdict. As Fulford points out, the President created the $30 billion Emergency Plan for Aids Relief, and extended it this year with a $48 billion to end the number of people being treated in Africa to three million and to train 140,000 health care workers who specialize in HIV prevention and treatment. Thus Bush changed our nation’s involvement in Africa in a positive fashion. The rock star and activist Bob Geldorf openly acknowledged this, pointing out that Bush “has done more than any other president” for Africa. But yet, as Fulford writes, “it’s unlikely that one in a 100 of [Bush's] fellow Americans know about it.”
Most important of all will be acknowledging Bush’s role in keeping America free of terrorist attacks since 9/11. If we look at what has taken place in Britain, Spain and now India, we must realize that the various terrorist networks have certainly been trying, and had their efforts not been stopped, might have been successful.
Alexander Moens, a political scientist who has written a book titled The Foreign Policy of George W. Bush,  told journalist Kevin Libin that “there is evidence…plots were disrupted and people were killed that indicate that other attacks were in the making.”
And then there is Iraq. To the Left and the antiwar activists, the very entry into the war and the toppling of Saddam Hussein was unnecessary, wrong and immoral. Evidently, leaving one of the greatest contemporary tyrants and butchers of his own people in office was not a problem. And although intelligence turned out to be deeply flawed, the spurious charge that “Bush lied us into war” has no mettle. Virtually every major Democrat saw the same intelligence as the Bush administration, and made the point over and over that Saddam had violated all United Nations sanctions, and was set to put WMDs into operation in the near future.
As Professor Moen points out, Bush’s “ability to actually stay convinced that Iraq had to be won, when nobody else in the world agreed with him…is an aspect of his strong leadership that people will respect more over time.” One can argue that when Bush leaves office, Iraq will be on the way to forging an actual democracy; the war will have been effectively over and Al-Qaeda will have been defeated.
This is not to say that critics are incorrect when they attack the serious mishandling of Iraq after Saddam’s ouster. No one can justify Abu Ghraib, the excesses in interrogation techniques and the sanction of actual torture, or the problems at Guantanamo. Nor can one fail to be critical of the President’s inability to explain to the American public why they should support Operation Iraqi Freedom .
Future generations will have to assess the final outcome. If Iraq does emerge as a democracy in the Middle East, joining Israel as a state pledged to democracy in a sea of tyrannies, then future historians will see the President in a more favorable light than their contemporaries seem to do today. Whatever their conclusions turn out to be, I have one prediction: Bush’s position in the rankings of American Presidents will have risen at least close to the center, if not higher. Check back with me in a decade.
Posted on: Monday, December 1, 2008 - 14:14