The Beer Theory of Credit Quality

by Bill Bonner – Bonner and

Here’s a firsthand report directly from one of our dear readers:

Greetings from Greek islands. Although news seems bad from reading papers etc., life here is rolling along. I am vaca with family and pulled out 500 euro from ATM last night (Sunday, June 28) on island Hydra. Restaurant accepted Amex. So far so good.

Yes, so far, so good.

But the steamroller is still rolling.

Americans aren’t really interested in what happens to the Greeks – unless they happen to be on “vaca” there. But the chief obstacle in Greece is the same one in China and in the United States: too much debt.

The Germans and Greeks can blab, hondle, and bluff all they want. It won’t go away.

According to financial services company Credit Suisse, Greece has total debt – including households, businesses, and government – equal to 353% of GDP.

But U.S. debt is even higher at 370%. Germany, that supposed paragon of financial virtue, is at 302%. And China, with its state-controlled economy, is at 250%.

All are in good shape compared to Britain. It has total debt equal to 546% of GDP. Japan is in an even worse state. Its total-debt-to-GDP is 646%.

And if the Credit Suisse numbers are correct, Ireland is off the charts with total debt equal to more than 1,000 times GDP.

But the Greeks are feeling the heat because they can’t service their public sector debt right now. They can’t pay it for the very same reason they got it in the first place – false pretenses.

First, they claimed they met the guidelines for entry into the euro zone. Then they claimed they could afford to live in the style to which they became accustomed. Then they claimed they would pay back the money they borrowed to make payments on the debt they couldn’t afford.

None of it was true.

Now, with their backs to the euro wall, they can’t “print their way out” of their predicament. Their creditors expect them to pay up. The Germans, in particular, see it as a moral responsibility.

“That’s the difference between beer drinkers and wine drinkers,” says a friend. “The beer drinkers pay.”


The Beer Theory of Credit Quality

Bond investors believed the euro promised stability and security. It was backed not by the wine drinkers, but by the beer drinkers.

We’re not sure how Ireland – a big beer-drinking country – fits into this story. But our friend notes that the countries of Northern Europe – where they also drink mostly beer – tend to repay their debts. Southern Europe – Spain, Italy, and Greece – are bad credit risks.

On the streets of London at this time of year, people stand on the sidewalks with barrels of beer in their hands. And on the Fourth of July holiday, more Americans will raise glasses of beer than wine.

Still, we doubt the “Beer Theory of Credit Quality” will hold up under the pressure of a generalized credit contraction.

In Europe, the beer drinkers of the north sold automobiles, for example, to the wine drinkers of the south. Then, when the winos couldn’t pay, the beer swillers gave them more credit.

Now, when the Greeks still can’t pay, the Germans are getting huffy about it.

And everybody is nervous. If the Germans put the screws to the Greeks, they invite problems with the rest of the wine drinkers.

What the Greeks owe is peanuts compared to what the Italians and Spanish owe. And if the credit stops, who’s going to buy the Germans’ BMWs, Audis, and Mercedes?

Nobody wants the credit to stop.


Star-Crossed Debtors

That is also true of another pair of star-crossed debtors – the Chinese and the Americans.

Like the Greeks and Germans, the Chinese lent, and the Americans spent.

And now, what a surprise… it’s the Chinese who seem to be in trouble.

Wait, what do the Chinese drink?

We don’t know. But the Shanghai index fell 17% in the last 18 days. And it dropped another 5% yesterday. (More on that below in today’s Market Insight.)

According to the McKinsey Global Institute:

China’s debt has quadrupled since 2007. Fueled by real estate and shadow banking, China’s total debt has nearly quadrupled, rising to $28 trillion by mid-2014, from $7 trillion in 2007.
Three developments are potentially worrisome: half of all loans are linked, directly or indirectly, to China’s overheated real-estate market; unregulated shadow banking accounts for nearly half of new lending; and the debt of many local governments is probably unsustainable.”

McKinsey says total world debt is now more than three times global GDP.

That is a “macro obstacle” about as big as they get. It is a steamroller.

And it is headed for us all… no matter what we drink.

Article originally posted at Bonner and

There Will Never Be Enough Good Jobs Again

by Paul Rosenberg


It’s over. Except for a short moment or a wild and self-exhausting governmental mandate (both of which are doubtful), there will never again be enough “good jobs” to go around. That model is gone and we need to root it out of our imaginations.

Sure, there will be some good jobs, but nowhere near enough.

About half of the Western world is already on the dole in one form or another. 93 million Americans lack a decent job and have no real hope of getting one. And so long as the current hierarchies remain, things won’t get substantially better.

I’m sorry to dump that on you, but it’s better to face it directly.

But please bear in mind that I’m a confirmed optimist. Just because there are no “good jobs” doesn’t mean that we’ll all languish in a meaningless existence. Far from it. Once we get over our addictions to status, hierarchy, and dominance, a glorious future awaits us.

Why It Won’t Get Better

The standard response to what I’ve noted above is to call it “the Luddite fallacy.” That line of argument says that in the past, innovation has not wiped out jobs, that new types of jobs were created and filled the gaps fairly well.

And that statement is true. Individual jobs were wiped out, but new jobs came along and (more or less) picked up the slack.

However, that is not happening this time, and for a very simple reason: Adaptation is now against the law. Previous rises in technology occurred while adaptation was still semi-legal.

Please take a look at this graph and remember a simple truth: Regulation forbids adaptation.

The US government is currently spending $60 billion, every year, to restrain business activity. (And the EU is worse.) On top of that, reasonable estimates show that US government regulations cost businesses nearly $2 trillion per year.

And let’s be honest about this: The primary purpose of regulation is to give the friends of congressmen a business advantage. Why else would they pay millions of dollars to lobbyists?

So, the new jobs that should be spawned, will not be. Mega-corps own Congress and they get the laws they pay for. And mega-corps do not like competition.

Furthermore, the political-corporate-bureaucratic complex will bite and claw to retain every scrap of power they have, and small businesses will be their first victims. (They already are.)

Trapped Between Hammer and Anvil

So, the people who are hoping and waiting for a “good job” to pop up are trapped between hammer and anvil. Robots are starting to roll into the workplace while the job creators (small entrepreneurs) are in regulatory and economic chains. They can’t come to the rescue.

In the 19th century, all sorts of possibilities were open to entrepreneurs. This remained at least partly true, even into the 1970s, when I watched the business heroes of my youth having a gas while making piles of money.

It used to be that a clever person could get ahead, independently, and have a ball doing so.

Those days, alas, are over.

These days, to get rich, one needs to take government as a partner. If one does not, regulation and legislation are likely to destroy your business. At this point, many of us (myself included) have had businesses – good businesses that benefited everyone involved – crushed by legislation.

To avoid being crushed these days, you have to be smarter and fleeter of foot than everyone else. Not many of us can survive in that situation, and as regulations continue to rise, even that number grows smaller and smaller.

For the generation before of mine, independent success required ambition, but it was reachable. For my generation, only those of us blessed with unusual talent had a chance at controlling our economic destinies. For the young generation of today, it’s nearly impossible. These days, if you want to jump ahead, you need to be part of something big… and you need to start as a sycophant.


So, if you’re looking for the proverbial good job, stop waiting for “The Hierarchy That Is” to sort things out and get everything back to normal. Good jobs get fewer and fewer every year, and those that are lost won’t be coming back.

But… if and when you’re ready to change your thinking – to seriously change your thinking – this is good news too: You can reclaim the parts of yourself that you were ready to sacrifice to the “good job.”

You see, the “good job” was nearly as much a curse as it was a blessing. Yes, I know, steady wages and benefits are a very comfortable thing, but they also play right into a ridiculous, predatory script.

You know the one: where you struggle to display your status to all the other worker-bees. You feel like you have to do what the ads tell you: Get the new car, the bigger truck, the video player in the back seat, the gigantic TV, the most “amazing” holiday parties, the expensive shoes, the designer bags, the organic veggies, etc., etc., etc.

I would like you, please, to consider this quote from the boss of Lehman Brothers, just as the World War I production surge was failing:

We must shift America from a needs, to a desires culture. People must be trained to desire, to want new things, even before the old had been entirely consumed. We must shape a new mentality in America. Man’s desires must overshadow his needs.

Would you agree that their plan worked?

As long as you follow their script, you’ll remain in a permanent deficit mentality. No matter how much you have, you’ll always feel like you need more. It’s life on a shiny gerbil wheel. The “good job” kept us from knowing ourselves; it allowed us to sleep-walk through life. We got a “good job” and never developed ourselves any further. Work, retire, die, ho hum.

Then What?

So, if we forget about having a “good job,” what happens?

Well, it might very well mean that you do what you’re already doing, but you stop feeling bad about it. It means that you get over the endless grasping after status… of letting ridiculous ads define what “success” looks like… of letting other people define your self-opinion.

Letting go of the “good job” delusion means that you stop pining for the days when you could blow a third of your money on status crap. It means that you start taking pleasure in growing your own food, developing new ventures, and improving yourself.

It means that rather than begging politicians to ride in on a white horse and fix your world, you ignore them and start paying attention to your actual life.

Fundamentally, this means that we start using our own initiative, without seeking permission, and start building better things.

Rather than going on, I’ll leave you with two quotes, both from Erich Fromm. I think they are worth close consideration:

Our society is run by a managerial bureaucracy, by professional politicians; people are motivated by mass suggestion, their aim is producing more and consuming more, as purposes in themselves. All activities are subordinated to economic goals, means have become ends; man is an automaton – well fed, well clad, but without any ultimate concern for that which is his peculiarly human quality and function.

The quest for certainty blocks the search for meaning. Uncertainty is the very condition to impel man to unfold his powers.

Paul Rosenberg

[Editor’s Note: Paul Rosenberg is the outside-the-Matrix author of, a site dedicated to economic freedom, personal independence and privacy. He is also the author of The Great Calendar, a report that breaks down our complex world into an easy-to-understand model. Click here to get your free copy.]

A Case for Monetary Independence

by Lucas M. Engelhardt – Mises Daily:monetary

“Sound money and free banking are not impossible — they are merely illegal. Freedom of money and freedom of banking are the principles that must guide our steps.” — Hans Sennholz

When I was asked to give the Hans Sennholz Memorial Lecture, I was uncertain what I should speak about. Should I give an inspirational, autobiographical talk about life as a young academic? Should I present cutting edge research? Should I advocate for better policy in some “hot” political topic? In the end, I looked at the title of the lecture — this was the Hans Sennholz Memorial Lecture. So, I decided that I should present something Sennholzian — especially since I am a Grove City College alumnus, though I was never a student of Sennholz — who had retired before I was a student here.

The only problem was that I knew embarrassingly little about Hans Sennholz. I had heard him speak — in the same room where I was going to speak — once. But, I only remembered two things about him. First, I remembered his German accent. Second, I remembered a brief story that he told about his experiences in academic publishing. Apparently, Harvard asked him to write an article — I don’t think he mentioned what — so he did, they published it, and he was paid $15 for his efforts. He thought this must be some mistake. Not much later, Harvard approached him again, so he wrote for them again — they printed it, and he received another $15. He decided to stop writing for Harvard. (Sennholz’s academic publishing experience is quite different from mine. I wrote an article that I sent to one of the American economic journals. They decided not to print it, and I paid them $100.)

Anyway, after realizing that I should discuss something Sennholzian, and realizing my own ignorance of Sennholz’s work, I hit the library and reserved every book by Sennholz in the state of Ohio’s library system. As I flipped through them: Age of Inflation; Debts and Deficits; The Great Depression: Will We Repeat It?; Money and Freedom a central theme emerged, and it’s the theme in the quote that I began with: Sound money and free banking. So, I hope to present to you today what I call a case for monetary independence — that is, a case for the separation of money and State. To make this case, I will consider a number of different institutional arrangements for how the monetary system may be organized.


Fully Dependent National Central Banks

Let’s start with the worst case — a central bank that is fully dependent on the political system. In effect, in such a system, the Treasury would have the power to create money at will. Economists generally agree that such a system would lead to very high rates of inflation. Government spending is popular — the left loves their social welfare programs, while the right likes funding a large military. However, taxes are politically unpopular — especially with those that have to pay them. So, it is unsurprising that governments typically run deficits. If the government were given direct control over money creation, one can expect that deficits would be funded largely by the creation of new money, as the effects of money creation are much easier to hide than the effects of taxation or decreases in spending.

The end result that economists expect with this framework is that hyperinflation becomes a very real possibility. Historically, hyperinflations tend to occur when large deficits are funded with money creation. This isn’t shocking — a $1 bill costs just about $0.07 to print, so money production is quite profitable. It’s a cheap way of raising funds for the government, and zeros are cheap. So, as prices go up and the money loses value, the Treasury can maintain their profits simply by adding zeros. Eventually, we end up with a Zimbabwe scenario. I have 180 trillion Zimbabwe dollars that I bought on eBay for $15 — and that included protective plastic sleeves. I suspect the sleeves are more valuable than the money inside them, but the point is: zeros are cheap. That being the case, there is virtually no limit to the inflation that a Treasury could create if it were giving the power to create money directly. For this reason, most economists now suggest that central banks should be independent.


Independent National Central Banks

In some ways, the claim that money should be independent of the State is a bit blasé. Over the past twenty or thirty years, the mainstream economics literature has converged around the idea that central banks — which govern monetary policy — should be independent of the governments that they operate under. Alberta Alesina and Larry Summers (Summers is the former Treasury Secretary under President Clinton, and former Director of the National Economic Council) found that independent central banks have better inflation performance — without having higher unemployment or more economic instability than countries with central banks that are less independent. Even President Obama has been clear that he supports a “strong and independent Federal Reserve” — an odd statement given that he has appointed all five of the current members of the Board of Governors, and appears to be looking to appoint more.

And the reality is that the Federal Reserve is not very independent. Dincer and Eichengreen, in a paper in the International Journal of Central Banking, ranked the United States’s Federal Reserve System as one of the four least independent central banks in the world — along with India, Singapore, and Saudi Arabia.

Beyond the institutional connections, there are clear policy connections between the Federal Reserve and government spending. After controlling for the state of the economy, a $1 deficit appears to be funded by about $0.30 of additional monetary base. So, while the Fed is not funding the government dollar-for-dollar, there does appear to be a very close connection between the two. The reason is simple: the Fed, under its current ideology, targets interest rates. If the government borrows a lot, it will drive interest rates up. So, the Fed produces more money to put into loan markets to drive rates back down to their target levels. The end effect is that the Fed is funding a significant portion of the government’s deficits.

So, is this any better than a fully dependent central bank? As many economists love to say — it depends. When the time comes, will the Fed decide to fight inflation rather than continue to fund government deficits? It is impossible to say for certain — though I will say two things. First, mainstream macroeconomists seem to have achieved a consensus that fighting inflation is a very important goal of monetary policy; perhaps the most important goal in most countries at most times. Second, the leadership of the Federal Reserve is convinced that, at the moment, inflation is not much of a concern. Whether they will change their minds in time, and have the political fortitude to stand up to a government that will, in all likelihood, still be deficit spending, is uncertain enough that I won’t speculate one way or the other.


Independent, Discretionary, International Central Banks

As we know, the Federal Reserve is not very independent. So, what does it take to make a central bank independent? Based on Nergiz Dincer and Barry Eichengreen’s research, the most independent central banks are mostly found in the Eurozone — where the European Central Bank is in control of monetary policy.

Is this international system a “better” one, though? Let’s take this to an extreme — an extreme which some people have suggested — and consider the benefits and drawbacks of such a system. Let us imagine that all central banks ceded their authority to the International Monetary Fund, which then acted as a single one-world central bank.

This system does, admittedly, have a number of very real benefits. Trade is certainly easier when there is a common currency. Decreased worries regarding exchange rate fluctuations encourage long-term investment projects across national boundaries, which can increase productivity by locating capital where it will be most productive, rather than where worries about currency stability are smallest. The IMF can be expected to be independent of any single government’s pressure to fund deficits — or at least more independent than a national central bank would be.

The drawbacks, however, are substantial. In his book The Tragedy of the Euro, Philipp Bagus suggests that the formation of the Eurozone created a tragedy of the commons in which weak economies — such as Greece, Portugal, and Spain — had incentives to run large budget deficits, funded, indirectly, by the European Central Bank. As the first recipients of newly created money, deficit-running economies can spend the money before it has its full impact on prices — thereby gaining at the expense of those countries that run more balanced budgets. This naturally creates an incentive for countries to run budget deficits — and, in fact, to compete for running the largest ones. This is a recipe for some combination of exceptionally high inflation — if the central bank were to accommodate the deficits or exceptionally high interest rates — if the central bank were to stand its ground.

While it may be that an international central bank could stand its ground more effectively than a national central bank could, recent experience in Europe raises questions about whether international central banks actually will stand their ground.

I want to make one last point about the danger of an entirely unified system: when doing risk management — and a lot of policy is really just risk management — one needs to pay attention to the worst case scenarios. As long as the central bank has discretion, the odds that — at some point in its history — the central bank is going to make a very large mistake is very high. The question then becomes: what is Plan B? We have seen in recent years that national-level hyperinflation, though terrible, has been fairly easy to recover from. The reason can best be seen by examining Zimbabwe. In its hyperinflationary episode in 2008, the internal economy of Zimbabwe was so disrupted that the gross national income per capita had fallen to its lowest level in forty years. However, since that time, gross national income per capital has more than doubled to its highest level since 1983. How did this happen? Zimbabweans abandoned their hyperinflated currency in favor of some combination of the euro, US dollar, and South African Rand — all of which were stable when compared to the Zimbabwe dollar. The adoption of a currency that is more stable gave people confidence to engage in market transactions again — which unfettered resources that had been largely unusable in a hyperinflationary environment.

This solution, though, required the existence of alternative currencies to switch to. What would happen if a single world central bank made a similar mistake? The answer is not at all clear, but I suggest that a worldwide hyperinflation, if it were to occur, would seriously disrupt the division of labor, and thereby lead to a collapse in the worldwide standard of living. The recovery would not be easy, as it would require the reintroduction of a new currency that is actually trusted by the people enough that they would accept it as a medium of exchange. Historically, some countries have succeeded at reintroducing a re-based form of their own currency — but there are also many cases, Zimbabwe among them — where the reintroduction failed.

Given, then, that there would be strong incentives toward hyperinflation, the odds of a hyperinflation actually occurring in a system with a single world central bank, at some point, are far from zero. In fact, given a sufficiently long time period, hyperinflation — or at least some form of serious monetary mismanagement — becomes highly likely. Is this risk worth the advantages? In my assessment, they are probably not.

Monetary Policy Rules

All that has been said thus far has assumed that money is produced by some human monetary policymaker that has some discretion about how much money they can produce. A popular alternative is a rule-based monetary policy. In this case, the political system sets up a monetary policy rule which, somehow, they are unable to alter. This rule then automatically decides what monetary policy should be.

There are several such rules that have been proposed. Milton Friedman’s constant money growth rule was one early — and remarkably simple — example. Friedman suggested that the money supply should grow at a constant rate near 3 or 5 percent. Given that production, on average, grows at a similar rate, this rate of growth will lead to an overall level of prices that is basically stable over the long run. Since Friedman, a number of other rules have been proposed. John Taylor famously proposed his rule which is based on a combination of recent inflation and the recent state of the economy relative to its long-term trend. Scott Sumner suggests what he calls Nominal GDP targeting — an idea not original to Sumner, nor does he claim it to be.

Rather than criticizing each of these individually, I will suggest a few difficulties with this institutional arrangement — regardless of the specific content of the rules.

The primary difficulty, of course, is the political one. Any political system that is strong enough to establish a monetary policy rule is strong enough to modify it — or discard it. So, what would it take for the monetary policy rule to be established and then left alone? We know that there are times that policymakers are actually strong enough to implement a policy, but would not be strong enough to eliminate it. I think of Social Security as an example. In this case, the policy created an interest group — and a popular one — that would fight for the policy to continue. Everyone loves their Grandma, and everyone’s Grandma loves Social Security — so it is such a popular program today that no politician would be willing to seriously attempt to eliminate it. For us to do this with monetary policy, we’d need to have a monetary policy rule that created a popular interest group that would resist any changes to that rule. How to do that is not clear to me — but I may just be uncreative at coming up with political solutions.

Even if we were to solve the political problem, these rules all share in common certain economic problems — primarily one of measurement error. Any use of economic data must acknowledge that discussing data from a scientific standpoint, such as saying that the overall price level will rise if the money supply increase sufficiently quickly, is different from saying that a particular measurement of that variable will act in a specific way. The Consumer Price Index, Producer Price Index, and GDP Deflator all seek to communicate the “overall price level” — but they all have weaknesses.

That is: the statistics that we can actually measure don’t align perfectly with the scientific conceptions that they are designed to estimate. In short: in reality, there is error in any macroeconomic measurement. For scientific purposes, this is something we can deal with. As long as our statistics are reasonably well correlated with the underlying reality that we care about, errors can be expected to, in a sense, cancel out, on average. So, as long as actual prices, on average, act like the CPI, and as long as the true money supply, on average, acts like M2, then any statistical connection between CPI and M2 would be expected, on average, to reflect the actual relationship between money and price levels.

But, policymaking is an entirely different matter — it’s far closer to engineering than science. That being the case, the errors are, in a sense, exactly what matter. If our measure of the money supply is temporarily undermeasuring the true money supply, then we’ll end up creating too much money under a Friedman rule. Is this temporary? Yes, but in the world of economics, temporary things are exactly those things that create economic disruptions.

An additional economic problem with these rules is that they assume that, in a sense, the world is, or should be, static. The Friedman Rule and Nominal GDP targeting both implicitly assume that overall price levels or total spending in the economy should not change. Why not? The Taylor Rule implicitly assumes that the equilibrium real interest rate in the economy should not change. Again, why not? The economic world is a dynamic one in which change is one of the very few constants. At its most fundamental level, economic activity is the use of resources to satisfy our preferences based on our technical know-how. But all three of these are in constant flux. We are continuously using, creating, exhausting, and discovering resources. We are continuously changing our preferences. Our technical know-how is continuously changing as we learn new things and unlearn others. Why then would we expect macroeconomic aggregates — even if we could measure them perfectly — to remain constant? So, rule-based policymaking has serious economic problems because of mismeasurement and the natural dynamism of the real world. Perhaps fortunately we will likely never experience these problems as the political problems with getting such rules established are likely to be insurmountable.


Market-Based Money

Our final stop in the spectrum of monetary independence is a truly independent currency — that is, a money that has no legal advantages or disadvantages when compared to other goods. In short: a free market in money where moneys are free to compete with one another to attain the favor of users. Anyone who wishes may introduce their own money — so I could print Engelhardt dollars in my basement — and try to convince people to use them. The only restriction would be that fraud would be banned — so no one else could mimic my Engelhardt dollars.

In such a system, I would expect that moneys would be governed by the normal, everyday actions of entrepreneurs that do so well satisfying so many of our desires. As they respond to demand and competition from other suppliers, the supply of money would grow at the pace that the market determines. If more of a particular money is demanded, that money will rise in value — increasing the profitability of producing it — leading those entrepreneurs that produce it to produce more, and drawing other entrepreneurs toward producing money that is similar — and therefore competitive — with that money.

As entrepreneurs respond to demand, one would expect that the value of a winning money is likely to be fairly stable over long periods of time — not perfectly stable, of course, as there is often a delay between a change in demand and changes in production to meet that demand. But, the market will reward those money producers that do the best job providing a money that people actually want to use.

As Sennholz observed in many of his writings, there’s something about gold that makes it a particularly good money. And that something is not just some undefinable “X Factor.” It’s a list of traits. As laid out in Sennholz’s Money and Freedom, gold is useful, but unessential, easily divisible, highly durable, storable and transportable. So, the fact that gold — in many cases operating alongside the remarkably similar, but somewhat less valuable silver — was, historically, what emerged as money on the free market. Like Sennholz, I also agree that it seems fairly likely that, if people were left to their own devices, they would again use gold as money.

The question then is: what would it take for us to establish a market-based money? When I first read Sennholz’s Inflation or Gold Standard? I read his plan for reform — and on nearly every step, I said to myself “Well, we’ve already done that.” Only a couple points remained. When Sennholz wrote Money and Freedom in 1985, his original intent was just to update Inflation or Gold Standard? — but he realized that the world had changed enough in the ten or so years since Inflation or Gold Standard? was written that a new book was required. So, he laid out a new plan for reform. It ends up very little has changed in the past thirty years — so Sennholz’s plan from 1985 is mostly still relevant to us today.

The first step: Legal tender laws must be repealed. Allow private debt payments to be written so that they can be repaid however the borrower and lender find acceptable. As Sennholz notes — this move isn’t really particularly radical. If the federal government wishes to receive its own fiat currency in payment for taxes, no one is preventing them from continuing to do so. If it prefers to borrow and repay in its own fiat currency, that is also fine. Similarly, if any private business or individual wishes to continue using paper dollars exclusively, they are free to do so. The only difference is that people would also be free NOT to deal in paper currency. To some degree, we already have this freedom in most of our transactions. When selling goods and services, businesses are permitted to refuse — or require — payment in any form they like. Legally in the US, only debt falls under the legal tender provision. Again, the legal change we’re asking for is not radical.

A second step is what I call “Honesty in Minting.” The US mint produces gold and silver coins — which have a legal tender value that is a small fraction of their metal value. Under Gresham’s Law, these coins are hoarded while paper money — which is worth far more in exchange than the paper it is printed on — is used as money. This should stop. Rather than stamping a Silver Liberty with a phony legal tender value, simply stamp it with its weight and purity. The back of a Silver Liberty should say 1 oz fine silver. I’d note that it already does include this — it just appends the rather silly “ONE DOLLAR” designation as well. This creates confusion for any business that may want to accept gold or silver coins by suggesting that the coin is worth one dollar when its metal value is worth far more than that. Simply eliminating the one dollar designation would make these coins far more usable in transactions, by allowing them to be traded for their fair value.

In addition to honesty in minting, additional freedom in the banking system would also make the market for money more competitive. For example, free entry in banking should be allowed. Banks should be free to accept deposits and offer check-writing and debit-card services denominated in any currency, or any commodity, that depositors and banks find acceptable.

Technically, you can have deposits in the US that are denominated in foreign currencies — but the minimum deposits tend to be prohibitively high — I found one account that you could open for a mere $50,000 or so. Allowing free entry for banks that specialize in foreign currencies would make the possibility of using alternative moneys real to more than just those that are exceptionally wealthy. In addition, banks should no longer be required to be members of the FDIC or Federal Reserve System. As with any organization, banks should be allowed to join if they believe that the benefits outweigh the costs, and not to join if they believe the costs outweigh the benefits.

Again, these are not radical moves. I am not calling for the end of the FDIC — though I confess that I would like to see it vanish. I am not calling for the abolition of the Federal Reserve — though, again, I am convinced that that would, on the whole, be a good thing. I am simply asking that these organizations be opened up to the normal market forces of competition from competitors who are free to enter or exit the market, producing innovative products that may operate alongside — or may replace — those products currently being provided by the Federal Reserve and FDIC.

I will close as I began, with Sennholz. The last paragraph of Money and Freedom declares to us:

Sound money and free banking are not impossible; they are merely illegal. This is why money must be deregulated. All financial institutions must be free again to issue their notes based on ordinary contract. In a free society, individuals are free to establish note-issuing banks and create private clearinghouses. In freedom, the money and banking industry can create sound and honest currencies, just as other free industries can provide efficient and reliable products. Freedom of money and freedom of banking, these are the principles that must guide our steps.

Article originally posted at

Drought and the Failure of Big Government in California

by Ryan McMaken – Mises Daily:government

California Governor Jerry Brown has announced that private citizens and small businesses — among others — will have their water usage restricted, monitored, and subject to heavy fines if state agents determine that too much water has been used. Noticeably absent from the list of those subject to restrictions are the largest users of water, the farmers.

Agriculture accounts for 80 percent of the state’s water consumption, but 2 percent of the state’s economy. To spell it out a little more clearly: Under Jerry’s Brown water plan, it’s fine to use a gallon of subsidized water to grow a single almond in a desert, but if you take a shower that’s too long, prepare to be fined up to $500 per day.


There Is No Market Price for Water

The fact that the growers, who remain a powerful interest group in California, happen to be exempt from water restrictions reminds us that water is not allocated according to any functional market system, but is allocated through political means by politicians and government agents.

When pressed as to why the farmers got a free pass, Jerry Brown was quick to fall back on the old standbys: California farms are important to the economy, and California farms produce a lot of food. Thus, the rules don’t apply to them. If translated from politico-speak, however, what Brown really said was this: “I have unilaterally decided that agriculture is more important than other industries and consumers in California, including industries and households that may use water much more efficiently, and which may be willing to pay much more for water.”
In California, those who control the political system have ensured that water will not go to those who value it most highly. Instead, water will be allocated in purely arbitrary fashion based on who has the most lobbyists and the most political power.

Numerous economists at and elsewhere have already pointed out that the true solution to water shortages lies in allowing a market price to determine allocation — and in allowing there to be a market in water — just as there is a market in energy, food, and other goods essential to life and health. Supporters of government-controlled water claim that billionaires will hoard all the water if this is allowed, although it remains unclear why the billionaires haven’t also hoarded all the oil, coal, natural gas, clothing, food, and shoes for themselves, since all of these daily essentials are traded using market prices, and all are used daily by people of ordinary means.


City Water vs. Agricultural Water

For a clue as to how divorced from reality is current water policy in California, we need only look at the government-determined “price” of water there. Even under current conditions, water remains very inexpensive in California, but for the record, city dwellers have historically paid much, much higher prices for water than growers.
For example, according to one study, water for residents of San Francisco rose by 50 percent from 2010 to 2014, but residents were still paying about 0.8 cents per gallon for water. In Los Angeles, the price growth was a little less over the same period, but the Los Angeles price was also low, coming in a little over 0.6 cents per gallon. City dwellers are told that water is incredibly scarce, but as Kathryn Shelton and Richard McKenzie have noted, the price of water is so low that virtually no one even knows the per-gallon price.

But how much do growers pay for their water? In a recent LA Times article contending that growers “aren’t the water enemy,” it was noted that growers are now paying $1,000 per acre foot. This is supposed to convince us that water prices are incredibly high. But how does this compare to city prices? An acre-foot is about 326,000 gallons of water, so if we do the arithmetic, we find that growers who pay $1,000 for an acre-foot are paying about 0.3 cents per gallon for their water. That’s a little less than half as much as the city users are paying.

Now, city water is treated potable water, so we might expect a premium for city water. Historically, however, the gap between city prices and agricultural prices is much, much greater. Bloomberg notes that as of 2014, the price of water had risen to $1,100 “from about $140 a year ago” in the Fresno area. Going back further, we find that in 2001 many farmers were paying $70 per acre foot. Prices well below $1,000 are far more typical of the past several decades than the $1,000 to $3,000 per acre-foot many growers now say they pay. In fact, if we see what the per-gallon price would be for a $140 acre-foot of water, we find that a city dweller could use fifty gallons per day at a monthly price of 64 cents per month, or a per-gallon price of 0.04 cents.

At prices like these, it is no wonder that there is now a shortage of water. The price of water has for years been sending the message that water is barely a scarce resource at all.

In many cases, the low prices are enabled by decades of taxpayer subsidization of water infrastructure. A year round flow of water to both cities and growers is ensured in part by huge New Deal-era projects like Shasta Dam, and Hoover Dam, which the State of California could not afford to build, but which today California largely relies upon for water, care of the US taxpayer.

In central and northern California, the primary beneficiaries of federal water projects are growers, although it’s the city dwellers, who use a relatively small amount of water, who get lectured about conserving water. Were an actual market in water allowed, however, growers would have to compete for water with city dwellers, whose industries are far more productive than agricultural enterprises and who are likely to be willing to pay higher prices. Even when the private sector owners of water are old farming families (a legacy of prior appropriation), the water would still tend to go to those who value it the most as reflected in the per-unit prices they are willing to pay. That is: city dwellers


What Will We Do Without California Growers?

The fact remains that much California farmland is in a desert where it rains under twelve inches per year. Massive irrigation projects have made farming economical in the region, but it’s unlikely that the status quo can continue forever if California dries up and cities begin to compete for more water.

When crops like pecans, which are native to Louisiana where it rains over fifty inches per year, are being grown in central California, we will have to ask ourselves if there is true comparative advantage at work here, or if the industry is really sitting upon a shaky foundation of government-subsidized and -allocated resources.

The rhetoric that’s coming out of the growers, of course, is that California growers are essential to the American food supply. Some will even suggest that it’s a national security issue. Without California growers, we’re told, we’ll all starve in case of foreign embargo.

But let’s not kid ourselves. North America is in approximately zero danger of having too little farmland for staple crops. In fact, one can argue that some of the best farmland in the world — in Iowa for instance — is underutilized because policies like Jerry Brown’s farm favoritism send the message that California will prop up its desert agriculture no matter what.

No, if California farmland continues to go dry, this only means that Americans will have to turn to other parts of the US or imports. After all, many of the crops grown in dry parts of California are much more economically grown in more humid environments, including citrus plants, avocadoes (which are native to Mexico), and various tree nuts. And of course, it’s these crops, which are already fairly expensive and water-intensive that get mentioned when we’re told that California growers must be given what they want until the end of time. This will likely mean higher prices for some of these crops in the short run, the correct response is not government favoritism, but free trade, and letting comparative advantage work. In a world with market prices, it’s simply not economical to grow everything under the sun in the California desert. If markets were allowed to function, with real water prices and free trade, this would quickly become abundantly clear.

Article originally posted at

Voluntary Exchange vs. Government Mandates

by Patrick Barron – Mises Daily:voluntary exchange

The basic unit of all economic activity is the uncoerced, free exchange of one economic good for another. Moreover, the decision to engage in exchange is based upon the ordinally ranked subjective preferences of each party to the exchange. To achieve maximum satisfaction from the exchange, each party must have full ownership and control of the good that he wishes to exchange and may dispose of his property without interference from a third party, such as government.

The exchange will take place when each party values the good to be received more than the good that he gives up. The expected — but by no means guaranteed — result is a total higher satisfaction for both parties. Any subsequent satisfaction or dissatisfaction with the exchange must accrue completely to the parties involved. The expected higher satisfaction that one or each expects may not be dependent upon harming a third party in the process.


Third Parties Cannot Create Value by Forcing Exchange

Several observations can be deduced from the above explanation. It is not possible for a third party to direct this exchange in order to create a more satisfactory outcome. No third party has ownership of the goods to be exchanged; therefore, no third party can hold a legitimate subjective preference upon which to base an evaluation as to the higher satisfaction to be gained. Furthermore, the higher satisfaction of any exchange cannot be quantified in any cardinal way, for each party’s subjective preference is ordinal only.

This rules out all utilitarian measurements of satisfaction upon which interventions may be based. Each exchange is an economic world unto itself. Compiling statistics of the number and dollar amounts of many exchanges is meaningless for other than historical purposes, both because the dollars involved are not representative of the preferences and satisfactions of others not involved in the exchange, and because the volume and dollar amounts of future exchanges are independent of past exchanges.


One Example: The Case of Ethanol

Let us examine a recent, typical exchange that violates our definition of a true exchange yet is justified by government interventionists today: subsidized, protected, and mandated use of ethanol.

The use of ethanol is coerced; i.e., the government requires its mixture into gasoline. Government does not own the ethanol, so it cannot possibly hold a valid subjective preference. The parties forced to buy ethanol actually receive some dissatisfaction. Had they desired to purchase ethanol, no mandate would have been required.

Because those engaging in the forced exchange did not desire the ethanol in the first place, including the dollar value of ethanol sales in statistics purporting to measure the societal value of goods exchanged in our economy is meaningless. Yet the government includes all mandated exchanges as a source of “value” in its own calculations.

This is just one egregious example of many such measurements that are included in our GDP statistics purporting to convince us that we have “never had it so good.”


Another Example: The Soviet Economy

Our flawed view that governments can improve satisfaction caused us to misjudge the military threat of the Soviet Union for decades. Our CIA placed western dollar values on Soviet production data to arrive at the conclusion that its economy was growing faster than that of the US and would surpass US GDP at some point in the not too distant future. Except for very small exceptions, all economic production resources in the Soviet Union were owned by the state. This does not necessarily mean that it was possible for the state to hold valid subjective preferences, for those who occupied important offices in the state held them at the sufferance of what can only be described as gang lords, who themselves held office very tentatively.

State ownership is not real ownership. Those in positions of power with responsibility over resources hold their offices for a given period of time and have little or no ability to pass their office on to their heirs. Thus, the resources eventually succumb to the law of the tragedy of the commons and are plundered to extinction. Nevertheless the squandering of the Soviet Union’s commonly held resources was tallied by our CIA as meeting legitimate demand.

Professor Yuri Maltsev saw first-hand the total destruction of the Soviet economy. In Requiem for Marx he gives a heartbreaking portrayal of the suffering of the Russian populace through state directed, irrational central planning that did not come close to meeting the people’s legitimate needs, while our CIA continued to crank out bogus statistics of the supposed strength of the Soviet economy upon which the Reagan administration based its unprecedented peacetime military expansion.


Peaceful Exchange Allowed, Violent Exchange Redressed

With the proviso that no exchange may harm another, as explained so well in Dr. Thomas Patrick Burke’s bookNo Harm: Ethical Principles for a Free Market, we are led to the conclusion that no outside agency can create greater economic satisfaction than can a free and uncoerced exchange. The statistics that support such interventions are meaningless, because they cannot reflect the satisfaction obtained from true ordinally held subjective preferences. Once this understanding is acknowledged and embraced, the consequences for the improvement of our total satisfaction are tremendous. Our economy can be unshackled from government directed economic exchanges and regulations.

Article originally posted at

Bait & Switch: Economic Development in the States

by Jeff Scribner – Mises Daily:economic development

North Carolina recently offered Boeing $683 million in tax incentives to open a plant in North Carolina to build Boeing’s new 777X jetliner. The NC bid failed, as did those from some other states, when Boeing decided to build the 777X in its home state of Washington where there is no state, personal, or corporate income tax.

More recently, North Carolina was prepared to offer Toyota up to $107 million worth of incentives to lure the automaker’s North American headquarters from Los Angeles to Charlotte, bringing 2,900 jobs with it. The Charlotte Observer reported that Charlotte lost out to Plano, Texas. The Texas offer was only $40 million but Texas has no corporate or personal income tax and has direct flights to Japan.

Businesses do not locate in any one place solely because of the tax laws. However, as tax burdens climb, the tax treatment of the business itself, and of its higher-paid employees and executives, becomes a more important consideration. Thus the incentive packages, made up primarily of special tax abatements for a set period of time, are developed and used in recruiting new businesses.

It is apparent that the politicians — politicians as diverse as Governor Pat McCrory of North Carolina and Governor Andrew Como of New York — who try to make use of these incentives, are totally missing the point they are illustrating. If you have to bribe a company to locate in your state or bribe one not to leave, your taxes and whatever else you are using to bribe them, are too high or otherwise onerous. If this were not so, companies and entrepreneurs would move to your state without being bribed and those already there would not be trying to leave. Low taxes and a favorable business climate, like that of Texas, bring in many companies from other places, like California, where the business climate and taxes are not favorable.

Ally Bank ran a commercial several months ago illustrating this concept. The point of the commercial was that it is wrong to treat new customers better than old ones. More importantly, state “incentives” for new businesses, or those planning to leave, may amount to failure to provide equal protection under the (tax) law and may actually be bad for the state’s economy.

In April, Governor McCrory proposed a public-private partnership that would take over the economic development functions of the North Carolina Commerce Department. It is not yet clear how the partnership’s marketing would work or whether it would still offer tax and other “incentives” to attract companies to relocate to North Carolina. The budget approved by the House and Senate has no credits, instead offering grants totaling $10 million. (The Department of Commerce is in the process of determining how the grants program will be structured.) As a point of comparison, under the current incentives program, the state gave out $61.2 million in credits in 2013.

New York is mounting an effort to attract new businesses and entice entrepreneurs to start new businesses through its “Start Up New York” program. See their video ad here.

The Upstate New York economy is not good. Many of the little manufacturing companies that lived along the old Erie Canal have gone and even some of the big ones, like Xerox, Kodak, IBM, Bausch & Lomb have left, are leaving, or are diversifying out of state. In his article “The Ghost of America Future” Bob Lonsberry points out that New York has the highest combined state and local taxes, property taxes, and gasoline taxes in the country. Upstate New York is also losing population and representation in Congress. Is this a place where you would move to or start your business? Even if you get a tax break now, what happens when the time is up? Worse yet, if other businesses and population are leaving, will there be any local market for you?

North Carolina, on the other hand, has gained some of the people leaving New York. North Carolina’s traditional tobacco and furniture manufacturing businesses have shrunk. But North Carolina is home to Research Triangle Park (RTP) the biggest research park on the east coast and home to several information-technology, communications, and biotech companies. Moreover, the influx of people moving in from New York and other high-tax northern states has boosted the North Carolina service and real estate sectors. So North Carolina is a better place to move your business to or start up a new business. Then why does Governor McCrory have to offer incentives? Because, even though North Carolina is much better than New York, it is still too highly taxed and regulated when compared to many other states.

In truth, the states should close their economic development offices, cut the size and expense of their governments, and reduce or eliminate the taxes levied on businesses. They should also cut the regulatory red tape required to start a business and then operate that business within their state. Personal income taxes should also be eliminated so that companies considering moving to take advantage of the good business climate will bring their headquarters and high earners along. If you really are a good place to do business and your current businesses are doing well, you will not have to bribe a company to move in. Just get out of the way. Look at Texas!

If you are a small businessman or CEO of a big public company like Boeing, where do you want to move or expand? If a state that you are considering will offer you a “bribe” to move there, how will they treat you when you become one of the “old” companies there? If you are in the economic development office of a state, why do you think you should offer a new company something you would not offer to those already there? If you are a businessman in any state whose government will offer “incentives” to a new company, you should consider suing for equal protection under the law!

Article originally posted at

A Portrait of the Classical Gold Standard

by Marcia Christoff-Kurapovna – Mises Daily:gold standard

“The world that disappeared in 1914 appeared, in retrospect, something like our picture of Paradise,” wrote the economist Cecil Hirsch in his June 1934 review of R.W. Hawtrey’s classic, The Art of Central Banking (1933). Hirsch bemoaned the loss of the far-sighted restraint that had once prevailed among the “bankers’ banks” of the West, concluding that modern times “had failed to attain the standard of wisdom and foresight that prevailed in the 19th century.”

That wisdom and foresight was once upon a time institutionalized throughout an international monetary culture — gold-based, wary of credit, and contemptuous of debt, public or private. This world included central banks including the Bank of England, the Bank of France, the Swiss National Bank, the early Federal Reserve, the Imperial Bank of Austria-Hungary, and the German Reichsbank. But the entrenched hard-money ideology of the time restrained all of them. The Bank of Russia, for example, which once required 50 percent to 100 percent gold backing of all notes issued, possessed the second largest gold reserves on the planet at the turn of the twentieth century.

“The countries that were tied together in the gold standard system represented to a not inconsiderable degree a community of interest in and responsibility for the maintenance of economic and financial stability throughout the world,” recounted Aldoph C. Miller, member of the Federal Reserve Board from 1914 to 1936, in The Proceedings of the Academy of Political Science, in May 1936. “The gold standard was the one outstanding symbol of unity and economic solidarity which the nineteenth century world had developed.”

It was a time when “automatic market forces,” as economists of the day referred to them, prevailed over monetary management. Redeemability of money in (fine) gold ensured, within limits, stability in foreign exchange rates. Credit was extended only as far as reserve ratios would allow, and central banks were required to keep fixed reserves of gold against notes-in-circulation and against demand deposits.


When Markets Dominated Monetary “Policy”

Gold flows regulated international price relationships through markets, which adjusted themselves accordingly: prices rose when there was an influx of gold — for example, when one country received a debt payment from another country (always in gold), or during such times as the California or Australian gold rushes of the 1870s. These inflows meant credit expansion and a rise in prices. An outflow of gold meant credit was contracted and price deflation followed.

The efficiency of that standard was not impeded by the major central banks in such a way that “any disturbance of economic or financial character originating at any point in the world which might threaten the continued maintenance of economic equilibrium was quickly detected by foreign exchanges,” Miller, the Federal Reserve board member, noted in his paper. “In this way, the gold standard system became in a very real sense a regime or rule of economic health, a method of catching economic disturbances in the bud.”

The Bank of England, the grand master of them all, was the financial center of the universe, whose tight handle on its credit policies was so disciplined that the secured the top spot while not even holding the largest gold reserves. Consistent in its belief that protection of reserves was the chief, and only important, criterion of credit policy, England became the leading exporter of capital, the free market for gold, the international discount market, and international banker for the trade of other countries, as well as her own. The world was in this sense on the sterling standard.

The Bank of France, wisely admonished by its founder, Napoleon, to make sure France was always a creditor country, was so replete with reserves it made England a 500 million franc loan (in 1915 numbers) at the onset of the World War I. Switzerland, perhaps the last “19th-century-style” hold-out today with unlimited-liability private bankers and strict debt-ceiling legislation, also required high standards of its National Bank, founded in 1907. By the 1930s that country had higher banking reserves than the US; the Swiss franc was never explicitly devalued, unlike nearly every other Western nation’s currency, and the country’s domestic price level remained the most stable in the world.

For a time, the disciplined mindset of these banks found its way across the Atlantic, where the idea of a central bank had been long the subject of hot debate in the US. The economist H. Parker Willis, writing about the controversy in The Journal of the Proceedings of the Academy of Political Science, October 1913, admonished: “The Federal Reserve banks are to be ‘bankers’ banks,’ and they are intended to do for the banker what he himself does for the public.”

At first, the advice was heeded: in September 1916, almost two years after its founding on December 23,1913, the fledgling Fed worked out an amendment to its gold policy on the basis of a very conservative view of credit. This new policy sought to restrain “the undue and unnecessary expansion of credit,” wrote Fed board member Miller, in an article for The American Economic Review, in June 1921.

The Bank of Russia, during the second half of the nineteenth century steered itself through the Crimean War, the Russo-Turkish War, the Russo-Japanese War, impending Balkan wars — not to mention all that was to follow — and managed to emerge with sound fiscal policies and massive gold reserves. According to The Economist of May 20, 1899, Russian holdings were 95 million pounds sterling of gold, while the Bank of France held 78 million sterling worth. (Austria-Hungary held 30 million sterling worth of gold and the Bank of England 30 million sterling worth of both gold and silver.) “Russia up to the very moment of rupture [with Japan, 1904–1905], was working imperturbably at the progressive consolidation of her finances,” reported Karl Helfferich of the University of Berlin, at a meeting of The Royal Economic Society [UK] in December 1904. “Even in years of industrial crises and defective harvest, her foreign trade showed an excess of exports over imports more than sufficient to compensate payments sent abroad. And, as guarantee her monetary system she has succeeded in a amassing and maintaining a vast reserve of gold.”

These banks, in turn, drew on the medieval/Renaissance and Baroque-era banking traditions of the Hanseatic League, the Bank of Venice, and Amsterdam banks. Payment-on-demand “in good and heavy gold” was like a blood-oath binding the banker-client relationship. The transfer of credit “did not arise from any such substitution of credit for money,” noted Charles F. Dunbar, in The Quarterly Journal of Economics of April 1892, “but from the simple fact that the transfer in-bank saved the necessity of counting coin and manual delivery of every transaction.”

Bankers were forbidden to deal in certain commodities, could not make loans or create credit for the purchase of such commodities, and forbade both foreigners and citizens from buying silver on credit unless the same amount in cash was in the bank. According to Dunbar, a Venetian law of 1403 on reserve requirements became the basis of US banking law on the deposits of public securities in the late 1800s.

After the fall of bi-metallism in the 1870s, gold continued to perform monetary functions among the main countries of the Western world (and the well-administered Bank of Japan). It was the only medium of exchange and the only currency with unrestricted legal tender. It became the vaunted “measure of value.” Bank currency notes were simply used as auxiliary to gold and, in general, did not enjoy the privilege of legal tender.


The End of An Era

It was certainly not a flawless system, or without periodic crises. But central banks had to act in an exceptionally prudent manner given the all-over public distrust of paper money.

As economist Andrew Jay Frame of the University of Chicago, writing in The Journal of Political Economy, in January 1912, noted: “During panics in Britain in 1847 and 1866, when cash payments were suspended, the floodgates of cash were opened [by The Bank of England], the governor sent word to the street that solvent banks would be accommodated, and the panic was relieved.” Frame then adds: “However, this extra cash and the increased loans that went with it were very quickly put to an end to avoid credit expansion.”

The US was equally confident of its prudent attitude. Aldoph Miller, writing of Federal Reserve policy, remarked: “The three chief elements of the policy of a central bank or system of reserve holding institutions are best disclosed in connection with the attitude towards 1) gold 2) currency 3) credit.” He noted proudly: “The federal reserve system has met [these] tests on the whole with remarkable success.”

But after World War I, a different international landscape was left behind. England had been displaced as the center of international finance; the US and France emerged as the chief post-war creditor countries. The mechanism of the gold standard to which depreciated currencies could be related no longer existed. Only the US was left with a full gold standard. England and France had a gold bullion standard and other countries (Germany, primarily) had a gold-exchange standard.

A matrix of unbalanced trade relationships began to saturate the international economy. Then, with so many foreign countries attendant upon its speculative boom, the US manipulated its own domestic credit policies to ease credit and exchange-standard controls. This eventually culminated in an international financial crisis of 1931. Under Bretton Woods (1944), the gold standard was effectively abandoned: domestic convertibility was illegal and the role of gold was very constrained in favor of the dollar.

“It was, at least in theory, simple enough in the old days,” wrote a wistful W. Randolph Burgess, head of the New York Federal Reserve, in 1938. “In the present strange new world, where the old gold portents have lost their former meaning, where is the radio beam which the central banker may follow? What is the equivalent of gold?”

The men of his era and of the late nineteenth century understood the meaning of such a question and, more importantly, why it is one that must be asked. But theirs was a different world, indeed — one without “QE,” ZIRP,” or “Unknown Knowns” as fiscal policy. And there were no helicopters, either.

Article originally posted at

This is Why Your Wages Aren’t Rising

by Bill Bonner – Bonner and

We ended last week wondering what had gone wrong: How come the 21st century has turned out to be such a dud?

Where are the jaw-dropping new inventions? Where are the rising incomes? Where is the dynamic, sizzling economy we expected?

Back in about 1963, we recall trying to picture ourselves in the 21st century. The rate of progress then was so fantastic we had to stretch to imagine it.

Every year, Chevrolet, Ford and Chrysler put out a new and better automobile. In 50 more years, surely cars would be regularly flying through the air!

In 1969, Neil Armstrong walked on the moon. It was just a matter of time before we had a colony there… from which we could explore the solar system.

Then in 1970, the pocket electronic calculator appeared. Half a century later, imagine the condensed knowledge and computing power we would be able to carry around.


Aging Economies

The only one of those things that realized its apparent potential was the increase in computing power.

That has changed life on planet Earth. Now, instead of talking to your neighbors in the elevator, you can keep your head down and focus on your smartphone.

We’ve seen couples in restaurants who never talk among themselves – each fiddled with their iPhone through the whole meal.

Is that progress or what?

Since the 21st century began, the average US household has lost income. Bummer.

Why has this happened?

One answer we proposed to readers of our new monthly publication, The Bill Bonner Letter, was that three of the leading economic zones – the US, Europe and Japan – have come to be dominated by old people.

But that explains only a part of it… and probably not the major part.


Stopping the Future from Happening

The other reason is that government is always reactionary.

It protects existing voters from those who haven’t been born yet… existing wealth businesses from entrepreneurs… and the past, generally, from the future.

Much of the blame for this flop of a century can be put on government and its cronies in the private sector.

At this suggestion, apologists for big government point out that government spending, as a percentage of GDP, is scarcely higher now than it was in the 20th century.

But today, much more of the private sector has been crony-ized.

Since 1960, the number of rules, regulations and taxes has soared.

As we showed last week, far fewer new businesses are being started now than were in the 1960s.

This is partly because a high wall of regulation, designed to keep out competitors, protects existing businesses.


Chock-a-Block with Cronies

The “security” industry is obviously a government affair – dominated by large, entrenched cronies.

But so are businesses in finance, health care, housing and education. They are not exactly married to the feds… but they are so close they spend almost every night in each other’s arms.

When you buy a house, for example, it is considered a private sector transaction.

Fannie Mae, Ginnie Mae and Freddie Mac mortgages don’t show up as government spending.

But the US government created and operates them. And these three “government-sponsored enterprises” are responsible for about 95% of mortgages issued in the last three years.

Banking, medicine and schooling – even at the university level – are so dependent on government rules and government money. And they are so chock-a-block with cronies, that they might as well be government itself.

And take a company such as GE. It is supposed to be in the business of power generators and airplane engines and other major industrial innovations.

But prior to the crisis of 2008, it worked fiddle and bow with the feds to play the government’s distorted yield curve… and then, when that gig was up…. it was saved by more direct federal bailouts.

You can read the whole sordid story at David Stockman’s excellent website, Contra Corner. (Stockman was President Reagan’s budget adviser before quitting in disgust over the administration’s profligate spending.)

In short, after 1960, the economy came to be more controlled by people who were more interested in protecting current wealth than in producing more of it.

Central planning led to a decline in growth rates throughout the rest of the 20th century. The rate of innovation slowed.

What we are seeing in the 21st century is proof of our dictum: The real role of government is to look into the future and prevent it from happening.

Article originally posted at Bonner and

Government Loans: Risky Business for Taxpayers

by Matt Battaglioli– Mises Daily:loans

Obtaining a loan from the government now seems perfectly normal to most Americans, be the loans for education, business, healthcare, or whatever else.

Examples include Small Business Administration loans, where a potential business owner goes to the government to get startup cash, and student loans, where a college student borrows money for tuition or even living expenses. These loans can often be paid back with interest over the course of what is often several decades.

Other examples might include Federal Housing Administration (FHA), Veterans Administration (VA), or Rural Housing Services (RHS) loans, which differ from the former in the sense that they are government insured loans, yet the fundamental principle behind them remains the same: government is taking upon itself (via taxpayers) the risk behind making the loan.

Of course, private loans are also available, though those that do not employ government insurance or other subsidies usually come with higher interest rates. The higher interest rates in the purely-private sector come from the fact that the private entity making the loan must take on all the risk, instead of externalizing it to the taxpayers.

So, the reality of lower interest rates in government and government-subsidized loans means they are vitally necessary, right?

First of all, the government doesn’t “make money,” in the way that private entities do. There is only one way in which states initially accumulate revenue, and that is through taxation. This extorted wealth is originally made in the private sector. So, in order for a government to make a loan back to the private sector, that money must first be removed from the private sector via taxation.


Government Knows How To Best Spend Your Money

For private entities, however, when they make a loan and determine who qualifies for it, and at what interest rate, the private firm making the loan is basically determining at what price (i.e, interest rate) the firm feels adequately compensated for the risk of lending out this money, and for giving up direct control over that money for the duration.

To claim, therefore, that the government should be in the business of making loans because private loans are generally too costly or too inaccessible for buyers, is no different than saying that government must take individual’s money and use it in a way that the original owners (i.e., the taxpayers) themselves would determine to be reckless and irresponsible. While it is true that occasionally a government loan may be paid back with interest at the appropriate time, it would be absurd to suggest that politicians would be more knowledgeable about how a person’s money should be used than the person who originally created and owned the wealth in the first place.


But Government Should At Least Prevent Usury, Right?

Moreover, there are those who will say that private firms making loans should be restricted from charging “excessive” interest on their loans (i.e., usury). This is an example of a very well-meaning, but utterly damaging regulation. It is crucial to note the differences in time preference displayed by both the lender and the borrower. The lender’s time preference (in this case) is lower than that of the borrower’s, meaning that the lender prefers a larger sum of money in the future, and the borrower prefers a smaller sum now. To get money now, however, the borrower must pay for it in the form of interest.

This represents a healthy balance between lenders and borrowers. It is why loans are made. Laws passed that prohibit certain interest rates on loans are far more likely to hurt those who need the loans, than anyone else. As was previously stated, a firm or person making a loan must feel compensated for the risk of making the loan, and that compensation manifests itself in the interest rate. To restrict a firm from charging a certain percentage of interest on their loans will only reduce the amount of loans it gives out.


Taking Away Your Choices

If a potential borrower who is determined to be a rather high risk asks for a private loan, then their interest on that loan will be quite high, but at least in that situation, the borrower has the choice of taking the loan, or to not take the loan. In the end, the borrower will choose what he or she believes will most benefit him or her. Yes, the borrower might miscalculate and the loan might turn out to have been a bad idea, but at least the borrower had a choice.

On the other hand, if the amount of interest that could be charged on the loan were to be forced down via government regulation, then the firm or person making the loan would simply not offer the loan at all, as he or she would not feel their risk is justified by the legally-allowable interest rate.

Faced with a lack of loans, risky borrowers may then look to government and government-subsidized loans as an option, but we find here just another case of government offering itself as the (taxpayer-funded) solution to a problem it caused in the first place.

Article originally posted at

Why the Austrian Understanding of Money and Banks Is So Important

by Jörg Guido Hülsmann– Mises Daily:Money and Bank

This article is adapted from the foreword to Finance Behind the Veil of Money: An Austrian Theory of Financial Markets by Eduard Braun.

The classical economists had rejected the notion that overall monetary spending — in current jargon: aggregate demand — is a driving force of economic growth. The true causes of the wealth of nations are non-monetary factors such as the division of labor and the accumulation of capital through savings. Money comes into play as an intermediary of exchange and as a store of value. Money prices are also fundamental for business accounting and economic calculation. But money delivers all these benefits irrespective of its quantity. A small money stock provides them just as well as a bigger one. It is therefore not possible to pull a society out of poverty, or to make it more affluent, by increasing the money stock. By contrast, such objectives can be achieved through technological progress, through increased frugality, and through a greater division of labor. They can be achieved through the liberalization of trade and the encouragement of savings.


The Austrians Are the True Heirs of Classical Economics

For more than a century, the Austrian school of economics has almost single-handedly upheld, defended, and refined these basic contentions. Initially Carl Menger and his disciples had perceived themselves, and were perceived by others, as critics of classical economics. That “revolutionary” perception was correct to the extent that the Austrians, initially, were chiefly engaged in correcting and extending the intellectual edifice of the classics. But in retrospect we see more continuity than rupture. The Austrian school did not aim at supplanting classical economics with a completely new science. Regarding the core message of the classics, the one pertaining to the wealth of nations, they have been their intellectual heirs. They did not seek to demolish the theory of Adam Smith root and branch, but to correct its shortcomings and to develop it.

The core message of the classics is today very much out of fashion — probably just as much as at the end of the eighteenth century. As the prevailing way of economic thinking has it, monetary spending is the lubricant and engine of economic activity. Savings are held to be a plight on the social economy, the selfish luxury of the ignorant or the evil, at the expense of the rest of humanity. To promote growth and to combat economic crises, it is crucial to maintain the present level of aggregate spending, and to increase it if possible.

This prevailing theory is precisely the one refuted by Smith and his disciples. Classical economics triumphed over that theory, which Smith called “mercantilism,” but its triumph was short-lived. Starting in the 1870s, at the very moment of the appearance of the Austrian school, mercantilism started its comeback, at first slowly, but then in ever-increasing speed.1 In the 1930s it was led to triumph under the leadership of Lord Keynes.


How Keynesianism Destroyed Economics

Neo-mercantilism, or Keynesianism, has ravaged the foundations of our monetary system. Whereas the classical economists and their intellectual heirs had tried to reduce the monetary role of the state as much as possible, even to the point of privatizing the production of money, the Keynesians set out to bring it under full government control. Most importantly, they sought to replace free-market commodity monies such as silver and gold with fiat money. As we know, these endeavors have been successful. Since 1971 the entire world economy has been on a fiat standard.

But Keynesianism has also vitiated economic thought. For the past sixty years, it has dominated the universities of the western world, at first under the names of “the new economics” or of Keynesianism, and then without any specific name, since it is pointless to single out and name a theory on which seemingly everyone agrees.


The Key Importance of Money and Banking

No other area has been more affected by this counter-revolution than the theory of banking and finance. It was but a small step from the notion that increases in aggregate demand tend to have, on the whole, salutary economic effects, to the related notion that the growth of financial markets — aka “financial deepening” — generally tends to spur economic growth.2 Whereas the classical tradition had stressed that “financing” an economy meant providing it with the real goods required to sustain human labor during the production process (which was called the wage fund respectively the subsistence fund), the Keynesian counter-revolution deflected attention from his real foundation of finance. In the eyes of these protagonists, finance was beneficial to the extent — and only to the extent — that it facilitated the creation and spending of money. Financial intermediation was useful because it prevented that savings remained dormant in idle money hoards. But finance could do much more to maintain and increase aggregate demand. It could most notably rely on the ex nihilo creation of credit through commercial banks and central banks. It provided monetary authorities with new tools to manage inflation expectations, for example, through the derivatives markets. And financial innovation was likely to create ever new opportunities for recalcitrant money hoarders to finally spend their cash balances on attractive “financial products.”

The youthful and boastful neo-mercantilist movement of the 1930s and the early post-war period did not bother to refute the classical conceptions in any detail. The theory of the wage fund was brushed aside, rather than carefully analyzed and criticized, just as Keynes had brushed aside Say’s Law without even making the attempt to dissect it.3 As a consequence, the foundations of the theory of finance have remained in an unsatisfactory state for many decades. A newer vision of finance had supplanted the older one. But was the latter without merit? The new theory appeared to be new. But was it true?

Finance Behind the Veil of Money is one of the very first modern discussions that try to come to grips with these basic questions. Steeped in the tradition of the Austrian school, Dr. Eduard Braun delivers a sweeping and original essay on the foundations of finance. Relying on sources in three languages, and delving deep into the history of capital theory — most notably the neglected German-language literature of the 1920s and 1930s — his work sheds new light on a great variety of topics, in particular, on the history of the subsistence-fund theory, on the relation between monetary theory and capital theory, on economics and business accounting, on price theory and interest theory, on financial markets, on business cycle theory, and on economic history.

Two achievements stand out.

One, Braun resuscitates the theory of the subsistence fund out of the almost complete oblivion into which it had fallen after WWII. He argues that this theory has been neglected for no pertinent reason, and with dire consequences for theory and economic policy. In particular, without grasping the nature and significance of the subsistence fund, one cannot understand the upper turning points of the business cycle, nor the economic rationale of business accounting, nor the interdependence between the monetary side and the real side of the economy.

Two, the author reinterprets the role of money within the theory of finance. He revisits the theory of the purchasing power of money (PPM) and argues that a suitable definition of the PPM relates exclusively to consumer-good prices, not to capital-good prices. Dr. Braun argues that the PPM in that sense is the bridge between the theory of money and the classical theory of the subsistence fund.

His book shows that this is a fruitful approach and a promising framework for future research in a variety of contemporary fields, such as financial economics, finance, money and banking, and macroeconomics. The current crisis is a devastating testimony to the fact that mainstream thought in these fields is very deficient, and possibly deeply flawed. At the very moment when governments and central banks, with the encouragement of academic economists, set out to apply the conventional Keynesian policies with ever greater determination, Eduard Braun invites us to step back and reflect about the meaning of finance. This is time well spent, as Braun’s readers will find out.

Article originally posted at