Market Logic

Is there a monetary solution to a structural problem?

Posted in economics, macroeconomics, monetary econ by mktlogic on March 12, 2011

Arnold Kling does a pretty good job summing up the monetary economics of Scott Sumner and John Taylor here. It’s a choice between more wrong and less wrong. As I comented there,

Sumner’s view seems to be that the damage of the recession is optional. It isn’t.

For the better part of a decade labor, capital and loanable funds were consumed in producing houses that turned out to be worth less than the economic costs of producing them. No monetary rule or NGDP target will ever change the fact that the resources consumed in the housing sector have already been consumed. A NGDP target might well have prolonged the illusion that all of the homebuilding was actually an efficient use of resources and made things worse.

Employment won’t return until people identify new projects to invest in, but that’s not something that can be helped by boosting NGDP either. It just makes it harder for people to identify profitable projects when the monetary policy further distorts relative prices. A Taylor rule approach would be less bad, since it would call for less debasement but an even better approach would be no debasement.

What’s wrong with this?

Posted in behavioral econ, economics, microeconomics by mktlogic on September 6, 2010

Cost benefit analysis takes market prices as accurate indicators of how much different things are valued.  This is only a valid assumption about market price in the case where it is also assumed that people behave rationally.  If people are behaving rationally, then they are already behaving in the way which is best for their welfare.  So, there can be no argument based on cost benefit analysis to the conclusion that people should change their behavior in order to make themselves better off.

This “feels wrong” but I can’t say why.

My first reaction would be to name a counterexample like the tragedy of the commons.  If multiple parties are fishing in the same pond, no one has any incentive to leave fish behind to breed because the others will just take those fish.  So I might claim, on the basis of a cost vs benefit argument that the fishers using the pond should come up with a new arrangement like a collectively enforced quota.  Another option would be an auction where the winner buys the pond and the the winning bid is distributed equally to all of the other people who fish in that pond.  Then the winner could rent out fishing rights to the others, so there would be no major disruption.

But this is no counterexample.  Why?  Because we have to make some assumption about the rationality of the fishers.  If we assume that the fishers are not rational, then the costs of setting up an auction or a quota don’t properly reflect the value of the resources that would be used in setting up the auction or quota system.  Maybe it would still be a good idea but the results of any sort of cost/benefit analysis would be unreliable because the inputs to that analysis would be suspect.  On the other hand, if we assume that the fishers are rational, then the fact that they have not already implemented some solution to the commons problem indicates that the actual benefits of a new arrangement are not enough to overcome the actual costs.

Distinguishing structural from cyclical unemplyoment

Posted in economics, macroeconomics by mktlogic on April 18, 2010

Christina Romer’s recent comments are somewhat troubling.

Romer writes,

” I find it distressing that some observers talk about unemployment remaining high for an extended period with resignation, rather than with a sense of urgency to find ways to address the problem. Behind this fatalism, there seems to be a view that perhaps the high unemployment reflects structural changes or other factors not easily amenable to correction. High unemployment in this view is simply “the new normal.” I disagree.

Deficient Aggregate Demand Is Key. The high unemployment that the United States is experiencing reflects a severe shortfall of aggregate demand. Despite three quarters of growth, real GDP is approximately 6 percent below its trend path.”

Romer goes on to argue, based on something like Okun’s law, that because the current unemployment rate and growth in GDP (measured as income) are consistent with a historical relationship between unemployment and GDP (measured as output), current unemployment is best attributed to cyclical factors.

Some problems with this line of reasoning: First, it’s an incomplete argument to estimate the relationship between unemployment and output growth and then cite current period values of these variables as evidence against structural employment. It may well be the case that the existing relationship was observed in periods were employment problems were structural. Romer seems to take it as given that the sample period used to determine the “normal” relationship between unemployment and GDP growth was characterized by purely cyclical employment effects.

Beyond that, Romer misses the relevant problems of observational equivalence here. Here’s a simple analogy: If consumers want apple pies but producers gear up to make cream pies instead, apple growers will soon be converting their orchards to grazing fields to fulfill the sudden increase in orders for dairy cream to be sent to bakers. Workers who might have gone into botany will go into veterinary medicine to satisfy the demand for large animal veterinarians. Once the cream pies hit the market, they will be received poorly by consumers and there will be a surplus of pies. Dairies will lay off workers to cope with falling revenues. To an aggregate demand theorist, this purely structural episode will looks like a case of declining aggregate demand. And there is a reason for that: To an aggregate demand believer, everything looks like a case of fluctuating aggregate demand.

This is the ultimate problem with aggregate demand theories of macroeconomic fluctuations. They provide no way of distinguishing between changes caused by declining aggregate demand and changes caused by supply side errors.

Is data mining really the problem?

Posted in data mining, finance by mktlogic on August 12, 2009

In the WSJ, Jason Zweig writes,

“The stock market generates such vast quantities of information that, if you plow through enough of it for long enough, you can always find some relationship that appears to generate spectacular returns — by coincidence alone. This sham is known as “data mining.”

Every year, billions of dollars pour into data-mined investing strategies. No one knows if these techniques will work in the real world. Their results are hypothetical — based on “back-testing,” or a simulation of what would have happened if the manager had actually used these techniques in the past, typically without incurring any fees, trading costs or taxes.”

I think I agree with the spirit of what Zweig says, but articles like this always bug me for a handful of reasons.

First, any investing thesis is either based on past data or it is based on no data. There are problems with back testing versus other ways of using data, but the reliance on past data is not the problem.

Second, using the term “data mining” as some kind of slur for sloppy exploratory data analysis is misleading. Most of what Zweig criticizes isn’t strictly data mining and in fact his recommended alternatives are closer to actual data mining practice.

What Zweig actually seems to criticize are specification searching and parameter searching and those really are problems (What are the odds of not getting a t-stat greater than 2 if you try 50 variations on a model?) but that’s not data mining. Zweig does recommend some alternatives, but it’s worth mentioning the alternative implied in textbook statistical analysis: Come up with an idea, then test it under one specification, and let that be the end of it. I doubt that anyone actually does this. What people try to do is come up with an idea and test it under a handful of reasonable seeming specifications. This has a high risk of devolving into a statistically sloppy specification search. Or you can actually do real data mining i.e. exhaustively or nearly exhaustively testing lots of models and using cross-validation and out of sample analysis. So the choice is really between testing a handful of specifications or testing lots.

Zweig actually recommends the use of out of sample analysis and giggle testing a.k.a. asking “Does this idea make sense?”, but I have no idea why he mentions these as alternatives to data mining. Out of sample testing is a standard practice in data mining. Giggle testing can be used in conjunction with any other approach but it really just amounts to asking “Is this idea compatible with what I believed yesterday?”

Zweig isn’t all wet. In fact most of what he criticizes as data mining is really worthy of criticism. It just isn’t data mining.

What should we test when we test technical trading rules?

Posted in economics, finance by mktlogic on May 13, 2009

I recently read Mebane Faber’s paper, A Quantitative Approach to Tactical Asset Allocation, which is apparently quite popular on SSRN.

Abstract:
The purpose of this paper is to present a simple quantitative method that improves the risk-adjusted returns across various asset classes. A simple moving average timing model is tested since 1900 on the United States equity market before testing since 1973 on other diverse and publicly traded asset class indices, including the Morgan Stanley Capital International EAFE Index (MSCI EAFE), Goldman Sachs Commodity Index (GSCI), National Association of Real Estate Investment Trusts Index (NAREIT), and United States government 10-year Treasury bonds. The approach is then examined in a tactical asset allocation framework where the empirical results are equity-like returns with bond-like volatility and drawdown.

Early in the paper Faber presents a simple and mechanical market timing procedure:

BUY RULE: Buy when monthly price > 10-month SMA.
SELL RULE: Sell and move to cash when monthly price < 10-month SMA.

The remainder of the paper describes the performance of a hypothetical portfolio that adheres to these two rules.

Faber’s procedure is, in fact, one instance in a class of procedures of the form “Buy (sell) when the price is above (below) the N-period SMA.” Whenever I see papers like his, I’m curious as to why the focus is on testing the instance rather than the class. That is, how would performance have been if the wrong lookback had been used?

I’m sure that there are plenty of similar classes of procedures based on channel breakouts, trendlines, volatility bands and so on that outperform buy and hold over the same period with the right lookback. Such procedures would probably also work well enough with non-optimal lookbacks.

Another class of market timing procedure might be “Buy on Jan 1, 1900 and switch back and forth from assets to cash every N days.” I’m very sure that for the right values of N, this timing procedure can generate results better than buy and hold. I’m also confident that for the wrong values of N this procedure would work very poorly.

This isn’t a complaint about data mining for the right lookback to make a trading strategy seem better than it is. (To his credit, Faber’s paper includes out of sample tests of his procedure.) Rather, knowing how well a market timing procedure works for any given lookback period is just not that useful. Since it is impossible to know the optimal lookback ahead of time, the relevant question either for tests of market efficiency or for active management is “How well does the class of market timing procedures work when the lookback used is non-optimal?”

Another Great Depression would be cheaper

Posted in economics, macroeconomics by mktlogic on March 31, 2009

From Bloomberg: Financial Rescue Nears GDP as Pledges Top $12.8 Trillion. The article includes an itemized list.

                                  --- Amounts (Billions)---
                                   Limit          Current
===========================================================
Total                            $12,798.14     $4,169.71
-----------------------------------------------------------
 Federal Reserve Total            $7,765.64     $1,678.71
  Primary Credit Discount           $110.74        $61.31
  Secondary Credit                    $0.19         $1.00
  Primary dealer and others         $147.00        $20.18
  ABCP Liquidity                    $152.11         $6.85
  AIG Credit                         $60.00        $43.19
  Net Portfolio CP Funding        $1,800.00       $241.31
  Maiden Lane (Bear Stearns)         $29.50        $28.82
  Maiden Lane II  (AIG)              $22.50        $18.54
  Maiden Lane III (AIG)              $30.00        $24.04
  Term Securities Lending           $250.00        $88.55
  Term Auction Facility             $900.00       $468.59
  Securities lending overnight       $10.00         $4.41
  Term Asset-Backed Loan Facility   $900.00         $4.71
  Currency Swaps/Other Assets       $606.00       $377.87
  MMIFF                             $540.00         $0.00
  GSE Debt Purchases                $600.00        $50.39
  GSE Mortgage-Backed Securities  $1,000.00       $236.16
  Citigroup Bailout Fed Portion     $220.40         $0.00
  Bank of America Bailout            $87.20         $0.00
  Commitment to Buy Treasuries      $300.00         $7.50
-----------------------------------------------------------
  FDIC Total                      $2,038.50       $357.50
   Public-Private Investment*       $500.00          0.00
   FDIC Liquidity Guarantees      $1,400.00       $316.50
   GE                               $126.00        $41.00
   Citigroup Bailout FDIC            $10.00         $0.00
   Bank of America Bailout FDIC       $2.50         $0.00
-----------------------------------------------------------
 Treasury Total                   $2,694.00     $1,833.50
  TARP                              $700.00       $599.50
  Tax Break for Banks                $29.00        $29.00
  Stimulus Package (Bush)           $168.00       $168.00
  Stimulus II (Obama)               $787.00       $787.00
  Treasury Exchange Stabilization    $50.00        $50.00
  Student Loan Purchases             $60.00         $0.00
  Support for Fannie/Freddie        $400.00       $200.00
  Line of Credit for FDIC*          $500.00         $0.00
-----------------------------------------------------------
HUD Total                           $300.00       $300.00
  Hope for Homeowners FHA           $300.00       $300.00
-----------------------------------------------------------
The FDIC’s commitment to guarantee lending under the
Legacy Loan Program and the Legacy Asset Program includes a 
$500 billion line of credit from the U.S. Treasury.

I wonder what Bernanke, Geitherner et al are using as their estimated decline in GDP under the scenario of no policy response.

There was no credit crunch

Posted in economics, macroeconomics by mktlogic on March 27, 2009

The economic crisis may have been the result of any number of factors, but the frequently cited “credit crunch” narrative doesn’t square with reality.  Bank credit never stopped growing.  The following graphs from the Fed give year-on-year percent changes in total loans and leases at commercial banks, total bank credit and total real estate loans.  Whatever the causes of the current crisis may have been, it’s hard to believe that declining availability of credit was among them.

loans

totalbankcredit

realestateloans

Goodbye, homo economicus. Hello, what?

Posted in behavioral econ, economics, philosophy by mktlogic on March 27, 2009

Concerning the recent financial crisis, Anatole Kaletsky writes

“Academic economists have thus far escaped much blame for the crisis. Public anger has focused on more obvious culprits: greedy bankers, venal politicians, sleepy regulators or reckless mortgage borrowers. But why did these scapegoats behave in the ways they did? Even the greediest bankers hate losing money so why did they take risks which with hindsight were obviously suicidal? The answer was beautifully expressed by Keynes 70 years ago: “Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.”

What the “madmen in authority” heard this time was the distant echo of a debate among academic economists begun in the 1970s about “rational” investors and “efficient” markets. This debate began against the backdrop of the oil shock and stagflation and was, in its time, a step forward in our understanding of the control of inflation. But, ultimately, it was a debate won by the side that happened to be wrong. And on those two reassuring adjectives, rational and efficient, the victorious academic economists erected an enormous scaffolding of theoretical models, regulatory prescriptions and computer simulations which allowed the practical bankers and politicians to build the towers of bad debt and bad policy.”

The whole article continues in a similar vain:

More challenging to the orthodoxy of academic economics have been approaches that rejected the principle that economic behaviour could be described by precise mathematical relationships at all. Benoit Mandelbrot, one of the great mathematicians of the 20th century, who pioneered the analysis of chaotic and complex systems, describes, in The (Mis)behaviour of Markets, how economists ignored 40 years of progress in the study of earthquakes, weather, ecology and other complex systems, partly because the non-Gaussian mathematics used to study chaos did not offer the precise answers of EMH. The fact that the answers provided by EMH were wrong seemed no deterrent to “scientific” economics.

This is fairly representative of the large number of recent articles and essays suggesting that the current economic distress calls for a rejction of some or all of what the author believes to be the mainstream approach to economics.

Some of these criticisms are valid, although the recent crisis doesn’t make them any more valid.  But most miss their targets entirely.  Eugene Fama was writing (with Mandelbrot!) about fat tails and non-Gaussian distributions since about the same time he was writing about efficient markets. Bad loans were originated mostly on the basis of data driven credit scoring models, not general equilibrium models where the representative agents have rational expectations.  And so on…

That doesn’t mean there aren’t things about economics that could use some fixing.  But the proposed solution has to offer some potential for improvement.  Theories based on animal spirits, irrational agents, etc. can explain anything but economists ought to be able to do more than point out that “anything is possible and anything can happen.”  Even after the fact, theories based on irrationality and animal spirits don’t add very much to the historical narrative.  There may come a time when economics will be improved by a deliberate effort to reconcile mainstream economics with the methods and findings of psychology and sociology.  I’ll wait until those calling for such a change can enumerate a set of new and improved theories and their concrete implications.

An exception to the tendency for actuarial methods to outperform clinical methods?

Posted in actuarial vs clinical prediction, statistics by mktlogic on January 7, 2009

On page 103 of “The Death of Economics” Paul Ormerod writes:

“In the same way, the macro-economic models are unable to produce forecasts on their own. The proprietors of the models interfere with their output before it is allowed to see the light of day. These ‘judgmental adjustments’ can be, and often are, extensive. Every model builder and model operator knows about the process of altering the output of a model, but this remains something of a twilight world, and is not well documented in the literature. One of the few academics to take an interest is Mike Artis of Manchester University, a former forecaster himself, and his study carried out for the Bank of England in 1982 showed definitively that the forecasting record of models, without such human intervention, would have been distinctly worse than it has been with the help of the adjustments, a finding which has been confirmed by subsequent studies.”

Setting up a caching proxy server in Ubuntu

Posted in computers, linux by mktlogic on November 25, 2008

Just a quick summary from a more detailed source

  1. Get squid and squid prefetch: sudo apt-get install squid-prefetch
  2. Edit the configuration file: sudo vi /etc/squid/squid.conf so that the line with the term visible hostname reads visible hostname localhost
  3. Tell squid where to listen by changing the line that reads http_port a.b.c.d:3128 to have the ip address of the intended listening interface.
  4. After the set of lines beginning with acl add acl squidusers src 192.168.1.0/255.255.255.0 or whatever network the proxy should serve.
  5. Find the set of lines that begin with http_access allow and add one more to read http_access allow squidusers and still another line to read http_access deny all
  6. Comment out the line that reads http_access deny! Safe_port
Follow

Get every new post delivered to your Inbox.