Here's an issue about investment risk and return that has been bugging me since a conversation with a guy who really should know better.
Start with the basics. We take it for granted that investors do not like risk; that they will take risk if compensated, and compensation means a higher rate of interest. Oh, and for present purposes, interest=volatility.
So far, fine. Now, consider AnnCo, the pension manager surveying its record over the past 10 years. The risk-free rate is five percent; AnnCo took some risk and got eight percent. Oh wait, AnnCo actually got eight percent per year "with notably rare exceptions," i.e., years seven, eight, nine and ten. Did AnnCo do better than risk-free? Obviously not. Risk-free has accumulated $1.63 on a dollar over 10 years; AnnCo, just $1.59 (AnnCo's 10-year mean is 4.7 percent).
You knew that? Course you did. Grownups understand it easily enough although it's surprising how ofen the point gets lost in discussion. But try this. PenCo finds it averaged 7.75 percent over the last 20 years; therefore it thinks it is justified in using 7.75 as its estimate going forward.
Is this a fair analysis? I think not. Seems to me the real question is "what was the risk-free rate of return over the last 20 years." One way to answer the question is to ask, "what kind of return would have have earned if you had bought Treasuries in 1991?" The envelope, please--turns out that 10 year Treasuries were yielding over seven percent; 30s over eight. So at 7.75 percent, PenCo isn't even beating risk-free. Does anybody think the next 20 years will give us 7.75 percent risk-free? Treasuries sure don't; Bloomberg tonight quotes 10s at 3.17 percent, 30s at 4.31 percent. So for a 20, figure about half the 7.75 percent rate.
H/T again to Ignoto without whom this post would have been far more trivial and less interesting.