November 30, 2009

The 7 fund stats that mislead investors the most

My friend Keith works for a big mutual fund company and assumes I hate mutual funds because, he says, I "always write about the things we do wrong."


He insists that fund companies don't actually do much wrong, because they follow the rules and regulations and they'd get in trouble if they violated those standards.


While he's right from a legal standpoint, Keith ignores the simple truth that the rules leave fund companies a lot of ways to fudge the statistics, and the meaning of the numbers. What's more, industry practices let fund companies and research firms hype red herrings, information that's attractive but not necessarily meaty and important.

In our recent discussion, I laid out for Keith what I considered the most misleading statistics and data in the fund world. The longer the conversation ran on, the more I realized that most fund investors don't necessarily know how this information can be used against them.
If these points factor into your investment decisions, you may want to look more closely at their meaning:


1. Past performance, Part I: The candy of the mutual fund world, past performance is where a fund "tastes great" and there are no consequences for indulging. Fund executives publish fine-print warnings that past results are not a reliable indicator of what to expect going forward, but that's always below the large-type hype using those results as a big reason why you should buy a fund now.

So long as investors use past performance to frame future expectations and make it the key reason for buying a fund, management will promote a statistic that they know is bad for you.

2. Past performance, Part II: Some funds achieve their record the old-fashioned way: through shenanigans and financial engineering. Fund companies routinely merge away their bad track records. If XYZ Growth is a laggard but XYZ Large-Cap -- run by the same management team -- has reasonable performance, the growth fund will get the axe and the strong record will survive. Never mind that many investors had a lesser experience -- or that management has shown an ability to underperform -- the snapshot view looks good.

Similarly, fund companies pitch new funds as their "best new ideas." What they don't say is that they "incubate" funds, creating a bunch of new issues using house money. The best performers -- and their newly minted track record -- go public. You get the fund that sticks as opposed to the one that stinks, but you might have judged the new fund differently if you knew it was merely the best of a bad crop.

3. Past performance, Part III: The long-term annualized average record looks good but ignores the question "What have you done for me lately?" Some funds live off great past performance; they haven't been solid for years, but big numbers produced in the distant past make them look ironclad.

4. Average cost: While there is no guarantee that cheap management is good management, costs matter. That's why many investors base their purchase on the average cost for the type of fund they want. The average expense ratio is roughly 1.3% for stock funds and about 1% for bond funds.

In general, investors think that "below-average" is sufficient. What they don't know is that the average is skewed dramatically by the way it is calculated. A "dollar-weighted average" -- so that a fund with $10 billion in assets affects the average more than a fund with $10 million -- drops the "average" expense ratio significantly, so that the typical costs for investors in stock funds drops below 1%. (That's good, because it means investors gravitate to low-cost funds.)
Sadly, it's not just investors who don't look at dollar-weighted average expenses; it's fund directors too. By using the bloated numbers that overemphasize tiny and obscure funds, a board can keep costs high without, theoretically, letting them go much "above average."

5. Returns aren't adjusted for taxes: The fund company doesn't pay Uncle Sam; you do. Funds tell you what they earned, when what's most important is what you get to keep.

6. Time-weighted performance measurement: This boils down to "your mileage may vary." The typical pattern for a hot mutual fund is that the assets flow in after a period of great performance; in other words, investors tend to buy at the high points. If the fund suffers thereafter and the shareholder bails out, they have sold low.

Meanwhile the average performance numbers can continue to look pretty good.
What the fund does after your money arrives is all that matters. Funds that have feast-or-famine performance can look good when performance is annualized or smoothed out over several years, but the real question is whether investors actually got what the fund claimed to deliver. You'll need independent research -- Morningstar Inc. measures this -- to know for sure, because fund companies won't tell you.

7. Manager tenure: Studies show that managers with years on the job have better performance than their short-term brethren. Still, manager tenure has no effect on what happens next; it's not like a manager who has a decade at the helm automatically gets a 1% edge on future performance. Plenty of experienced managers got crushed in 2008; experience was no cushion.

Source: MarketWatch

No comments:

Related Posts Plugin for WordPress, Blogger...