That might seem a strange comment for a person running a financial advice website to make. But it's true.
Setting up an SMSF won't automatically result in better returns (in fact it will backfire badly for some) although they are a fantastic tool for the right person with the right facts and circumstances.
The reason I say this is because, if you read the media or watch TV, you'd be forgiven for believing that SMSFs outperform industry and retail funds, industry funds outperform retail funds and retails funds outperform, well, something I guess. Of course, that's unless you're seeing the results of the 'research' where it's the other way around.
Perhaps it's just me, but it seems that every other day there's a now form of 'advertising dressed up as research' being released. It seems the finance industry is fast catching up to the volume pumped out by politicians.
The latest piece of research commissioned by someone who, in all likelihood, got exactly the result they were looking for, is a study released by National Australia Bank into the performance of SMSFs versus APRA-regulated super funds. I haven't been able to locate the study itself - it's always easier to find the headline 'results' of a study than the full report with all the assumptions and disclaimers - so I won't be too critical of this particular piece. But for anyone considering setting up a SMSF (or switching to an industry fund) it's important to understand the weaknesses of these types of reports and why they're really not of much use for anything other than marketing.
In brief, the key flaws:
1. The study period.
Anyone commissioning a report of this nature picks a favourable period. If you're paying for research you'd be crazy not to!
The NAB example covers the period 2005-2013, which obviously includes the GFC. It's often mentioned that SMSFs tend to hold more cash than external super funds. During the GFC, this was a good thing. But the other factor peculiar to SMSFs is the fact that there was a transitional arrangement which allowed people to contribute up to $1m, which ended in 30 June 2007.
This meant that there was a bit of a rush at this time by people to set up (rather large) SMSFs, many of which held lots of cash, just before the GFC hit. It also meant many were transferring existing assets into their SMSFs at lower prices than they might have otherwise.
I can't say what impact this has on performance (I suspect no-one can) but it's a specific example of a factor which pollutes any SMSF vs non-SMSF performance data. That's unless being overweight cash in a financial crisis is somehow a special feature of SMSFs.
On top of that, the periods typically studied in these reports are too short. In this example, eight years isn't enough to prove anything. One property downturn, for instance, could completely reverse the results.
2. Apples vs aguajefruit
You've probably never heard of an aguajefruit, but it's nothing like an apple. Nor is the average APRA-regulated super fund anything like the average SMSF (if an average fund of either description even exists - see below).
It's like comparing the performance of an AFL and NRL team. They both play what someone calls 'footy' but the exercise is pointless unless you've got some weird point to prove (and if you do, you'll probably find statistics to support it).
Anyone wanting to set up a SMSF should compare what they're planning to do with their alternative options. If you're trying to decide between Australian Super High Growth and a SMSF, administered by Heffron, running a portfolio based on the Intelligent Investor growth portfolio then they're the two things to be compared.
The 'average' SMSF includes funds wholly invested in leveraged property and the 'average' APRA- regulated funds include options like AMP Super Cash (essentially a bank deposit). Neither of these is likely to affect your results.
3. Lies, damn lies and statistics, especially averages
There is a saying in statistics that 'plans based on averages are wrong, on average'. Average performance is great for advertising campaigns but in many cases it's a very poor representation of actual experience. For instance, the average performance of SMSFs will be heavily influenced by the very large funds (many set up in June 2007 - see above) with very low average costs, and investments that might not be available to the average investor. On the other hand, the average performance of APRA-regulated funds is heavily influenced by the savings of investors who haven't given a moment's thought to what they're invested in.
Whenever you look at averages it's important to remember that, not only are they often useless, but they can lead you right up the garden path. For instance, the average age in a classroom might be 17, which has you thinking 'Year 12'. But then you find out it's a really small kindergarten class with an 80 year old teacher.
The figures in this example might also contain other flaws. For instance, are some returns net of insurance premiums? Are adjustments made for the non-cash cost of time spent by SMSF trustees? But it really doesn't matter exactly what the flaws are. The key principle is that these types of reports have no bearing on individual decisions.
If you're thinking about a SMSF, an industry fund or retail fund, don't be sucked in by the marketing hype. Work out the options available to you, the costs and forecast returns. Then compare your options on an apples and apples basis.
Leave the aguajefruit for the marketing teams.