Actually this question comes up repeatedly, and while in theory over 30fps you can't tell the difference, in practice you certainly can. I used to run an old first person shooter many moons ago at 30fps on average, when upgrading to a newer machine it could run at about 65fps and the difference was very clear, though by no means as much as the difference between 25 and 30fps for example so very much a case of reducing returns.
I should point out that I get 150fps in positive conditions, it can drop down to 50-60fps in pretty heavy conditions on the screen.
I bought it at that spec for the next generation of sims - and because I didn't want to buy another machine for five years, by which time no doubt it'll be crawling at 5fps on then-current software
Matt.