VoidHawk
FPS is a little more important than some realise.
It depends on how the developers set it all up, but here's an example of why high fps can be important.
A couple of years ago people with high fps found they could jump higher than those with low fps!!
People will say anything above 25 to 30 frames is fine as the human eye cant detect the difference once its higher than that (persistence of vision)
but try telling that to serious gamers and they'll quickly tell you its bull, and they're right.
PC's not only have higher frame rate, they also have higher resolution. A good pc with a good graphics card will put a console to shame.
Never understood the people cant see anything above 25-30 fps thing... I can, always have anything bellow 50 and I can start to see the difference,
anything bellow 40 and it starts to annoy me, the difference to my eyes and game play perspective from 50+ and 30-40 is staggeringly noticeable (its
like watching one of those old very early black and white films where the projector was hand cranked, the speed and fluidity of the image just looks
in consistant and things start to jump around.
Dont forget things like screen tearing, where the fps is higher than the refresh rating of the monitor and you get frames where the top half is the
previous/current frame and the bottom half of the screen is the next frame. People seem to hate screen tearing, me I dont care at all and in some ways
actually like seeing it, it means what im playing is the best fps i can get. Its hardly off putting since you only really notice it if the view your
seeing is changing rapidly, such as turning around very quickly or looking up quickly... and in those situations you arent able to properly focus on
whats on screen anyway, although thats probably why people notice it more in those situations. This is what Vsync is for, it sets the output of frames
to the max fps at the time, removes tearing but can throttle your FPS unnaturally causing your performance to go down when it doesnt have to.
For me if the fps is a constant 60 everything is fine or at the very least equal to your monitor refresh rate, fps higher than that value is
pointless, since you arent going to see the difference, since your monitor cant display it anyway. Which is why I laugh at people saying 'I get 100+
fps in X' sure as a number to say my GPU/CPU can push the output of the graphics engine to this value is fine, but to say your play experience is
better because of it is completely false (the only reason it would be good is that you would have a say in the 100 fps with a 60mhz refresh monitor
example a 40fps buffer zone for frame rate spikes so the fps would have to drop 50+ fps before you noticed anything but buttery smooth visuals.
The problem with consoles is for a very long time they've been tied to TV resolutions... which up until this decade has been 320x240... yes your old
cathode ray TV had that small a resolution. With LCD TV's and HD and digital now the norm higher resolutions matching the bottom end of PC's are
possible for consoles. The thing is the console gpus from my understanding are no where near the level of a PC's (the same as a laptops GPU is often
inferior to a desktops), so for cost and manufacture the gpu for a console is lower strength than a PC's but they are trying to push console standard
resolution... and it just doesnt work very well, and alot of games for consoles are still 720p (that is the vertical size of the image ie 720 pixels
higher vertically) which works well for most current consoles, but on a modern LCD with a larger vertical resolution such as a HD TV with 1080p means
the TV has to upscale the image... which is just ugly. As an aside something I found out recently is that many console first person shooters like COD
or anything that has the need for quick precise targeting has actual fuzzy aiming in it, ie it hits what you'd otherwise not hit if you had played it
on PC, they do this to get around the limitations of a console control compared to a PC's mouse and keyboard setup. Why would i want to play a game on
a gaming system that 'adjusts' my gameplay to make up for its short comings.
One thing ive never experienced until now is micro stutter, I recently just got Skyrim (yeah im a few years behind everyone else
)... and boy is
that game taxing my patience, its fun but damn it if it doesnt have the most demanding pre-setup to get looking good and working right of any game ive
ever played on a PC. I get regular stutters though where the game plays normal (judging by the sound) but the frames pause for a quarter of a second
in some situations (ie anything with a humanoid model being preloaded into the game or visible on screen for a while initially) creating a 'stutter'
of the visuals that make playing especially in combat VERY hard (like watching a slide show with a quarter second interval, thing is its not like very
low FPS which I can handle or avoid)... supposedly its because the FPS is higher than the refresh of the monitor, which is bollocks in my opinion
since ive never had the issue with max+ FPS before in other games, but supposedly its a fairly new issue with alot of modern games... supposedly....
given I get it even in situations where my fps is well bellow the 30's I think the official explanation is wrong.
So anyway... Consoles are inferior to PC, and always have been, they've bridged the gap the last few years somewhat but they just cant catch up and by
design they never will really (the recent resolution wars debate is a good example). Modern consoles are trying to be PC's and in doing so they have
stopped being consoles (and the good things that used to entail) but are also trying to be PC's and a media center... and fail at all of that.
Consoles are/were for ease of use gaming... consoles in general are becoming more and more like PC games in that you have to preinstall the damn
games, patch them etc etc... ultimately, why not just buy a PC with a decent GPU (heck the current gen of GPU are fairly cheap and run pretty much
everything at the moment perfectly, and will do so well enough for at least the next 3 or years), its cheaper in the long run and can be used for
other things.
So yeah sorry for all that... PC with a half decent GPU (ie anyone released in the last few years) is and will always will be better than a Console.
Last console I owned (and still do) is a PS2, and ive played and or owned every console they've ever produced since the Atari... wouldn't touch any of
the modern ones personally.
John_Rodger_Cornman
reply to post by luciddream
What I am talking about is the lag to the television monitor. Not the framerate difference between consoles and pc.
Example: PC/VGA card-->computer monitor
compared to...
Console/integrated VGA card --->television monitor
Ummmm.. your talking about thousandths if not hundreds of thousandths of a second, its an electrical signal running up a cable, you aint going to get
any sort of lag from that part of the setup, unless your TV is a couple of hundred miles from where you are sitting... thats like saying whats the lag
between turning on a light with a wall light switch or pushing the button on a torch. Makes no sense to ask...
edit on 15-12-2013 by
BigfootNZ because: (no reason given)