"Please, explain to me HOW this is bad optimisation, and not factors like physics and shadows. Please, entertain me. Better yet, explain to me how you know what the AI in an unreleased game is doing."
Simple, because factors like the physics are not effected by the resolution. Ergo if it is non-effectual below 1024x768 it's due to graphics issues, not due to AI/Physics engine. Those are done by skeletal points in the various critters, and the skeletal points don't change by resolution.
Shadows I'll give, but the CPU can give those over to the GPU.
And let's not even talk about how easy it is to obtain a Beta version of a game, first off. And secondly, the AI in HL set a new standard for AI in a video game. The developers, not the marketting team, have talked about nothing more than the levels of advancement they have been working on for the AI.
So if the first point doesn't satisfy, the second point should.
"Put your resolution to 1600x1200 and try maxxing AA and AF. You will cripple your frame rate. If the GPU isn't doing enough, that's your fault. "
The improvement between 1280x1024 and 1600x1200 is minimal in comparison with other problems it causes.
One of them being that any text is too small to read from the distance I sit at.
"Pixel Shader 2.0 didn't exist until DX9, and that IS new, graphics wise. But that's my point, Bump Mapping ISN'T new, but it still gets supported, unlike AA and AF."
You also can't force something like Bump Mapping globally and expect it to work.. So blame the game developers or blame the video cards' driver developers for giving you the option?
"Well, 30fps is not a good frame rate for an FPS game. Myself, I can't play below 60."
Sorry to hear that, I can.
"So you'd like your desktop resolution to be your game resolution? I doubt that, but we'll pass on it for now."
Doubt it all you like, oddly, my desktop runs at the same resolution I play most games in...
"Considering the tricks that nVidia has pulled over the last year, what you see in your control panel doesn't tend to be what you see on screen. Not to mention that nVidia AA is *terrible*."
I see no difference in quality from ATI, only in performance. It's also not a hardware issue, it's a driver issue.
"That's because those games don't give the FX 2.0 pixel shaders. Developers have pretty much given up on the FX range as DX9 cards, and instead send them DX8 tasks to do."
That's funny, the image looks the same. . . The loss of 10-15fps doesn't phase me, and I still get the same image quality.
"Don't quote urban myth as fact. The human eye can distinguish between FPS faaaar above 30fps."
The eye is also heavily limited by the processor behind it, which, though it sees a better framerate, the brain doesn't, necessarily.
"What are you talking about? What the hell can an application do to get around the fact that the monitor can only refresh so fast???"
Reduce dependancy on framerate.
"Global settings have never been a good idea, they never worked well, and they never will work well. Know why? Because no two games are alike. You seem quite content to have global settings... good for you, but that doesn't change the fact that some apps need AA, some don't, seem need more, some need less. Global settings do nothing but force AA onto an app, and that's just wasting frame rates. It's stupid."
That's the glory of global settings, you can change them.. whenever. Both ATI and nVidia have little tools that sit on your taskbar that you can change the AA/AF to whatever level you want with two clicks. I'm sorry if that's too much for you and all, but it's there. They come with every video card.
The point is it's not the developer's fault you have them, but if the developers have the ability to avoid doing something, because you can do it manually, then they can save time on that and put it into other things.. like making the game better. Good Graphics don't make a Good Game. There's better things they can do with their time, especially when they're given deadlines. If I have to click two or three times more to enjoy a game better, so be it, it's not like it wastes any of my time.
And as for wasting framerates, framerates don't make the world go around. You can operate quite smoothely on lower framerates, and if you seriously have a problem with V-Sync, all I can say is get a better monitor or force the settings higher.
I mean, sheez, a nice ViewSonic monitor, like the one I have, needs to bust 100+ Hz in order to tear. 60fps doesn't hit that. Or even 85 if you wanna go for some gusto. And there's no palpable reason, save for maybe Doom2, why you would want/need to have a game running at 90fps or higher. You want to talk mind-boggling?
I'm glad you see a huge difference between 35fps and 60fps. I, personally, don't. The game operates the same for me without any issue, and I don't find myself any more "skilled" because I have more frames than my mind seems willing to comprehend without filling the rest in itself or what, but it's not exactly amazing stuff.
"P.S. You have my sympathy for the FX."
I don't see why, the hardware on it is excellent. Yeah, it has driver issues, but they'll grind that stuff out. I'm not a big fan of ATI, so whatever. You won't, however, see me knockin' 'em.. Both companies make a great product, and both have their strengths and weaknesses. I, personally, prefer the image I get out of my card, even though it doesn't run quite as fast (or cold!) as my roommates.
Me.