Hi Whyme466,
Thank you for sharing your thoughts. I know we have had an exchange of emails over our differing opinions, and I think it would be helpful if we did some digging as to what it is you are after.
Let me quote some of the statements listed, and I'll share my view, and I'm hoping the community will share their ideas as well.
I believe that GG3D criteria need to be updated to reflect actual operational 3D rendering layer (Nvidia, DDD, iZ3D) performance - that is, how users actually play the games. This is how any system is evaluated by end users. I believe GG3D assessments should emphasize NORMAL (operational) rendering layer settings, not unique settings, unused in typical game play.
I think there is room to be more specific here. The way to think about GG3D is it's not designed to tell you how you should play your game in 3D. Everyone's eyes are different, everyone has a preference on what they like and what they don't like, and the nature of the game has a role in this as well. DDD, Nvidia, and iZ3D all have their own way of rating games, and there is no meaning to how they rate their games. Excellent or Good or Lousy could all be talking about the same game and same game experience.
The idea behind GG3D is it rates games according to the ability to have maximum eye candy and being able to game without having visual anomalies. All games start with a score of 100%, and get deductions when eye candy has to be turned off, and/or there are visual flaws. Each of these components has a score value. What's left after the penalties are taken away is categorized into the Bronze to Platinum grades. There is a secondary indicator which tells us whether or not the game is capable of out of screen effects, but that isn't handled as a penalty - just an indication of visual flexibility.
When members are making submissions for their games, we ask that they accurately submit according to the GG3D specifications. HOWEVER, members are always welcome to share their personal recommendations for best results in the comments section.
The current asssessment criteria lack any evaluation of the dynamic performance of the rendering layer - how much interaction is required as a Dragon Age 2 facial closeup jumps to a panoramic scene, for example. Is manual interaction required? If so, how should this interaction be scored (penalized)? Is the rendering layer prone to severe (large penalty) or small, occasional (small penalty) autofocus adjustment, if typically used in normal game play?
Can you elaborate more on what you mean here?
If you are talking about having great 3D settings in one scene, and then suddenly having crazy separation or convergence levels in another where your eyes want to jump out of their head, that's called a camera angle problem, and that's listed as a "secondary anomaly" choice in GG3D.
We acknowledge that there are auto-convergence features in some drivers, and we ask that they be turned off for testing purposes. They prevent measurable S-3D flexibility, they cut down on game performance, and more often than not, they aren't even needed. If a game is well rendered, it shouldn't have camera angle problems in S-3D, and gamers shouldn't have to use an auto-convergence feature to get a positive result. Yes, there are instances when auto-convergence can benefit a game, and we ask that this commentary be left to the comments section as a personal recommendation. Auto-Convergence is a symptom of a problem that makes things difficult for driver developers - and the problem should be acknowledged in a measurable way.
The use of some effects in the 3D realm need reassessment, as well. For example, I agree with the author of this review about DOF:
http://www.behardware.com/articles/807-" onclick="window.open(this.href);return false; ... ef-3d.html. I find that disabling DOF improves 3D (depth) resolution, a key attribute in 3D immersion. So, if a game cannot disable DOF, it should be penalized (not the current scoring approach in GG3D, however).
Let me ask the question a different way. If you don't turn the DOF off, will the game have visual problems and anomalies that shouldn't be there because of a driver incompatibility? If it's a case of compatibility, then it should still be handled as a penalty. Remember, we aren't telling gamers they should or shouldn't turn DOF off, we are saying they have the option to. If the driver can't support DOF, then it's a penalty because they aren't giving maximum flexibility to the gamer - it's not a real choice.
I didn't see settings reductions scored in your Skyrim profile - just pictures. Are your settings being reduced because your GPU isn't fast enough, or is it because there is an incompatibility that is harming performance?
A good example is AA with some DDD profiles. Turn on anti-aliasing, and a 55FPS game gets cut down to 5. It's not a problem with the GPU, it's an incompatibility with AA with this particular game. So, we have to turn AA off (which is a minor penalty). If you are turning settings down for natural speed loss, that's incorrect because users with much better GPUs won't know the flexibility that the game can potentially offer them. That said, you should still include your personal settings recommendations in the comments section (as you have) so the score of the game doesn't get falsely reduced. However, don't hesitate to list required score reductions for compatibility reasons.
The scoring isn't about artistic choice, it's about visual flexibility and measurable compatibility.
Regards,
Neil