I know DLSS scales differently depending on the game, but still 10% relative difference between DLSS Quality and Performance modes seems unimpressive compared to my results.
Alan Wake 2 - Path Tracing
DLAA (Native) 37fps vs 63fps DLSS Quality = 70% relative difference
DLSS Quality 63fps vs 84fps DLSS Performance = 33%
Black Myth Wukong - Path Tracing
DLAA 36fps vs 63fps DLSS Quality = 75%
DLSS Quality 63fps vs 86fps DLSS Performance = 36%
Cyberpunk - Path Tracing
DLAA 41fps vs 75fps DLSS Quality = 83%
DLSS Quality 75fps vs 111fps DLSS Performance = 48%
Cyberpunk - Ray Tracing Ultra
DLAA 68fps vs 119fps DLSS Quality = 75%
DLSS Quality 119fps vs 160fps DLSS Performance = 34%
Cyberpunk - Raster, tested at 4K because of CPU limit at 1440p
DLAA 58fps vs 103fps DLSS Quality = 77%
DLSS Quality 103fps vs 138fps DLSS Performance = 34%
Dead Space Remake - Raster 4K tested at 4K because of CPU limit at 1440p
DLAA 71fps vs 115fps DLSS Quality = 62%
DLSS Quality 115fps vs 155fps DLSS Performance = 34%
GTA Enhanced Edition - Ray Tracing tested at 4K because of CPU limit at 1440p
DLAA 66fps vs 101fps DLSS Quality = 53%
DLSS Quality 101fps vs 124fps DLSS Performance = 22%
Silent Hill 2 Remake - Hardware Lumen / RT
DLAA 54fps vs 84fps DLSS Quality = 55%
DLSS Quality 84fps vs 102fps DLSS Performance = 21%
The Witcher 3 - Ray Tracing
DLAA 79fps vs 116fps DLSS Quality = 46%
DLSS Quality 116fps vs 139fps DLSS Performance = 20%
The Witcher 3 - Raster
DLAA 133fps vs 206fps DLSS Quality = 54%
DLSS Quality 206fps vs 251fps DLSS Performance = 21%