It’s a touchy subject since you have half of the game’s fans that don’t care about the visuals and the other half who do, but nevertheless, the idea that CD Projekt Red “downgraded” the game to better suit the consoles has persisted. Bans were delivered over their forums, threads on /v/ about the subject dominated the front page and comparison videos sprung up like weeds. Thankfully, Eurogamer Spoke to CD PRojekt Red’s Marcin Iwinski about the issue and received an answer.
Summing up, it seems that the demo we saw in 2013 was in a small, enclosed area and had yet to be applied to the final open-world product. Upon realizing that the same level of detail couldn’t be sustained by their streaming technology when placed in a larger setting (Like that of Velen), they had to scale things back down. They do say they could have kept that lighting and texture model, but very few PCs would have been able to handle it and anyone running under DX12 would have been left out entirely. They also mention that the console versions were what gave them the extra funding needed to even complete the game, and without them they wouldn’t have been able to gain enough revenue to make something so grand.
- “If you’re looking at the development process,” Iwinski begins, “we do a certain build for a tradeshow and you pack it, it works, it looks amazing. And you are extremely far away from completing the game. Then you put it in the open-world, regardless of the platform, and it’s like ‘oh shit, it doesn’t really work’. We’ve already showed it, now we have to make it work. And then we try to make it work on a huge scale. This is the nature of games development.”
- It was captured PC footage, not pre-rendered, Badowski confirms, but a lot had to change. “I cannot argue – if people see changes, we cannot argue,” Adam Badowski says, “but there are complex technical reasons behind it.
- “Maybe it was our bad decision to change the rendering system,” he mulls, “because the rendering system after VGX was changed.” There were two possible rendering systems but one won out because it looked nicer across the whole world, in daytime and at night. The other would have required lots of dynamic lighting “and with such a huge world simply didn’t work”.
- It’s a similar story for environments, and their texture sizes and incidental objects. It was a trade-off between keeping that aspect of them or their unique, handmade design. And the team chose the latter. The data-streaming system couldn’t handle everything while Geralt galloped around. The billowing smoke and roaring fire from the trailer? “It’s a global system and it will kill PC because transparencies – without DirectX 12 it does’t work good in every game.” So he killed it for the greater good, and he focused on making sure the 5000 doors in Novigrad worked instead.
This reminds me of Christmas of 1999 when Ultima 9 came out. Richard Garriott went far beyond what any PC was capable of and created an open-world CRPG that loaded everything in the world at once…which resulted in horrible frame droppage. The game wasn’t able to run decently on a PC until 1GHZ CPUs became the norm two years later. It ruined his reputation and his company, so in this writer’s opinion it was a good idea not to go so hard (visually) on Witcher 3. Of course, debates will still rage on as to whether this is fair to high-end users or not.
As someone who is currently going through the game (and reviewing it for our site) I can attest to the game being very scale-able and also very next-gen in appearance.