Pascal Gilcher, also known by his username of Marty McFly, can be considered to be the most famous ReShade modder mainly thanks to his breakthrough SSRTGI (Screen Space Ray Traced Global Illumination) shader. SSRTGI has been a staple of ReShade presets and graphics enhancement packs since its debut in May 2019, and it was often covered even here on Wccftech. It became so ubiquitous that even NVIDIA added a version of the ray tracing shader to its GeForce Experience Freestyle suite of postprocessing effects.
However, 2023's new buzzword in graphics rendering is path tracing. It is the next step in the evolution from limited ray tracing of select effects like ambient occlusion, shadows, or reflections to a near-complete abandonment of rasterized effects in favor of the path tracing pipeline. PC gamers got their first taste of this technology this April with the Cyberpunk 2077 Overdrive Mode preview, which immediately established itself as the new graphical standard in videogames.
The downside is that path tracing is extremely taxing on any hardware. Only the latest GeForce RTX 40 Series graphics cards can comfortably run Cyberpunk 2077 Overdrive mode through a combination of DLSS 2 (Super Resolution) and DLSS 3 (Frame Generation).
That made Pascal's unveiling of his new work-in-progress path tracing ReShade addon even more surprising since it won't be able to leverage DLSS or even the RT Cores embedded in the GeForce RTX GPUs. Even so, Pascal is adamant that performance will be better than the regular triangle-based method.
I've had a long, interesting chat with him to discuss his path tracing approach, the ETA of its release to Patreon subscribers, and how his addon can also replace or inject many other effects. Skip past the video to read the full edited transcript.
To start, I know you are in touch with NVIDIA - they even implemented your RTGI in GeForce Experience's Freestyle a while ago. Did you talk with them about this yet?
Definitely, they would be interested in it, but since it's an ongoing thing, I'm not sure if I can really talk about it. So let's just say I'm interested that possibility.
I understand that you've been working on this for a couple of years, right?
More or less. When the ReShade API came out, it allowed you to modify game calls, meaning that whatever the game tries to render this something, it communicates that to ReShade, which now has an interface that says 'Ok, the game wants to do this thing. Do you have anything to say about it?'
With that, you can design addons that perform any sort of task based on what the game is going to do. Let's say the game tries to render a specific effect. Then you can say no, you're not going to render that way. You can block, for example, a certain effect in the game from being rendered or an object from being rendered. I started working on this before the ReShade API officially came out of it. But I just recently picked it up again.
Was there any particular reason for you to pick it up again, or was it just scheduled in your workload?
I didn't have much experience with it at first, obviously because it was new. I also didn't have much experience with C++ as a whole, and I've released two other add-ons in the meantime that did other things. I've gotten much better at it now. I recently tried to make something work and I said 'Ok, this add-on can do this'. So I tried to update it with the new stuff and that broke it completely. So I said 'Ok, that's bad. I guess I need to take another look at it'.
And then, I looked over and saw that many of the problems that originally prevented me from progressing further were really obvious. It's like you're looking at your own exam from sixth grade and you realize how easy it is in retrospect. So, I looked at it and said I could do this much better now. I always wanted to pick it up again because it was a very good concept, so I started again, and now I've been working on it again for, I believe it's two months now, trying to carry over all the features.
As I understand your work, it follows a voxel-based approach. Digital Foundry likened it to SEUS path tracing for Minecraft. Is that correct in your assessment?
Yes and no. I guess you could call it voxels. Basically, this is just a word for what pixels would be if they were 3D. The way I do it is yes, it stores data and rounds it to the nearest cube in the game world, but it doesn't store all of them. There's a presentation from NVIDIA about ray traced ambient occlusion you can actually look up called Spatial Hashing. Basically, it means that whenever you see something in the game world, it tries to find the closest cube in which it falls and enters that into a data structure that cannot hold all of them. It would be like you're trying to put 100 tenants into a house that can only hold about 10 of them.
Because of that, I don't actually need to store the entire space. In a game like Minecraft, you would have every voxel that contains air, for example. It's technically unused, but it still consumes memory, meaning that storing all the data would be insanely memory-consuming. If I were to store all the voxels this infrastructure works with, I would consume 40-50 gigabytes of VRAM, around twice as much as the top GPUs have. What I'm using is called a sparse storage model, meaning I don't store every node because we only store the nodes that actually exist. It's about 2% or 3% of the total ones that could be there. That works pretty well and I can store much, much more data than I would usually be able to.
You said that the stale data is removed. I guess that means when you move through the world, the system automatically purges the old data and gets the new data.
Correct. Let's say you have a completely black image and you have a few white pixels on it. If you had to store all the pixels and the image was very large, it would consume a lot of memory. But if you had a list of only the white pixels and you were not storing the black ones, then you would only store like five pixels, which is a very small amount of data, and I can get away with storing way less. So the actual volume support is up to 2000 times larger or something like that, but if I were to store this, it would consume more VRAM than the RTX 4090 has. As it is, I get away with 300-400 megabytes.
2 of 9
Path tracing is usually very expensive, so people are wondering about the performance cost of this addon. Can you share anything about how much of a frame rate hit it's going to be, for example, in Skyrim?
The performance of the tracing itself is usually higher than triangle-based tracing. The thing with RTX Remix is they capture the geometry as the game tries to render it. They perform regular triangle-based path tracing and most of the time, they can only afford to shoot a single ray per pixel and try to make the most of it. In the Skyrim demo video that I presented, I shoot like three or four rays per pixel.
I have an RTX 4090 and get around 100 frames per second. With my new system, I get away with only one ray, but then again, I have to have more post-processing runtime due to the more expensive filter and also this NVIDIA model (ReSTIR GI) that increases the quality you get on a single ray. So it still takes a certain amount of performance, but there is lots of headroom to optimize. I do several things in full resolution that I theoretically don't need to render in full resolution.
I just want to get the most out of it and then kind of optimize back, because most of the time you don't know if something looks bad. Is it because you've cut corners already, or is it because your approach isn't good? So I'm trying for maximum quality first and then trying to see how much I can cut back on performance without losing much visual quality.
I read in the Digital Foundry article that you might even implement hardware acceleration of BVH. Would that be far off?
It is a bit complicated because right now, I get away with the fact that everything works inside the shader. If I were to make something with BVH, that means I would have to store the data structure on the CPU side. I would have to make my own interface meaning that, for example, if the game is running on DX9 or DX10, I could not use the hardware acceleration.
To use the RTX library, I'd have to use DX12 or Vulkan, and I would have to make my own rendering device and then communicate the data back to the game. Also, it would cut out everyone who doesn't have a ray tracing capable video card. So I'm not sure if it's worth it spending that much effort on this. I also don't see anything wrong with the path tracing solution as it is. Obviously, triangle-based ray tracing is more accurate in the intersections between objects.
But I'm just interested in the light, meaning it doesn't really matter if my tracing can precisely tell you the position of a person's t-shirt. It suffices for it to know that it must be red, so the light must be too. At which point does the added precision become irrelevant for the use case? In the case of diffuse, even specular GI, it's totally fine. For short rays, though, I revert to a very high precision intersection method and only switch to spatially hashed data once I traveled more than a couple voxels distance.
I also don't see any visual degradation, such as with the screen space approaches. They, of course, have significant degradation compared to world space.
'I think some people wouldn't recognize path tracing if it hit them in the head' - Pascal GilcherI don't know if you've read some of the reactions that emerged on Reddit or gaming forums like ResetEra, for example. Some people thought it was really cool. Others said, well, this doesn't look like path tracing to me. Could that be because your solution doesn't include dynamic light sources yet?
I think that most of these comments don't have any clue what they're talking about. In this case, yes, I don't have access to the light sources in the game, but the game already does that for me. It would be redundant to do that. My focus is on indirect illumination, meaning that the light source casts the light into the scene, and it is reflected back into areas that have been shaded.
In the video, there is a scene where you can see the bright windows casting light across the ground. This is something that only path tracing can do.
They don't really know what they're looking about. I mean, I've seen comments saying that it just makes stuff more contrasted. I think they wouldn't recognize path tracing if it hit them in the head.
Some of these commenters certainly do not possess enough technical knowledge. Speaking of dynamic light sources, did you consider possibly using the RTX Direct Illumination SDK? It could allow a lot of additional dynamic lights.
My path-tracing solution can handle many lights as well. It just needs to know where they are. This is one problem that RTX Remix has, too, since older games do not really have the concept of dynamic lighting and sometimes you can't even trust the game. RTX Remix tries to solve that by having creators manually place light sources in those places where they would be. Theoretically, I could just support that.
What I'm doing is transportable to every game you want. I just need to find a few things that almost every game has. I just have to find where they are and then get up to the same places.
That is, of course, one of the major draws of your solution. But I was wondering whether you'll have to adapt the path tracing addon yourself for each game, or could modders do it themselves?
The way it works is the add-on basically has to be told 'Ok, you're going to wait for this specific rendering event. Then you're looking at this attached resource. You look at this entry and the data that is there.' The way I do it at the moment is I have it running and I want to tell all the data going back and forth so I know what each of the data are. For example, let's say I need specific data that transforms position relative to the camera into positions in the game world. You need experience in game programming, not to mention knowing what these things look like. I can tell them apart, but someone who doesn't have any of that can't. In the ReShade community, maybe five people can do that. It's a very non-linear process, and I think it depends on what the game has for you because this might be different data. I don't think that many people can do that, so I plan on having a split model. While I'm hunting for this data, the add-on gathers all there is to the rendering at once and then gives me the data that I actually need.
At the moment, it's basically me and maybe a few of the people who are knowledgeable enough about this that can do the work on specific games and create configuration files that tell the add-on what the respective data are and how to interpret them. Then the end users would just use the final configuration files.
One of the main comparisons that have been made (even you made it) is with NVIDIA's RTX Remix. Is it fair to say the two technologies are different and complementary, though, since NVIDIA's solution targets older games (mostly those using fixed function pipelines before shaders entered the picture), while your path tracing solution can support up to DX12 and Vulkan?
Of course, my solution is not as powerful as RTX Remix, which completely transforms the game. My solution is less invasive. That's why it's also more transportable. It can probably do less than RTX Remix and yes, it has a very different focus. The main focus of RTX Remix is not only to inject path tracing but also to replace 3D models, texture, or whatever is in the game.
My goal is only to, well, improve how the game looks like. I think you could compare it more to, say, the ENBSeries, which improves the graphics of many different games and its features vary per game.
'Path Tracing is just one of many effects that can be added or replaced with this addon' - Pascal GilcherI guess we could call it ENBSeries with Path Tracing or something.
Yes, but path tracing is just one of many rendering effects that can be injected or replaced into games with this addon. Let's say that X game has a bad Y rendering effect; now I can replace it with something better. I announced it with path tracing because, let's face it, almost everyone knows me as the RTGI guy. It was the most obvious thing to do, but this is just one thing that can be improved. I've done much, much more.
Yeah, I've seen your volumetric clouds for Skyrim; they look really impressive. I know you said that you're still developing this addon. Do you have an ETA? Will it be a few months before this is released to Patreon subscribers?
That's actually a good question. Right now, the solution for Skyrim is not perfectly integrated. There are some missing transparent objects. And let's face it, no one plays Skyrim without ENB. Right now, I've made it compatible with the non-ENB version, so I probably need to integrate it with ENB. I also don't see a reason to re-implement everything ENB has done. If I were to focus on Skyrim, it could be ready in three months.
However, as I try to make it compatible with as many games as possible, I'd say six months to have a generous time frame where it's workable. This is actually very challenging because every game keeps doing things that I never anticipated. Every game I tried has something I didn't plan for, and then it's one week of work to fix things. But it's working better and better in every new game.
So I think it'll be around six months until I have a solution that I can just drop into a game, tweak it for five minutes, and then it's good to go. Of course, depending on the game, the feature set might vary completely.
Do you think your solution would also work well with isometric games, like Baldur's Gate 3 (which is really popular right now)?
In theory, yes. I don't see a reason why it wouldn't.
Nice, would love to see that. Will your path tracing add-on be compatible with PureDark's DLSS 3 Skyrim mod? It would be great to maximize both visuals and performance at the same time.
I haven't really looked into it, to be honest. At the moment, I'm using a pretty much unmodded Skyrim. As far as I understand, it's not compatible with the ENB Series, so people have a choice. They can use the DLSS 3 and have high frame rates, or they can have good graphics.
However, my path-tracing solution doesn't have that compatibility issue, so they should work together in theory.
Thank you for your time.