there should’t be any averaging! Just render the damn frame!
You can’t tell me we could get something like mgsV on previous gen hardware at 60 fps, And that hardware with 9 times the processing power can only render a lower resolution, noisy image which is then upscaled and denoised… at 30 fps.
“But raytracing!!!”
If these are the compromises that need to be made just to shoehorn that in, then current hardware isn’t really capable of realtime raytracing in the first place.
so, if a company decides to, for example, start using some MIT licensed software, does that suddenly materialize extra responsibilities for that software’s dev?