Okay, I work as a programmer, and there is a reason projects work the opposite way. You first have to have a working product that comes back as good from whatever QA you have, then you optimise and build on it. If you have to optimise on day 1, nothing will ever get done. I should know, that’s why I have a ton of personal projects in development hell.
Why would games be different?
I mean, putting in a bit of thinking before you actually hit the keyboard can be an incredibly effective form of optimization, if you can get for example an O(n^2) down to an O(log n). You’ll even save time on not having to rework the thing later, and if you build on poor foundations, chances are you’ll stumble upon fundamental architectural challenges down the road, which can be extremely costly in terms of development time.
I work in games, the reason it works the opposite for them is because Unreal Editor is a product that is also shipped.
Sadly for most of us, the tools used to make the game (that includes the engine) are for internal use only, and most of the time there is no army of programmers available to do all of the work ahead of time. So it pays to wait and focus on the hot path used by the game you are shipping right now and not a hypothetical one you might ship later.
I can build everything in one level at the start or I can build it across multiple levels and stream it.
Which one I do should be done at the start.
And of course if I’m targeting a 4090 then hoping to slap DLSS on it then it’s not going to work. I could pull a TI and turn AO off then pretend UE5 is the problem but it’s really just a developer issue.
every time I see a game has great graphics and is made with UE 5 I never get it anymore because I know it’ll run like ass on my PC. I’m not a billionaire who can afford a fancy computer with a GPU.
What, you guys don’t have a personal nuclear reactor powering your 5090 GPU to run games on Super Ultra full RT settings at 480p upscaled to 4K?
3080 does most games 4k60+ fps. It’s not that bad.
I think this is a case, where there’s an engine that was developed for graphics cards, that epic thought were common when they created UE5. They expected that RT performance would not only increase in the highest pricing tier, but also in mid-range (as in $500 pricing range) to a degree, where all the lumen stuff would be trivial. But steam survey reveals that most people use X060, X050 and then some X070 class cards, where RT performance isn’t that great. https://store.steampowered.com/hwsurvey/videocard/
So IMO there are several issues:
- There are obviously optimization issues with UE5, I’m not going to claim otherwise.
- Nvidia catering to the AI crowd, which is obviously more lucrative for them that improving RT performance for mid- and low-range cards.
- Epic being bad at forecasting the future state of graphics performance.
- Epic trying to offload workload from development to the end user (e.g. shifting from pre-baked lighting to realtime lighting).
The end result stays the same, we now play games that run so bad on current hardware, that everything now needs to be AI-upscaled with framegen and has this weird soft look that will look super dated at some point because RT isn’t there yet.
Stop 👏 Using 👏 Shitreal engine 👏
Why?
Just keep screeching at the void without an alternative. I’m sure folks will listen any day now!
On the subject of what a steaming fucking load UE5 is… Well, in short, Robocop should be one of the best games I’ve ever played, but it’s a barely functional disaster and that’s entirely the fault of Epic. The game itself is amazing, but the engine is on par with unpatched Fallout New Vegas.