I've awoken from my slumber
All the way back in early 2021 (I can't believe it's been 2 years) I wrote the compute shader renderer for the Switch port of melonDS as described in my previous post on it. If you don't know much about it the compute shader renderer then I recommend checking that post out.

After more or less completing it for Switch (the port desparately needs an update, it will come, I promise), I didn't really touch the code much. Over the last couple of weeks this finally changed.

The renderer had to be ported from Switch's homebrew GPU API deko3D to OpenGL, which fortunately wasn't that hard, because A. most of the complexity lies within the shader there is not that much buffer jougling and B. Nvidia GPUs (or atleast Maxwell) being somewhat of a OpenGL hardware implementation.

But let's come to the main attraction, besides some fixes, high resolution rendering is finally implemented for it. And it works wonderfully, with far fewer or no artefacts compared to the classic OpenGL renderer. And even on my integrated Intel UHD 620 I can reach up to 3x-4x resolution depending on the game.

With local wireless there is now another reason you might want to use it over the software renderer. If you are short on CPU cores for all the melonDS instances you can offload the rasterisation onto the GPU.

There are still a few things left to do. For some reason the shaders (which are all compiled on startup, so no stuttering while playing) seem to compile quite slowly on Windows for Intel and Nvidia GPUs. Bizzarely this seems to be related to the very large SSBOs, atleast reducing their size seems to lead to speed up. So my plan is to replace the large buffers which scale proportionally to the resolution with ones which have unspecified size or image load and store. If I had to guess the driver performs the layout calculation somehow for every array entry. In case I don't get the compile times low enough, I need to implement a shader binary cache.

The outlines generated through edge marking (e.g. used by the Zelda games) are always only pixel thick, which quickly becomes very thin for higher resolutions. Thus I want to add an option to counteract that (I am still not exactly sure how to do it.

Another issue that currently the compute shader renderer isn't integrated into the GUI at all, it currently just replaces the OpenGL renderer.

And like always there is still some clean up to be done in the code. As a last note, the compute shader renderer already uses a texture cache (which as part of this clean up should also be used by the OpenGL renderer). Implementing texture replacement on top of that is not hard and is on my list as well, but one step after the other.

And yes, it allows you to play Pokemon in higher resolutions with no back lines.
keisui says:
May 1st 2023
@Brankale , how did you test the build ¿ i couldnt find it under the actions tab on their github
Brankale says:
May 2nd 2023
@keisui I build it by myself from the compute-shader-renderer branch
keisui says:
May 2nd 2023
@Brankale ah gotcha , ive never had any luck with building myself but ill give it a try again
keisui says:
May 11th 2023
followup to this^ comment , 9 days later , im really not sure what to do , it just doesnt recognize the compute shader renderer branch , all it says is
"fatal: repository 'https://github.com/melonDS-emu/melonDS/tree/compute-shader-renderer/' not found
-bash: cd: melonDS: No such file or directory"

when i build following the exact steps in the "how to build section" it just builds the master branch , i really have no idea how to build from another branch , unless the solution is painfully obvious in which im just painfully blind
Generic aka RSDuck says:
May 11th 2023
follow the exact steps, except after the "cd melonDS" also do "git checkout compute-shader-renderer"
keisui says:
May 12th 2023
it built just fine , but im getting significantly more graphical issues ontop of the previous graphical issues with the opengl renderer , if it makes any difference during the last step i got around a hundred
"warning: 'offsetof' within non-standard-
layout type 'ARM' is conditionally-supported [-Winvalid-offsetof]"
messages , despite following the exact steps multiple times . i tried having "git checkout compute-shader-renderer" both on the same line and on the line below "cd melonDS" with no luck at all in terms of the issues disappearing using both build methods
Generic aka RSDuck says:
May 12th 2023
the warning is normal. Also do you have an AMD GPU?
keisui says:
May 12th 2023
ah alright . no , i have an nvidia gpu , just updated it recently too , if youd like i can upload some videos of whats happening too ¿
Generic aka RSDuck says:
May 13th 2023
yes please
keisui says:
May 13th 2023
https://www.youtube.com/watch?v=EWf4r76yJpI&feature=youtu.be
i just recorded some random gameplay that shows it best it throughout two games , there doesnt seem to be a consistency as to how or what causes it to glitch , some objects are fine sometimes only for them to glitch out the next second , and some glitches are dependent on camera angle . as showcased it mostly happens during higher resolutions but also in lower resolutions as well , and different resolutions cause different levels of bugginess
Generic aka RSDuck says:
May 14th 2023
that looks mostly like the tile cache overflowing. That's not that hard to fix, I just didn't notice, because I would probably run out of memory before this happens.

Though I can't really explain the issues at low resolutions, does it happen in all games?

Also please try building again from today's source. I fixed some bugs. Now the compute shader renderer is also properly integrated into the GUI, so it is it's own option in the video settings.
keisui says:
May 15th 2023
todays still has the graphical bugs as well unfortunately , and yea it seems to happen in all games , i just tested a bunch of random games from different publishers and it happened with all of them , both with high and low resolutions
keisui says:
Jun 27th 2023
i think there should also be an toggle (for the compute shader) to disable the textures snapping to pixel coordinates as even in high res some textures act as if youre playing in native res while the rest of the game is upresed and looks fine
Generic aka RSDuck says:
Jun 28th 2023
I'm not really sure what you mean by textures snapping to a grid. There is an option to toggle between using the native resolution coordinates and using more high precision coordinates. The hires coordinates only have 4 bits of extra precision, so if you play at a large enough resolution they will be "used up".

The thing with things like this is that if we start increasing the resolution of some calculations, we're basically already going torwards the classical OpenGL renderer and things will start to break.
keisui says:
Jun 29th 2023
its a different issue from the precision coordinates , that works just fine , im meaning that textures display the same way no matter using software renderer or the compute shader renderer , the best way i can describe it is that theyre snapping to the native ds res coordinates , while models still use high precision coordinates . they dont like to display diagonally is the best way i can describe it , the textures align properly on the models but the textures themselves dont , if that makes any sense
Post a comment
Name:
DO NOT TOUCH