Iris Shaders

Iris Shaders

38M Downloads

[Feature Request] Is it possible to support HDR content for Minecraft through Iris?

Ptilopsis01 opened this issue · 25 comments

commented

Is your feature request related to a problem? Please describe.
No

Describe the solution you'd like
Support HDR content for Minecraft

Describe alternatives you've considered
As mentioned above, it's a pity that Minecraft can't output HDR content with all those excellent shades. I'm wondering if it's possible to support HDR content through Iris.

Additional context
Add any other context or screenshots about the feature request here.

commented

I've been looking at the code for a bit (I have no knowledge of Iris's inner workings, so this could be completely wrong), but I think because shaders return sRGB values, which then get converted to Rec2020 in Iris (for potential HDR), there are never HDR values to begin with. So, if my assumptions are correct, adding HDR support would require shaders to be rewritten to return a linear value, which can then be correctly colormapped in Iris.

Again, this could be completely wrong, so take this with a large grain of salt.

commented

This is correct, however many shaders are willing to do this; it's specifically an issue of display managers not supporting it right now.

commented

@IMS212 Could it support HDR on Windows and Mac though, as an optional feature? And Linux is also getting there slowly with KDE and SteamDeck + Valve + GameScope

commented

I have a beta version working on KDE for HDR, I’m not sure if I’ll release it.

Windows is currently impossible until we get a DXGI swapchain. (basically rendering the game with OpenGL then drawing it with DX12)

commented

Was this issue closed because the maintainers don't want this implemented in the project or because they don't want to contribute the changes required?
If it's the latter it'd be great to leave the issue open for tracking progress in the future if someone is to contribute it.

commented

The GLFW branch is on my personal repo; if you build it and attach it to MC, it will work automatically.

commented

And about Special K: I know it’s capable of AutoHDR, but that’s not what I want; I want to be able to output a direct scRGB 16-bit buffer.

commented

@IMS212 I was already talking in the Special K discord how it could be possible to achieve HDR for this project. The oolite project managed to get HDR working with OpenGL on Windows (didn’t check yet if they also used a DXGI swapchain). And I don’t know how easy it is to get the swapchain via LWJGL.

But I am using KDE HDR (fedora 40 kde spin) and I would really like to work on your HDR beta version. Maybe you can send it to me privately if you don’t want to push the branch publicly?

commented

The GLFW branch is on my personal repo; if you build it and attach it to MC, it will work automatically.

Thank you, will give it a try this weekend!

And about Special K: I know it’s capable of AutoHDR, but that’s not what I want; I want to be able to output a direct scRGB 16-bit buffer.

Don’t worry, it’s just what I use at the moment to have HDR (tonemapped from SDR) in Minecraft, and in that discord a lot of people know a lot about how to render HDR, it’s why I asked there.

commented

@IMS212 So for Windows @AnotherCommander in the Oolite project managed to output HDR in OpenGL - without(!) - DXGI

To help other people another_commander documented how they did it without DXGI here: https://discord.com/channels/778539700981071872/778542373486592003/1248574268992917598

I try to copy the information here:

vvvvvvvvvvvvvvv

The question whether OpenGL is capable of rendering HDR natively or not has come up in this server at least twice now. I thought it might be a good idea to consolidate the data I used as source in order to set up HDR in the OpenGL project I participate in to serve as a quick how-to, in case the question happens to come up again.
The short answer: yes it is capable. Longer answer and resources follow:

  • Go to slide # 53 of the "HDR Rendering on NVidia GPUs" presentation by T.J. True and read from there. The main points are clearly outlined. Just don't bother with the Fullscreen Exclusive stuff, this presentation is from the NVAPI era and some things have changed since. Actual OGL set up stuff is still relevant though. The presentation is at https://on-demand.gputechconf.com/gtc/2017/presentation/s7394-tom-true-programming-for-high-dynamic-range.pdf

  • In the project I worked on, we had to modify the SDL 1.2.13 dll to incorporate the desired changes for being compliant with the above. The source code patch to make SDL 1.2.13 HDR-capable is attached. Of course, you don't need SDL to make OpenGL work with HDR, you can take the ideas and apply them directly to your OGL project - it will still work.

The patch is applied to the SDL 1.2.13 original source code, available from here: https://sourceforge.net/projects/libsdl/files/SDL/1.2.13/SDL-1.2.13.tar.gz/download
In our case, all we had to do to enable a 16-bit scRGB backbuffer with the modified SDL.dll was

SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_PIXEL_TYPE_FLOAT, 1);

We did not use any DXGI interop in our project, at least not consciously. Our game does trigger Hardware Composed: Independent Flip when launched with HDR enabled, but this is apparently done on the driver level without specific instructions or intervention from our code. It Just Works (TM).

SDL Patch:

diff -ruN SDL-1.2.13/include/SDL_video.h SDL-1.2.13_new/include/SDL_video.h
--- SDL-1.2.13/include/SDL_video.h	2007-12-31 06:48:36 +0200
+++ SDL-1.2.13_new/include/SDL_video.h	2022-09-06 08:32:20 +0300
@@ -203,6 +203,7 @@
     SDL_GL_GREEN_SIZE,
     SDL_GL_BLUE_SIZE,
     SDL_GL_ALPHA_SIZE,
+	SDL_GL_PIXEL_TYPE_FLOAT,
     SDL_GL_BUFFER_SIZE,
     SDL_GL_DOUBLEBUFFER,
     SDL_GL_DEPTH_SIZE,
diff -ruN SDL-1.2.13/src/video/SDL_sysvideo.h SDL-1.2.13_new/src/video/SDL_sysvideo.h
--- SDL-1.2.13/src/video/SDL_sysvideo.h	2007-12-31 06:48:14 +0200
+++ SDL-1.2.13_new/src/video/SDL_sysvideo.h	2022-09-06 08:32:28 +0300
@@ -281,6 +281,7 @@
 		int green_size;
 		int blue_size;
 		int alpha_size;
+		int pixel_type_rgba_float;
 		int depth_size;
 		int buffer_size;
 		int stencil_size;
diff -ruN SDL-1.2.13/src/video/SDL_video.c SDL-1.2.13_new/src/video/SDL_video.c
--- SDL-1.2.13/src/video/SDL_video.c	2007-12-31 06:48:14 +0200
+++ SDL-1.2.13_new/src/video/SDL_video.c	2022-09-06 10:08:09 +0300
@@ -221,6 +221,7 @@
 	video->gl_config.green_size = 3;
 	video->gl_config.blue_size = 2;
 	video->gl_config.alpha_size = 0;
+	video->gl_config.pixel_type_rgba_float = 0;
 	video->gl_config.buffer_size = 0;
 	video->gl_config.depth_size = 16;
 	video->gl_config.stencil_size = 0;
@@ -1442,6 +1443,9 @@
 		case SDL_GL_ALPHA_SIZE:
 			video->gl_config.alpha_size = value;
 			break;
+		case SDL_GL_PIXEL_TYPE_FLOAT:
+			video->gl_config.pixel_type_rgba_float = value;
+			break;
 		case SDL_GL_DOUBLEBUFFER:
 			video->gl_config.double_buffer = value;
 			break;
diff -ruN SDL-1.2.13/src/video/wincommon/SDL_wingl.c SDL-1.2.13_new/src/video/wincommon/SDL_wingl.c
--- SDL-1.2.13/src/video/wincommon/SDL_wingl.c	2007-12-31 06:48:02 +0200
+++ SDL-1.2.13_new/src/video/wincommon/SDL_wingl.c	2022-09-06 14:11:24 +0300
@@ -233,6 +233,11 @@
 		*iAttr++ = WGL_ALPHA_BITS_ARB;
 		*iAttr++ = this->gl_config.alpha_size;
 	}
+	
+	if ( this->gl_config.pixel_type_rgba_float ) {
+		*iAttr++ = WGL_PIXEL_TYPE_ARB;
+		*iAttr++ = WGL_TYPE_RGBA_FLOAT_ARB;
+	}
 
 	*iAttr++ = WGL_DOUBLE_BUFFER_ARB;
 	*iAttr++ = this->gl_config.double_buffer;
@@ -469,15 +474,26 @@
 		    case SDL_GL_MULTISAMPLESAMPLES:
 			wgl_attrib = WGL_SAMPLES_ARB;
 			break;
-		    case SDL_GL_ACCELERATED_VISUAL:
-			wgl_attrib = WGL_ACCELERATION_ARB;
-			this->gl_data->wglGetPixelFormatAttribivARB(GL_hdc, pixel_format, 0, 1, &wgl_attrib, value);
-			if ( *value == WGL_NO_ACCELERATION_ARB ) {
-				*value = SDL_FALSE;
-			} else {
-				*value = SDL_TRUE;
+		    case SDL_GL_ACCELERATED_VISUAL: {
+				wgl_attrib = WGL_ACCELERATION_ARB;
+				this->gl_data->wglGetPixelFormatAttribivARB(GL_hdc, pixel_format, 0, 1, &wgl_attrib, value);
+				if ( *value == WGL_NO_ACCELERATION_ARB ) {
+					*value = SDL_FALSE;
+				} else {
+					*value = SDL_TRUE;
+				}
+				return 0;
+			}
+			case SDL_GL_PIXEL_TYPE_FLOAT: {
+				wgl_attrib = WGL_PIXEL_TYPE_ARB;
+				this->gl_data->wglGetPixelFormatAttribivARB(GL_hdc, pixel_format, 0, 1, &wgl_attrib, value);
+				if ( *value == WGL_TYPE_RGBA_FLOAT_ARB ) {
+					*value = SDL_TRUE;
+				} else {
+					*value = SDL_FALSE;
+				}
+				return 0;
 			}
-			return 0;
 		    default:
 			return(-1);
 		}
@@ -549,6 +565,10 @@
 			return -1;
 		}
 		break;
+		case SDL_GL_PIXEL_TYPE_FLOAT:
+		// cannot query attribute unless WGL_ARB_pixel_format is 1
+		*value = -1;
+		break;
 	    default:
 		retval = -1;
 		break;
diff -ruN SDL-1.2.13/src/video/wincommon/SDL_wingl_c.h SDL-1.2.13_new/src/video/wincommon/SDL_wingl_c.h
--- SDL-1.2.13/src/video/wincommon/SDL_wingl_c.h	2007-12-31 06:48:02 +0200
+++ SDL-1.2.13_new/src/video/wincommon/SDL_wingl_c.h	2022-09-06 08:32:23 +0300
@@ -124,6 +124,7 @@
 #define WGL_SWAP_COPY_ARB              0x2029
 #define WGL_SWAP_UNDEFINED_ARB         0x202A
 #define WGL_TYPE_RGBA_ARB              0x202B
+#define WGL_TYPE_RGBA_FLOAT_ARB        0x21A0
 #define WGL_TYPE_COLORINDEX_ARB        0x202C
 #endif

Also, make sure you output linear data and you should be good.
Hope this helps someone in the future.

^^^^^^^^^^^^^^^

commented

@IMS212 I’d guess you mean this branch (on the official repo): https://github.com/IrisShaders/Iris/tree/HDR

Your personal repo has CSM and CSM2, but they don’t seem right

It has references to local paths on your PC, but that’s not a problem, I will get to work 🏃‍♀️

commented

Ohhh you forked GLFW! Now I understand https://github.com/IMS212/glfw/tree/hdr

commented

I think one difference so far I could see: In the GLFW fork you used PQ with Rec2020 primaries and in Oolite they used linear scRGB with 709 primaries, which might be the easier way, since SDR values more or less stay SDR and you can just go outside of the 0..1 range to get HDR values. (a lot of PC HDR games do this nowadays). And you don’t need to change the primaries when you change between SDR and HDR, they can stay the same.

AFAICT the frog color management supports linear scRGB + 709 as well.

(The info about linear scRGB + 709 only works in windows 10+ when HDR is enabled, so not in Windows 7, but I guess that is the target windows environment anyway.)

commented

The REC2020 primaries were just for testing, as they gave much more vibrant test colors. I was always planning to use rec709, so that works out.

commented

but this is apparently done on the driver level without specific instructions or intervention from our code

I mean, that is probably triggering the automatic DXGI swapchain presentation though (you should be able to confirm this with presentmon I think?)
Which I guess is still very handy and saves you a bunch of code, but it's not universal (nvidia > 526.47 and amd > 23.7.1) and I cannot see it ever being better than "explicit" interop.

commented

Hello. Just to clarify the situation a bit, presentmon confirms that the presentation for Oolite is DXGI, Hardware Composed: Independent Flip when running in HDR mode, even with the driver Vulkan/OpenGL Present Method setting set to "Preferred Native" and Optimizations for Windowed Games in Win 11 display settings turned off . Also, I will note that the HDR feature in Oolite was developed with a driver version which was way, way older than 526.47 (which implemented the above setting). Sorry I don't have the exact driver version number I was running back then handy, I'll see if I can dig out any forgotten game logs from the time.

Edit: Found it, it was version 516.40.

commented

@mirh Thanks for the info! Maybe I understood it wrong and maybe @AnotherCommander can clarify this 😊

commented

That is interesting..
I can only retrospectively guess the capability has been there since a pretty longer time (exactly because they had to support hdr games, such as doom eternal and RDR2), and what happened in those newer drivers is just that they exposed the underlying mechanism generally for normal SDR applications too.

commented

This functionality is “Layering OpenGL on DXGI”, and is an option in NVIDIA control panel. This should not be enabled in Minecraft as it causes random presentation issues and crashes.

commented

@IMS212 AnotherCommander said it worked even with "Prefer Native", though of course the setting says prefer native, not "enforce native".

Hm, I just directly tested this option and Minecraft works fine with NVIDIA’s DXGI option! But admittedly it was only a short test.

But I know that I’ve played Minecraft with Special K’s OpenGL-IK, which should leverage NVIDIA’s DXGI layering, and that worked fine for hundreds of hours and never a crash. Were the issues maybe fixed in the meantime by NVIDIA?

Scrap that last part, I need to check if it really leverages it, for Vulkan it does but OpenGL might be a custom DXGI interop.

Confirmed it, I’ve played Minecraft with Special K for hundreds of hours indirectly with NVIDIA’s OpenGL interop code, via their nvidia interop api:

      if (dx_gl_interop.d3d11.staging.colorBuffer.p != nullptr)
      {
        glGenRenderbuffers (1, &dx_gl_interop.gl.color_rbo);
        glGenFramebuffers  (1, &dx_gl_interop.gl.fbo);

        dx_gl_interop.d3d11.staging.hColorBuffer =
          wglDXRegisterObjectNV ( dx_gl_interop.d3d11.hInteropDevice,
                                  dx_gl_interop.d3d11.staging.colorBuffer,
                                  dx_gl_interop.gl.color_rbo, GL_RENDERBUFFER,
                                                              WGL_ACCESS_WRITE_DISCARD_NV );
      }
    }

In Special K: https://github.com/SpecialKO/SpecialK/blob/cab11d3bc1e92f465f9e84f1ef1217bcac14cd2a/src/render/gl/opengl.cpp#L2684

Additional Info: https://registry.khronos.org/OpenGL/extensions/NV/WGL_NV_DX_interop.txt

PS: And that API is supported by AMD and Intel as well: https://opengl.gpuinfo.org/listreports.php?extension=WGL_NV_DX_interop

commented

Yep; adding this manually works great and is planned at some point. It's only NVIDIA's driver-level variant that causes problems.

commented

hi IMS212 , i tested your glfw & iris fork. after some fix it successfully triggerd hdr mode under kwin. however, tomemapping looks not very well. all white color shown at maximum luminance, even the white color under srgb colorspace (e.g. gui font). i guess it didnt do any tonemapping and direct pass srgb value to rec2020.

i saw that there is also a fork of iris which provides hdrconfig uniform to shader. is there any shader that actually used these uniforms to provide a correct tonemapping ?

commented

i successfully get a working hdr on iterationt 3.2 shader. here is the patch of it

commit d6d59c5aab8e2935f46646b1e153249eaf1bc57e
Author: myself <[email protected]>
Date:   Tue Oct 8 20:34:47 2024 +0000

    pq hdr

diff --git a/shaders/Lib/Programs/Final.glsl b/shaders/Lib/Programs/Final.glsl
index 0195b9b..ad66f6c 100644
--- a/shaders/Lib/Programs/Final.glsl
+++ b/shaders/Lib/Programs/Final.glsl
@@ -25,6 +25,13 @@ layout(location = 0) out vec4 compositeOutput1;
 	uniform float BiomeBasaltDeltasSmooth;
 #endif
 
+uniform int maxLuminance;
+
+const float PQ_M1 = 2610.0/4096 * 1.0/4;
+const float PQ_M2 = 2523.0/4096 * 128;
+const float PQ_C1 = 3424.0/4096;
+const float PQ_C2 = 2413.0/4096 * 32;
+const float PQ_C3 = 2392.0/4096 * 32;
 
 #include "/Lib/Uniform/GbufferTransforms.glsl"
 
@@ -78,7 +85,16 @@ vec3 AgX(vec3 color) {
 }
 
 vec3 None(vec3 color){
-	return pow(color, vec3(1.0 / 2.2));
+	return LinearToGamma(color);
+}
+
+vec3 pq(vec3 color){
+	color.rgb *= vec3(1.0/(10000 / maxLuminance));
+	color.rgb = pow(color.rgb, vec3(PQ_M1));
+	color.rgb = (vec3(PQ_C1) + vec3(PQ_C2) * color.rgb) / (vec3(1.0) + vec3(PQ_C3) * color.rgb);
+	color.rgb = pow(color.rgb, vec3(PQ_M2));
+
+	return color;
 }
 
 vec3 MergeBloom(vec3 color, float bloomGuide){
diff --git a/shaders/Lib/Settings.glsl b/shaders/Lib/Settings.glsl
index ea1e577..53f899e 100644
--- a/shaders/Lib/Settings.glsl
+++ b/shaders/Lib/Settings.glsl
@@ -420,7 +420,7 @@
   //#define GLARE_FLARE_SHADOWBASED
 
 //Color---------------------------------
-	#define TONEMAP_OPERATOR AgX // [AgX ACES None]
+	#define TONEMAP_OPERATOR AgX // [AgX ACES None pq]
 
 	#define AGX_EV 				13.0 // [8.0 8.5 9.0 9.5 10.0 10.5 11.0 11.5 12.0 12.5 13.0 13.5 14.0 14.5 15.0 15.5 16.0 16.5 17.0 17.5 18.0 24.0 32.0]
 

select pq in tonemap option and it would works. pq tonemap shader are taking from https://github.com/haasn/libplacebo .

a screenshot of final result:

Image

it looks washed out because it dont have a icc profile embedded

commented

it also works for SEUS_PTGI_HRR_Test_2.1_GFME_v1.16 with the patch bellow:

diff --git a/shaders/lib/Settings.inc b/shaders/lib/Settings.inc
index 1182dcf..1ad8217 100644
--- a/shaders/lib/Settings.inc
+++ b/shaders/lib/Settings.inc
@@ -71,7 +71,7 @@
 #define WATER_FOG_DENSITY 1.0 // [0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0]
 
 // Postprocessing
-#define TONEMAP_OPERATOR SEUSTonemap // [SEUSTonemap ACESTonemap Uncharted2Tonemap BurgessTonemap ReinhardJodie ExponentialTonemap]
+#define TONEMAP_OPERATOR SEUSTonemap // [SEUSTonemap ACESTonemap Uncharted2Tonemap BurgessTonemap ReinhardJodie ExponentialTonemap pq]
 #define TONEMAP_CURVE 1.5 // [1.0 1.25 1.5 1.75 2.0 2.25 2.5 2.75 3.0 3.25 3.5 3.75 4.0 4.25 4.5 4.75 5.0 5.5 6.0 6.5 7.0 7.5 8.0 8.5 9.0 9.5 10.0]
 #define EXPOSURE 1.0 // [0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 3.0 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4.0]
 #define SATURATION 1.15 // [0.0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1.0 1.05 1.1 1.15 1.2 1.25 1.3 1.35 1.4 1.45 1.5 1.6 1.7 1.8 1.9 2.0]
diff --git a/shaders/program/composite14.fsh.glsl b/shaders/program/composite14.fsh.glsl
index aa60c15..1298992 100644
--- a/shaders/program/composite14.fsh.glsl
+++ b/shaders/program/composite14.fsh.glsl
@@ -9,6 +9,13 @@
 
 in vec4 texcoord;
 
+uniform int maxLuminance;
+
+const float PQ_M1 = 2610.0/4096 * 1.0/4;
+const float PQ_M2 = 2523.0/4096 * 128;
+const float PQ_C1 = 3424.0/4096;
+const float PQ_C2 = 2413.0/4096 * 32;
+const float PQ_C3 = 2392.0/4096 * 32;
 
 const float overlap = 1.9;
 
@@ -37,6 +44,15 @@ const mat3 ACESOutputMat = mat3(
     -0.00327, -0.07276,  1.07602
 );
 
+vec3 pq(vec3 color){
+	color.rgb *= vec3(1.0/(10000 / maxLuminance));
+	color.rgb = pow(color.rgb, vec3(PQ_M1));
+	color.rgb = (vec3(PQ_C1) + vec3(PQ_C2) * color.rgb) / (vec3(1.0) + vec3(PQ_C3) * color.rgb);
+	color.rgb = pow(color.rgb, vec3(PQ_M2));
+
+	return color;
+}
+
 vec3 Uncharted2Tonemap(vec3 x)
 {
 	x *= 3.0;

Image

note: you need to have manually decreased exposure (EXPOSURE=0.1) to make it have enough headroom for hdr. otherwise it is just playing sdr game at max brightness

commented

@GeForceLegend can you review this patch as well?