Qt wiki will be updated on October 12th 2023 starting at 11:30 AM (EEST) and the maintenance will last around 2-3 hours. During the maintenance the site will be unavailable.

Qt Contributors Summit 2022 - Program/Color space in Qt - issues and possibilities

From Qt Wiki
Jump to navigation Jump to search


Session Summary

  • What are color spaces
    • Wide gamut color spaces
    • HDR color spaces
    • "Gamma" curves
      • Why is gamma correction difficult, especially with text
  • What can Qt currently do
    • QColorSpace and image transforms
    • Linear color spaces
    • Floating point image formats
  • What can't Qt currently do
    • Gamma correct blending
      • Non fast-path text rendering
      • Real instead of platform faked gamma-correction on text fast path
      • Other gamma corrected text painting without linear color spaces
    • Automatic color correction
      • From QImage inputs
      • To QPA outputs (for instance DP3 macs)
  • What APIs can we add to help
    • QPainter API
    • QScreen/QSurface API
      • QPA integration
    • QFont API?
    • QColorSpace API
      • XYZ,XYy
      • Luminosity
      • RT2100
      • Double table ICC formats

Session Owners

  • Allan Sandfeld Jensen

Notes

(Taken by CristianMaureiraFredes)

The session was based on slides that will be shared afterwards. These are notes on the discussion.

  • Comments & Comments
    • Friedemann: Note Qt was mentioned as one library implementing color spaces correctly at Meeting C++ https://meetingcpp.com/mcpp/online/talkview.php?tid=2
    • Nicolas Fella: For colors on Wayland there is https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14
    • Lars Knoll: There is some stuff available there that Laszlo has been working on. I'm using that for Qt MM.
    • Lars Knoll: But it's preview and doesn't work on all platforms.
    • Andy: There is a huge performance impact for software rendering for doing this. And image formats "lie" about the color space as well. It's hard to just expect it to work given a random image file. In Qt Quick 3D we do all the rendering in linear color space and tonemap back out to sRGB space. But all user input via textures colors needs to be converted based on context. Qt Declarative just mixes everything
    • Max: So we could implement some of that on the RHI level?
      • Sure, but we need to check if that would have an impact on performance
    • Lars: comments on the topic of multimedia

we have to be very careful with QPAinter (rendering on sw) things can become too expensive. What you want to do is to render images, but assume linear blending? Alan: yeah, that's considered. Maybe we need a setting to correct that... maybe a setting tool to correct the images on load? other thing...some work on the output side. Some of the rendered colors, you need built-in tone mapping support. We can't get into some colorspaces, but there are limits with HDR colors on screen. without the mapping is saturated. Alan: ack it's expensive, and we need to do it in hw I think Alan: for videos..yeah, in hw. videos go up in luminosity like 5, 10, or 15. Modern macbook HDR 700 nits, if the display is at max bright, then luminosity of 3 you can render properly. for a low surrounding light setting, maybe a factor of 8. You need to get that factor out of the OS, and tone it to that. Alan: we need also the default value for SDR values (windows) I'm using the same in all platforms... Alan: most screens have 200 nits, maximize makes that blurry... for macos they tell you SDR value, then the range that they support on top of that. So a luminosity on 1 is what the max SDR gives you, then you need to cut it off. (Things will be followed up with the changes in multimedia, to gather more insights)

    • Alexandru Croitor: Are there commercial users asking for better color support? It feels like all improvements are done because the developers feel passionate about it, rather than paying users needing it? Which is surprising to me
      • Andy: I think thats correct. Currently customers just make it look good regardless of what we do. Doing it perfectly everywhere will be painful (esspecially in Widgets and 2D Quick, and both painful for us as well as the users in breakages of existing code + performance penalties).
      • Fabian: Well, at least for font rendering, we had bug reports
      • Andy: Ideally we should do all (2D) rendering in linear color space as well and output to whatever is the ideal color space for the output, but there is a cost.
    • Shawn: sounds a bit like hidpi: let's plan for getting it really proper eventually, make sure it can be hardware-accelerated, and try not to have it take 10 years this time
    • Albert Astals Cid: Related to colorspaces but unrelated to this presentation, in KDE we'd super like if QColorSpace had CMYK support. We're trying to support loading CMYK PSD files and it's not great we have to solved it in our side :D https://invent.kde.org/frameworks/kimageformats/-/merge_requests/72wn Rutledge: sounds a bit like hidpi: let's plan for getting it really proper eventually, make sure it can be hardware-accelerated, and try not to have it take 10 years this time
    • Lars: I think correct colorspace handling is needed for our users at least for photos and videos.


Follow up conversation on chat

  • Lars: UI elements can I believe mostly be handled in linear color space and UX designers will tune the color values until it looks correct.
  • Allan: yes, but it might need to be optional.
  • Allan: though I would also prefer if we could handle srgb colors in linear space.
  • Giuseppe: yeah, we've had N requests in the last couple of years for handling CMYK in QImage / QPainter. "it's complicated"
  • Laszlo: @Allan What I was missing here was a bit more inclusive look at the full stack as things are in Qt 6. QPainter/QImage/QColorSpace is one small piece in the puzzle. Qt Quick has its own story. Then Qt Quick 3D for instance does implement lots of stuff mentioned here (work in linear, tonemap, floating point images without QImage, HDR ready, etc. because it is simply more essential for 3D content). Also there's the HDR work in Multimedia Lars mentioned, built on the experimental Windows/macOS HDR swapchain support in QRhi. In the end the full story is a lot more complicated, esp. with overlapping rendering technologies (e.g. think an image in Qt Quick scene embedded within a Qt Quick 3D scene in a Qt Quick scene in a QQuickWidget composited by the widget stack to a window..), with many challenges along the way. Anyway, there's definitely room for a lot of experimenting in this area.
  • Allan: right, I knew RHI had a new model for the backing store, which is what I hoped would be ready for the qtgui side of things as well
  • Lars: Agree with Laszlo. It's certainly not easy to make this consistent, esp. given that we don't want to break existing uses. Ideally, I believe we should do all our rendering in linear, and tonemap the output to whatever the underlying surface uses.
  • Allan: and part of the wish for the talk was to bridge some of these separate efforts together.
  • Lars: I think the work on an RHI enabled backing store might help here. That should at least make tonemapping possible without completely killing performance.
  • Laszlo: yes one new thing in 6.4 for composited backingstores (if there's a QQuickWidget/QOpenGLWidget in the top-level) is that we use QRhi for that, not OpenGL directly. Which in turn enables some other interesting things using the platforms' best suported APIs (e.g. the high dpi scaling experiments), and can be useful for future color space stuff as well later on.
  • Andy: You shouldn't forget that every color value and texture/image needs to be converted to linear color space (when needed) which also has a cost. And that ignores the fact that it's dificult to know when to do so without the user being explicit about the color space of each image since image formats don't always speceify their color space correctly. Also don't ignore the fact that many exising applications depend on the current "wrong" behavior.
  • Lars: Btw, you can find the stuff I'm using in https://code.qt.io/cgit/qt/qtmultimedia.git/tree/src/multimedia/shaders. hdrtonemapper.glsl is the tonemapper that limits luminosity to the range the screen can render (to avoid hard clipping of values on the screen)
  • Allan: though theoretical HDR format should be using absolute luminosity, but I am happy to ignore that, as not being able to adjust brigthness is not good in my book
  • Morten: Should we add a new experimental "color correct" application mode? Then we would be free to make changes without breaking existing apps (which may have implemented custom color correction on top if Qt)
  • Lars: @andy right now, we know that our default 2D rendering basically assumes the output surface is sRGB with a gamma of ~2 - 2.2, and we do alpha composition linearly in that space. Should be possible to map that so the default rendering looks the same.
  • Andy: @Morten S +1
  • Lars: @allan from all I could see, that is only theory. All TVs and screens to a brightness adjustment.
  • Allan: mine does not :D and I hate it!
  • Lars: otherwise you could never properly watch an HDR movie unless you perfectly control the surrounding light.
  • Lars: mine does :P
  • Allan: yeah I have to draw the curtains to watch some movies..
  • Andy: I like how HDR content on my TV has stuff that is so bright its like looking at the sun :-p
  • Lars: Sure, that's the purpose of HDR... it should blind you :)
  • Andy: lol
  • Lars: but macOS does adjust things according to the surrounding light (or the brightness setting of the display). So a luminosity of 1 is full SDR brightness. And the max luminosity that can be rendered depends on the display brightness (ie. you get a larger range with low display brightness settings)
  • Allan: so how does it do tone mapping of things that ends up too bright? A clamp, or late curve?
  • Lars: If my display is at full brightness I can render luminosities from ~0 - 2.5. If it's at a very low brightness the range goes from ~0 - 16.
  • Lars: @allan https://code.qt.io/cgit/qt/qtmultimedia.git/tree/src/multimedia/shaders/hdrtonemapper.glsl
  • Lars: That's basically doing the mapping of the Y component in linear YUV space.
  • Allan: right, so that is what you based it on
  • Lars: Yes. RHI gives me the maxLum value.
  • Lars: see QRhiSwapChainHdrInfo::maxLuminance.
  • Allan: that makes it output specific though.. For QColorSpace, I would prefer something general, but maybe that should be kept out of it then
  • Lars: (and qvideowindow.cpp:425ff)
  • Lars: If you want to render things on screen you need some tonemapping. Ideally I think the backing store could do that.
  • Lars: Of course you don't want any tone mapping to happen if you want to process images or videos.
  • Allan: right, but if a user requests a conversion from sRGB to RT.2100 on an image, it would need some display independent conversion too
  • Lars: Of course. Simply leave out the tonemapping.
  • Lars: I've got shaders for quite a bit of that as well in Qt MM by now.
  • Lars: although you still might need to do some tonemapping. Especially when converting from an HDR to an SDR format.
  • Laszlo: @Allan overall there are many bits and pieces in place here and there, which is great already. But in the end it's a difficult topic (for me at least) because (1) due to the many ways a Qt app can output something, sometimes different ways of rendering are combined with each other (because why not have e.g. HDR videos in a 2D scene within a 3D scene within 2D scene in a widget! :) ), while each path needing their own, sometimes conflicting solutions, sometimes impossible due to compatibility etc. - so holistic solutions are hard - and (2) the platform / graphics API / related API support for some of these things (thinking mainly of HDR output here) are far from ideal, wildly varying among platforms. E.g. the HDR video stuff mentioned above is only focusing on macOS right now and only when targeting a QWindow directly (won't be availble with a VideoItem in Qt Quick). And just querying something seemingly as simple as the maximum luminance for the screen is pretty varied among platforms and APIs. The devil is in the details hence I believe it is essential to have lots of direct experimentation on multiple platforms with all the different rendering tech in Qt just to get an understanding what's really possible to support in practice.
  • Lars: But for all image/video processing I think the best option would be to do everything in a linear floating point color space (scRGB most likely), and then tonemap to whatever the output format is. And that's sort of independent of whether you render to the screen or as an image/video with a certain colorspace.
  • Allan: @Laszlo true. I was trying to experiment on Windows, when I found the 2D API we were using wasn't good enough.. I would have to rewrite it.
  • Lars: I think that's correct. Of course, if performance doesn't matter as much (ie. we're not rendering real time), we can do everything in a linear floating point space and then do a tonemapped copy to whatever the output surface is.
  • Lars: using platform dependent shaders.
  • Laszlo: when it comes to accelerated (so QRhi-based) output one place where I believe we could get good results in the near future is Quick 3D because that already does many things right when it comes to HDR and tonemapping. With the (undocumented) ability to opt a QQuickWindow in to a scRGB (FP16) or HDR10 (RGB10) swapchain on Windows, assuming one has a HDR enabled screen,. I was planning to do some demos to see what are we missing (surely we need to work on the tonemapping side of things e.g.), but that work has not progressed lately.

(TBH Qt Quick I find hopeless with it shaders and data such as colors being 100% as-is exposed to the user - compatibility makes it impossible to really change stuff there)

  • Lars: I don't think we need to change QQ2D (or qpainter). Lets simply document what it does.
  • Lars: its all about UX there anyway and the only place those things matter in there is for photos. and we can simply have a 'renderImagesCorrectly' property for the QML image element and QPainter that would cause us to convert the image as good as we can.
  • Laszlo: although that's not to say we do not have some small improvements even in Quick, e.g. in 6.4 where is now https://doc-snapshots.qt.io/qt6-dev/qml-qtquick-shadereffectsource.html#format-prop (so one could e.g. do layer.enabled: true; layer.format: RGBA16F to render as subtree into an FP texture - granted this is not that useful for a Quick app, and mainly needed to support FP-baked View3Ds, but it's a step at least to make Quick more than just RGBA8 textures)
  • Allan: Btw. Keep in mind we have at least one other potential user of HDR in 2D content: Krita
  • Laszlo: true, but how exactly is Krita rendering? Wasn't there some OpenGL widget or something involved?
  • Allan: they have their own painting, but they need QWindow/QSurface AFAIK
  • Allan: so if you can do that for qtquick3d that should be possible for them too, just keep it in mind
  • Lars: If they can get any type of HDR enabled surface, they can probably handle the rest themselves.
  • Laszlo: ah, do they now? ok, the nice part of that is that they have full control then (since they get a window/native window and do what they want with it) so they are not restricted by what Qt is offering :)
  • Lars: Can we get a FP based window on Linux currently? I couldnt figure out how to do that.
  • Allan: Possibly with EGL, but not with X11 or Wayland directly
  • Laszlo: also, with which (Qt) technology? E.g. raster window or with QRhi targeting a GL or Vulkan QWindow?
  • Lars: I think you want RHI with GL or Vulkan for that.
  • Allan: Doen't know how they adopted to Qt6 yet, but they used to have both opengl and raster backends
  • Allan: haven't talked them since the last physical Akademy :D
  • Lars: As I said before, you will need to do some tonemapping, and you really don't want to do that in SW.
  • Nicolas Fella: they == krita? They haven't
  • Allan: okay, thanks :)
  • Laszlo: if it's QRhi, then (1) even though there are EGL extensions we do not support any of those for now, so you cannot get a HDR / floating point backed swapchain. If you use Vulkan however, that's cross platform. So the two supported modes (scRGB/HDR10) might work if the system/drivers support it. But what's supported on X11/Wayland level, no idea.
  • Nicolas: See https://phabricator.kde.org/T14170 for why it's hard for them
  • Laszlo: oh but even if it works, there will be no maximum luminance and such available from the QRhi since Vulkan and the WSI says nothing about it (unlike DXGI or Metal), it should be queried from elsewhere in the platform, whatever that elsewhere may be (and may depend on the win.sys. as well) Messy! But it could be worth trying at some point if just launching a Vulkan-based QQuickWindow with an scRGB swapchain works on Linux or not.
  • Lars: It would be great if we can start supporting HDR/FP based surfaces on more platforms. But in any case, we (and krita) will need to be prepared to fall back to sRGB.
  • Lars: @laszlo It's going to be messy, but Qt's job is to try to get some order into that messyness :)
  • Nicolas: For Wayland there's ongoing work upstream for a color management protocol, so discussing our requirements with them would be good
  • Allan: Requirements: a )Read an optimal color space. b) Be able to set a colorspace on a surface and the let the backend handled it hardware accelerated
  • Lars: for the wayland client, I'd simply like to be able to use a FP16/linear scRGB based surface for our windows.
  • Allan: right, and using FP16/FP32 formats..
  • Lars: plus give us a max luminosity value that can currently be rendered.
  • Nicolas: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14 is the relevant protocols MR
  • Lars: or maybe we don't even need the max luminosity, if the wayland server does the tonemapping.