Hacker News

31 Comments:
viraptor said 9 days ago:

> This work is likely to take months still before there is a complete tentative protocol, and probably years until these features are available in your favourite Wayland desktop environments.

I really appreciate the realistic timelines and honesty here.

yepthatsreality said 9 days ago:

> If you are an expert on the topics of color management or HDR displays and content, or a developer interested in contributing to the project, you are warmly welcome to join the development.

kinleyd said 9 days ago:

Having just emerged from an exploratory dive into Wayland, this thread popping up here is surely a sign from the gods. :)

It's been a good experience: Weston, sway and Wayland itself have such tiny foot prints while doing so much. I haven't tried qtwayland and some of the others mentioned and will give them a run out. I didn't have any problems on Arch but for some reason Weston wouldn't build on GuixSD which I am also exploring.

One fun part was having to use the flag --my-next-gpu-will-not-be-nvidia to invoke wayland against Nvidia drivers. It isn't supported and promptly crashes anyway. I was glad I had a Radeon gpu also installed, on which I continue a most satisfactory exploration.

SahAssar said 8 days ago:

> One fun part was having to use the flag --my-next-gpu-will-not-be-nvidia to invoke wayland against Nvidia drivers.

That's because of sway, not because of wayland. Sway's creator has a (justified) vendetta against Nvidias linux drivers.

kinleyd said 8 days ago:

It's certainly in keeping with a fine tradition!

"This kernel is tainted by Nvidia." is another I remember, as of course that notable middle finger that was thrust skyward in similar context.

cpach said 9 days ago:

Interesting. Does anyone know what kind of stake Collabora has in Wayland? So far it has seemed to me that there aren’t many commercial backers of Wayland. But if there is, I guess it’s good for the project. Creating a desktop environment of high quality takes lots of really hard work, just look at all the enourmous resources that Next/Apple/Microsoft has spent on their desktop environments.

rektide said 9 days ago:

> So far it has seemed to me that there aren’t many commercial backers of Wayland.

A very decent % of car-automotive infotainment systems are GENIVI[1] and Automotive Grade Linux based, which have been Wayland based.

LG's webOS runs on Wayland.

Much of Collabora's interest comes from involvement with Chrome & ChromeOS. I originally was thinking ChromeOS switched entirely to Wayland, but looking around I'm having a hard time confirming. I believe this is true: ChromeOS has an "Aura" shell which leverages the Chrome infrastructure. Chrome has it's own "Ozone" api for rendering. A lot of Collabora work & other has gone into the Ozone's Wayland backend.

In the future, the Lacros[2] project intends to run Chrome the browser inside their Chrome-based Exo/Exosphere display server, which is impemented in Aura. Kind of weird situation. Still not sure what Aura or Exo directly uses, whether it keeps a Mus or other special Ozone backend, or whether it too will run on Wayland, but the browser inside ChromeOS is going to be running on Wayland "soon".

One can also run Android apps under Spruv[3], which targets Wayland.

Best most important technology development happening on the planet. Join those pushing for a better "brighter" future.

[1] https://www.genivi.org/about-genivi

[2] https://source.chromium.org/chromium/chromium/src/+/master:d...

[3] https://gitlab.collabora.com/spurv

pabs3 said 9 days ago:

There was a post from Igalia (another open source consultancy) about Wayland and Chromium yesterday:

https://blogs.igalia.com/msisov/2020/11/20/chrome-chromium-o... https://news.ycombinator.com/item?id=25167584

ognarb said 9 days ago:

Wayland is heavily used in the embedded world. But the embedded world is not using KWin, Mutter or Sway but instead more minimal compositors like Qt Wayland compositor or Weston.

O_H_E said 9 days ago:

Interesting. Nice to know. Never heard of Wayland outside the context of desktops.

sho_hn said 8 days ago:

Cars and TVs extensively use Wayland, e.g. LG's.

skykooler said 9 days ago:

It's also used on at least two mobile OS's - Sailfish OS and Plasma Mobile.

thayne said 9 days ago:

Non desktop usecases are actually a major driver for wayland adoption, because x11 is very much designed for desktop, whereas wayland is intended to be more general.

ddingus said 9 days ago:

Doesn't this ultimately mean X11 is needed?

People keep arguing away the more advanced use cases. Others keep saying those use cases matter.

So...

Either, Weyland eventually ends up satisfying those use cases, and a lot of apps that go with them

, or?

thayne said 8 days ago:

Well, the intention is that there are desktop-specific protocols (for example, xdg-shell) that would be implemented by desktop compositors (such as KDE, GNOME, and wlroots), but wouldn't need to be implemented in other environments. The problem is currently there are several use cases where such protocols either don't exist, or are not standardized across multiple compositors.

ddingus said 8 days ago:

I want to be clear about something. I don't care how those use cases get addressed.

I do care very much about Unix remaining Unix. And the way X was implemented is multi-user graphical computing. That's got a lot of advantages and lot of use cases that people put to good use.

X works works over the network because Unix does things over the network. X is multi-user, like Unix is multi-user.

Same goes for multi-headed systems, multiple displays all kinds of user interface devices. You name it.

The problem quite simply is the new development isn't unix-like in the way X was Unix like.

The question is whether people will accept a lesser Unix then they know is currently possible.

thayne said 8 days ago:

Afaik all of those things work fine with wayland, at least with wlroots. It is true that wayland isn't network transparent, but it can be used over the network with things like wayvnc and waypipe. Now you can say that those aren't really mature yet, and I would agree with you. I think wayland's readiness is often over-exagerated. However, I don't think any of the things you pointed out are compelling evudence that wayland isn't unix like.

Otoh, I do think wayland departs from Unix in how it requires the "compositor" to perform to many roles. While in X, the compositor, window manager, etc. can be and often are seperarate processes from the x server, in wayland the compositor also needs to be the window manager, the keyboard layout manager, xwayland server, etc. And at least according some developers for a popular desktop environment, it is also the compositors responsibility to perform other tasks such as screen capture, vnc server, taskbar, lockscreen, etc. It isn't possible to do something like use i3 as the window manager for kde in wayland.

ddingus said 8 days ago:

Your "not unix like" comments ring true, and similar ones have been brought up with systemd.

My "I don't care how we get the use cases done" point of view applies to systems just as it does wayland.

I don't care so much, so long as we preserve how robust UNIX is.

Now to be fair, I think doing that gets a lot harder as these kinds of efforts continue, but the door is open. May end up in a good place.

Personally, I wish all the current Wayland users and devs would experience an SGI X environment. Great experience.

If, at the end of the day, stuff ends up working that good, few will complain.

ddingus said 8 days ago:

Again, does that not mean X11 is needed, until those things are addressed?

thayne said 8 days ago:

For some people, yes. But if you don't need those things, or your compositor and apps of choice support the same protocols, it is possible to switch to wayland.

ddingus said 8 days ago:

Well, sure.

But the greater frustration here seems to be lack of interest in multi user graphical computing.

Without the basic nature of X being a priority, the whole system is less. X was done the way Unix was.

People thought through general purpose multi user networked computing. Sometimes just multi user, serial tty style.

The gift of C was that same level of thought behind multi hser graphical computing. Again, sometimes networked, in the same fashion as above.

Current efforts are not multi user graphical computing as a focus.

It is said it can be.

Many who understand how to use X think it should be. Unix is a powerful OS, with some of that potential being sidelined to optimize for specific cases.

Long term, that may be a net loss.

We may also find it gets reinvented again too.

m0zg said 9 days ago:

Do you have any examples you could point to? A few years back I looked into this and the only real viable option was Qt Embedded, which is expensive for embedded and requires royalties to be paid on per-device basis. I wonder how much things have changed since then and whether there's now a viable FOSS GUI stack for embedded Linux.

kbumsik said 9 days ago:

Qt Embbedded is a full featured GUI framework and you don't need it. You can just use QtWayland for free and run any GUI apps that targets Wayland (GTK, Qt...).

One of the most used is probably Tizen, an embedded OS by Samsung for smartwatches (they don't use Google WatchOS) and TVs.

m0zg said 8 days ago:

To answer my own (inexplicably downvoted) question: https://lvgl.io/ looks intriguing.

said 8 days ago:
[deleted]
BlueTemplar said 9 days ago:

> Most of the desktop applications (well, literally all right now) are using Standard Dynamic Range (SDR).

Wait, does that mean that Firefox, Gimp, VLC... when run in X11, do not currently support HDR ? What about gamuts other than sRGB ?

Anyway, sound like this means that my old wide gamut monitor should eventually be able to show at least partially the 'HDR gamuts' (and luminosities) ? That would be great!

brudgers said 9 days ago:

Gamut is orthogonal to dynamic range in many three dimensional color spaces. A wide gamut CRT will tend to have less dynamic range than a wide gamut OLED. Dynamic range is a description of contrast. Gamut is a description of color.

BlueTemplar said 9 days ago:

In marketing-speak, the various 'HDR' standards refers to a bunch of things, not just to dynamic range : https://www.cnet.com/news/dolby-vision-hdr10-advanced-hdr-an...

IIRC most of those are using Rec.2020 ?

brudgers said 9 days ago:

Early drafts of my comment kept going off into the weeds of "high dynamic range" versus how people tend to use "HDR" to describe pictures made by reducing more bits into fewer bits while maintaining the perception of more bits which in photography is basically every black and white print made from a well exposed black and white negative of a scene with at least moderate contrast...obviously I am letting this comment go into the weeds.

More recently, "HDR" can also refer to mapping fewer bits in the source into more bits on the output in the case of displays.

Anyway, disentangling dynamic range from gamut just a bit seemed like it might add some light.

jiggawatts said 9 days ago:

Practically nothing in the PC world properly supports either HDR, wide gamut colour, or even just mapping to the physical display gamut in general.

There are bout half a dozen things you can do, in some configurations, sometimes, where everything will "work". Anything else is a total shitshow.

It all goes back to generations of developers assuming that a pixel is an RGB triple of bytes in the range 0..255 in the sRGB colour space. Millions of lines of code have been written when "images" are "byte[]" or "struct {r: byte; b: byte; g: byte} []" or the equivalent.

First of all, there's an implicit assumption by most programmers that this is a linear colour space. It isn't. The (127,127,127) pixel colour is not 50% grey! Not even close. Even Adobe Photoshop made this mistake.

Second, monitor manufacturers like to stretch colours to the maximum (native) display capability, because it makes them stand out more in the shop. Everyone wants the monitor that makes the colours "pop". Unfortunately, this means that with modern wide-gamut monitors, skin tones are so stretched that everyone looks like they're in clown make-up.

Third, Microsoft is the laziest company you can possibly imagine. They will not lift a finger to do anything other than the bare minimum, which then they will never touch again unless forced to at gunpoint. Hence, Windows had colour management technically added to it back around the year 2000, in the sense that it has a handful of colour management pieces that never do anything unless explicitly invoked by applications such as Adobe Photoshop. Nothing has changed since, other than a new "HDR mode" that blurs text, turns the brightness down, and is only suitable for gaming in a dark room.

Fourth, the 10-bit-per-channel displays required for HDR were sold only to professionals, so of course NVIDIA and ATI milked this market for all it was worth. Until very recently, 10-bit output was available on most consumer cards but disabled in the drivers. You had to buy the Quadro "professional" cards with identical chips, but different drivers. As you can imagine, practically nothing in Windows (or Linux) can output 10 bit, other than a handful of video viewing apps. Web browsers certainly can't. Even games that do compositing with 16-bit buffers output 8 bit only unless they were explicitly made for HDR. Even those still refuse to output 10 bit SDR or wide-gamut SDR even if it's available.

Fifth, because of the deliberate hardware limitations and the completely uncalibrated $150 budget displays most people buy, support for anything other than sRGB and SDR was pointless in file formats, so nobody bothered to lift a finger. Not the standards bodies, not Microsoft, not Linux, not anyone[1].

As of 2021, it is impossible to do any of the following in the general case:

- Send someone a HDR still image file.

- Send someone a HDR video and expect it to work in anything other than the YouTube app on a handful of platforms.

- Send someone a 10-bit or better file and expect it to actually deliver a benefit

- Send a wider-than-sRGB file and expect it to display correctly, even on an sRGB monitor. Expect clowns or zombies, not proper skin tones.

(I wrote a similar rant back in 2020, and 2019, and 2018, back all the way to 2010. This will not change in 2022. Or 2023. Or...)

1] Okay, I tell a lie: Apple did actually make all of the above work! Nobody else though. Right now, within the Apple ecosystem only, I can:

- Send someone a 10-bit, Display-P3 image and it'll display perfectly. Better colour, calibrated, smoother gradients.

- Send a wider-than-Display-P3 image and it'll also be correctly displayed, not stretched unpredictably like it would be on a PC.

- Send a HDR video in 10-bit Dolby Vision as a chat message and it'll work.

Etc...

Everywhere else though, no chance.

the8472 said 9 days ago:

WCG has been a thing for much longer than HDR, thus support is also more established. Firefox and GIMP do support color management via ICC profiles.