|Subject:||Re: [Nano-devel] A patch for added color support|
|Date:||Sun, 31 Dec 2017 21:50:24 +1030|
|User-agent:||Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Thunderbird/52.5.0|
Rise, rise from the grave! ;)|
> On Tue, 21 Oct 2014 22:21:59 +0200 Erik Lundin wrote:
>> On Tue, 21 Oct 2014 21:30:58 +0200 Benno Schulenberg wrote:
>> By the way, Erik, what do you think of the way Dave Geering added
>> support for 256 colours in https://savannah.gnu.org/patch/index.php?6873 ?
>> (He didn't actually post a patch, so I attach effective diff here.)
> Adding support to define colors in more ways might be a good idea. The
> question is on how this could easily be documented or showed by example.
> It might be harder to explain to a new user how to define a converted
> RGB color to the color they are looking for then directing them to the
> official xterm-color256 palette. But my experience is that most users
> google color themes for their favorite editor and then just apply it
> directly. Most people that actually design new color themes are power
> user that know exactly what color they are using (probably using an
> example palette).
> So introducing the "nano color numbers" might be a good or bad idea.
> It's difficult to know. But changing this further down the track is a
> bad idea. Then "some" of the google provided themes might work and some
> wont. It might just be better to only provide one way to add the
> extended colors?
> The code supplied has several problems as i see it. Giving negative
> numbers like: -5:-5:-5 will get unexpected results. Ceiling the color
> numbers will definitely lead to frustration for a lot of users if they
> try to define real RGB values (0-255) for each color. A warning if the
> color is beyond the possible limit would be great.
My take on this is that if you're going to do it at all you should do it properly so you don't have to do it again.
The average human's eye is incapable of distinguishing between adjacent hues in a 24-bit palette (~16.7 million colours), whereas they can easily tell the difference between adjacent hues in a 216/256 colour palette. That being the case, there is almost no 'pressure' for anything greater than a 24-bit colour palette and, unless the human eye evolves to be more sensitive, nor will there be. A palette of 24-bits should see us all to the grave. On that basis, I think it would be wise to treat 24-bit RGB (so 8-bits per channel) as the basis for added colour support.
Many terminal environments already support 24-bit 'truecolour' (a convenient list can be found here: https://gist.github.com/XVilka/8346728) and the trend towards truecolour will only continue. If nanorc syntax files skipped 256 entirely and went to straight to 16.7M there would be no reason to revisit this issue in a decade — perhaps ever. (At least not from a user's perspective.)
Since rgb(0..255,0..255,0..255) is already a commonplace notation for web developers, and web development is quite likely to be the entry portal that most future programmers end up stepping through, I suggest adopting it for colour statements in nanorc syntax files.
Such a move would makes bounds-checking trivial and eliminate most of the complexity that leads to user mistakes in the first place. Giving users four/five different ways to enter colour codes paves the way for four/five times the number of mistakes.
I would thus prefer/like to see:
color cyan regex
superseded by something like one of the following:
color rgb(0,255,255) regex
color 0,255,255 regex
> Other then that it doesn't check if the terminal even supports the
> extended colors (through the COLOR variable or any other method). Some
> terminals support more then 8 and less then 256 colors. I personally
> think the only way to go is to revert it to no colors at all for those
The mapping of 24-bit nanorc syntax colours to the resolution supported by the terminal could, and I feel should, be done programmatically within nano.
I consider the nanorc files to be a different issue to the actual rendering within terminal-based applications. There's no reason (I can think of) why most users need to be exposed to the nitty-gritty of 'arcane colour systems'. Forcing them to do so won't make their (or anyone else's) lives 'better'.
If the terminal cannot handle 24-bit colours then having the user's 24-bit preferences mapped quietly, and behind-the-scenes, to whatever the terminal can actually handle is the way to go, I think. Assuming 24-bit could be passed straight through, only a handful of functions would logically be needed: 24->8 (truecolour to 256), 24->4 (truecolour to 16), 24->3 (truecolour to 8), and 24->1 (truecolour to monochrome).
I'm new to the whole 'terminal colour' thing, and most of my C knowledge has rusted away over the decades, so I don't know what the optimal coding approach would be. I do, however, strongly feel that the complexity involved in (re-)training users to understand and use multiple colour systems would be a net negative for the project. Far better, I feel, to skip 256 colours completely and transition the user base straight to 24-bit rgb for syntax highlighting in nanorc files. Those that are familiar with (and competent) should be responsible for the functions that translate user-space preferences into system-space reality.
Thanks for listening, and I apologise for the length of this post.
|[Prev in Thread]||Current Thread||[Next in Thread]|