bug-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#19993: 25.0.50; Unicode fonts defective on Windows


From: Ilya Zakharevich
Subject: bug#19993: 25.0.50; Unicode fonts defective on Windows
Date: Sun, 8 Mar 2015 00:38:05 -0800
User-agent: Mutt/1.5.21 (2010-09-15)

On Sat, Mar 07, 2015 at 10:18:25AM +0200, Eli Zaretskii wrote:
> > >   (set-fontset-font "fontset-default" '(#x1d400 . #x1d7ff) "Symbola")
> > 
> > I do not follow.  What is going on now?  Are you saying that it should
> > NOT work out-of-the-box?
> 
> On the slim chance that you'd like this to work for you, and didn't
> yet figure it out, I described what worked for me.

It would be nice if there was a recipe which works for everyone.
(After this, one could make it a default. ;-)

But the major hurdle is that the semantic of fontsets is completely
undocumented.  After your suggestions, I think I arrived at some
description which does not contradict anything I have seen:

=======================================================
When Emacs wants to show a character using a fontset:
  • Emacs looks in the fontset and finds the font specifications associated
    to this character.
  • Emacs checks which Unicode Subset contains the given character.
                (What if not unique???)
  • From fonts matching the font specifications, Emacs picks up those
    which have this Unicode Subset “identified” within the font.
  • From these, Emacs choses one (which?).

Emacs uses this procedure for two fontsets: the currently enabled one, and
the default fontset.  If none of two obtained fonts supports the given
character, a HEX representation is shown.
=======================================================

Is it similar to what actually happens?  (I’m not asking about the
implementation, just whether there is a functional equivalence.)

Ilya





reply via email to

[Prev in Thread] Current Thread [Next in Thread]