|Subject:||Re: I'm looking for a method of converting a string's character encoding|
|Date:||Sun, 29 Apr 2012 07:42:29 +0900|
> Date: Sat, 28 Apr 2012 20:29:22 +0200
> From: Daniel Krueger <address@hidden>
> Cc: address@hidden, Sunjoong Lee <address@hidden>
>You are mostly right, but only "mostly". Experience teaches that
> i think there shouldn't be any transcoding of guile's strings, as
> strings are internal representation of characters, no matter how they
> are encoded. So the only time when encoding matters is when it passes
> it's `internal boundarys', i mean if you write the string to a port or
> read from a port or pass it as a string to a foreign library. For the
> ports all transcoding is available, and as said, the real
> representation of guile strings internally is as utf8, which can't be
> changed. The only additional thing i forgot about are bytevectors, if
> you convert a string to an explicit representation, but afaik there
> you also can give the encoding to use.
> Am I wrong?
sometimes you need to change encoding even inside "the boundaries".
One notable example is when the original encoding was determined
incorrectly, and the application wants to "re-decode" the string, when
its external origin is no longer available. Another example is an
application that wants to convert an encoded string into base-64 (or
similar) form -- you'll need to encode the string internally first.
These kinds of rare, but still important, use cases are the reason why
Emacs Lisp has primitives to do encoding and decoding of in-memory
strings; as much as Emacs maintainers want to get rid of the related
need to support "unibyte strings", they are not going to go away any
IOW, Guile needs a way to represent a string encoded in something
other than UTF-8, and convert between UTF-8 and other encodings.
|[Prev in Thread]||Current Thread||[Next in Thread]|