[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: How should ObjC encode 'long'?
From: |
Ziemowit Laski |
Subject: |
Re: How should ObjC encode 'long'? |
Date: |
Tue, 27 Jan 2004 14:46:31 -0800 |
On 27 Jan, 2004, at 14.37, Kaelin Colclasure wrote:
On Jan 26, 2004, at 7:09 PM, Ziemowit Laski wrote:
Currently, ObjC encodes 'long' as 'l' (and 'unsigned long' as 'L'),
but only if
sizeof(long) == sizeof(int). On LP64 targets, where sizeof(long) ==
2 * sizeof(int),
'long' and 'unsigned long' get encoded as 'q' and 'Q', respectively,
instead.
Personally, I tend to think that this is broken, and that long should
always be 'l',
regardless of its size. However, I can also see an ABI argument
(esp. in the context
of distributed objects) that would lead to the opposite conclusion.
What do you all think?
I would assume the whole point of "encoding" is to externalize the
relevant type information. And if it's for external consumption, it
needs to be fully self-describing. 'q' and 'Q' unambiguously denote a
signed / unsigned quadword... Seems like exactly the right thing.
Yes, this seems to be the consensus, and so it shall stay. :-)
--Zem
--------------------------------------------------------------
Ziemowit Laski 1 Infinite Loop, MS 301-2K
Mac OS X Compiler Group Cupertino, CA USA 95014-2083
Apple Computer, Inc. +1.408.974.6229 Fax .5477