axiom-developer
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Axiom-developer] Re: [sage-devel] Re: doctest failures due to rounding


From: William Stein
Subject: [Axiom-developer] Re: [sage-devel] Re: doctest failures due to rounding errors on Solaris.
Date: Thu, 31 Dec 2009 21:16:43 -0800

On Thu, Dec 31, 2009 at 9:13 PM, Tim Daly <address@hidden> wrote:
> Dr. David Kirkby wrote:
>> rjf wrote:
>>
>>> On Dec 31, 11:15 am, "Dr. David Kirkby" <address@hidden>
>>> wrote:
>>>
>>>
>>>>> RJF
>>>>>
>>>> The point you are missing is that we want to compare the output what Sage 
>>>> prints
>>>> to a human.
>>>>
>>>>
>>> The point you are missing is that the following item, which presumably
>>> could be printed by Sage,
>>> is perfectly readable to a human:
>>>
>>> 6121026514868073 * 2^(-51).
>>>
>>> It exactly dictates the bits in an IEEE double-float, and does not
>>> require any conversion from binary
>>> to decimal. It does not need rounding.  This kind of representation
>>> does not have any hidden unprinted digits.  It does not ever need to
>>> be longer because of delicate edge conditions of certain numbers.
>>>
>>> It happens to evaluate to
>>> APPROXIMATELY   2.718281828459045
>>>
>>
>> Sure, Sage could print that. It would also be worth printing the sign bit, 
>> so we
>> could verify the values of
>>
>> 1) Sign bit
>> 2) Significand
>> 3) Exponent.
>>
>> All of those could be correct. But there is still the software which does the
>> non-trivial task of converting that into the base 10 representation used by
>> humans. Then in additon to that, there is the software which takes a base 10
>> number, shows it with the Sage prompt, adding carriage returns etc where
>> necessary. All of these can go wrong.
>>
>> I would think in an almost ideal world, the test would be done at a higher
>> level, using hardware/software which checked what the monitor actually
>> displayed. That's not quite as easy to do though.
>>
>> Even better would be some way to scan the brain of the user to see what 
>> he/she
>> believes Sage is showing. Perhaps we use a font that is not very good, so
>> despite being displayed properly, it misunderstood.
>>
>> Given most of time people want to see a base 10 representation of a number, 
>> and
>> not a base 2, base 16 or IEE 754 representation, I believe most testing 
>> should
>> be done at the base 10 level.
>>
>> If there is a reason for testing the IEEE 754 representation as first choice,
>> then you have yet to convince me of it.
>>
>>
>> Dave
>>
>>
>>
> Dave,
>
> Axiom has the same issues.
>
> My take on this is that what you check depends on the reason you are
> checking.
> If you are generating the output for human use (e.g. a table) then you
> want decimal.
> If you are generating the output for regression testing (e.g. checking
> the answers on
> multiple hardware) then you probably want Fateman's solution.
>
> Tim

The output is used both for human use and for regression testing.  Its
primary use is human -- it's an example in the Sage reference manual:

   sage: float(e)
   2.7182818284590451

This is something a user will look at when reading the documentation
for some function.  It illustrates what happens when they convert the
symbolic constant e to float.

William




reply via email to

[Prev in Thread] Current Thread [Next in Thread]