|
From: | Linda Walsh |
Subject: | bug#17505: Interface inconsistency, use of intelligent defaults. |
Date: | Fri, 16 May 2014 14:20:42 -0700 |
User-agent: | Thunderbird |
Pádraig Brady wrote:
On 05/16/2014 11:01 AM, Ruediger Meier wrote:On Friday 16 May 2014, Pádraig Brady wrote:The attached patch changes the output to: $ dd if=/dev/zero of=/dev/null bs=256M count=2 2+0 records in 2+0 records out 536870912 bytes (512 MiB) copied, 0.152887 s, 3.3 GiB/sThanks!What about just "512 M" which looks IMO better, is a valid input unit and is explained in the man page.That would be less clear I think since in standards notation, 512M is 512000000. Also adding the B removes any ambiguity as to whether this referred to bytes of blocks.
---- Since 'B' already refers to 2^3 (most commonly) bits of information saying "KiB" = 1024 information Bytes. What other type of bytes are there? I would acknowledge some ambiguity when using the prefixes with 'bits', but with 'Bytes' you only have their usage/reference in relation to 'information'. Note that in the information field, when referring to timings, milli, micro, nano -- all refer to an abstract, non-information quantity (time in 's'). When referring to non-computer units SI prefixes would be the default. But for space, in 'bytes' -- they are an 'information unit' that has no physical basis for measurement. I think the SI standard was too hastily pushed upon the nascent computer industry by established and more dominant companies that were used to talking about physical that relate to concrete physical quantities. I'm beginning to wonder how one would go about correcting the SI standard so as not to introduce inaccuracies in measurement in the computer industry.
[Prev in Thread] | Current Thread | [Next in Thread] |