[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Qemu-ppc] [PATCHv2 08/11] tests: Clean up IO handling in ide-test
From: |
David Gibson |
Subject: |
Re: [Qemu-ppc] [PATCHv2 08/11] tests: Clean up IO handling in ide-test |
Date: |
Thu, 20 Oct 2016 14:24:11 +1100 |
User-agent: |
Mutt/1.7.0 (2016-08-17) |
On Wed, Oct 19, 2016 at 04:51:41PM +0200, Laurent Vivier wrote:
>
>
> On 19/10/2016 16:43, Laurent Vivier wrote:
> >
> >
> > On 19/10/2016 14:25, David Gibson wrote:
> >> ide-test uses many explicit inb() / outb() operations for its IO, which
> >> means it's not portable to non-x86 platforms. This cleans it up to use
> >> the libqos PCI accessors instead.
> >>
> >> Signed-off-by: David Gibson <address@hidden>
> >> ---
> >> tests/ide-test.c | 179
> >> ++++++++++++++++++++++++++++++++++++-------------------
> >> 1 file changed, 118 insertions(+), 61 deletions(-)
> >
> > Could explain why you have swapped the le16_to_cpu() and cpu_to_le16()?
> >
> > For me, they were correct.
>
> And I have just finished testing your series on a BE host, and ide-test
> is broken:
>
> TEST: tests/ide-test... (pid=12472)
> /i386/ide/identify: **
> ERROR:/home/laurent/Projects/qemu/tests/ide-test.c:518:test_identify:
> assertion failed: (ret == 0)
> FAIL
Ah, thanks for testing this.
> You should not add the cpu_to_le16():
>
> for (i = 0; i < 256; i++) {
> - data = inb(IDE_BASE + reg_status);
> + data = qpci_io_readb(dev, ide_base + reg_status);
> assert_bit_set(data, DRDY | DRQ);
> assert_bit_clear(data, BSY | DF | ERR);
>
> - ((uint16_t*) buf)[i] = inw(IDE_BASE + reg_data);
> + buf[i] = cpu_to_le16(qpci_io_readw(dev, ide_base + reg_data));
> }
Urgh, the endianness here is doing my head in.
So, the cpu_to_le16() was supposed to counteract the implicit
conversion from LE inside qpci_io_readw. We're reading from the data
register here, and those are usually "streaming" style, meaning that
we want byte-order preserving rather than byte-significance
preserving.
But.. the IDENTIFY command describes most of the output in terms of
(16-bit) words, meaning I guess we do want to swap from LE in order to
interpret those. Except that the string portions seem to be encoded
strangely, which the later string_cpu_to_be16() calls are about.
In summary, I think you're right and the cpu_to_le16() shouldn't be
there. Seems to work on a BE host, anyway.
--
David Gibson | I'll have my music baroque, and my code
david AT gibson.dropbear.id.au | minimalist, thank you. NOT _the_ _other_
| _way_ _around_!
http://www.ozlabs.org/~dgibson
signature.asc
Description: PGP signature