qemu-arm
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Qemu-arm] [PATCH 2/2] BootLinuxConsoleTest: Test the SmartFusion2 b


From: Cleber Rosa
Subject: Re: [Qemu-arm] [PATCH 2/2] BootLinuxConsoleTest: Test the SmartFusion2 board
Date: Thu, 23 May 2019 10:07:07 -0400 (EDT)


----- Original Message -----
> From: "Philippe Mathieu-Daudé" <address@hidden>
> To: address@hidden
> Cc: "Eduardo Habkost" <address@hidden>, "Caio Carrara" <address@hidden>, 
> "Alistair Francis"
> <address@hidden>, "Subbaraya Sundeep" <address@hidden>, address@hidden, 
> "Cleber Rosa"
> <address@hidden>, "Peter Maydell" <address@hidden>, "Philippe Mathieu-Daudé" 
> <address@hidden>
> Sent: Monday, May 20, 2019 6:06:35 PM
> Subject: [PATCH 2/2] BootLinuxConsoleTest: Test the SmartFusion2 board
> 
> Similar to the x86_64/pc test, it boots a Linux kernel on an
> Emcraft board and verify the serial is working.
> 
> If ARM is a target being built, "make check-acceptance" will
> automatically include this test by the use of the "arch:arm" tags.
> 
> Alternatively, this test can be run using:
> 
>   $ avocado run -t arch:arm tests/acceptance
>   $ avocado run -t machine:emcraft-sf2 tests/acceptance
> 
> Based on the recommended test setup from Subbaraya Sundeep:
> https://lists.gnu.org/archive/html/qemu-devel/2017-05/msg03810.html
> 
> Signed-off-by: Philippe Mathieu-Daudé <address@hidden>
> ---
>  tests/acceptance/boot_linux_console.py | 27 ++++++++++++++++++++++++++
>  1 file changed, 27 insertions(+)
> 
> diff --git a/tests/acceptance/boot_linux_console.py
> b/tests/acceptance/boot_linux_console.py
> index f593f3858e..844cb80bb5 100644
> --- a/tests/acceptance/boot_linux_console.py
> +++ b/tests/acceptance/boot_linux_console.py
> @@ -178,6 +178,33 @@ class BootLinuxConsole(Test):
>          console_pattern = 'Kernel command line: %s' % kernel_command_line
>          self.wait_for_console_pattern(console_pattern)
>  
> +    def test_arm_emcraft_sf2(self):
> +        """
> +        :avocado: tags=arch:arm
> +        :avocado: tags=machine:emcraft-sf2
> +        :avocado: tags=endian:little
> +        """
> +        uboot_url = ('https://raw.githubusercontent.com/'
> +                     'Subbaraya-Sundeep/qemu-test-binaries/'
> +                     'fa030bd77a014a0b8e360d3b7011df89283a2f0b/u-boot')
> +        uboot_hash = 'abba5d9c24cdd2d49cdc2a8aa92976cf20737eff'
> +        uboot_path = self.fetch_asset(uboot_url, asset_hash=uboot_hash)
> +        spi_url = ('https://raw.githubusercontent.com/'
> +                   'Subbaraya-Sundeep/qemu-test-binaries/'
> +                   'fa030bd77a014a0b8e360d3b7011df89283a2f0b/spi.bin')
> +        spi_hash = '85f698329d38de63aea6e884a86fbde70890a78a'
> +        spi_path = self.fetch_asset(spi_url, asset_hash=spi_hash)
> +
> +        self.vm.set_machine('emcraft-sf2')
> +        self.vm.set_console()
> +        kernel_command_line = self.KERNEL_COMMON_COMMAND_LINE

The kernel_command_line variable is not needed...

> +        self.vm.add_args('-kernel', uboot_path,
> +                         '-append', kernel_command_line,

... and you can just use self.KERNEL_COMMON_COMMAND_LINE here.

> +                         '-drive', 'file=' + spi_path +
> ',if=mtd,format=raw',

Nitpick: it's more Pythonic to format strings than to concatenate
them.

> +                         '-no-reboot')
> +        self.vm.launch()
> +        self.wait_for_console_pattern('init started: BusyBox')

Another nitpick is, given that image is pinned down, maybe attempt to
match against the entire line?

  init started: BusyBox v1.24.1 (2017-05-15 09:57:00 IST)

> +
>      def test_s390x_s390_ccw_virtio(self):
>          """
>          :avocado: tags=arch:s390x
> --
> 2.19.1
> 
> 

Because of the other discussions about tests and timeouts, this is
what I get on a first run (having to download the images):

 (1/1) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (27.49 s)

And then, a pretty consistent < 8s mark using the resources from
cache:

 (01/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.55 s)
 (02/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.35 s)
 (03/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.40 s)
 (04/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.60 s)
 (05/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.62 s)
 (06/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.33 s)
 (07/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.35 s)
 (08/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (6.38 s)
 (09/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.39 s)
 (10/10) 
tests/acceptance/boot_linux_console.py:BootLinuxConsole.test_arm_emcraft_sf2: 
PASS (7.44 s)

It'd be nice to know the average test execution time on real/typical
hardware, and with that, starting to adjust expected timeouts.  Anyway:

Tested-by: Cleber Rosa <address@hidden>



reply via email to

[Prev in Thread] Current Thread [Next in Thread]