guile-user
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

"Pace is nothing without guile"


From: Neil Jerram
Subject: "Pace is nothing without guile"
Date: Sun, 13 Jul 2008 18:06:17 +0100

... That's a comment from coverage of the current England v South
Africa cricket match
(http://uk.cricinfo.com/talk/content/current/multimedia/360921.html).

But is Guile nothing without pace?

Well obviously it isn't "nothing", but I think Guile is perceived,
among both Scheme implementations and free scripting languages, as
being a bit slow, and I think that a large part of the reason for this
is that we have no systematic benchmarking.

So this email is about systematic performance data.  I was wondering
what benchmarks we could run to get good coverage of all Guile's
function, and suddenly thought "of course, the test suite!"  The test
suite should, by definition, provide coverage of everything that we
care about.  Therefore I think that we should be able to start
collecting a lot of useful performance data by implementing a version
of "make check" that measures and stores off the time that each test
takes to run.

What I'd like input/advice on, is exactly how we store and collate
such data.  I think the system should ideally support

- arbitrary later analysis of the collected data

- correlation of the result for a specific test with the exact source
code of that test at the time it was run...

- ...and hence, being able to work out (later) that the results
changed because the content of the test changed

- anyone running the tests and uploading data, not just Guile core developers

- associating a set of results with the relevant information about the
machine that they were obtained on (CPUs, RAM) in such a way that the
information is trustable, but without invading the privacy of the
uploader.

So how do we do that?  Perhaps the test content identification could
be done by its Git (SHA-1) hash - together with the path of the repo
containing that version.  And I imagine that the form of the results
could be a file containing lines like:

("numbers.test" SHA1-HASH REPO-PATH DATE+TIME MACHINE-INFO MEASURED-DURATION)

That would allow sets of results to be concatenated for later
analysis.  But I'm not sure what the relevant MACHINE-INFO is and how
to represent that.

Any thoughts / comments / ideas?  Thanks for reading!

      Neil




reply via email to

[Prev in Thread] Current Thread [Next in Thread]