bug-gnubg
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-gnubg] An old bug and possible neural nets input change


From: Øystein Schønning-Johansen
Subject: Re: [Bug-gnubg] An old bug and possible neural nets input change
Date: Thu, 6 Jun 2019 02:20:33 +0200

On Thu, Jun 6, 2019 at 12:21 AM Philippe Michel <address@hidden> wrote:
Interesting diagram. I suppose the hc_0 and hc_1 prefixes mean the
inputs are for the player on roll and for the other respectively ?

Yes! hc is for "hand crafted" features, the cleaned dataset I used also have features
prefixed by "b" for "base inputs". 0 is for the player on roll, and 1 is for the opponent.
 
The sum of the values seems to be about 0.5. Should the sum be 1 and the
basic inputs amount for the complement ? If this is the case, do you
have the actual sum for the listed inputs ?

No, the diagram is not normalized. The abscissa is the drop of error-rate (I think it is
a mean_absolute_error) when randomizing the samples of that feature.
I can see if I can make a corresponding plot with the base inputs.
 
PIPLOSS for one of the players is indeed at the top of the list, but
aggregated for both players MOBILITY seems to be about equal. The second
one, ENTER is interesting, it is 0 when the player doesn't have a man on
the bar so, when it matters, it may well be the most important input.

Sure. I'll keep that in mind.
 
The first random ideas that come to mind :

- could it worthwile to add a few of the complex features like MOBILITY and
ENTER to the pruning nets ?

- the paper by Berliner mentionned by Ian in another follow-up describes
his efforts to improve PIPLOSS from a simple but not that good algorithm
to more or less what we have. Since, according to your analysis, the
value of this input is very asymetrical between the players, maybe a
simpler version could be used for at least one of them. That's assuming
one can come up with something approximate that is much faster but not
too less accurate.

Sure, I also guess we can look at all the different features, and maybe even come
up with some new ideas. Having a good dataset as the one you have gathered (Thanks Philippe!)
makes it really simple to train new neural networks and we can try out a lot of features. I can
now train a contact neural network within a few minutes.

Best regards,
-Øystein

reply via email to

[Prev in Thread] Current Thread [Next in Thread]