Saturday, July 30, 2011

Quiescence Search Trouble

While cleaning up my code base I realized that the quiescence search (qSearch) behaved not the way I intended. In qSearch  I don't want to search captures that lose material. I create all captures for a given position and perform a static exchange evaluation (SEE) on them. If the played exchange reveals that the capture is losing I delete that capture from the list.

But my implementation had an error.

for (i=0; i < ml->moveCnt; i++)
{
   if ((see(ml->moves[i]) < 0)) ml->deleteElementAt(i);
}
   
This code is problematic because the deleteElementAt(idx) function performs the deletion by copying the last element onto the position idx and decrementing the moveCnt property.

So the last element from the list is now at the position of the losing capture, but the move counter i is incremented after that operation and so this element is not checked whether it is maybe also losing. This way the resulting list might include losing captures which makes the qSearch search more moves and therefore nodes than intended.

I fixed that error but to my surprise the fixed version played weaker than the buggy one. Obviously the inclusion of some losing captures (which are selected more or less random) let the engine see some tactics that it otherwise missed and this effect more than countered the effect of the additional nodes.

I run tests where the fixed and the buggy version reported different results and tried to figure out what losing captures might me worth searching but I did not came to a good rule for that. Also searching all losing captures gave a weaker engine.

But I was not willing to keep the buggy code in the engine. So I did a complete rewrite of qSearch and also improved the static exchange evaluation to report more accurate results (it is now aware of king pinned pieces). This version now is a strong as the buggy one, but gives me a better feeling.


So this was obviously the rare exception to the "bugs hurt playing strength" rule. 

Tuesday, July 12, 2011

Probabilties and bad estimates

I'm somehow not good in estimating probabilities. I recently implemented some more caching in the engine and for space reasons I only saved part of the 64 bit board identifier.

I thought that using 2^20 hash slots which also verify 20 bits of the key just by the slot no and 20 more bits from the key are enough to safely detect collisions, when 2 different boards map to the hash same slot. But I was wrong. With my first verification run with a 12 ply search against 25 positions an undetected collision crashed the engine as it returned a move from another board, that was not legal on the actual board. It tried to capture its own pawn with a rook.


I knew there was a distant probability that this can happen (so like once in a thousands years), but as it happened so fast I must have greatly misjudged the probability for that.

Looks like I need additional collision detection code for that.

Monday, July 4, 2011

New CCC Tournament

As ice is public now it is sometimes chosen by engine testers in their tournament setup. It usually plays in the lower or lowest league but it's interesting for me anyway to see how it performs.

In the last special tournament it played against a lot of engines rated higher than itself (ice has about 2200 ELO) but it played pretty well. It finished 2nd place out of 16 engines and gained 50 ELO in that tournament. This makes me a bit proud and encourages me to enhance it further.

Final Standings

22.0 - Jazz 444
21.0 - Ice 0.1
19.5 - BikJump 2.01
19.0 - FireFly 2.5.10
18.0 - KMTChess 1.2.1 32-bit
17.5 - ECE 11.01
16.5 - Protej 0.5.7
15.0 - Prophet 2.0
14.5 - Carballo 0.5 32-bit
14.0 - Chesley r323
13.5 - Kurt 0.9.2 32-bit
13.0 - Parrot 07.07.22
13.0 - Clarabit 1.00
8.0 - PLP 1661571
8.0 - Bubble 1.5
7.5 - Jabba 1.0


http://talkchess.com/forum/viewtopic.php?topic_view=threads&p=410401&t=39210