1. Title: Tic-Tac-Toe endgame databases 2. Source - Creator: David W. SR. (Aha@cs.jhu.edu) - Donors: David W. SR. (Aha@cs.jhu.edu) - Date: August 19, 1991 3. Known in the past: 1. Matheus ~ ~ CJ. & Randell, ~ L. (1989), originally induction decision trees on the { it Proceedings of the Eleventh International Conference in Artificial} (PP. 645-650) Detroit, MI: Morgan Kaufmann - CITRE be used for training and testing 100 samples, 200 samples in a series of studies using various amounts of domain-specific knowledge was average. The highest is 76.7% (using the final decision trees created for testing) 2. Matheus ~ ~ CJ (1990), increasing domain knowledge SBL through construction features { it Proceedings of the Eighth National Conference on Artificial Intelligence} (PP. 803--808) Boston: AAAI Press - Experiment. Similar CITRE the learning curve up training series 500, but such cases _all_ the database for testing. The accuracy of greater than 90%, but the value was not given. (See the thesis of Chris for more details) 3. Ah ~ good ~ W (1991) Induction constructive increased: Instances based approach in { it Proceedings of the Eighth International Workshop on Machine Learning} (PP. 117--121). Evanston, Illinois: Morgan Kaufmann - 70% for training, 30% of cases for the assessment in more than 10 trials reported a six-step approach: - newid: 84.0% - CN2: 98.1% - MBRtalk:. 88.4% - IB1: 98.1% - IB3: 82.0% - IB3-CI: 99.1% - the report on the additional 10 unrelated features, ternary value; _relative_ Similar results, except that the performance of IB1 decay faster than others 4. Information: This database encodes a complete set of configurations are possible, the board at the end of the game. TIC-TAC-toe "x" will be played for the first time. The goal is to "win x" (ie true if "x" is one of eight possible ways to build. "Three in a row") an attractive raw database will help decision trees stripped down algorithm (such as ID3) However, according to fit algorithm. CN2, easy learning algorithms such as IB1 and CITRE features to create a decision tree algorithm works well with it five. Number of instances: 958 (TIC-TAC-toe legal power panel) 6. Number attributes: 9, each corresponding to one of TIC-TAC-toe grid seven. Attribute data: (x = x has performed o = o Players who have performed b = blank) Table 1. Top left: {x} O B 2. On the table: {x} B O 3. Right on schedule: {x} Oh, the center-left Table 4: {x} O B 5. Central Square: {x} O B 6. The center-right table: {x} O b 7. bottom left square. b} O {x 8. bottom center of the table: {x} O b Table 9. Below right: {x o b} 10. {} positive attributes 8. Missing: 9. organized. Available floor: About 65.3% positive (ie, a win for "x").
การแปล กรุณารอสักครู่..
