TY - GEN
T1 - Consistency conditions for inductive inference of recursive functions
AU - Akama, Yohji
AU - Zeugmann, Thomas
PY - 2007
Y1 - 2007
N2 - A consistent learner is required to correctly and completely reflect in its actual hypothesis all data received so far. Though this demand sounds quite plausible, it may lead to the unsolvability of the learning problem. Therefore, in the present paper several variations of consistent learning are introduced and studied. These variations allow a so-called δ-delay relaxing the consistency demand to all but the last δ data. Additionally, we introduce the notion of coherent learning (again with δ-delay) requiring the learner to correctly reflect only the last datum (only the n -δth datum) seen. Our results are threefold. First, it is shown that all models of coherent learning with δ-delay are exactly as powerful as their corresponding consistent learning models with δ-delay. Second, we provide characterizations for consistent learning with δ-delay in terms of complexity. Finally, we establish strict hierarchies for all consistent learning models with δ-delay in dependence on δ.
AB - A consistent learner is required to correctly and completely reflect in its actual hypothesis all data received so far. Though this demand sounds quite plausible, it may lead to the unsolvability of the learning problem. Therefore, in the present paper several variations of consistent learning are introduced and studied. These variations allow a so-called δ-delay relaxing the consistency demand to all but the last δ data. Additionally, we introduce the notion of coherent learning (again with δ-delay) requiring the learner to correctly reflect only the last datum (only the n -δth datum) seen. Our results are threefold. First, it is shown that all models of coherent learning with δ-delay are exactly as powerful as their corresponding consistent learning models with δ-delay. Second, we provide characterizations for consistent learning with δ-delay in terms of complexity. Finally, we establish strict hierarchies for all consistent learning models with δ-delay in dependence on δ.
UR - http://www.scopus.com/inward/record.url?scp=34548101679&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=34548101679&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-69902-6_22
DO - 10.1007/978-3-540-69902-6_22
M3 - Conference contribution
AN - SCOPUS:34548101679
SN - 3540699015
SN - 9783540699019
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 251
EP - 264
BT - New Frontiers in Artificial Intelligence - JSAI 2006 Conference and Workshops, Revised Selected Papers
PB - Springer Verlag
T2 - 20th Annual Conference of the Japanese Society for Artificial Intelligence, JSAI 2006 Conference and Workshops
Y2 - 5 June 2006 through 9 June 2006
ER -