Please use this identifier to cite or link to this item: http://hdl.handle.net/1880/45750
Title: ON LEARNING DECISION TREES WITH LARGE OUTPUT DOMAINS
Authors: Bshouty, N.
Tamon, C.
Wilson, D.
Keywords: Computer Science
Issue Date: 1-May-1995
Abstract: For two disjoint sets of variables, X and Y, and a class of functions C, we define DT(X,Y,C) to be the class of all decision trees over X whose leaves are functions from C over Y. We study the learnability of DT(X,Y,C) using membership and equivalence queries. Boolean decision trees, DT(X,$empty$,{0,1}), were shown to be exactly learnable in [Bs93] but does this imply the learnability of decision trees that have non-boolean leaves? A simple encoding of all possible leaf values will work provided that the size of C is reasonable. Our investigation involves several cases where simple encoding is not feasible, i.e., when | C | is large. We show how to learn decision trees whose leaves are learnable concepts belonging to a class C, DT(X,Y,C), when the separation between the variables X and Y is known. A simple algorithm for decision trees whose leaves are constants, DT(X,$empty$,C), is also presented. Each case above requires at least s separate executions of the algorithm from [Bs93] where s is the number of distinct leaves of the tree but we show that if C is a bounded lattice, DT(X,$empty$,C) is learnable using only one execution of this algorithm.
URI: http://hdl.handle.net/1880/45750
Appears in Collections:Bshouty, Nader

Files in This Item:
File Description SizeFormat 
1995-567-19.pdf253.86 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.