Browsing by Author "Bshouty, N."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Open Access ON LEARNING DECISION TREES WITH LARGE OUTPUT DOMAINS(1995-05-01) Bshouty, N.; Tamon, C.; Wilson, D.For two disjoint sets of variables, X and Y, and a class of functions C, we define DT(X,Y,C) to be the class of all decision trees over X whose leaves are functions from C over Y. We study the learnability of DT(X,Y,C) using membership and equivalence queries. Boolean decision trees, DT(X,$empty$,{0,1}), were shown to be exactly learnable in [Bs93] but does this imply the learnability of decision trees that have non-boolean leaves? A simple encoding of all possible leaf values will work provided that the size of C is reasonable. Our investigation involves several cases where simple encoding is not feasible, i.e., when | C | is large. We show how to learn decision trees whose leaves are learnable concepts belonging to a class C, DT(X,Y,C), when the separation between the variables X and Y is known. A simple algorithm for decision trees whose leaves are constants, DT(X,$empty$,C), is also presented. Each case above requires at least s separate executions of the algorithm from [Bs93] where s is the number of distinct leaves of the tree but we show that if C is a bounded lattice, DT(X,$empty$,C) is learnable using only one execution of this algorithm.Item Open Access ON THE FOURIER SPECTRUM OF MONOTONE FUNCTIONS(1995-04-01) Bshouty, N.; Tamon, T.We study the Fourier spectrum of monotone boolean functions. Our main result is that any monotone boolean function of n inputs has most of its power spectrum on the coefficients of "degree" soft-Oh of square-root of n under any product distribution. As a consequence, we derive a subexponential PAC learning algorithm over product distributions and a subexponential size, sublinear depth non-monotone circuit approximation for unrestricted monotone boolean functions. Our other results include a new measure of complexity for distributions, called convex dimension, and several efficient PAC learning algorithms for subclasses of monotone boolean functions.Item Open Access A TIGHT BOUND FOR APPROXIMATING THE SQUARE ROOT(1995-05-01) Bshouty, N.; Mansour, Y.; Tiwari, P.; Schieber, B.We prove an $OMEGA$(loglog(1/$epsilon$)) lower bound on the depth of any computation tree and any RAM program with operations {$+, -, *, /, left floor cdot right floor, not, and, or, xor$}, unlimited power of answering YES/NO questions, and constants {0,1} that computes $sqrt x$ to accuracy $epsilon$, for all $x \(mo $ [1,2]. Since Newton method achieves such an accuracy in $O$(loglog(1/$epsilon$)) depth, our bound is tight.