Predicate selection for structural decision trees

K. S. Ng*, J. W. Lloyd

*Corresponding author for this work

    Research output: Contribution to journalConference articlepeer-review

    Abstract

    We study predicate selection functions (also known as splitting rules) for structural decision trees and propose two improvements to existing schemes. The first is in classification learning, where we reconsider the use of accuracy as a predicate selection function and show that, on practical grounds, it is a better alternative to other commonly used functions. The second is in regression learning, where we consider the standard mean squared error measure and give a predicate pruning result for it.

    Original languageEnglish
    Pages (from-to)264-278
    Number of pages15
    JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume3625
    DOIs
    Publication statusPublished - 2005
    Event15th International Conference on Inductive Logic Programming, ILP 2005 - Bonn, Germany
    Duration: 10 Aug 200513 Aug 2005

    Fingerprint

    Dive into the research topics of 'Predicate selection for structural decision trees'. Together they form a unique fingerprint.

    Cite this