<P>没那么多时间仔细看你的程序,参考下面的吧<BR>function D = CART(train_features, train_targets, params, region)</P>
<P>% Classify using classification and regression trees<BR>% Inputs:<BR>% features - Train features<BR>% targets - Train targets<BR>% params - [Impurity type, Percentage of incorrectly assigned samples at a node]<BR>% Impurity can be: Entropy, Variance (or Gini), or Missclassification<BR>% region - Decision region vector: [-x x -y y number_of_points]<BR>%<BR>% Outputs<BR>% D - Decision sufrace</P>
<P><BR>[Ni, M] = size(train_features);</P>
<P>%Get parameters<BR>[split_type, inc_node] = process_params(params);</P>
<P>%For the decision region<BR>N = region(5);<BR>mx = ones(N,1) * linspace (region(1),region(2),N);<BR>my = linspace (region(3),region(4),N)' * ones(1,N);<BR>flatxy = [mx(:), my(:)]';</P>
<P>%Preprocessing<BR>[f, t, UW, m] = PCA(train_features, train_targets, Ni, region);<BR>train_features = UW * (train_features - m*ones(1,M));;<BR>flatxy = UW * (flatxy - m*ones(1,N^2));;</P>
<P>%Build the tree recursively<BR>disp('Building tree')<BR>tree = make_tree(train_features, train_targets, M, split_type, inc_node, region);</P>
<P>%Make the decision region according to the tree<BR>disp('Building decision surface using the tree')<BR>targets = use_tree(flatxy, 1:N^2, tree);</P>
<P>D = reshape(targets,N,N);<BR>%END</P>
<P>function targets = use_tree(features, indices, tree)<BR>%Classify recursively using a tree</P>
<P>if isnumeric(tree.Raction)<BR> %Reached an end node<BR> targets = zeros(1,size(features,2));<BR> targets(indices) = tree.Raction(1);<BR>else<BR> %Reached a branching, s<BR> %Find who goes where<BR> in_right = indices(find(eval(tree.Raction)));<BR> in_left = indices(find(eval(tree.Laction)));<BR> <BR> Ltargets = use_tree(features, in_left, tree.left);<BR> Rtargets = use_tree(features, in_right, tree.right);<BR> <BR> targets = Ltargets + Rtargets;<BR>end<BR>%END use_tree </P>
<P>function tree = make_tree(features, targets, Dlength, split_type, inc_node, region)<BR>%Build a tree recursively</P>
<P>if (length(unique(targets)) == 1),<BR> %There is only one type of targets, and this generates a warning, so deal with it separately<BR> tree.right = [];<BR> tree.left = [];<BR> tree.Raction = targets(1);<BR> tree.Laction = targets(1);<BR> break<BR>end</P>
<P>[Ni, M] = size(features);<BR>Nt = unique(targets);<BR>N = hist(targets, Nt);</P>
<P>if ((sum(N < Dlength*inc_node) == length(Nt) - 1) | (M == 1)), <BR> %No further splitting is neccessary<BR> tree.right = [];<BR> tree.left = [];<BR> if (length(Nt) ~= 1),<BR> MLlabel = find(N == max(N));<BR> else<BR> MLlabel = 1;<BR> end<BR> tree.Raction = Nt(MLlabel);<BR> tree.Laction = Nt(MLlabel);<BR> <BR>else<BR> %Split the node according to the splitting criterion <BR> deltaI = zeros(1,Ni);<BR> split_point = zeros(1,Ni);<BR> op = optimset('Display', 'off'); <BR> for i = 1:Ni,<BR> split_point(i) = fminbnd('CARTfunctions', region(i*2-1), region(i*2), op, features, targets, i, split_type);<BR> I(i) = feval('CARTfunctions', split_point(i), features, targets, i, split_type);<BR> end<BR> <BR> [m, dim] = min(I);<BR> loc = split_point(dim);<BR> <BR> %So, the split is to be on dimention 'dim' at location 'loc'<BR> indices = 1:M;<BR> tree.Raction= ['features(' num2str(dim) ',indices) > ' num2str(loc)];<BR> tree.Laction= ['features(' num2str(dim) ',indices) <= ' num2str(loc)];<BR> in_right = find(eval(tree.Raction));<BR> in_left = find(eval(tree.Laction));<BR> <BR> if isempty(in_right) | isempty(in_left)<BR> %No possible split found<BR> tree.right = [];<BR> tree.left = [];<BR> if (length(Nt) ~= 1),<BR> MLlabel = find(N == max(N));<BR> else<BR> MLlabel = 1;<BR> end<BR> tree.Raction = Nt(MLlabel);<BR> tree.Laction = Nt(MLlabel);<BR> else<BR> %...It's possible to build new nodes<BR> tree.right = make_tree(features(:,in_right), targets(in_right), Dlength, split_type, inc_node, region);<BR> tree.left = make_tree(features(:,in_left), targets(in_left), Dlength, split_type, inc_node, region); <BR> end<BR> <BR>end</P> |