声振论坛

 找回密码
 我要加入

QQ登录

只需一步,快速开始

查看: 1457|回复: 1

[工具箱] happy教授~看看这个K-MEANS聚类的问题拉

[复制链接]
发表于 2006-4-26 12:00 | 显示全部楼层 |阅读模式

马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。

您需要 登录 才可以下载或查看,没有账号?我要加入

x
非常感谢HAPPY教授!
问题是:聚类处理的结果很随机,并没有按照我选择的执行!
(具体正确执行结果的的图在打包文件内)
比如我选择1-cluster,出来的结果有时候是2次聚类的结果~选择2-cluster,出来的有时候却是3次聚类的结果~下面这段代码红色标注的产生这样很随机的结果,我自己改的绿色的聚类也是这样的结果,但是很奇怪,title显示是正常的.我觉得代码并没有问题啊.
function apply_Callback(hObject, eventdata, handles)
........
for k = 1:nColors
color = le;
color(rgb_label ~= k) = 0;
segmented_images{k} = color;
end
%imshow(segmented_images{k}),title(['objects in cluster',{k}])
switch handles.clustername
case '1-cluster'
imshow(segmented_images{1}), title('objects in cluster 1');
case '2-cluster'
imshow(segmented_images{2}), title('objects in cluster 2');
case '3-cluster'
imshow(segmented_images{3}), title('objects in cluster 3');
end
这段cluster的函数有写错吗?
function cluster_Callback(hObject, eventdata, handles)
%%%%%
clusterstring=get(handles.cluster,'string');
clustervalue=get(handles.cluster,'value');
clustercontext=get(handles.cluster,{'value','string'});
vvv1=deblank(clustercontext{2}(clustercontext{1}));
handles.clustername=vvv1{1};guidata(hObject,handles);
回复
分享到:

使用道具 举报

发表于 2006-4-30 21:04 | 显示全部楼层

回复:(菜肉包子)happy教授~看看这个K-MEANS聚类的问...

<P>没那么多时间仔细看你的程序,参考下面的吧<BR>function D = CART(train_features, train_targets, params, region)</P>
<P>% Classify using classification and regression trees<BR>% Inputs:<BR>% features - Train features<BR>% targets     - Train targets<BR>% params - [Impurity type, Percentage of incorrectly assigned samples at a node]<BR>%                   Impurity can be: Entropy, Variance (or Gini), or Missclassification<BR>% region     - Decision region vector: [-x x -y y number_of_points]<BR>%<BR>% Outputs<BR>% D - Decision sufrace</P>
<P><BR>[Ni, M]    = size(train_features);</P>
<P>%Get parameters<BR>[split_type, inc_node] = process_params(params);</P>
<P>%For the decision region<BR>N           = region(5);<BR>mx          = ones(N,1) * linspace (region(1),region(2),N);<BR>my          = linspace (region(3),region(4),N)' * ones(1,N);<BR>flatxy      = [mx(:), my(:)]';</P>
<P>%Preprocessing<BR>[f, t, UW, m]   = PCA(train_features, train_targets, Ni, region);<BR>train_features  = UW * (train_features - m*ones(1,M));;<BR>flatxy          = UW * (flatxy - m*ones(1,N^2));;</P>
<P>%Build the tree recursively<BR>disp('Building tree')<BR>tree        = make_tree(train_features, train_targets, M, split_type, inc_node, region);</P>
<P>%Make the decision region according to the tree<BR>disp('Building decision surface using the tree')<BR>targets = use_tree(flatxy, 1:N^2, tree);</P>
<P>D = reshape(targets,N,N);<BR>%END</P>
<P>function targets = use_tree(features, indices, tree)<BR>%Classify recursively using a tree</P>
<P>if isnumeric(tree.Raction)<BR>   %Reached an end node<BR>   targets = zeros(1,size(features,2));<BR>   targets(indices) = tree.Raction(1);<BR>else<BR>   %Reached a branching, s<BR>   %Find who goes where<BR>   in_right    = indices(find(eval(tree.Raction)));<BR>   in_left     = indices(find(eval(tree.Laction)));<BR>   <BR>   Ltargets = use_tree(features, in_left, tree.left);<BR>   Rtargets = use_tree(features, in_right, tree.right);<BR>   <BR>   targets = Ltargets + Rtargets;<BR>end<BR>%END use_tree </P>
<P>function tree = make_tree(features, targets, Dlength, split_type, inc_node, region)<BR>%Build a tree recursively</P>
<P>if (length(unique(targets)) == 1),<BR>   %There is only one type of targets, and this generates a warning, so deal with it separately<BR>   tree.right      = [];<BR>   tree.left       = [];<BR>   tree.Raction    = targets(1);<BR>   tree.Laction    = targets(1);<BR>   break<BR>end</P>
<P>[Ni, M] = size(features);<BR>Nt      = unique(targets);<BR>N       = hist(targets, Nt);</P>
<P>if ((sum(N &lt; Dlength*inc_node) == length(Nt) - 1) | (M == 1)), <BR>   %No further splitting is neccessary<BR>   tree.right      = [];<BR>   tree.left       = [];<BR>   if (length(Nt) ~= 1),<BR>      MLlabel   = find(N == max(N));<BR>   else<BR>      MLlabel   = 1;<BR>   end<BR>   tree.Raction    = Nt(MLlabel);<BR>   tree.Laction    = Nt(MLlabel);<BR>   <BR>else<BR>   %Split the node according to the splitting criterion  <BR>   deltaI = zeros(1,Ni);<BR>   split_point = zeros(1,Ni);<BR>   op = optimset('Display', 'off');   <BR>   for i = 1:Ni,<BR>      split_point(i) = fminbnd('CARTfunctions', region(i*2-1), region(i*2), op, features, targets, i, split_type);<BR>      I(i) = feval('CARTfunctions', split_point(i), features, targets, i, split_type);<BR>   end<BR>   <BR>   [m, dim] = min(I);<BR>   loc = split_point(dim);<BR>    <BR>   %So, the split is to be on dimention 'dim' at location 'loc'<BR>   indices = 1:M;<BR>   tree.Raction= ['features(' num2str(dim) ',indices) &gt;  ' num2str(loc)];<BR>   tree.Laction= ['features(' num2str(dim) ',indices) &lt;= ' num2str(loc)];<BR>   in_right    = find(eval(tree.Raction));<BR>   in_left     = find(eval(tree.Laction));<BR>   <BR>   if isempty(in_right) | isempty(in_left)<BR>      %No possible split found<BR>   tree.right      = [];<BR>   tree.left       = [];<BR>   if (length(Nt) ~= 1),<BR>      MLlabel   = find(N == max(N));<BR>   else<BR>      MLlabel = 1;<BR>   end<BR>   tree.Raction    = Nt(MLlabel);<BR>   tree.Laction    = Nt(MLlabel);<BR>   else<BR>   %...It's possible to build new nodes<BR>   tree.right = make_tree(features(:,in_right), targets(in_right), Dlength, split_type, inc_node, region);<BR>   tree.left  = make_tree(features(:,in_left), targets(in_left), Dlength, split_type, inc_node, region);      <BR>   end<BR>   <BR>end</P>
您需要登录后才可以回帖 登录 | 我要加入

本版积分规则

QQ|小黑屋|Archiver|手机版|联系我们|声振论坛

GMT+8, 2024-12-4 23:45 , Processed in 0.050395 second(s), 17 queries , Gzip On.

Powered by Discuz! X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回复 返回顶部 返回列表