声振论坛

 找回密码
 我要加入

QQ登录

只需一步,快速开始

查看: 7082|回复: 17

[人工智能] [示例] SVM(支持向量机)应用于分类问题

[复制链接]
发表于 2007-6-25 09:41 | 显示全部楼层 |阅读模式

马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。

您需要 登录 才可以下载或查看,没有账号?我要加入

x
这是一个支持向量机应用于分类问题的演示程序,分为线性与非线性两部分,希望对做这方面研究的人有用。
(请先安装SVM工具箱,然后运行之)
  1. %%%%%%%%%%%%%%%%%%%%%%%
  2. function demsvm1()
  3. % DEMSVM1 - Demonstrate basic Support Vector Machine classification
  4. %
  5. %   DEMSVM1 demonstrates the classification of a simple artificial data
  6. %   set by a Support Vector Machine classifier, using different kernel
  7. %   functions.
  8. %
  9. %   See also
  10. %   SVM, SVMTRAIN, SVMFWD, SVMKERNEL, DEMSVM2
  11. %
  12. %
  13. % Copyright (c) Anton Schwaighofer (2001)
  14. % This program is released unter the GNU General Public License.
  15. %
  16. X = [2 7; 3 6; 2 2; 8 1; 6 4; 4 8; 9 5; 9 9; 9 4; 6 9; 7 4];
  17. Y = [ +1;  +1;  +1;  +1;  +1;  -1;  -1;  -1;  -1;  -1;  -1];
  18. % define a simple artificial data set
  19. x1ran = [0 10];
  20. x2ran = [0 10];
  21. % range for plotting the data set and the decision boundary
  22. disp(' ');
  23. disp('This demonstration illustrates the use of a Support Vector Machine');
  24. disp('(SVM) for classification. The data is a set of 2D points, together');
  25. disp('with target values (class labels) +1 or -1.');
  26. disp(' ');
  27. disp('The data set consists of the points');
  28. ind = [1:length(Y)]';
  29. fprintf('X%2i = (%2i, %2i) with label Y%2i = %2i\n', [ind, X, ind, Y]');
  30. disp(' ')
  31. disp('Press any key to plot the data set');
  32. pause
  33. f1 = figure;
  34. plotdata(X, Y, x1ran, x2ran);
  35. title('Data from class +1 (squares) and class -1 (crosses)');
  36. fprintf('\n\n\n\n');
  37. fprintf('The data is plotted in figure %i, where\n', f1);
  38. disp('  squares stand for points with label Yi = +1');
  39. disp('  crosses stand for points with label Yi = -1');
  40. disp(' ')
  41. disp(' ');
  42. disp('Now we train a Support Vector Machine classifier on this data set.');
  43. disp('We use the most simple kernel function, namely the inner product');
  44. disp('of points Xi, Xj (linear kernel K(Xi,Xj) = Xi''*Xj )');
  45. disp(' ');
  46. disp('Press any key to start training')
  47. pause
  48. net = svm(size(X, 2), 'linear', [], 10);
  49. net = svmtrain(net, X, Y);
  50. f2 = figure;
  51. plotboundary(net, x1ran, x2ran);
  52. plotdata(X, Y, x1ran, x2ran);
  53. plotsv(net, X, Y);
  54. title(['SVM with linear kernel: decision boundary (black) plus Support' ...
  55.        ' Vectors (red)']);
  56. fprintf('\n\n\n\n');
  57. fprintf('The resulting decision boundary is plotted in figure %i.\n', f2);
  58. disp('The contour plotted in black separates class +1 from class -1');
  59. disp('(this is the actual decision boundary)');
  60. disp('The contour plotted in red are the points at distance +1 from the');
  61. disp('decision boundary, the blue contour are the points at distance -1.');
  62. disp(' ');
  63. disp('All examples plotted in red are found to be Support Vectors.');
  64. disp('Support Vectors are the examples at distance +1 or -1 from the ');
  65. disp('decision boundary and all the examples that cannot be classified');
  66. disp('correctly.');
  67. disp(' ');
  68. disp('The data set shown can be correctly classified using a linear');
  69. disp('kernel. This can be seen from the coefficients alpha associated');
  70. disp('with each example: The coefficients are');
  71. ind = [1:length(Y)]';
  72. fprintf('  Example %2i: alpha%2i = %5.2f\n', [ind, ind, net.alpha]');
  73. disp('The upper bound C for the coefficients has been set to');
  74. fprintf('C = %5.2f. None of the coefficients are at the bound,\n', ...
  75. net.c(1));
  76. disp('this means that all examples in the training set can be correctly');
  77. disp('classified by the SVM.')
  78. disp(' ');
  79. disp('Press any key to continue')
  80. pause
  81. X = [X; [4 4]];
  82. Y = [Y; -1];
  83. net = svm(size(X, 2), 'linear', [], 10);
  84. net = svmtrain(net, X, Y);
  85. f3 = figure;
  86. plotboundary(net, x1ran, x2ran);
  87. plotdata(X, Y, x1ran, x2ran);
  88. plotsv(net, X, Y);
  89. title(['SVM with linear kernel: decision boundary (black) plus Support' ...
  90.        ' Vectors (red)']);
  91. fprintf('\n\n\n\n');
  92. disp('Adding an additional point X12 with label -1 gives a data set');
  93. disp('that can not be linearly separated. The SVM handles this case by');
  94. disp('allowing training points to be misclassified.');
  95. disp(' ');
  96. disp('Training the SVM on this modified data set we see that the points');
  97. disp('X5, X11 and X12 can not be correctly classified. The decision');
  98. fprintf('boundary is shown in figure %i.\n', f3);
  99. disp('The coefficients alpha associated with each example are');
  100. ind = [1:length(Y)]';
  101. fprintf('  Example %2i: alpha%2i = %5.2f\n', [ind, ind, net.alpha]');
  102. disp('The coefficients of the misclassified points are at the upper');
  103. disp('bound C.');
  104. disp(' ')
  105. disp('Press any key to continue')
  106. pause

  107. fprintf('\n\n\n\n');
  108. disp('Adding the new point X12 has lead to a more difficult data set');
  109. disp('that can no longer be separated by a simple linear kernel.');
  110. disp('We can now switch to a more powerful kernel function, namely');
  111. disp('the Radial Basis Function (RBF) kernel.');
  112. disp(' ')
  113. disp('The RBF kernel has an associated parameter, the kernel width.');
  114. disp('We will now show the decision boundary obtained from a SVM with');
  115. disp('RBF kernel for 3 different values of the kernel width.');
  116. disp(' ');
  117. disp('Press any key to continue')
  118. pause
  119. net = svm(size(X, 2), 'rbf', [8], 100);
  120. net = svmtrain(net, X, Y);
  121. f4 = figure;
  122. plotboundary(net, x1ran, x2ran);
  123. plotdata(X, Y, x1ran, x2ran);
  124. plotsv(net, X, Y);
  125. title(['SVM with RBF kernel, width 8: decision boundary (black)' ...
  126.        ' plus Support Vectors (red)']);
  127. fprintf('\n\n\n\n');
  128. fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
  129. f4);
  130. disp('with Radial Basis Function kernel, the kernel width has been');
  131. disp('set to 8.');
  132. disp('The SVM now interprets the new point X12 as evidence for a');
  133. disp('cluster of points from class -1, the SVM builds a small ''island''');
  134. disp('around X12.');
  135. disp(' ')
  136. disp('Press any key to continue')
  137. pause

  138. net = svm(size(X, 2), 'rbf', [1], 100);
  139. net = svmtrain(net, X, Y);
  140. f5 = figure;
  141. plotboundary(net, x1ran, x2ran);
  142. plotdata(X, Y, x1ran, x2ran);
  143. plotsv(net, X, Y);
  144. title(['SVM with RBF kernel, width 1: decision boundary (black)' ...
  145.        ' plus Support Vectors (red)']);
  146. fprintf('\n\n\n\n');
  147. fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
  148. f5);
  149. disp('with radial basis function kernel, kernel width 1.');
  150. disp('The decision boundary is now highly shattered, since a smaller');
  151. disp('kernel width allows the decision boundary to be more curved.');
  152. disp(' ')
  153. disp('Press any key to continue')
  154. pause

  155. net = svm(size(X, 2), 'rbf', [36], 100);
  156. net = svmtrain(net, X, Y);
  157. f6 = figure;
  158. plotboundary(net, x1ran, x2ran);
  159. plotdata(X, Y, x1ran, x2ran);
  160. plotsv(net, X, Y);
  161. title(['SVM with RBF kernel, width 36: decision boundary (black)' ...
  162.        ' plus Support Vectors (red)']);
  163. fprintf('\n\n\n\n');
  164. fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
  165. f6);
  166. disp('with radial basis function kernel, kernel width 36.');
  167. disp('This gives a decision boundary similar to the one shown in');
  168. fprintf('Figure %i for the SVM with linear kernel.\n', f2);

  169. fprintf('\n\n\n\n');
  170. disp('Press any key to end the demo')
  171. pause
  172. delete(f1);
  173. delete(f2);
  174. delete(f3);
  175. delete(f4);
  176. delete(f5);
  177. delete(f6);

  178. function plotdata(X, Y, x1ran, x2ran)
  179. % PLOTDATA - Plot 2D data set
  180. %
  181. hold on;
  182. ind = find(Y>0);
  183. plot(X(ind,1), X(ind,2), 'ks');
  184. ind = find(Y<0);
  185. plot(X(ind,1), X(ind,2), 'kx');
  186. text(X(:,1)+.2,X(:,2), int2str([1:length(Y)]'));
  187. axis([x1ran x2ran]);
  188. axis xy;

  189. function plotsv(net, X, Y)
  190. % PLOTSV - Plot Support Vectors
  191. %
  192. hold on;
  193. ind = find(Y(net.svind)>0);
  194. plot(X(net.svind(ind),1),X(net.svind(ind),2),'rs');
  195. ind = find(Y(net.svind)<0);
  196. plot(X(net.svind(ind),1),X(net.svind(ind),2),'rx');

  197. function [x11, x22, x1x2out] = plotboundary(net, x1ran, x2ran)
  198. % PLOTBOUNDARY - Plot SVM decision boundary on range X1RAN and X2RAN
  199. %
  200. hold on;
  201. nbpoints = 100;
  202. x1 = x1ran(1):(x1ran(2)-x1ran(1))/nbpoints:x1ran(2);
  203. x2 = x2ran(1):(x2ran(2)-x2ran(1))/nbpoints:x2ran(2);
  204. [x11, x22] = meshgrid(x1, x2);
  205. [dummy, x1x2out] = svmfwd(net, [x11(:),x22(:)]);
  206. x1x2out = reshape(x1x2out, [length(x1) length(x2)]);
  207. contour(x11, x22, x1x2out, [-0.99 -0.99], 'b-');
  208. contour(x11, x22, x1x2out, [0 0], 'k-');
  209. contour(x11, x22, x1x2out, [0.99 0.99], 'g-');
  210. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
复制代码

评分

1

查看全部评分

回复
分享到:

使用道具 举报

发表于 2007-11-13 21:26 | 显示全部楼层
看起来还是有点迷糊
发表于 2007-11-13 21:37 | 显示全部楼层
我怎只看到训练的样本了,没有分类呢?
发表于 2007-12-20 13:01 | 显示全部楼层
好贴,谢谢,学习学习!
发表于 2007-12-24 09:59 | 显示全部楼层
发表于 2008-1-8 09:06 | 显示全部楼层
这就说明了程序写好注释的重要性
发表于 2008-1-15 09:06 | 显示全部楼层
原帖由 koihime 于 2008-1-8 09:06 发表
这就说明了程序写好注释的重要性


一般看别人代码比自己写代码还要类,当然注释是一方面,思路的不同也是很大的原因
发表于 2008-3-21 11:12 | 显示全部楼层
拷回去研究下
发表于 2008-4-23 10:10 | 显示全部楼层
看看!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
发表于 2008-8-29 10:59 | 显示全部楼层
xjzuo,你这个svm的工具箱是从哪里下载的? 很需要
发表于 2008-9-5 08:56 | 显示全部楼层
原帖由 若菱 于 2008-8-29 10:59 发表
xjzuo,你这个svm的工具箱是从哪里下载的? 很需要


http://www.chinavib.com/forum/viewthread.php?tid=36029
这个帖子有各种svm工具箱的说明和相关连接

不知道xjzuo用的是哪个
发表于 2008-10-29 22:36 | 显示全部楼层
对阿你用的工具箱是哪个为什么运行到训练的时候就出错了,找不到
net = svm(size(X, 2), 'linear', [], 10);这条语句
发表于 2008-11-11 16:48 | 显示全部楼层
我也出现了12的问题,请问怎么解决?
发表于 2009-6-18 10:41 | 显示全部楼层

我的也报错了

错误
??? Undefined command/function 'svm'.

Error in ==> demsvm1 at 48
net = svm(size(X, 2), 'linear', [], 10);
发表于 2010-3-31 11:09 | 显示全部楼层
:loveliness: 下来试试再评论,呵呵。
您需要登录后才可以回帖 登录 | 我要加入

本版积分规则

QQ|小黑屋|Archiver|手机版|联系我们|声振论坛

GMT+8, 2024-5-2 16:26 , Processed in 0.136866 second(s), 22 queries , Gzip On.

Powered by Discuz! X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回复 返回顶部 返回列表