




已阅读5页,还剩7页未读, 继续免费阅读
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
function svm_struct, svIndex = svmtrain(training, groupnames, varargin) %SVMTRAIN trains a support vector machine classifier % % SVMStruct = SVMTRAIN(TRAINING,GROUP) trains a support vector machine % classifier using data TRAINING taken from two groups given by GROUP. % SVMStruct contains information about the trained classifier, including % the support vectors, that is used by SVMCLASSIFY for classification. % GROUP is a column vector of values of the same length as TRAINING that % defines two groups. Each element of GROUP specifies the group the % corresponding row of TRAINING belongs to. GROUP can be a numeric % vector, a string array, or a cell array of strings. SVMTRAIN treats % NaNs or empty strings in GROUP as missing values and ignores the % corresponding rows of TRAINING. % % SVMTRAIN(.,KERNEL_FUNCTION,KFUN) allows you to specify the kernel % function KFUN used to map the training data into kernel space. The % default kernel function is the dot product. KFUN can be one of the % following strings or a function handle: % % linear Linear kernel or dot product % quadratic Quadratic kernel % polynomial Polynomial kernel (default order 3) % rbf Gaussian Radial Basis Function kernel % mlp Multilayer Perceptron kernel (default scale 1) % function A kernel function specified using , % for example KFUN, or an anonymous function % % A kernel function must be of the form % % function K = KFUN(U, V) % % The returned value, K, is a matrix of size M-by-N, where U and V have M % and N rows respectively. If KFUN is parameterized, you can use % anonymous functions to capture the problem-dependent parameters. For % example, suppose that your kernel function is % % function k = kfun(u,v,p1,p2) % k = tanh(p1*(u*v)+p2); % % You can set values for p1 and p2 and then use an anonymous function: % (u,v) kfun(u,v,p1,p2). % % SVMTRAIN(.,RBF_SIGMA,SIGMA) allows you to specify the scaling % factor, sigma, in the radial basis function kernel. % % SVMTRAIN(.,POLYORDER,ORDER) allows you to specify the order of a % polynomial kernel. The default order is 3. % % SVMTRAIN(.,MLP_PARAMS,P1 P2) allows you to specify the % parameters of the Multilayer Perceptron (mlp) kernel. The mlp kernel % requires two parameters, P1 and P2, where K = tanh(P1*U*V + P2) and P1 % 0 and P2 2 error(Bioinfo:svmtrain:TooManyGroups,. SVMTRAIN only supports classification into two groups.nGROUP contains %d different groups.,ngroups) end % convert to 1, -1. groupIndex = 1 - (2* (groupIndex-1); % handle optional arguments if numoptargs = 1 if rem(numoptargs,2)= 1 error(Bioinfo:svmtrain:IncorrectNumberOfArguments,. Incorrect number of arguments to %s.,mfilename); end okargs = kernel_function,method,showplot,kfunargs,. quadprog_opts,polyorder,mlp_params,. boxconstraint,rbf_sigma,autoscale, smo_opts; for j=1:2:numoptargs pname = optargsj; pval = optargsj+1; k = find(strncmpi(pname, okargs,numel(pname); if isempty(k) error(Bioinfo:svmtrain:UnknownParameterName,. Unknown parameter name: %s.,pname); elseif length(k)1 error(Bioinfo:svmtrain:AmbiguousParameterName,. Ambiguous parameter name: %s.,pname); else switch(k) case 1 % kernel_function if ischar(pval) okfuns = linear,quadratic,. radial,rbf,polynomial,mlp; funNum = strmatch(lower(pval), okfuns); if isempty(funNum) funNum = 0; end switch funNum %maybe make this less strict in the future case 1 kfun = linear_kernel; case 2 kfun = quadratic_kernel; case 3,4 kfun = rbf_kernel; useSigma = true; case 5 kfun = poly_kernel; usePoly = true; case 6 kfun = mlp_kernel; useMLP = true; otherwise error(Bioinfo:svmtrain:UnknownKernelFunction,. Unknown Kernel Function %s.,pval); end elseif isa (pval, function_handle) kfun = pval; else error(Bioinfo:svmtrain:BadKernelFunction,. The kernel function input does not appear to be a function handlenor valid function name.) end case 2 % method if strncmpi(pval,qp,2) if isempty(which(quadprog) warning(Bioinfo:svmtrain:NoOptim,. The Optimization Toolbox is required to use the quadratic programming method.) optimMethod = SMO; else optimMethod = QP; end elseif strncmpi(pval,ls,numel(pval) optimMethod = LS; elseif strncmpi(pval, smo, numel(pval) optimMethod = SMO; else error(Bioinfo:svmtrain:UnknownMethod,. Unknown method option %s. Valid methods are SMO, QP and LS,pval); end case 3 % display plotflag = opttf(pval,okargsk,mfilename); if plotflag = true if size(training,2) = 2 plotflag = true; else plotflag = false; warning(Bioinfo:svmtrain:OnlyPlot2D,. The display option can only plot 2D training data.) end end case 4 % kfunargs if iscell(pval) kfunargs = pval; else kfunargs = pval; end case 5 % quadprog_opts if isstruct(pval) qp_opts = optimset(qp_opts,pval); elseif iscell(pval) qp_opts = optimset(qp_opts,pval:); else error(Bioinfo:svmtrain:BadQuadprogOpts,. QUADPROG_OPTS must be an opts structure.); end case 6 % polyorder if isscalar(pval) | isnumeric(pval) error(Bioinfo:svmtrain:BadPolyOrder,. POLYORDER must be a scalar value.); end if pval =floor(pval) | pval 0 warning(Bioinfo:svmtrain:MLPBiasNotNegative,. The bias for MLP kernel should be negative.) end kfunargs = pval(1),pval(2); setMLP = true; case 8 % box constraint: it can be a positive numeric scalar % or a numeric vector of the same length as there are % data points if isscalar(pval) n2 = length(find(groupIndex=-1); c1 = 0.5 * pval * nPoints / n1; c2 = 0.5 * pval * nPoints / n2; boxconstraint(groupIndex=1) = c1; boxconstraint(groupIndex=-1) = c2; elseif isvector(pval) end boxconstraint = pval; else error(Bioinfo:svmtrain:BoxConstraintNotScalar,. The box constraint must be a numeric scalar or vector 0.); end % If boxconstraint = Inf then convergence will not % happen so fix the value to 1/sqrt(eps). boxconstraint = min(boxconstraint,repmat(1/sqrt(eps(class(boxconstraint),. size(boxconstraint); case 9 % rbf sigma if isscalar(pval) setSigma = true; else error(Bioinfo:svmtrain:RBFSigmaNotScalar,. Sigma must be a numeric scalar.); end case 10 % autoscale autoScale = opttf(pval,okargsk,mfilename); case 11 % smo_opts smo_opts = pval; end end end end if setPoly end if setMLP end if setSigma end % plot the data if requested if plotflag hAxis,hLines = svmplotdata(training,groupIndex); legend(hLines,cellstr(groupString); end % autoscale data if required, we cant use the zscore function here, % because we need the shift and scale values. scaleData = ; if autoScale scaleData.shift = - mean(training); stdVals = std(training); scaleData.scaleFactor = 1./stdVals; % leave zero-variance data unscaled: scaleData.scaleFactor(isfinite(scaleData.scaleFactor) = 1; % shift and scale columns of data matrix: for c = 1:size(training, 2) training(:,c) = scaleData.scaleFactor(c) * . (training(:,c) + scaleData.shift(c); end end if strcmpi(optimMethod, SMO) % if we have a kernel that takes extra arguments we must define a new % kernel function handle to be passed to seqminopt if isempty(kfunargs) tmp_kfun = (x,y) feval(kfun, x,y, kfunargs:); else tmp_kfun = kfun; end alpha bias = seqminopt(training, groupIndex, . boxconstraint, tmp_kfun, smo_opts); svIndex = find(alpha sqrt(eps); sv = training(svIndex,:); alphaHat = groupIndex(svIndex).*alpha(svIndex); else % QP and LS both need the kernel matrix: % calculate kernel function and add additional term required % for two-norm soft margin try kx = feval(kfun,training,training,kfunargs:); % ensure function is symmetric kx = (kx+kx)/2 + diag(1./boxconstraint); catch theException error(Bioinfo:svmtrain:KernelFunctionError,. Error calculating the kernel function:n%sn, theException.message); end % create Hessian H =(groupIndex * groupIndex).*kx); if strcmpi(optimMethod, QP) % X=QUADPROG(H,f,A,b,Aeq,beq,LB,UB,X0,opts) alpha, fval, exitflag, . output, lambda = quadprog(H,-ones(nPoints,1),. groupIndex,0,zeros(nPoints,1), . Inf *ones(nPoints,1), ones(nPoints,1), . qp_opts); %#ok if exitflag sqrt(eps); sv = training(svIndex,:); % calculate the parameters of the separating line from the support % vectors. alphaHat = groupIndex(svIndex).*alpha(svIndex); % Calculate the bias by applying the indicator function to the support % vector with largest alpha. maxAlpha,maxPos = max(alpha); bias = groupIndex(maxPos) - sum(alphaHat.*kx(svIndex,maxPos); % an alternative method is to average the values over all support vectors % bias = mean(groupIndex(sv) - sum(alphaHat(:,ones(1,numSVs).*kx(sv,sv); % An alternative way to calculate support vectors is to look for zeros of % th
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 音乐二年级上册唱歌 理发师教案及反思
- 小学科学粤教粤科版五年级上册(新版)第4单元 物质的变化27 把物质混合起来教学设计
- 秘书的培训与晋升
- 五星级酒店前厅部培训大纲
- 语文人教部编版军神教案
- 电梯使用单位安全管理人员培训课件
- 陕西省石泉县高中数学 第二章 变化率与导数 2.4.1 导数的加法与减法法则教学设计 北师大版选修2-2
- 手术无菌物品管理规范
- 选购木材的销售简单合同范本
- 中小学教育安全合同签订指南
- 《综合英语4》课程教学大纲
- 包装饮用水行业研究报告
- 2024年荆州市直事业单位人才引进笔试真题
- 《药物计量换算法》课件
- 《文明上网》课件
- 数据开放与共享平台建设合同
- 2025年货车从业资格证答题题库
- 【政治】做中华传统美德的践行者课件-+2024-2025学年统编版道德与法治七年级下册
- 2024-2030年中国建筑垃圾处理行业发展分析及投资规划研究报告
- 通信工程安全知识培训
- 中建临时用电施工方案范本
评论
0/150
提交评论