模式识别与机器学习实验报告_第1页
模式识别与机器学习实验报告_第2页
模式识别与机器学习实验报告_第3页
模式识别与机器学习实验报告_第4页
模式识别与机器学习实验报告_第5页
已阅读5页,还剩13页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

中南大学模式识别与机器学习实验报告班级学号姓名指导老师ProgrammingExercise1:LinearRegressionIntroductionInthisexercise,youwillimplementlinearregressionandgettoseeitworkondata.Beforestartingonthisprogrammingexercise,westronglyrecommendwatchingthevideolecturesandcompletingthereviewquestionsfortheassociatedtopics.Togetstartedwiththeexercise,youwillneedtodownloadthestartercodeandunzipitscontentstothedirectorywhereyouwishtocompletetheexercise.Ifneeded,usethecdcommandinOctavetochangetothisdirectorybeforestartingthisexercise.YoucanalsofindinstructionsforinstallingOctaveonthe\OctaveInstallation"pageonthecoursewebsite.Filesincludedinthisexerciseex1.m-Octavescriptthatwillhelpstepyouthroughtheexerciseex1multi.m-Octavescriptforthelaterpartsoftheexerciseex1data1.txt-Datasetforlinearregressionwithonevariableex1data2.txt-Datasetforlinearregressionwithmultiplevariablessubmit.m-Submissionscriptthatsendsyoursolutionstoourservers[*]warmUpExercise.m-SimpleexamplefunctioninOctave[*]plotData.m-Functiontodisplaythedataset[*]computeCost.m-Functiontocomputethecostoflinearregression[*]gradientDescent.m-Functiontorungradientdescent[$]computeCostMulti.m-Costfunctionformultiplevariables[$]gradientDescentMulti.m-Gradientdescentformultiplevariables[$]featureNormalize.m-Functiontonormalizefeatures[$]normalEqn.m-Functiontocomputethenormalequations*indicateslesyouwillneedtocomplete$indicatesextracreditexercisesThroughouttheexercise,youwillbeusingthescriptsex1.mandex1multi.m.Thesescriptssetupthedatasetfortheproblemsandmakecallstofunctionsthatyouwillwrite.Youdonotneedtomodifyeitherofthem.Youareonlyrequiredtomodifyfunctionsinotherles,byfollowingtheinstructionsinthisassignment.Forthisprogrammingexercise,youareonlyrequiredtocompletetherstpartoftheexercisetoimplementlinearregressionwithonevariable.Thesecondpartoftheexercise,whichyoumaycompleteforextracredit,coverslinearregressionwithmultiplevariables.根据实验内容补全代码后如下:(1)computeCost.mfunctionJ=computeCost(X,y,theta)%COMPUTECOSTComputecostforlinearregression%J=COMPUTECOST(X,y,theta)computesthecostofusingthetaasthe%parameterforlinearregressiontofitthedatapointsinXandy%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta%YoushouldsetJtothecost.res=X*theta-y;J=(res’*res)/m*0.5;%=========================================================================end(2)plotData.mfunctionplotData(x,y)%PLOTDATAPlotsthedatapointsxandyintoanewfigure%PLOTDATA(x,y)plotsthedatapointsandgivesthefigureaxeslabelsof%populationandprofit.%======================YOURCODEHERE======================%Instructions:Plotthetrainingdataintoafigureusingthe%"figure"and"plot"commands.Settheaxeslabelsusing%the"xlabel"and"ylabel"commands.Assumethe%populationandrevenuedatahavebeenpassedin%asthexandyargumentsofthisfunction.%%Hint:Youcanusethe'rx'optionwithplottohavethemarkers%appearasredcrosses.Furthermore,youcanmakethe%markerslargerbyusingplot(...,'rx','MarkerSize',10);figure;%openanewfigurewindowplot(x,y,’rx’,’MarkerSize’,10);xlabel(‘profitin$10,000s’);ylabel(‘PopulationofCityin10,0000s’);%============================================================end(3)warmUpExercise.mfunctionA=warmUpExercise()%WARMUPEXERCISEExamplefunctioninoctave%A=WARMUPEXERCISE()isanexamplefunctionthatreturnsthe5x5identitymatrixA=[];%=============YOURCODEHERE==============%Instructions:Returnthe5x5identitymatrix%Inoctave,wereturnvaluesbydefiningwhichvariables%representthereturnvalues(atthetopofthefile)%andthensetthemaccordingly.A=eye(5);%===========================================endfeatureNormalize.mfunction[X_norm,mu,sigma]=featureNormalize(X)%FEATURENORMALIZENormalizesthefeaturesinX%FEATURENORMALIZE(X)returnsanormalizedversionofXwhere%themeanvalueofeachfeatureis0andthestandarddeviation%is1.Thisisoftenagoodpreprocessingsteptodowhen%workingwithlearningalgorithms.%YouneedtosetthesevaluescorrectlyX_norm=X;mu=zeros(1,size(X,2));sigma=zeros(1,size(X,2));%======================YOURCODEHERE======================%Instructions:First,foreachfeaturedimension,computethemean%ofthefeatureandsubtractitfromthedataset,%storingthemeanvalueinmu.Next,computethe%standarddeviationofeachfeatureanddivide%eachfeaturebyit'sstandarddeviation,storing%thestandarddeviationinsigma.%%NotethatXisamatrixwhereeachcolumnisa%featureandeachrowisanexample.Youneed%toperformthenormalizationseparatelyfor%eachfeature.%%Hint:Youmightfindthe'mean'and'std'functionsuseful.%mu=mean(X_norm);sigma=std(X_norm);X_norm=(X_norm-repmat(mu,size(X,1),1))./repmat(sigma,size(X,1),1);%============================================================endcomputeCostMulti.mfunctionJ=computeCostMulti(X,y,theta)%COMPUTECOSTMULTIComputecostforlinearregressionwithmultiplevariables%J=COMPUTECOSTMULTI(X,y,theta)computesthecostofusingthetaasthe%parameterforlinearregressiontofitthedatapointsinXandy%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta%YoushouldsetJtothecost.res=X*theta-y;J=(res’*res)/m*0.5;%=========================================================================EndgradientDescent.mfunction[theta,J_history]=gradientDescent(X,y,theta,alpha,num_iters)%GRADIENTDESCENTPerformsgradientdescenttolearntheta%theta=GRADIENTDESENT(X,y,theta,alpha,num_iters)updatesthetaby%takingnum_itersgradientstepswithlearningratealpha%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamplesJ_history=zeros(num_iters,1);foriter=1:num_iters%======================YOURCODEHERE======================%Instructions:Performasinglegradientstepontheparametervector%theta.%%Hint:Whiledebugging,itcanbeusefultoprintoutthevalues%ofthecostfunction(computeCost)andgradienthere.%theta=theta-alpha/m*((X*theta-y))’*X)’;%============================================================%SavethecostJineveryiterationJ_history(iter)=computeCost(X,y,theta);endendgradientDescentMulti.mfunction[theta,J_history]=gradientDescentMulti(X,y,theta,alpha,num_iters)%GRADIENTDESCENTMULTIPerformsgradientdescenttolearntheta%theta=GRADIENTDESCENTMULTI(x,y,theta,alpha,num_iters)updatesthetaby%takingnum_itersgradientstepswithlearningratealpha%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamplesJ_history=zeros(num_iters,1);foriter=1:num_iters%======================YOURCODEHERE======================%Instructions:Performasinglegradientstepontheparametervector%theta.%%Hint:Whiledebugging,itcanbeusefultoprintoutthevalues%ofthecostfunction(computeCostMulti)andgradienthere.%t=((X*theta-y)’*X);theta=theta-alpha/m*t;%============================================================%SavethecostJineveryiterationJ_history(iter)=computeCostMulti(X,y,theta);endendnormalEqn.mfunction[theta]=normalEqn(X,y)%NORMALEQNComputestheclosed-formsolutiontolinearregression%NORMALEQN(X,y)computestheclosed-formsolutiontolinear%regressionusingthenormalequations.theta=zeros(size(X,2),1);%======================YOURCODEHERE======================%Instructions:Completethecodetocomputetheclosedformsolution%tolinearregressionandputtheresultintheta.%theta=pinv(X’*X)*X’*y;%SampleSolution%%============================================================end总体运行结果如图所示:ProgrammingExercise2:LogisticRegressionIntroductionInthisexercise,youwillimplementlogisticregressionandapplyittotwodifferentdatasets.Beforestartingontheprogrammingexercise,westronglyrecommendwatchingthevideolecturesandcompletingthereviewquestionsfortheassociatedtopics.Togetstartedwiththeexercise,youwillneedtodownloadthestartercodeandunzipitscontentstothedirectorywhereyouwishtocompletetheexercise.Ifneeded,usethecdcommandinOctavetochangetothisdirectorybeforestartingthisexercise.YoucanalsofindinstructionsforinstallingOctaveonthe\OctaveInstallation"pageonthecoursewebsite.Filesincludedinthisexerciseex2.m-Octavescriptthatwillhelpstepyouthroughtheexerciseex2reg.m-Octavescriptforthelaterpartsoftheexerciseex2data1.txt-Trainingsetforthersthalfoftheexerciseex2data2.txt-TrainingsetforthesecondhalfoftheexercisesubmitWeb.m-Alternativesubmissionscriptsubmit.m-SubmissionscriptthatsendsyoursolutionstoourserversmapFeature.m-FunctiontogeneratepolynomialfeaturesplotDecisionBounday.m-Functiontoplotclassier'sdecisionboundary[*]plotData.m-Functiontoplot2Dclassicationdata[*]sigmoid.m-SigmoidFunction[*]costFunction.m-LogisticRegressionCostFunction[*]predict.m-LogisticRegressionPredictionFunction[*]costFunctionReg.m-RegularizedLogisticRegressionCost*indicateslesyouwillneedtocompleteThroughouttheexercise,youwillbeusingthescriptsex2.mandex2reg.m.Thesescriptssetupthedatasetfortheproblemsandmakecallstofunctionsthatyouwillwrite.Youdonotneedtomodifyeitherofthem.Youareonlyrequiredtomodifyfunctionsinotherles,byfollowingtheinstructionsinthisassignment.根据实验内容补全代码后如下:plotData.mfunctionplotData(X,y)%PLOTDATAPlotsthedatapointsXandyintoanewfigure%PLOTDATA(x,y)plotsthedatapointswith+forthepositiveexamples%andoforthenegativeexamples.XisassumedtobeaMx2matrix.%CreateNewFigurefigure;holdon;%======================YOURCODEHERE======================%Instructions:Plotthepositiveandnegativeexamplesona%2Dplot,usingtheoption'k+'forthepositive%examplesand'ko'forthenegativeexamples.%positive=find(y==1);negative=find(y==0);plot(X(positive,1),X(positive,2),‘k+’,‘MarkerSize’,7,’LineWidth’,2);plot(X(negative,1),X(negative,2),‘ko’,‘MarkerFaceColor’,‘y’,’MarkSize’,7);%=========================================================================holdoff;endSigmoid.mfunctiong=sigmoid(z)%SIGMOIDComputesigmoidfunctoon%J=SIGMOID(z)computesthesigmoidofz.%Youneedtoreturnthefollowingvariablescorrectlyg=zeros(size(z));%======================YOURCODEHERE======================%Instructions:Computethesigmoidofeachvalueofz(zcanbeamatrix,%vectororscalar).g=1./(1+exp(-z));%=============================================================endcostFunction.mfunction[J,grad]=costFunction(theta,X,y)%COSTFUNCTIONComputecostandgradientforlogisticregression%J=COSTFUNCTION(theta,X,y)computesthecostofusingthetaasthe%parameterforlogisticregressionandthegradientofthecost%w.r.t.totheparameters.%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;grad=zeros(size(theta));%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta.%YoushouldsetJtothecost.%Computethepartialderivativesandsetgradtothepartial%derivativesofthecostw.r.t.eachparameterintheta%Hypothesis=sigmoid(X*theta);J=1/m*sum(-y.*log(hypothesis)-(1-y).*log(1-hypothesis))+0.5*lambda/m*(theta(2:end)’*theta(2:end));n=size(X,2);grad(1)=1/m*dot(hypothesis-y,X(:,1));fori=2:ngrad(i)=1/m*dot(hypothesis-y,X(:,i))+lambda/m*theta(i);%Note:gradshouldhavethesamedimensionsastheta%%=============================================================endPredict.mfunctionp=predict(theta,X)%PREDICTPredictwhetherthelabelis0or1usinglearnedlogistic%regressionparameterstheta%p=PREDICT(theta,X)computesthepredictionsforXusinga%thresholdat0.5(i.e.,ifsigmoid(theta'*x)>=0.5,predict1)m=size(X,1);%Numberoftrainingexamples%Youneedtoreturnthefollowingvariablescorrectlyp=zeros(m,1);%============

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论