




版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
中南大学模式识别与机器学习实验报告班级学号姓名指导老师ProgrammingExercise1:LinearRegressionIntroductionInthisexercise,youwillimplementlinearregressionandgettoseeitworkondata.Beforestartingonthisprogrammingexercise,westronglyrecommendwatchingthevideolecturesandcompletingthereviewquestionsfortheassociatedtopics.Togetstartedwiththeexercise,youwillneedtodownloadthestartercodeandunzipitscontentstothedirectorywhereyouwishtocompletetheexercise.Ifneeded,usethecdcommandinOctavetochangetothisdirectorybeforestartingthisexercise.YoucanalsofindinstructionsforinstallingOctaveonthe\OctaveInstallation"pageonthecoursewebsite.Filesincludedinthisexerciseex1.m-Octavescriptthatwillhelpstepyouthroughtheexerciseex1multi.m-Octavescriptforthelaterpartsoftheexerciseex1data1.txt-Datasetforlinearregressionwithonevariableex1data2.txt-Datasetforlinearregressionwithmultiplevariablessubmit.m-Submissionscriptthatsendsyoursolutionstoourservers[*]warmUpExercise.m-SimpleexamplefunctioninOctave[*]plotData.m-Functiontodisplaythedataset[*]computeCost.m-Functiontocomputethecostoflinearregression[*]gradientDescent.m-Functiontorungradientdescent[$]computeCostMulti.m-Costfunctionformultiplevariables[$]gradientDescentMulti.m-Gradientdescentformultiplevariables[$]featureNormalize.m-Functiontonormalizefeatures[$]normalEqn.m-Functiontocomputethenormalequations*indicateslesyouwillneedtocomplete$indicatesextracreditexercisesThroughouttheexercise,youwillbeusingthescriptsex1.mandex1multi.m.Thesescriptssetupthedatasetfortheproblemsandmakecallstofunctionsthatyouwillwrite.Youdonotneedtomodifyeitherofthem.Youareonlyrequiredtomodifyfunctionsinotherles,byfollowingtheinstructionsinthisassignment.Forthisprogrammingexercise,youareonlyrequiredtocompletetherstpartoftheexercisetoimplementlinearregressionwithonevariable.Thesecondpartoftheexercise,whichyoumaycompleteforextracredit,coverslinearregressionwithmultiplevariables.根据实验内容补全代码后如下:(1)computeCost.mfunctionJ=computeCost(X,y,theta)%COMPUTECOSTComputecostforlinearregression%J=COMPUTECOST(X,y,theta)computesthecostofusingthetaasthe%parameterforlinearregressiontofitthedatapointsinXandy%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta%YoushouldsetJtothecost.res=X*theta-y;J=(res’*res)/m*0.5;%=========================================================================end(2)plotData.mfunctionplotData(x,y)%PLOTDATAPlotsthedatapointsxandyintoanewfigure%PLOTDATA(x,y)plotsthedatapointsandgivesthefigureaxeslabelsof%populationandprofit.%======================YOURCODEHERE======================%Instructions:Plotthetrainingdataintoafigureusingthe%"figure"and"plot"commands.Settheaxeslabelsusing%the"xlabel"and"ylabel"commands.Assumethe%populationandrevenuedatahavebeenpassedin%asthexandyargumentsofthisfunction.%%Hint:Youcanusethe'rx'optionwithplottohavethemarkers%appearasredcrosses.Furthermore,youcanmakethe%markerslargerbyusingplot(...,'rx','MarkerSize',10);figure;%openanewfigurewindowplot(x,y,’rx’,’MarkerSize’,10);xlabel(‘profitin$10,000s’);ylabel(‘PopulationofCityin10,0000s’);%============================================================end(3)warmUpExercise.mfunctionA=warmUpExercise()%WARMUPEXERCISEExamplefunctioninoctave%A=WARMUPEXERCISE()isanexamplefunctionthatreturnsthe5x5identitymatrixA=[];%=============YOURCODEHERE==============%Instructions:Returnthe5x5identitymatrix%Inoctave,wereturnvaluesbydefiningwhichvariables%representthereturnvalues(atthetopofthefile)%andthensetthemaccordingly.A=eye(5);%===========================================endfeatureNormalize.mfunction[X_norm,mu,sigma]=featureNormalize(X)%FEATURENORMALIZENormalizesthefeaturesinX%FEATURENORMALIZE(X)returnsanormalizedversionofXwhere%themeanvalueofeachfeatureis0andthestandarddeviation%is1.Thisisoftenagoodpreprocessingsteptodowhen%workingwithlearningalgorithms.%YouneedtosetthesevaluescorrectlyX_norm=X;mu=zeros(1,size(X,2));sigma=zeros(1,size(X,2));%======================YOURCODEHERE======================%Instructions:First,foreachfeaturedimension,computethemean%ofthefeatureandsubtractitfromthedataset,%storingthemeanvalueinmu.Next,computethe%standarddeviationofeachfeatureanddivide%eachfeaturebyit'sstandarddeviation,storing%thestandarddeviationinsigma.%%NotethatXisamatrixwhereeachcolumnisa%featureandeachrowisanexample.Youneed%toperformthenormalizationseparatelyfor%eachfeature.%%Hint:Youmightfindthe'mean'and'std'functionsuseful.%mu=mean(X_norm);sigma=std(X_norm);X_norm=(X_norm-repmat(mu,size(X,1),1))./repmat(sigma,size(X,1),1);%============================================================endcomputeCostMulti.mfunctionJ=computeCostMulti(X,y,theta)%COMPUTECOSTMULTIComputecostforlinearregressionwithmultiplevariables%J=COMPUTECOSTMULTI(X,y,theta)computesthecostofusingthetaasthe%parameterforlinearregressiontofitthedatapointsinXandy%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta%YoushouldsetJtothecost.res=X*theta-y;J=(res’*res)/m*0.5;%=========================================================================EndgradientDescent.mfunction[theta,J_history]=gradientDescent(X,y,theta,alpha,num_iters)%GRADIENTDESCENTPerformsgradientdescenttolearntheta%theta=GRADIENTDESENT(X,y,theta,alpha,num_iters)updatesthetaby%takingnum_itersgradientstepswithlearningratealpha%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamplesJ_history=zeros(num_iters,1);foriter=1:num_iters%======================YOURCODEHERE======================%Instructions:Performasinglegradientstepontheparametervector%theta.%%Hint:Whiledebugging,itcanbeusefultoprintoutthevalues%ofthecostfunction(computeCost)andgradienthere.%theta=theta-alpha/m*((X*theta-y))’*X)’;%============================================================%SavethecostJineveryiterationJ_history(iter)=computeCost(X,y,theta);endendgradientDescentMulti.mfunction[theta,J_history]=gradientDescentMulti(X,y,theta,alpha,num_iters)%GRADIENTDESCENTMULTIPerformsgradientdescenttolearntheta%theta=GRADIENTDESCENTMULTI(x,y,theta,alpha,num_iters)updatesthetaby%takingnum_itersgradientstepswithlearningratealpha%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamplesJ_history=zeros(num_iters,1);foriter=1:num_iters%======================YOURCODEHERE======================%Instructions:Performasinglegradientstepontheparametervector%theta.%%Hint:Whiledebugging,itcanbeusefultoprintoutthevalues%ofthecostfunction(computeCostMulti)andgradienthere.%t=((X*theta-y)’*X);theta=theta-alpha/m*t;%============================================================%SavethecostJineveryiterationJ_history(iter)=computeCostMulti(X,y,theta);endendnormalEqn.mfunction[theta]=normalEqn(X,y)%NORMALEQNComputestheclosed-formsolutiontolinearregression%NORMALEQN(X,y)computestheclosed-formsolutiontolinear%regressionusingthenormalequations.theta=zeros(size(X,2),1);%======================YOURCODEHERE======================%Instructions:Completethecodetocomputetheclosedformsolution%tolinearregressionandputtheresultintheta.%theta=pinv(X’*X)*X’*y;%SampleSolution%%============================================================end总体运行结果如图所示:ProgrammingExercise2:LogisticRegressionIntroductionInthisexercise,youwillimplementlogisticregressionandapplyittotwodifferentdatasets.Beforestartingontheprogrammingexercise,westronglyrecommendwatchingthevideolecturesandcompletingthereviewquestionsfortheassociatedtopics.Togetstartedwiththeexercise,youwillneedtodownloadthestartercodeandunzipitscontentstothedirectorywhereyouwishtocompletetheexercise.Ifneeded,usethecdcommandinOctavetochangetothisdirectorybeforestartingthisexercise.YoucanalsofindinstructionsforinstallingOctaveonthe\OctaveInstallation"pageonthecoursewebsite.Filesincludedinthisexerciseex2.m-Octavescriptthatwillhelpstepyouthroughtheexerciseex2reg.m-Octavescriptforthelaterpartsoftheexerciseex2data1.txt-Trainingsetforthersthalfoftheexerciseex2data2.txt-TrainingsetforthesecondhalfoftheexercisesubmitWeb.m-Alternativesubmissionscriptsubmit.m-SubmissionscriptthatsendsyoursolutionstoourserversmapFeature.m-FunctiontogeneratepolynomialfeaturesplotDecisionBounday.m-Functiontoplotclassier'sdecisionboundary[*]plotData.m-Functiontoplot2Dclassicationdata[*]sigmoid.m-SigmoidFunction[*]costFunction.m-LogisticRegressionCostFunction[*]predict.m-LogisticRegressionPredictionFunction[*]costFunctionReg.m-RegularizedLogisticRegressionCost*indicateslesyouwillneedtocompleteThroughouttheexercise,youwillbeusingthescriptsex2.mandex2reg.m.Thesescriptssetupthedatasetfortheproblemsandmakecallstofunctionsthatyouwillwrite.Youdonotneedtomodifyeitherofthem.Youareonlyrequiredtomodifyfunctionsinotherles,byfollowingtheinstructionsinthisassignment.根据实验内容补全代码后如下:plotData.mfunctionplotData(X,y)%PLOTDATAPlotsthedatapointsXandyintoanewfigure%PLOTDATA(x,y)plotsthedatapointswith+forthepositiveexamples%andoforthenegativeexamples.XisassumedtobeaMx2matrix.%CreateNewFigurefigure;holdon;%======================YOURCODEHERE======================%Instructions:Plotthepositiveandnegativeexamplesona%2Dplot,usingtheoption'k+'forthepositive%examplesand'ko'forthenegativeexamples.%positive=find(y==1);negative=find(y==0);plot(X(positive,1),X(positive,2),‘k+’,‘MarkerSize’,7,’LineWidth’,2);plot(X(negative,1),X(negative,2),‘ko’,‘MarkerFaceColor’,‘y’,’MarkSize’,7);%=========================================================================holdoff;endSigmoid.mfunctiong=sigmoid(z)%SIGMOIDComputesigmoidfunctoon%J=SIGMOID(z)computesthesigmoidofz.%Youneedtoreturnthefollowingvariablescorrectlyg=zeros(size(z));%======================YOURCODEHERE======================%Instructions:Computethesigmoidofeachvalueofz(zcanbeamatrix,%vectororscalar).g=1./(1+exp(-z));%=============================================================endcostFunction.mfunction[J,grad]=costFunction(theta,X,y)%COSTFUNCTIONComputecostandgradientforlogisticregression%J=COSTFUNCTION(theta,X,y)computesthecostofusingthetaasthe%parameterforlogisticregressionandthegradientofthecost%w.r.t.totheparameters.%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;grad=zeros(size(theta));%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta.%YoushouldsetJtothecost.%Computethepartialderivativesandsetgradtothepartial%derivativesofthecostw.r.t.eachparameterintheta%Hypothesis=sigmoid(X*theta);J=1/m*sum(-y.*log(hypothesis)-(1-y).*log(1-hypothesis))+0.5*lambda/m*(theta(2:end)’*theta(2:end));n=size(X,2);grad(1)=1/m*dot(hypothesis-y,X(:,1));fori=2:ngrad(i)=1/m*dot(hypothesis-y,X(:,i))+lambda/m*theta(i);%Note:gradshouldhavethesamedimensionsastheta%%=============================================================endPredict.mfunctionp=predict(theta,X)%PREDICTPredictwhetherthelabelis0or1usinglearnedlogistic%regressionparameterstheta%p=PREDICT(theta,X)computesthepredictionsforXusinga%thresholdat0.5(i.e.,ifsigmoid(theta'*x)>=0.5,predict1)m=size(X,1);%Numberoftrainingexamples%Youneedtoreturnthefollowingvariablescorrectlyp=zeros(m,1);%============
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025年中级会计师考试财务管理试卷及答案
- 毅结特紧固件系统扩建紧固件项目环境影响评价报告表
- 低温仓储温度调控系统性能评估研究考核试卷
- 现场作业安全防护措施审查制度考核试卷
- 产品性能标准化测试与评估方法考核试卷
- 新建硅胶制品和电磁屏蔽铝箔垫片项目报告表
- 七圩60MW渔光互补光伏项目环境影响评价报告表
- 教育经济与初等教育师资队伍建设研究考核试卷
- 2024年新疆新和县急诊医学(副高)考试题含答案
- 指定捐赠管理办法
- 基孔肯雅热防控技术指南(2025年版)宣讲课件
- 2025年机械制造行业技能考试-制动钳工(客车)历年参考题库含答案解析(5套100道单选题合辑)
- 骨科快速康复护理课件
- 2025年基本公共卫生服务中医药健康管理服务项目培训考试试题(含答案)
- (高清版)DB11∕T 509-2025 房屋建筑修缮工程定案和施工质量验收规程
- 智算中心及算力产业集群项目节能评估报告
- 农村网格化矛盾纠纷课件
- 深圳市华强北跨境电商物流模式研究
- 2023年机械制造行业技能考试-制动钳工(客车)考试历年高频考点试题含答案
- 人教版高中物理必修二全册同步课时练习
- 大学英语六级词汇(乱序版)CET
评论
0/150
提交评论