大数据体系与技术实验-MapRecuce初级编程实践_第1页
大数据体系与技术实验-MapRecuce初级编程实践_第2页
大数据体系与技术实验-MapRecuce初级编程实践_第3页
大数据体系与技术实验-MapRecuce初级编程实践_第4页
大数据体系与技术实验-MapRecuce初级编程实践_第5页
已阅读5页,还剩37页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

大数据体系与技术

实验报告

实验题目:MapRecuce初级编程实践

一、实验目的

(1)通过实验掌握基本的MapReduce编程方法;

(2)掌握用MapReduce解决一些常见的数据处理问题,包括数据去重、数据排序和数

据挖掘等。

二、实验内容

1

社交网站上百万注册用户,网站服务器保留了用户登录

网站的日志记录,即登录一次网站,日志文件就记录一次用

户邮件地址,现有某一天的原始数据文件共800万行记录。

运营商要求定期货的用户在某天登录次数信息,作为用

户行为分析基础数据,以制定有效的运营计划。

数据清洗后提供用户名称和访问日期。

⑴编程实现按日期统计访问次数,要求获取每个自然日为

单位的所有用户访问次数。

⑵在⑴的基础上编程实现按访问次数排序。

⑶编程实现按月输出保存数据,要求最终的输出结果根据

月份分别保存到两个不同的文件中(2016年1月和2016年

2月)。同时要求输出1、2月份的记录数。

Heather,2016-02-09

Jakeem,2016~02-09

Patience,2016-02-09

Price,2016-02-09

Bell,2016-02-09

Nolan,2016-02-09

Davis,2016-02-09

Patience,2016-02-09

Patience,2016-02-09

♦・•

2.编程实现学生各科平均成绩计算。

zhangsan88

lisi99

wangwu66

zhaoliu77

zhangsan78

lisi89

wangwu96

zhaoliu67

zhangsan80

lisi82

wangwu84

zhaoliu86

实验3.统计拨打公共服务号码的电话信息

13718855152112

18610117315110

89451849112

13718855153110

13718855154112

18610117315114

18910117315114

三、实验步骤

1

社交网站上百万注册用户,网站服务器保留了用户登录

网站的日志记录,即登录一次网站,日志文件就记录一次用

户邮件地址,现有某一天的原始数据文件共800万行记录。

运营商要求定期货的用户在某天登录次数信息,作为用

户行为分析基础数据,以制定有效的运营计划。

数据清洗后提供用户名称和访问日期。

⑴编程实现按日期统计访问次数,要求获取每个自然日为

单位的所有用户访问次数。

代码如下:

packagemapreduce.util;

importjava.io.lOException;

importorg.apache.hadoop.conf.Configuration;

importorg.apache.hadoop.fs.Path;

importorg.apache.hadoop.io.IntWritable;

importorg.apache.hadoop.io.Text;

importorg.apache.hadoop.mapreduce.Job;

importorg.apache.hadoop.mapreduce.Mapper;

importorg.apache.hadoop.mapreduce.Reducer;

importorg.apache.hadoop.mapreduce.lib.input.FileinputFormat;

importorg.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

importorg.apache.hadoop.util.GenericOptionsParser;

publicclassdailyAccessCount{

publicstaticclassMyMapper

extendsMapper<Object,Text,Text,IntWritable>{

privatefinalstaticIntWritableone=newIntWritable(1);

publicvoidmap(Objectkey,Textvalue,Contextcontext)

privatefinalstaticIntWritableone=newIntWritable(1);

publicvoidmap(ObjectkeyrTextvalue,Contextcontext)

throwslOException,InterruptedException{

Stringline=value.toString();

〃指定逗号为分隔符,组成数组

Stringarray[]=line.split(",**);

//提取数组中的访问日期作为Key

StringkeyOutput=array[l];

〃组成键值对

context.write(newText(keyOutput),one);

)

)

publicstaticclassMyReducer

extendsReducer<Text,IntWritable,Text,IntWritable>{

privateIntWritableresult=newIntWritable();

publicvoidreduce(Textkey,Iterable<IntWritable>values,Contextcontext)

publicvoidreduce(Textkey,Iterable<IntWritable>values,Contextcontext)

throwslOException,InterruptedException{

〃定义累加器,初始值为。

intsum=0;

for(IntWritableval:values){

〃将相同诞的所有值进行累加

sum+=val.get();

}

result.set(sum);

context.write(key,result);

)

)

publicstaticvoidmain(String[]args)throwsException{

Configurationconf=newConfiguration();

Jobjob=Job.getlnstance(conf,"DailyAccessCount**);

ichval*1s>r*Ruf*1auvfdai1uAruavvrciini*「lavvl•

publicstaticvoidmain(String[]args)throwsException{

Configurationconf=newConfiguration();

Jobjob=Job.getlnstance(conf,"DailyAccessCount");

job.setJarByClass(dailyAccessCount.class);

job.setMapperClass(MyMapper.class);

job.setReducerClass(MyReducer.class);

job.setMapOutputKeyClass(Text.class);

job.setMapOutputValueClassdntWritable.class);

job.setOutputKeyClass(Text.class);

job.setOutputValueClass(IntWritable.class);

for(inti=0;i<args.length-1;++i)<

FilelnputFormat.addlnputPath(job,newPath(args[i]));

)

FileOutputFormat.setOutputPathijobrnewPath(args[args.length-1]));

System.exit(job.waitForCompletion(true)?0:1);

)

}

导出jar包

RunnableJARFileExport

RunnableJARFileSpecification

SelectaJavaApplication*launchconfigurationtousetocreatearunnableJAR.«

Launchconfiguration:

dailyAccessCount-hadoop

Exportdestination:

/usr/local/hadoop/myapp/dacjar▼Browse...

Libraryhandling:

❷ExtractrequiredlibrariesintogeneratedJAR

OPackagerequiredlibrariesintogeneratedJAR

OCopyrequiredlibrariesintoasub-foldernexttothegeneratedJAR

□SaveasANTscript

ANTscriptlocation:/home/hadoop/workspace▼Browse...

<BackNext>CancelFinish

运行jar包,

hadoop@hadoop>VtrtualBox:**$cd/usr/local/hadoop

|hadoop@hadoop-Vtrtual.Box:/usr/local/hadoop$./bin/hdfsdfs-copyFromLocal~/user/hadoop/user

_login.txt/user/root/

2022-12-3161:19:01,906INFOsasl.SaslDataTransferClient:SASLencryptiontrustcheck:local

HostTrusted=false,remoteHostTrusted=false

ihadoop@hadoop-VirtualBox:/usr/local/hadoop$./bin/hadoopjar./myapp/dac.jar/user/root/user

l_login.txt/user/root/AccessCount

2022-12-3161:21:19,950INFOimpl.MetricsConfig:loadedpropertiesfromhadoop-mp

|erties

12022-12-3161:21:20,388INFOimpl.MetricsSystemlmpl:ScheduledMetricsnapshotperiodat10

jsecond(s).

^2022-12-3101:21:20,389INFOimpl.MetricsSystemlmpl:JobTrackermetricssystemstarted

!2022-12-3161:21:22,446WARNmapreduce.JobResourceUploader:Hadoopcommand-lineoptionparst

ingnotperformed.ImplementtheToolinterfaceandexecuteyourapplicationwithToolRunner

itoremedythis.

2022-12-3161:21:23,055INFOinput.FilelnputFornat:Totalinputfilestoprocess:1

12022-12-3191:21:23,299INFOmapreduce.JobSubmitter:numberofsplits:1

2022-12-3101:21:23,727INFOmapreduce.JobSubmitter:Submittingtokensforjob:job_loca1198

,93862496061

■2022-12-3101:21:23,731INFOmapreduce.JobSubmitter:Executingwithtokens:[]

2622-12-3101:21:24,063INFOmapreduce.Job:Theurltotrackthejob:http://localhost:8080/

12022-12-3161:21:24,064INFOmapreduce.Job:Runningjob:job_locall989386249_0O01

12022-12-3161:21:24,086INFOmapred.LocalJobRunner:Outputcommittersettnconfignull

■2022-12-3161:21:24,143INFOoutput.FileOutputCommitter:FileOutputCommitterAlgorithmver

istonis2

*2022-12-3101:21:24,143INFOoutput.FileOutputCommttter:FileOutputCommitterskipcleanup_t

lemporaryfoldersunderoutputdirectory:false,ignorecleanupfailures:false

2022-12-3161:21:24,184INFOmapred.LocalJobRunner:Outputcommitterisorg.apache.hadoop.map

reduce.lib.output.FileOutputCommitter

2022-12-3161:21:24,462INFOmapred.LocalJobRunner:Waitingformaptasks

2022-12-3161:21:24,466INFOmapred.LocalJobRunner:Startingtask:attempt_local.l989386249_0

eeimeeeeeee

022-12-3101:21:23,299INFOmapreduce.JobSubmitter:numberofsplits:1

022-12-3101:21:23,727INFOnapreduce.JobSubmitter:Submittingtokensforjob:job_loca1198

•3862490001

022-12-3101:21:23,731INFOmapreduce.JobSubmttter:Executingwithtokens:[]

022-12-3101:21:24,063INFOmapreduce.Job:Theurltotrackthejob:http://localhost:8086/

022-12-3101:21:24,064INFOmapreduce.Job:Runningjob:job_locall989386249_0001

022-12-3101:21:24,086INFOmapred.LocalJobRunner:OutputCommittersettnconfignull

022-12-3101:21:24,143INFOoutput.FileOutputCommitter:FileOutputCommitterAlgorithmver

tonis2

022-12-3101:21:24,143INFOoutput.FileOutputCommitter:FileOutputCommitterskipcleanup_t

■mporaryfoldersunderoutputdtrectory:false,ignorecleanupfailures:false

022-12-3101:21:24,184INFOmapred.LocalJobRunner:OutputCommttterisorg.apache.hadoop.map

educe.lib.output.FileOutputCommitter

©22-12-3101:21:24,462INFOmapred.LocalJobRunner:Waitingformaptasks

022-12-3101:21:24,466INFOmapred.LocalJobRunner:Startingtask:attempt_locall989386249_0

i01_m_00Oe00_e

022-12-3101:21:24,583INFOoutput.FileOutputCommitter:FileOutputCommitterAlgorithmver

tonis2

022-12-3101:21:24,587INFOoutput.FileOutputCommitter:FileOutputCommitterskipcleanup_t

mporaryfoldersunderoutputdirectory:false,ignorecleanupfailures:false

022-12-3101:21:25,108INFOmapreduce.Job:Jobjob<_locall989386249_0061runninginubermod

■:false

022-12-3101:21:25,109INFOmapreduce.Job:map0%reduce0%

022-12-3101:21:25,922INFOmapred.Task:UsingResourceCalculatorProcessTree:[]

022-12-3101:21:25,943INFOmapred.MapTask:Processingsplit:hdfs://localhost:9000/user/ro

t/user_logtn.txt:0+24263613

022-12-3101:21:26,167INFOmapred.MapTask:(EQUATOR)0kvi26214396(104857584)

022-12-3101:21:26,167INFOmapred.MapTask:mapreduce.task.io.sort.mb:106

022-12-3101:21:26,167INFOmapred.MapTask:softlimitat83886080

022-12-3101:21:26,168INFOmapred.MapTask:bufstart=0;bufvoid=164857600

022-12-3101:21:26,168INFOmapred.MapTask:kvstart=26214396;length6553600

022-12-3101:21:26.191INFOmaDred.MaDTask:Maooutoutcollectorclassora.aDache.hadooD.

(82986880);length=5467677/6553600

2022-12-3101:21:31,592INFOmapred.MapTask:Finishedspill0

2022-12-3161:21:31,654INFOmapred.Task:Task:attempt_locall989386249_O001_m_0OOO6O_0isdo

|ne.Andisintheprocessofcommitting

12022-12-3161:21:31,691INFOmapred.LocalJobRunner:map

2022-12-3161:21:31,691INFOmapred.Task:Task'attempt_locaI1989386249_0OOl>_m_OO00O0_6*don

e.

2022-12-3101:21:31,759INFOmapred.Task:FinalCountersforattempt_locall989386249_e001_m_

00O0Oe_0:Counters:22

FileSystemCounters

FILE:Numberofbytesread=62941358

FILE:Numberofbyteswritten=87174082

FILE:Numberofreadoperations=0

FILE:Numberoflargereadoperations=0

FILE:Numberofwriteoperations=0

HDFS:Numberofbytesread=24263613

HDFS:Numberofbyteswritten=0

HDFS:Numberofreadoperattons=5

HDFS:Numberoflargereadoperations=0

HDFS:Numberofwriteoperations=l

Map-ReduceFramework

Mapinputrecords=1366920

Mapoutputrecords=1366920

Mapoutputbytes=2050386O

Mapoutputmaterializedbytes=23237646

Inputsplitbytes=lll

Combineinputrecords=0

SpilledRecords=1366920

FailedShuffles=6

MergedMapoutputs=0

GCtimeelapsed(ms)=172

HDFS:Numberofreadoperations=10

HDFS:Numberoflargereadoperations=0

HDFS:Numberofwriteoperattons=3

Map-ReduceFramework

Combineinputrecords=6

Combineoutputrecords=0

Reduceinputgroups=366

Reduceshufflebytes=23237646

Reduceinputrecords=1366920

Reduceoutputrecords=366

SpilledRecords=1366920

ShuffledMaps=1

FailedShuffles:。

MergedMapoutputs=l

GCtimeelapsed(ms)=3

Totalcommittedheapusage(bytes)=161746944

ShuffleErrors

BAD_ID=0

CONNECTION=O

IO_ERROR=0

WR0NG_LENGTH=0

WRONG_MAP=0

WRONG二REDUCE=。

FileOutputFormatCounters

BytesWrttten=5856

2022-12-3101:21:37,167INFOmapred.LocalJobRunner:Finishingtask:attempt_locall989386249

OOGIr0000000

2022-12-3101:21:37,212INFOmapred.LocalJobRunner:reducetaskexecutorcomplete.

2022-12-3101:21:38,132INFOmapreduce.Job:Jobjob_locall989386249__0001completedsuccessfi

iiy

2022-12-3101:21:38,185INFOmapreduce.Job:Counters:35

FileSystemCounters

FILE:Numberofbvtesread=172358040

HDFS:Numberofreadoperations=15

HDFS:Numberoflargereadoperattons=0

HDFS:Numberofwriteoperattons=4

Map-ReduceFramework

Mapinputrecords=1366920

Mapoutputrecords=1366926

Mapoutputbytes=2050380O

Mapoutputmaterializedbytes=23237646

Inputsplitbytes=lll

Combineinputrecords=0

Combineoutputrecords=6

Reduceinputgroups=366

Reduceshufflebytes=23237646

Reduceinputrecords=1366920

Reduceoutputrecords=366

SpilledRecords=2733840

ShuffledMaps=1

FailedShuffles=0

MergedMapoutputs=l

CCtimeelapsed(ms)=175

Totalcommittedheapusage(bytes)=323493888

ShuffleErrors

BAD_ID=0

CONNECTION=O

IO_ERROR=0

WRONG_LENGTH=O

WR0NG_MAP=6

WRONG~REDUCE=0

FileInputFormatCounters

BytesRead=24263613

FileOutputFormatCounters

BytesWritten=5856

输出结果如下,

hadoop@hadoop-VirtualBox:/usr/local/hadoop$./btn/hdfsdfs-cat/user/root/AccessCount/part-

r-06000

2022-12-3101:32:45,267INFOsasl.SaslDataTransferClient:SASLencryptiontrustcheck:local

HostTrusted=false,remoteHostTrusted=false

2016-01-015038

2016-01-025378

2016-01-035341

2016-61-645304

2016-01-055258

2016-01-065136

2016-61-075300

2016-01-685281

2016-61-095367

2016-01-105282

2016-01-113126

2016-61-123237

2016-01-133129

2016-01-143091

2016-01-153131

2016-61-163168

A.-17OA

j2O16-01-173124

2016-01-183065

2016-01-193135

2016-01-203154

12016-01-213078

2016-01-223234

2016-01-233131

2016-01-243179

12016-01-253152

|2016-01-263124

;2016-©1-273096

•2016-01-283121

12016-01-293698

12016-01-303094

2016-01-313885

12016-02-013077

2016-02-023018

2016-02-033184

2016-02-043217

•2016-02-053102

12016-02-063018

2016-02-673691

S2O16-02-08312©

12016-02-093154

,2016-02-163157

12016-02-113104

12016-02-123106

2016-62-133050

■2016-02-143233

!2016-02-153090

|2。16-。2-163865

^2016-02-163065

;2016-02-173158

,816-82-183171

2016-02-193083

2016-02-203131

2016-62-213125

2016-02-223024

2016-62-233109

2016-62-243693

2016-62-253162

12016-62-263160

|2016-©2-273182

[2316-62-283140

2016-62-293201

12016-63-013062

12016-63-023206

2016-63-633115

2016-03-043170

!2016-63-653128

;2016-©3-063124

2016-03-973056

12016-03-683123

(2016-03-093119

?2016-63-103168

:2016-©3-113261

泛”6-63-123110

12916・。3・133146

12616-03-143164

12016-03-153087

■2016-03-163066

12616-83-173052

12016-03-183164

{2016-63-193269

2016-03-193209

2016-03-203128

2016-03-213263

2016-03-223187

2016-63-233110

2016-63-243133

2016-63-253102

2016-63-263053

2016-03-273116

2016-03-283174

2016-63-293159

2016-63-303091

2016-63-313641

2016-04-012998

2016-04-023122

2016-64-633142

2016-64-043109

2016-04-053173

2016-04-663651

2016-04-673104

2016-04-083171

2016-04-093172

2016-04-103223

2016-64-112991

2016-64-123088

2016-04-133123

2016-04-143010

2016-04-153135

2016-04-163106

2016-04-173641

2016-64-183156

2016-04-193219

12816-84-2。__3695

12016-64-203095

2016-04-213019

2016-64-223084

2016-64-233116

2016-04-243183

2016-04-253169

12016-64-263148

2016-04-273075

2016-04-283170

'2016-04-293127

2016-04-303091

2016-05-013105

2016-05-023136

2016-65-033105

12016-05-043103

||2016-05-053010

2016-65-063167

2016-05-673171

2016-05-083162

2016-05-093090

,2016-65-103117

peie-es-ii3176

2016-05-123202

2016-65-133064

:2O16-05-143096

2016-65-153191

2016-65-163125

12016-05-173103

12016-65-183191

2016-05-193148

2016-05-203094

'2016-65-213170

12016-65-223079

12016-05-223679

2016-05-233617

2016-65-243116

,2016-05-253190

■2016-05-263117

2616-05-273242

(2016-Q5-283127

2016-65-293277

2016-05-303117

<2016-65-313187

12016-66-013189

2016-06-023110

■2016-06-033165

12016-06-043638

12016-06-053159

2016-06-063197

2016-66-073031

!2016-66-083162

,2016-06-093174

2016-66-163158

2616-86-113135

?2016-66-123101

12016-06-133195

2016-06-143075

[2016-66-153241

12016-66-163698

;2016-66-173089

12016-66-183628

■2016-66-193068

2016-06-203077

•2016-66-213133

'2016-06-223108

[2016-66-233998

2016-06-233098

2016-06-243163

2016-06-253135

2016-06-263168

2016-06-273098

2016-06-283078

2016-06-293122

2016-06-303108

2016-07-015485

2016-07-025185

2016-07-035366

2016-07-045327

2016-07-055329

2016-07-065372

2016-07-675352

2016-07-685182

2016-67-695334

2016-07-105213

2016-07-115360

2016-07-123139

2016-07-133116

2016-07-143159

2016-67-153122

2016-67-163263

2016-07-173098

2016-07-183164

2016-07-193082

2016-07-203211

2016-07-213095

(2016-07-233189

|2616-07-243221

2016-07-253162

2616-07-263139

2016-07-273132

12616-07-283128

2016-07-293164

2016-07-303164

2616-07-313166

2016-08-013094

2016-08-023064

,2016-08-633148

|2016-08-043149

2016-08-053164

|2616-08-063103

2016-08-073106

2016-08-683193

2616-08-093147

2016-08-163159

2016-08-113151

2016-68-123177

2016-08-133191

2016-08-143070

12616-68-153138

2916-08-163182

2016-08-173238

12016-08-183265

|2016-08-193136

2016-08-263104

2016-08-213159

12616-08-223076

12。16・。8・233089

2016-08-233089

2016-08-243148

:2016-08-253067

.2016-08-263243

2016-08-273176

12016-08-283176

12016-08-293099

)2016-08-303160

2016-08-313139

2016-09-013110

12016-09-023172

2616-09-033115

;2016-09-043112

2016-09-055295

12016-09-065029

|2016-09-075317

2016-09-085342

2016-09-695513

2016-09-105205

2016-09-115250

2016-09-125092

*2016-09-135216

12016-09-145333

2016-09-155507

■2016-09-165317

■2016-09-175163

3144

12016-09-193109

12016-09-203131

2016-09-213118

■2016-09-223089

12016-09-233183

.2016-09-243009

12016-09-243069

,2016-09-25314©

:2016-09-263136

2616-09-273096

>2016-09-283062

12016-09-293077

12616-69-303676

2616-ie-ei3630

2016-10-023047

2016-16-033165

2016-19-043191

12016-16-655317

2016-10-065275

2016-10-075231

12016-16-08531©

|2016-10-©95279

2016-ie-ie5393

2016-10-11527©

[2016-10-125462

2616-10-135343

2016-16-145247

7016-10-155141

2016-16-165285

2016-10-175315

;2016-10-185335

'2016-10-195447

'2016-10-2©5278

2016-10-213107

12616-10-223046

2016-10-233135

'2016-10-243148

!2016-ie-253160

|2ei6・ie,263661

12016-10-263061

2016-10-273214

2016-10-283108

2016-10-293261

2016-10-303146

2016-10-313118

2016-11-613222

2016-11-023086

2016-11-033163

2016-11-043211

2016-11-053148

12016-11-063220

;2016-ll-073149

>2016-11-087353

2016-11-097593

i2016-ll-ie7393

2016-11-117498

2016-11-127611

|2016-11-137545

2016-11-147401

'2016-11-157548

*2016-11-167549

12016-11-177376

2016-11-187577

[2616-11-197543

12016-11-267491

'2016-11-217658

12016-11-227677

12016-11-233060

,2016-11-243694

2016-11-253151

'2016-11-263088

.2016-11-273194

汉。16-11-273194

2016-11-283082

[2016-11-293098

2016-11-303155

2016-12-013182

2016-12-023132

12016-12-033193

12016-12-043088

2016-12-053083

2616-12-663105

(2016-12-673696

2016-12-083144

2016-12-093180

2016-12-103151

12616-12-113103

12016-12-127343

2616-12-137430

2016-12-147550

[2016-12-157328

2016-12-167259

2016-12-177475

■2016-12-187468

2016-12-197354

2016-12-207478

)2016-12-217644

2016-12-228014

[2616-12-233132

(2016-12-243078

(2016-12-253181

2016-12-263636

*2016-12-273148

12016-12-283123

,2016-12-293091__

]2616-12-25

:2016-12-263036

f2016-12-273148

52016-12-283123

12016-12-293091

;2016-12-303095

’2616-12-313106

|hadoop@hadoop-VirtualBox:/usr/local/hadoop$|

⑵在⑴的基础上编程实现按访问次数排序。

代码如下,

packagemapreduce.util;

importjava.io.lOException;

importorg.apache.hadoop.conf.Configuration;

importorg.apache.hadoop.fs.Path;

importorg.apache.hadoop.io.IntWritable;

importorg.apache.hadoop.io.Text;

importorg.apache.hadoop.mapreduce.Job;

importorg.apache.hadoop.mapreduce.Mapper;

importorg.apache.hadoop.mapreduce.Reducer;

importorg.apache.hadoop.mapreduce.lib.input.FilelnputFormat;

importorg.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

importorg.apache.hadoop.util.GenericOptionsParser;

publicclassaccessTimesSort{

publicstaticclassMyMapper

extendsMapper<Object,Text,IntWritable,Text>{

publicvoidmanfnhiortkav一Tpxt\/alUQ-Ccntpxtccntaxt

publicstaticclassMyMapper

extendsMapper<Object,Text,IntWritable,Text>{

publicvoidmap(Objectkey.Textvalue.Contextcontext

)throwslOException,InterruptedException{

Stringlines=value.tostring();

//指定tab为分隔符,组成数组

Stringarray[]=lines.split("\t");

〃提取访问次数做为Key

intkeyOutput=Integer.parselnt(array[l]);

〃提取访问日期做为Values

StringvalueOutput-array[0];

context.write(newIntWritable(keyOutput),newText(valueoutput)

)

)

publicstaticclassMyReducer

extendsReducer<IntWritable,Text,Text,IntWritable>{

publicstaticclassMyReducer

extendsReducer<IntWritable,Text,Text,IntWritablo{

publicvoidreduce(IntWritablekey,Iterable<Text>values,

Contextcontext)

throwslOException,InterruptedException{

for(Textvalue:values){

context.writetvalue,key);

)

publicstati

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论