国产 无码 综合区,色欲AV无码国产永久播放,无码天堂亚洲国产AV,国产日韩欧美女同一区二区

頭歌:共享單車之?dāng)?shù)據(jù)分析

這篇具有很好參考價(jià)值的文章主要介紹了頭歌:共享單車之?dāng)?shù)據(jù)分析。希望對大家有所幫助。如果存在錯(cuò)誤或未考慮完全的地方,請大家不吝賜教,您也可以點(diǎn)擊"舉報(bào)違法"按鈕提交疑問。

第1關(guān)?統(tǒng)計(jì)共享單車每天的平均使用時(shí)間

package com.educoder.bigData.sharedbicycle;
 
import java.io.IOException;
import java.text.ParseException;
import java.util.Collection;
import java.util.Date;
import java.util.HashMap;
import java.util.Locale;
import java.util.Map;
import java.util.Scanner;
import java.math.RoundingMode;
import java.math.BigDecimal;
import org.apache.commons.lang3.time.DateFormatUtils;
import org.apache.commons.lang3.time.FastDateFormat;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.mapreduce.TableReducer;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.BytesWritable;
import org.apache.hadoop.io.DoubleWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.util.Tool;
 
import com.educoder.bigData.util.HBaseUtil;
 
/**
 * 統(tǒng)計(jì)共享單車每天的平均使用時(shí)間
 */
public class AveragetTimeMapReduce extends Configured implements Tool {
 
	public static final byte[] family = "info".getBytes();
 
	public static class MyMapper extends TableMapper<Text, BytesWritable> {
		protected void map(ImmutableBytesWritable rowKey, Result result, Context context)
				throws IOException, InterruptedException {
			/********** Begin *********/
		 long beginTime = Long.parseLong(Bytes.toString(result.getValue(family, "beginTime".getBytes())));
		 long endTime = Long.parseLong(Bytes.toString(result.getValue(family, "endTime".getBytes())));
		 String format = DateFormatUtils.format(beginTime, "yyyy-MM-dd", Locale.CHINA);
		 long useTime = endTime - beginTime;
		 BytesWritable bytesWritable = new BytesWritable(Bytes.toBytes(format + "_" + useTime));
		 context.write(new Text("avgTime"), bytesWritable);		 
			/********** End *********/
		}
	}
 
	public static class MyTableReducer extends TableReducer<Text, BytesWritable, ImmutableBytesWritable> {
		@Override
		public void reduce(Text key, Iterable<BytesWritable> values, Context context)
				throws IOException, InterruptedException {
			/********** Begin *********/
		     double sum = 0;
			int length = 0;
			Map<String, Long> map = new HashMap<String, Long>();
			for (BytesWritable price : values) {
				byte[] copyBytes = price.copyBytes();
				String string = Bytes.toString(copyBytes);
				String[] split = string.split("_");
				if (map.containsKey(split[0])) {
					Long integer = map.get(split[0]) + Long.parseLong(split[1]);
					map.put(split[0], integer);
				} else {
					map.put(split[0], Long.parseLong(split[1]));
				}
			}
			Collection<Long> values2 = map.values();
			for (Long i : values2) {
				length++;
				sum += i;
			}
			BigDecimal decimal = new BigDecimal(sum / length /1000);
			BigDecimal setScale = decimal.setScale(2, RoundingMode.HALF_DOWN);
			Put put = new Put(Bytes.toBytes(key.toString()));
			put.addColumn(family, "avgTime".getBytes(), Bytes.toBytes(setScale.toString()));
			context.write(null, put);	 
			/********** End *********/
		}
 
	}
 
	public int run(String[] args) throws Exception {
		// 配置Job
		Configuration conf = HBaseUtil.conf;
		// Scanner sc = new Scanner(System.in);
		// String arg1 = sc.next();
		// String arg2 = sc.next();
		String arg1 = "t_shared_bicycle";
		String arg2 = "t_bicycle_avgtime";
		try {
			HBaseUtil.createTable(arg2, new String[] { "info" });
		} catch (Exception e) {
			// 創(chuàng)建表失敗
			e.printStackTrace();
		}
		Job job = configureJob(conf, new String[] { arg1, arg2 });
		return job.waitForCompletion(true) ? 0 : 1;
	}
 
	private Job configureJob(Configuration conf, String[] args) throws IOException {
		String tablename = args[0];
		String targetTable = args[1];
		Job job = new Job(conf, tablename);
		Scan scan = new Scan();
		scan.setCaching(300);
		scan.setCacheBlocks(false);// 在mapreduce程序中千萬不要設(shè)置允許緩存
		// 初始化Mapreduce程序
		TableMapReduceUtil.initTableMapperJob(tablename, scan, MyMapper.class, Text.class, BytesWritable.class, job);
		// 初始化Reduce
		TableMapReduceUtil.initTableReducerJob(targetTable, // output table
				MyTableReducer.class, // reducer class
				job);
		job.setNumReduceTasks(1);
		return job;
	}
}

第2關(guān)?統(tǒng)計(jì)共享單車在指定地點(diǎn)的每天平均次數(shù)?

package com.educoder.bigData.sharedbicycle;
 
 
 
import java.io.IOException;
 
import java.math.BigDecimal;
 
import java.math.RoundingMode;
 
import java.util.ArrayList;
 
import java.util.Collection;
 
import java.util.HashMap;
 
import java.util.Locale;
 
import java.util.Map;
 
import java.util.Scanner;
 
import org.apache.commons.lang3.time.DateFormatUtils;
 
import org.apache.hadoop.conf.Configuration;
 
import org.apache.hadoop.conf.Configured;
 
import org.apache.hadoop.hbase.CompareOperator;
 
import org.apache.hadoop.hbase.client.Put;
 
import org.apache.hadoop.hbase.client.Result;
 
import org.apache.hadoop.hbase.client.Scan;
 
import org.apache.hadoop.hbase.filter.BinaryComparator;
 
import org.apache.hadoop.hbase.filter.Filter;
 
import org.apache.hadoop.hbase.filter.FilterList;
 
import org.apache.hadoop.hbase.filter.SingleColumnValueFilter;
 
import org.apache.hadoop.hbase.filter.SubstringComparator;
 
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
 
import org.apache.hadoop.hbase.mapreduce.TableMapper;
 
import org.apache.hadoop.hbase.mapreduce.TableReducer;
 
import org.apache.hadoop.hbase.util.Bytes;
 
import org.apache.hadoop.io.BytesWritable;
 
import org.apache.hadoop.io.DoubleWritable;
 
import org.apache.hadoop.io.Text;
 
import org.apache.hadoop.mapreduce.Job;
 
import org.apache.hadoop.util.Tool;
 
import com.educoder.bigData.util.HBaseUtil;
 
 
 
/**
 * 共享單車每天在韓莊村的平均空閑時(shí)間
 */
 
public class AverageVehicleMapReduce extends Configured implements Tool {
  
 
    public static final byte[] family = "info".getBytes();
 
  
    public static class MyMapper extends TableMapper<Text, BytesWritable> {
 
        protected void map(ImmutableBytesWritable rowKey, Result result, Context context)
 
                throws IOException, InterruptedException {
 
            /********** Begin *********/
 
           String beginTime = Bytes.toString(result.getValue(family, "beginTime".getBytes()));
 
           String format = DateFormatUtils.format(Long.parseLong(beginTime), "yyyy-MM-dd", Locale.CHINA);
 
           BytesWritable bytesWritable = new BytesWritable(Bytes.toBytes(format));
 
           context.write(new Text("河北省保定市雄縣-韓莊村"), bytesWritable);
 
            /********** End *********/
 
        }
 
    }
 
 
 
    public static class MyTableReducer extends TableReducer<Text, BytesWritable, ImmutableBytesWritable> {
 
        @Override
 
        public void reduce(Text key, Iterable<BytesWritable> values, Context context)
 
                throws IOException, InterruptedException {
 
            /********** Begin *********/
 
          double sum = 0;
 
            int length = 0;
 
            Map<String, Integer> map = new HashMap<String, Integer>();
 
            for (BytesWritable price : values) {
 
                byte[] copyBytes = price.copyBytes();
 
                String string = Bytes.toString(copyBytes);
 
                if (map.containsKey(string)) {
 
                    Integer integer = map.get(string) + 1;
 
                    map.put(string, integer);
 
                } else {
 
                    map.put(string, new Integer(1));
 
                }
 
            }
 
            
 
            Collection<Integer> values2 = map.values();
 
            for (Integer i : values2) {
 
                length++;
 
                sum += i;
 
            }
 
            BigDecimal decimal = new BigDecimal(sum / length);
 
            BigDecimal setScale = decimal.setScale(2, RoundingMode. HALF_DOWN);
 
            Put put = new Put(Bytes.toBytes(key.toString()));
 
            put.addColumn(family, "avgNum".getBytes(), Bytes.toBytes(setScale.toString()));
 
            context.write(null, put);
 
 
            /********** End *********/
 
        }
 
 
 
    }
 
 
 
    public int run(String[] args) throws Exception {
 
        // 配置Job
 
        Configuration conf = HBaseUtil.conf;
 
        //Scanner sc = new Scanner(System.in);
 
        //String arg1 = sc.next();
 
        //String arg2 = sc.next();
 
        String arg1 = "t_shared_bicycle";
 
        String arg2 = "t_bicycle_avgnum";
 
        try {
 
            HBaseUtil.createTable(arg2, new String[] { "info" });
 
        } catch (Exception e) {
 
            // 創(chuàng)建表失敗
 
            e.printStackTrace();
 
        }
 
        Job job = configureJob(conf, new String[] { arg1, arg2 });
 
        return job.waitForCompletion(true) ? 0 : 1;
 
    }
 
 
 
    private Job configureJob(Configuration conf, String[] args) throws IOException {
 
        String tablename = args[0];
 
        String targetTable = args[1];
 
        Job job = new Job(conf, tablename);
 
        Scan scan = new Scan();
 
        scan.setCaching(300);
 
        scan.setCacheBlocks(false);// 在mapreduce程序中千萬不要設(shè)置允許緩存
 
        /********** Begin *********/
 
         //設(shè)置過濾
 
         ArrayList<Filter> listForFilters = new ArrayList<Filter>();
 
         Filter destinationFilter =new SingleColumnValueFilter(Bytes.toBytes("info"), Bytes.toBytes("destination"),
 
               CompareOperator.EQUAL, new SubstringComparator("韓莊村"));
 
        Filter departure = new SingleColumnValueFilter(Bytes.toBytes("info"), Bytes.toBytes("departure"),
 
               CompareOperator.EQUAL, Bytes.toBytes("河北省保定市雄縣"));
 
        listForFilters.add(departure);
 
        listForFilters.add(destinationFilter);
 
        scan.setCaching(300);
 
        scan.setCacheBlocks(false);
 
        Filter filters = new FilterList(listForFilters);
 
        scan.setFilter(filters);
 
            /********** End *********/
 
        // 初始化Mapreduce程序
 
        TableMapReduceUtil.initTableMapperJob(tablename, scan, MyMapper.class, Text.class, BytesWritable.class, job);
 
        // 初始化Reduce
 
        TableMapReduceUtil.initTableReducerJob(targetTable, // output table
 
                MyTableReducer.class, // reducer class
 
                job);
 
        job.setNumReduceTasks(1);
 
        return job;
 
    }
 
}

第3關(guān)?統(tǒng)計(jì)共享單車指定車輛每次使用的空閑平均時(shí)間?

package com.educoder.bigData.sharedbicycle;

import java.io.IOException;

import java.math.BigDecimal;

import java.math.RoundingMode;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.conf.Configured;

import org.apache.hadoop.hbase.CompareOperator;

import org.apache.hadoop.hbase.client.Put;

import org.apache.hadoop.hbase.client.Result;

import org.apache.hadoop.hbase.client.Scan;

import org.apache.hadoop.hbase.filter.Filter;

import org.apache.hadoop.hbase.filter.SingleColumnValueFilter;

import org.apache.hadoop.hbase.io.ImmutableBytesWritable;

import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;

import org.apache.hadoop.hbase.mapreduce.TableMapper;

import org.apache.hadoop.hbase.mapreduce.TableReducer;

import org.apache.hadoop.hbase.util.Bytes;

import org.apache.hadoop.io.BytesWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.util.Tool;

import com.educoder.bigData.util.HBaseUtil;

/**

 * 

 * 統(tǒng)計(jì)5996共享單車每次使用的空閑平均時(shí)間

 */

public class FreeTimeMapReduce extends Configured implements Tool {
    public static final byte[] family = "info".getBytes();

    public static class MyMapper extends TableMapper<Text, BytesWritable> {
        protected void map(ImmutableBytesWritable rowKey, Result result, Context context)

                throws IOException, InterruptedException {
            /********** Begin *********/

            long beginTime = Long.parseLong(Bytes.toString(result.getValue(family, "beginTime".getBytes())));

            long endTime = Long.parseLong(Bytes.toString(result.getValue(family, "endTime".getBytes())));

            BytesWritable bytesWritable = new BytesWritable(Bytes.toBytes(beginTime + "_" + endTime));

            context.write(new Text("5996"), bytesWritable);      

            /********** End *********/

        }

    }

    public static class MyTableReducer extends TableReducer<Text, BytesWritable, ImmutableBytesWritable> {
        @Override

        public void reduce(Text key, Iterable<BytesWritable> values, Context context)

                throws IOException, InterruptedException {
            /********** Begin *********/

            long freeTime = 0;

            long beginTime = 0;

            int length = 0;

            for (BytesWritable time : values) {
                byte[] copyBytes = time.copyBytes();

                String timeLong = Bytes.toString(copyBytes);

                String[] split = timeLong.split("_");

                if(beginTime == 0) {
                    beginTime = Long.parseLong(split[0]);

                    continue;

                }

                else {
                    freeTime = freeTime + beginTime - Long.parseLong(split[1]);

                    beginTime = Long.parseLong(split[0]);

                    length ++;

                }

            }

            Put put = new Put(Bytes.toBytes(key.toString()));

            BigDecimal decimal = new BigDecimal(freeTime / length /1000 /60 /60);

            BigDecimal setScale = decimal.setScale(2, RoundingMode.HALF_DOWN);

            put.addColumn(family, "freeTime".getBytes(), Bytes.toBytes(setScale.toString()));

            context.write(null, put);

         

         

         

         

         

            /********** End *********/

        }

    }

    public int run(String[] args) throws Exception {
        // 配置Job

        Configuration conf = HBaseUtil.conf;

        // Scanner sc = new Scanner(System.in);

        // String arg1 = sc.next();

        // String arg2 = sc.next();

        String arg1 = "t_shared_bicycle";

        String arg2 = "t_bicycle_freetime";

        try {
            HBaseUtil.createTable(arg2, new String[] { "info" });

        } catch (Exception e) {
            // 創(chuàng)建表失敗

            e.printStackTrace();

        }

        Job job = configureJob(conf, new String[] { arg1, arg2 });

        return job.waitForCompletion(true) ? 0 : 1;

    }

    private Job configureJob(Configuration conf, String[] args) throws IOException {
        String tablename = args[0];

        String targetTable = args[1];

        Job job = new Job(conf, tablename);

        Scan scan = new Scan();

        scan.setCaching(300);

        scan.setCacheBlocks(false);// 在mapreduce程序中千萬不要設(shè)置允許緩存

        /********** Begin *********/

         //設(shè)置過濾條件

        Filter filter = new SingleColumnValueFilter(Bytes.toBytes("info"), Bytes.toBytes("bicycleId"), CompareOperator.EQUAL, Bytes.toBytes("5996"));

        scan.setFilter(filter); 

         

         

         

            /********** End *********/

        // 初始化Mapreduce程序

        TableMapReduceUtil.initTableMapperJob(tablename, scan, MyMapper.class, Text.class, BytesWritable.class, job);

        // 初始化Reduce

        TableMapReduceUtil.initTableReducerJob(targetTable, // output table

                MyTableReducer.class, // reducer class

                job);

        job.setNumReduceTasks(1);

        return job;

    }

}

第4關(guān)?統(tǒng)計(jì)指定時(shí)間共享單車使用次數(shù)

package com.educoder.bigData.sharedbicycle;

import java.io.IOException;

import java.util.ArrayList;

import org.apache.commons.lang3.time.FastDateFormat;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.conf.Configured;

import org.apache.hadoop.hbase.CompareOperator;

import org.apache.hadoop.hbase.client.Put;

import org.apache.hadoop.hbase.client.Result;

import org.apache.hadoop.hbase.client.Scan;

import org.apache.hadoop.hbase.filter.Filter;

import org.apache.hadoop.hbase.filter.FilterList;

import org.apache.hadoop.hbase.filter.SingleColumnValueFilter;

import org.apache.hadoop.hbase.io.ImmutableBytesWritable;

import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;

import org.apache.hadoop.hbase.mapreduce.TableMapper;

import org.apache.hadoop.hbase.mapreduce.TableReducer;

import org.apache.hadoop.hbase.util.Bytes;

import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.util.Tool;

import com.educoder.bigData.util.HBaseUtil;

/**

 * 共享單車使用次數(shù)統(tǒng)計(jì)

 */

public class UsageRateMapReduce extends Configured implements Tool {

    public static final byte[] family = "info".getBytes();

    public static class MyMapper extends TableMapper<Text, IntWritable> {

        protected void map(ImmutableBytesWritable rowKey, Result result, Context context)throws IOException, InterruptedException {
            /********** Begin *********/
           IntWritable doubleWritable = new IntWritable(1);
           context.write(new Text("departure"), doubleWritable);
            /********** End *********/

        }

    }

    public static class MyTableReducer extends TableReducer<Text, IntWritable, ImmutableBytesWritable> {

        @Override

        public void reduce(Text key, Iterable<IntWritable> values, Context context)

                throws IOException, InterruptedException {

            /********** Begin *********/        

            int totalNum = 0;
            for (IntWritable num : values) {
                int d = num.get();
                totalNum += d;
            }
            Put put = new Put(Bytes.toBytes(key.toString()));
            put.addColumn(family, "usageRate".getBytes(), Bytes.toBytes(String.valueOf(totalNum)));
            context.write(null, put);
           /********** End *********/
        }
    }

    public int run(String[] args) throws Exception {
        // 配置Job
        Configuration conf = HBaseUtil.conf;
        // Scanner sc = new Scanner(System.in);
        // String arg1 = sc.next();
        // String arg2 = sc.next();
        String arg1 = "t_shared_bicycle";
        String arg2 = "t_bicycle_usagerate";
        try {
            HBaseUtil.createTable(arg2, new String[] { "info" });
        } catch (Exception e) {
            // 創(chuàng)建表失敗
            e.printStackTrace();
        }
        Job job = configureJob(conf, new String[] { arg1, arg2 });
        return job.waitForCompletion(true) ? 0 : 1;
    }
    private Job configureJob(Configuration conf, String[] args) throws IOException {
        String tablename = args[0];
        String targetTable = args[1];
        Job job = new Job(conf, tablename);
        ArrayList<Filter> listForFilters = new ArrayList<Filter>();
        FastDateFormat instance = FastDateFormat.getInstance("yyyy-MM-dd");
        Scan scan = new Scan();
        scan.setCaching(300);
        scan.setCacheBlocks(false);// 在mapreduce程序中千萬不要設(shè)置允許緩存
         /********** Begin *********/
        try {
        Filter destinationFilter = new SingleColumnValueFilter(Bytes.toBytes("info"), Bytes.toBytes("beginTime"), CompareOperator.GREATER_OR_EQUAL, Bytes.toBytes(String.valueOf(instance.parse("2017-08-01").getTime())));
        Filter departure = new SingleColumnValueFilter(Bytes.toBytes("info"), Bytes.toBytes("endTime"), CompareOperator.LESS_OR_EQUAL, Bytes.toBytes(String.valueOf(instance.parse("2017-09-01").getTime())));
        listForFilters.add(departure);
        listForFilters.add(destinationFilter);
        }catch (Exception e) {
            e.printStackTrace();
            return null;
        }
        Filter filters = new FilterList(listForFilters);
        scan.setFilter(filters);
        
        /********** End *********/
        // 初始化Mapreduce程序
        TableMapReduceUtil.initTableMapperJob(tablename, scan, MyMapper.class, Text.class, IntWritable.class, job);
        // 初始化Reduce

        TableMapReduceUtil.initTableReducerJob(targetTable, // output table

                MyTableReducer.class, // reducer class

                job);

        job.setNumReduceTasks(1);

        return job;

    }

}

?第5關(guān)?統(tǒng)計(jì)共享單車線路流量文章來源地址http://www.zghlxwxcb.cn/news/detail-829195.html

package com.educoder.bigData.sharedbicycle;
 
import java.io.IOException;
 
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.mapreduce.TableReducer;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.util.Tool;
 
import com.educoder.bigData.util.HBaseUtil;
 
/**
 * 共享單車線路流量統(tǒng)計(jì)
 */
public class LineTotalMapReduce extends Configured implements Tool {
 
	public static final byte[] family = "info".getBytes();
 
	public static class MyMapper extends TableMapper<Text, IntWritable> {
		protected void map(ImmutableBytesWritable rowKey, Result result, Context context)
				throws IOException, InterruptedException {
			            /********** Begin *********/
						String start_latitude = Bytes.toString(result.getValue(family, "start_latitude".getBytes()));
						String start_longitude = Bytes.toString(result.getValue(family, "start_longitude".getBytes()));
						String stop_latitude = Bytes.toString(result.getValue(family, "stop_latitude".getBytes()));
						String stop_longitude = Bytes.toString(result.getValue(family, "stop_longitude".getBytes()));
						String departure = Bytes.toString(result.getValue(family, "departure".getBytes()));
						String destination = Bytes.toString(result.getValue(family, "destination".getBytes()));
						IntWritable doubleWritable = new IntWritable(1);
						context.write(new Text(start_latitude + "-" + start_longitude + "_" + stop_latitude + "-" + stop_longitude + "_" + departure + "-" + destination), doubleWritable);
		 
		 
		 
		 
		 
			/********** End *********/
		}
	}
 
	public static class MyTableReducer extends TableReducer<Text, IntWritable, ImmutableBytesWritable> {
		@Override
		public void reduce(Text key, Iterable<IntWritable> values, Context context)
				throws IOException, InterruptedException {
			            /********** Begin *********/
						int totalNum = 0;
						for (IntWritable num : values) {
							int d = num.get();
							totalNum += d;
						}
						Put put = new Put(Bytes.toBytes(key.toString() + totalNum ));
						put.addColumn(family, "lineTotal".getBytes(), Bytes.toBytes(String.valueOf(totalNum)));
						context.write(null, put);
		 
		 
		 
		 
		 
			/********** End *********/
		}
 
	}
 
	public int run(String[] args) throws Exception {
		// 配置Job
		Configuration conf = HBaseUtil.conf;
		// Scanner sc = new Scanner(System.in);
		// String arg1 = sc.next();
		// String arg2 = sc.next();
		String arg1 = "t_shared_bicycle";
		String arg2 = "t_bicycle_linetotal";
		try {
			HBaseUtil.createTable(arg2, new String[] { "info" });
		} catch (Exception e) {
			// 創(chuàng)建表失敗
			e.printStackTrace();
		}
		Job job = configureJob(conf, new String[] { arg1, arg2 });
		return job.waitForCompletion(true) ? 0 : 1;
	}
 
	private Job configureJob(Configuration conf, String[] args) throws IOException {
		String tablename = args[0];
		String targetTable = args[1];
		Job job = new Job(conf, tablename);
		Scan scan = new Scan();
		scan.setCaching(300);
		scan.setCacheBlocks(false);// 在mapreduce程序中千萬不要設(shè)置允許緩存
		// 初始化Mapreduce程序
		TableMapReduceUtil.initTableMapperJob(tablename, scan, MyMapper.class, Text.class, IntWritable.class, job);
		// 初始化Reduce
		TableMapReduceUtil.initTableReducerJob(targetTable, // output table
				MyTableReducer.class, // reducer class
				job);
		job.setNumReduceTasks(1);
		return job;
	}
}

到了這里,關(guān)于頭歌:共享單車之?dāng)?shù)據(jù)分析的文章就介紹完了。如果您還想了解更多內(nèi)容,請?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!

本文來自互聯(lián)網(wǎng)用戶投稿,該文觀點(diǎn)僅代表作者本人,不代表本站立場。本站僅提供信息存儲空間服務(wù),不擁有所有權(quán),不承擔(dān)相關(guān)法律責(zé)任。如若轉(zhuǎn)載,請注明出處: 如若內(nèi)容造成侵權(quán)/違法違規(guī)/事實(shí)不符,請點(diǎn)擊違法舉報(bào)進(jìn)行投訴反饋,一經(jīng)查實(shí),立即刪除!

領(lǐng)支付寶紅包贊助服務(wù)器費(fèi)用

相關(guān)文章

  • 大數(shù)據(jù)畢業(yè)設(shè)計(jì) 共享單車數(shù)據(jù)分析與可視化系統(tǒng) - Python

    大數(shù)據(jù)畢業(yè)設(shè)計(jì) 共享單車數(shù)據(jù)分析與可視化系統(tǒng) - Python

    ?? 這兩年開始畢業(yè)設(shè)計(jì)和畢業(yè)答辯的要求和難度不斷提升,傳統(tǒng)的畢設(shè)題目缺少創(chuàng)新和亮點(diǎn),往往達(dá)不到畢業(yè)答辯的要求,這兩年不斷有學(xué)弟學(xué)妹告訴學(xué)長自己做的項(xiàng)目系統(tǒng)達(dá)不到老師的要求。 為了大家能夠順利以及最少的精力通過畢設(shè),學(xué)長分享優(yōu)質(zhì)畢業(yè)設(shè)計(jì)項(xiàng)目,今天

    2024年02月06日
    瀏覽(31)
  • 大數(shù)據(jù)畢設(shè)項(xiàng)目 - 基于大數(shù)據(jù)的共享單車數(shù)據(jù)分析與可視化

    大數(shù)據(jù)畢設(shè)項(xiàng)目 - 基于大數(shù)據(jù)的共享單車數(shù)據(jù)分析與可視化

    ?? 這兩年開始畢業(yè)設(shè)計(jì)和畢業(yè)答辯的要求和難度不斷提升,傳統(tǒng)的畢設(shè)題目缺少創(chuàng)新和亮點(diǎn),往往達(dá)不到畢業(yè)答辯的要求,這兩年不斷有學(xué)弟學(xué)妹告訴學(xué)長自己做的項(xiàng)目系統(tǒng)達(dá)不到老師的要求。 為了大家能夠順利以及最少的精力通過畢設(shè),學(xué)長分享優(yōu)質(zhì)畢業(yè)設(shè)計(jì)項(xiàng)目,今天

    2024年03月13日
    瀏覽(26)
  • 【計(jì)算機(jī)畢設(shè)選題】基于大數(shù)據(jù)的共享單車數(shù)據(jù)分析與可視化

    【計(jì)算機(jī)畢設(shè)選題】基于大數(shù)據(jù)的共享單車數(shù)據(jù)分析與可視化

    ?? 這兩年開始畢業(yè)設(shè)計(jì)和畢業(yè)答辯的要求和難度不斷提升,傳統(tǒng)的畢設(shè)題目缺少創(chuàng)新和亮點(diǎn),往往達(dá)不到畢業(yè)答辯的要求,這兩年不斷有學(xué)弟學(xué)妹告訴學(xué)長自己做的項(xiàng)目系統(tǒng)達(dá)不到老師的要求。 為了大家能夠順利以及最少的精力通過畢設(shè),學(xué)長分享優(yōu)質(zhì)畢業(yè)設(shè)計(jì)項(xiàng)目,今天

    2024年02月21日
    瀏覽(30)
  • 【頭歌】共享單車之?dāng)?shù)據(jù)存儲

    任務(wù)描述 本關(guān)任務(wù):獲取 data.xls 文件中的數(shù)據(jù)。 相關(guān)知識 獲取工作簿中的信息,我們可以使用 Java POI ( POI 是一個(gè)提供 API 給 Java 程序?qū)?Microsoft Office 格式檔案讀和寫的功能)提供的 Workbook 類來操作。 為了完成本關(guān)任務(wù),你需要掌握:如何獲取 Wookbook 的數(shù)據(jù)。 編程要求

    2024年02月10日
    瀏覽(19)
  • 頭歌Educoder云計(jì)算與大數(shù)據(jù)——實(shí)驗(yàn)五 Java API分布式存儲

    原始電商數(shù)據(jù)都是存儲在關(guān)系型數(shù)據(jù)庫或 NoSQL 數(shù)據(jù)庫上的,是面向OLTP(聯(lián)機(jī)事務(wù)處理過程)的;數(shù)據(jù)都是面向業(yè)務(wù)的,而不是面向分析。因此數(shù)據(jù)比較復(fù)雜,表很多關(guān)聯(lián)的數(shù)據(jù)是分散的,不利于統(tǒng)計(jì)分析;因此需要把數(shù)據(jù)從多個(gè)表里導(dǎo)出來、聯(lián)合起來,找出分析所需要的數(shù)據(jù)項(xiàng)

    2023年04月09日
    瀏覽(94)
  • 【頭歌-數(shù)據(jù)分析與實(shí)踐-python】數(shù)據(jù)分析與實(shí)踐-python——python基礎(chǔ)

    注意 : 本文檔僅供參考使用,本章節(jié)程序絕大多數(shù)程序面向?qū)ο筝敵觯坏y試用例改變,會導(dǎo)致無法通過,請悉知 ! ! ! 請勿盲目使用 第1關(guān) 字符串常量的輸出 第2關(guān) 輸入及輸出 第3關(guān) 書寫一個(gè)完整的Python程序 第1關(guān) 關(guān)系運(yùn)算符與單分支選擇語句 第2關(guān),3個(gè)數(shù)按從大到小依

    2024年01月23日
    瀏覽(83)
  • Educoder/頭歌JAVA——jQuery基礎(chǔ)

    Educoder/頭歌JAVA——jQuery基礎(chǔ)

    目錄 第1關(guān):jQuery入門 相關(guān)知識 環(huán)境安裝 第一個(gè)程序 id選擇器 ?第2關(guān):jQuery基本選擇器 相關(guān)知識 類選擇器 元素選擇器 編程要求 ?第3關(guān):過濾選擇器 (一) 相關(guān)知識 設(shè)置css屬性 編程要求 第4關(guān):過濾選擇器 (二) 相關(guān)知識 :not 選擇器 編程要求 ?第5關(guān):tab選項(xiàng)卡 相關(guān)知

    2024年02月07日
    瀏覽(30)
  • python大數(shù)據(jù)作業(yè)-客戶價(jià)值分析-實(shí)訓(xùn)頭歌

    python大數(shù)據(jù)作業(yè)-客戶價(jià)值分析-實(shí)訓(xùn)頭歌

    一、實(shí)驗(yàn)?zāi)康呐c要求 1、掌握使用numpy和pandas庫處理數(shù)據(jù)的基本方法。 2、掌握使用RFM分析模型對客戶信息進(jìn)行特征提取的基本方法。 3、掌握對特征數(shù)據(jù)進(jìn)行標(biāo)準(zhǔn)化處理的基本方法。 4、掌握使用Sklearn庫對K-Means聚類算法的實(shí)現(xiàn)及其評價(jià)方法。 5、掌握使用matplotlib結(jié)合pandas庫對

    2023年04月17日
    瀏覽(22)
  • 頭歌平臺python數(shù)據(jù)分析——(9)Matplotlib圖形配置

    頭歌平臺python數(shù)據(jù)分析——(9)Matplotlib圖形配置

    ,根據(jù)輸入數(shù)據(jù)繪制熱成像圖并隱藏坐標(biāo)軸,具體要求如下: 圖形的figsize為(10, 10); 圖形保存到Task1/img/T1.png。 根據(jù)函數(shù)參數(shù)file_name讀取文件,統(tǒng)計(jì)每年births的總和并作折線圖,為最高/最低出生數(shù)年份設(shè)置注釋,具體要求如下: 對數(shù)據(jù)進(jìn)行去空值處理; 注釋文字的坐標(biāo)位置

    2024年02月10日
    瀏覽(170)
  • ???????頭歌(EduCoder)Java實(shí)訓(xùn)作業(yè)答案

    ???????頭歌(EduCoder)Java實(shí)訓(xùn)作業(yè)答案

    搜集整理了一份最新最全的頭歌(EduCoder)Java實(shí)訓(xùn)作業(yè)答案,分享給大家~ (EduCoder)是信息技術(shù)類實(shí)踐教學(xué)平臺。(EduCoder)涵蓋了計(jì)算機(jī)、大數(shù)據(jù)、云計(jì)算、人工智能、軟件工程、物聯(lián)網(wǎng)等專業(yè)課程。超60000個(gè)實(shí)訓(xùn)案例,建立學(xué)、練、評、測一體化實(shí)驗(yàn)環(huán)境。 ? 第一關(guān) 第二關(guān) C

    2024年02月08日
    瀏覽(156)

覺得文章有用就打賞一下文章作者

支付寶掃一掃打賞

博客贊助

微信掃一掃打賞

請作者喝杯咖啡吧~博客贊助

支付寶掃一掃領(lǐng)取紅包,優(yōu)惠每天領(lǐng)

二維碼1

領(lǐng)取紅包

二維碼2

領(lǐng)紅包