-
Notifications
You must be signed in to change notification settings - Fork 1
/
Hadoop.txt
60 lines (20 loc) · 1.03 KB
/
Hadoop.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
Important commands for Hadoop:
ssh root@127.0.0.1 -p 2222
Transfer file from local to to the root file to the VM temp/data folder:
scp -P 2222 /Users/Sam/Desktop/AdventureWorks\ 2012\ OLTP\ Script/* root@127.0.0.1:/tmp/data
copy from local to HDFS:
hadoop fs -copyFromLocal /tmp/data* /user/hadoop/input
list all
hadoop fs -ls /user/hadoop/input
Create:
hadoop dfs -mkdir -p /user/hadoop/input
way to run the program:
hadoop jar /tmp/Samples-0.0.1-SNAPSHOT.jar CloudWick.Samples.WordCount hdfs:/user/hadoop/input/data/StateProvince.csv hdfs:/user/hadoop/output
hadoop jar /tmp/Samples-0.0.1-SNAPSHOT.jar CloudWick.Samples.WordCount hdfs:/input/StateProvince.csv hdfs:/output
hadoop jar /tmp/data/Samples-0.0.1-SNAPSHOT.jar CloudWick.Samples.WordCount /user/hadoop/input/data/unique_tracks.txt /user/hadoop/input/unique_artists.txt /user/hadoop/output/aap1
list all:
hadoop fs -ls /input
copy file:
hadoop fs -copyFromLocal /tmp/data* hdfs:/input
Looking the input file:
hadoop fs -cat /input/StateProvince.csv