flume学习(四)

来源:互联网 发布:全屏画图软件 编辑:程序博客网 时间:2024/05/16 15:14

接上章

nginx配置

worker_processes  1;events {    worker_connections  1024;}http {    include       mime.types;    default_type  application/octet-stream;    log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '                      '$status $body_bytes_sent "$http_referer" '                      '"$http_user_agent" "$http_x_forwarded_for"';    log_format  log_format   '$remote_addr^A$msec^A$http_host^A$request_uri';    access_log /home/hadoop/access.log log_format;    sendfile        on;    keepalive_timeout  65;    #include /etc/nginx/conf.d/*.conf;server {    listen       80;    server_name  hh 0.0.0.0;    location ~ .*(BfImg)\.(gif)$ {      default_type image/gif;      root /usr/local/nginx/www/source;     }}}

运行nginx

sbin/nginx

测试nginx

# curl http://dev-hadoop-single.com:80<!DOCTYPE html><html><head><title>Welcome to nginx!</title><style>    body {        width: 35em;        margin-left: 100px;        margin-top:100px;        font-family: Tahoma, Verdana, Arial, sans-serif;    }</style></head><body><h1>Welcome to nginx! backup</h1><p>this is the first page</p><p>If you see this page, the nginx web server is successfully installed andworking. Further configuration is required.</p><p>For online documentation and support please refer to<a href="http://nginx.org/">nginx.org</a>.<br/>Commercial support is available at<a href="http://nginx.com/">nginx.com</a>.</p><p><em>Thank you for using nginx.</em></p></body></html>

启动flume进行测试

浏览器访问
http://dev-hadoop-single.com/a.gif?userid=xxx&ctime=xxxx&url=YYY
进行多次访问
查看/home/hadoop/access.log
“`
# tail -f /home/hadoop/access.log
192.168.56.1^A1476876997.488^Adev-hadoop-single.com^A/a.gif
192.168.56.1^A1476876999.153^Adev-hadoop-single.com^A/a.gif
192.168.56.1^A1476877028.421^Adev-hadoop-single.com^A/a.gif
192.168.56.1^A1476877030.573^Adev-hadoop-single.com^A/a.gif
192.168.56.1^A1476877064.052^Adev-hadoop-single.com^A/a.gif?userid=xxx&ctime=xxxx&url=YYY
192.168.56.1^A1476877065.791^Adev-hadoop-single.com^A/a.gif?userid=xxx&ctime=xxxx&url=YYY
192.168.56.1^A1476877067.537^Adev-hadoop-single.com^A/a.gif?userid=xxx&ctime=xxxx&url=YYY

查看flume日志

16/10/19 19:43:21 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-02/2016-10-19/log-spool.1476876160609.tmp
16/10/19 19:43:21 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-02/2016-10-19/log-spool.1476876160609.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-02/2016-10-19/log-spool.1476876160609
16/10/19 19:43:22 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-02/2016-10-19/log-spool.1476876160610.tmp

查看hdfs目录

$ hdfs dfs -ls /flume/events-02/2016-10-19
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.5.0-cdh5.3.6/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/hbase-0.98.6-cdh5.3.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/10/19 19:46:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Found 33 items
-rw-r–r– 1 hadoop supergroup 10397 2016-10-19 19:43 /flume/events-02/2016-10-19/log-spool.1476876160609
-rw-r–r– 1 hadoop supergroup 10439 2016-10-19 19:43 /flume/events-02/2016-10-19/log-spool.1476876160610
-rw-r–r– 1 hadoop supergroup 10439 2016-10-19 19:43 /flume/events-02/2016-10-19/log-spool.1476876160611
-rw-r–r– 1 hadoop supergroup 10439 2016-10-19 19:44 /flume/events-02/2016-10-19/log-spool.1476876160612
“`

0 0