基于cdh的Kafka配置及部署(详细,成功运行)
发表于:2024-11-24 作者:热门IT资讯网编辑
编辑最后更新 2024年11月24日,一、下载http://archive.cloudera.com/kafka/parcels/2.2.0/wget http://archive.cloudera.com/kafka/parcels/2
一、下载
http://archive.cloudera.com/kafka/parcels/2.2.0/
wget http://archive.cloudera.com/kafka/parcels/2.2.0/KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcelwget http://archive.cloudera.com/kafka/parcels/2.2.0/KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel.sha1
二、校验
[hadoop@hadoop003 softwares]$ sha1sum KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel359509e028ae91a2a082adfad5f64596b63ea750 KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel
[hadoop@hadoop003 softwares]$ cat KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel.sha1359509e028ae91a2a082adfad5f64596b63ea750
校验码相同,说明文件在下载过程中没有任何损坏,可正常使用
三、解压并设置软连接
[hadoop@hadoop003 softwares]$ tar -zxf KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel -C ~/app[hadoop@hadoop003 app]$ ln -s /home/hadoop/app/KAFKA-2.2.0-1.2.2.0.p0.68/ /home/hadoop/app/kafka
四、重要目录说明
[hadoop@hadoop003 kafka]$ pwd/home/hadoop/app/kafka[hadoop@hadoop003 kafka]$ lltotal 20drwxr-xr-x 2 hadoop hadoop 4096 Jul 7 2017 bindrwxr-xr-x 5 hadoop hadoop 4096 Jul 7 2017 etcdrwxr-xr-x 3 hadoop hadoop 4096 Jul 7 2017 libdrwxr-xr-x 2 hadoop hadoop 4096 Jul 7 2017 meta### kafka配置文件目录,我们修改配置文件就在这里修改[hadoop@hadoop003 kafka]$ ll etc/kafka/conf.dist/total 48-rw-r--r-- 1 hadoop hadoop 906 Jul 7 2017 connect-console-sink.properties-rw-r--r-- 1 hadoop hadoop 909 Jul 7 2017 connect-console-source.properties-rw-r--r-- 1 hadoop hadoop 2760 Jul 7 2017 connect-distributed.properties-rw-r--r-- 1 hadoop hadoop 883 Jul 7 2017 connect-file-sink.properties-rw-r--r-- 1 hadoop hadoop 881 Jul 7 2017 connect-file-source.properties-rw-r--r-- 1 hadoop hadoop 1074 Jul 7 2017 connect-log4j.properties-rw-r--r-- 1 hadoop hadoop 2061 Jul 7 2017 connect-standalone.properties-rw-r--r-- 1 hadoop hadoop 4369 Jul 7 2017 log4j.properties-rw-r--r-- 1 hadoop hadoop 5679 Jun 1 01:24 server.properties-rw-r--r-- 1 hadoop hadoop 1032 Jul 7 2017 tools-log4j.properties### kafka功能目录[hadoop@hadoop003 kafka]$ ll lib/kafka/total 112drwxr-xr-x 2 hadoop hadoop 4096 Jul 7 2017 bindrwxr-xr-x 2 hadoop hadoop 4096 Jul 7 2017 clouderalrwxrwxrwx 1 hadoop hadoop 43 Jun 1 02:11 config -> /etc/kafka/conf #注意这是红色-rw-rw-r-- 1 hadoop hadoop 48428 Jun 1 02:17 KAFKA-2.2.0-1.2.2.0.p0.68-el6.parceldrwxr-xr-x 2 hadoop hadoop 12288 Jul 7 2017 libs-rwxr-xr-x 1 hadoop hadoop 28824 Jul 7 2017 LICENSEdrwxrwxr-x 2 hadoop hadoop 4096 Jun 1 01:39 logs-rwxr-xr-x 1 hadoop hadoop 336 Jul 7 2017 NOTICEdrwxr-xr-x 2 hadoop hadoop 4096 Jul 7 2017 site-docs### config软连接此时默认链接的getaway的配置文件,也就是CM客户端的配置文件,因为我们没有使用cm,所以也就没有自动生成/etc/kafka/conf 故报错闪烁红色### bin目录下是kafka的相关脚本,例如server启动关闭&&consumer&&producer的启动脚本
五、修改配置文件
# 第一步:[hadoop@hadoop003 kafka] cd etc/kafka/conf.dist# 第二步:vim server.properties# 第三步:(主要修改其中的6个参数)broker.id=0 #标示符log.dirs=/home/hadoop/app/kafka/logs #数据保存的位置log.retention.hours=168 #数据的保留时间(168 hours=7天)zookeeper.connect=hadoop001:2181,hadoop002:2181,hadoop003:2181/kafka# zookeeper存储kafka数据的位置delete.topic.enable=true #可以删除已创建主题
六、启动kafka
[hadoop@hadoop003 kafka]$ lib/kafka/bin/kafka-server-start.sh /home/hadoop/app/kafka/etc/kafka/conf.dist/server.properties SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/home/hadoop/app/KAFKA-2.2.0-1.2.2.0.p0.68/lib/kafka/libs/slf4j-log4j12-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/home/hadoop/app/KAFKA-2.2.0-1.2.2.0.p0.68/lib/kafka/libs/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.log4j:ERROR Could not read configuration file from URL [file:lib/kafka/bin/../config/log4j.properties].java.io.FileNotFoundException: lib/kafka/bin/../config/log4j.properties (No such file or directory) at java.io.FileInputStream.open0(Native Method) at java.io.FileInputStream.open(FileInputStream.java:195) at java.io.FileInputStream.(FileInputStream.java:138) at java.io.FileInputStream.(FileInputStream.java:93) at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90) at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.(LogManager.java:127) at org.slf4j.impl.Log4jLoggerFactory.(Log4jLoggerFactory.java:66) at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:72) at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:45) at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150) at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124) at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383) at org.apache.kafka.common.utils.Utils.(Utils.java:59) at kafka.Kafka$.getPropsFromArgs(Kafka.scala:41) at com.cloudera.kafka.wrap.Kafka$.main(Kafka.scala:72) at com.cloudera.kafka.wrap.Kafka.main(Kafka.scala)log4j:ERROR Ignoring configuration file [file:lib/kafka/bin/../config/log4j.properties].SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]log4j:WARN No appenders could be found for logger (kafka.server.KafkaConfig).log4j:WARN Please initialize the log4j system properly.log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
出现了bug找不到配置文件
java.io.FileNotFoundException: lib/kafka/bin/../config/log4j.properties
由于 lrwxrwxrwx 1 hadoop hadoop 43 Jun 1 02:11 config -> /etc/kafka/conf 找不到,所以要指定成etc/kafka/conf.dist/
[hadoop@hadoop003 kafka]$ rm lib/kafka/config[hadoop@hadoop003 kafka]$ ln -s /home/hadoop/app/kafka/etc/kafka/conf.dist/ /home/hadoop/app/kafka/lib/kafka/config
重新启动
[hadoop@hadoop003 kafka]$ nohup kafka-server-start.sh /home/hadoop/app/kafka/etc/kafka/conf.dist/server.properties > /home/hadoop/app/kafka/server-logs/kafka-server.log 2>&1 &
没有报错信息了。。。