@rickyChen
2017-07-11T11:32:33.000000Z
字数 1054
阅读 2775
Spark
Yarn
启动nodemanager
报错 No space left on device
使用
df -h
命令判断nodemanager
运行日志和启动日志磁盘空间是否足够。
使用pyspark读取kafka对应topic数据报错java.lang.NoClassDefFoundError: org/apache/kafka/common/message/KafkaLZ4BlockOutputStream
更改之前
./bin/spark-submit --jars lib/spark-streaming-kafka_2.10-1.6.1.jar,lib/kafka_2.10-0.8.2.1.jar,lib/metrics-core-2.2.0.jar --deploy-mode client ./project/stream.py
更改之后
./bin/spark-submit --jars lib/spark-streaming-kafka_2.10-1.6.1.jar,lib/kafka_2.10-0.8.2.1.jar,lib/metrics-core-2.2.0.jar,lib/kafka-clients-0.8.2.1.jar --deploy-mode client ./project/stream.py
使用hdfs balancer显示Another Balancer is running.. Exiting ...
HDFS HA模式与reblancer不兼容 参考资料
bin/hdfs balancer -Dfs.defaultFS=hdfs://namenode:8020 -Ddfs.nameservices="" -threshold 10
使用MaxMindGeoIP解析报错
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.node.ArrayNode.(Lcom/fasterxml/jackson/databind/node/JsonNodeFactory;Ljava/util/List;)
com.maxmind.geoip2 版本需要是 2.5.0,以便和spark本身兼容
<dependency>
<groupId>com.maxmind.geoip2</groupId>
<artifactId>geoip2</artifactId>
<version>2.5.0</version>
</dependency>