![[置頂]Apache Kafka-0.8.1.1源碼編譯 -开发者知识库,第1张 [置頂]Apache Kafka-0.8.1.1源碼編譯 -开发者知识库,第1张](/pic/[置頂]Apache Kafka-0.8.1.1源碼編譯 -开发者知识库.jpg)
經過近一個月時間,終於差不多將之前在Flume 0.9.4上面編寫的source、sink等插件遷移到Flume-ng 1.5.0,包括了將Flume 0.9.4上面的TailSource、TailDirSource等插件的遷移(當然,我們加入了許多新的功能,比如故障恢復、日志的斷點續傳、按塊發送日志以及每個一定的時間輪詢發送日志而不是等一個日志發送完才發送另外一個日志)。現在我們需要將Flume-ng 1.5.0和最新的Kafka-0.8.1.1進行整合,今天這篇文章主要是說如何編譯Kafka-0.8.1.1源碼。
在講述如何編譯Kafka-0.8.1.1源碼之前,我們先來了解一下什么是Kafka:
Kafka is a distributed, partitioned, replicated commit log service. It provides the functionality of a messaging system, but with a unique design.(Kafka是一個分布式的、可分區的(partitioned)、基於備份的(replicated)和commit-log存儲的服務.。它提供了類似於messaging system的特性,但是在設計實現上完全不同)。kafka是一種高吞吐量的分布式發布訂閱消息系統,它有如下特性:
(1)、通過O(1)的磁盤數據結構提供消息的持久化,這種結構對於即使數以TB的消息存儲也能夠保持長時間的穩定性能。
(2)、高吞吐量:即使是非常普通的硬件kafka也可以支持每秒數十萬的消息。
(3)、支持通過kafka服務器和消費機集群來分區消息。
(4)、支持Hadoop並行數據加載。
官方文檔中關於kafka分布式訂閱架構如下圖:
![[置頂]Apache Kafka-0.8.1.1源碼編譯 -开发者知识库,第3张 [置頂]Apache Kafka-0.8.1.1源碼編譯 -开发者知识库,kafka_producer_consumer,第3张](/img.php?pic=https://www.itdaan.com/imgs/0/5/1/1/81/ebddec6dc8e6cb06abd92723de34a559.jpe)
歡迎關注微信公共帳號
好了,更多關於Kafka的介紹可以去http://kafka.apache.org/里面查看。現在我們正入正題,說說如何編譯 Kafka-0.8.1.1,我們可以用Kafka里面自帶的腳本進行編譯;我們也可以用sbt進行編譯,sbt編譯有點麻煩,我將在文章的后面進行介紹。
一、用Kafka里面自帶的腳本進行編譯
下載好了Kafka源碼,里面自帶了一個gradlew的腳本,我們可以利用這個編譯Kafka源碼:
2 |
# tar -zxf kafka- 0.8 . 1.1 -src.tgz |
4 |
# ./gradlew releaseTarGz |
運行上面的命令進行編譯將會出現以下的異常信息:
01 |
:core:signArchives FAILED |
03 |
FAILURE: Build failed with an exception. |
06 |
Execution failed for task ':core:signArchives' . |
07 |
> Cannot perform signing task ':core:signArchives' because it |
08 |
has no configured signatory |
11 |
Run with --stacktrace option to get the stack trace. Run with |
12 |
--info or --debug option to get more log output. |
這是一個bug(https://issues.apache.org/jira/browse/KAFKA-1297),可以用下面的命令進行編譯
1 |
./gradlew releaseTarGzAll -x signArchives |
這時候將會編譯成功(在編譯的過程中將會出現很多的)。在編譯的過程中,我們也可以指定對應的Scala版本進行編譯:
1 |
./gradlew -PscalaVersion= 2.10 . 3 releaseTarGz -x signArchives |
編譯完之后將會在core/build/distributions/里面生成kafka_2.10-0.8.1.1.tgz文件,這個和從網上下載的一樣,可以直接用。
二、利用sbt進行編譯
我們同樣可以用sbt來編譯Kafka,步驟如下:
03 |
# git checkout -b 0.8 remotes/origin/ 0.8 |
05 |
[info] [SUCCESSFUL ] org.eclipse.jdt#core; 3.1 . 1 !core.jar (2243ms) |
06 |
[info] downloading http: |
07 |
[info] [SUCCESSFUL ] ant#ant; 1.6 . 5 !ant.jar (1150ms) |
09 |
[info] Resolving org.apache.hadoop#hadoop-core; 0.20 . 2 ... |
11 |
[info] Resolving com.yammer.metrics#metrics-annotation; 2.2 . 0 ... |
13 |
[info] Resolving com.yammer.metrics#metrics-annotation; 2.2 . 0 ... |
15 |
[success] Total time: 168 s, completed Jun 18 , 2014 6 : 51 : 38 PM |
18 |
[info] Set current project to Kafka (in build file:/export1/spark/kafka/) |
19 |
Getting Scala 2.8 . 0 ... |
20 |
:: retrieving :: org.scala-sbt#boot-scala |
22 |
3 artifacts copied, 0 already retrieved (14544kB/27ms) |
23 |
[success] Total time: 1 s, completed Jun 18 , 2014 6 : 52 : 37 PM |
對於Kafka 0.8及以上版本還需要運行以下的命令:
01 |
# ./sbt assembly- package -dependency |
02 |
[info] Loading project definition from /export1/spark/kafka/project |
03 |
[warn] Multiple resolvers having different access mechanism configured with |
04 |
same name 'sbt-plugin-releases' . To avoid conflict, Remove duplicate project |
05 |
resolvers (`resolvers`) or rename publishing resolver (`publishTo`). |
06 |
[info] Set current project to Kafka (in build file:/export1/spark/kafka/) |
07 |
[warn] Credentials file /home/wyp/.m2/.credentials does not exist |
08 |
[info] Including slf4j-api- 1.7 . 2 .jar |
09 |
[info] Including metrics-annotation- 2.2 . 0 .jar |
10 |
[info] Including scala-compiler.jar |
11 |
[info] Including scala-library.jar |
12 |
[info] Including slf4j-simple- 1.6 . 4 .jar |
13 |
[info] Including metrics-core- 2.2 . 0 .jar |
14 |
[info] Including snappy-java- 1.0 . 4.1 .jar |
15 |
[info] Including zookeeper- 3.3 . 4 .jar |
16 |
[info] Including log4j- 1.2 . 15 .jar |
17 |
[info] Including zkclient- 0.3 .jar |
18 |
[info] Including jopt-simple- 3.2 .jar |
19 |
[warn] Merging 'META-INF/NOTICE' with strategy 'rename' |
20 |
[warn] Merging 'org/xerial/snappy/native/README' with strategy 'rename' |
21 |
[warn] Merging 'META-INF/maven/org.xerial.snappy/snappy-java/LICENSE' |
22 |
with strategy 'rename' |
23 |
[warn] Merging 'LICENSE.txt' with strategy 'rename' |
24 |
[warn] Merging 'META-INF/LICENSE' with strategy 'rename' |
25 |
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard' |
26 |
[warn] Strategy 'discard' was applied to a file |
27 |
[warn] Strategy 'rename' was applied to 5 files |
28 |
[success] Total time: 3 s, completed Jun 18 , 2014 6 : 53 : 41 PM |
當然,我們也可以在sbt里面指定scala的版本:
07 |
過往記憶博客,專注於hadoop、hive、spark、shark、flume的技術博客,大量的干貨 |
08 |
過往記憶博客微信公共帳號:iteblog_hadoop |
12 |
sbt " 2.10.3 assembly-package-dependency" |
本文地址:《Apache Kafka-0.8.1.1源碼編譯》:http://www.iteblog.com/archives/1044,過往記憶博客,專注於hadoop、hive、spark、shark、flume的技術博客,大量的干貨.過往記憶博客微信公共帳號:iteblog_hadoop本博客文章除特別聲明,全部都是原創!
尊重原創,轉載請注明: 轉載自過往記憶(http://www.iteblog.com/)
本文鏈接地址: 《Apache Kafka-0.8.1.1源碼編譯》(http://www.iteblog.com/archives/1044)
E-mail:wyphao.2007@163.com
最佳答案:
本文经用户投稿或网站收集转载,如有侵权请联系本站。