Kafka学习笔记 --- Scala实现FlinkKafkaConsumer

来源:互联网 发布:知乎 感情 编辑:程序博客网 时间:2024/06/08 06:03

pom.xml配置项:


<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">  <modelVersion>4.0.0</modelVersion>  <groupId>zetdata</groupId>  <artifactId>FlinkKafkaConsumer</artifactId>  <version>1.0-SNAPSHOT</version>  <inceptionYear>2008</inceptionYear>  <properties>    <scala.version>2.11.7</scala.version>  </properties>  <repositories>    <repository>      <id>scala-tools.org</id>      <name>Scala-Tools Maven2 Repository</name>      <url>http://scala-tools.org/repo-releases</url>    </repository>  </repositories>  <pluginRepositories>    <pluginRepository>      <id>scala-tools.org</id>      <name>Scala-Tools Maven2 Repository</name>      <url>http://scala-tools.org/repo-releases</url>    </pluginRepository>  </pluginRepositories>  <dependencies>    <dependency>      <groupId>org.scala-lang</groupId>      <artifactId>scala-library</artifactId>      <version>${scala.version}</version>    </dependency>    <dependency>      <groupId>junit</groupId>      <artifactId>junit</artifactId>      <version>4.4</version>      <scope>test</scope>    </dependency>    <dependency>      <groupId>org.specs</groupId>      <artifactId>specs</artifactId>      <version>1.2.5</version>      <scope>test</scope>    </dependency>    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka-0.9_2.10 -->      <dependency>          <groupId>org.apache.flink</groupId>          <artifactId>flink-connector-kafka-0.10_2.11</artifactId>          <version>1.3.0</version>      </dependency>      <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core -->      <dependency>          <groupId>org.apache.flink</groupId>          <artifactId>flink-core</artifactId>          <version>1.3.0</version>      </dependency>      <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-clients_2.11 -->      <dependency>          <groupId>org.apache.flink</groupId>          <artifactId>flink-clients_2.11</artifactId>          <version>1.3.0</version>      </dependency>      <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-scala_2.11 -->      <dependency>          <groupId>org.apache.flink</groupId>          <artifactId>flink-scala_2.11</artifactId>          <version>1.3.0</version>      </dependency>      <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala_2.11 -->      <dependency>          <groupId>org.apache.flink</groupId>          <artifactId>flink-streaming-scala_2.11</artifactId>          <version>1.3.0</version>      </dependency>  </dependencies>  <build>    <sourceDirectory>src/main/scala</sourceDirectory>    <testSourceDirectory>src/test/scala</testSourceDirectory>    <plugins>      <plugin>        <groupId>org.scala-tools</groupId>        <artifactId>maven-scala-plugin</artifactId>        <executions>          <execution>            <goals>              <goal>compile</goal>              <goal>testCompile</goal>            </goals>          </execution>        </executions>        <configuration>          <scalaVersion>${scala.version}</scalaVersion>          <args>            <arg>-target:jvm-1.5</arg>          </args>        </configuration>      </plugin>      <plugin>        <groupId>org.apache.maven.plugins</groupId>        <artifactId>maven-eclipse-plugin</artifactId>        <configuration>          <downloadSources>true</downloadSources>          <buildcommands>            <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>          </buildcommands>          <additionalProjectnatures>            <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>          </additionalProjectnatures>          <classpathContainers>            <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>            <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>          </classpathContainers>        </configuration>      </plugin>    </plugins>  </build>  <reporting>    <plugins>      <plugin>        <groupId>org.scala-tools</groupId>        <artifactId>maven-scala-plugin</artifactId>        <configuration>          <scalaVersion>${scala.version}</scalaVersion>        </configuration>      </plugin>    </plugins>  </reporting></project>

代码利用将上述的依赖加入,然后继续编写代码:

package zetdataimport java.util.Propertiesimport org.apache.flink.streaming.api.scala.StreamExecutionEnvironmentimport org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010import org.apache.flink.streaming.util.serialization.SimpleStringSchemaimport org.apache.flink.api.scala._/**  * Created by ryan on 17-7-14.  */object FlinkKafkaCon {    def main(args: Array[String]): Unit ={      val env = StreamExecutionEnvironment.getExecutionEnvironment      env.enableCheckpointing(5000)      val properties = new Properties()      properties.setProperty("bootstrap.servers", "192.168.1.81:6667")      properties.setProperty("zookeeper.connect", "192.168.1.81:2181")      val sss = new SimpleStringSchema()      val fkc = new FlinkKafkaConsumer010[String]("test", sss, properties)      val stream = env.addSource(fkc)      stream.setParallelism(4).print()      env.execute("KafkaFlinkConsumer....")    }}

此处使用的kafka是hdp版本,因此broker端口不是9092,是6667。

flink connect kafkas是010版本。


原创粉丝点击