kafka-eagle的概述与安装使用

kafka-eagle概述
为了简化开发者和服务工程师维护Kafka集群的工作有一个监控管理工具,叫做 Kafka-eagle。这个管理工具可以很容易地发现分布在集群中的哪些topic分布不均匀,或者是分区在整个集群分布不均匀的的情况。它支持管理多个集群、选择副本、副本重新分配以及创建Topic。同时,这个管理工具也是一个非常好的可以快速浏览这个集群的工具,
环境和安装
1、环境要求
需要安装jdk,启动zk以及kafka的服务
2、安装步骤
1、下载源码包
kafka-eagle官网:
链接: link.

我们可以从官网上面直接下载最细的安装包即可kafka-eagle-bin-1.3.2.tar.gz这个版本即可

代码托管地址:
https:///smartloli/kafka-eagle/releases2、解压
这里我们选择将kafak-eagle安装在第三台
直接将kafka-eagle安装包上传到node03服务器的/export/softwares路径下,然后进行解压
node03服务器执行一下命令进行解压:

cd /export/softwares/
tar -zxf kafka-eagle-bin-1.3.2.tar.gz -C /export/servers/
cd /export/servers/kafka-eagle-bin-1.3.2
tar -zxf kafka-eagle-web-1.3.2-bin.tar.gz

3、准备数据库
kafka-eagle需要使用一个数据库来保存一些元数据信息,我们这里直接使用msyql数据库来保存即可,在node03服务器执行以下命令创建一个mysql数据库即可
进入mysql客户端

mysql -uroot -p
create database eagle;

4、修改kafak-eagle配置文件
node03执行以下命令修改kafak-eagle配置文件

cd /export/servers/kafka-eagle-bin-1.3.2/kafka-eagle-web-1.3.2/conf
vim system-config.properties
kafka.eagle.zk.cluster.alias=cluster1,cluster2
cluster1.zk.list=node01:2181,node02:2181,node03:2181
cluster2.zk.list=node01:2181,node02:2181,node03:2181
kafka.eagle.driver=com.mysql.jdbc.Driver
kafka.eagle.url=jdbc:mysql://node03:3306/eagle
kafka.eagle.username=root
kafka.eagle.password=123456

5、配置环境变量
kafka-eagle必须配置环境变量,node03服务器执行以下命令来进行配置环境变量
vim /etc/profile

export KE_HOME=/export/servers/kafka-eagle-bin-1.3.2/kafka-eagle-web-1.3.2
export PATH=:$KE_HOME/bin:$PATH

让修改立即生效,执行

source /etc/profile
**6、启动kafka-eagle**
node03执行以下界面启动kafka-eagle
cd /export/servers/kafka-eagle-bin-1.3.2/kafka-eagle-web-1.3.2/bin
chmod u+x ke.sh
./ke.sh start

7、主界面
访问kafka-eagle
http://node03:8048/ke/account/signin?/ke/ 默认:
用户名:admin
密码:123456

实时看板案例

1、项目需求梳理
根据订单mq,快速计算双11当天的订单量、销售金额

2、项目架构模型

支付系统+kafka+ redis

1、支付系统发送mq到kafka集群中,编写程序消费kafka的数据并计算实时的订单数量、订单数量

2、将计算的实时结果保存在redis中

3、外部程序访问redis的数据实时展示结果

kafka eagle 密码更改 kafka eagle使用_kafka eagle 密码更改

3、订单数据模型
订单编号、订单时间、支付编号、支付时间、商品编号、商家名称、商品价格、优惠价格、支付
4、指标需求
平台运维角度统计指标
平台总销售额度
redisRowKey设计 itcast:order:total:price:date
平台今天下单人数
redisRowKey设计 itcast:order:total:user:date
平台商品销售数量
redisRowKey设计 itcast:order:total:num:date

商品销售角度统计指标
每个商品的总销售额
Redis的rowKey设计itcast:order:productId:price:date
每个商品的购买人数
Redis的rowKey设计itcast:order:productId:user:date
每个商品的销售数量
Redis的rowKey设计itcast:order:productId:num:date
店铺销售角度统计指标
每个店铺的总销售额
Redis的rowKey设计itcast:order:shopId:price:date
每个店铺的购买人数
Redis的rowKey设计itcast:order:shopId:user:date
每个店铺的销售数量
Redis的rowKey设计itcast:order:shopId:num:date

5、kafka 当中的topic创建,以及模拟消息生产程序1、创建我们的topic

bin/kafka-topics.sh  --create --replication-factor 2 --topic itcast_order --zookeeper node01:2181,node02:2181,node03:2181 --partitions 5

2、创建maven项目并导入必须依赖的jar包

<dependencies>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>0.10.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.alibaba</groupId>
        <artifactId>fastjson</artifactId>
        <version>1.2.41</version>
    </dependency>
    <dependency>
        <groupId>redis.clients</groupId>
        <artifactId>jedis</artifactId>
        <version>2.9.0</version>
    </dependency>

    <dependency>
        <groupId>log4j</groupId>
        <artifactId>log4j</artifactId>
        <version>1.2.17</version>
    </dependency>


</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.1</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <transformers>
                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>cn.itcast.storm.kafkaConsumer.KafkTopology</mainClass>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>
         <plugin>
                 <artifactId> maven-assembly-plugin </artifactId>
                 <configuration>
                      <descriptorRefs>
                           <descriptorRef>jar-with-dependencies</descriptorRef>
                      </descriptorRefs>
                      <archive>
                           <manifest>
                                <mainClass>cn.itcast.flumekafka.LoggerPrint</mainClass>
                           </manifest>
                      </archive>
                 </configuration>
                 <executions>
                      <execution>
                           <id>make-assembly</id>
                           <phase>package</phase>
                           <goals>
                                <goal>single</goal>
                           </goals>
                      </execution>
                 </executions>
            </plugin>
    </plugins>
</build>

6、代码实现
消息生产代码实现
第一步:创建我们的订单实体类

import com.alibaba.fastjson.JSONObject;

import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Random;
import java.util.UUID;

public class PaymentInfo {

    private static final long serialVersionUID = -7958315778386204397L;
    private String orderId;//订单编号
    private Date createOrderTime;//订单创建时间
    private String paymentId;//支付编号
    private Date paymentTime;//支付时间
    private String productId;//商品编号
    private String productName;//商品名称
    private long productPrice;//商品价格
    private long promotionPrice;//促销价格
    private String shopId;//商铺编号
    private String shopName;//商铺名称
    private String shopMobile;//商品电话
    private long payPrice;//订单支付价格
    private int num;//订单数量
    /**
     * <Province>19</Province>
     * <City>1657</City>
     * <County>4076</County>
     */
    private String province; //省
    private String city; //市
    private String county;//县
    //102,144,114
    private String catagorys;
    public String getProvince() {
        return province;
    }
    public void setProvince(String province) {
        this.province = province;
    }

    public String getCity() {
        return city;
    }

    public void setCity(String city) {
        this.city = city;
    }

    public String getCounty() {
        return county;
    }

    public void setCounty(String county) {
        this.county = county;
    }

    public String getCatagorys() {
        return catagorys;
    }

    public void setCatagorys(String catagorys) {
        this.catagorys = catagorys;
    }

    public PaymentInfo() {
    }

    public PaymentInfo(String orderId, Date createOrderTime, String paymentId, Date paymentTime, String productId, String productName, long productPrice, long promotionPrice, String shopId, String shopName, String shopMobile, long payPrice, int num) {
        this.orderId = orderId;
        this.createOrderTime = createOrderTime;
        this.paymentId = paymentId;
        this.paymentTime = paymentTime;
        this.productId = productId;
        this.productName = productName;
        this.productPrice = productPrice;
        this.promotionPrice = promotionPrice;
        this.shopId = shopId;
        this.shopName = shopName;
        this.shopMobile = shopMobile;
        this.payPrice = payPrice;
        this.num = num;
    }

    public String getOrderId() {
        return orderId;
    }

    public void setOrderId(String orderId) {
        this.orderId = orderId;
    }

    public Date getCreateOrderTime() {
        return createOrderTime;
    }

    public void setCreateOrderTime(Date createOrderTime) {
        this.createOrderTime = createOrderTime;
    }

    public String getPaymentId() {
        return paymentId;
    }

    public void setPaymentId(String paymentId) {
        this.paymentId = paymentId;
    }

    public Date getPaymentTime() {
        return paymentTime;
    }

    public void setPaymentTime(Date paymentTime) {
        this.paymentTime = paymentTime;
    }

    public String getProductId() {
        return productId;
    }

    public void setProductId(String productId) {
        this.productId = productId;
    }

    public String getProductName() {
        return productName;
    }

    public void setProductName(String productName) {
        this.productName = productName;
    }

    public long getProductPrice() {
        return productPrice;
    }

    public void setProductPrice(long productPrice) {
        this.productPrice = productPrice;
    }

    public long getPromotionPrice() {
        return promotionPrice;
    }

    public void setPromotionPrice(long promotionPrice) {
        this.promotionPrice = promotionPrice;
    }

    public String getShopId() {
        return shopId;
    }

    public void setShopId(String shopId) {
        this.shopId = shopId;
    }

    public String getShopName() {
        return shopName;
    }

    public void setShopName(String shopName) {
        this.shopName = shopName;
    }

    public String getShopMobile() {
        return shopMobile;
    }

    public void setShopMobile(String shopMobile) {
        this.shopMobile = shopMobile;
    }

    public long getPayPrice() {
        return payPrice;
    }

    public void setPayPrice(long payPrice) {
        this.payPrice = payPrice;
    }

    public int getNum() {
        return num;
    }

    public void setNum(int num) {
        this.num = num;
    }

    @Override
    public String toString() {
        return "PaymentInfo{" +
                "orderId='" + orderId + '\'' +
                ", createOrderTime=" + createOrderTime +
                ", paymentId='" + paymentId + '\'' +
                ", paymentTime=" + paymentTime +
                ", productId='" + productId + '\'' +
                ", productName='" + productName + '\'' +
                ", productPrice=" + productPrice +
                ", promotionPrice=" + promotionPrice +
                ", shopId='" + shopId + '\'' +
                ", shopName='" + shopName + '\'' +
                ", shopMobile='" + shopMobile + '\'' +
                ", payPrice=" + payPrice +
                ", num=" + num +
                '}';
    }

    public String random() throws ParseException {
        this.orderId = UUID.randomUUID().toString().replaceAll("-", "");
        this.paymentId = UUID.randomUUID().toString().replaceAll("-", "");
        this.productPrice = new Random().nextInt(1000);
        this.promotionPrice = new Random().nextInt(500);
        this.payPrice = new Random().nextInt(480);
        this.shopId = new Random().nextInt(200000)+"";

        this.catagorys = new Random().nextInt(10000)+","+new Random().nextInt(10000)+","+new Random().nextInt(10000);
        this.province = new Random().nextInt(23)+"";
        this.city = new Random().nextInt(265)+"";
        this.county = new Random().nextInt(1489)+"";

        String date = "2015-11-11 12:22:12";
        SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
        try {
            this.createOrderTime = simpleDateFormat.parse(date);
        } catch (ParseException e) {
            e.printStackTrace();
        }
        JSONObject obj = new JSONObject();
        String jsonString = obj.toJSONString(this);
        return jsonString;
        //  return new Gson().toJson(this);
    }


}

第二步:定义log4j.properties配置文件
在项目的src/main/resources路径下创建log4j.properties并进行配置

### 设置###
log4j.rootLogger = debug,stdout,D,E

### 输出信息到控制抬 ###
log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target = System.out
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n

### 输出DEBUG 级别以上的日志到=E://logs/error.log ###
log4j.appender.D = org.apache.log4j.DailyRollingFileAppender
#log4j.appender.D.File = E://logs/log.log
log4j.appender.D.File = /export/servers/orderLogs/orderinfo.log
log4j.appender.D.Append = true
log4j.appender.D.Threshold = DEBUG 
log4j.appender.D.layout = org.apache.log4j.PatternLayout
#log4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss}  [ %t:%r ] - [ %p ]  %m%n
log4j.appender.D.layout.ConversionPattern = %m%n

### 输出ERROR 级别以上的日志到=E://logs/error.log ###
log4j.appender.E = org.apache.log4j.DailyRollingFileAppender
#log4j.appender.E.File = E://logs/ordererror.log
log4j.appender.E.File = /export/servers/orderLogs/ordererror.log
log4j.appender.E.Append = true
log4j.appender.E.Threshold = ERROR 
log4j.appender.E.layout = org.apache.log4j.PatternLayout
#log4j.appender.E.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss}  [ %t:%r ] - [ %p ]  %m%n
log4j.appender.E.layout.ConversionPattern =  %m%n

第三步:开发日志生产代码

import org.apache.log4j.Logger;

import java.text.ParseException;

public class LogOperate {

    private static  Logger printLogger = Logger.getLogger("printLogger");


    public static void main(String[] args) throws ParseException, InterruptedException {
        PaymentInfo paymentInfo = new PaymentInfo();

        while(true){
            String random = paymentInfo.random();
            System.out.println(random);
            printLogger.info(random);        //模拟订单的生成
            Thread.sleep(1000);
        }


    }

}

第四步:将程序打包并上传服务器运行

将我们的程序进行打包,并上传到node03服务器进行运行,产生日志处理

kafka eagle 密码更改 kafka eagle使用_kafka_02


第六步:开发flume配置文件,实现收集数据到kafka

node03执行以下命令,开发flume配置文件

#为我们的source channel  sink起名
a1.sources = r1
a1.channels = c1
a1.sinks = k1
#指定我们的source收集到的数据发送到哪个管道
a1.sources.r1.channels = c1
#指定我们的source数据收集策略
a1.sources.r1.type = TAILDIR
a1.sources.r1.positionFile = /var/log/flume/taildir_position.json
a1.sources.r1.filegroups = f1
a1.sources.r1.filegroups.f1 = /export/servers/orderLogs/orderinfo.log

#指定我们的channel为memory,即表示所有的数据都装进memory当中
a1.channels.c1.type = memory
#指定我们的sink为kafka  sink,并指定我们的sink从哪个channel当中读取数据
a1.sinks.k1.channel = c1
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.topic = itcast_order
a1.sinks.k1.kafka.bootstrap.servers = node01:9092,node02:9092,node03:9092
a1.sinks.k1.kafka.flumeBatchSize = 20
a1.sinks.k1.kafka.producer.acks = 1

第七步:kafka启动console控制台,消费数据以验证数据进入kafka
node01执行以下命令进入kafka控制台进行消费,消费kafka当中的数据以验证数据计入kafka

cd /export/servers/kafka_2.11-1.0.0
bin/kafka-console-consumer.sh --bootstrap-server node01:9092,node02:9092,node03:9092 --topic itcast_order --from-beginning

消息消费代码实现
定义redis工具类:

import com.sun.org.apache.bcel.internal.generic.NEW;
import redis.clients.jedis.Jedis;
import redis.clients.jedis.JedisPool;
import redis.clients.jedis.JedisPoolConfig;
import redis.clients.jedis.JedisSentinelPool;

/**
 * 主要用于获取jedis的客户端连接
 */
public class JedisUtils {
    private static  JedisPool jedisPool;


    public  static JedisPool getJedisPool(){

        if(null == jedisPool){

            JedisPoolConfig jedisPoolConfig = new JedisPoolConfig(); //创建jedis连接池配置  
            jedisPoolConfig.setMaxTotal(20);   //最大连接数 
            jedisPoolConfig.setMaxIdle(10);        //最大空闲连接  
            jedisPoolConfig.setMinIdle(5);        //最小空闲连接  
            jedisPoolConfig.setMaxWaitMillis(3000);

            jedisPool = new JedisPool(jedisPoolConfig,"node01",6379);

        }


        return jedisPool;

    }


    public static void main(String[] args) {
        JedisPool jedisPool = getJedisPool();
        Jedis resource = jedisPool.getResource();
        resource.set("setkey","setvalue");
        resource.close();



    }




}

定义消费者类:

import com.alibaba.fastjson.JSONObject;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.clients.consumer.OffsetAndMetadata;
import org.apache.kafka.common.TopicPartition;
import redis.clients.jedis.Jedis;
import redis.clients.jedis.JedisPool;

import java.util.*;

public class MyKafkaConsumer {

    /**
     * 消费itcast_order这个topic里面的数据
     * @param args
     */
    public static void main(String[] args) {

        Properties props = new Properties();
        //指定kafka的服务器地址
        props.put("bootstrap.servers", "node01:9092,node02:9092,node03:9092");
        //指定消费者组的名字
        props.put("group.id", "testGroup");
        //允许程序自动提交offset 提交offset保存到了kafka当中的一个topic里面去了
        props.put("enable.auto.commit", "false");
        //每隔多长时间提交一次offset的值
       

        //数据key,和value的序列化
        props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        //定义kafkaConsumer

        KafkaConsumer<String, String> kafkaConsumer = new KafkaConsumer<String, String>(props);

        kafkaConsumer.subscribe(Arrays.asList("itcast_order"));

        while(true){
            //获取到topic当中所有的数据
            ConsumerRecords<String, String> consumerRecords = kafkaConsumer.poll(3000);

            Set<TopicPartition> partitions = consumerRecords.partitions();//获取到所有的分区
            for (TopicPartition topicPartition : partitions) {


                List<ConsumerRecord<String, String>> records = consumerRecords.records(topicPartition);
                for (ConsumerRecord<String, String> record : records) {

                    JedisPool jedisPool = JedisUtils.getJedisPool();
                    Jedis jedis = jedisPool.getResource();//获取jedis客户端


                    String value = record.value();//获取json格式字符串

                    //将json格式字符串,转换成为对象
                    PaymentInfo paymentInfo = JSONObject.parseObject(value, PaymentInfo.class);
                    long payPrice = paymentInfo.getPayPrice();

                    //redis当中的key,一般都是约定俗称的  itcast:order:total:price:date
                    //求取平台销售总额度
                    jedis.incrBy("itcast:order:total:price:date",payPrice);

                    //平台今天下单人数
                    jedis.incr("itcast:order:total:user:date");
                    // 平台商品销售数量 简单认为一个订单里面就一个商品
                    jedis.incr("itcast:order:total:num:date");


                    //每个商品的总销售额
                    jedis.incrBy("itcast:order:"+paymentInfo.getProductId()+":price:date",payPrice);
                    // 每个商品的购买人数
                    jedis.incr("itcast:order:"+paymentInfo.getProductId()+":user:date");
                    // 每个商品的销售数量
                    jedis.incr("itcast:order:"+paymentInfo.getProductId()+":num:date");


                    //每个店铺的总销售额
                    jedis.incrBy("itcast:order:"+paymentInfo.getShopId()+":pirce:date",payPrice);
                    //每个店铺的购买人数
                    jedis.incr("itcast:order:"+paymentInfo.getShopId()+":user:date");
                    //每个店铺的销售数量
                    jedis.incr("itcast:order:"+paymentInfo.getShopId()+":num:date");

                    jedis.close();


                    //处理业务逻辑
                    /**
                     * 平台运维角度统计指标
                     平台总销售额度
                     redisRowKey设计  itcast:order:total:price:date
                     平台今天下单人数
                     redisRowKey设计  itcast:order:total:user:date
                     平台商品销售数量
                     redisRowKey设计  itcast:order:total:num:date

                     商品销售角度统计指标
                     每个商品的总销售额
                     Redis的rowKey设计itcast:order:productId:price:date
                     每个商品的购买人数
                     Redis的rowKey设计itcast:order:productId:user:date
                     每个商品的销售数量
                     Redis的rowKey设计itcast:order:productId:num:date


                     店铺销售角度统计指标
                     每个店铺的总销售额
                     Redis的rowKey设计itcast:order:shopId:price:date
                     每个店铺的购买人数
                     Redis的rowKey设计itcast:order:shopId:user:date
                     每个店铺的销售数量
                     Redis的rowKey设计itcast:order:shopId:num:date
                     */






                }

                //每一个分区处理完成之后,就提交offset值
                long offset = records.get(records.size() - 1).offset();

                Collections.singletonMap(topicPartition,new OffsetAndMetadata(offset));

                kafkaConsumer.commitSync();




            }


        }




    }

}