1、filebeat采集日誌(可以采集多種日誌類型logpatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
3.2 安裝啟動kibana
>3.2.1 解壓 kibana
[root@ecs7 efk]# su elasticsearch
[elasticsearch@ecs7 efk]$ tar -zxvf kibana-7.12.0-linux-x86_64.tar.gz
>3.2.2 配置 kibana
[elasticsearch@ecs7 efk]$ cd kibana-7.12.0-linux-x86_64
[elasticsearch@ecs7 kibana-7.12.0-linux-x86_64]$ cd config/
[elasticsearch@ecs7 config]$ cp kibana.yml kibana.yml.org
備份原始配置文件
[elasticsearch@ecs7 config]$ cp kibana.yml kibana.yml.org
kibana.yml 全文
# 端口
server.port: 5601
# 主機
server.host: "0.0.0.0"
# 名稱
server.name: "master"
# es集群地址
elasticsearch.hosts: ["pression
#_source.enabled: false
setup.kibana:
# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
# Array of hosts to connect to.
# es 地址
hosts: ["191.168.0.107:9200"]
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
# 日誌時間處理
- timestamp:
field: json.@timestamp
timezone: Asia/Shanghai
layouts:
- '2006-01-02T15:04:05+08:00'
- '2006-01-02T15:04:05.999+08:00'
test:
- '2019-06-22T16:33:51+08:00'
- '2019-11-18T04:59:51.123+08:00'
# 刪除相關字段
- drop_fields:
fields: [json.@version,json.level_value,json.@timestamp]
# 重命名字段
- rename:
fields:
- from: "json.logName"
to: "json.appName"
ignore_missing: false
fail_on_error: true
>3.3.3 啟動filebeat
使用cmd運行 filebeat.exe
3.4 springboot logback配置
pom.xml 新增logstash-logback-encoder依賴,logstash-logback-encoder可以將日誌以json的方式輸出,也不用我們單獨處理多行記錄問題
net.logstash.logback
logstash-logback-encoder
5.3
<?xml version="1.0" encoding="UTF-8"?>
%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n
UTF-8
logs/${logName}/${logName}.log
true
logs/${logName}/${logName}-%d{yyyy-MM-dd}.log.%i
64MB
30
1GB
Asia/Shanghai
{"level": "%level","class": "%logger{40}","message": "%message","stack_trace": "%exception"}
啟動springboot服務,生成的日誌會自動被filebeat采集並推送到es。