一、下載Flink
下載Flink:
- https://www.apache.org/dyn/closer.lua/flink/flink-1.17.1/flink-1.17.1-bin-scala_2.12.tgz
二、解壓Flink 安裝包
tar -zxvf flink-1.17.1-bin-scala_2.12.tgz -C /moudle/
三、配置環(huán)境變量
vim /etc/profile.d/flink_env.sh
export HADOOP_HOME=/opt/module/hadoop-3.1.3
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
#Flink 需要
export FLINK_HOME=/module/flink-1.17.1
export HADOOP_CLASSPATH = `hadoop classpath`
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
四、激活環(huán)境變量
source /etc/profile.d/flink_env.sh
五、下載Iceberg flink jar包
下載Iceberg flink jar包:iceberg-flink-runtime-1.17-1.3.0.jar
- https://iceberg.apache.org/releases/
六、部署Iceberg flink jar包
cp iceberg-flink-runtime-1.17-1.3.0.jar /module/flink-1.17.1/lib/
七、修改flink配置
修改配置文件flink-conf.yaml
classloader.check-leaked-classloader: false
taskmanager.numberOfTaskSlots: 4
state.backed: rocksdb
execution.checkpointing.interval: 30000
state.checkpoints.dir: hdfs://hadoop1:8020/ckps
state.backed.incremental: true
local模式
修改workers文章來源:http://www.zghlxwxcb.cn/news/detail-558942.html
vim /module/flink-1.17.1/conf/workers
localhost
localhost
localhost
八、啟動flink
bin/start-cluster.sh
九、啟動flink sql client
bin/sql-client.sh embedded
至此FLink成功集成Iceberg文章來源地址http://www.zghlxwxcb.cn/news/detail-558942.html
到了這里,關于Iceberg從入門到精通系列之六:Flink集成Iceberg的文章就介紹完了。如果您還想了解更多內容,請在右上角搜索TOY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關文章,希望大家以后多多支持TOY模板網(wǎng)!