您当前的位置: 首页 >  flink

Bulut0907

暂无认证

  • 3浏览

    0关注

    346博文

    0收益

  • 0浏览

    0点赞

    0打赏

    0留言

私信
关注
热门博文

Flink Iceberg Connector

Bulut0907 发布时间:2022-04-08 17:50:49 ,浏览量:3

目录
  • 1. 创建表
  • 2. 插入数据和查询数据

1. 创建表

在Flink的当前Catalog(默认为default_catalog,类型为GenericInMemoryCatalog)中,创建一个和Iceberg表对应的表映射

Flink SQL> create temporary table default_catalog.default_database.my_user(
> user_id bigint,
> user_name string,
> birthday date,
> country string
> ) with (
> 'connector'='iceberg',
> 'catalog-type'='hadoop',
> 'catalog-name'='hadoop_catalog',
> 'catalog-database'='iceberg_db',
> 'catalog-table'='my_user',
> 'warehouse'='hdfs://nnha/user/iceberg/warehouse'
> );
[INFO] Execute statement succeed.

Flink SQL>
  • catalog-type: 默认是hive。如果类型是hive,还得添加表参数'uri'='thrift://hive1:9083'
  • catalog-database: 默认Flink当前所在的Flink Database,比如default_catalog中的default_database
  • catalog-table: 默认使用Flink建表语句create table my_user中的表名
2. 插入数据和查询数据
Flink SQL> set 'sql-client.execution.result-mode' = 'tableau';
[INFO] Session property has been set.

Flink SQL>
Flink SQL> select * from default_catalog.default_database.my_user;
+----+----------------------+--------------------------------+------------+--------------------------------+
| op |              user_id |                      user_name |   birthday |                        country |
+----+----------------------+--------------------------------+------------+--------------------------------+
| +I |                    5 |                       zhao_liu | 2022-02-02 |                          japan |
| +I |                    2 |                      zhang_san | 2022-02-01 |                          china |
| +I |                    1 |                      zhang_san | 2022-02-01 |                          china |
+----+----------------------+--------------------------------+------------+--------------------------------+
Received a total of 3 rows

Flink SQL>
Flink SQL> insert into default_catalog.default_database.my_user(user_id, user_name, birthday, country) values(6, 'zhang_san', date '2022-02-01', 'china');
[INFO] Submitting SQL update statement to the cluster...
[INFO] SQL update statement has been successfully submitted to the cluster:
Job ID: 0b3e692d5100fb075668fc7a32d7f3e4


Flink SQL> select * from my_user;
+----+----------------------+--------------------------------+------------+--------------------------------+
| op |              user_id |                      user_name |   birthday |                        country |
+----+----------------------+--------------------------------+------------+--------------------------------+
| +I |                    6 |                      zhang_san | 2022-02-01 |                          china |
| +I |                    5 |                       zhao_liu | 2022-02-02 |                          japan |
| +I |                    2 |                      zhang_san | 2022-02-01 |                          china |
| +I |                    1 |                      zhang_san | 2022-02-01 |                          china |
+----+----------------------+--------------------------------+------------+--------------------------------+
Received a total of 4 rows

Flink SQL>
关注
打赏
1664501120
查看更多评论
立即登录/注册

微信扫码登录

0.0428s