Skip to content

Commit

Permalink
md文档更新
Browse files Browse the repository at this point in the history
  • Loading branch information
zhuhp committed Jan 16, 2021
1 parent 87d9e6a commit 30b4cc2
Showing 1 changed file with 9 additions and 73 deletions.
82 changes: 9 additions & 73 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,17 @@ https://xie.infoq.cn/article/1af0cb75be056fea788e6c86b


## 一、简介
flink-streaming-platform-web系统是基于flink封装的一个可视化的web系统,用户只需在web界面进行sql配置就能完成流计算任务,
flink-streaming-platform-web系统是基于flink封装的一个可视化的、轻量级的web系统,用户只需在web界面进行sql配置就能完成流计算任务,
主要功能包含任务配置、启/停任务、告警、日志等功能。目的是减少开发,完全实现flink-sql 流计算任务

flink任务支持单流 双流 单流与维表等
**flink任务支持单流 、双流、 单流与维表等**

<font color=red size=5>支持本地模式、yarn-per模式、STANDALONE模式 </font >
**支持本地模式、yarn-per模式、STANDALONE模式**


**支持udf、自定义连接器等,完全兼容官方连接器**


**目前flink版本已经升级到1.12**


Expand Down Expand Up @@ -163,6 +164,8 @@ cd /XXXX/flink-streaming-platform-web/bin
日志目录地址: /XXXX/flink-streaming-platform-web/logs/
~~~~


**一定 一定 一定 要到bin目录下再执行deploy.sh 否则无法启动**


Expand Down Expand Up @@ -403,7 +406,7 @@ udf 开发demo 详见 [https://github.com/zhp8341/flink-streaming-udf](https://
```


##以下语法是按照flink1.10写的 有时间重新写


[demo1 单流kafka写入mysqld 参考 ](https://github.com/zhp8341/flink-streaming-platform-web/tree/master/docs/sql_demo/demo_1.md)

Expand All @@ -415,80 +418,13 @@ udf 开发demo 详见 [https://github.com/zhp8341/flink-streaming-udf](https://

[demo5 滑动窗口](https://github.com/zhp8341/flink-streaming-platform-web/tree/master/docs/sql_demo/demo_5.md)

[demo6 JDBC CDC的使用示例](https://github.com/zhp8341/flink-streaming-platform-web/tree/master/docs/sql_demo/demo_6.md)

[demo7 datagen简介](https://github.com/zhp8341/flink-streaming-platform-web/blob/master/docs/sql_demo/demo_datagen.md)

```sql

CREATE FUNCTION jsonHasKey as 'com.xx.udf.JsonHasKeyUDF';

-- 如果使用udf 函数必须配置 udf地址

create table flink_test_6 (
id BIGINT,
day_time VARCHAR,
amnount BIGINT,
proctime AS PROCTIME ()
)
with (
'connector.properties.zookeeper.connect'='hadoop001:2181',
'connector.version'='universal',
'connector.topic'='flink_test_6',
'connector.startup-mode'='earliest-offset',
'format.derive-schema'='true',
'connector.type'='kafka',
'update-mode'='append',
'connector.properties.bootstrap.servers'='hadoop003:9092',
'connector.properties.group.id'='flink_gp_test1',
'format.type'='json'
);


create table flink_test_6_dim (
id BIGINT,
coupon_amnount BIGINT
)
with (
'connector.type' = 'jdbc',
'connector.url' = 'jdbc:mysql://127.0.0.1:3306/flink_web?characterEncoding=UTF-8',
'connector.table' = 'test_dim',
'connector.username' = 'flink_web_test',
'connector.password' = 'flink_web_test_123',
'connector.lookup.max-retries' = '3'
);


CREATE TABLE sync_test_3 (
day_time string,
total_gmv bigint
) WITH (
'connector.type' = 'jdbc',
'connector.url' = 'jdbc:mysql://127.0.0.1:3306/flink_web?characterEncoding=UTF-8',
'connector.table' = 'sync_test_3',
'connector.username' = 'flink_web_test',
'connector.password' = 'flink_web_test_123'

);


INSERT INTO sync_test_3
SELECT
day_time,
SUM(amnount - coupon_amnount) AS total_gmv
FROM
(
SELECT
a.day_time as day_time,
a.amnount as amnount,
b.coupon_amnount as coupon_amnount
FROM
flink_test_6 as a
LEFT JOIN flink_test_6_dim FOR SYSTEM_TIME AS OF a.proctime as b
ON b.id = a.id
)
GROUP BY day_time;

```



**官方相关预发和连接下载**
Expand Down

0 comments on commit 30b4cc2

Please sign in to comment.