Replies: 1 comment 1 reply
-
有bug 全量数据量大的情况下 他分片取 范围会出现null的情况 导致一次查出整个库的数据 最终内存溢出 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
并行度设置的1
scan.incremental.snapshot.chunk.size: 80960
flink cdc 3.0
分块同步的,看日志数据量一次也就几万条,怎么快照阶段内存消耗这么多
Beta Was this translation helpful? Give feedback.
All reactions