You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
sed: couldn't open temporary file /opt/flink/conf/sedfZedyP: Read-only file system
sed: couldn't open temporary file /opt/flink/conf/sedlTBBWK: Read-only file system
/docker-entrypoint.sh: line 73: /opt/flink/conf/flink-conf.yaml: Read-only file system
/docker-entrypoint.sh: line 89: /opt/flink/conf/flink-conf.yaml.tmp: Read-only file system
Starting kubernetes-application as a console application on host test-iceberg-sql-55494cd56-cb6x5.
ERROR StatusLogger Reconfiguration failed: No configuration found for '63947c6b' at 'null' in 'null'
ERROR StatusLogger Reconfiguration failed: No configuration found for '4e07b95f' at 'null' in 'null'
ERROR StatusLogger Reconfiguration failed: No configuration found for '4f7c0be3' at 'null' in 'null'
ERROR StatusLogger Reconfiguration failed: No configuration found for '2e647e59' at 'null' in 'null'
18:13:00.322 [flink-akka.actor.default-dispatcher-14] ERROR org.dinky.app.flinksql.Submitter - jobClient is empty, can not monitor job
18:13:00.329 [flink-akka.actor.default-dispatcher-14] ERROR org.apache.flink.runtime.entrypoint.ClusterEntrypoint - Fatal error occurred in the cluster entrypoint.
java.util.concurrent.CompletionException: org.apache.flink.client.deployment.application.ApplicationExecutionException: Could not execute application.
at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:292) ~[?:1.8.0_382]
at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:308) ~[?:1.8.0_382]
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:957) ~[?:1.8.0_382]
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940) ~[?:1.8.0_382]
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488) ~[?:1.8.0_382]
at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990) ~[?:1.8.0_382]
at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:337) ~[flink-dist-1.17.0.jar:1.17.0]
at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.lambda$runApplicationAsync$2(ApplicationDispatcherBootstrap.java:254) ~[flink-dist-1.17.0.jar:1.17.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_382]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_382]
at org.apache.flink.runtime.concurrent.akka.ActorSystemScheduledExecutorAdapter$ScheduledFutureTask.run(ActorSystemScheduledExecutorAdapter.java:171) ~[flink-rpc-akka_268bd73e-f252-4a85-9a09-d3bdeedddd6d.jar:1.17.0]
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68) ~[flink-rpc-akka_268bd73e-f252-4a85-9a09-d3bdeedddd6d.jar:1.17.0]
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$withContextClassLoader$0(ClassLoadingUtils.java:41) ~[flink-rpc-akka_268bd73e-f252-4a85-9a09-d3bdeedddd6d.jar:1.17.0]
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49) [flink-rpc-akka_268bd73e-f252-4a85-9a09-d3bdeedddd6d.jar:1.17.0]
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48) [flink-rpc-akka_268bd73e-f252-4a85-9a09-d3bdeedddd6d.jar:1.17.0]
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) [?:1.8.0_382]
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) [?:1.8.0_382]
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) [?:1.8.0_382]
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175) [?:1.8.0_382]
Caused by: org.apache.flink.client.deployment.application.ApplicationExecutionException: Could not execute application.
... 13 more
Caused by: org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Cannot initialize Catalog implementation org.apache.iceberg.hive.HiveCatalog: Cannot find constructor for interface org.apache.iceberg.catalog.Catalog
Missing org.apache.iceberg.hive.HiveCatalog [java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/NoSuchObjectException]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372) ~[flink-dist-1.17.0.jar:1.17.0]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist-1.17.0.jar:1.17.0]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:105) ~[flink-dist-1.17.0.jar:1.17.0]
at org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:301) ~[flink-dist-1.17.0.jar:1.17.0]
... 12 more
Caused by: java.lang.IllegalArgumentException: Cannot initialize Catalog implementation org.apache.iceberg.hive.HiveCatalog: Cannot find constructor for interface org.apache.iceberg.catalog.Catalog
Missing org.apache.iceberg.hive.HiveCatalog [java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/NoSuchObjectException]
at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:224) ~[iceberg-flink-runtime-1.17-1.3.0.jar:?]
at org.apache.iceberg.flink.CatalogLoader$HiveCatalogLoader.loadCatalog(CatalogLoader.java:128) ~[iceberg-flink-runtime-1.17-1.3.0.jar:?]
at org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:114) ~[iceberg-flink-runtime-1.17-1.3.0.jar:?]
at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:166) ~[iceberg-flink-runtime-1.17-1.3.0.jar:?]
at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:139) ~[iceberg-flink-runtime-1.17-1.3.0.jar:?]
at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:414) ~[flink-table-api-java-uber-1.17.0.jar:1.17.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1466) ~[flink-table-api-java-uber-1.17.0.jar:1.17.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1212) ~[flink-table-api-java-uber-1.17.0.jar:1.17.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:765) ~[flink-table-api-java-uber-1.17.0.jar:1.17.0]
at org.dinky.executor.DefaultTableEnvironment.executeSql(DefaultTableEnvironment.java:300) ~[?:?]
Hello @372242283, thank you for your feedback, but this issue will not be fixed. You can search for keywords in the Issue list to view it. Issues are marked `Wont Fix`
Search before asking
What happened
版本:1.0.1
描述:dinky启动的时候,${DINKY_HOME}/extends/目录下存在Iceberg核心依赖:iceberg-flink-runtime-1.17-1.3.0.jar
然后再镜像${FLINK_HOME}/lib/目录下也存在Iceberg核心依赖:iceberg-flink-runtime-1.17-1.3.0.jar
在提交k8s后查看日志报找不到Iceberg相关类
What you expected to happen
${FLINK_HOME}/lib/目录已经存在依赖,希望可以正常运行。有个疑问:会不会是dinky在启动的时候,classpath已经加载过Iceberg相关依赖,导致k8s提交的时候,无法加载类了?
How to reproduce
${FLINK_HOME}/lib/目录已经存在依赖,希望可以正常运行。有个疑问:会不会是dinky在启动的时候,classpath已经加载过Iceberg相关依赖,导致k8s提交的时候,无法加载类了?
Anything else
No response
Version
1.0.0
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: