You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We use SCDF to deploy Spring Batch app to our internal kubernetes environment creating task for each app. Normally batch app is running on one pod. But we would like to run batch apps on multiple pods .
I read the blog below, It suggests that running batch app using partitioned job makes sense for multi pod job execution.
Great Question!
Yes, there is a feature in Spring Cloud Task that supports the execution of remote batch partitions. And more can be read about it here.
There is a sample application in Spring Cloud Task that you can use to test and it can be found here.
There is an issue in Spring Cloud Deployer open which says that when launching pods asynchronously that some pods fail to get launched because they use the same name. There is a PR for it, but it did not make it in before code freeze. So for now partitions can only be launched synchronously for Kubernetes.
Hi,
We use SCDF to deploy Spring Batch app to our internal kubernetes environment creating task for each app. Normally batch app is running on one pod. But we would like to run batch apps on multiple pods .
I read the blog below, It suggests that running batch app using partitioned job makes sense for multi pod job execution.
https://spring.io/blog/2021/01/27/spring-batch-on-kubernetes-efficient-batch-processing-at-scale/
Is there any reference documentation or architectural solution for that requirement?
How should we deploy partitioned job to kubernetes using Spring Cloud Data Flow?
The text was updated successfully, but these errors were encountered: