Summary
Jobs
Get version matrix based on changes in PR/commit
Run core tests (spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run Clickhouse tests (server=24.6.3.70-alpine, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run Greenplum tests (server=7.0.0, spark=3.2.4, pydantic=2, java=11, python=3.10, os=ubuntu-latest)
Run Hive tests (spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run Kafka tests (server=3.7.1, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run LocalFS tests (spark=3.5.0, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run MongoDB tests (server=7.0.12, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run MSSQL tests (server=2022-CU14-ubuntu-22.04, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run MySQL tests (server=9.0.1, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run Oracle tests (server=23.4-slim-faststart, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run Postgres tests (server=16.3-alpine, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run Teradata tests (spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run FTP tests (server=latest, pydantic=2, python=3.12, os=ubuntu-latest)
Run FTPS tests (server=latest, pydantic=2, python=3.12, os=ubuntu-latest)
Run HDFS tests (server=hadoop3-hdfs, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run S3 tests (server=2024.7.26, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
Run SFTP tests (server=9.6_p1-r0-ls154, pydantic=2, python=3.12, os=ubuntu-latest)
Run Samba tests (server=latest, pydantic=2, python=3.12, os=ubuntu-latest)
Run WebDAV tests (server=latest, pydantic=2, python=3.12, os=ubuntu-latest)
Tests done
Job
Run time
49s
1m 33s
1m 14s
Run Clickhouse tests (server=24.6.3.70-alpine, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run Clickhouse tests (server=24.6.3.70-alpine, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
1m 20s
1m 3s
Run Greenplum tests (server=7.0.0, spark=3.2.4, pydantic=2, java=11, python=3.10, os=ubuntu-latest) / Run Greenplum tests (server=7.0.0, spark=3.2.4, pydantic=2, java=11, python=3.10, os=ubuntu-latest)
2m 0s
Run Kafka tests (server=3.7.1, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run Kafka tests (server=3.7.1, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
3m 47s
2m 25s
Run HDFS tests (server=hadoop3-hdfs, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run HDFS tests (server=hadoop3-hdfs, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
4m 55s
2m 11s
Run MongoDB tests (server=7.0.12, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run MongoDB tests (server=7.0.12, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
1m 29s
Run MSSQL tests (server=2022-CU14-ubuntu-22.04, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run MSSQL tests (server=2022-CU14-ubuntu-22.04, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
2m 34s
Run MySQL tests (server=9.0.1, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run MySQL tests (server=9.0.1, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
1m 44s
Run Oracle tests (server=23.4-slim-faststart, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run Oracle tests (server=23.4-slim-faststart, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
2m 27s
Run Postgres tests (server=16.3-alpine, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run Postgres tests (server=16.3-alpine, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
2m 51s
Run S3 tests (server=2024.7.26, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest) / Run S3 tests (server=2024.7.26, spark=3.5.1, pydantic=2, java=20, python=3.12, os=ubuntu-latest)
3m 9s
1m 16s
1m 10s
1m 0s
3m 33s
21s
42m 51s
You can’t perform that action at this time.