-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
velero backup logs or describe is not working though backup create is working fine #8439
Comments
That's an internal network. Velero cli you execute logs or describe will need to have access to that cluster network. If s3url is not externally accessible you won't be able to use velero cli directly externally. You can try Pardon me if typos as currently replying on mobile. |
Backup create do not rely on client having s3url access. Log and describe does. #6167 is planned to help you not have to do above kubectl workaround. |
As this is expected behavior or requirements duplicate of #6167 closing. Feel free to keep commenting to troubleshoot. |
Got it. Its working now. Further I installed, velero with below command in my baremetal k8s cluster [NOT A CLOUD K8s]- velero install And created A volume snapshot class and updated VolumeSnapShotLocation as below -
Spec: After this my backup is working fine though it was PartialFailed while backing PV and PVc due to Volumesnapshot was not configured. I guess snapshot looks ok but whatever the data was updated into PVC filesystem, after install application, is not restored and set to default. I don't see any details guide in Velero doc for baremetal setup. I am unable to fix to ensure no data loss placed in PVC/pv mounted inside stateful set pod . |
I'm assuming you are referring to missing volumesnapshotclass label? apiVersion: snapshot.storage.k8s.io/v1
kind: VolumeSnapshotClass
metadata:
name: test-snapclass
+ labels:
+ velero.io/csi-volumesnapshot-class: "true" per doc you need to label volumesnapshotclass for velero CSI to pick it up and use it to snapshot volumes.
|
The Volume Snapshot Location (VSL) api is not used for CSI snapshots. there are several types of velero volume backup methods. Please select one and review that doc's Installing Velero section and backup examples.
for bare metal you are limited to first three. There is no need to configure VSL. |
What steps did you take and what happened:
velero gets installed in velero NS and I am able to take backup.
velero backup create pod-ubuntu --include-namespaces default --include-resources pods --wait
Backup request "pod-ubuntu" submitted successfully.
Waiting for backup to complete. You may safely press ctrl-c to stop waiting - your backup will continue in the background.
velero backup describe pod-ubuntu
Backup Volumes:
<error getting backup volume info: Get "http://34.118.238.97:9000/velero/backups/pod-ubuntu/pod-ubuntu-volumeinfo.json.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=minio%2F20241121%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20241121T135556Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=4dc2e66215ba28e60be625bc9fcd7700659259558b72f7bd65187742e888e7f2": context deadline exceeded>
But is I do #velero backup get it is working fine.
velero backup get
NAME STATUS ERRORS WARNINGS CREATED EXPIRES STORAGE LOCATION SELECTOR
minio-bkp Completed 0 0 2024-11-21 13:29:58 +0000 UTC 29d default
pod Completed 0 0 2024-11-21 13:35:21 +0000 UTC 29d default
pod-ubuntu Completed 0 0 2024-11-21 13:42:33 +0000 UTC 29d default
What did you expect to happen:*
It should be able to describe and restore the backup.
The following information will help us better understand what's going on:
If you are using velero v1.7.0+:
Please use
velero debug --backup <backupname> --restore <restorename>
to generate the support bundle, and attach to this issue, more options please refer tovelero debug --help
If you are using earlier versions:
Please provide the output of the following commands (Pasting long output into a GitHub gist or other pastebin is fine.)
kubectl logs deployment/velero -n velero
velero backup describe <backupname>
orkubectl get backup/<backupname> -n velero -o yaml
velero backup logs <backupname>
velero restore describe <restorename>
orkubectl get restore/<restorename> -n velero -o yaml
velero restore logs <restorename>
Anything else you would like to add:
Environment:
velero version
):velero client config get features
):kubectl version
):/etc/os-release
):Vote on this issue!
This is an invitation to the Velero community to vote on issues, you can see the project's top voted issues listed here.
Use the "reaction smiley face" up to the right of this comment to vote.
The text was updated successfully, but these errors were encountered: