Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

operator/config/samples/karmada.yaml may be not valid #4128

Closed
chaosi-zju opened this issue Oct 13, 2023 · 7 comments · Fixed by #4315
Closed

operator/config/samples/karmada.yaml may be not valid #4128

chaosi-zju opened this issue Oct 13, 2023 · 7 comments · Fixed by #4315
Assignees
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@chaosi-zju
Copy link
Member

chaosi-zju commented Oct 13, 2023

What happened:

When installing karmada by operator, karmada provided a sample karmada.yaml.

we do like this:

cd path/to/karmada

helm install karmada-operator -n karmada-system  --create-namespace --dependency-update ./charts/karmada-operator --debug

kubectl apply -f https://raw.githubusercontent.com/karmada-io/karmada/release-1.7/operator/config/crds/operator.karmada.io_karmadas.yaml

kubectl apply -f operator/config/samples/karmada.yaml

there will report karmada.yaml has syntax error:

image

I wanna if others think it is a problem, or these configurations have some special purpose?

What you expected to happen:

install success

How to reproduce it (as minimally and precisely as possible):

see above

Anything else we need to know?:

Environment:

  • Karmada version: latest
  • kubectl-karmada or karmadactl version (the result of kubectl-karmada version or karmadactl version): latest
  • Others:
@chaosi-zju chaosi-zju added the kind/bug Categorizes issue or PR as related to a bug. label Oct 13, 2023
@calvin0327
Copy link

/assign

@zhzhuang-zju
Copy link
Contributor

cc @calvin0327 Hi~ have you made some progress? if not, can I take this?

@zhzhuang-zju
Copy link
Contributor

@calvin0327 Hi~ Since you haven't replied yet, I’ll take it first~ Of course, if you have some ideas and suggestions too, be free to share them here
/assign

@zhzhuang-zju
Copy link
Contributor

As can be seen from the error log, karmada does not have the spec.components.etcd.local.volumeData.volumeClaim.metadata.name field. Looking at the code, we can see that the metadata in the structure definition contains the name field, so the problem should be in the process of generating crd from the structure.

// ObjectMeta is metadata that all persisted resources must have, which includes all objects
// users must create.
type ObjectMeta struct {
// Name must be unique within a namespace. Is required when creating resources, although
// some resources may allow a client to request the generation of an appropriate name
// automatically. Name is primarily intended for creation idempotence and configuration
// definition.
// Cannot be updated.
// More info: https://kubernetes.io/docs/concepts/overview/working-with-objects/names#names
// +optional
Name string `json:"name,omitempty" protobuf:"bytes,1,opt,name=name"`

It can be seen from the historical submission records that the above conversion process is completed using the controller-gen tool through command controller-gen crd paths=./operator/pkg/apis/operator/... output:crd:dir=./operator/config/crds.

Referring to kubernetes-sigs/controller-tools#557, embedded ObjectMeta in the CRD get's properly generated if the generator option generateEmbeddedObjectMeta=true is passed, so we should change the command accordingly and add generator option generateEmbeddedObjectMeta=true.

@chaosi-zju
Copy link
Member Author

@zhzhuang-zju great job~

@chaosi-zju
Copy link
Member Author

@zhzhuang-zju would you like to push a PR to fix it?

@zhzhuang-zju
Copy link
Contributor

@zhzhuang-zju would you like to push a PR to fix it?

Of course, I'd love to~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants