Skip to content

Namespace inheritance breaks strategic merge patches in Kustomize 5.8.0 #6031

@ikyasam18

Description

@ikyasam18

What happened?

I encountered the following error when upgrading Kustomize from version 5.7.0 to 5.8.0.

$go run main.go build /Users/masayuki.kamoda/work/kustomize/kustomize-repro/overlay
Error: accumulating resources: accumulation err='accumulating resources from './hoge_v2': '/Users/masayuki.kamoda/work/kustomize/kustomize-repro/overlay/hoge_v2' must resolve to a file': recursed accumulation of path '/Users/masayuki.kamoda/work/kustomize/kustomize-repro/overlay/hoge_v2': no resource matches strategic merge patch "ConfigMap.v1.[noGrp]/hoge-schedule_job_1.[noNs]": no matches for Id ConfigMap.v1.[noGrp]/hoge-schedule_job_1.[noNs]; failed to find unique target for patch ConfigMap.v1.[noGrp]/hoge-schedule_job_1.[noNs]
exit status 1

What did you expect to happen?

  • In Kustomize 5.7.0: The patch is successfully applied to the ConfigMap
  • In Kustomize 5.8.0: The patch should continue to work, maintaining backward compatibility

How can we reproduce it (as minimally and precisely as possible)?

https://github.com/ikyasam18/kustomize-repro

kustomize build ~/work/kustomize/kustomize-repro/overlay 

Expected output

apiVersion: v1
data:
  configs.yaml: |
    case_configs:
      - id: 1
        name: dummy1
        client: client1
kind: ConfigMap
metadata:
  name: hoge-schedule_job_1
  namespace: hoge-dev
---
apiVersion: v1
data:
  configs.yaml: |
    case_configs:
      - id: 2
        name: dummy2
        client: client2
kind: ConfigMap
metadata:
  name: hoge-schedule_job_2
  namespace: hoge-dev
---
apiVersion: v1
data:
  configs.yaml: |
    case_configs:
      - id: 3
        name: dummy3
        client: client3
kind: ConfigMap
metadata:
  name: hoge-schedule_job_3
  namespace: hoge-dev
---
apiVersion: v1
data:
  configs.yaml: |
    case_configs:
      - id: 4
        name: dummy4
        client: client4
kind: ConfigMap
metadata:
  name: hoge-schedule_job_4
  namespace: hoge-dev
---
apiVersion: batch/v1
kind: CronJob
metadata:
  name: hoge-schedule_job_1
  namespace: hoge-dev
spec:
  concurrencyPolicy: Forbid
  jobTemplate:
    spec:
      activeDeadlineSeconds: 600
      template:
        metadata:
          annotations:
            cluster-autoscaler.kubernetes.io/safe-to-evict: "false"
        spec:
          containers:
          - args:
            - hoge
            - run-schedule_job_1
            command:
            - python
            - scripts/main.py
            image: TO_BE_SPECIFIED
            imagePullPolicy: Always
            name: hoge
            resources:
              limits:
                cpu: 1
                memory: 5Gi
              requests:
                cpu: 0.25
                memory: 2Gi
          nodeSelector:
            machine-type: highmem-2
          restartPolicy: Never
          serviceAccountName: hoge-batch
          tolerations:
          - effect: NoSchedule
            key: zeroScale
            operator: Equal
          volumes:
          - configMap:
              name: hoge-schedule_job_1
            name: configs
  schedule: 10 8 * * *
  startingDeadlineSeconds: 600
  timeZone: Asia/Tokyo
---
apiVersion: batch/v1
kind: CronJob
metadata:
  name: hoge-schedule_job_2
  namespace: hoge-dev
spec:
  concurrencyPolicy: Forbid
  jobTemplate:
    spec:
      activeDeadlineSeconds: 600
      template:
        metadata:
          annotations:
            cluster-autoscaler.kubernetes.io/safe-to-evict: "false"
        spec:
          containers:
          - args:
            - hoge
            - run-schedule_job_2
            command:
            - python
            - scripts/main.py
            image: TO_BE_SPECIFIED
            imagePullPolicy: Always
            name: hoge
            resources:
              limits:
                cpu: 1
                memory: 5Gi
              requests:
                cpu: 0.25
                memory: 2Gi
          nodeSelector:
            machine-type: highmem-2
          restartPolicy: Never
          serviceAccountName: hoge-batch
          tolerations:
          - effect: NoSchedule
            key: zeroScale
            operator: Equal
          volumes:
          - configMap:
              name: hoge-schedule_job_2
            name: configs
  schedule: 10 8 * * *
  startingDeadlineSeconds: 600
  timeZone: Asia/Tokyo
---
apiVersion: batch/v1
kind: CronJob
metadata:
  name: hoge-schedule_job_3
  namespace: hoge-dev
spec:
  concurrencyPolicy: Forbid
  jobTemplate:
    spec:
      activeDeadlineSeconds: 600
      template:
        metadata:
          annotations:
            cluster-autoscaler.kubernetes.io/safe-to-evict: "false"
        spec:
          containers:
          - args:
            - hoge
            - run-schedule_job_3
            command:
            - python
            - scripts/main.py
            image: TO_BE_SPECIFIED
            imagePullPolicy: Always
            name: hoge
            resources:
              limits:
                cpu: 1
                memory: 5Gi
              requests:
                cpu: 0.25
                memory: 2Gi
          nodeSelector:
            machine-type: highmem-2
          restartPolicy: Never
          serviceAccountName: hoge-batch
          tolerations:
          - effect: NoSchedule
            key: zeroScale
            operator: Equal
          volumes:
          - configMap:
              name: hoge-schedule_job_3
            name: configs
  schedule: 10 8 * * *
  startingDeadlineSeconds: 600
  timeZone: Asia/Tokyo
---
apiVersion: batch/v1
kind: CronJob
metadata:
  name: hoge-schedule_job_4
  namespace: hoge-dev
spec:
  concurrencyPolicy: Forbid
  jobTemplate:
    spec:
      activeDeadlineSeconds: 600
      template:
        metadata:
          annotations:
            cluster-autoscaler.kubernetes.io/safe-to-evict: "false"
        spec:
          containers:
          - args: TO_BE_SPECIFIED
            command:
            - python
            - scripts/main.py
            - hoge
            - run-schedule_job_4
            image: TO_BE_SPECIFIED
            imagePullPolicy: Always
            name: hoge
            resources:
              limits:
                cpu: 1
                memory: 5Gi
              requests:
                cpu: 0.25
                memory: 2Gi
          nodeSelector:
            machine-type: highmem-2
          restartPolicy: Never
          serviceAccountName: hoge-batch
          tolerations:
          - effect: NoSchedule
            key: zeroScale
            operator: Equal
          volumes:
          - configMap:
              name: hoge-schedule_job_4
            name: configs
  schedule: 10 8 * * *
  startingDeadlineSeconds: 600
  timeZone: Asia/Tokyo

Actual output

$go run main.go build /Users/masayuki.kamoda/work/kustomize/kustomize-repro/overlay
Error: accumulating resources: accumulation err='accumulating resources from './hoge_v2': '/Users/masayuki.kamoda/work/kustomize/kustomize-repro/overlay/hoge_v2' must resolve to a file': recursed accumulation of path '/Users/masayuki.kamoda/work/kustomize/kustomize-repro/overlay/hoge_v2': no resource matches strategic merge patch "ConfigMap.v1.[noGrp]/hoge-schedule_job_1.[noNs]": no matches for Id ConfigMap.v1.[noGrp]/hoge-schedule_job_1.[noNs]; failed to find unique target for patch ConfigMap.v1.[noGrp]/hoge-schedule_job_1.[noNs]
exit status 1

Kustomize version

5.8.0

Operating system

Linux

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/bugCategorizes issue or PR as related to a bug.triage/acceptedIndicates an issue or PR is ready to be actively worked on.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions