Helm Chart Testing
Why Helm Charts Need Their Own Tests
Helm charts add a templating layer on top of Kubernetes manifests. This introduces a new class of bugs: a template might produce valid YAML for one set of values but invalid YAML for another. A conditional block might silently omit a critical resource. A helper template might generate incorrect labels that break service discovery.
Testing Helm charts means validating not just the chart itself, but the rendered output across all supported value combinations.
The Three Layers of Helm Testing
| Layer | What It Tests | Speed | Tools |
|---|---|---|---|
| Lint | Chart structure, syntax, metadata | Instant | helm lint |
| Template render + validate | Rendered YAML correctness | Seconds | helm template + kubeconform |
| Unit tests | Template logic, conditionals, defaults | Seconds | helm-unittest |
| Integration tests | Chart deploys and works in a real cluster | Minutes | helm test, ct (chart-testing) |
Linting: The First Gate
helm lint catches structural issues in your chart before you render or deploy anything:
# Basic lint
helm lint ./charts/myapp/
# Lint with specific values (catches issues with value-dependent templates)
helm lint ./charts/myapp/ --values values-production.yaml
# Lint with strict mode (warnings become errors)
helm lint ./charts/myapp/ --strict
# Common issues helm lint catches:
# - Missing Chart.yaml required fields
# - Template syntax errors (unclosed {{ }})
# - Missing values referenced in templates
# - Deprecated API versions
Chart.yaml Best Practices
# Chart.yaml
apiVersion: v2
name: myapp
description: A Helm chart for the MyApp service
type: application
version: 1.5.0 # Chart version (SemVer)
appVersion: "2.3.1" # Application version
maintainers:
- name: Platform Team
email: platform@example.com
dependencies:
- name: postgresql
version: "13.x"
repository: https://charts.bitnami.com/bitnami
condition: postgresql.enabled
Template Rendering and Validation
Rendering templates without deploying is the most important testing step. It catches issues that lint cannot detect because lint does not evaluate template logic.
# Render templates with default values
helm template myapp ./charts/myapp/
# Render with production values
helm template myapp ./charts/myapp/ \
--values values-production.yaml
# Pipe rendered output to kubeconform for K8s schema validation
helm template myapp ./charts/myapp/ \
--values values-production.yaml | kubeconform -strict
# Test with multiple value files
helm template myapp ./charts/myapp/ \
--values values-production.yaml \
--values values-us-east-1.yaml | kubeconform -strict
# Render with value overrides to test specific scenarios
helm template myapp ./charts/myapp/ \
--set replicas=1 \
--set image.tag=latest \
--set ingress.enabled=true | kubeconform -strict
Testing All Value Combinations
#!/bin/bash
# scripts/test-helm-values.sh
# Test chart rendering with all supported value files
CHART_DIR="./charts/myapp"
FAILURES=0
# Test each environment's values
for values_file in values-*.yaml; do
echo "Testing with $values_file..."
if ! helm template myapp "$CHART_DIR" --values "$values_file" | kubeconform -strict; then
echo "FAIL: $values_file produces invalid manifests"
FAILURES=$((FAILURES + 1))
fi
done
# Test with minimal values (defaults only)
echo "Testing with defaults only..."
if ! helm template myapp "$CHART_DIR" | kubeconform -strict; then
echo "FAIL: Default values produce invalid manifests"
FAILURES=$((FAILURES + 1))
fi
if [ $FAILURES -gt 0 ]; then
echo "$FAILURES value combinations failed"
exit 1
fi
echo "All value combinations passed"
Unit Testing with helm-unittest
helm-unittest is a dedicated unit testing framework for Helm charts. It lets you write assertions against rendered template output without deploying anything.
Installation
# Install as a Helm plugin
helm plugin install https://github.com/helm-unittest/helm-unittest
Test Structure
charts/myapp/
Chart.yaml
values.yaml
templates/
deployment.yaml
service.yaml
ingress.yaml
tests/
deployment_test.yaml
service_test.yaml
ingress_test.yaml
Writing Unit Tests
# tests/deployment_test.yaml
suite: Deployment Tests
templates:
- deployment.yaml
tests:
- it: should set resource limits
asserts:
- isNotNull:
path: spec.template.spec.containers[0].resources.limits.cpu
- isNotNull:
path: spec.template.spec.containers[0].resources.limits.memory
- it: should not run as root
asserts:
- equal:
path: spec.template.spec.securityContext.runAsNonRoot
value: true
- it: should use the correct image tag
set:
image.tag: "2.3.1"
asserts:
- matchRegex:
path: spec.template.spec.containers[0].image
pattern: ":2\\.3\\.1$"
- it: should set correct replica count for production
values:
- ../values-production.yaml
asserts:
- equal:
path: spec.replicas
value: 3
- it: should set readiness probe
asserts:
- isNotNull:
path: spec.template.spec.containers[0].readinessProbe
- equal:
path: spec.template.spec.containers[0].readinessProbe.httpGet.path
value: /healthz
- it: should set liveness probe
asserts:
- isNotNull:
path: spec.template.spec.containers[0].livenessProbe
- it: should drop all capabilities
asserts:
- contains:
path: spec.template.spec.containers[0].securityContext.capabilities.drop
content: "ALL"
- it: should not use latest tag
set:
image.tag: "latest"
asserts:
- failedTemplate:
errorMessage: "image.tag must not be 'latest'"
Testing Conditional Resources
# tests/ingress_test.yaml
suite: Ingress Tests
templates:
- ingress.yaml
tests:
- it: should not create ingress by default
asserts:
- hasDocuments:
count: 0
- it: should create ingress when enabled
set:
ingress.enabled: true
ingress.host: "myapp.example.com"
asserts:
- hasDocuments:
count: 1
- equal:
path: spec.rules[0].host
value: "myapp.example.com"
- it: should configure TLS when specified
set:
ingress.enabled: true
ingress.host: "myapp.example.com"
ingress.tls.enabled: true
ingress.tls.secretName: "myapp-tls"
asserts:
- equal:
path: spec.tls[0].secretName
value: "myapp-tls"
- contains:
path: spec.tls[0].hosts
content: "myapp.example.com"
Running Unit Tests
# Run all tests
helm unittest ./charts/myapp/
# Run with verbose output
helm unittest -v ./charts/myapp/
# Output as JUnit XML for CI
helm unittest -o junit -f test-results.xml ./charts/myapp/
# Run specific test file
helm unittest -f 'tests/deployment_test.yaml' ./charts/myapp/
Integration Testing with chart-testing (ct)
chart-testing (ct) by Helm maintainers validates charts by actually installing them in a cluster. Use it with kind (Kubernetes in Docker) for CI:
# .github/workflows/helm-test.yml
name: Helm Chart Tests
on:
pull_request:
paths:
- 'charts/**'
jobs:
lint-and-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Helm
uses: azure/setup-helm@v4
- name: Set up chart-testing
uses: helm/chart-testing-action@v2
- name: Lint charts
run: ct lint --all
- name: Create kind cluster
uses: helm/kind-action@v1
- name: Install and test charts
run: ct install --all
In-Chart Test Hooks
Helm supports test hooks -- pods that run after installation to verify the chart works:
# templates/tests/test-connection.yaml
apiVersion: v1
kind: Pod
metadata:
name: "{{ include "myapp.fullname" . }}-test-connection"
labels:
{{- include "myapp.labels" . | nindent 4 }}
annotations:
"helm.sh/hook": test
"helm.sh/hook-delete-policy": hook-succeeded
spec:
containers:
- name: wget
image: busybox:1.36
command: ['wget']
args: ['{{ include "myapp.fullname" . }}:{{ .Values.service.port }}/healthz']
restartPolicy: Never
# Run chart tests after installation
helm test myapp --namespace production
Best Practices Summary
- Always lint before rendering --
helm lint --strictcatches structural issues instantly. - Render with all value files -- A chart that works with defaults but breaks with production values is a deployment risk.
- Pipe rendered output to kubeconform -- This catches Kubernetes schema issues that Helm does not check.
- Write unit tests for conditionals -- Every
if/elsein your templates needs a test for both branches. - Test value validation -- If certain value combinations are invalid, ensure templates fail with clear error messages.
- Use chart-testing in CI -- Actual installation tests catch issues that static analysis misses.
- Pin dependency versions -- Use exact or range versions in Chart.yaml, never
*.