For full contributor guidance (setup, PR expectations), see contributing.md.
-
Prepare an OpenSpec proposal before writing provider code. Start with
openspec-explorewhen the problem or scope is still fuzzy and you want to investigate the codebase, compare approaches, or clarify requirements without implementing yet. Example: "Useopenspec-exploreto think through how dashboard schema alignment should work before we formalize the change." Useopenspec-proposewhen you are ready to generate an implementation-ready change in one pass. It creates the change plus the key artifacts such asproposal.md,design.md, andtasks.md. Example: "Useopenspec-proposeto createdashboard-api-schema-alignmentwith proposal, design, and tasks." Useopenspec-new-changewhen you want to scaffold the change first and then create artifacts incrementally. Example: "Useopenspec-new-changeto startdashboard-api-schema-alignmentand show me the first artifact template." The goal of this step is an approved change underopenspec/changes/<change-name>/that is ready to review. -
Open a proposal PR. Send the OpenSpec artifacts for review before implementation. In most cases this PR should contain only the proposal artifacts under
openspec/changes/<change-name>/. This is the point to resolve scope, requirements, and design questions before code lands. -
Implement the approved proposal. Use
openspec-apply-changeto read the change context, work through the task list, make the code changes, and update task checkboxes as work completes. Example: "Useopenspec-apply-changefordashboard-api-schema-alignmentand implement the remaining tasks." Useopenspec-continue-changeif the change is not fully apply-ready yet, or if review/implementation feedback means you need to create the next artifact before continuing. Example: "Useopenspec-continue-changefordashboard-api-schema-alignmentand create the next required artifact." Useopenspec-implementation-loopwhen you want a more automated end-to-end loop around a single approved change, including implementation, local review, push, and optional PR handling. Example: "Useopenspec-implementation-loopfordashboard-api-schema-alignmentin PR mode." During implementation, add or update acceptance tests for new behavior and bug fixes. For bugs, verify the new test fails first so it reproduces the original issue. Make small, reviewable changes. Keep generated artifacts up to date (docs and generated clients when applicable). Run the narrowest tests that prove correctness, then broaden as appropriate. The System User resource (seeinternal/elasticsearch/security/system_userreferenced fromcoding-standards.md) is the canonical example for new resources. Follow it. -
Verify the implementation against the spec. Run
openspec-verify-changeto check completeness, correctness, and coherence against the approved change artifacts. Example: "Useopenspec-verify-changefordashboard-api-schema-alignmentand report any gaps before we open the implementation PR." Address any verification findings before moving on. -
Open the implementation PR. Once the change is implemented and verified, open a separate PR for the provider code and any related generated artifacts. Link back to the approved proposal change so reviewers can compare the implementation with the agreed requirements.
The canonical list is the root Makefile, but the usual ones are:
make lintmake testmake testacc(requires Docker andTF_ACC=1)make docs-generate
worktrunk manages feature worktrees for this repository so multiple branches can be developed in parallel without switching the main working tree.
Install the shell hook once to get the wt alias and tab completion:
wt config shell installAfter reloading your shell profile, you can use wt <branch> to create or switch to a feature worktree, and wt commands will have tab completion.
Keep worktrees inside the bare repo by setting the worktree path template in ~/.config/worktrunk/config.toml:
worktree-path = "{{ repo_path }}/{{ branch | sanitize }}"Each feature worktree becomes a subdirectory of the bare repo directory, keeping related branches discoverable and avoiding scattered worktrees across the filesystem.
When a new worktree is created the post-start hook (.config/wt.toml) automatically generates a .env from .env.template with per-worktree port variables derived deterministically from the branch name, plus the acceptance-test connection variables (ELASTICSEARCH_ENDPOINTS, ELASTICSEARCH_USERNAME, KIBANA_ENDPOINT, KIBANA_USERNAME). TF_ACC is intentionally not written so acceptance mode remains opt-in.
The main checkout's .env may not contain port variables if it predates the worktrunk setup; port variables are generated only in worktrees created via wt switch --create.
Before running Makefile targets that talk directly to Elasticsearch or Kibana on localhost, or before running acceptance tests directly with go test, export the worktree's .env so the generated connection variables are visible in your shell:
set -a; . ./.env; set +a
# Then run port-dependent targets, for example:
make testacc-vs-docker
make set-kibana-password
make setup-synthetics
make create-es-api-key
make create-es-bearer-token
make setup-kibana-fleet
# Or run acceptance tests directly:
TF_ACC=1 go test -v ./internal/acctest -run '^TestAccExamples_planOnly$' -count=1Targets that use docker compose (for example make docker-elasticsearch, make docker-kibana, and make docker-fleet) automatically read .env from the current directory, so they do not require the export step.
Alternatively, pass the variables explicitly on the command line:
make testacc-vs-docker ELASTICSEARCH_PORT=12345 KIBANA_PORT=16789When a worktree is removed (wt remove), the pre-remove hook (docker compose down --volumes) automatically tears down the Docker Compose stack for that worktree.
These trees hold copy-paste-ready Terraform for this provider. Snippets may be surfaced on generated reference pages (docs/resources/, docs/data-sources/), in docs templates, or in guides—not every .tf is shown on every page, but each covered file participates in validation below.
Regardless of how a file is surfaced, contributions must satisfy both of the following:
- Self-contained modules: A file must not depend on declarations that exist only in a sibling
.tfin the same directory (locals, variables, resources, data sources copied from another file). - Plan-only acceptance coverage:
TestAccExamples_planOnlyininternal/acctest/plans every covered example in isolation against the provider (withTF_ACC=1and the usual Elasticsearch/Kibana environment variables used elsewhere in acceptance tests).
If you touch or add snippets, run the harness targeted at your change—for example:
TF_ACC=1 go test ./internal/acctest -run '^TestAccExamples_planOnly$' -count=1
Some paths are intentionally skipped in the harness (documented beside the harness); those remain rare exceptions.