Valtik Studios
Back to blog
Secrets ManagementhighUpdated 2026-04-1726 min

Secrets Management 2026: The Complete Guide to Vaults, Rotation, and Leaked-Credential Response

Every company has secrets. Nobody knows where. This is the complete 2026 secrets management guide. What counts as a secret. The 11 places they leak. Vault architecture shootout (HashiCorp Vault, AWS Secrets Manager, Azure Key Vault, Doppler, Infisical). Rotation cadence. Dynamic secrets. Incident response for exposed secrets. Anti-patterns that persist everywhere.

TT
Tre Trebucchi·Founder, Valtik Studios. Penetration Tester

Founder of Valtik Studios. Pentester. Based in Connecticut, serving US mid-market.

Every company has secrets somewhere. Nobody knows where

The exercise I run on the first day of any security engagement involving a software organization is the same. Ask for a list of every credential the production environment uses. API keys. Database passwords. TLS certificates. OAuth client secrets. Service account tokens. Every one of them, with owner, rotation date, and storage location.

Nobody can produce this list. Every time. Some companies claim they can. Then we dig. There are credentials in developer laptops' Docker environment files. Credentials in CI/CD pipeline variables. Credentials in Terraform state. Credentials in Kubernetes ConfigMaps (not Secrets). Credentials in text messages between team members. Credentials in a shared Notion doc. Credentials in the CEO's 1Password.

This is the secrets management problem in 2026. It's not a technical problem. It's an organizational visibility problem. You can't protect what you can't see, and most organizations cannot see where their production secrets live. Which is why supply chain compromises, insider threats, and accidental exposures keep producing catastrophic breaches.

This post is the complete 2026 secrets management guide. What counts as a secret. Where they inevitably leak. The vault architectures that work. Rotation policies that matter. Detection and incident response when they leak anyway.

What counts as a secret

Broader than people think.

Definitely secrets:

  • API keys (external services, internal services)
  • Database passwords
  • Service account passwords
  • OAuth client secrets
  • Private keys (TLS, SSH, code signing, JWT signing)
  • Access tokens (GitHub tokens, cloud access keys)
  • Webhook secrets
  • Encryption keys
  • Authentication tokens

Often secrets but often forgotten:

  • Internal URLs that aren't public-facing
  • DNS configurations
  • Network addresses of internal services
  • Database connection strings
  • Employee directory data
  • Source code of proprietary algorithms

Not really secrets but frequently treated as such:

  • Public API keys (rotatable but not sensitive)
  • Non-sensitive configuration values

The fuzzy boundary is where most confusion lives. "The AWS account ID isn't really secret, right?" Sort of. It's public if you look hard, but exposing it simplifies reconnaissance for attackers. Treat as sensitive.

Where secrets leak

From engagements and breach post-mortems, the persistent leak locations:

1. Source code repositories

Hardcoded secrets in code. Committed, pushed, discovered later. Even after removal, git history retains the secret forever unless the history is rewritten.

Tools: TruffleHog, Gitleaks, GitHub Secret Scanning.

2. Container images

Environment variables baked into Dockerfiles. Credentials copied into images. Exposed when image is pulled from a public registry.

Tools: Trivy image scanning, Dive for layer inspection.

3. Environment variables

Set on local machines, in CI/CD pipelines, in production runtime. Visible to anyone with access to the process environment. Leaked through process dumps, error reports, log messages.

Mitigation: read secrets from a vault at runtime, don't persist to environment variables.

4. Kubernetes "Secrets"

As noted in our Helm secrets post, Kubernetes Secrets are base64-encoded plaintext. Anyone with get secrets permission reads them directly. Etcd access reads them directly.

5. Terraform state files

Our Terraform state post covers this. State files contain every sensitive value Terraform manages. Backups in over-permissive S3 buckets, committed to git, stored on shared drives.

6. Cloud service configuration

Lambda environment variables. CloudFormation parameters. Azure Functions app settings. GCP secret parameters. Often visible to anyone with read access to the cloud service, which is broader than intended.

7. Log files

Secrets logged in error messages, request dumps, debugging output. Captured by log aggregators. Retained for months or years.

Mitigation: log scrubbing (redact known secret patterns before ingestion).

8. Developer laptops

.env files, docker-compose.yml with secrets, personal scripts, saved passwords in local password managers. Lost laptops, stolen laptops, compromised laptops.

9. Shared documentation

Confluence pages, Notion docs, Google Drive documents containing credentials. Often visible to broader company than necessary.

10. Chat platforms

Slack DMs. Microsoft Teams chats. Email. Text messages. Credentials shared in chat for "convenience."

11. Ticketing systems

Zendesk tickets with credentials attached. JIRA tickets with production DB connection strings. Retention for years.

Vault architectures

The technical answer to "where should secrets live" is "in a purpose-built vault." The options:

HashiCorp Vault

The incumbent. Open source + enterprise. Most flexible. Most complex.

Pros:

  • Works everywhere (cloud, on-prem, hybrid)
  • Extensive auth method support
  • Dynamic secrets (credentials generated per request)
  • PKI, KMIP, transit encryption built-in
  • Strong audit logging

Cons:

  • Operational burden is real
  • Learning curve is steep
  • Self-hosting HA requires expertise
  • Enterprise license expensive

Best for: complex environments, multi-cloud, teams with dedicated platform engineering.

AWS Secrets Manager

Native AWS service. Integrates with IAM.

Pros:

  • Zero operational burden
  • Native IAM integration
  • Automatic rotation built-in for RDS, DocumentDB, Redshift
  • Cross-region replication
  • Secrets-level IAM policies

Cons:

  • AWS-only
  • Per-secret pricing adds up
  • Rotation for non-AWS services requires custom Lambda functions

Best for: AWS-native organizations.

Azure Key Vault

Azure's equivalent. Same positioning.

Pros: native Azure integration, managed service

Cons: Azure-only, less feature-rich than Vault

Best for: Azure-native organizations.

Google Cloud Secret Manager

GCP's equivalent.

Pros: native GCP integration

Cons: GCP-only, less mature than AWS/Azure

Best for: GCP-native organizations.

Doppler

Modern developer-first secrets platform.

Pros:

  • Excellent UX
  • SDK integration across all languages
  • Environment management built-in
  • Fast
  • Affordable for small teams

Cons:

  • Another vendor in the critical path
  • Less enterprise-specific feature set than Vault

Best for: mid-market SaaS that wants good DX.

1Password Secrets Automation

1Password-powered secrets for applications.

Pros: integrates with 1Password password management

Cons: newer product, smaller ecosystem

Best for: 1Password-centric organizations.

Infisical

Open source secrets platform.

Pros: open source, self-hostable, reasonable UX

Cons: newer, smaller community

Best for: teams that want open source but not Vault complexity.

Early-stage startup (< 20 engineers)

Doppler + GitHub Secrets + cloud provider IAM. Avoid Vault.

Mid-market (20-200 engineers)

Cloud provider secrets manager (AWS Secrets Manager / Azure Key Vault / Google Secret Manager) as source of truth. Pull into containers via SDK or External Secrets Operator for Kubernetes.

Consider Vault if hybrid/multi-cloud or if dynamic secrets are a real requirement.

Enterprise (200+ engineers)

Vault as central secrets platform. Cloud-native secrets managers where beneficial (AWS RDS rotation, for example). Strong federation with identity provider.

The rotation problem

Secrets should rotate. Static secrets that live for years are a persistent attack surface. Rotation limits the damage window of any compromise.

Rotation cadence by secret type:

  • Short-lived tokens: auto-generated per request
  • Database passwords: 30-90 days
  • Cloud access keys: 90 days (or eliminate in favor of workload identity)
  • TLS certificates: per certificate lifecycle (typically 1 year)
  • API keys: 90-180 days depending on sensitivity
  • Code signing keys: rarely, but strong access controls
  • Root credentials: every incident + annual

Dynamic secrets

Better than static rotation. A service requests credentials when it needs them, uses them briefly, and they expire. No standing credentials to steal.

Vault's dynamic secrets engine does this for databases, cloud providers, PKI, SSH.

Cloud native equivalent: workload identity (IRSA on AWS, Workload Identity on GCP, Workload Identity on Azure).

Automation

Rotation that requires manual engineer action doesn't happen. Rotation that requires engineering context switching doesn't happen. Automate everything.

  • Cloud provider secrets managers have native rotation Lambda/Function support
  • Vault dynamic secrets handle this automatically
  • Custom rotation for application-specific secrets requires engineering time

Access controls

Who can read each secret.

Principle of least privilege applies:

  • Service accounts only get access to secrets they need
  • Humans only get access to secrets they need for their role
  • Break-glass humans only access during documented emergencies
  • All access logged

Patterns:

  • IAM policy per secret (AWS Secrets Manager)
  • Vault policies scoped to paths
  • Per-secret RBAC in modern platforms

Audit logging

Every secret access logged. Minimum:

  • Timestamp
  • Principal (who accessed)
  • Secret identifier
  • Source (which system, which IP)
  • Success/failure

Logs go to SIEM. Alerting on:

  • Access from unexpected source
  • Access by principals not normally reading that secret
  • Mass access (many secrets read in short window)
  • Access outside business hours (for non-automated secrets)
  • Failed access attempts

Incident response for exposed secrets

When a secret leaks publicly (git, Pastebin, data dump), assume worst case and rotate immediately.

Phase 1. Detection

Sources:

  • Automated scanning of public repos (GitHub secret scanning, TruffleHog)
  • Threat intel (credential dumps, paste site monitoring)
  • Self-discovery during engineering work
  • External reporter (bug bounty, researcher)
  • Customer reports

Phase 2. Validation

Is the leak real?

  • Confirm the credential exists and is valid
  • Confirm the public exposure is accurate
  • Determine leak time (when was secret exposed?)

Phase 3. Containment

  • Rotate the secret immediately
  • Revoke the old value
  • If the leaked secret was used, audit what the attacker could have done

Phase 4. Investigation

  • Review audit logs for access patterns during the exposure window
  • Identify any suspicious activity
  • Check for downstream compromise (secrets that the leaked credential had access to)

Phase 5. Remediation

  • Fix the leak source (remove from git history, fix the misconfiguration)
  • Update any systems that used the old value
  • Monitor for future abuse

Phase 6. Post-incident

  • Root cause analysis
  • Process improvements
  • Detection rule updates

The metrics

  • Percentage of secrets in the vault (vs. scattered)
  • Mean time since last rotation (by secret type)
  • Audit log coverage (percentage of secrets with audit logging)
  • Leaked secret detection time (time from leak to detection)
  • Rotation time (from leak detection to rotation complete)
  • Static credential count (should trend toward zero)

The anti-patterns

Everyone shares 1Password

1Password is a password manager. It's not a secrets management platform. Credentials shared via 1Password have no audit logs, no rotation, no access controls beyond the shared vault.

Fix: 1Password for employees' individual passwords. Secrets manager for production credentials.

.env.production committed to git

Not because someone is malicious. Because the developer who set up deployment wrote .env.production.example, then renamed to .env.production to test, then forgot to .gitignore it.

Fix: Pre-commit hooks. Gitleaks in CI. Dependabot for secret scanning.

Credentials shared over Slack for "convenience"

Slack logs are discoverable and retained. A "one-time share" is a permanent record.

Fix: Vault with per-request sharing. Or at minimum 1Password share links that expire.

Hardcoded secrets in Dockerfiles

The secret is in the image. The image is in the registry. The registry is pulled by anyone with pull access. The secret is everywhere.

Fix: Runtime secret injection. Never bake secrets into images.

No rotation ever

Same database password since 2019. Any breach involving any system that used that password compromises it today.

Fix: Mandatory rotation policy. Automated where possible.

Single-point-of-failure vault

Vault has one HA pair. Vault goes down. Production can't authenticate to anything. Incident escalates.

Fix: Vault HA across regions, or graceful degradation patterns.

Working with us

We run secrets management engagements focused on:

  • Current-state assessment (inventory of where secrets actually live)
  • Vault architecture design
  • Migration planning from ad-hoc to vault-centric
  • Rotation policy design
  • Detection and monitoring integration
  • Incident response planning for exposed secrets

Valtik Studios, valtikstudios.com.

secrets managementvaulthashicorp vaultaws secrets managerazure key vaultrotationcredential leakcomplete guide

Want us to check your Secrets Management setup?

Our scanner detects this exact misconfiguration. plus dozens more across 38 platforms. Free website check available, no commitment required.

Get new research in your inbox
No spam. No newsletter filler. Only new posts as they publish.