Secrets, Keys, and Tokens: Securing Your Python Automation Scripts
Stop hardcoding API keys in your scripts. Learn how to manage secrets properly with environment variables, vaults, token rotation, and secure CI/CD pipelines.

Somewhere in your codebase, there is an API key sitting in plain text. Maybe it is in a config file. Maybe it is hardcoded in a function. Maybe it is in a Jupyter notebook that got committed to Git six months ago.
This is how credentials leak. And leaked credentials are expensive — not just the direct damage, but the incident response, key rotation, and audit that follows.
This guide covers how to manage secrets properly in Python automation scripts. From basics (environment variables) to production patterns (vault integration, token rotation), every pattern here is immediately applicable.
# The Credential Lifecycle
flowchart LR G["Generate\n(API dashboard)"] --> S["Store\n(vault / env)"] S --> I["Inject\n(runtime)"] I --> U["Use\n(API calls)"] U --> R["Rotate\n(scheduled)"] R --> S S -.-> A["Audit\n(access logs)"]
Credentials have four stages: storage, injection, usage, and rotation. Most scripts handle usage but ignore everything else.
# What You Will Need
pip install python-dotenv keyring requests
- python-dotenv — load environment variables from
.envfiles - keyring — access OS credential storage
- requests — HTTP client for API calls
# The Problem: What Not to Do
Every one of these is a real pattern found in production scripts:
# ❌ Hardcoded in source code
API_KEY = "sk-live-abc123def456"
# ❌ In a config dict
config = {
"api_key": "sk-live-abc123def456",
"db_password": "P@ssw0rd!",
}
# ❌ In a comment (yes, this happens)
# Old key: sk-live-old-key-here
# New key: sk-live-abc123def456
All of these end up in version control. And once a secret is in Git history, it is there forever — even if you delete the file.
# Step 1: Environment Variables (The Baseline)
The simplest improvement: move secrets out of code and into the environment.
# Setting Environment Variables
# Linux / macOS
export API_KEY="sk-live-abc123def456"
export DB_PASSWORD="secure-password-here"
# Windows PowerShell
$env:API_KEY = "sk-live-abc123def456"
$env:DB_PASSWORD = "secure-password-here"
# Reading in Python
import os
def get_required_env(name):
"""Get an environment variable or raise a clear error."""
value = os.environ.get(name)
if not value:
raise EnvironmentError(
f"Required environment variable '{name}' is not set. "
f"Set it with: export {name}=your-value"
)
return value
# Usage
api_key = get_required_env("API_KEY")
db_password = get_required_env("DB_PASSWORD")
# Using .env Files (Local Development)
For local development, use a .env file — but never commit it.
# .env (add to .gitignore!)
API_KEY=sk-live-abc123def456
DB_PASSWORD=secure-password-here
SMTP_HOST=smtp.example.com
SMTP_PORT=587
from dotenv import load_dotenv
import os
# Load .env file into environment
load_dotenv()
api_key = os.environ["API_KEY"]
db_password = os.environ["DB_PASSWORD"]
# The .gitignore Entry
# Secrets — never commit these
.env
.env.local
.env.production
*.pem
*.key
credentials.json
This is non-negotiable. Add the .gitignore entry before creating the .env file.
# Step 2: Credential Classes
Centralise credential access so every part of your pipeline gets secrets the same way:
import os
from dataclasses import dataclass
from dotenv import load_dotenv
@dataclass
class PipelineCredentials:
"""Centralised credential management for pipeline scripts."""
api_key: str
db_host: str
db_password: str
smtp_host: str
smtp_password: str
@classmethod
def from_environment(cls):
"""Load all credentials from environment variables."""
load_dotenv()
required = ["API_KEY", "DB_HOST", "DB_PASSWORD", "SMTP_HOST", "SMTP_PASSWORD"]
missing = [var for var in required if not os.environ.get(var)]
if missing:
raise EnvironmentError(
f"Missing required environment variables: {', '.join(missing)}"
)
return cls(
api_key=os.environ["API_KEY"],
db_host=os.environ["DB_HOST"],
db_password=os.environ["DB_PASSWORD"],
smtp_host=os.environ["SMTP_HOST"],
smtp_password=os.environ["SMTP_PASSWORD"],
)
def __repr__(self):
"""Never print actual credential values."""
return "PipelineCredentials(api_key=***, db_password=***, ...)"
# Using the Credential Class
def run_pipeline():
"""Pipeline that loads credentials safely."""
creds = PipelineCredentials.from_environment()
# Pass credentials explicitly — no global state
data = fetch_data(creds.api_key)
save_to_db(data, creds.db_host, creds.db_password)
send_report(creds.smtp_host, creds.smtp_password)
# Step 3: OS Keyring (Desktop Scripts)
For scripts that run on your machine (not servers), use the OS credential store:
import keyring
# Store a credential (run once)
keyring.set_password("my_pipeline", "api_key", "sk-live-abc123def456")
keyring.set_password("my_pipeline", "db_password", "secure-password")
# Retrieve in your script
api_key = keyring.get_password("my_pipeline", "api_key")
db_password = keyring.get_password("my_pipeline", "db_password")
if not api_key:
raise ValueError("API key not found in keyring. Run setup first.")
This stores credentials in:
- macOS — Keychain
- Windows — Credential Manager
- Linux — Secret Service (GNOME Keyring / KWallet)
No plain text files. No environment variables visible in process listings.
# Step 4: OAuth2 Token Management
Many modern APIs use OAuth2. Tokens expire and need refreshing — handle this automatically.
flowchart LR C[Client] -->|Client ID + Secret| A[Auth Server] A -->|Access Token + Refresh Token| C C -->|Access Token| API[API Server] API -->|401 Expired| C C -->|Refresh Token| A A -->|New Access Token| C
import requests
import time
import os
import json
class OAuth2TokenManager:
"""Manage OAuth2 tokens with automatic refresh."""
def __init__(self, token_url, client_id, client_secret, token_file=".oauth_token.json"):
self.token_url = token_url
self.client_id = client_id
self.client_secret = client_secret
self.token_file = token_file
self._token = None
self._expires_at = 0
def get_token(self):
"""Get a valid access token, refreshing if needed."""
if self._token and time.time() < self._expires_at - 60:
return self._token
# Try loading cached token
cached = self._load_cached_token()
if cached and time.time() < cached.get("expires_at", 0) - 60:
self._token = cached["access_token"]
self._expires_at = cached["expires_at"]
return self._token
# Request new token
return self._request_new_token()
def _request_new_token(self):
"""Request a new access token from the auth server."""
response = requests.post(
self.token_url,
data={
"grant_type": "client_credentials",
"client_id": self.client_id,
"client_secret": self.client_secret,
},
timeout=30,
)
response.raise_for_status()
data = response.json()
self._token = data["access_token"]
self._expires_at = time.time() + data.get("expires_in", 3600)
# Cache token to disk
self._save_token(data)
return self._token
def _save_token(self, token_data):
"""Cache token data (not the client secret)."""
cache = {
"access_token": token_data["access_token"],
"expires_at": self._expires_at,
}
with open(self.token_file, "w") as f:
json.dump(cache, f)
def _load_cached_token(self):
"""Load cached token if available."""
if not os.path.exists(self.token_file):
return None
with open(self.token_file, "r") as f:
return json.load(f)
def get_headers(self):
"""Get authorization headers with a valid token."""
return {"Authorization": f"Bearer {self.get_token()}"}
# Using the Token Manager
# Credentials from environment — secrets never in code
token_manager = OAuth2TokenManager(
token_url="https://auth.example.com/oauth/token",
client_id=os.environ["OAUTH_CLIENT_ID"],
client_secret=os.environ["OAUTH_CLIENT_SECRET"],
)
# Token refresh is automatic
response = requests.get(
"https://api.example.com/data",
headers=token_manager.get_headers(),
timeout=30,
)
# Step 5: Secrets in CI/CD
Never put secrets in CI configuration files. Use the platform's secret storage.
# GitHub Actions
# .github/workflows/pipeline.yml
name: Run Pipeline
on:
schedule:
- cron: "0 6 * * 1-5"
jobs:
run:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run pipeline
env:
API_KEY: ${{ secrets.API_KEY }}
DB_PASSWORD: ${{ secrets.DB_PASSWORD }}
SMTP_PASSWORD: ${{ secrets.SMTP_PASSWORD }}
run: python pipeline.py
Secrets are injected as environment variables at runtime. They never appear in logs or build output.
# Preventing Secret Leaks in Logs
import logging
import re
class SecretFilter(logging.Filter):
"""Filter that redacts known secret patterns from log output."""
PATTERNS = [
(re.compile(r'(sk-live-)\w+'), r'\1****'),
(re.compile(r'(Bearer )\S+'), r'\1****'),
(re.compile(r'(password["\s:=]+)\S+', re.IGNORECASE), r'\1****'),
(re.compile(r'(api[_-]?key["\s:=]+)\S+', re.IGNORECASE), r'\1****'),
]
def filter(self, record):
message = record.getMessage()
for pattern, replacement in self.PATTERNS:
message = pattern.sub(replacement, message)
record.msg = message
record.args = ()
return True
# Apply to all loggers
logger = logging.getLogger("pipeline")
logger.addFilter(SecretFilter())
# Step 6: Key Rotation
Credentials should be rotated regularly. Automate the process so it actually happens.
import secrets
from datetime import datetime, timedelta
class KeyRotationTracker:
"""Track credential ages and alert when rotation is due."""
def __init__(self, rotation_days=90):
self.rotation_days = rotation_days
self.credentials = {}
def register(self, name, created_date):
"""Register a credential with its creation date."""
self.credentials[name] = {
"created": created_date,
"expires": created_date + timedelta(days=self.rotation_days),
}
def check_all(self):
"""Check all credentials and return rotation status."""
now = datetime.now()
status = []
for name, info in self.credentials.items():
days_remaining = (info["expires"] - now).days
if days_remaining < 0:
urgency = "EXPIRED"
elif days_remaining < 7:
urgency = "URGENT"
elif days_remaining < 30:
urgency = "WARNING"
else:
urgency = "OK"
status.append({
"credential": name,
"created": info["created"].strftime("%Y-%m-%d"),
"expires": info["expires"].strftime("%Y-%m-%d"),
"days_remaining": days_remaining,
"status": urgency,
})
return status
def generate_new_key(self, length=32):
"""Generate a cryptographically secure random key."""
return secrets.token_urlsafe(length)
# Rotation Status Output
┌─────────────────┬────────────┬────────────┬───────────────┬─────────┐
│ Credential │ Created │ Expires │ Days Left │ Status │
├─────────────────┼────────────┼────────────┼───────────────┼─────────┤
│ API Key (prod) │ 2026-01-15 │ 2026-04-15 │ 2 │ URGENT │
│ DB Password │ 2026-02-20 │ 2026-05-21 │ 38 │ OK │
│ SMTP Token │ 2025-12-01 │ 2026-03-01 │ -31 │ EXPIRED │
│ OAuth Secret │ 2026-03-01 │ 2026-05-30 │ 47 │ OK │
└─────────────────┴────────────┴────────────┴───────────────┴─────────┘
# Security Checklist
| Check | How to verify | Risk if missing |
|---|---|---|
| No secrets in source code | grep -r "sk-live|password=" *.py |
Credentials leak via Git |
.env in .gitignore |
Check .gitignore file |
Secrets committed accidentally |
| Environment variables used | All os.environ calls, no hardcoded values |
Credentials tied to code |
| Secrets not in logs | Check log output for key patterns | Credentials in log files |
| Token refresh automated | OAuth2 tokens refresh before expiry | Pipeline fails unexpectedly |
| Keys rotated quarterly | Rotation tracker shows no EXPIRED status | Stale credentials are a target |
| CI secrets use platform storage | No secrets in workflow YAML files | Exposed in public repos |
# What This Replaces
| Insecure pattern | Secure equivalent |
|---|---|
API_KEY = "sk-live-..." in code |
os.environ["API_KEY"] |
| Credentials in config files | .env files excluded from Git |
| Passwords in plain text | OS keyring storage |
| Manual token refresh | Automatic OAuth2 token management |
| Never rotating keys | Scheduled rotation with tracking |
| Secrets visible in logs | Log filter that redacts patterns |
# Next Steps
Start with the baseline: move all hardcoded credentials to environment variables and add .env to .gitignore. That single change eliminates the most common credential leak vector.
Then add the credential class pattern so every part of your pipeline accesses secrets consistently. Add the log filter to prevent accidental leaks in output.
For building the pipelines that use these credentials, see How to Automate Data Workflows Using APIs and Python. For making those pipelines resilient to failures, see How to Build Self-Healing Data Pipelines.
Automation services include secure credential management and secrets infrastructure for production pipeline systems.
Get in touch to discuss securing your automation scripts.
Enjoyed this article?
Get notified when I publish new articles on automation, ecommerce, and data engineering.