Skip to content

CWE-377: Insecure Temporary File - Python

Overview

Insecure temporary file creation in Python occurs when applications create files with predictable names, insecure permissions, or in shared directories without proper protection. Python's tempfile module provides secure alternatives that should always be used for handling temporary files containing sensitive data.

Primary Defence: Use tempfile.NamedTemporaryFile() or tempfile.mkstemp() with secure defaults for unpredictable names, restricted permissions, and automatic cleanup.

Common Vulnerable Patterns

Predictable filename in /tmp

import os

# VULNERABLE - Predictable filename using PID
def save_user_data(user_data):
    temp_file = f"/tmp/userdata_{os.getpid()}.txt"

    # Attackers can predict the PID and pre-create this file
    with open(temp_file, 'w') as f:
        f.write(user_data)  # Sensitive data exposed

    process_file(temp_file)
    # File not deleted - persists in /tmp

Fixed filename in shared directory

# VULNERABLE - Fixed filename that multiple processes might use
def export_credentials(api_key, secret):
    temp_file = "/tmp/credentials.txt"

    # Race condition: Another user could pre-create this file
    # World-readable by default (umask dependent)
    with open(temp_file, 'w') as f:
        f.write(f"API_KEY={api_key}\n")
        f.write(f"SECRET={secret}\n")

    return temp_file

Insecure file permissions

import os

# VULNERABLE - World-readable permissions (0644)
def save_session_data(session_token):
    temp_file = "/tmp/session_data.txt"

    # Create file with insecure permissions
    fd = os.open(temp_file, os.O_CREAT | os.O_WRONLY, 0o644)
    os.write(fd, session_token.encode())
    os.close(fd)

    # Any user on the system can read this file
    # ls -la shows: -rw-r--r-- (world-readable)

Using timestamp for filename

import time

# VULNERABLE - Timestamp-based filename is predictable
def create_temp_log():
    timestamp = int(time.time())
    temp_file = f"/tmp/log_{timestamp}.txt"

    # Attacker can predict the timestamp
    with open(temp_file, 'w') as f:
        f.write("Sensitive log data")

    return temp_file

Not cleaning up temporary files

# VULNERABLE - Temp files accumulate in /tmp
def process_sensitive_data(data):
    import random
    temp_file = f"/tmp/data_{random.randint(1000, 9999)}.tmp"

    with open(temp_file, 'w') as f:
        f.write(data)

    result = analyze(temp_file)
    # File never deleted - sensitive data persists
    return result

Secure Patterns

Using tempfile.NamedTemporaryFile with auto-deletion

import tempfile

def process_sensitive_data(data):
    """Secure temp file with automatic cleanup"""
    # Creates file with:
    # - Unpredictable name (cryptographically random)
    # - Restrictive permissions (0600 - owner only)
    # - Auto-deletion when closed
    with tempfile.NamedTemporaryFile(mode='w', delete=True, suffix='.txt') as temp_file:
        temp_file.write(data)
        temp_file.flush()  # Ensure data is written

        # Process the file while it's open
        result = process_file(temp_file.name)

    # File automatically deleted when context exits
    return result

Why this works:

  • Cryptographically random names: tempfile.NamedTemporaryFile() uses os.urandom() or /dev/urandom on Unix, making filenames impossible for attackers to predict
  • Automatic secure permissions: Sets 0600 (owner read/write only) on Unix, preventing other users from accessing the file
  • Automatic deletion: delete=True (default) ensures file removed when closed or context manager exits, even on exceptions
  • Guaranteed cleanup: Context manager (with statement) guarantees automatic cleanup, eliminating risk of leaving sensitive data in temp storage
  • Data integrity: flush() ensures buffered data written to disk before processing, preventing race conditions where file read before all data written

When to use: Best default choice for Python - simple, secure, automatic cleanup. Perfect for most temp file scenarios.

Using tempfile with explicit deletion

import tempfile
import os

def export_user_report(user_data):
    """Secure temp file with manual cleanup"""
    temp_file = None
    try:
        # Create with delete=False to manage lifecycle manually
        # Still gets secure name and permissions
        temp_file = tempfile.NamedTemporaryFile(
            mode='w',
            delete=False,
            prefix='user_report_',
            suffix='.csv'
        )

        temp_file.write(user_data)
        temp_file.close()

        # Process the closed file
        send_email_attachment(temp_file.name)

    finally:
        # Always clean up, even on error
        if temp_file and os.path.exists(temp_file.name):
            os.unlink(temp_file.name)

Why this works: Using delete=False with tempfile.NamedTemporaryFile() allows manual lifecycle management while still benefiting from cryptographic random name generation and secure 0600 permissions. This is necessary when the file must be closed before processing (some tools can't read from open files) or when the file needs to outlive the function scope. The try-finally block ensures cleanup happens even if exceptions occur during processing. Closing the file before processing releases file locks and ensures all data is flushed to disk. The os.path.exists() check prevents errors if the file was already deleted. This pattern provides the security of tempfile (random names, secure permissions) with the flexibility of manual cleanup timing.

Using tempfile.mkstemp for file descriptor

import tempfile
import os

def save_credentials_securely(api_key, secret):
    """Secure temp file using mkstemp"""
    # mkstemp returns (file_descriptor, path)
    # Creates file with mode 0600 (owner read/write only)
    fd, temp_path = tempfile.mkstemp(suffix='.txt', prefix='creds_')

    try:
        # Write using file descriptor (more secure)
        os.write(fd, f"API_KEY={api_key}\n".encode())
        os.write(fd, f"SECRET={secret}\n".encode())
        os.close(fd)

        # Process the file
        load_credentials(temp_path)

    finally:
        # Always delete the temp file
        if os.path.exists(temp_path):
            os.unlink(temp_path)

Why this works: tempfile.mkstemp() returns a low-level file descriptor and path, creating the file atomically with mode 0600 (owner read/write only) to prevent race conditions where an attacker might create a file with the same name between name generation and file creation. Using file descriptors (os.write() instead of high-level open()) provides direct control and avoids potential permission issues from opening an already-created file. The file descriptor approach is more secure because it ensures you're writing to the exact file that mkstemp() created, not a symlink or file that an attacker substituted. Always closing the file descriptor (os.close(fd)) before processing prevents file lock issues. The try-finally ensures cleanup even on errors, and os.path.exists() check prevents errors if the file was already deleted.

Using temporary directory with secure permissions

import tempfile
import os
import shutil

def process_multiple_files(files_data):
    """Create secure temporary directory for multiple files"""
    # Create temp directory with mode 0700 (owner only)
    with tempfile.TemporaryDirectory(prefix='secure_work_') as temp_dir:
        # temp_dir has permissions 0700 - only owner can access

        for filename, data in files_data.items():
            file_path = os.path.join(temp_dir, filename)

            # Files inherit directory's secure context
            with open(file_path, 'w') as f:
                f.write(data)

        # Process all files in secure directory
        result = batch_process(temp_dir)

    # Directory and all contents auto-deleted
    return result

Why this works: tempfile.TemporaryDirectory() creates a directory with mode 0700 (owner-only execute/read/write), ensuring that only the owning user can access, list, or create files within it. This provides isolation from other users on shared systems. The cryptographically random directory name prevents prediction attacks. Files created within this directory inherit the secure context, and even if they had world-readable permissions, other users still couldn't access them without directory permissions. The context manager automatically deletes the directory and all its contents recursively when exiting, even on exceptions, preventing sensitive data from persisting. This is ideal for processing multiple related files since all are protected by the same secure directory and all are cleaned up together atomically.

Explicit secure permissions with os.open

import os
import tempfile

def create_secure_temp_file(sensitive_data):
    """Create temp file with explicit secure permissions"""
    # Get secure temp directory
    temp_dir = tempfile.gettempdir()

    # Generate secure random filename
    temp_name = tempfile.mktemp(suffix='.dat', dir=temp_dir)

    try:
        # Create with O_EXCL (fail if exists) and mode 0600
        fd = os.open(
            temp_name,
            os.O_CREAT | os.O_EXCL | os.O_WRONLY,
            0o600  # Owner read/write only
        )

        # Write sensitive data
        os.write(fd, sensitive_data.encode())
        os.close(fd)

        # Verify permissions (should be 0600)
        stat_info = os.stat(temp_name)
        assert stat_info.st_mode & 0o777 == 0o600

        process_file(temp_name)

    finally:
        if os.path.exists(temp_name):
            os.unlink(temp_name)

Why this works: Using os.open() with O_CREAT | O_EXCL flags provides atomic file creation that fails if the file already exists, completely eliminating TOCTOU race conditions. The O_EXCL flag is critical for security - it ensures you create a new file, not open an existing one that an attacker might have pre-created. Setting mode to 0o600 during creation means the file never exists with insecure permissions, even for a microsecond. Working with file descriptors (os.write()) instead of higher-level file objects provides maximum control and avoids Python's buffering layers. The assertion verifies permissions are correctly set, detecting any unexpected permission changes. This low-level approach is recommended for maximum security when handling extremely sensitive data like cryptographic keys or credentials.

User-specific temporary directory

import tempfile
import os
import atexit
import shutil

class SecureTempManager:
    """Manage user-specific temporary directory"""

    def __init__(self, app_name):
        self.app_name = app_name
        self.temp_dir = self._create_user_temp_dir()
        # Register cleanup on exit
        atexit.register(self.cleanup)

    def _create_user_temp_dir(self):
        """Create secure user-specific temp directory"""
        base_temp = tempfile.gettempdir()
        user_temp = os.path.join(base_temp, f"{self.app_name}-{os.getuid()}")

        # Create with restrictive permissions if doesn't exist
        if not os.path.exists(user_temp):
            os.makedirs(user_temp, mode=0o700)
        else:
            # Verify permissions on existing directory
            os.chmod(user_temp, 0o700)

        return user_temp

    def create_temp_file(self, data, suffix='.tmp'):
        """Create temp file in user-specific directory"""
        fd, path = tempfile.mkstemp(
            suffix=suffix,
            dir=self.temp_dir
        )

        try:
            os.write(fd, data.encode())
        finally:
            os.close(fd)

        return path

    def cleanup(self):
        """Clean up all temporary files"""
        if os.path.exists(self.temp_dir):
            shutil.rmtree(self.temp_dir)

# Usage
temp_manager = SecureTempManager('myapp')
temp_file = temp_manager.create_temp_file("sensitive data")
# Cleanup happens automatically on exit

Why this works: Creating a user-specific directory (appname-{uid}) isolates temporary files from other users and applications, even on shared systems. Using os.getuid() in the directory name ensures each user has their own temp space, preventing cross-user access. The 0o700 permissions on the directory mean only the owner can access it, providing directory-level isolation. tempfile.mkstemp() within this directory creates files with 0600 permissions, providing file-level protection as well. The atexit.register() ensures cleanup happens automatically when Python exits normally, even if the developer forgets to call cleanup. The os.chmod() on existing directories ensures permissions are always correct, even if the directory was created with different permissions previously. This pattern is ideal for long-running applications that create many temp files.

Using tempfile with custom directory

import tempfile
import os

def process_with_custom_temp_dir(data):
    """Use custom secure temporary directory"""
    # Create application-specific temp directory
    app_temp_dir = os.path.join(
        tempfile.gettempdir(),
        f"myapp-{os.getuid()}"
    )

    # Ensure directory exists with secure permissions
    os.makedirs(app_temp_dir, mode=0o700, exist_ok=True)

    # Create temp file in custom directory
    with tempfile.NamedTemporaryFile(
        mode='w',
        delete=True,
        dir=app_temp_dir,  # Use custom directory
        prefix='work_',
        suffix='.dat'
    ) as temp_file:
        temp_file.write(data)
        temp_file.flush()

        result = process_file(temp_file.name)

    return result

Why this works: Creating an application-specific temp directory (myapp-{uid}) provides isolation from other applications while still using the system's standard temp location. The os.makedirs() with mode=0o700 creates a directory tree with owner-only permissions, and exist_ok=True makes the operation idempotent (safe to call multiple times). Creating temp files within this custom directory via tempfile.NamedTemporaryFile(dir=...) inherits the security benefits while allowing you to organize temp files by application. The directory-level isolation means even if an attacker can list the system temp directory, they cannot access your application's subdirectory. This approach balances security (isolated directory) with convenience (automatic cleanup via NamedTemporaryFile). It's particularly useful for multi-tenant applications where different tenants' temp files must be isolated.

Secure temp file for Django file upload

from django.core.files.uploadedfile import UploadedFile
import tempfile
import os

def handle_file_upload(uploaded_file: UploadedFile):
    """Securely handle file upload with temp storage"""
    # Use NamedTemporaryFile for uploaded content
    with tempfile.NamedTemporaryFile(
        delete=False,
        suffix=os.path.splitext(uploaded_file.name)[1]
    ) as temp_file:
        # Write uploaded chunks securely
        for chunk in uploaded_file.chunks():
            temp_file.write(chunk)

        temp_path = temp_file.name

    try:
        # Virus scan or validate the file
        if not is_safe_file(temp_path):
            raise ValueError("File failed security check")

        # Process the validated file
        result = process_uploaded_file(temp_path)

    finally:
        # Always clean up
        if os.path.exists(temp_path):
            os.unlink(temp_path)

    return result

Why this works: Django's UploadedFile.chunks() method streams file data in manageable chunks (typically 2.5MB), preventing memory exhaustion from large uploads. Using tempfile.NamedTemporaryFile(delete=False) creates a file with 0600 permissions and unpredictable name while allowing you to close the file before processing (required by some validation tools). Extracting the file extension from the original filename via os.path.splitext() preserves file type information needed for validation. The pattern validates the file (virus scan, magic number check, size limits) before processing, ensuring only safe files are used. The try-finally ensures cleanup even if validation fails or processing throws exceptions. This approach is ideal for Django applications handling user uploads because it integrates with Django's file upload API while maintaining security and proper resource cleanup.

Framework-Specific Guidance

Flask file upload with secure temp storage

from flask import Flask, request
import tempfile
import os

app = Flask(__name__)

@app.route('/upload', methods=['POST'])
def upload_file():
    """Securely handle file upload"""
    if 'file' not in request.files:
        return 'No file uploaded', 400

    uploaded_file = request.files['file']

    # Use secure temp file
    with tempfile.NamedTemporaryFile(delete=False, suffix='.upload') as temp_file:
        uploaded_file.save(temp_file.name)
        temp_path = temp_file.name

    try:
        # Process the uploaded file
        result = process_upload(temp_path)
        return result, 200

    finally:
        # Clean up
        if os.path.exists(temp_path):
            os.unlink(temp_path)

FastAPI with secure temporary files

from fastapi import FastAPI, UploadFile
import tempfile
import os
import aiofiles

app = FastAPI()

@app.post("/upload")
async def upload_file(file: UploadFile):
    """Async secure file upload handling"""
    # Create secure temp file
    fd, temp_path = tempfile.mkstemp(suffix='.upload')

    try:
        # Write uploaded content asynchronously
        async with aiofiles.open(temp_path, 'wb') as f:
            content = await file.read()
            await f.write(content)

        os.close(fd)

        # Process the file
        result = await process_file_async(temp_path)
        return {"status": "success", "result": result}

    finally:
        if os.path.exists(temp_path):
            os.unlink(temp_path)

Additional Resources