CWE-327: Use of a Broken or Risky Cryptographic Algorithm
Overview
Use of weak or broken cryptographic algorithms occurs when applications rely on outdated, insecure, or improperly configured encryption methods that fail to protect data confidentiality, integrity, and authenticity. Modern computing power makes algorithms like MD5 and DES breakable in seconds to minutes, allowing attackers to decrypt sensitive data, bypass integrity checks, forge digital signatures, or crack hashed passwords through brute force, collision attacks, cryptanalysis, or birthday attacks.
OWASP Classification
A04:2025 - Cryptographic Failures
Risk
Weak cryptographic algorithms create critical vulnerabilities:
- Data decryption: Attackers decrypt sensitive data (passwords, PII, financial records)
- Integrity bypass: Modify encrypted data without detection
- Authentication bypass: Forge digital signatures or certificates
- Password cracking: Crack hashed passwords from database dumps
- Compliance violations: Fails PCI-DSS, HIPAA, FIPS 140-2, GDPR requirements
- Man-in-the-middle attacks: Downgrade attacks force weak cipher usage
- Session hijacking: Weak session token generation enables prediction
Modern computing power makes algorithms like MD5 and DES breakable in seconds to minutes, not years.
Common Broken or Risky Algorithms
Weak cryptographic algorithms fall into several categories:
- Weak encryption: DES, 3DES, RC4, Blowfish (64-bit block ciphers vulnerable to collision attacks)
- Weak hashing: MD5, SHA-1, CRC32 (collision vulnerabilities enable forgery)
- Weak key exchange: Anonymous DH, Export-grade ciphers, static RSA (vulnerable to downgrade attacks)
- Insufficient key lengths: RSA < 2048 bits, ECC < 256 bits (brute-forceable with modern compute)
- Improper modes: ECB mode (leaks patterns), unauthenticated encryption (padding oracle attacks)
Remediation Steps
Core principle: Do not use broken or deprecated cryptography; cryptographic algorithm selection must be centrally defined, server-controlled, and constrained to algorithms that remain cryptographically sound.
Locate the vulnerable cryptographic algorithm in your code
- Review the flaw details to identify the specific file, line number, and code pattern
- Identify which weak cryptographic algorithm is in use (DES, 3DES, MD5, SHA-1, etc.)
- Trace the data flow to understand what data is being protected and where it's used
- Determine the purpose: encryption, hashing, password hashing, key exchange, or TLS configuration
Replace with modern, approved cryptographic algorithms (Primary Defense)
Replace weak algorithms with strong, industry-standard alternatives:
ENCRYPTION (Symmetric):
- Use: AES-256 in GCM or CBC mode with HMAC
- Avoid: DES, 3DES, RC4, Blowfish, AES-ECB
HASHING:
- Use: SHA-256, SHA-384, SHA-512 (SHA-2 family)
- Use: SHA-3 family (for new implementations)
- Avoid: MD5, SHA-1, MD4, CRC32
PASSWORD HASHING:
- Use: Argon2id, bcrypt, scrypt, PBKDF2 with SHA-256
- Avoid: Plain SHA/MD5, unsalted hashes, fast hashes
KEY EXCHANGE:
- Use: ECDHE (Elliptic Curve Diffie-Hellman Ephemeral)
- Use: DHE with 2048+ bit parameters
- Avoid: Static RSA, Anonymous DH, Export ciphers
PUBLIC KEY ENCRYPTION:
- Use: RSA-3072 or RSA-4096, ECDSA with P-256 or higher
- Avoid: RSA < 2048 bits, DSA < 2048 bits
Apply additional cryptographic protections
- Use Authenticated Encryption: Combine encryption with integrity protection using AEAD modes (AES-GCM, ChaCha20-Poly1305, AES-CCM)
- Configure TLS/SSL Securely: Use TLS 1.2 minimum (TLS 1.3 preferred), disable weak protocols (TLS 1.0/1.1, SSL 2.0/3.0), use strong cipher suites only (ECDHE with AES-GCM)
- Use Vetted Cryptographic Libraries: Never implement cryptography yourself - use established libraries (Java JCA/Bouncy Castle, .NET System.Security.Cryptography, Python cryptography, Node.js crypto, OpenSSL, libsodium)
- Avoid Encryption Without Authentication: ECB mode leaks patterns, CBC without HMAC is vulnerable to padding oracle attacks
Implement migration strategy for existing encrypted data
- Add version metadata: Store which algorithm was used for each encrypted field
- Implement dual-read: Try new algorithm first, fall back to old algorithm if decryption fails
- Always encrypt with new algorithm: New/updated data uses strong encryption
- Batch migrate existing data: Decrypt with old algorithm, re-encrypt with new algorithm, update database
- Monitor migration progress: Track percentage of data migrated to new algorithm
- Remove old algorithm support: After 100% migration, remove legacy decryption code
Monitor and audit cryptographic usage
- Log cryptographic operations for security-sensitive data (encryption, decryption, key generation)
- Alert on usage of deprecated algorithms if dual-read is still active
- Track migration progress with database queries
- Review code for remaining instances of weak algorithms
Test the cryptographic changes thoroughly
- Verify the specific weak algorithm is no longer used
- Test encryption/decryption with new algorithm in staging environment
- Test dual-read strategy with production data copy
- Verify performance impact of new algorithm is acceptable (target 250-500ms for password hashing)
- Test rollback procedures and backup restoration
- Re-scan with security scanner to confirm the issue is resolved
Migration Considerations
CRITICAL: Changing encryption algorithms will make existing encrypted data unreadable unless you implement a migration strategy.
What Breaks
- Existing encrypted data becomes unreadable: Data encrypted with DES/3DES cannot be decrypted with AES-256
- Database columns with encrypted PII: Credit cards, SSNs, addresses encrypted with old algorithm are lost
- Encrypted files unreadable: Backups, archived files, encrypted exports cannot be decrypted
- API integrations fail: Partners using your encryption keys for data exchange will break
- Session tokens invalid: Encrypted session data cannot be decrypted, logging out all users
- Encrypted credentials lost: Stored passwords for external services (DB, APIs) become unrecoverable
Migration Approach
Dual-Read Strategy (Recommended)
Support both old and new encryption algorithms during transition:
- Add version metadata: Store which encryption algorithm was used for each encrypted field
-
Implement dual decryption:
- Try decrypting with new algorithm (AES-256-GCM) first
- If fails, try decrypting with old algorithm (DES/3DES)
- Track which algorithm successfully decrypted
-
Always encrypt with new algorithm: New/updated data uses strong encryption
-
Batch migrate existing data:
- Decrypt with old algorithm
- Re-encrypt with new algorithm
- Update database with new encrypted value and version metadata
- Process in batches to avoid overwhelming database
-
Monitor migration progress: Track percentage of data migrated to new algorithm
- Remove old algorithm support: After 100% migration, remove DES decryption code
Implementation Steps:
- Backup all encrypted data before migration
- Deploy dual-read code (supports both algorithms)
- Verify dual-read works in production
- Run batch migration script during low-traffic period
- Monitor migration progress with database queries
- After 100% migration, remove legacy algorithm support
Rollback Procedures
If migration causes data loss:
- Stop migration script immediately: Kill running batch process
- Restore from backup: Restore sensitive_data table from pre-migration backup
- Revert application code: Deploy previous version
- Verify old encryption works: Test decryption of sample records
- Emergency data recovery: Attempt decryption with both algorithms for corrupted records
Testing Recommendations
Pre-Migration Testing:
- Test dual-read with production data copy
- Verify old encrypted data still decrypts correctly
- Verify new encrypted data decrypts correctly
- Test migration script on sample dataset
- Verify batch processing doesn't overload database
- Test rollback procedure restores data correctly
- Load test: Ensure dual-read doesn't impact performance
Post-Migration Monitoring:
- Monitor application error rates
- Track decryption failures
- Verify no data corruption
- Monitor database performance
- Alert on crypto version distribution changes
Key Metrics:
- Total encrypted records
- Records using old algorithm
- Records using new algorithm
- Migration percentage
- Decryption error rate
1. Code Scan for Weak Algorithms
# Search for weak algorithm usage
grep -r "DES\|RC4\|MD5\|SHA1" --include="*.java" --include="*.cs" --include="*.py"
# Look for specific patterns
grep -r 'Cipher.getInstance.*DES' .
grep -r 'MessageDigest.getInstance.*MD5' .
grep -r 'HashAlgorithm.Create.*MD5' .
Expected result: No usage of weak algorithms in production code.
TLS Configuration Testing
# Test TLS configuration with SSL Labs
# https://www.ssllabs.com/ssltest/
# Or use testssl.sh
testssl.sh --severity MEDIUM https://yourapp.com
# Or use nmap
nmap --script ssl-enum-ciphers -p 443 yourapp.com
Expected result:
- A or A+ rating on SSL Labs
- No weak ciphers enabled
- TLS 1.2+ only
Verify Strong Algorithms
Verify in code:
- AES-256-GCM or AES-128-GCM for encryption
- SHA-256 or SHA-512 for hashing
- bcrypt, Argon2, or scrypt for passwords
- RSA ≥ 2048 bits or ECDSA P-256 for signatures
Dependency Scanning
# Check for cryptographic library vulnerabilities
npm audit (Node.js)
pip-audit (Python)
dotnet list package --vulnerable (.NET)
dependency-check (OWASP Dependency Check)
Runtime Verification
- Monitor logs for cipher suite negotiation
- Verify strong ciphers are actually used in TLS handshakes
- Check database encryption uses approved algorithms
- Confirm password hashes use bcrypt/Argon2 (not SHA-256)
Dynamic Scan Guidance
For guidance on remediating this CWE when detected by dynamic (DAST) scanners:
- Dynamic Scan Guidance - Analyzing DAST findings and mapping to source code