CWE-918: Server-Side Request Forgery (SSRF) - Java
Overview
Server-Side Request Forgery (SSRF) occurs when an attacker can make a server perform HTTP requests to arbitrary destinations. This can lead to accessing internal services, cloud metadata endpoints, or bypassing firewalls. Always validate and sanitize URLs, use allowlists, and implement network-level protections.
Primary Defence: Validate URLs against an allowlist of permitted domains/IPs, block private IP ranges (RFC 1918, loopback, link-local), and restrict protocols to https:// only.
Common Vulnerable Patterns
Direct URL Usage from User Input
// VULNERABLE - Direct user input to HTTP request
import java.net.URL;
import java.io.InputStream;
public class ImageFetcher {
public byte[] fetchImage(String imageUrl) throws Exception {
// No validation - attacker can access internal resources!
URL url = new URL(imageUrl);
try (InputStream in = url.openStream()) {
return in.readAllBytes();
}
}
}
// Attack examples:
// http://localhost:8080/admin
// http://169.254.169.254/latest/meta-data/iam/security-credentials/
// file:///etc/passwd
Why this is vulnerable: Accepting user-provided URLs without validation allows attackers to make the server request internal resources (localhost, cloud metadata 169.254.169.254, internal IPs), bypass firewalls, or read local files via file:// protocol.
Unvalidated HttpClient Requests
// VULNERABLE - No URL validation
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
public class WebhookHandler {
public void sendWebhook(String webhookUrl, String data) throws Exception {
// No validation - SSRF vulnerability!
try (CloseableHttpClient client = HttpClients.createDefault()) {
HttpGet request = new HttpGet(webhookUrl);
client.execute(request);
}
}
}
// Attack: webhookUrl = "http://internal-api.local/sensitive-endpoint"
Why this is vulnerable: HttpClient makes requests to any URL without validation, enabling attackers to scan internal networks, access cloud metadata endpoints (AWS, Azure, GCP), or pivot attacks through the server.
Unvalidated Spring RestTemplate
// VULNERABLE - RestTemplate without validation
import org.springframework.web.client.RestTemplate;
import org.springframework.web.bind.annotation.*;
@RestController
public class ProxyController {
private final RestTemplate restTemplate = new RestTemplate();
@GetMapping("/proxy")
public String proxy(@RequestParam String url) {
// No validation - SSRF vulnerability!
return restTemplate.getForObject(url, String.class);
}
}
// Attack: /proxy?url=http://169.254.169.254/latest/meta-data/
Why this is vulnerable: Spring RestTemplate fetches any URL without validation, allowing SSRF attacks against internal services, cloud metadata APIs, or localhost-bound administrative interfaces.
URL Redirect Without Validation
// VULNERABLE - Open redirect leading to SSRF
@GetMapping("/redirect")
public ResponseEntity<Void> redirect(@RequestParam String target) {
// No validation - can redirect anywhere!
return ResponseEntity
.status(HttpStatus.FOUND)
.location(URI.create(target))
.build();
}
// Attack: /redirect?target=http://internal-service.local/
Why this is vulnerable: Unvalidated redirects can trigger server-side requests when combined with URL pre-fetching or link preview features, enabling SSRF alongside open redirect attacks.
Secure Patterns
URL Allowlist Validation
// SECURE - Validate URLs against allowlist
import java.net.URL;
import java.net.MalformedURLException;
import java.util.Set;
public class SafeImageFetcher {
private static final Set<String> ALLOWED_HOSTS = Set.of(
"api.example.com",
"cdn.example.com",
"images.example.com"
);
private static final Set<String> ALLOWED_SCHEMES = Set.of("https");
public byte[] fetchImage(String imageUrl) throws Exception {
URL url = validateUrl(imageUrl);
try (InputStream in = url.openStream()) {
return in.readAllBytes();
}
}
private URL validateUrl(String urlString) throws SecurityException, MalformedURLException {
URL url = new URL(urlString);
// Validate scheme
if (!ALLOWED_SCHEMES.contains(url.getProtocol().toLowerCase())) {
throw new SecurityException("Invalid URL scheme: " + url.getProtocol());
}
// Validate host
String host = url.getHost().toLowerCase();
if (!ALLOWED_HOSTS.contains(host)) {
throw new SecurityException("Host not allowed: " + host);
}
// Block private IP ranges
if (isPrivateIp(host)) {
throw new SecurityException("Private IP addresses not allowed");
}
return url;
}
private boolean isPrivateIp(String host) {
try {
InetAddress addr = InetAddress.getByName(host);
return addr.isSiteLocalAddress() ||
addr.isLoopbackAddress() ||
addr.isLinkLocalAddress();
} catch (Exception e) {
return false;
}
}
}
Why this works:
- Host allowlist:
ALLOWED_HOSTSrestricts outbound requests to trusted domains, preventing targeting of internal services, databases (Redis, Memcached), admin panels - Scheme validation: Blocks
file://,jar://and other protocols accessing local resources or triggering class-loading vulnerabilities - Built-in IP validation:
InetAddress.getByName()resolves hostnames with built-in checks -isSiteLocalAddress()(RFC 1918),isLoopbackAddress()(127.x.x.x),isLinkLocalAddress()(169.254.x.x) - DNS rebinding defense: Defeats attacks where malicious domain switches from public to private IP after initial validation
- Fail-closed: DNS exceptions prevent TOCTOU race conditions
- Cloud metadata protection: Layered defense stops SSRF attacks exfiltrating cloud metadata (AWS IMDSv1/v2, GCP, Azure) or probing internal infrastructure
Apache HttpClient with Validation
// SECURE - HttpClient with URL validation
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import java.net.URI;
import java.util.regex.Pattern;
public class SecureWebhookHandler {
private static final Pattern ALLOWED_URL_PATTERN =
Pattern.compile("^https://([a-z0-9-]+\\.)*example\\.com/.*$");
public void sendWebhook(String webhookUrl, String data) throws Exception {
// Validate URL
URI uri = validateWebhookUrl(webhookUrl);
try (CloseableHttpClient client = HttpClients.createDefault()) {
HttpGet request = new HttpGet(uri);
client.execute(request);
}
}
private URI validateWebhookUrl(String url) throws SecurityException {
if (url == null || url.isEmpty()) {
throw new SecurityException("URL cannot be empty");
}
// Check against allowlist pattern
if (!ALLOWED_URL_PATTERN.matcher(url).matches()) {
throw new SecurityException("URL not allowed: " + url);
}
try {
URI uri = new URI(url);
// Additional validation
if (!"https".equals(uri.getScheme())) {
throw new SecurityException("Only HTTPS allowed");
}
// Block private IPs
InetAddress addr = InetAddress.getByName(uri.getHost());
if (addr.isSiteLocalAddress() || addr.isLoopbackAddress()) {
throw new SecurityException("Private IP addresses not allowed");
}
return uri;
} catch (Exception e) {
throw new SecurityException("Invalid URL: " + e.getMessage());
}
}
}
Why this works:
- Strict domain matching: Regex pattern (
^https://([a-z0-9-]+\.)*example\.com/.*$) with subdomain constraints prevents lookalike domains (example-com.attacker.net) or path bypasses - HTTPS-only: Prevents downgrade attacks and protects data in transit
- Comprehensive IP blocking:
isSiteLocalAddress()+isLoopbackAddress()blocks all RFC 1918 private ranges (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) plus localhost (127.x.x.x) - Internal access prevention: Stops access to internal APIs, databases, services bound to private interfaces
- Redirect consideration: Apache HttpClient follows redirects by default - use
RequestConfigwith.setRedirectsEnabled(false)to prevent redirect-based SSRF - Exception handling: Try-catch prevents information leaks about DNS failures or URI parsing errors
Spring RestTemplate with Validation
// SECURE - Spring RestTemplate with allowlist
import org.springframework.web.client.RestTemplate;
import org.springframework.web.bind.annotation.*;
import org.springframework.http.HttpStatus;
import org.springframework.web.server.ResponseStatusException;
@RestController
public class SecureProxyController {
private final RestTemplate restTemplate = new RestTemplate();
private static final Set<String> ALLOWED_DOMAINS = Set.of(
"api.example.com",
"public-api.example.org"
);
@GetMapping("/proxy")
public String proxy(@RequestParam String url) {
URI validatedUri = validateUrl(url);
try {
return restTemplate.getForObject(validatedUri, String.class);
} catch (Exception e) {
throw new ResponseStatusException(
HttpStatus.BAD_REQUEST,
"Failed to fetch resource"
);
}
}
private URI validateUrl(String url) {
try {
URI uri = new URI(url);
// Only allow HTTPS
if (!"https".equals(uri.getScheme())) {
throw new ResponseStatusException(
HttpStatus.BAD_REQUEST,
"Only HTTPS URLs allowed"
);
}
// Check domain allowlist
String host = uri.getHost().toLowerCase();
if (!ALLOWED_DOMAINS.contains(host)) {
throw new ResponseStatusException(
HttpStatus.BAD_REQUEST,
"Domain not allowed"
);
}
// Prevent DNS rebinding attacks
InetAddress address = InetAddress.getByName(host);
if (isPrivateAddress(address)) {
throw new ResponseStatusException(
HttpStatus.BAD_REQUEST,
"Private IP addresses not allowed"
);
}
return uri;
} catch (URISyntaxException | UnknownHostException e) {
throw new ResponseStatusException(
HttpStatus.BAD_REQUEST,
"Invalid URL"
);
}
}
private boolean isPrivateAddress(InetAddress address) {
return address.isSiteLocalAddress() ||
address.isLoopbackAddress() ||
address.isLinkLocalAddress() ||
address.isAnyLocalAddress();
}
}
Why this works:
- Clean error responses:
ResponseStatusException(400 Bad Request) without exposing whether rejection due to domain mismatch, private IP, or DNS failure - Case normalization:
ALLOWED_DOMAINSwith.toLowerCase()prevents case-variation bypasses (Example.COM vs example.com) - Wildcard binding protection:
isAnyLocalAddress()(0.0.0.0) blocks routes to localhost - Redirect risk:
RestTemplatefollows redirects by default - configure customClientHttpRequestFactorywithsetFollowRedirects(false)for maximum security - Generic error messages: "Failed to fetch resource" prevents probing internal network topology
- Logging separation: Spring's exception hierarchy (
@ExceptionHandler) logs detailed errors server-side while returning sanitized client messages
URL Validation Utility Class
// SECURE - Reusable URL validator
import java.net.*;
import java.util.*;
public class UrlValidator {
private final Set<String> allowedSchemes;
private final Set<String> allowedHosts;
private final boolean blockPrivateIps;
public UrlValidator(Set<String> allowedSchemes,
Set<String> allowedHosts,
boolean blockPrivateIps) {
this.allowedSchemes = allowedSchemes;
this.allowedHosts = allowedHosts;
this.blockPrivateIps = blockPrivateIps;
}
public URL validate(String urlString) throws SecurityException {
try {
URL url = new URL(urlString);
// Validate scheme
if (!allowedSchemes.contains(url.getProtocol().toLowerCase())) {
throw new SecurityException(
"Scheme not allowed: " + url.getProtocol()
);
}
String host = url.getHost().toLowerCase();
// Validate host against allowlist
if (!isHostAllowed(host)) {
throw new SecurityException("Host not allowed: " + host);
}
// Block private IPs
if (blockPrivateIps && isPrivateIpOrRange(host)) {
throw new SecurityException(
"Private IP addresses not allowed"
);
}
// Block localhost variants
if (isLocalhost(host)) {
throw new SecurityException("Localhost not allowed");
}
return url;
} catch (MalformedURLException e) {
throw new SecurityException("Invalid URL: " + e.getMessage());
}
}
private boolean isHostAllowed(String host) {
// Exact match
if (allowedHosts.contains(host)) {
return true;
}
// Check wildcard subdomains (e.g., *.example.com)
for (String allowedHost : allowedHosts) {
if (allowedHost.startsWith("*.") &&
host.endsWith(allowedHost.substring(1))) {
return true;
}
}
return false;
}
private boolean isPrivateIpOrRange(String host) {
try {
InetAddress addr = InetAddress.getByName(host);
// Check for private IP ranges
return addr.isSiteLocalAddress() || // 10.x, 172.16-31.x, 192.168.x
addr.isLoopbackAddress() || // 127.x
addr.isLinkLocalAddress() || // 169.254.x
addr.isAnyLocalAddress() || // 0.0.0.0
isAwsMetadata(addr) || // 169.254.169.254
isDockerInternal(addr); // 172.17.x
} catch (UnknownHostException e) {
return false;
}
}
private boolean isAwsMetadata(InetAddress addr) {
byte[] bytes = addr.getAddress();
return bytes.length == 4 &&
bytes[0] == (byte) 169 &&
bytes[1] == (byte) 254 &&
bytes[2] == (byte) 169 &&
bytes[3] == (byte) 254;
}
private boolean isDockerInternal(InetAddress addr) {
byte[] bytes = addr.getAddress();
return bytes.length == 4 &&
bytes[0] == (byte) 172 &&
bytes[1] == 17;
}
private boolean isLocalhost(String host) {
return host.equals("localhost") ||
host.equals("127.0.0.1") ||
host.equals("::1") ||
host.equals("0.0.0.0");
}
}
// Usage:
UrlValidator validator = new UrlValidator(
Set.of("https"),
Set.of("api.example.com", "*.cdn.example.com"),
true
);
URL safeUrl = validator.validate(userInput);
Why this works:
- Comprehensive encapsulation: Validator class centralizes all SSRF defenses in reusable component: scheme allowlist, host allowlist with wildcard subdomain support (
*.cdn.example.com), DNS resolution, private IP checks - Multi-layer IP validation: Uses
isSiteLocalAddress/isLoopbackAddress/isLinkLocalAddressfor comprehensive private range detection - Cloud-specific detection: Specialized methods for AWS metadata (
169.254.169.254exact match), Docker networks (172.17.x) - Environment flexibility:
strictModeparameter enforces HTTPS-only in production, allows HTTP in development - Immutable configuration: Constructor with
Set.of()prevents post-initialization modification - Defense-in-depth architecture: Single location for validation logic ensures consistent SSRF protection across all HTTP clients (HttpClient, OkHttp, RestTemplate)
- Audit-friendly: Security policies consolidated in one class rather than scattered across service methods
URL Parser Bypass Prevention
Different languages parse URLs differently, allowing bypasses:
Example bypass techniques:
http://127.0.0.1@evil.com(parsed differently by Python vs Java)http://[::ffff:127.0.0.1]/(IPv6 notation for IPv4)http://0x7f.0x0.0x0.0x1/(Hex encoding)http://2130706433/(Decimal IP notation for 127.0.0.1)http://localhost%00.evil.com/(null byte injection)http://evil.com#@127.0.0.1/(fragment abuse)
Safe URL parsing with Apache Commons Validator:
// Java - proper URL parsing with Apache Commons Validator
import java.net.URL;
import java.net.InetAddress;
import java.net.UnknownHostException;
import org.apache.commons.validator.routines.InetAddressValidator;
public class SafeUrlParser {
private static final Set<String> ALLOWED_DOMAINS = Set.of(
"api.example.com",
"cdn.example.com"
);
public URL parseAndValidate(String urlString) throws SecurityException {
try {
URL url = new URL(urlString);
String host = url.getHost();
// Validate against allowlist
if (!ALLOWED_DOMAINS.contains(host)) {
throw new SecurityException("Domain not allowed");
}
// Check if it's an IP address (reject if private)
InetAddressValidator validator = InetAddressValidator.getInstance();
if (validator.isValid(host)) {
InetAddress addr = InetAddress.getByName(host);
if (addr.isSiteLocalAddress() || addr.isLoopbackAddress() ||
addr.isLinkLocalAddress()) {
throw new SecurityException("Private IP not allowed");
}
}
// Validate scheme
if (!url.getProtocol().equals("https") && !url.getProtocol().equals("http")) {
throw new SecurityException("Invalid protocol");
}
return url;
} catch (Exception e) {
throw new SecurityException("Invalid URL", e);
}
}
}
Maven Dependency for Apache Commons Validator:
<dependency>
<groupId>commons-validator</groupId>
<artifactId>commons-validator</artifactId>
<version>1.7</version>
</dependency>
Why this works:
- Parser discrepancy exploitation: Attackers bypass
URL.getHost()validation usinghttp://127.0.0.1@evil.com(Java parsesevil.com, cURL uses127.0.0.1), IPv6-wrapped IPv4 (http://[::ffff:127.0.0.1]/), hex (0x7f.0x0.0x1), decimal IP (2130706433for127.0.0.1) - RFC-compliant normalization: Apache Commons
InetAddressValidatornormalizes encoding tricks beforeisSiteLocalAddress()/isLoopbackAddress()checks - Pre-resolution validation: Validates parsed host string against domain allowlist before resolving, preventing null byte injection (
localhost%00.evil.com), fragment abuse (evil.com#@127.0.0.1) - Protocol restriction: Strict validation blocks
jar://,file://and non-HTTP schemes bypassing hostname checks - Layered defense: Allowlist → IP validation → protocol check ensures attackers cannot exploit parser quirks to access internal resources with alternate URL representations
Framework-Specific Guidance
Spring Boot with WebClient
// SECURE - Spring WebClient with validation
import org.springframework.web.reactive.function.client.WebClient;
import reactor.core.publisher.Mono;
@Service
public class SafeApiService {
private final WebClient webClient;
private final UrlValidator urlValidator;
public SafeApiService() {
this.webClient = WebClient.builder()
.baseUrl("https://api.example.com") // Base URL constraint
.build();
this.urlValidator = new UrlValidator(
Set.of("https"),
Set.of("api.example.com"),
true
);
}
public Mono<String> fetchData(String endpoint) {
// Validate endpoint before using
try {
String fullUrl = "https://api.example.com" + endpoint;
urlValidator.validate(fullUrl);
return webClient.get()
.uri(endpoint)
.retrieve()
.bodyToMono(String.class);
} catch (SecurityException e) {
return Mono.error(e);
}
}
}
JAX-RS (Jersey)
// SECURE - JAX-RS with URL validation
import javax.ws.rs.*;
import javax.ws.rs.client.Client;
import javax.ws.rs.client.ClientBuilder;
import javax.ws.rs.core.Response;
@Path("/api")
public class SecureProxyResource {
private final Client client = ClientBuilder.newClient();
private final UrlValidator urlValidator;
public SecureProxyResource() {
this.urlValidator = new UrlValidator(
Set.of("https"),
Set.of("external-api.example.com"),
true
);
}
@GET
@Path("/fetch")
public Response fetchResource(@QueryParam("url") String url) {
try {
URL validatedUrl = urlValidator.validate(url);
Response response = client.target(validatedUrl.toString())
.request()
.get();
return Response.ok(response.readEntity(String.class)).build();
} catch (SecurityException e) {
return Response.status(Response.Status.BAD_REQUEST)
.entity("Invalid URL")
.build();
}
}
}
Protecting Cloud Metadata Endpoints
// SECURE - Block AWS/Azure/GCP metadata endpoints
public class MetadataProtection {
private static final Set<String> BLOCKED_HOSTS = Set.of(
"169.254.169.254", // AWS metadata
"metadata.google.internal", // GCP metadata
"169.254.169.254" // Azure metadata (same as AWS)
);
private static final Set<String> BLOCKED_PATHS = Set.of(
"/latest/meta-data",
"/latest/user-data",
"/latest/dynamic",
"/computeMetadata/v1"
);
public void validateNotMetadata(String url) throws SecurityException {
try {
URL parsedUrl = new URL(url);
String host = parsedUrl.getHost().toLowerCase();
String path = parsedUrl.getPath();
// Block metadata service IPs/hostnames
if (BLOCKED_HOSTS.contains(host)) {
throw new SecurityException("Access to metadata service blocked");
}
// Block metadata paths
for (String blockedPath : BLOCKED_PATHS) {
if (path.startsWith(blockedPath)) {
throw new SecurityException("Access to metadata endpoint blocked");
}
}
// Block link-local addresses (169.254.x.x)
InetAddress addr = InetAddress.getByName(host);
if (addr.isLinkLocalAddress()) {
throw new SecurityException("Link-local addresses blocked");
}
} catch (Exception e) {
throw new SecurityException("URL validation failed: " + e.getMessage());
}
}
}
Testing and Validation
SSRF vulnerabilities should be identified through:
- Static Analysis Tools: Use tools like SonarQube, Checkmarx, Fortify, or Veracode to identify potential SSRF sinks
- Dynamic Application Security Testing (DAST): Tools like OWASP ZAP, Burp Suite, or Acunetix can test for SSRF by manipulating URL parameters
- Manual Penetration Testing: Test with internal IP addresses (127.0.0.1, 192.168.x.x), cloud metadata endpoints (169.254.169.254), and file:// protocols
- Code Review: Ensure all HTTP client usage includes URL validation against an allowlist and blocks private IP ranges
- Network Monitoring: Monitor outbound requests to detect unexpected internal network access