Vanilla is a web forum software, it is both a cloud-based community forum software and an open source community supported software.
Versions affected | vanilla forum <= 3.3 | safecurl <= 0.9.2 |
CVE identifier | CVE-2020-36474 |
# Summary
an authenticated user when requesting /api/v2/media/scrape with a valid URL and vanilla will request the URL using safecurl lib and print back some summary although the user cannot request any URL due to security checks and a blacklisting enabled but is prone to dns rebinding attack.
# Vulnerability analysis
Vanilla uses a custom safecurl lib to perform server side requests, each url requested using media scrape, will have to go through a series of validations
public function validateUrl(string $url): array {
if ("" === trim($url)) {
throw new InvalidURLException("URL cannot be empty.");
}
$parts = parse_url($url);
if (empty($parts)) {
throw new InvalidURLException("Unable to parse URL.");
}
if (!array_key_exists("host", $parts)) {
throw new InvalidURLException("No host found in URL.");
}
//If credentials are passed in, but we don't want them, raise an exception
if (!$this->areCredentialsAllowed() && (array_key_exists("user", $parts) || array_key_exists("pass", $parts))) {
throw new InvalidURLException("Credentials not allowed as part of the URL.");
}
//First, validate the scheme
if (array_key_exists("scheme", $parts)) {
$parts["scheme"] = $this->validateScheme($parts["scheme"]);
} else {
//Default to http
$parts["scheme"] = "http";
}
//Validate the port
if (array_key_exists("port", $parts)) {
$parts["port"] = $this->validatePort($parts["port"]);
}
//Validate the host
$host = $this->validateHost($parts["host"]);
$parts["host"] = $host["host"];
//Rebuild the URL
$url = $this->buildUrl($parts);
return array(
"url" => $url,
"host" => $host["host"],
"ips" => $host["ips"],
);
}
mostly checks for URL format (host/creds/scheme/port...),
//Now resolve to an IP and check against the IP lists
$ips = @gethostbynamel($host);
if (empty($ips)) {
throw new InvalidURLException("Unable to resolve host.");
}
$whitelistedIPs = $this->whitelist->getIPs();
if (!empty($whitelistedIPs)) {
$valid = false;
foreach ($whitelistedIPs as $whitelistedIP) {
foreach ($ips as $ip) {
if ($this->cidrMatch($ip, $whitelistedIP)) {
$valid = true;
break 2;
}
}
}
if (!$valid) {
throw new InvalidURLException("Host does not resolve to a whitelisted address.");
}
}
$blacklitedIPs = $this->blacklist->getIPs();
if (!empty($blacklitedIPs)) {
foreach ($blacklitedIPs as $blacklitedIP) {
foreach ($ips as $ip) {
if ($this->cidrMatch($ip, $blacklitedIP)) {
throw new InvalidURLException("Host resolves to a blacklisted address.");
}
}
}
}
return ["host" => $host, "ips" => $ips];
Doing validation against IP whitelist/blacklist
private function getDefaultBlacklist(): UrlPartsList {
$result = new UrlPartsList();
$result->setIPs([
"0.0.0.0/8",
"10.0.0.0/8",
"100.64.0.0/10",
"127.0.0.0/8",
"169.254.0.0/16",
"172.16.0.0/12",
"192.0.0.0/29",
"192.0.2.0/24",
"192.88.99.0/24",
"192.168.0.0/16",
"198.18.0.0/15",
"198.51.100.0/24",
"203.0.113.0/24",
"224.0.0.0/4",
"240.0.0.0/4",
]);
return $result;
}
Blacklisted IPs list is hardcoded by default
If first time resolved is not blacklisted then when curl fetches the URL it will pass. Thus performing dns rebinding with media scrape api call will bypass the blacklist check.
# Impact
An authenticated user (can be obtained via signup) can bypass URL blacklisting in media scrape API through DNS rebinding leading to parial read SSRF
# Timeline
13-02-2020 - Reported
17-02-2020 - Vendor confirmed
23-10-2020 - Fixed