Files
A file field on a form lets visitors attach uploads. The submission lands as a normal payload row, plus a list of file records pointing at the actual bytes in private storage.
Files are a first-class part of every plan, but the rules differ.
Upload from the form
In the builder, drop a file field, then under its config:
accept— MIME types or extensions you allow (["image/png", "image/jpeg", ".pdf"]). Empty = anything goes.multiple— let visitors attach more than one.required— at least one file must be attached.
For legacy forms (your own HTML), use a regular <input type="file" name="cv" />. We accept multipart/form-data POSTs at every form endpoint.
MIME and size restrictions
We enforce MIME at upload time. If accept is set, anything outside the list is rejected before the file is stored. Visitors see an inline error; the submission doesn't land.
Size caps are per file:
| Plan | Per-file cap | Total per submission |
|---|---|---|
| Free | 5 MB | 10 MB |
| Pro | 25 MB | 100 MB |
| Team | 100 MB | 500 MB |
A submission that exceeds the total cap fails with 413 Payload Too Large. No partial uploads — either everything lands or nothing does.
We also block specific extensions outright regardless of accept: .exe, .bat, .cmd, .msi, .dll, .scr, .com. If you genuinely need to accept executables, use a different upload service — we don't host them.
Where files live
Private object storage, on a per-team isolated bucket prefix. The bucket is private — there's no public read URL, and no path-guessable URL works. Bytes are reachable only via signed URLs we mint on demand.
Signed download URLs
Click a file thumbnail in the inbox detail panel and we mint a signed URL with a 15-minute TTL. The URL is single-team — sharing it doesn't grant anyone outside your team the ability to refresh it once it expires.
Signed URLs:
- 15-minute TTL by default.
- Encode the file's content-type so browsers render PDFs and images inline.
- Don't reveal the underlying storage path or bucket name.
- Can't be extended — generate a new one if 15 minutes isn't enough.
Via the API: GET /submissions/{submission}/files/{file}/url returns a fresh signed URL. Via MCP: get_submission_file.
Streaming download (for big files)
Don't hot-link a signed URL into a long-running script — by the time you've downloaded a 100 MB file over a slow link, the URL might have expired. Instead:
curl -L "$(
curl -s -H "Authorization: Bearer $TOKEN" \
https://formspring.io/api/v1/submissions/01J/files/01K/url \
| jq -r .url
)" --output cv.pdf
The -L follows the redirect; the curl -s | jq pulls the freshly-minted URL. Run it as a single command and you'll never bump TTL.
Retention
Files inherit their submission's retention policy. As long as the submission is in your data store, the file is too.
- Submission deleted → file deleted (within 24 hours).
- Submission expired by retention → file expired with it.
- Form archived past retention → all files force-deleted with the form.
We never keep file bytes longer than the submission row. If you need long-term archival, export and store yourself — a CSV-with-file-URLs export uses fresh signed URLs each time you regenerate it.
Inspecting via API and MCP
REST:
GET /api/v1/submissions/{submission}
Returns a files array on the submission, each with id, original_name, mime, size_bytes, signed_url (15-min).
MCP: get_submission includes the files array; get_submission_file mints a fresh URL on demand.
What's next
- Inbox → — where you click to download
- Bulk actions → — file deletes cascade with submissions
- Validation → — gate uploads at submission time