"502 Bad Gateway" when visiting certain URLs, Part 3
Googlebot found another bad URL:
When viewing it on my hosted site, I get an Nginx 502 Bad Gateway error.
Googlebot found another bad URL:
When viewing it on my hosted site, I get an Nginx 502 Bad Gateway error.
I've fixed this in the repository. Replaced with Access Forbidden, maybe "Bad Request" will be better. This request requires some parameters to work, it's a part of the internal api.
Google sure is good at finding these URLs. Would it be prudent to start adding them to robots.txt?
Yes, the non-content urls should not be crawled.
To enter a block of code:
Comments