I put up a vps with nginx and the logs show dodgy requests within minutes, how do you guys deal with these?

Edit: Thanks for the tips everyone!

  • WasPentalive@lemmy.one
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    The ligitimate web spiders (for example the crawler used by Google to map the web for search) should pay attention to robots.txt. I think though that that is only valid for web-based services.