How to stop bots in robots.txt and nginx

Hey,

i edit /var/www/yunohost/apps/robots.txt
and is there:

User-agent: *
Disallow: /
i try dissalow the

i check every robots.txt from other apps and are all with disalow.

so i edit /etc/nginx/nginx.conf
and find some good information on the web about this,if you want to check,just search for "nginx-how-to-block-exploits-sql-injections-file-injections-spam-user-agents-etc"
so i add the "server { " entries with all block file injections and block common exploits,spam,hoggers and hacking tools,im not using the one that stops ,job scheduler cron.

but they keep coming…and i dont need them.

are another to way to stop this access´s?

or i have to live with them?

my yunohost is working great and very strong,lovely system,i just wish more apps here;)

if anyone knows a answer please tell me,thks

I don’t think there is much more you can do regarding robots.txt. Bad robots are going to ignore it any ways and keep scanning your server.

Make sure your services and softwares are up to date, to prevent the exploitation of security holes, and use fail2ban to prevent bruteforce attacks.

thank you for ur answer @vetetix ,i always try my best to have everything up to date,not easy sometimes,i dont use fail2ban only iptables , i find this blacklist is another alternative to add to my iptables and more security,and is working good, https://github.com/trick77/ipset-blacklist

yes they will always keep scanning all the servers,i guess we have to live with them.

Are you sure that your nginx and ssowat configuation allow to serve the robot.txt files for unauthenficated users ?

Anyway the best way to block robot is probably not putting a robot.txt file in each app but blocking it in nginx conf. Yunohost do this in web administration configuration:
https://github.com/YunoHost/yunohost/blob/unstable/data/templates/nginx/yunohost_admin.conf#L27-L32

You can also do something like this: https://bascht.com/tech/2013/06/20/emulate-robotstxt-with-simple-nginx-directive/
But I guess you would have to unprotect domain.tld/robot.txt from ssowat.

thank you @tostaki
some apps have robots.txt file by default when they are installed on there folder /var/www/nameofapp,but they arent working.
yunohost have in /var/www/yunohost/apps/robots.txt for all apps.
and im not sure that my nginx and ssowat configuration allow to serve the robot.txt files,im using no.host.me domain,im thinking maybe this free service not allow me to block using robots.txt,right?
the nginx.conf with "server { " entries,they block some of query_strings ,but not all,i will add more entries for the new ones i found.
i have that lines in yunohost_admin.conf without # , but they aren’t doing what supposedly they are created.
i have to check my logs better, and study to find a solution for this.

edit: here my security.conf for nginx https://pi.nohost.me/zerobin/?66106223ccdf0a47#Bez+GY2fHpovBySnFQFANBVxha5l8+YDPCTdBQDMYrA=

but not blocking normal bots