i edit /var/www/yunohost/apps/robots.txt
and is there:
User-agent: *
Disallow: /
i try dissalow the
i check every robots.txt from other apps and are all with disalow.
so i edit /etc/nginx/nginx.conf
and find some good information on the web about this,if you want to check,just search for "nginx-how-to-block-exploits-sql-injections-file-injections-spam-user-agents-etc"
so i add the "server { " entries with all block file injections and block common exploits,spam,hoggers and hacking tools,im not using the one that stops ,job scheduler cron.
but they keep coming…and i dont need them.
are another to way to stop this access´s?
or i have to live with them?
my yunohost is working great and very strong,lovely system,i just wish more apps here;)
thank you for ur answer @vetetix ,i always try my best to have everything up to date,not easy sometimes,i dont use fail2ban only iptables , i find this blacklist is another alternative to add to my iptables and more security,and is working good, https://github.com/trick77/ipset-blacklist
yes they will always keep scanning all the servers,i guess we have to live with them.
thank you @tostaki
some apps have robots.txt file by default when they are installed on there folder /var/www/nameofapp,but they arent working.
yunohost have in /var/www/yunohost/apps/robots.txt for all apps.
and im not sure that my nginx and ssowat configuration allow to serve the robot.txt files,im using no.host.me domain,im thinking maybe this free service not allow me to block using robots.txt,right?
the nginx.conf with "server { " entries,they block some of query_strings ,but not all,i will add more entries for the new ones i found.
i have that lines in yunohost_admin.conf without # , but they aren’t doing what supposedly they are created.
i have to check my logs better, and study to find a solution for this.