The WebSite Checkup section shows the results of a website testing considering the next essential parameters:
SSL Certificate - checks the presence of a Secure Socket Layer Certificate on a website. The information sent on the Internet is passed from computer to the destination server. Any computer in between a user and a server is able to get usernames, passwords and such important details as, for instance, credit cards. SSL certificate encrypts all information, including sensitive one.
Valid certificate - checks the validity of an SSL certificate (if present).
Server signature test - checks whether your server returns its own version. If it does, the check is considered as not passed. A server signature is the public identity of a web server. It contains sensitive information that could be used to exploit a vulnerability. You should turn your server signature OFF to secure it and avoid disclosure of what versions of software you are running.
Have robots.txt - checks the presence of robots.txt file on a website. Robots.txt is a standard that websites use to communicate with crawlers and robots. It informs the robot about which areas of the website it should or should not process or scan.
Have sitemap.xml - checks the presence of sitemap.xml file on a website. A sitemap includes a list of pages on a website that are accessible to crawlers or users.
IP canonicalization test - normally, a server IP address should forward to a website domain via 301 (redirect). If there is different response code the test is considered as not passed.
Trash page test - checks if non-existent URLs return 404 response code. If they don't, the check is considered as not passed.
Use the Site Auditor settings section to modify the parameters of your website scanning by our crawler.