Static websites store already-rendered content, which is why they don’t need to access any database, run complex scripts, or depend on a runtime engine whenever a user requests a page.

That translates in clear advantages in load times and security: static pages save lots of server time and have fewer vulnerabilities. That, in turn, means that search engines will rank static pages better than their dynamic equivalents.

SEO experts are turning to static content whenever they can, to better compete in a world in which a fraction of a second can make the difference between total success and utter failure. Static content deployment has become a buzzword between marketing strategists, and IT personnel love that they have a less vulnerable spot to keep an eye on.

But beware — they are not 100% hack-proof, so if you are planning to deploy static content on your website, there are some best practices you should follow to keep it secured.

Use Security HTTP Headers

Security headers are a subset of HTTP Response Headers — a pack of metadata, error codes, cache rules, etc. that the webserver adds to the content it serves — designed to tell the browser what to do and how to handle the content it receives. Not all browsers support all security headers, but there’s a small set that is pretty common and provides basic security measures to stop hackers from exploiting vulnerabilities.

X-Frame-Options: SAMEORIGIN

The X-Frame-Options header is intended to disable or mitigate risks imposed by iframes on your site. Iframes can be used by hackers to seize legitimate clicks and direct visitors to any URL they want. There are different ways to prevent the misuse of iframes.

The best practice recommended by OWASP (Open Web Application Security Project) suggests using this header with the SAMEORIGIN parameter, which allows the use of iframes only by someone on the same origin. Other options are DENY, to disable iframes completely, and ALLOW-FROM, to allow only specific URLs to put pages on iframes.

Check out the implementation guide for Apache and Nginx.

X-XSS-Protection: 1; mode=block

X-XSS-Protection header is designed to protect websites from cross-site scripting. This header feature can be implemented in two ways:

  • X-XSS-Protection: 1
  • X-XSS-Protection: 1; mode=block

The first one is more permissive, filtering scripts from the request to the webserver but rendering the page anyway. The second way is more secure since it blocks the whole page when an X-XSS script is detected in the request. This second option is OWASP-recommended best practice.

X-Content-Type-Options: nosniff

This header prevents the use of MIME “sniffing” — a feature that allows the browser to scan the content and respond differently from what the header instructs. When this header is present, the browser must set the content type as instructed, instead of inferring it by “sniffing” the content in advance.

If you apply this header, you should double-check that your content types are applied correctly on each page of your static website.

Content-Type: text/html; charset=utf-8

This line is added to request and response headers for HTML pages since version 1.0 of the HTTP protocol. It establishes that all tags are rendered in the browser, displaying the result on the webpage.

Use TLS certificates

An SSL/TLS certificate is a must for any websites because it allows the webserver to encrypt the data it sends to the web browser through the secure HTTPS protocol. That way, if the data is intercepted in its travel, it will be unreadable, which is essential for protecting user privacy and to secure the website. A static website doesn’t store its visitors’ personal information, but it is essential that the information they request cannot be seen by unwanted watchers.

The use of encryption by a website is necessary to be marked as a safe site by most web browsers and is mandatory for websites that seek to comply with the EU’s General Data Protection Regulation (GDPR). The law does not state specifically that an SSL certificate should be used, but it is the simplest way to meet the privacy requirements in the regulation.

In terms of security, the SSL certificate allows authorities to verify ownership of a website and to prevent hackers from creating fake versions of it. The use of an SSL certificate allows the website visitor to check the authenticity of the publisher and feel confident that no one is able to spy on his or her activities on the website.

The good news is the certificate doesn’t cost much. In fact, you can get it in FREE from ZeroSSL or buy a premium one from SSL Store.

Deploy DDoS protection

Distributed Denial of Service (DDoS) attacks are becoming increasingly common nowadays. In this type of attack, a set of distributed devices is used to overwhelm a server with a flood of requests, until it gets saturated and simply refuses to work. It doesn’t matter if your website has static content — its web server could easily become a victim of a DDoS attack if you don’t take the necessary measures.

The easiest way to implement DDoS protection on your website is to have a security service provider take care of all cyber threats. This service will provide intrusion detection, antiviral services, vulnerability scanning, and more, so you practically don’t have to be concerned about any threats.

Such a comprehensive solution could be pricey, but there are also more focused solutions with lower costs, such as DDoS Protection as a Service (DPaaS). You should ask your hosting provider if it offers such a service.

More affordable solutions are cloud-based DDoS protection services, such as the ones offered by Akamai, Sucuri, or Cloudflare. These services provide early detection and analysis of DDoS attacks, and filtering and diverting of those attacks — that is, rerouting the malicious traffic away from your site.

When considering an anti-DDoS solution, you should pay attention to its network capacity: this parameter indicates how much attack intensity the protection can withstand.

Avoid Vulnerable JavaScript Libraries

Even if your website has static content, it could make use of JavaScript libraries that impose security risks. It is generally considered that 20% of those libraries make a website more vulnerable. Fortunately, you could use the service provided by Vulnerability DB to check if a particular library is safe or not. In its database, you can find detailed information and guidance for a lot of known vulnerabilities.

Besides checking a particular library for vulnerabilities, you can follow this list of best practices for JavaScript libraries that will provide remediation to its potential risks:

  • Don’t use external library servers. Instead, store the libraries in the same server that hosts your website. If you must use external libraries, avoid using libraries from blacklisted servers, and check the security of external servers periodically.
  • Use version management for JavaScript libraries and make sure you use the latest version of each library. If version management is not an option, at least you should use versions that are free from known vulnerabilities. You can use retire.js to detect the use of vulnerable versions.
  • Regularly check if your website is using external libraries you don’t know about. This way, you will know if a hacker injected links to unwanted library providers. Injection attacks are unlikely in static websites, but it won’t harm to do this checking once in a while.

Implement Backup Strategy

A static website should always have its content safely backed up whenever it is changed. The backup copies must be safely stored and easily accessible in case you need to restore your website in the event of a crash. There are many ways to backup your static website, but in general, they can be categorized in manual and automatic.

If your website content doesn’t change very frequently, a manual backup strategy can be adequate — you just need to remember to make a fresh backup whenever you make a change in the content. If you have a control panel to manage your hosting account, it is very likely that within that control panel, you will find an option to make backups. If not, you can always use an FTP client to download all the website content to a local device where you can keep it safe and restore it if necessary.

Of course, the automatic backup option is preferable if you want to keep your website management tasks at a minimum. But automatic backups are usually offered as a premium features by hosting providers, adding to the total cost of keeping your website secured.

You may consider using cloud object storage for the backup.

Use a Reliable Hosting Provider

A reliable web hosting service is necessary to guarantee that your website will operate smoothly and swiftly, but also to be sure that it will not be hacked. Most web hosting reviews will show you figures and comparisons about speed, uptime and customer support, but when considering website security, there are some aspects that should be carefully observed, and that you should ask your provider about before hiring its service:

  • Software security: you should find out how software updates are handled; for example, if all the software is auto-updated or if each update is subjected to a testing process before it is deployed.
  • DDoS protection: in case this kind of protection is included with the hosting service, ask for details about how it is implemented, to verify if it meets your website requirements.
  • SSL availability and support: since in most cases the certificates are managed by the hosting provider, you should check what kind of certificate it offers and what is the certificate renewal policy.
  • Backup and restore: many hosting providers offer an automated backup service, which is a good thing because it practically lets you forget about making backups, storing them, and keeping them updated. But take into account the cost of such a service, and weight it against the effort it will take to keep your content backed up by yourself.
  • Malware protection: a reliable hosting provider should have its servers protected against malware, by performing periodic malware scans and monitoring file integrity. In the case of shared hosting, it is desirable that the hosting provider makes use of account isolation, to prevent malware infections from propagating between neighbor websites.
  • Firewall protection: a hosting provider can increase the security level of the websites it hosts by deploying a firewall that keeps hostile traffic away.

Check out the reliable static site hosting platform.

Enforce a Strong Password Policy

Since a static site doesn’t have a database or a managed content system, it has fewer usernames and passwords to manage. But you still have to enforce a password policy for the hosting or FTP accounts you will use to update the static content.

Good practices for passwords include, among others:

  • Changing them periodically
  • Setting a minimum password length.
  • Using combinations of upper case/lower case letters together with special characters and numbers
  • Avoid communicating them by email or text messages.

Also, the default password for administrative accounts must be changed from the very beginning — this is a common error that hackers can easily exploit. Don’t be scared of losing password; use a password manager to manage them securely.

Let’s get static

A few years ago, dynamic content was the way to go: everything could be easily changed and updated, allowing for an entire website redesign within seconds. But then, speed became the top priority, and static content suddenly became cool again.

In that sense, all website security practices should be reevaluated — there sure are fewer aspects to consider, but you should not get all relaxed about it. This list of best practices will surely help you create your own checklist to keep your static website safe and sound.