Search engine optimization
Search Engine Optimization requires careful attention to maintaining good URLs, links, relevant content and A/B testing. Fastly is a great place to exert fine control over all these aspects of your site experience without compromising on performance.
Many companies spend a significant part of their marketing budget on Search Engine Marketing (SEM) and advertising, so anything you can do that increases organic traffic is worth its weight in gold. In the case of Google, tools like Pagespeed Insights and Search Console help to quickly identify which improvements are expected on your site.
Here we'll help you understand how Fastly can help boost your organic search performance.
Improving search ranking is rarely a matter of tuning a single property of your website, since many different signals contribute to good ranking. A number of Fastly features can come in useful:
- Header manipulation: HTTP headers can affect the way a search crawler interprets your page, and are also important in creating redirect responses. Learn more about headers and Fastly.
- Cache policy: When cache directives are used correctly, it can dramatically improve your site's performance. Fastly respects cache policies set on responses by your servers, including
Surrogate-Controlas well as all
Cache-Controldirectives. In VCL services, we also allow cache directives to be modified before a response is cached. Learn more about cache freshness.
- Stale serving: Serving stale allows content to be served even though it is past its normal cache lifetime. Normally this is done while the content is refreshed in the background, or when the origin server is down.
- Image optimization: Turn on our image optimizer to adapt images to the correct size and format for each client device.
- Compression: Compressing may dramatically reduce your response sizes, which will often be valued by crawlers, and may allow a crawler to visit each page more often. Fastly supports compressing content at the edge.
- HTML manipulation: Using Compute@Edge, streaming transforms can be written in any supported compute language, to modify the content of the response as it passes through your app.
- Generated responses: In Compute@Edge services it's easy to construct responses from scratch, which is often valuable to create redirects. In VCL services the
syntheticstatement provides the same capability.
- Data at the edge: It can be handy to be able to configure datasets that are accessible to edge applications, to power redirects or content insertion, for example. Compute@Edge services have access to Config stores, while VCL services use Edge dictionaries.
Ideas and typical uses
Some of the specific things that Fastly customers do at the edge to improve SEO include:
- Redirects: Search engine algorithms are still strongly based on links between pages. These links help determine the visibility that each web page has within the search engine results pages (SERPs). Ensuring URLs never die is one of the most important aspects of a good SEO strategy, and the edge is the best place for redirects, so that they can be served as fast as possible.
- Increasing cache TTLs: Longer cache times mean that it's more likely a search crawler will get a HIT from the cache, instead of having to load a page from your origin servers. Remember that search crawlers don't act like normal users; they will visit far more long-tail content, which is much less likely to be in cache.
- Enabling stale serving: In some cases, it can make sense to serve a crawler with potentially out of date content very fast, rather than fetching up to date content and giving the crawler a slower experience.
- Modifying content: Mis-ordered headings, over-long text or lack of descriptive content can all cause poor ranking for pages. Consider generating additional content for product or listing pages at the edge using AI language models such as ChatGPT.
- Optimizing images: Serving images that are the correct size and format for the client makes pages smaller, faster and easier for crawlers to parse.
- Authenticating crawlers: Some behaviors may be triggered by knowing that the visitor is a crawler, such as allowing the crawler to see premium content instead of a paywall. This often makes use of the
User-Agentheader, but that's easily faked. A DNS lookup at the edge can help ensure that you are dealing with the real Googlebot.
- Canonical link headers: The
Linkheader can be used to denote the URL that should be considered canonical for a piece of content. Where the same content is available from multiple URLs, using a
Link: rel=canonical; url=<...>header will avoid search crawlers seeing your content as a duplicate.
- Data collection: It's important to know how crawlers are behaving on your site. Modifying response content at the edge can enable the insertion of
<script>tags to load your own or third party behavioral analytics libraries, and report their data back to your Fastly service for logging.
The following demos, tutorials, starter kits, and code examples are relevant to this use case. Try them out!
Sites running on Fastly you can browse. See the principles working for an end user.
|Social sharing cards||Generate custom social sharing cards on the fly and serve them from the edge.|
Step by step instructions. Build and learn.
|Redirects||Your servers often have to handle millions of requests for old and non-canonical URLs. This can cause unneeded load, as well as make logs messier and, if you have recently changed your site's URL scheme, you might be redirecting a lot! Learn how to shift all your static redirects to the edge using an Edge dictionary.|
Template Compute@Edge applications you can use to bootstrap your project.
|Optimizely Feature Experimentation, Optimizely Full Stack (legacy), and Optimizely Rollouts||A basic starter kit for Fastly's Compute@Edge with Optimizely built in.|
Snippets of code ready to copy and paste.
|Add www. to apex hostname and subdomains||Detect requests that don't include a www. prefix, and redirect to the equivalent path on a hostname that starts with www., usually to make sure there's only one canonical location for your content.|
|Create image transform presets||Use custom, predefined classnames like large, medium, small, teaser, thumb, or article to control Fastly Image Optimizer and optionally prevent end-user access to native properties like 'width'.|
|Enrich image responses with EXIF metadata||Use the `exif` Rust crate to decorate a backend response with image metadata.|
|Filter query string parameters||Add, remove, and sort querystring parameters.|
|Follow redirects at the edge||Protect clients from redirects by chasing them internally at the edge, and then return the eventual non-redirect response.|
|Perform redirects with wildcard patterns using an edge dictionary||Match URL prefixes and make use of configurable response status and querystring preservation.|
|Redirect old URLs at the edge||Use a dictionary of URL mappings to serve your redirects at lightning speed.|
|Remove trailing slashes to normalize URLs||Treat URLs with and without suffixed slashes as equivalent, or redirect URLs with slashes to the version without.|
|Rewrite URL path||Receive a request for one path but request a different path from origin, without a redirect.|
|Serve robots.txt from the edge||Serve full text of robots.txt as a synthetic response to avoid requests hitting your origin.|
|Serve stale to search crawlers for better ranking||Prioritize human traffic over search crawlers by serving stale content to crawlers.|
|Transform a response while streaming it||Streaming transformations avoid buffering a response, reducing latency and memory consumption|
|Use dynamic backends to follow redirects||Create a dynamic backend from the redirect response, and then get a response from the dynamic backend.|
|Use microservices to divide up a domain||Send request to different origin servers based on the URL path.|
|Verify if a web crawler accessing your server really is Googlebot||An implementation of Google's recommended mechanism for verifying googlebot|