Akamai Content Deliver Network (CDN) FAQs

Last updated December 21, 2023

On this page:

What is a CDN?

A "content delivery network" or CDN is a set of "edge" servers physically located around the globe, connected with a high-speed network (usually a private, faster-than-general-internet network). Web traffic connects to this network of servers instead of connecting to the "origin" servers directly (our UBCMS servers). The CDN helps accelerate, filter, and secure web traffic. We will be using CDN services from Akamai, a leading provider.

Why are we using a CDN?

A CDN helps us to:

  • Handle virtually unlimited surges of web traffic
  • Improve page load time, especially for users further from our Buffalo datacenters
  • Block or absorb denial of service attacks and other security issues
  • Bridge over periods of unexpected downtime in our datacenter

Do we need to enable the Akamai CDN for our site?

The CDN is fully enabled for the entire UBCMS. No additional action is needed.

How does Akamai make pages faster?

Several methods improve page performance:

  • Content, especially infrequently changing JS, CSS, and images, is cached to the greatest extent possible within the Akamai network. Any content that can be served from the cache no longer involves UB's servers at all.
  • Content that does have to be routed to UBCMS origin servers uses a faster network than the general internet.
  • Connections between end-users (browsers) and Akamai's network automatically use the lastet, most efficient standards. For example, HTTPS connections on compatible browsers are upgraded to HTTP/2.
  • Persistent connections between Akamai's network and UB's origin servers reduce network overhead

Will pages take longer to activate?

Pages may be cached for up to 10 seconds in the Akamai network and thus may take up to 10 seconds beyond the usual 30-60 seconds to replicate a page. So in practice there is no meaningful delay.

New pages are not affected by this delay at all because there is no old version that will be cached.

Images and other static assets (CSS, JS) are also not affected by this delay because they use a URL fingerprinting technique that essentially makes a new URL any time the content changes.

Will different users see content changes at different times?

No. In theory, more servers that could have different versions of content are involved in serving pages (6 publishers, 2 dispatchers and now thousands of Akamai edge servers). However, Akamai will only make one request for each URL to our servers and then handle synchronizing this content within its network.

How is content cached in the Akamai network?

When a user connects to your site, they will be routed to the closest of Akamai's thousands of global edge servers. If the Akamai network has already seen a version of the page (or asset) that can be cached and is not expired, it will be served to the user without connecting them to our servers. If not, the Akamai network will request the page/asset from our servers.

The Akamai network acts as a cacheing reverser-proxy server. UBCMS has always had a cacheing reverse-proxy server (the "dispatcher" servers), so adding another layer like this should work very smoothly. In particular, rules about what can be cached and for how long are well-established in the UBCMS dispatchers and will be extended to Akamai.

Here is a recap of the cache rules:

  • only GET and HEAD requests are cached (not POST and other methods associated with interactive and transactional content like submitting a form)
  • requests with -pw. or -pw/ in the URL (password-protected pages) are never cached
  • requests with a query string (a "?" in the URL) are never cached
  • pages marked as not cacheable in page properties are never cached
  • HTML pages that are cacheable are only cached by Akamai for up to 10 seconds
  • CSS, JS and images are cached by our dispatchers, Akamai, and each browser indefinitely, however a "URL fingerprinting" technique ensures that their URL changes when their content changes and thus new versions are used immediately. This is not new with Akamai, but will be extra beneficial.

Note: the scheme of cacheing HTML but only for a very short time adds the benefit that if our servers are down, the Akamai network can still serve the expired copies (>10 seconds old) until our servers are reachable again.

What kind of issues are expected when first enabling the CDN?

We do not expect any problems, but the kind of potential issues we will be on highest alert for are:

  • delayed page activations even though replication agents are clear
  • errors logging in to or accessing shibboleth-protected pages
  • content switching between different versions on different loads or for different users (since there are now more servers in the mix)
  • forms not submitting or validating properly

Will analytics continue to work with Akamai?

Yes. Because Google Tag Manager, Google Analytics and similar services use javascript code in the browser to actively send telemetry, this will work unchanged.

Also, analytics that depend on server logs will continue to reflect the demand on the server, but this demand will no longer be directly correlated with user activity on our web pages. By design, many requests to UBCMS URLs will no longer involve on-premise UB servers and will thus not be logged in on-premise log files.

How can we best take advantage of the CDN?

To get the most out of the Akamai CDN, we recommend the following best practices.

Make more pages cacheable

Cacheable pages will benefit from greater acceleration, and only cacheable pages will remain available if service in our UB datacenter is interrupted. It is very strongly recommended that your home page and your most frequently used pages are cacheable. Full details on cache rules are available in this document.

Other techniques to make pages cacheable include:

  • Do not mark pages "not cacheable" in page properties. Using dispatcher cache flush triggers can trigger an update to your page if content within UBCMS changes. Using the "dynamic reference" option in shared content components can make part of your page always-updated while the rest is cached.
  • Write custom interactive applications and modules using client-side/front-end techniques (Javascript) to replace external embed components that cannot be cached or require full page reloads on each interaction.
  • Consult with UC if you have important pages that you're not sure how to make cacheable.

Reduce dependency on non-accelerated services

If custom code on your site depends on external resources, make sure these resources are also loaded from high-speed, high-availability, high-security sources. For example, if you load scripts, css, images, or iframe content from your own server or another server at UB, you may want to move these files into UBCMS if possible (static files can be managed in the DAM via the web interface or WebDAV). If you link to popular third-party JS libraries outside of UBCMS, try finding CDN-backed sources (googleapis.com, cdnjs, jsdelivr, etc.).