SlideShare a Scribd company logo
NGINX High-performance 
Caching 
Introduced by Andrew Alexeev 
Presented by Owen Garrett 
Nginx, Inc.
About this webinar 
Content Caching is one of the most effective ways to dramatically improve 
the performance of a web site. In this webinar, we’ll deep-dive into 
NGINX’s caching abilities and investigate the architecture used, debugging 
techniques and advanced configuration. By the end of the webinar, you’ll 
be well equipped to configure NGINX to cache content exactly as you need.
BASIC PRINCIPLES OF CONTENT CACHING
Basic Principles 
Internet 
N 
GET /index.html 
GET /index.html 
Used by: Browser Cache, Content Delivery Network and/or Reverse Proxy Cache
Mechanics of HTTP Caching 
• Origin server declares cacheability of content 
Expires: Tue, 6 May 2014 02:28:12 GMT 
Cache-Control: public, max-age=60 
X-Accel-Expires: 30 
Last-Modified: Tue, 29 April 2014 02:28:12 GMT 
ETag: "3e86-410-3596fbbc“ 
• Requesting client honors cacheability 
– May issue conditional GETs
What does NGINX cache? 
• Cache GET and HEAD with no Set-Cookie response 
• Uniqueness defined by raw URL or: 
proxy_cache_key $scheme$proxy_host$uri$is_args$args; 
• Cache time defined by 
– X-Accel-Expires 
– Cache-Control 
– Expires http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
NGINX IN OPERATION…
NGINX Config 
proxy_cache_path /tmp/cache keys_zone=one:10m levels=1:2 inactive=60m; 
server { 
listen 80; 
server_name localhost; 
location / { 
proxy_pass http://localhost:8080; 
proxy_cache one; 
} 
}
Caching Process 
Internet 
MISS 
Read request Wait? 
Check Cache 
Respond from 
cache 
cache_lock_timeout 
Response 
cacheable? 
HIT 
Stream to disk 
NGINX can use stale content under the following circumstances: 
proxy_cache_use_stale error | timeout | invalid_header | 
updating | http_500 | http_502 | http_503 | http_504 | 
http_403 | http_404 | off
Caching is not just for HTTP 
• FastCGI 
– Functions much like HTTP 
• Memcache 
– Retrieve content from memcached 
server (must be prepopulated) 
• uwsgi and SCGI 
N 
HTTP 
FastCGI 
memcached 
uwsgi 
SCGI 
NGINX is more than 
just a reverse proxy
HOW TO UNDERSTAND WHAT’S GOING ON
Cache Instrumentation 
add_header X-Cache-Status $upstream_cache_status; 
MISS Response not found in cache; got from upstream. Response may have been 
saved to cache 
BYPASS proxy_cache_bypass got response from upstream. Response may have 
been saved to cache 
EXPIRED entry in cache has expired; we return fresh content from upstream 
STALE takes control and serves stale content from cache because upstream is not 
responding correctly 
UPDATING serve state content from cache because cache_lock has timed out and 
proxy_use_stale takes control 
REVALIDATED proxy_cache_revalidate verified that the current cached content was still 
valid (if-modified-since) 
HIT we serve valid, fresh content direct from cache
Cache Instrumentation 
map $remote_addr $cache_status { 
127.0.0.1 $upstream_cache_status; 
default “”; 
} 
server { 
location / { 
proxy_pass http://localhost:8002; 
proxy_cache one; 
add_header X-Cache-Status $cache_status; 
} 
}
Extended Status 
Check out: demo.nginx.com 
http://demo.nginx.com/status.html http://demo.nginx.com/status
HOW CONTENT CACHING FUNCTIONS 
IN NGINX
How it works... 
• NGINX uses a persistent disk-based cache 
– OS Page Cache keeps content in memory, with hints from 
NGINX processes 
• We’ll look at: 
– How is content stored in the cache? 
– How is the cache loaded at startup? 
– Pruning the cache over time 
– Purging content manually from the cache
How is cached content stored? 
proxy_cache_path /tmp/cache keys_zone=one:10m levels=1:2 
max_size=40m; 
• Define cache key: 
proxy_cache_key $scheme$proxy_host$uri$is_args$args; 
• Get the content into the cache, then check the md5 
$ echo -n "httplocalhost:8002/time.php" | md5sum 
6d91b1ec887b7965d6a926cff19379b4 - 
• Verify it’s there: 
$ cat /tmp/cache/4/9b/6d91b1ec887b7965d6a926cff19379b4
Loading cache from disk 
• Cache metadata stored in shared memory segment 
• Populated at startup from cache by cache loader 
proxy_cache_path path keys_zone=name:size 
[loader_files=number] [loader_threshold=time] [loader_sleep=time]; 
(100) (200ms) (50ms) 
– Loads files in blocks of 100 
– Takes no longer than 200ms 
– Pauses for 50ms, then repeats
Managing the disk cache 
• Cache Manager runs periodically, purging files that 
were inactive irrespective of cache time, deleteing 
files in LRU style if cache is too big 
proxy_cache_path path keys_zone=name:size 
[inactive=time] [max_size=size]; 
(10m) 
– Remove files that have not been used within 10m 
– Remove files if cache size exceeds max_size
Purging content from disk 
• Find it and delete it 
– Relatively easy if you know the key 
• NGINX Plus – cache purge capability 
$ curl -X PURGE -D – "http://localhost:8001/*" 
HTTP/1.1 204 No Content 
Server: nginx/1.5.12 
Date: Sat, 03 May 2014 16:33:04 GMT 
Connection: keep-alive 
X-Cache-Key: httplocalhost:8002/*
CONTROLLING CACHING
Delayed caching 
proxy_cache_min_uses number; 
• Saves on disk writes for very cool caches 
Cache revalidation 
proxy_cache_revalidate on; 
• Saves on upstream bandwidth and disk writes
Control over cache time 
proxy_cache_valid 200 302 10m; 
proxy_cache_valid 404 1m; 
• Priority is: 
– X-Accel-Expires 
– Cache-Control 
– Expires 
– proxy_cache_valid 
Set-Cookie response header 
means no caching
Cache / don’t cache 
proxy_cache_bypass string ...; 
proxy_no_cache string ...; 
• Bypass the cache – go to origin; may cache result 
• No_Cache – if we go to origin, don’t cache result 
proxy_no_cache $cookie_nocache $arg_nocache $http_authorization; 
• Typically used with a complex cache key, and only if the 
origin does not sent appropriate cache-control reponses
Multiple Caches 
proxy_cache_path /tmp/cache1 keys_zone=one:10m levels=1:2 inactive=60s; 
proxy_cache_path /tmp/cache2 keys_zone=two:2m levels=1:2 inactive=20s; 
• Different cache policies for different tenants 
• Pin caches to specific disks 
• Temp-file considerations – put on same disk!: 
proxy_temp_path path [level1 [level2 [level3]]];
QUICK REVIEW – WHY CACHE?
Why is page speed important? 
• We used to talk about the ‘N second rule’: 
– 10-second rule 
• (Jakob Nielsen, March 1997) 
– 8-second rule 
• (Zona Research, June 2001) 
– 4-second rule 
• (Jupiter Research, June 2006) 
– 3-second rule 
• (PhocusWright, March 2010) 
12 
10 
8 
6 
4 
2 
0 
Jan-97 
Jan-98 
Jan-99 
Jan-00 
Jan-01 
Jan-02 
Jan-03 
Jan-04 
Jan-05 
Jan-06 
Jan-07 
Jan-08 
Jan-09 
Jan-10 
Jan-11 
Jan-12 
Jan-13 
Jan-14
Google changed the rules 
“We want you to be able to get 
from one page to another as 
quickly as you turn the page on 
a book” 
Urs Hölzle, Google
The costs of poor performance 
• Google: search enhancements cost 0.5s page load 
– Ad CTR dropped 20% 
• Amazon: Artificially increased page load by 100ms 
– Customer revenue dropped 1% 
• Walmart, Yahoo, Shopzilla, Edmunds, Mozilla… 
– All reported similar effects on revenue 
• Google Pagerank – Page Speed affects Page Rank 
– Time to First Byte is what appears to count
NGINX Caching lets you 
Improve end-user performance 
Consolidate and simplify your web infrastructure 
Increase server capacity 
Insulate yourself from server failures
Closing thoughts 
• 38% of the world’s busiest websites use NGINX 
• Check out the blogs on nginx.com 
• Future webinars: nginx.com/webinars 
Try NGINX F/OSS (nginx.org) or NGINX Plus (nginx.com)

More Related Content

PDF
Memory Mapping Implementation (mmap) in Linux Kernel
PDF
Kernel Recipes 2017: Using Linux perf at Netflix
PDF
Container Performance Analysis
PDF
LISA2019 Linux Systems Performance
PDF
malloc & vmalloc in Linux
PDF
Linux Binary Exploitation - Return-oritend Programing
PDF
Performance Wins with BPF: Getting Started
PDF
CephFS Update
Memory Mapping Implementation (mmap) in Linux Kernel
Kernel Recipes 2017: Using Linux perf at Netflix
Container Performance Analysis
LISA2019 Linux Systems Performance
malloc & vmalloc in Linux
Linux Binary Exploitation - Return-oritend Programing
Performance Wins with BPF: Getting Started
CephFS Update

What's hot (20)

PDF
Computing Performance: On the Horizon (2021)
PDF
x86とコンテキストスイッチ
PDF
MongoDB World 2019: MongoDB Read Isolation: Making Your Reads Clean, Committe...
PDF
Linuxカーネルを読んで改めて知るプロセスとスレッドの違い
PDF
Linux Performance Profiling and Monitoring
PDF
High-Performance Networking Using eBPF, XDP, and io_uring
PDF
Troubleshooting Cassandra (J.B. Langston, DataStax) | C* Summit 2016
PDF
LSFMM 2019 BPF Observability
PDF
Memory Management with Page Folios
PPTX
Tuning Apache Kafka Connectors for Flink.pptx
PDF
DPDK QoS
PDF
Process Address Space: The way to create virtual address (page table) of user...
PDF
Linux Kernel vs DPDK: HTTP Performance Showdown
PPTX
Modern Linux Tracing Landscape
PDF
Linux BPF Superpowers
PDF
High Availability Content Caching with NGINX
PDF
Seastar / ScyllaDB, or how we implemented a 10-times faster Cassandra
PPTX
Understanding DPDK
PPTX
Where is my bottleneck? Performance troubleshooting in Flink
PDF
H2O - the optimized HTTP server
Computing Performance: On the Horizon (2021)
x86とコンテキストスイッチ
MongoDB World 2019: MongoDB Read Isolation: Making Your Reads Clean, Committe...
Linuxカーネルを読んで改めて知るプロセスとスレッドの違い
Linux Performance Profiling and Monitoring
High-Performance Networking Using eBPF, XDP, and io_uring
Troubleshooting Cassandra (J.B. Langston, DataStax) | C* Summit 2016
LSFMM 2019 BPF Observability
Memory Management with Page Folios
Tuning Apache Kafka Connectors for Flink.pptx
DPDK QoS
Process Address Space: The way to create virtual address (page table) of user...
Linux Kernel vs DPDK: HTTP Performance Showdown
Modern Linux Tracing Landscape
Linux BPF Superpowers
High Availability Content Caching with NGINX
Seastar / ScyllaDB, or how we implemented a 10-times faster Cassandra
Understanding DPDK
Where is my bottleneck? Performance troubleshooting in Flink
H2O - the optimized HTTP server
Ad

Viewers also liked (12)

PDF
【Interop Tokyo 2015】 M 01: Cisco Meraki クラウド ネットワーキング
PPTX
Velocity 2010 Highlights
PDF
Tips on High Performance Server Programming
PDF
阿里开源经验分享
PDF
阿里CDN技术揭秘
PDF
Nginx Internals
PDF
阿里云CDN技术演进之路
PDF
HTTP cache @ PUG Rome 03-29-2011
PPTX
Load Balancing and Scaling with NGINX
PDF
Http cache - kiedy/dlaczego/jak
PDF
20100918 android cache
PDF
So You Wanna Go Fast?
【Interop Tokyo 2015】 M 01: Cisco Meraki クラウド ネットワーキング
Velocity 2010 Highlights
Tips on High Performance Server Programming
阿里开源经验分享
阿里CDN技术揭秘
Nginx Internals
阿里云CDN技术演进之路
HTTP cache @ PUG Rome 03-29-2011
Load Balancing and Scaling with NGINX
Http cache - kiedy/dlaczego/jak
20100918 android cache
So You Wanna Go Fast?
Ad

Similar to NGINX High-performance Caching (20)

PDF
Using NGINX as an Effective and Highly Available Content Cache
PPTX
High Availability Content Caching with NGINX
PPTX
NGINX as a Content Cache
PDF
ITB2017 - Nginx Effective High Availability Content Caching
PDF
Nginx caching
PDF
What is Nginx and Why You Should to Use it with Wordpress Hosting
PPTX
NGINX for Application Delivery & Acceleration
PPTX
Delivering High Performance Websites with NGINX
PPTX
Nginx Scalable Stack
PPTX
Drupal 8 and NGINX
PPTX
5 things you didn't know nginx could do velocity
PPTX
Maximizing PHP Performance with NGINX
PPTX
NGINX: Basics and Best Practices
PPTX
NGINX 101: Web Server and Reverse-Proxy Cache
PPTX
5 things you didn't know nginx could do
KEY
Nginx - Tips and Tricks.
PDF
tuning-nginx-for-high-performance-nick-shadrin.pdf
PDF
NGINX: The Past, Present and Future of the Modern Web
PDF
ITB2017 - Nginx ppf intothebox_2017
PPTX
What's New in NGINX Plus R7?
Using NGINX as an Effective and Highly Available Content Cache
High Availability Content Caching with NGINX
NGINX as a Content Cache
ITB2017 - Nginx Effective High Availability Content Caching
Nginx caching
What is Nginx and Why You Should to Use it with Wordpress Hosting
NGINX for Application Delivery & Acceleration
Delivering High Performance Websites with NGINX
Nginx Scalable Stack
Drupal 8 and NGINX
5 things you didn't know nginx could do velocity
Maximizing PHP Performance with NGINX
NGINX: Basics and Best Practices
NGINX 101: Web Server and Reverse-Proxy Cache
5 things you didn't know nginx could do
Nginx - Tips and Tricks.
tuning-nginx-for-high-performance-nick-shadrin.pdf
NGINX: The Past, Present and Future of the Modern Web
ITB2017 - Nginx ppf intothebox_2017
What's New in NGINX Plus R7?

More from NGINX, Inc. (20)

PDF
【NGINXセミナー】 Ingressを使ってマイクロサービスの運用を楽にする方法
PDF
【NGINXセミナー】 NGINXのWAFとは?その使い方と設定方法 解説セミナー
PDF
【NGINXセミナー】API ゲートウェイとしてのNGINX Plus活用方法
PPTX
Get Hands-On with NGINX and QUIC+HTTP/3
PPTX
Managing Kubernetes Cost and Performance with NGINX & Kubecost
PDF
Manage Microservices Chaos and Complexity with Observability
PDF
Accelerate Microservices Deployments with Automation
PDF
Unit 2: Microservices Secrets Management 101
PDF
Unit 1: Apply the Twelve-Factor App to Microservices Architectures
PDF
NGINX基本セミナー(セキュリティ編)~NGINXでセキュアなプラットフォームを実現する方法!
PDF
Easily View, Manage, and Scale Your App Security with F5 NGINX
PDF
NGINXセミナー(基本編)~いまさら聞けないNGINXコンフィグなど基本がわかる!
PDF
Keep Ahead of Evolving Cyberattacks with OPSWAT and F5 NGINX
PPTX
Install and Configure NGINX Unit, the Universal Application, Web, and Proxy S...
PPTX
Protecting Apps from Hacks in Kubernetes with NGINX
PPTX
NGINX Kubernetes API
PPTX
Successfully Implement Your API Strategy with NGINX
PPTX
Installing and Configuring NGINX Open Source
PPTX
Shift Left for More Secure Apps with F5 NGINX
PPTX
How to Avoid the Top 5 NGINX Configuration Mistakes.pptx
【NGINXセミナー】 Ingressを使ってマイクロサービスの運用を楽にする方法
【NGINXセミナー】 NGINXのWAFとは?その使い方と設定方法 解説セミナー
【NGINXセミナー】API ゲートウェイとしてのNGINX Plus活用方法
Get Hands-On with NGINX and QUIC+HTTP/3
Managing Kubernetes Cost and Performance with NGINX & Kubecost
Manage Microservices Chaos and Complexity with Observability
Accelerate Microservices Deployments with Automation
Unit 2: Microservices Secrets Management 101
Unit 1: Apply the Twelve-Factor App to Microservices Architectures
NGINX基本セミナー(セキュリティ編)~NGINXでセキュアなプラットフォームを実現する方法!
Easily View, Manage, and Scale Your App Security with F5 NGINX
NGINXセミナー(基本編)~いまさら聞けないNGINXコンフィグなど基本がわかる!
Keep Ahead of Evolving Cyberattacks with OPSWAT and F5 NGINX
Install and Configure NGINX Unit, the Universal Application, Web, and Proxy S...
Protecting Apps from Hacks in Kubernetes with NGINX
NGINX Kubernetes API
Successfully Implement Your API Strategy with NGINX
Installing and Configuring NGINX Open Source
Shift Left for More Secure Apps with F5 NGINX
How to Avoid the Top 5 NGINX Configuration Mistakes.pptx

Recently uploaded (20)

PDF
Encapsulation theory and applications.pdf
PPTX
Spectroscopy.pptx food analysis technology
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PPTX
A Presentation on Artificial Intelligence
PPTX
sap open course for s4hana steps from ECC to s4
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PPTX
Cloud computing and distributed systems.
PPTX
Machine Learning_overview_presentation.pptx
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Machine learning based COVID-19 study performance prediction
PDF
Approach and Philosophy of On baking technology
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
A comparative analysis of optical character recognition models for extracting...
PDF
cuic standard and advanced reporting.pdf
PDF
Assigned Numbers - 2025 - Bluetooth® Document
Encapsulation theory and applications.pdf
Spectroscopy.pptx food analysis technology
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
A Presentation on Artificial Intelligence
sap open course for s4hana steps from ECC to s4
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Cloud computing and distributed systems.
Machine Learning_overview_presentation.pptx
Dropbox Q2 2025 Financial Results & Investor Presentation
Machine learning based COVID-19 study performance prediction
Approach and Philosophy of On baking technology
Digital-Transformation-Roadmap-for-Companies.pptx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Per capita expenditure prediction using model stacking based on satellite ima...
20250228 LYD VKU AI Blended-Learning.pptx
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Building Integrated photovoltaic BIPV_UPV.pdf
A comparative analysis of optical character recognition models for extracting...
cuic standard and advanced reporting.pdf
Assigned Numbers - 2025 - Bluetooth® Document

NGINX High-performance Caching

  • 1. NGINX High-performance Caching Introduced by Andrew Alexeev Presented by Owen Garrett Nginx, Inc.
  • 2. About this webinar Content Caching is one of the most effective ways to dramatically improve the performance of a web site. In this webinar, we’ll deep-dive into NGINX’s caching abilities and investigate the architecture used, debugging techniques and advanced configuration. By the end of the webinar, you’ll be well equipped to configure NGINX to cache content exactly as you need.
  • 3. BASIC PRINCIPLES OF CONTENT CACHING
  • 4. Basic Principles Internet N GET /index.html GET /index.html Used by: Browser Cache, Content Delivery Network and/or Reverse Proxy Cache
  • 5. Mechanics of HTTP Caching • Origin server declares cacheability of content Expires: Tue, 6 May 2014 02:28:12 GMT Cache-Control: public, max-age=60 X-Accel-Expires: 30 Last-Modified: Tue, 29 April 2014 02:28:12 GMT ETag: "3e86-410-3596fbbc“ • Requesting client honors cacheability – May issue conditional GETs
  • 6. What does NGINX cache? • Cache GET and HEAD with no Set-Cookie response • Uniqueness defined by raw URL or: proxy_cache_key $scheme$proxy_host$uri$is_args$args; • Cache time defined by – X-Accel-Expires – Cache-Control – Expires http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
  • 8. NGINX Config proxy_cache_path /tmp/cache keys_zone=one:10m levels=1:2 inactive=60m; server { listen 80; server_name localhost; location / { proxy_pass http://localhost:8080; proxy_cache one; } }
  • 9. Caching Process Internet MISS Read request Wait? Check Cache Respond from cache cache_lock_timeout Response cacheable? HIT Stream to disk NGINX can use stale content under the following circumstances: proxy_cache_use_stale error | timeout | invalid_header | updating | http_500 | http_502 | http_503 | http_504 | http_403 | http_404 | off
  • 10. Caching is not just for HTTP • FastCGI – Functions much like HTTP • Memcache – Retrieve content from memcached server (must be prepopulated) • uwsgi and SCGI N HTTP FastCGI memcached uwsgi SCGI NGINX is more than just a reverse proxy
  • 11. HOW TO UNDERSTAND WHAT’S GOING ON
  • 12. Cache Instrumentation add_header X-Cache-Status $upstream_cache_status; MISS Response not found in cache; got from upstream. Response may have been saved to cache BYPASS proxy_cache_bypass got response from upstream. Response may have been saved to cache EXPIRED entry in cache has expired; we return fresh content from upstream STALE takes control and serves stale content from cache because upstream is not responding correctly UPDATING serve state content from cache because cache_lock has timed out and proxy_use_stale takes control REVALIDATED proxy_cache_revalidate verified that the current cached content was still valid (if-modified-since) HIT we serve valid, fresh content direct from cache
  • 13. Cache Instrumentation map $remote_addr $cache_status { 127.0.0.1 $upstream_cache_status; default “”; } server { location / { proxy_pass http://localhost:8002; proxy_cache one; add_header X-Cache-Status $cache_status; } }
  • 14. Extended Status Check out: demo.nginx.com http://demo.nginx.com/status.html http://demo.nginx.com/status
  • 15. HOW CONTENT CACHING FUNCTIONS IN NGINX
  • 16. How it works... • NGINX uses a persistent disk-based cache – OS Page Cache keeps content in memory, with hints from NGINX processes • We’ll look at: – How is content stored in the cache? – How is the cache loaded at startup? – Pruning the cache over time – Purging content manually from the cache
  • 17. How is cached content stored? proxy_cache_path /tmp/cache keys_zone=one:10m levels=1:2 max_size=40m; • Define cache key: proxy_cache_key $scheme$proxy_host$uri$is_args$args; • Get the content into the cache, then check the md5 $ echo -n "httplocalhost:8002/time.php" | md5sum 6d91b1ec887b7965d6a926cff19379b4 - • Verify it’s there: $ cat /tmp/cache/4/9b/6d91b1ec887b7965d6a926cff19379b4
  • 18. Loading cache from disk • Cache metadata stored in shared memory segment • Populated at startup from cache by cache loader proxy_cache_path path keys_zone=name:size [loader_files=number] [loader_threshold=time] [loader_sleep=time]; (100) (200ms) (50ms) – Loads files in blocks of 100 – Takes no longer than 200ms – Pauses for 50ms, then repeats
  • 19. Managing the disk cache • Cache Manager runs periodically, purging files that were inactive irrespective of cache time, deleteing files in LRU style if cache is too big proxy_cache_path path keys_zone=name:size [inactive=time] [max_size=size]; (10m) – Remove files that have not been used within 10m – Remove files if cache size exceeds max_size
  • 20. Purging content from disk • Find it and delete it – Relatively easy if you know the key • NGINX Plus – cache purge capability $ curl -X PURGE -D – "http://localhost:8001/*" HTTP/1.1 204 No Content Server: nginx/1.5.12 Date: Sat, 03 May 2014 16:33:04 GMT Connection: keep-alive X-Cache-Key: httplocalhost:8002/*
  • 22. Delayed caching proxy_cache_min_uses number; • Saves on disk writes for very cool caches Cache revalidation proxy_cache_revalidate on; • Saves on upstream bandwidth and disk writes
  • 23. Control over cache time proxy_cache_valid 200 302 10m; proxy_cache_valid 404 1m; • Priority is: – X-Accel-Expires – Cache-Control – Expires – proxy_cache_valid Set-Cookie response header means no caching
  • 24. Cache / don’t cache proxy_cache_bypass string ...; proxy_no_cache string ...; • Bypass the cache – go to origin; may cache result • No_Cache – if we go to origin, don’t cache result proxy_no_cache $cookie_nocache $arg_nocache $http_authorization; • Typically used with a complex cache key, and only if the origin does not sent appropriate cache-control reponses
  • 25. Multiple Caches proxy_cache_path /tmp/cache1 keys_zone=one:10m levels=1:2 inactive=60s; proxy_cache_path /tmp/cache2 keys_zone=two:2m levels=1:2 inactive=20s; • Different cache policies for different tenants • Pin caches to specific disks • Temp-file considerations – put on same disk!: proxy_temp_path path [level1 [level2 [level3]]];
  • 26. QUICK REVIEW – WHY CACHE?
  • 27. Why is page speed important? • We used to talk about the ‘N second rule’: – 10-second rule • (Jakob Nielsen, March 1997) – 8-second rule • (Zona Research, June 2001) – 4-second rule • (Jupiter Research, June 2006) – 3-second rule • (PhocusWright, March 2010) 12 10 8 6 4 2 0 Jan-97 Jan-98 Jan-99 Jan-00 Jan-01 Jan-02 Jan-03 Jan-04 Jan-05 Jan-06 Jan-07 Jan-08 Jan-09 Jan-10 Jan-11 Jan-12 Jan-13 Jan-14
  • 28. Google changed the rules “We want you to be able to get from one page to another as quickly as you turn the page on a book” Urs Hölzle, Google
  • 29. The costs of poor performance • Google: search enhancements cost 0.5s page load – Ad CTR dropped 20% • Amazon: Artificially increased page load by 100ms – Customer revenue dropped 1% • Walmart, Yahoo, Shopzilla, Edmunds, Mozilla… – All reported similar effects on revenue • Google Pagerank – Page Speed affects Page Rank – Time to First Byte is what appears to count
  • 30. NGINX Caching lets you Improve end-user performance Consolidate and simplify your web infrastructure Increase server capacity Insulate yourself from server failures
  • 31. Closing thoughts • 38% of the world’s busiest websites use NGINX • Check out the blogs on nginx.com • Future webinars: nginx.com/webinars Try NGINX F/OSS (nginx.org) or NGINX Plus (nginx.com)

Editor's Notes

  • #5: Why cache – three reasons – performance improvements, capacity improvements, and resilience to failures in backends
  • #8: Cool because is trivial to configure
  • #10: Error: an error occurred while establishing a connection with the server, passing a request to it, or reading the response header; Timeout: a timeout has occurred while establishing a connection with the server, passing a request to it, or reading the response header; invalid_header: a server returned an empty or invalid response; Updating – content is being refreshed and a lock is in place http_500: a server returned a response with the code 500; http_502: a server returned a response with the code 502; http_503: a server returned a response with the code 503; http_504: a server returned a response with the code 504; http_403: a server returned a response with the code 403; http_404: a server returned a response with the code 404; Off: disables passing a request to the next server.
  • #12: Complex. We make it really easy
  • #16: It uses same tech as static content that nginx is renowned for
  • #22: Get smart
  • #31: http://www.strangeloopnetworks.com/assets/images/infographic2.jpg http://www.thinkwithgoogle.com/articles/the-google-gospel-of-speed-urs-hoelzle.html http://moz.com/blog/how-website-speed-actually-impacts-search-ranking What does performance really mean to you? Revenue Ad CTR Employee and partner satisfaction What devices do your users use? What network conditions are they under?
  • #32: 1. Deliver all content at the speed of nginx 2. Compared to multiple point solutions 3. Cache for one second example 4. proxy_cache_use_stale