Hello!
On Thu, Jan 15, 2015 at 2:27 AM, Sam Wong wrote:
> I would like to test if proxy_cache is working or the page served is proxy
> cached
>
> how do i echo and append a <!-- proxy_cached --> to the end of the html
> content for files that are served from proxy_cached?
>
Easy.
You can have both header_filter_by_lua and body_filter_by_lua. And in
the header_filter_by_lua hook, do the following:
1. read the value of the builtin nginx variable $upstream_cache_status
[1] from within Lua (via the ngx.var API). So that you know whether
the response is from cache or not.
2. In case that the current response is from cache, you prepare the
HTML comment string there and save it to the ngx.ctx table (as
ngx.ctx.html_cmt_to_append, for instance).
3. Check if the Content-Length response header is set. If yes, then
calculate the length of your newly-generated HTML comment string and
add that to the existing Content-Length value and update the
Content-Length response header accordingly.
And then in the body_filter_by_lua hook, you do the following:
1. Check if ngx.ctx.html_cmt_to_append (or something like that) is
set. If not, then return immediately.
2. Otherwise check if the "eof" flag is set. If not, return immediately.
3. Otherwise you're seeing the last data chunk in the current
response, and then append ngx.ctx.html_cmt_to_append into the current
data chunk string.
This approach is quite efficient because it fully utilizes the
streaming processing capabilities in nginx's output filter mechanism.
And the complication comes from streaming processing.
Furthermore, it does not only work with ngx_proxy but also any other
nginx upstream modules like ngx_fastcgi and ngx_uwsgi (you name it :))
This is because $upstream_cache_status is shared across all the
upstream modules.
Good luck!
Best regards,
-agentzh
[1] http://nginx.org/en/docs/http/ngx_http_upstream_module.html#var_upstream_cache_status