There's an old thread I kicked off asking about streaming the response back to the user while also saving it in cache:
"[openresty-en] clone response into a coroutine or post-process the entire response after ngx.say?"
I haven't implemented the recommended solution, though I do want to if I can ever find the time.
Right now I'm just hooking into
content_by_lua_block {
...
}
and then inside that block I'm loading and executing lua libraries to process the request.
Within the lua libraries, if I decide it is a cache miss, I execute:
local response = ngx.location.capture("/proxy" .. self.request.uri, {
method = self:ngx_method(request.method),
body = self.request.body,
ctx = {},
})
which executes this subrequest from the nginx configuration:
#
# Call the origin server, if ngx.ctx has specified
# if_none_match or if_modified_since then set the
# corresponding If-None-Match or If-Modified-Since
# headers on the request (overriding any existing
# ones).
#
location /proxy {
internal;
# strip off the leading /proxy added to
# trigger the subrequest
rewrite ^/proxy/+(.*) /$1 break;
# clear the range header and add in validators
# if specified by the proxy
rewrite_by_lua_block {
ngx.req.clear_header("range")
if ngx.ctx.if_none_match then
ngx.req.set_header("If-None-Match", ngx.ctx.if_none_match)
end
if ngx.ctx.if_modified_since then
ngx.req.set_header("If-Modified-Since", ngx.ctx.if_modified_since)
end
}
# always pass back the Date from the response
proxy_pass_header "date";
# call the backend
proxy_pass http://$proxy_host;
}
The proxy_host is just a dictionary mapping of incoming host to backend hosts, e.g.,
# $proxy_server is the origin server to pass requests to
# when we have determined it cannot fulfil the request
# completely out of cache.
map $http_host $proxy_host {
default "localhost";
test-cache.example.org "test.example.org";
}
On Monday, April 10, 2017 at 3:00:10 PM UTC-7, Leandro Moreira wrote:
Cool, what are the hooks for pre caching (request) and post caching (response) in Lua?