Hello!
On Wed, Jun 18, 2014 at 1:02 PM, Alex Cline wrote:
> I'm setting up a high-traffic system incorporating openresty and would like
> to be able to cache the values returned from redis for the upstream servers.
> I've been able to successfully implement the dynamic routing illustrated
> here: http://openresty.org/#DynamicRoutingBasedOnRedis
>
> Would what I want to do violate openresty's one coroutine per request model?
> I'd like to be able to cache the requests for a few seconds.
>
You have various choices, depending on your actual use cases:
1. use the shared dictionary provided by ngx_lua:
https://github.com/openresty/lua-nginx-module#ngxshareddict
This cache is shared by all the worker processes but you need to
serialize complicated lua data structures yourself if the values are
not simple Lua values. Works pretty fast for primitive Lua values like
strings, booleans, and numbers.
2. use lua-resty-lrucache to do Lua-space caching:
https://github.com/openresty/lua-resty-lrucache#readme
This cache is per-worker. Ideal for caching complicated Lua values. No
serialization is needed.
You can also cascade these two cache layers for caching complicated
Lua value. We use both of these caching layers in CloudFlare's Lua CDN
system, for example.
For maximal performance, you'd better enable the lua-resty-core (and
also LuaJIT 2.1) in your Lua app so that all the Lua caching code can
be fully JIT compiled:
https://github.com/openresty/lua-resty-core#readme
The latest OpenResty release should be good here.
Best regards,
-agentzh