Hello!
On Thu, Oct 15, 2015 at 6:23 PM, 杨阳 wrote:
> Hi, agentzh
>
BTW, it's not a good practice to address my nick while sending mails
to the mailing list ;)
> We want get large files using a lot of small files parallel in
> nginx_lua, for example, 10 parallel for 60 kB file every time, can you give
> me some suggestion about how to get the 600K data as soon as possible?
> thanks.
>
Assuming you're not doing file I/O directly in nginx and your backend
serving the "files" do support very large numbers of concurrent
connections without sacrificing throughput (much). Then use the "light
threads" mechanism provided by ngx_lua should help with reducing the
total redundancy:
https://github.com/openresty/lua-nginx-module#ngxthreadspawn
Regards,
-agentzh