I think this question might also need some clarity regarding context. At it's core Openresty -is- Nginx, so if none of your connections require a *_by_lua directive, the 1G limit seems like a red herring. At that point the question is essentially, 'how many concurrent connections can Nginx handle?'.
If the question is related to content served/altered by a *_by_lua directive, then the answer will likely depend on how much memory each request needs to properly serve or alter the request.
alhakeem
Hello, Does Brian Akins have the configuration for the 1m+ concurrent connections for the Openresty instance ? Cheers, Abdul Hakeem From: openresty-en@googlegroups.com [mailto:openresty-en@googlegroups.com] On Behalf Of Robert Paprocki Sent: Tuesday, July 21, 2015 7:20 PM To: openresty-en@googlegroups.com Subject: Re: [openresty-en] maximum concurrency I think this question might also need some clarity regarding context. At it's core Openresty -is- Nginx, so if none of your connections require a *_by_lua directive, the 1G limit seems like a red herring. At that point the question is essentially, 'how many concurrent connections can Nginx handle?'. If the question is related to content served/altered by a *_by_lua directive, then the answer will likely depend on how much memory each request needs to properly serve or alter the request. On Tue, Jul 21, 2015 at 8:57 AM, Brian Akins <br...@akins.org> wrote: The 1GB limit is per worker (there are some workarounds to this, but it's usually not an issue). You generally run multiple workers per nginx instance. I've personally seen an Openresty instance handler 1 million plus concurrent connections on a single machine. On Tue, Jul 21, 2015 at 6:49 AM, Samarjit Uppal <uppal...@gmail.com> wrote: How many maximum concurrent connections can openresty handle? I know its limited by the 1GB memory pool allocated to the Lua process. So, does that mean the concurrent connections per second depends on the size of the individual http request size. Isn't 1GB too small allocation. If this 1GB could be raised then will openresty be able to handle much higher concurrency? .
|
uppal.samar
All my http requests undergo some minor processing by lua and return a result to client. So its basically lua api. So in that case, according to what you say, the number of concurrent connections is limited by the 1GB allocated to the lua space. That seems a restrictive design. Should not it be flexible and we should be able to change that value to increase concurrency. I wonder does node.js also has such a constraint?
On Tuesday, July 21, 2015 at 11:49:51 PM UTC+5:30, rpaprocki wrote:
I think this question might also need some clarity regarding context. At it's core Openresty -is- Nginx, so if none of your connections require a *_by_lua directive, the 1G limit seems like a red herring. At that point the question is essentially, 'how many concurrent connections can Nginx handle?'.
If the question is related to content served/altered by a *_by_lua directive, then the answer will likely depend on how much memory each request needs to properly serve or alter the request.
On Tue, Jul 21, 2015 at 8:57 AM, Brian Akins
<br...@akins.org> wrote:
The 1GB limit is per worker (there are some workarounds to this, but it's usually not an issue). You generally run multiple workers per nginx instance. I've personally seen an Openresty instance handler 1 million plus concurrent connections on a single machine.
On Tue, Jul 21, 2015 at 6:49 AM, Samarjit Uppal
<uppal...@gmail.com> wrote:
Hi,
How many maximum concurrent connections can openresty handle? I know its limited by the 1GB memory pool allocated to the Lua process. So, does that mean the concurrent connections per second depends on the size of the individual http request size. Isn't 1GB too small allocation. If this 1GB could be raised then will openresty be able to handle much higher concurrency?
Thanks.
Samar
.
doujiang24
we don't have to worry about the 1GB limit, usally one request need little kb memory in LuaJIT space
I was test openresty handle websocket request recent, one nginx worker can handle 100K concurrency request, and only 400M - 600M memory used in LuaJIT space [1]