How much memory a worker may need for a single query? And are the requests queued when memory is scarce?
No, requests fail if memory is all consumed but there is auto scaling in the daemon when memory is getting low:
- It tries to buffer less. In general, we stream everything so memory usage is low
- For joins that are performed in ODAS, the build (post filtered) side needs to fit in memory. Allocate enough memory in the worker node to accommodate the smaller table. Sample calculation provided in the second link below.
Refer to https://docs.okera.com/cluster-sizing#cluster-sizing-guide for high level information.
The worker node, which does the heavy lifting, consumes memory and resources based on the data shape and join processing. https://docs.okera.com/cluster-sizing#worker-node-worker provides additional details on the memory needs of the worker.