Information on the current system
I currently have a Service-A which is the Producer, Service-B is the consumer. Within Service-B:
- I receive the data object.
- The data object is queued.
- The queue is actively polled by a co-routine running in the background.
What is the problem?
As the size of the queue increases, the memory usage of my service (B) increases. I would like to figure out a way to conserve memory.
- Queue size cap/threshold is set to X
- Once the Queue.size == X -> Pause the consumption (The producer i.e Service-A offers this back pressure mechanism feature)
- The data consumption is paused after the current batch of data objects is received. So the Queue.size would technically end up increasing to X+delta. This is because from the Producer's perspective, even if it receives a pause() request, it only pauses after sending over the current batch of data objects.
- Queue polling happens as usual.
- Send a resume() request to Producer once the Queue.size decreases.
- What is the broad topic of the problem that I'm trying to solve? Is throttling the correct term?
- Am I moving in the right direction in limiting Queue size to approach the problem of memory usage?
- If yes, how do I decide when to resume consumption? Should I resume when Queue.size goes down below X/2? X/4? (Where X is the predefined threshold)
- I have looked into certain throttling algorithms, however in my case discarding data objects is not an option. Are there any other preexisting algorithms that I should read up on?