I have been struggling to understand lambda memory usage. when i profile my functions locally, the max heap used is ~ 40mb. my (unzipped) bundle is ~35mb, so i would expect at most my function would use ~80mb. However, i consistently max out a 192mb container. Is there a way to simulate the memory limit of a function locally, or is there something obvious that i am missing when trying to figure out how much memory to allocate to my functions?
thanks in advance
Hi…I’m working with a Stan model, and I changed my calculation work process. I didn’t include any new information or boundaries, yet I replaced a straightforward calculation in boundary change with an increasingly intricate one. In this way caused Stan to begin utilizing a few times more memory; in certainty it presently would snatch all accessible memory and trade and freeze.
Is there a portrayal some place of how Stan utilizes memory, with the goal that I can make sense of what explicitly causes this?