Here's my problem: I have an object, e.g.
l = [1,2,3] l.append(l) # assume that it appends a reference to l, not just [1,2,3]
Now, when l is deleted, it reduces the number of references by 1 and sees that the number of references is now 1 and does nothing more, obviously a memory leak.
If I want to be able to avoid memory leaks (without explicitly using weak references, such as in Python), how can I know when to delete the contents of the objects first?
The only way I could think of doing this is, every time we want to delete a reference, we tally up the number of references to
l contained within
l recursively. In this case, that number is 1.
Then when we delete, we check if the number of references remaining is equal to the number of references contained in the object.
We decrement 2 to 1, and find that there is 1 reference to
l contained. Since 1 = 1, we know to then go through
l and delete its contents, bringing the number of references down to 0 and avoiding memory leaks.
The problem I have with this is that for a large array or an array with lots of dimensions, it could be problematic to have to flatten and search for references to the object every time a reference is deleted.
In the (C++) project I'm working on, I've already made a custom version of
std::vector which also allows me to see whether the object has been modified since the last time a value was calculated, which could work well here, but
std::unordered_map I've just used that by itself and would have to check every time for the number of references to itself, knowing that it could have changed and the number of references to itself contained in itself could've changed.
The only other things I can think of in the Python example:
- it somehow automatically makes the reference to l weak
- it evaluates lazily and is actually not appending a reference (which doesn't really help me)
Another example in Python is how
type's type is
type, so this would also never get deleted unless there is extra code explicitly dedicated to cleaning it up.