BASH Scripting
One of the few courses remaining on my Bachelor’s degree sees me doing “UNIX System Administration”, which seems to primarily involve writing shell scripts in BASH on Linux hosts. I’ve dug into the first assignment as I’m pretty confident in the material, and got most of the boxes ticked, but one of the criteria for the assignment is “no wasted resources”, which got me looking at memory and…
… ugh, I have a memory leak in my script. Or do I? After cutting out bits of the script with judicious use of the return
operator, I honestly couldn’t see where the heck the leak would be… I’m not doing anything wrong. I tried forcibly declaring a couple of variables local, even though that should be implied, and still nothing.
It all came down to this little test case:
#! /bin/bash
declare -a test=({a..e})
test_func() {
echo "${!test[@]}" # leaks memory?
}
while true
do
test_func
done
The idea being to expand the array into a list of indexes from the array. Under certain circumstances such as used above, it seems to leak memory. Certain things mitigate that, for instance if there’s a sleep
anywhere in the function, it cleans up after itself.
Why was I using that? Well I’d originally thought the array would be sparse, but it turns out it wasn’t, so I replaced it with a C-style for
loop to provide the indexes and the memory leaks were gone. But was this a bug in BASH, or simply my understanding of it?
It may be related to or part of another memory leak in arrays, which doesn’t seem to be patched. However, when I tried the test case on my FreeBSD machine, with BASH 4.4.12, it’s not present, so my care factor went out the window.