4

I have a bunch of functions stored in ~/.bash_functions, which is sourced by ~/.bashrc on shell startup. The file exports all the functions like so:

# Find functions in this script based on a grep search, and export them.
grep ^'[[:alnum:]]' ~/.bash_functions |
  grep '()' |
  cut -d'(' -f1 | 
  while read function
do
  export -f "$function"
done
unset function

This works perfectly fine on a local shell, but not via SSH. None of the functions are actually exported (checked using declare -F). However, if I put echo "$function" into the loop, it prints all the function names, so I know the only part of the loop not working is the export line.

The functions get exported properly if I use export -f in the SSH session, or if I add an export -f line for each individual function in the file.

I'm using 14.04 with Bash 4.3.11, and the SSH client is Termux on Android.

Edit: Even if I add declare -F at the bottom of ~/.bash_functions, the functions show as not exported.

Edit: I just realized that in a local session, some of my functions aren't exported, seemingly at random, but I can't find any evidence of an error. I am doing more research...

wjandrea
  • 14,504

1 Answers1

0

Because the while loop is in a pipeline, it is being executed in a subshell. If you log out and log in, you will see that a local session will be affected too (not just SSH). This can be fixed by moving the list into a variable and switching to a for loop:

# Find functions in this script based on a grep search, and export them.
functions="$( grep ^'[[:alnum:]]' ~/.bash_functions |
  grep '()' |
  cut -d'(' -f1 
)"

for function in $functions; do
  export -f "$function"
done
unset -v function functions # Also I added the -v flag to make this only unset variables.
wjandrea
  • 14,504