|Programmer Puzzle - Python Swallows A Global|
|Written by Mike James|
Page 1 of 2
Here's a teaser that poses a practical problem - one that crops up in everyday Python programming. See if you can work out the answer before looking at its solution and the pattern to follow to avoid it. The question in this case is where did the global variable go?
Python executes code in the order that it is encountered and this includes the creation of variables.
For example, if you declare and initialize two variables and try to use one before it has been assigned a value you will discover that you can't:
As the second print is before the initialization of j and the result is that it throws a runtime error.
Some languages will create a variable and initialize it to some default value such as zero if you try to use a variable before you explicitly initialize it.
In Python assigning a value to a variable is all you need to do to create and initialize that variable.
The second fact you need to know is that if you use a variable within a function before you have initialized it then the Python interpreter will look for a valid global variable to use.
That is when a variable is encountered within a function the Python interpreter first looks for a valid local variable and if it doesn't find one then it looks for a global variable of the same name.
You start off with a nice easy function which does something or other and makes use of a global variable called i:
Where the print is supposed to represent some complicated processing involving the variable i. In this puzzle the code has been reduced to the absolute minimum to display the problem.
If you call the function after declaring and initializing the global variable:
then everything works as you would expect and the print displays the value 20 i.e the value of the global variable i.
Next you start to work on the function and develop it.
At some point you use, and hence add, a local variable called i i.e. you assign a value to i.
This is fine because once you define a local variable with the same name as a global variable the local variable hides the global. It might not be good style or good planning but the program should work.
Up to the point where you use the local variable the function should be able to use the global variable just like it used to. The reasoning here is that until you use the local variable it doesn't exist and so the global variable will be used.
So for example the following version should work perfectly :
The new local variable i in the for loop hides the global variable i but only after its use within the for loop.
However when you run the program the print now causes a runtime error - Unbound local variable!
If you take out the pint the for loop runs perfectly and displays 0 to 9 so clearly the local variable i is working as you would expect.
The fact that at the start of the function i is undefined makes no sense because the variable i is more than defined it is defined twice! Once as a global variable and once as a local variable. At the point the function starts the global variable is most definitely defined and not hidden by the local variable - so what is going on?
The clue is that there is a basic error of reasoning about when variables are deemed to exist. The reasoning would work in some languages but not in Python.
Turn to the next page when you are ready to find out.