Page 2 of 2
The problem was isolated to a leaking of the local variable out into the global context.
You can see this if you try the following program:
When the function executes the alert displays 0,0,0 which proves that the variables exist and have been initialized. However the second alert displays 100,0,0 which means that the global variables b and c have been set to zero.
With this information now you should be able to see what is wrong.
you will see 10 displayed as this is the result of the assignment.
Now consider the evaluation of
The first part of the instruction is the var statement which declares the a to be a local variable. Next we have the initialization. When you write:
the order of events is that a is declared, the expression is evaluated and the value assigned to a. In this case we have to evaluate b=c=0. At this point things go wrong. The b variable isn't declared as local and so it is identified with the global variable b or a new global variable b is created, the same for c. Finally the 0 is evaluated and stored in c, which provides the result to store in b i.e. zero and finally this is also stored in a.
All three variables end up being set to zero but b and c have been used as global variables.
The reason for the error is to think of var as an operator with a higher precedence than assignment. If this were the case then the var would first be applied to create three local variables and then the assignments would be evaluated. However var is not even an operator it is a statement and it evaluates the assignment before completing and declaring one variable as local. That is
is equivalent to
and the expression is b=c=0.
should be interpreted to mean:
var a=0: var b=0; var c=0;
OK, if this is the case what about the programmer who wants to write:
Does this mean
var a=b; var b=c+1;var c+1;
No, clearly this is not about simple syntactic "sugar".
Even if you throw in some semantics and say that you don't attempt to declare something that is an expression, i.e. the c+1, you still have a potential problem with:
unless you interpret it as
var b=0, a=b+1
and this creates both b and a as local variables with b set to zero and a set to 1. However:
doesn't work because b isn't defined when a is initialized.
Finally notice that:
does work and creates three local variables initialized to zero.
Can you figure out why?
(answer at the end)
So how to avoid the problem?
The alternative is not to try to tidy code to the point where it starts to be a matter of interpretation what the meaning is. In other words if it is a matter of what you expect of an little used construct then don't do it.
Answer to final puzzle:
works is that a and b are declared as local variables without initialization. Then c is declared as a local variable and set to the result of the expression, which also sets a and b to zero.
That is, the statement is equivalent to: