As follows
var a = 3;
There are actually two steps:
1 Initialize a to undefined
2 Assign a to value 3
So there will be some "unbelievable" phenomena, that is, variables in JS can be used first and then declared. This is not allowed in Java.
System.out.println(a);
int a = 1;
Compilation fails. But JS can, as follows
alert(a);
var a;
Although it is undefined, no error will be reported. It means that a is indeed declared and is undefined.
If it is just "alert(a)" without "var a", the JS engine will report an error.
alert(a);
FF is as follows
Although it can be used first and then declared, this will cause the effect of the assignment to be lost. As follows
alert(a);
var a = 1;
This time the output is still undefined instead of 1.
Another example,
alert('a' in window); // true
var a;
Although the code is formally written after alert, the engine still automatically processes the var declaration first. The final output is true.
After understanding this, it is not difficult to understand the results of the following code
if (!("a" in window)) {
var a = 1;
}
alert(a);