How does increase in confidence interval lead to an increase in VAR
Keep in mind that VAR is measurment of expected loss of value at a given confidence level, so the absolute value of a the loss at the 1% confidence interval will always be higher than the absolute value of the loss at the 5% confidence interval.
For example if there is a 5% chance of losing 5 million 5% of the time then the magnitude of loss will be at least 5 million. The 1% VAR will only incorporate the largest 20% of losses that are in the 5% VAR. Therefore, the magnitue of loss at 1% must be greater than the magnitude of loss at 5%.
var = return - z*standard deviation
z increases as confidence increases (i.e., 1.65 for 95% and 2.33 for 99%) so var increases as z increases
Confidence level vs level of signficance…Be careful.
As the probability level goes down, VAR goes up.
As the Confidence Level goes up, VAR goes up.