\[ \begin{align*} \hat{l1} &= l1 + e_1 \\ \hat{l2} &= l2 + e_2 \\ Var(\hat{l1}) &= Var(e_1)= \sigma^2 \\ Var(\hat{l2}) &= Var(e_2)= \sigma^2 \end{align*} \]
A better way to do this would be to measure the sum and the differences of the two:
\[ \begin{align*} \hat{m1} &= l1 + l2 + e_1 \\ \hat{m2} &= l2 - l1 + e_2 \\ \hat{l1} &= \frac{1}{2}(\hat{m1} - \hat{m2}) = l1 + \frac{1}{2}(e_1-e_2) \\ \hat{l2} &= \frac{1}{2}(\hat{m1} + \hat{m2}) = l2 + \frac{1}{2}(e_1+e_2) \\ Var(\hat{l1}) &= \frac{1}{4} [Var(e_1) + Var(e_2)] = \frac{1}{2}\sigma^2 \\ Var(\hat{l2}) &= \frac{1}{4} [Var(e_1) + Var(e_2)] = \frac{1}{2}\sigma^2 \end{align*} \]
In the above, the covariance of the two errors is zero since they are independent measurements.