April 25, 2022
This is a reading note on:
An Introduction to the Conjugate Gradient Method Without the Agonizing Pain
It is definitely the best paper to understand the steepest descent conjugate gradient method, which save me a lot of time, thanks god.
And it is written in 1994, pretty cool, right? 🧐
...
April 21, 2022
Least square problem solve the following equation:
$$
argmin_x |Ax-y|_2^2
$$
It can be solve in closed-form by the following equation:
$$
\hat{x} = (A^TA)^{-1}A^Ty
$$
...
December 25, 2020
For 2D Total variation denosing, we have the following objective, where $x$ is the clean image we want to optimize, $y$ is the origin noisy image, and $D_r $and $D_c$ are doubly block circulant matrices for two 2D convolution.
In detail, both of $x$ and $y$ are vectorized into 1D vectors, and the total variation terms are expressed as two convolutions in Matrix-vector form.
$$
\operatorname{minimize} \enspace \frac{1}{2} \|x-y\|_{2}^{2} + \lambda\|D_r x\|_{1} + \lambda\|D_c x\|_{1}
$$
...