# Skin weight optimization (matrix form) (2)

### (Matrix form first attempt)

$$\sum\limits_{j \in \mathcal N(i)} (c_{ij} \nabla \mathbf s_i \mathbf S_i) \vec w_i - \sum\limits_{j \in \mathcal N(i)} (c_{ij} \nabla \mathbf s_i \mathbf S_j) \vec w_j = \vec b_i$$

$$\left (\sum\limits_{j \in \mathcal N(i)} c_{ij} \right ) (\nabla \mathbf s_i \mathbf S_i) \vec w_i - \sum\limits_{j \in \mathcal N(i)} (c_{ij} \nabla \mathbf s_i \mathbf S_j) \vec w_j = \vec b_i$$

$$\sum\limits_{j \in \mathcal N(i)} c_{ij} = c_i$$

$$\vec b_i = \nabla \mathbf s_i \sum\limits_{j \in \mathcal N(i)} { \frac{c_{ij}}{2}(\mathbf R_i + \mathbf R_j) ( \mathbf p_i - \mathbf p_j) }$$

Dimensions:

• $$\mathbf A: |\vec w| \times |\vec w|$$
• $$\vec b : |\vec w|$$
• $$\nabla s_i^{|\vec w_i|\times 3} \mathbf S_i^{3 \times|\vec w_i|} = \mathbf M_i^{|\vec w_i| \times |\vec w_i|}$$
• $$\nabla s_i^{|\vec w_i|\times 3} \mathbf S_j^{3 \times|\vec w_j|} = \mathbf M_j^{|\vec w_i| \times |\vec w_j|}$$

$$\mathbf M_i = \begin{bmatrix} T_{i1} \mathbf{p_i} \\ \vdots \\ T_{in} \mathbf{p_i} \\ \end{bmatrix} \left [{ \begin{matrix} T_{i1} \mathbf p_i & \cdots & T_{in} \mathbf{p_i} \\ \end{matrix} } \right ] = \begin{bmatrix} (T_{i1} \mathbf{p_i}).(T_{i1} \mathbf{p_i}) & \cdots & (T_{i1} \mathbf{p_i}) . (T_{in} \mathbf{p_i}) \\ \vdots & \ddots & \vdots \\ (T_{in} \mathbf{p_i}).(T_{i1} \mathbf{p_i}) & \cdots & (T_{in} \mathbf{p_i}) . (T_{in} \mathbf{p_i}) \\ \end{bmatrix}$$

$$\mathbf M_j = \begin{bmatrix} T_{i1} \mathbf{p_i} \\ \vdots \\ T_{in} \mathbf{p_i} \\ \end{bmatrix} \left [{ \begin{matrix} T_{j1} \mathbf p_j & \cdots & T_{jn} \mathbf{p_j} \\ \end{matrix} } \right ] = \begin{bmatrix} (T_{i1} \mathbf{p_i}).(T_{j1} \mathbf{p_j}) & \cdots & (T_{i1} \mathbf{p_i}) . (T_{jn} \mathbf{p_j}) \\ \vdots & \ddots & \vdots \\ (T_{in} \mathbf{p_i}).(T_{j1} \mathbf{p_j}) & \cdots & (T_{in} \mathbf{p_i}) . (T_{jn} \mathbf{p_j}) \\ \end{bmatrix}$$

$$\begin{bmatrix} c_0 \mathbf M_0 & \cdots & -c_{0j} M_j & \cdots & 0 \\ \vdots & \ddots & \vdots & 0 & 0 \\ 0 & 0 & \mathbf c_i M_i & \cdots & -c_{ij} M_j \\ \vdots & \vdots & \vdots & \ddots & 0 \\ 0 & \cdots & \mathbf -c_{vj} M_j & \cdots & c_v M_v \\ \end{bmatrix} \begin{bmatrix} \vec w_0 \\ \vdots \\ \vec w_i \\ \vdots \\ \vec w_j \\ \vdots \\ \vec w_v \\ \end{bmatrix} = \begin{bmatrix} \vec b_0 \\ \vdots \\ \vec b_i \\ \vdots \\ \vec b_j \\ \vdots \\ \vec b_v \end{bmatrix}$$

The matrix is sparse but can we prove this is a diagonally dominant system and use Jacobi iterations?

$$x^{(k+1)}_i = \frac{1}{a_{ii}} \left(b_i -\sum_{j\ne i}a_{ij}x^{(k)}_j\right) \text{Jacobi iteration for one Matrix row}$$

Note: experimentally the matrix does not seem diagonally dominant (quite a few lines don't respect the above condition). It seems not to be symmetric as well.

---------
The following remains to be checked
---------

Here solving for several $$x$$ at the same time i.e. solving per vertex, set of skinning weights $$\vec w_i$$
$$\vec w^{(k+1)}_i = \begin{bmatrix} (T_{i1} \mathbf{p_i})^2 \\ \vdots\\ (T_{in} \mathbf{p_i})^2\\ \end{bmatrix} \frac{1}{c_i } \left( \vec b_i -\sum_{j \in \mathcal N(i)} - c_{ij}M_j \vec w^{(k)}_j - \left (- c_i(M_i - diag(M_i)\right) \vec w^{(k)}_i) \right)$$

You need the following for diagonally dominant:

$$|a_{ii}| \geq \sum_{j\neq i} |a_{ij}| \quad\text{for all } i$$

Which I can infer from my above formulation of one Jacobi iteration:

$$\begin{array}{lll} a_{ii} & = & (T_{i1} \mathbf{p_i})^2.c_i \\ a_{ij} & = & \sum_{j \in \mathcal N(i)} - c_{ij} \begin{bmatrix} (T_{i1} \mathbf{p_i}).(T_{j1} \mathbf{p_j}) & + & \cdots & + & (T_{i1} \mathbf{p_i}) . (T_{jn} \mathbf{p_j}) \\ \end{bmatrix} + \\ & & - c_i \begin{bmatrix} (T_{i1} \mathbf{p_i}).(T_{i2} \mathbf{p_i})) & + & \cdots & + & (T_{i1} \mathbf{p_i}) . (T_{in} \mathbf{p_i})) \\ \end{bmatrix} \\ \end{array}$$

Although the above expression is a good starting point it is wrong because the indices of matrix elements such as: $$a_{ii}$$ does not match indices on the left hand side $$T_{ij}$$ (here typically we have the ith vertex and jth bone influence as opposed to the ith row and jth column of the whole matrix)