This week we made some major refactors to NewtonSVM class.
These include cleaning up all raw pointers and using SGVector, SGMatrix instead.
Using linalg instead of SGVector, SGMatrix for ops.
Making NewtonSVM iterative.
seperately calculating bias and weights. This was being done in a single matrix till now.
Using weights member of LinearMachine instead of local member. This ensures the model is usable when it is paused.
Next we worked on implementing Pseudo Inverse in linalg.
Any m x n matrix A can be decomposed into A = USVt. If A is self adjoint positive semi definite matrix then a
Symmetrical Self adjoint eigen slover can be used to calculate S and U. The inverse can be expressed as A+ = U * inverse(S)t * Ut.
We have symmetric eigen solver in linalg so we have used that here.
For a general m x n matrix, we have Singular Value Decomposition of A to calculate inverse as A+ = Vt * inverse(S)t * Ut.
This needed to be implemented directly from eigen backend.
With this all of refactoring of NewtonSVM was completed.
We also work on a systematic way to test all Iterative machines. Since they will inherit from Iterativemachine. We can use ctags to
sort them out and apply our test to them.