You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Original KAN paper used L-BFGS algorithm which is second-order algorithm.
Some of the "compare MLP to KAN" papers here (including "fairer comparison," "Wav-KAN" and "KAN for time series analysis" papers) use Adam optimizer (a first order algorithm) to train both KAN and MLP. This is pretty unfair because Adam is known to perform poorly on the non-neural-network types of functions [1].
Original KAN paper used L-BFGS algorithm which is second-order algorithm.
Some of the "compare MLP to KAN" papers here (including "fairer comparison," "Wav-KAN" and "KAN for time series analysis" papers) use Adam optimizer (a first order algorithm) to train both KAN and MLP. This is pretty unfair because Adam is known to perform poorly on the non-neural-network types of functions [1].
[1] https://parameterfree.com/2020/12/06/neural-network-maybe-evolved-to-make-adam-the-best-optimizer/
It should be noted for reader/reseacher convenience that paper uses first-order and/or second-order algorithm(s) for KAN training.
The text was updated successfully, but these errors were encountered: