Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an optimization algorithm to the paper descrription #83

Open
thesz opened this issue Aug 20, 2024 · 0 comments
Open

Add an optimization algorithm to the paper descrription #83

thesz opened this issue Aug 20, 2024 · 0 comments

Comments

@thesz
Copy link

thesz commented Aug 20, 2024

Original KAN paper used L-BFGS algorithm which is second-order algorithm.

Some of the "compare MLP to KAN" papers here (including "fairer comparison," "Wav-KAN" and "KAN for time series analysis" papers) use Adam optimizer (a first order algorithm) to train both KAN and MLP. This is pretty unfair because Adam is known to perform poorly on the non-neural-network types of functions [1].

[1] https://parameterfree.com/2020/12/06/neural-network-maybe-evolved-to-make-adam-the-best-optimizer/

It should be noted for reader/reseacher convenience that paper uses first-order and/or second-order algorithm(s) for KAN training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant