diff --git a/AoFAaltoCS.ipynb b/AoFAaltoCS.ipynb index 6bcf3a8..e78b6b9 100644 --- a/AoFAaltoCS.ipynb +++ b/AoFAaltoCS.ipynb @@ -56,7 +56,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 4, "id": "6142febe", "metadata": {}, "outputs": [ @@ -77,7 +77,69 @@ "Eero Hyvönen\n", "Perttu Hämäläinen\n", "Alex Jung\n", - "Juho Kannala\n" + "Juho Kannala\n", + "Petteri Kaski\n", + "Samuel Kaski\n", + "Sándor Kisfaludi-Bak\n", + "Maarit Korpi-Lagg\n", + "Juhi Kulshrestha\n", + "Russell W. F. Lai\n", + "Jouko Lampinen\n", + "Casper Lassenius\n", + "Jaakko Lehtinen\n", + "Janne Lindqvist\n", + "Harri Lähdesmäki\n", + "Lauri Malmi\n", + "Heikki Mannila\n", + "Pekka Marttinen\n", + "Ilkka Niemelä\n", + "Marko Nieminen\n", + "Pekka Orponen\n", + "Alexandru Paler\n", + "Jussi Rintanen\n", + "Juho Rousu\n", + "Jari Saramäki\n", + "Arno Solin\n", + "Jukka Suomela\n", + "Linh Truong\n", + "Jara Joel Olavi Uitto\n", + "Aki Vehtari\n", + "Johanna Viitanen\n", + "Petri Vuorimaa\n", + "Robin Welsch\n", + "Antti Ylä-Jääski\n", + "Bo Zhao\n", + "Mikko Kiviharju\n", + "Tero Ilmari Ojanperä\n", + "Nitin Sawhney\n", + "Talayeh Aledavood\n", + "Lachlan Gunn\n", + "Lassi Haaranen\n", + "Arto Hellas\n", + "Vesa Hirvisalo\n", + "Jaakko Hollmen\n", + "Wilhelmiina Hämäläinen\n", + "Tommi Junttila\n", + "Barbara Esther Keller\n", + "Ari Korhonen\n", + "Sari Kujala\n", + "Jorma Laaksonen\n", + "Riku Linna\n", + "Mika P. Nieminen\n", + "Kerttu Pollari-Malmi\n", + "Risto Sarvas\n", + "Otto Seppälä\n", + "Juha Sorva\n", + "Sanna Suoranta\n", + "Jari-Pekka Vanhanen\n", + "N Asokan\n", + "Jari Collin\n", + "Aristides Gionis\n", + "Petri Myllymäki\n", + "Marko Turpeinen\n", + "Tapio Lokki\n", + "Mikko Sams\n", + "Simo Särkkä\n" ] } ], @@ -116,7 +178,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 5, "id": "8c9e21c9", "metadata": {}, "outputs": [ @@ -126,7 +188,7 @@ "Text(0.5, 1.0, 'Distribution of Tax-Payer Money via Research Council of Finland')" ] }, - "execution_count": 9, + "execution_count": 5, "metadata": {}, "output_type": "execute_result" }, diff --git a/index.html b/index.html index 0658cb2..bae69d2 100755 --- a/index.html +++ b/index.html @@ -26,21 +26,32 @@
![]() |
|
New preprint on “Towards Model-Agnostic Federated Learning over Networks” available. click me
+New preprint on “Towards Model-Agnostic Federated Learning over Networks” available. click me +
I have prepared a starter kit for master thesis workers here
+I have prepared a starter kit for master thesis workers here +
I have started to share recordinds of our group meetings on my Youtube channel. Playlist
+I have started to share recordinds of our group meetings on my Youtube channel. Playlist +
To capitalize on the information in local datasets and their network structure, we have recently proposed networked exponential families as a novel probabilistic model for big data over networks. Networked exponential families are appealing statistically and computationally. They allow us to adaptively pool local datasets with similar statistical properties as training sets to learn personalized predictions tailored to each local dataset. We can compute these personalized predictions using highly scalable distributed convex optimization methods. These methods are robust against various types of imperfections (statistically and computationally) -and typically offer a high level of privacy protection.
-Relevant Publications:
+and typically offer a high level of privacy protection. + +Relevant Publications: +
A. Jung, “On the Duality Between Network Flows and Network Lasso,” in IEEE Signal Processing Letters, vol. 27, pp. 940-944, 2020, doi: 10.1109/LSP.2020.2998400.
+A. Jung, “On the Duality Between Network Flows and Network Lasso,” in IEEE Signal Processing Letters, vol. 27, pp. 940-944, 2020, doi: 10.1109/LSP.2020.2998400. +
A. Jung, “Networked Exponential Families for Big Data Over Networks,” in IEEE Access, vol. 8, pp. 202897-202909, 2020, doi: 10.1109/ACCESS.2020.3033817.
+A. Jung, “Networked Exponential Families for Big Data Over Networks,” in IEEE Access, vol. 8, pp. 202897-202909, 2020, doi: 10.1109/ACCESS.2020.3033817. +
A. Jung, A. O. Hero, III, A. C. Mara, S. Jahromi, A. Heimowitz and Y. C. Eldar, “Semi-Supervised Learning in Network-Structured Data via Total Variation Minimization,” in IEEE Transactions on Signal Processing, vol. 67, no. 24, pp. 6256-6269, Dec., 2019, doi: 10.1109/TSP.2019.2953593.
+A. Jung, A. O. Hero, III, A. C. Mara, S. Jahromi, A. Heimowitz and Y. C. Eldar, “Semi-Supervised Learning in Network-Structured Data via Total Variation Minimization,” in IEEE Transactions on Signal Processing, vol. 67, no. 24, pp. 6256-6269, Dec., 2019, doi: 10.1109/TSP.2019.2953593. +
A. Jung and N. Tran, “Localized Linear Regression in Networked Data,” in IEEE Signal Processing Letters, vol. 26, no. 7, pp. 1090-1094, July 2019, doi: 10.1109/LSP.2019.2918933.
+A. Jung and N. Tran, “Localized Linear Regression in Networked Data,” in IEEE Signal Processing Letters, vol. 26, no. 7, pp. 1090-1094, July 2019, doi: 10.1109/LSP.2019.2918933. +
Relevant Publications:
+predictions and user summaries. Thus, our method is model agnostic and can be used to compute explanations for different machine learning methods. + +Relevant Publications: +
A. Jung, “Explainable Empirical Risk Minimization”, arXiv eprint, 2020. weblink
+A. Jung, “Explainable Empirical Risk Minimization”, arXiv eprint, 2020. weblink +
A. Jung and P. H. J. Nardelli, “An Information-Theoretic Approach to Personalized Explainable Machine Learning,” in IEEE Signal Processing Letters, vol. 27, pp. 825-829, 2020, doi: 10.1109/LSP.2020.2993176.
+A. Jung and P. H. J. Nardelli, “An Information-Theoretic Approach to Personalized Explainable Machine Learning,” in IEEE Signal Processing Letters, vol. 27, pp. 825-829, 2020, doi: 10.1109/LSP.2020.2993176. +
Right from my start at Aalto in 2015, I took care of the main machine
learning courses at Aalto University. Within three years I have re-designed the spearhead course Machine Learning: Basic Principles (MLBP).
This re-design was based on a careful analysis of feedback received from several thousands of students. I have also started to
-prepare response letters to the student feedback, as it is customary in the
+prepare response letters to the student feedback, as it is customary in the
review process of scientific journals. My final edition of MLBP in 2018 has achieved the best student rating since the course was
-established at Aalto. The efforts have also been acknowledged by the Teacher of the Year
-award, which I have received in 2018 from the Department of Computer Science at Aalto University.
Machine learning methods have been and are currently popularized in virtually any field of science and technology. As a result, machine learning courses attract students from different study programs. Thus, a key challenge in teaching basic machine learning courses is the heterogeneity of student backgrounds. To cope with this challenge, I have developed a new teaching concept for machine learning. This teaching concept revolves around three main components of machine learning: data, models and loss functions. By decomposing every machine learning methods into specific design choices for data representation, model and loss function, students learn to navigate the vast landscape of machine learning -methods and applications. The three-component picture of machine learning is the main subject of my textbook Machine Learning: The Basics.
-