diff --git a/AoFAaltoCS.ipynb b/AoFAaltoCS.ipynb index 6bcf3a8..e78b6b9 100644 --- a/AoFAaltoCS.ipynb +++ b/AoFAaltoCS.ipynb @@ -56,7 +56,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 4, "id": "6142febe", "metadata": {}, "outputs": [ @@ -77,7 +77,69 @@ "Eero Hyvönen\n", "Perttu Hämäläinen\n", "Alex Jung\n", - "Juho Kannala\n" + "Juho Kannala\n", + "Petteri Kaski\n", + "Samuel Kaski\n", + "Sándor Kisfaludi-Bak\n", + "Maarit Korpi-Lagg\n", + "Juhi Kulshrestha\n", + "Russell W. F. Lai\n", + "Jouko Lampinen\n", + "Casper Lassenius\n", + "Jaakko Lehtinen\n", + "Janne Lindqvist\n", + "Harri Lähdesmäki\n", + "Lauri Malmi\n", + "Heikki Mannila\n", + "Pekka Marttinen\n", + "Ilkka Niemelä\n", + "Marko Nieminen\n", + "Pekka Orponen\n", + "Alexandru Paler\n", + "Jussi Rintanen\n", + "Juho Rousu\n", + "Jari Saramäki\n", + "Arno Solin\n", + "Jukka Suomela\n", + "Linh Truong\n", + "Jara Joel Olavi Uitto\n", + "Aki Vehtari\n", + "Johanna Viitanen\n", + "Petri Vuorimaa\n", + "Robin Welsch\n", + "Antti Ylä-Jääski\n", + "Bo Zhao\n", + "Mikko Kiviharju\n", + "Tero Ilmari Ojanperä\n", + "Nitin Sawhney\n", + "Talayeh Aledavood\n", + "Lachlan Gunn\n", + "Lassi Haaranen\n", + "Arto Hellas\n", + "Vesa Hirvisalo\n", + "Jaakko Hollmen\n", + "Wilhelmiina Hämäläinen\n", + "Tommi Junttila\n", + "Barbara Esther Keller\n", + "Ari Korhonen\n", + "Sari Kujala\n", + "Jorma Laaksonen\n", + "Riku Linna\n", + "Mika P. Nieminen\n", + "Kerttu Pollari-Malmi\n", + "Risto Sarvas\n", + "Otto Seppälä\n", + "Juha Sorva\n", + "Sanna Suoranta\n", + "Jari-Pekka Vanhanen\n", + "N Asokan\n", + "Jari Collin\n", + "Aristides Gionis\n", + "Petri Myllymäki\n", + "Marko Turpeinen\n", + "Tapio Lokki\n", + "Mikko Sams\n", + "Simo Särkkä\n" ] } ], @@ -116,7 +178,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 5, "id": "8c9e21c9", "metadata": {}, "outputs": [ @@ -126,7 +188,7 @@ "Text(0.5, 1.0, 'Distribution of Tax-Payer Money via Research Council of Finland')" ] }, - "execution_count": 9, + "execution_count": 5, "metadata": {}, "output_type": "execute_result" }, diff --git a/index.html b/index.html index 0658cb2..bae69d2 100755 --- a/index.html +++ b/index.html @@ -26,21 +26,32 @@

Alexander Jung

alt text 
    -
  • Dipl.-Ing. Dr. techn. ("sub auspiciis")

    +
  • Dipl.-Ing. Dr. techn. ("sub auspiciis") +

  • -
  • Associate Professor (tenured) for Machine Learning, Aalto University

    +
  • Associate Professor (tenured) for Machine Learning, Aalto University
    +

  • -
  • Associate Editor for IEEE Signal Processing Letters (website)

    +
  • Associate Editor for IEEE Signal Processing Letters (website)
    +

  • -
  • Editorial Board Member, “Machine Learning” (Springer) (website)

    +
  • Editorial Board Member, “Machine Learning” (Springer) (website)
    +

  • -
  • Follow me on LinkedIn

    +
  • Follow me on LinkedIn +

  • -
  • Subscribe to my Machine Learning YouTube channel

    +
  • Subscribe to my Machine Learning YouTube channel +

  • -
  • Fork me on GitHub

    +
  • Fork me on GitHub +

  • -
  • Textbook: “Machine Learning - The Basics”, Springer, 2022 ("draft" )

    +
  • Textbook: “Machine Learning - The Basics”, Springer, 2022 ("draft" )
    +

    +
  • +
  • <a href=“https:www.researchgate.netprofileAlexander-Jung”>Alexander Jung on ResearchGate</a> +

@@ -49,19 +60,23 @@

About Me

in 2008 and 2012, respectively. Currently, I am an Associate Professor (tenured) for Machine Learning at the Department of Computer Science of Aalto University. My research and teaching revolves around the mathematical foundations of trustworthy machine learning with an emphasis on application domains that generate networked data. These application domains -include numerical weather prediction, renewable energy networks, city planning or condition monitoring.

+include numerical weather prediction, renewable energy networks, city planning or condition monitoring. +

Updates

For (Prospective) Master Students

Research Highlight: Computational and Statistical Aspects of Total Variation Minimization for Federated Learning

@@ -71,22 +86,29 @@

Research Highlight: Computational and Statistical Aspects of Total Variation parameters. The statistical properties of local datasets are related via different network structures that reflect physical (“contact networks”), social or biological proximity. In general, local datasets are heterogeneous in the sense of having different statistical distributions. However, we can often approximate local datasets that form a tight-knit cluster by a common cluster-specific distribution. -

+
+

To capitalize on the information in local datasets and their network structure, we have recently proposed networked exponential families as a novel probabilistic model for big data over networks. Networked exponential families are appealing statistically and computationally. They allow us to adaptively pool local datasets with similar statistical properties as training sets to learn personalized predictions tailored to each local dataset. We can compute these personalized predictions using highly scalable distributed convex optimization methods. These methods are robust against various types of imperfections (statistically and computationally) -and typically offer a high level of privacy protection.

-

Relevant Publications:

+and typically offer a high level of privacy protection. +

+

Relevant Publications: +

Research Highlight: Personalized Explainable Machine Learning

@@ -96,33 +118,40 @@

Research Highlight: Personalized Explainable Machine Learning

entropy of the prediction given the summary that a particular user associates with data points. The user summary is used to characterise the background knowledge of the “explainee” in order to compute explanations that are tailored for her. To compute the explanations our method only requires some training samples that consists of data points and their corresponding -predictions and user summaries. Thus, our method is model agnostic and can be used to compute explanations for different machine learning methods.

-

Relevant Publications:

+predictions and user summaries. Thus, our method is model agnostic and can be used to compute explanations for different machine learning methods. +

+

Relevant Publications: +

Teaching Highlight: Student Feedback-Driven Course Development

Right from my start at Aalto in 2015, I took care of the main machine learning courses at Aalto University. Within three years I have re-designed the spearhead course Machine Learning: Basic Principles (MLBP). This re-design was based on a careful analysis of feedback received from several thousands of students. I have also started to -prepare response letters to the student feedback, as it is customary in the +prepare response letters to the student feedback, as it is customary in the review process of scientific journals. My final edition of MLBP in 2018 has achieved the best student rating since the course was -established at Aalto. The efforts have also been acknowledged by the Teacher of the Year -award, which I have received in 2018 from the Department of Computer Science at Aalto University.

+established at Aalto. The efforts have also been acknowledged by the Teacher of the Year +award, which I have received in 2018 from the Department of Computer Science at Aalto University. +

Teaching Highlight: A Three-Component Picture of Machine Learning

Machine learning methods have been and are currently popularized in virtually any field of science and technology. As a result, machine learning courses attract students from different study programs. Thus, a key challenge in teaching basic machine learning courses is the heterogeneity of student backgrounds. To cope with this challenge, I have developed a new teaching concept for machine learning. This teaching concept revolves around three main components of machine learning: data, models and loss functions. By decomposing every machine learning methods into specific design choices for data representation, model and loss function, students learn to navigate the vast landscape of machine learning -methods and applications. The three-component picture of machine learning is the main subject of my textbook Machine Learning: The Basics.

-

+methods and applications. The three-component picture of machine learning is the main subject of my textbook Machine Learning: The Basics. +

+

+

diff --git a/index.jemdoc b/index.jemdoc index 1680517..eb92894 100755 --- a/index.jemdoc +++ b/index.jemdoc @@ -11,6 +11,7 @@ - Subscribe to my {{Machine Learning YouTube}} channel - Fork me on {{GitHub}} - Textbook: "Machine Learning - The Basics", Springer, 2022 ({{"draft"}} ) \n +- Alexander Jung on ResearchGate ~~~ == About Me