Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Frontend slow with many concurrent users #509

Open
rth opened this issue Jan 22, 2021 · 4 comments
Open

Frontend slow with many concurrent users #509

rth opened this issue Jan 22, 2021 · 4 comments

Comments

@rth
Copy link
Collaborator

rth commented Jan 22, 2021

The server can be quite slow when accessed by multiple simultaneous users. For instance, here are load tests for a few pages with drill.

benchmark.yml

---

concurrency: 30
base: 'https://ramp.studio'
iterations: 32
rampup: 5

plan:
  - name: Fetch homepage
    request:
      url: /


  - name: Fetch event
    request:
      url: /events/{{ item }}
    with_items:
      - air_passengers_dssp_14
      - air_passengers_m2mosef2020
      - air_passengers_py4ds2020

  - name: Fetch problems
    request:
      url: /problems

where

drill --benchmark benchmark.yml --stats

with 30 concurrent connections produces ,

Fetch homepage            Total requests            32
Fetch homepage            Median time per request   155ms

Fetch event               Total requests            96
Fetch event               Median time per request   109ms

Fetch problems            Total requests            32
Fetch problems            Median time per request   2033ms

so the issue seems to be particularly with the /problems page, I suspect because of of a DB query in a double loop here.

The solutions could be to either,

  • to refactor that code to not make so many DB queries
  • cache the whole page either with nginx or something like caddy + caddy-cache
@kegl
Copy link
Collaborator

kegl commented Jan 22, 2021

I'd vote for caching.

@agramfort
Copy link
Collaborator

agramfort commented Jan 23, 2021 via email

@rth
Copy link
Collaborator Author

rth commented Jan 23, 2021

After looking a but in more detail, it might not necessarily be the simplest, as the content of the problems page is different for each user (it shows events where one is signed up). So I'm not actually sure it's possible to efficiently cache it, unless some of that information is removed (and if it is, this performance issue will disappear anyway).

Though caching would still be nice, particularly for static files in general.

@kegl
Copy link
Collaborator

kegl commented Jan 23, 2021

The exact same thing happened with the leaderboard table. I got rid of all user-dependent features (eg adding links to submissions of the user itself, even if the event is in competitive mode), and saved the table every time when it was updated. The complexity is that we need to know all events that change the page and trigger recomputing the table. In this case it would be adding problems and events, changing their statuses, and user sign ups.

Do we know which query is taking up time? My suspicion is that it's the one that counts the number of submissions. If that's the case, we could take those stats off the landing page and put it on the event page, or add a statistics menu. Or just cache these numbers when users sing up and submit, and put them in the db.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants