Skip to content

Commit

Permalink
UI: glider type filter greatly improved
Browse files Browse the repository at this point in the history
Also:
Removed unused npm dependencies
Simplify and configure cache for GET /api/models endpoint
  • Loading branch information
lwitkowski committed Aug 1, 2024
1 parent fdd0192 commit 934acff
Show file tree
Hide file tree
Showing 15 changed files with 162 additions and 862 deletions.
33 changes: 10 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Aero-offers

This project aims at reviving www.aero-offers.com - invaluable source of price trends for gliders and other aircrafts, originally developed and maintained by @rthaenert
This project aims at reviving [aero-offers.com](aero-offers.com) - invaluable source of price trends for gliders and other aircrafts, originally developed and maintained by @rthaenert

## Development

Expand All @@ -26,14 +26,20 @@ Currently, the project is being onboarded to Azure Cloud (still WIP).
- [x] use Azure secrets for db credentials
- [x] setup cron triggers for crawlers, reclassifier and FX rates updater
- [x] human readable domain (aero-offers.pl)
- [x] fix aircraft type dropdown
- [ ] infra as code (biceps or terraform)
- [ ] document infra and env topology
- [ ] fix other spiders/crawlers
- [ ] redirect from aero-offers.com
- [ ] fix aircraft type dropdown
- [ ] fix & polish CSS in UI
- [ ] update/simplify legal subpage
- [ ] cookies info
- [ ] use https://github.com/weglide/GliderList
- [ ] crawler for Facebook Marketplace - do they have nice api?
- [ ] crawler for https://www.aircraft24.de
- [ ] crawler for http://www.airplanemart.com
- [ ] crawler for http://www.aeronave.de/1-luftfahrzeuge/listings.html
- [ ] crawler for https://plane-sale.com

### Running locally without Python nor NodeJS
`docker compose up --build` - starts postgres, python backend and UI apps (http://localhost:8080/)
Expand Down Expand Up @@ -70,29 +76,10 @@ Start UI (vue app):
cd ui
npm run dev
```
UI has own, detailed README.md file.

Run crawlers/spiders & reclassifier:
```
cd backend
./run_spiders.sh && ./run_classifier.sh
```

## Further development / bug fixing (from Ralf)

- Model information
- Scale axes correctly (!)
- Euro on y-axis
- Top 10 aircraft offered per category
- Add more spiders
- Facebook Marketplace?
- https://www.aircraft24.de
- http://www.airplanemart.com
- http://www.aeronave.de/1-luftfahrzeuge/listings.html
- https://plane-sale.com

### Legal (from Ralf)
- Opt-out option / banner, pop-up due to analytics cookies
- Imprint generator: https://www.e-recht24.de/impressum-generator.html
- Data protection declaration generator: https://datenschutz-generator.de

Imprint and data protection declaration must be listed separately, but may refer to the same page (if desired).
```
4 changes: 1 addition & 3 deletions backend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,8 @@ RUN \
python3 -m pip install -r requirements.txt --no-cache-dir && \
apk --purge del .build-deps

ENV FLASK_APP=./api/flask_app.py

COPY . .

EXPOSE 80

CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0", "--port=80"]
CMD [ "python3", "-m" , "flask", "--app=./api/flask_app.py", "run", "--host=0.0.0.0", "--port=80"]
42 changes: 6 additions & 36 deletions backend/api/flask_app.py
Original file line number Diff line number Diff line change
@@ -1,27 +1,18 @@
import datetime

from flask import Flask, jsonify, request, abort
from flask_headers import headers
from flask_cors import CORS
from classifier import classifier

import db
from my_logging import *

app = Flask(__name__)
# TODO verify security risks with this
CORS(app, resources={r"/*": {"origins": "*"}})
app.config['JSON_AS_ASCII'] = False

logger = logging.getLogger("api")


@app.route('/api/offers')
def offers():
logger.debug("Received request {0}".format(request))
return jsonify(db.get_offers_dict(limit=request.args.get('limit'),
order_by=request.args.get('orderBy'),
return jsonify(db.get_offers_dict(aircraft_type=request.args.get('aircraft_type'),
offset=request.args.get('offset'),
aircraft_type=request.args.get('aircraft_type')))
limit=request.args.get('limit'),
order_by='creation_datetime'))

@app.route("/api/model/<manufacturer>/<model>")
def model_information(manufacturer, model):
Expand All @@ -36,31 +27,10 @@ def model_information(manufacturer, model):
manufacturer_info["offers"] = db.get_offers_for_model(manufacturer, model)
return jsonify(manufacturer_info)


@app.route("/api/models")
@headers({'Cache-Control':'public, max-age=360'})
def aircraft_models():
"""
Returns list of matching models in the format:
[{
"manufacturer": "Alexander Schleicher",
"model": "K8b"
}]
:param search_str: manufacturer/model to look for
:return: list of matching models
"""
search_str = request.args.get('search')
logger.info("Search request: {0}".format(search_str))
all_models = classifier.get_all_models()
matching_models = []
for manufacturer in all_models:
for aircraft_type in all_models[manufacturer]["models"]:
for model in all_models[manufacturer]["models"][aircraft_type]:
if search_str in manufacturer + " " + model:
matching_models.append({
"manufacturer": manufacturer,
"model": model})
return jsonify(matching_models)

return jsonify(classifier.get_all_models())

if __name__ == '__main__':
app.run(host='127.0.0.1', port=8080, debug=True)
10 changes: 0 additions & 10 deletions backend/api/uwsgi.ini

This file was deleted.

8 changes: 0 additions & 8 deletions backend/api/wsgi.py

This file was deleted.

15 changes: 0 additions & 15 deletions backend/classifier/classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,21 +20,6 @@ def get_all_models():
manufacturers = json.load(json_file)
return manufacturers

def get_actual_and_predicted(test_data):
model_classifier = ModelClassifier()
y_actual = []
y_predicted = []
for item in test_data:
expect_manufacturer = "expect_manufacturer" in item and item["expect_manufacturer"] is True
detail_text = "" if "detail_text" not in item else item["detail_text"]
# actual vs predicted ("" marks None or unknown model)
predicted_class = model_classifier.classify(item["input"], expect_manufacturer, detail_text)
predicted_class = "" if predicted_class[0] is None else predicted_class[0] + " " + predicted_class[1]
actual_class = "" if item["manufacturer"] is None else item["manufacturer"] + " " + item["model"]
y_actual.append(actual_class)
y_predicted.append(predicted_class)
return y_actual, y_predicted


class ModelClassifier:
manufacturers = {}
Expand Down
1 change: 1 addition & 0 deletions backend/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ Scrapy==2.11.2
nltk==3.8.1
Flask==3.0.3
Flask-Cors==4.0.1
Flask-Headers==1.0
uWSGI==2.0.26
defusedxml==0.7.1
requests==2.32.3
Expand Down
3 changes: 1 addition & 2 deletions backend/start_api.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
export PYTHONPATH=$PYTHONPATH':./'
export FLASK_APP=./api/flask_app.py

flask run --port=8081
flask --app=./api/flask_app.py --debug run --port=8081
Loading

0 comments on commit 934acff

Please sign in to comment.