Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev: added backend code linter (pylint) #25

Merged
merged 3 commits into from
Aug 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .github/workflows/cd-backend.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,10 @@ jobs:
run: |
pip3 install --quiet -r requirements.txt -r tests/requirements.txt

- name: 'Static analysis (Lint)'
working-directory: ./backend
run: ./run_lint.sh

- name: 'Run tests and check coverage'
working-directory: ./backend
run: |
Expand Down
13 changes: 5 additions & 8 deletions backend/job_fetch_offers.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
from datetime import datetime
from scrapy.utils.project import get_project_settings
from twisted.internet import reactor
from scrapy.crawler import CrawlerRunner
from scrapy.crawler import CrawlerProcess

import pprint
from my_logging import *
Expand All @@ -13,21 +12,19 @@
if __name__ == '__main__':
try:
settings = get_project_settings()
runner = CrawlerRunner(settings)
process = CrawlerProcess(settings)

spiders = {
SoaringDeSpider.SoaringDeSpider: None,
FlugzeugMarktDeSpider.FlugzeugMarktDeSpider: None,
#PlaneCheckComSpider.PlaneCheckComSpider: None
}
for spider_cls in spiders.keys():
crawler = runner.create_crawler(spider_cls)
crawler = process.create_crawler(spider_cls)
spiders[spider_cls] = crawler
runner.crawl(crawler)
process.crawl(crawler)

d = runner.join()
d.addBoth(lambda _: reactor.stop())
reactor.run() # the script will block here until all crawling jobs are finished
process.start() # the script will block here until all crawling jobs are finished

stats_per_spider = {}

Expand Down
3 changes: 1 addition & 2 deletions backend/mailer.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,7 @@ def send_mail(text=""):
if not SEND_RESULT_MAIL:
return
msg = email.mime.text.MIMEText(text)
# TODO put your mail address here
# me = u'ralf.thaenert@googlemail.com'
me = 'dev@aerooffers.pl'
msg['Subject'] = 'Aircraft Offers Crawling Result'
msg['From'] = SMTP_USER
msg['To'] = me
Expand Down
1 change: 1 addition & 0 deletions backend/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
coverage==7.6.0
Twisted==24.3.0
psycopg2-binary==2.9.9
pylint==3.2.6
SQLAlchemy==2.0.31
price-parser==0.3.4
Scrapy==2.11.2
Expand Down
5 changes: 5 additions & 0 deletions backend/run_lint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export PYTHONPATH=$PYTHONPATH':./'

set -e

pylint --fail-on=E --errors-only ./
9 changes: 3 additions & 6 deletions backend/run_tests.sh
Original file line number Diff line number Diff line change
@@ -1,9 +1,6 @@
export PYTHONPATH=$PYTHONPATH':./'

coverage run --source ./ -m xmlrunner -o ./test-results
set -e

if [[ $? -ne 0 ]]; then
exit 1
else
coverage report --fail-under=75
fi
coverage run --source ./ --omit="tests/*" -m xmlrunner -o ./test-results
coverage report --fail-under=65