Skip to content

Crawl buses, dining, and courses data #2

Crawl buses, dining, and courses data

Crawl buses, dining, and courses data #2

Workflow file for this run

name: Crawl buses, dining, and courses webpages and deploy to main branch
on:
schedule:
- cron: "0 * * * *" # 每小時的第 0 分鐘執行
workflow_dispatch:
jobs:
build_and_deploy:
runs-on: ubuntu-latest
env:
TZ: Asia/Taipei
DATA_FOLDER: data
steps:
- name: Checkout main branch
uses: actions/checkout@v3
with:
ref: main
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.13.2
cache: "pip"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Run Python crawler
run: |
python scripts/buses.py || true
python scripts/dining.py || true
python scripts/courses.py || true
- name: Update Data in main branch
run: |
git config --local user.email "action@github.com"
git config --local user.name "GitHub Actions"
git add $DATA_FOLDER
if ! git diff-index --quiet HEAD --; then
git commit -m "📝 Scheduled update of JSON data"
git push
else
echo "No changes to commit."
fi