A Github Scraping Tool developed with Ruby and the Nokogiri gem
This project is part of the Microverse curriculum in Ruby module!
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
This is Github Scrapping Tool built with ruby. This Tools is built as a capstone project for completing one of Microverse's Main Technical Curriculum sections.
To use this scraper this is what you need to:
- Have ruby installed in your computer
- Download or clone this repo:
- Clone with SSH:
git@github.com:PhillipUg/github-scraper.git
- Clone with HTTPS
https://github.com/PhillipUg/github-scraper.git
cd
intogithub-scraper
directory and runbundle install
- Finally, run
bin/main.rb
in your terminal.
When you first run this github scraping tool it begins by showing you the summary info output format
Github User
-------------------------------
Name: xxxxxx (xxxxxx)
Bio: xxxxxx
Work: xxxxxx
Location: xxxxxx
Website: xxxxxx
---------------------------------
pinned Repositories
---------------------------------
1. xxxxxx
2. xxxxxx
3. xxxxxx
4. xxxxxx
5. xxxxxx
6. xxxxxx
--------------------------------
After this, you are prompted to enter a valid github username. Then it returns the above output format with all the information filled in.
Categories
---------------------------------
repositories: xxxxxx
stars: xxxxxx
followers: xxxxxx
following: xxxxxx
----------------------------------
Then you will be prompted to enter a category name to see a full list of its contents. For instance enter repositories
or stars
to get a list of those scrapped categories. This will continue until you exit the program by typing 'q'
in the terminal and pressing Enter.
This project was built using these technologies.
- Ruby
- Rspec
- Nokogiri gem
- Colorize gem
If you wish to test it. Install Rspec
with gem install rspec
. We used rspec 3.9.1
but any version not older than 3.0
should work fine. Clone this repo to your local machine, cd into github-scraper directory and run rspec
You can try it live on repl.it
Check out this video demonstration of how I built the scraper.
👤 Phillip Musiime
- LinkedIn: Phillip Musiime
- GitHub: PhillipUg
- Twitter: @Phillip_Ug
- E-mail: phillipmusiime@gmail.com
This project is MIT licensed.