The user web crawler is a website indexer that is built upon what websites a browser navigates to
The user web crawler works through volunteering users who install an extension on their browsers. When the user visits a webpage, the URL is anonymously added to an upstream database that holds all the unique webpages. Note: There is currently no centralized database that the data is pushed to. To start logging data, you will need to setup your own backend service
go run server.go
Install the Tampermonkey browser extension
Run the following Python3 script when you want to push your code to the upstream database
python3 ./commit.py