Skip to content

Weekly Report for GSoC 2020 FOSSology Fossdash

Darshan Kansagara edited this page Aug 29, 2020 · 1 revision

Community Bonding (04-May - 01-June)

  • Clone fossology repo and set up to run it locally
  • Understand the terminology used by the fossology project by going through the documentation. Read wiki page about the fossology
  • Understanding more on Prometheus and influxdb real-time data source and Grafana.
  • Created two demo architecture for dashboard
    • Using Prometheus as a data source ( Pull based architecture )
    • Using influxdb as a data source ( Push based architecture )
  • Created dashboard for each to showcase as POC of our idea.
  • Created document for basic terminology of Prometheus and grafana for the beginners Link .
  • Link to commit: https://github.com/darshank15/GSoC_2020_FOSSOlogy/commit/121695cc6569b9f0a042d6b88f8bd8fc287633a7
  • Discuss with mentor about project and its goals and understand it more clearly.
  • Understanding bash script as it was used much more in many scripts in fossology.

Week_1 (01-June to 06-June)

  • Look into some docker command and learn docker-compose
  • Take look into fossology branch for fossdash dev/fossdash-exporter
  • Run it locally to see data generated by the python script file “fossdash-publish.py”
  • Solve issue by modifying the above python file
  • Link to Issue: https://github.com/darshank15/GSoC_2020_FOSSOlogy/issues/1
Initially python script generatting data in below formate :
agents_count.copyright,instance=c7fe15ee-5c9c-4687-91d8-b1ba840e6b00 value=1 1591286501000000000
agents_count.ecc,instance=c7fe15ee-5c9c-4687-91d8-b1ba840e6b00 value=1 1591286501000000000


We want to modify python script to get data as below to get new tag_set for type of agent_count
agents_count,instance=c7fe15ee-5c9c-4687-91d8-b1ba840e6b00,type=copyright value=1 1591246503000000000
agents_count,instance=c7fe15ee-5c9c-4687-91d8-b1ba840e6b00,type=ecc value=1 1591246503000000000
 
So we can do groupby based on INSTANCE as well as based on TYPE.
  • Made a simple dashboard for the above newly generated data for the showcase.

Week_2 (08-June to 13-June)

  • Look into an issue for configuring fossdash to work on both Docker containers as well as in source code.
  • the issue is here: https://github.com/darshank15/GSoC_2020_FOSSOlogy/issues/2
  • Look into Makefile for configuration
  • Changed the code in common-sysconfig.php to get an input box in UI to configure FossDash URL and store it into the database table sysconfig.
  • Wrote script run_me.py which will trigger to read updated data from the database and modify fossdash.conf file.
  • I also started working on code to include VERSION info into the influx DB.
  • The issue is here: https://github.com/darshank15/GSoC_2020_FOSSOlogy/issues/3
  • Currently fetched all versions and build info from the VERSION file. But later on, we can get this data from the database table.

Week_3 (15-June to 20-June)

Week_4 (22-June to 27-June)

Week_5 (29-June to 04-July)

  • Rename UUID on a Fossology instance
  • Done with the first GSoC evalution.
  • Task-1 Cleaning old fossdash reported files.
    • As of now we generating all reported files posts send the data to the influxDB. As the fossdash script may be running every day, the Amount of all reported files in local space will be larger and more disk consumption over the period of time.
    • Implemented this functionality using find command (ctime, maxDepth) to get and delete older reported file to save the disk space.
  • Task-2 Cron job configuration to schedule an interval for the fossdash script file.
    • From configuration, the user can change the corn job schedule interval for fossdash.
    • Done using crontab command to update cronjob interval.
  • Task-3 Implemented Enable/Disable button to control the functionality of the fossdash.

Week_6 (06-July to 11-July)

Week_7 (13-July to 18-July)

Week_8 (20-July to 25-July)

Week_8 (26-July to 31-July)

Week_9 (03-August to 08-August)

Week_10 (10-August to 15-August)

Week_11 (17-August to 22-August)