Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parametrize vroom size #3

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,9 @@ $ nextflow run tron-bioinformatics/tronflow-copy-number-calling -r <RELEASE|BRAN
Option 2: Download the project and run it as follows...

```bash

git clone --recurse-submodules git@github.com:TRON-Bioinformatics/tronflow-copy-number-calling.git
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, it would be good to have this in every tronflow repo, but this is obviously very difficult to maintain. Maybe we could have generic instructions when possible in the documentation and keep README files to a minimum. Just thinking aloud... maybe we can discuss in the larger group next week.


$ nextflow run main.nf -profile conda --input_files <YOUR_INPUT_FILE> --reference <YOUR_REFERENCE_FASTA> --intervals <YOUR_TARGET_REGIONS_BED> --tools cnvkit,sequenza
```

Expand All @@ -87,13 +90,13 @@ Input:
Example input file:
name1 tumor_bam1 normal_bam1
name2 tumor_bam2 normal_bam2
* reference: path to the FASTA genome reference
* reference: path to the FASTA genome reference. If sequenza is included, this file must not contain any alternative chromosomes and must be sorted in the same order as the bam file.
* intervals: path to the BED file with the targeted region
* tools: tools to perform CN calling with (single and multiple entries possible, use ',' as delimiter) [ cnvkit, sequenza ]

Optional input:
* output: the folder where to publish output (default: output)
* VROOM_CONNECTION_SIZE: value for the environment variable VROOM_CONNECTION_SIZE which sometimes causes trouble with sequenza (default: 500000000)
* VROOM_CONNECTION_SIZE: value for the environment variable VROOM_CONNECTION_SIZE which sometimes causes trouble with sequenza and may need to be increased (default: 500000000)
* cpus: the number of CPUs used by each job (default: 1)
* memory: the amount of memory used by each job (default: 4g)

Expand Down
4 changes: 2 additions & 2 deletions local_modules/rsequenza/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ process R_SEQUENZA {
tag "$meta.id"
label 'process_medium'

conda (params.enable_conda ? "conda-forge::r-base=4.2.2 bioconda::r-sequenza=3.0.0" : null)
conda (params.enable_conda ? "conda-forge::r-base=4.2.2 bioconda::r-sequenza=3.0.0 conda-forge::r-iotools=0.3-2 " : null)
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/r-sequenza:3.0.0--r42h3342da4_5' :
'biocontainers/r-sequenza:3.0.0--r42h3342da4_5' }"
Expand All @@ -23,7 +23,7 @@ process R_SEQUENZA {

library(sequenza)

Sys.setenv(VROOM_CONNECTION_SIZE = "131072000")
Sys.setenv(VROOM_CONNECTION_SIZE = "${params.VROOM_CONNECTION_SIZE}")

seqz <- sequenza.extract(file="${seqz}", verbose = FALSE)
#data.file <- system.file("extdata", "example.seqz.txt.gz", package = "sequenza")
Expand Down
3 changes: 2 additions & 1 deletion tests/scripts/run_test_02.sh
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@ nextflow run main.nf \
--output ${output} \
--reference ${reference} \
--intervals ${intervals} \
--tools ${tool}
--tools ${tool} \
--VROOM_CONNECTION_SIZE 1536870912

if [ $? -eq 1 ]
then
Expand Down
Loading