Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

check if shm is necessary to mount as a volume for audio processings #6

Open
palmoreck opened this issue Apr 28, 2020 · 0 comments
Open

Comments

@palmoreck
Copy link
Member

Using run docker cmd for nvcr.io/nvidia/tensorflow:19.03-py3 docker image

I got:

The SHMEM allocation limit is set to the default of 64MB. This may be
insufficient for TensorFlow. NVIDIA recommends the use of the following flags:
nvidia-docker run --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 ...

Maybe I need to mount a volume like:

    volumeMounts:
     - name: efs-pvc
       mountPath: "/shared_volume"
     - name: dshm
       mountPath: /dev/shm
  volumes:
   - name: efs-pvc
     persistentVolumeClaim:
      claimName: efs
   - name: dshm 
     emptyDir:
      medium: Memory

in

https://github.com/CONABIO/kube_sipecam/blob/master/deployments/audio/kale-jupyterlab-kubeflow_0.4.0_1.14.0_tf_cpu.yaml#L35

??

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant