diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 000000000..e69de29bb diff --git a/404.html b/404.html new file mode 100644 index 000000000..51ee8c160 --- /dev/null +++ b/404.html @@ -0,0 +1,265 @@ + + + + + + + + + + + + + 저런! 잘못 오셨습니다! | PyTorchKR | 파이토치 한국 사용자 모임 + + + + + + + + + + + + + + + + + + + + + + + +
+
+ + +
+ + + + + + + + +
+ +
+
+ +
+
+
+
+ + + +
+ + +

저런!

+ +

잘못된 경로로 접근하셨거나 접근하신 경로에 문서가 없습니다.

+ +

+ 정상적인 경로로 접근하셨는데 문제가 계속된다면 홈페이지 저장소에 이슈를 남겨주세요. +

+ +

+ 또는, 여기를 눌러 첫 페이지로 가실 수 있습니다. +

+
+ + + +
+
+
+
+ +
+
+
+
+

PyTorchKorea @ GitHub

+

파이토치 한국 사용자 모임을 GitHub에서 만나보세요.

+ GitHub로 이동 +
+ +
+

한국어 튜토리얼

+

한국어로 번역 중인 파이토치 튜토리얼을 만나보세요.

+ 튜토리얼로 이동 +
+ +
+

커뮤니티

+

다른 사용자들과 의견을 나누고, 도와주세요!

+ 커뮤니티로 이동 +
+
+
+
+ + + +
+
+
+
+ + +
+
+
+ + +
+ + + + + + + + + + + + + + + \ No newline at end of file diff --git a/CNAME b/CNAME new file mode 100644 index 000000000..f69ce4fa0 --- /dev/null +++ b/CNAME @@ -0,0 +1 @@ +pytorch.kr \ No newline at end of file diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 000000000..17de6abee --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,179 @@ +# PyTorch 한국어 모델 허브 번역 기여하기 + +PyTorch 한국어 모델 허브 저장소에 방문해주셔서 감사합니다. 이 문서는 PyTorch 한국어 모델 허브에 기여하는 방법을 안내합니다. + + +## 기여하기 개요 + +[본 저장소](https://github.com/PyTorchKorea/hub-kr)는 [PyTorch 공식 허브](https://pytorch.org/hub/)를 번역하는 프로젝트를 위한 곳으로, +[Pytorch 공식 허브 저장소](https://github.com/pytorch/hub)의 내용을 비정기적으로 반영하고, 번역 및 개선합니다. + +크게 다음과 같은 기여 방법이 있습니다. + +* [1. 오탈자를 수정하거나 번역을 개선하는 기여](#1-오탈자를-수정하거나-번역을-개선하는-기여) + * [PyTorch 한국어 모델 허브 사이트](https://pytorch.kr/hub/)에서 발견한 오탈자를 [본 저장소](https://github.com/PyTorchKorea/hub-kr)에서 고치는 기여입니다. +* [2. 번역되지 않은 허브 모델을 번역하는 기여](#2-번역되지-않은-허브-모델을-번역하는-기여) + * [PyTorch 한국어 모델 허브 사이트](https://pytorch.kr/hub/)에 아직 번역되지 않은 모델 허브를 번역하는 기여입니다. +* [3. 2로 번역된 문서를 리뷰하는 기여](#3-2로-번역된-문서를-리뷰하는-기여) :star: + * [본 저장소에 Pull Request된 허브 문서](https://github.com/PyTorchKorea/hub-kr/pulls)를 리뷰하는 기여입니다. + +기여 및 리뷰 시 [행동 강령](https://github.com/PyTorchKorea/.github/blob/master/CODE_OF_CONDUCT.md)을 지켜주시면 감사하겠습니다. + +## 기여 결과물의 라이선스 동의 + +PyTorch 한국어 모델 허브는 [Pytorch 공식 허브 저장소](https://github.com/pytorch/hub)와 동일한 [BSD 3항 라이선스](https://github.com/PyTorchKorea/pytorch.kr/blob/master/LICENSE)를 따릅니다. \ +따라서 기여하신 모든 내용에 [BSD 3항 라이선스](https://github.com/PyTorchKorea/pytorch.kr/blob/master/LICENSE)가 적용됨을 인지하시고 동의하시는 경우에만 아래 문서 내용과 같이 기여해주세요. + + +## 기여하기 절차 + +모든 기여는 [본 저장소에 이슈](https://github.com/PyTorchKorea/hub-kr/issues)를 남긴 후 [Pull Request를 보내는 것](https://github.com/PyTorchKorea/hub-kr/pulls)으로 합니다. \ +이 과정을 통해 Pull Request를 위한 Commit을 만들기 전에 이슈를 통해 해당 내용에 기여가 필요한지 여부를 확인하고 협의하셔야 합니다. \ +(물론 이슈를 남기셨다고 해서 반드시 해당 문제를 개선하셔야 하는 것은 아니니, 마음 편히 이슈를 남겨주세요. :)) + +### Pull Request 만들기 + +#### Pull Request 만들기 전 : 주의사항 + +* 하나의 commit, branch, Pull Request(PR)에는 하나의 변경 사항만 담아주세요. + * 여러 수정사항에 대해서는 각각 다른 branch에서 작업하신 뒤, 새로운 PR을 만들어주세요. + * 새로운 branch가 아닌, 이미 PR를 만드셨던 branch에 추가 commit 시에는 이전 commit들과 함께 Pull Request가 생성됩니다. +* Pull Request를 만들기 전 문법 오류나 깨진 글자는 없는지 확인해주세요. + * 기본적인 문법은 Markdown 문법을 지키면서 작성해주세요. + * 이미 번역된 문서들을 참고하셔도 좋습니다. + * 번역 후에는 (내 컴퓨터에서) 빌드를 한 후, 문법 오류를 확인해주세요. +* 오류가 많거나 다른 PR의 commit이 섞여 있는 경우 해당 PR은 관리자가 닫을 수 있으니 주의해주세요. +* Commit 메시지 작성 규칙을 지켜주세요. + * 새로운 번역 작성 시는 "[번역]:ResNet 모델" + * 번역에 대한 수정을 반영할 때는 "[Fix]:오타 수정" + +#### Pull Request 만들기 : 생성하기 + +* `라이선스 동의` 체크하기 ✅ + * 기여해주신 내용을 더 많은 분이 참고 / 개선 / 변경할 수 있게 라이선스 적용에 동의해주세요. + * 동의를 거부하실 수 있으나, 이 경우 해당 PR의 내용의 자유로운 사용이 어렵기 때문에 리뷰 및 반영은 진행하지 않습니다. +* PR 내용에 관련 이슈 번호 적어주기 🔢 + * 논의된 내용이 있다면 참고할 수 있도록 어떠한 이슈로부터 생성한 PR인지 알려주세요. +* PR 종류 선택하기 + * 리뷰어에게 어떤 종류의 PR인지 알려주세요. +* PR 설명하기 + * 이 PR을 통해 어떠한 것들이 변경되는지 알려주세요. +* **Tip**: 만약 문서가 방대해서 중간 피드백이 필요하다면 Draft PR 기능을 사용할 수 있습니다. + * 자세한 내용은 [GitHub Blog](https://github.blog/2019-02-14-introducing-draft-pull-requests/)의 글을 참고해주세요. + +#### Pull Request 만든 후 : 리뷰를 받았을 때 + +* 리뷰 내용에 대한 추가 의견이 있을 경우 해당 리뷰에 댓글로 의견을 주고받습니다. + * 번역한 문서의 내용은 번역자가 가장 잘 알고 있으므로 리뷰어의 의견에 반드시 따라야 하는 것은 아닙니다. + * 하지만 번역 실수나 오류, 잘못된 Markdown 문법에 대한 내용은 가급적 반영해주시기를 부탁드립니다. + * 다른 문서들과의 일관성, 이해를 위해 추가로 요청드리는 내용들도 있을 수 있으니 감안해주세요. +* 변경 사항을 고치기로 하였다면, Pull Request를 만든 원본 저장소 / branch에 추가 commit을 합니다. + * 리뷰 결과를 반영한 경우 `Resolve Conversation` 버튼을 눌러 리뷰어에게 알립니다. + +### Pull Request 리뷰하기 + +* 리뷰 전 (TRANSLATION_GUIDE.md - TBD) 문서를 읽고 리뷰해주세요. +* 특히 다음의 내용들을 유의해주세요. + * 번역된 용어들이 용어집에 맞게 사용되었는지 확인합니다. + * 번역된 내용에 오탈자가 있는지 확인해 봅니다. + * 부자연스러운 내용이 있다면 좀 더 나은 번역으로 제안하여 봅니다. + * Markdown 문법에 맞게 잘 작성되어있는지 확인해 봅니다. +* 말하려는 내용이 이미 다른 댓글에 있다면 공감 이모지 눌러주세요. + + +## (기여 종류에 따른) 기여 방법 + +### 1. 오탈자를 수정하거나 번역을 개선하는 기여 +
+ 펼치기 + +[PyTorch 한국어 모델 허브 사이트](https://pytorch.kr/hub/)에서 발견한 오탈자를 고치는 기여 방법입니다. + +#### 1-1. 이슈 남기기 + +(매우 낮은 확률로) 해당 오탈자가 의도한 것일 수 있으니, 해당 문제점을 고친 Pull Request를 생성하기 전에 [본 저장소에 이슈](https://github.com/PyTorchKorea/hub-kr/issues)를 검색하거나 새로 남겨주세요. + +해당 문제점에 대한 개선 사항이 **이미 논의되었거나 진행 중인 Pull Request를 통해 해결 중일 수 있으니, 새로 이슈를 만드시기 전, 먼저 검색**을 해주시기를 부탁드립니다. + +이후, 새로 남겨주신 이슈에서 저장소 관리자 및 다른 방문자들이 함께 문제점에 대해 토의하실 수 있습니다. (또는 이미 관련 이슈가 존재하지만 해결 중이지 않은 경우에는 댓글을 통해 기여를 시작함을 알려주세요.) + +#### 1-2. 저장소 복제하기 + +오탈자를 수정하기 위해 저장소를 복제합니다. \ +저장소 복제가 처음이시라면 [GitHub의 저장소 복제 관련 도움말](https://help.github.com/en/github/getting-started-with-github/fork-a-repo)을 참조해주세요. + + +#### 1-3. 오탈자 수정하기 + +위에서 찾은 원본 허브 문서를 Markdown 문법에 맞춰 수정합니다. \ +Markdown 문법에 익숙하지 않은 경우, 다른 허브 문서의 원본 문서와 빌드 결과물을 비교해보면서 빌드 결과물을 예상할 수 있습니다. + +#### 1-4. (내 컴퓨터에서) 결과 확인하기 + +저장소의 최상위 경로에서 `preview_hub.sh` 명령어를 이용하면 코드 실행 없이 `http://127.0.0.1:4000/` 로컬 주소를 활용하여 빌드 결과물을 빠르게 확인하실 수 있습니다. + +빌드를 위한 자세한 과정은 [Window_build.md](https://github.com/PyTorchKorea/hub-kr/blob/master/Window_build.md)와 [파이토치 허브 README.md](https://github.com/PyTorchKorea/hub-kr)를 참고해주시길 바랍니다. + +#### 1-5. Pull Request 만들기 + +수정을 완료한 내용을 복제한 저장소에 Commit 및 Push하고, Pull Request를 남깁니다. \ +Pull Request를 만드시기 전에 이 문서에 포함된 [Pull Request 만들기](#Pull-Request-만들기) 부분을 반드시 읽어주세요. \ +만약 Pull Request 만들기가 처음이시라면 [GitHub의 Pull Request 소개 도움말](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests) 및 [복제한 저장소로부터 Pull Request 만들기 도움말](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork)을 참조해주세요. + +
+ +### 2. 번역되지 않은 허브 모델을 번역하는 기여 +
+ 펼치기 + +[PyTorch 한국어 모델 허브 사이트](https://pytorch.kr/hub/)에 아직 번역되지 않은 모델 허브을 번역하는 기여 방법입니다. + +#### 2-1. 이슈 남기기 + +(매우 낮은 확률로) 해당 허브가 번역 중일 수 있으니, 번역 전에 Pull Request를 생성하기 전에 [본 저장소에 이슈](https://github.com/PyTorchKorea/hub-kr/issues)를 검색하거나 새로 남겨주세요. + +해당 허브에 대한 **번역이 이미 논의되었거나 Pull Request를 통해 진행 중일 수 있으니, 새로 이슈를 만드시기 전, 먼저 검색**을 해주시기를 부탁드립니다. \ +이후, 새로 남겨주신 이슈에서 저장소 관리자 및 다른 방문자들이 함께 번역 진행에 대해 토의하실 수 있습니다. \ +(또는 이미 관련 이슈가 존재하지만 번역 중이지 않은 것처럼 보이는 경우에는 댓글을 통해 기여를 시작함을 알려주세요.) + +#### 2-2. 저장소 복제하기 + +신규 모델 허브을 번역하기 위해 저장소를 복제합니다. \ +저장소 복제가 처음이시라면 [GitHub의 저장소 복제 관련 도움말](https://help.github.com/en/github/getting-started-with-github/fork-a-repo)을 참조해주세요. + +#### 2-3. 원본 경로 / 문서 찾기 + +허브 모델 번역을 위해서는 [PyTorch 한국어 모델 허브 사이트](https://pytorch.kr/hub/)의 모델 주소로부터 원본 문서를 찾아야합니다. \ +모델 주소에서 `https://pytorch.kr/hub/` 뒷 부분이 문서 이름 입니다. 이 문서는 `https://github.com/PyTorchKorea/hub-kr`에 `.md` 확장자로 존재합니다. \ +예를 들어, 파이토치 허브의 YOLOv5 모델 경로가 'https://pytorch.kr/hub/ultralytics_yolov5/' 일 때, 'https://github.com/PyTorchKorea/hub-kr'에 있는 'ultralytics_yolov5.md' 파일이 원본 문서입니다. + +#### 2-4. 허브 번역하기 + +위에서 찾은 원본 허브 문서를 Markdown 문법에 맞춰 번역합니다. \ +번역 중 번역 용어에 대해서는 다른 모델 허브 문서를 참조하시거나, `2-1`에서 남긴 이슈의 댓글을 통해 토의하실 수 있습니다. \ +Markdown 문법에 익숙하지 않은 경우, 다른 허브 원본 문서와 빌드 결과물을 비교해보면서 빌드 결과물을 예상할 수 있습니다. + +#### 2-5. (내 컴퓨터에서) 결과 확인하기 + +저장소의 최상위 경로에서 `preview_hub.sh` 명령어를 이용하면 코드 실행 없이 `http://127.0.0.1:4000/` 로컬 주소를 활용하여 빌드 결과물을 빠르게 확인하실 수 있습니다. \ +이 과정에서 수정한 문서 상에서 발생하는 오류가 있다면 Markdown 문법을 참고하여 올바르게 고쳐주세요. \ + +#### 2-6. Pull Request 만들기 + +번역을 완료한 내용을 복제한 저장소에 Commit 및 Push하고, Pull Request를 남깁니다. \ +Pull Request를 만드시기 전에 이 문서에 포함된 [Pull Request 만들기](#Pull-Request-만들기) 부분을 반드시 읽어주세요. \ +만약 Pull Request 만들기가 처음이시라면 [GitHub의 Pull Request 소개 도움말](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests) 및 [복제한 저장소로부터 Pull Request 만들기 도움말](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork)을 참조해주세요 + +
+ +### 3. Pull Request에 대해 리뷰하는 기여 +
+ 펼치기 + +[본 저장소에 Pull Request된 허브 문서](https://github.com/PyTorchKorea/hub-kr/pulls)를 리뷰하는 기여입니다. + +Pull Request된 문서의 오탈자 수정, Markdown 문법 오류 또는 잘못 번역된 내용을 개선하는 기여로, 가장 기다리고 있는 기여 방식입니다. :pray: \ +Pull Request를 리뷰하시기 전에 이 문서에 포함된 [Pull Request 리뷰하기](#Pull-Request-리뷰하기) 부분을 반드시 읽어주세요. \ +만약 PR 리뷰가 익숙하지 않으시다면 [GitHub의 Pull Request 리뷰 관련 도움말](https://docs.github.com/en/free-pro-team@latest/github/collaborating-with-issues-and-pull-requests/about-pull-request-reviews)을 참조해주세요. + +
\ No newline at end of file diff --git a/CONTRIBUTING_MODELS.md b/CONTRIBUTING_MODELS.md new file mode 100644 index 000000000..63484de61 --- /dev/null +++ b/CONTRIBUTING_MODELS.md @@ -0,0 +1,93 @@ +# PyTorch Korea 에 모델 게시하기 + +본 페이지는 사전 학습된 모델을 hub를 통해 배포하는 방법과 배포된 모델을 [Pytorch Korea](https://pytorch.kr/hub/)의 모델 안내 페이지에 추가하는 방법을 안내합니다. + +## 1. `torch.hub` 를 통해 모델 배포하기 + +`torch.hub`는 연구 재현 및 사용을 용이하게 하기 위해 설계된 사전 훈련된 모델 저장소입니다. + +### 1.1. 모델 퍼블리싱 + +Pytorch Hub는 `hubconf.py`를 추가하여 사전 훈련된 모델(모델 정의 및 사전 훈련된 가중치)을 깃허브 저장소에 게시할 수 있도록 지원합니다. + +`hubconf.py`는 여려 개의 엔트리 포인트(entry point)를 가질 수 있습니다. 각 엔트리 포인트들은 파이썬 함수로 정의됩니다. (예를 들어, 사용자가 등록하고자 하는 사전 훈련된 모델) + +```python +def entrypoint_name(*args, **kwargs): + # args & kwargs are optional, for models which take positional/keyword arguments. + ... +``` + + +### 1.2. 실행하는 방법 + +다음은 `pytorch/vision/hubconf.py`을 참고하여 작성한 `resnet18` 모델의 엔트리 포인트를 지정하는 코드입니다. +대부분의 경우 `hubconf.py`의 구현된 기능을 가져오면 충분합니다. +여기에서 확장 버전을 예시로 작동 방식을 보여드리겠습니다. +[pytorch/vision repo](https://github.com/pytorch/vision/blob/main/hubconf.py) 여기서 모든 스크립트를 볼 수 있습니다. + +```python +dependencies = ['torch'] +from torchvision.models.resnet import resnet18 as _resnet18 + + +# resnet18은 엔트리 포인트의 이름입니다. +def resnet18(pretrained=False, **kwargs): +""" # 이 문서 문자열은 hub.help() Resnet18 model 안에서 보여줍니다 + 사전 교육 (bool) : kwwargs, load pretrained weights를 모델에 적용합니다 +""" + + # 모델을 불러오고 사전 학습된 가중치를 로드 + model = _resnet18(pretrained=pretrained, **kwargs) + return model +``` + +`dependencies` 변수는 모델을 불러오는 데 필요한 패키지 이름을 담은 목록입니다. 이는 모델 학습에 필요한 종속성과는 약간 다를 수 있음을 주의하세요. +`args`와 `kwargs`는 실제 호출 가능한 함수로 전달됩니다. +함수의 Docstring은 도움 메시지의 역할을 합니다. 모델의 기능과 허용된 위치/키워드 인수에 대해 설명합니다. 여기에 몇 가지 예시를 추가하는 것이 좋습니다. +엔트리 포인트 함수는 모델(nn.module)을 리턴하거나 작업 흐름을 보다 부드럽게 만들기 위한 보조 도구(예: tokenizer)를 반환할 수 있습니다. +밑줄이 앞에 붙은 callables는 torch.hub.list()에 표시되지 않는 도우미 함수로 간주됩니다. +사전 훈련된 가중치는 깃허브 저장소에 로컬로 저장되거나 `torch.hub.load_state_dict_from_url()`을 사용해 불러올 수 있습니다. 크기가 2GB 미만일 경우 [project release](https://docs.github.com/en/repositories/working-with-files/managing-large-files/about-large-files-on-github)에 첨부하고 릴리스의 URL을 사용하는 것을 권장합니다. 위의 예시에서는 `torchvision.models.resnet.resnet18`이 `pretrained`를 다루지만, 해당 로직을 엔트리 포인트 정의에 넣는 방법을 사용할 수도 있습니다. + +``` +if pretrained: + #체크 포인트로 로컬 깃허브 저장소, 예를 들면 =weights/save.pth에 저장합니다 + dirname = os.path.dirname(__file__) + checkpoint = os.path.join(dirname, ) + state_dict = torch.load(checkpoint) + model.load_state_dict(state_dict) + + # 체크 포인트가 다른 곳에 저장된 경우 + checkpoint = 'https://download.pytorch.org/models/resnet18-5c106cde.pth' + model.load_state_dict(torch.hub.load_state_dict_from_url(checkpoint, progress=False)) +``` + +**주의 사항** +배포된 모델들은 적어도 branch/tag에 속해야 합니다. 랜덤 커밋이 되면 안됩니다. + +## 2. [Pytorch Korea](https://pytorch.kr/hub/) 에 모델 안내 페이지 추가하기 + + - `torch.hub` 를 배포한 이후 해당 모델에 대한 안내 페이지를 pytorch.kr 에 모델 안내 페이지 추가하기 + +### 2.1. 페이지 작성하기 + +#### 2.1.1. 본 저장소 루트에 파일 생성하기 + +```bash +touch ${project_root}/sample.md +``` + +#### 2.1.2. [template.md](https://github.com/PyTorchKorea/hub-kr/blob/master/docs/template.md) 를 참조하여 파일 작성하기 + +### 2.2. 결과 확인하기 + +### 2.2.1. [빌드하기](https://github.com/PyTorchKorea/hub-kr#빌드하기) 를 참조하여 홈페이지를 빌드 + +### 2.2.2. 이후에 https://127.0.0.1:4000/hub/ 에서 추가된 페이지를 확인하기 + +다음과 같은 페이지를 확인하실 수 있습니다. + +![모델 페이지 안내](images/model_page.png) +### 2.3. 기여하기 + + 추가한 페이지는 [기여하기](https://github.com/PyTorchKorea/hub-kr/blob/master/CONTRIBUTING.md) 를 참고하여 본 레포지터리에 기여해주세요! diff --git a/Cub200Dataset.png b/Cub200Dataset.png new file mode 100644 index 000000000..ead780b0d Binary files /dev/null and b/Cub200Dataset.png differ diff --git a/GPT1.png b/GPT1.png new file mode 100644 index 000000000..425ea2e75 Binary files /dev/null and b/GPT1.png differ diff --git a/MEALV2.png b/MEALV2.png new file mode 100644 index 000000000..b4e8b2088 Binary files /dev/null and b/MEALV2.png differ diff --git a/MEALV2_method.png b/MEALV2_method.png new file mode 100644 index 000000000..02f7668d4 Binary files /dev/null and b/MEALV2_method.png differ diff --git a/MEALV2_results.png b/MEALV2_results.png new file mode 100644 index 000000000..947734e70 Binary files /dev/null and b/MEALV2_results.png differ diff --git a/README.md b/README.md new file mode 100644 index 000000000..162d286ef --- /dev/null +++ b/README.md @@ -0,0 +1,44 @@ +# PyTorch 한국어 모델 허브 + +## 소개 + +PyTorch에서 제공하는 모델 허브의 한국어 번역을 위한 저장소입니다.\ +번역의 결과물은 [https://pytorch.kr/hub](https://pytorch.kr/hub)에서 확인하실 수 있습니다. (번역을 진행하며 **불규칙적으로** 업데이트합니다.) +새로운 모델에 대한 반영은 [모델 허브의 공식 저장소](https://github.com/pytorch/hub)를 참고해주세요. + +## 빌드하기 + +파이토치 허브는 [파이토치 한국 사용자 모임 홈페이지의 일부](https://pytorch.kr/hub/)입니다. \ +빌드를 위해서는 [파이토치 한국 사용자 모임 홈페이지 빌드 환경](https://github.com/PyTorchKorea/pytorch.kr#%EB%B9%8C%EB%93%9C-%EC%A0%88%EC%B0%A8)이 준비되어야 합니다. \ +자세한 내용은 [PyTorchKorea/pytorch.kr 저장소의 README.md](https://github.com/PyTorchKorea/pytorch.kr#%EB%B9%8C%EB%93%9C-%EC%A0%88%EC%B0%A8)를 참고해주세요. +빌드 환경이 준비되었다면, 아래 명령어로 빌드 및 미리보기를 할 수 있습니다. +```sh + ./preview_hub.sh +``` + +Window 10의 경우 아래 명령어로 빌드 및 미리보기를 진행해주세요. +빌드 환경은 [Pytorch-hub-kr build in Window 10](./Window_build.md) 문서를 참고해주세요. +```sh + ./preview_hub_window.sh +``` + +## 기여하기 + +다음의 방법들로 기여하실 수 있습니다. + +1. 오탈자를 수정하거나 번역을 개선하는 기여 + * [한국어 모델 허브 사이트](https://pytorch.kr/hub)에서 발견한 오탈자를 [한국어 모델 허브 저장소](https://github.com/PyTorchKorea/hub-kr)에서 고치는 기여입니다. +2. 번역되지 않은 모델 소개를 번역하는 기여 + * [한국어 모델 허브 사이트](https://pytorch.kr/hub)에서 아직 번역되지 않은 모델 소개를 번역하는 기여입니다. +3. 2로 번역된 문서를 리뷰하는 기여 :star: + * [본 저장소에 Pull Request된 모델 소개](https://github.com/PyTorchKorea/hub-kr/pulls)의 번역이 적절한지 리뷰하는 기여입니다. \ + (많은 분들의 참여를 간절히 기다리고 있습니다. :pray:) + +## 원문 + +현재 [PyTorch v1.9 기준(pytorch/hub@552c779)](https://github.com/pytorch/hub/commit/552c779) 번역이 진행 중입니다. \ +최신 버전의 모델 소개(공식, 영어)는 [PyTorch 모델 허브 사이트](https://pytorch.org/hub) 및 [PyTorch 모델 허브 저장소](https://github.com/pytorch/hub)를 참고해주세요. + +-- +This is a project to translate [pytorch/hub@552c779](https://github.com/pytorch/hub/commit/552c779) into Korean. +For the latest version, please visit to the [official PyTorch model hub repo](https://github.com/pytorch/hub). \ No newline at end of file diff --git a/ResNeXtArch.png b/ResNeXtArch.png new file mode 100644 index 000000000..b75d41b64 Binary files /dev/null and b/ResNeXtArch.png differ diff --git a/SEArch.png b/SEArch.png new file mode 100755 index 000000000..a7fb8d047 Binary files /dev/null and b/SEArch.png differ diff --git a/TRANSLATION_GUIDE.md b/TRANSLATION_GUIDE.md new file mode 100644 index 000000000..e618652af --- /dev/null +++ b/TRANSLATION_GUIDE.md @@ -0,0 +1,38 @@ +# 일반 규칙 + +* 번역된 문서만으로도 내용을 이해할 수 있도록 문서를 번역해야 합니다. + * 기계적인 번역이나 피상적인 리뷰는 지양해주세요. + * 일반 명사와 Class 이름은 구분하여 번역을 하거나 원문을 표기합니다. + * 예시. 데이터셋과 Dataset + +* 허브 모델 문서의 meta description도 번역에 포함됩니다. + * 문서 상단의 meta description의 summary 내용도 가급적 번역합니다. + * 예시. + ``` + layout: hub_detail + background-class: hub-background + body-class: hub + title: MobileNet v2 + 👉summary: residual block을 통해 속도와 메모리에 최적화된 효율적인 네트워크 + ⋮ + ``` + +* 반드시 문서를 직역하지 않아도 됩니다. + * 의미 없는 주어는 생략해 주세요. + * 예를 들어, `we`는 강조의 의미가 있지 않는 이상 번역하지 않고 생략합니다. + * 이해를 돕기 위한 약간의 의역이나 설명을 추가해도 좋습니다. + * 단, 원문의 의미가 다르게 해석될 여지가 있는 경우에는 자제해주세요. + +* 쉬운 유지보수를 위해 문장 단위는 가급적 지켜주시기 바랍니다. + * 하지만 문장이 여러 줄에 걸쳐 조각나 있는 경우 등에는 한 줄에 하나의 문장으로 모아주셔도 됩니다. + +* 번역된 문장만으로 의미를 전달하기 어려울 때에는 `한글(영어)`를 같이 작성합니다. + * 해당 용어가 **처음 출현**하였을 때 한글(영어)로 작성하고 그 이후에는 한글만 작성합니다. + +* 현재 허브의 문서에서 사용되는 용어는 [튜토리얼 용어집](https://github.com/PyTorchKorea/tutorials-kr/blob/master/TRANSLATION_GUIDE.md#용어-사용-규칙)을 기준으로 하고 있습니다. + +* 소스 코드, 논문 제목 등은 가급적 번역하지 않습니다. + * 단, 소스 코드에 포함된 주석은 가급적 번역합니다. +* 줄 바꿈 및 공백은 가급적 원문과 동일하게 유지합니다. + * 이후 원본 문서에 추가적인 변경이 발생할 때 유지보수를 돕습니다. + * 너무 긴 문장은 가독성을 고려해 줄 바꿈을 추가해도 좋습니다. \ No newline at end of file diff --git a/Window_build.md b/Window_build.md new file mode 100644 index 000000000..c2880bcbb --- /dev/null +++ b/Window_build.md @@ -0,0 +1,57 @@ +## Pytorch-hub-kr build in Window 10 + +- Author : [Taeyoung96](https://github.com/Taeyoung96) + +Window 10 환경에서 `Pytorch-hub-kr` 빌드를 위한 환경 구축을 하는 방법에 대해 소개해 드리겠습니다. + +환경 설정을 진행하면서 주관적으로 도움을 받았던 링크들을 공유하면서 글을 작성하도록 하겠습니다. + +[Git bash](https://git-scm.com/downloads)에서 Test를 진행했습니다. + +환경 설정 완료 후, `./preview_hub_window.sh`를 command 창에 입력해주시면 빌드 및 미리보기를 할 수 있습니다. + +### 환경 설정 +- Ruby +- node.js +- yarn +- Make command in Window 10 + +❗️ 환경 설정 시 각각의 버전을 아래 명시된 버전과 동일하게 맞추면 수월하게 빌드를 진행할 수 있습니다! + +1. Ruby 설치 + +[RubyInstaller](https://rubyinstaller.org/downloads/archives/)에서 `Ruby+Devkit 2.7.4-1 (x64)`를 다운 받아 설치합니다. +(`Ruby+Devkit 2.7.4-1 (x86)`의 경우 Test를 해보지는 않은 상태입니다. 다만 자신의 윈도우 운영체제 비트수에 맞추어 설치를 진행해야 합니다.) +`ruby -v`로 버전 확인이 가능합니다. `2.7.4` 버전을 설치해야 합니다. + +- ruby 설치 시 참고한 링크 : [[Ruby] 루비 설치하기(Windows 10/윈도우 10) / 예제 맛보기](https://junstar92.tistory.com/5) + [How to install RubyGems in Windows?](https://www.geeksforgeeks.org/how-to-install-rubygems-in-windows/) + +2. bundler 설치 + +git bash command 창에 `gem install bundler -v 2.3.13`로 명령어를 실행하여 bundler를 설치해주세요. +버전은 `2.3.13`으로 설치했습니다. + +3. node.js 설치 + +node.js 설치는 아래 링크를 참고하여 설치했습니다. 버전은 `16.13.2`입니다. +nvm 설치시 관리자의 권한으로 `git bash`를 실행해야 합니다. + +- node.js 설치 시 참고한 링크 : [윈도우 node.js 설치하기](https://kitty-geno.tistory.com/61) + [node.js와 npm 최신 버전으로 업데이트하기 (window 윈도우)](https://cheoltecho.tistory.com/15) + [Access Denied issue with NVM in Windows 10](https://stackoverflow.com/questions/50563188/access-denied-issue-with-nvm-in-windows-10) + +4. yarn 설치 + +git bash command 창에 `npm install --global yarn`로 yarn를 설치해주세요. 버전은 `1.22.19`입니다. + +5. Window 10에서 make 명령어 사용 + +[ezwinports](https://sourceforge.net/projects/ezwinports/)를 설치해야 합니다. 아래의 참고자료를 참고하여 설치를 진행해주세요. + +- 참고자료 : [MINGW64 "make build" error: "bash: make: command not found"](https://stackoverflow.com/questions/36770716/mingw64-make-build-error-bash-make-command-not-found) + +만약 ezwinports를 통해 make를 설치했음에도 불구하고 make 명령어를 사용할 수 없다면, +chocolatey 를 설치하여 make 명령어를 사용할 수 있습니다. + +- 참고자료 : [chocolatey 설치, 윈도우10에서 sudo,make 명령어 사용하기](https://jie0025.tistory.com/72) \ No newline at end of file diff --git a/about/index.html b/about/index.html new file mode 100644 index 000000000..5b39ac505 --- /dev/null +++ b/about/index.html @@ -0,0 +1,1111 @@ + + + + + + + + + + + + + 파이토치 한국 사용자 모임 소개 | PyTorchKR | 파이토치 한국 사용자 모임 + + + + + + + + + + + + + + + + + + + + + + +
+
+ + +
+ + + + + + + + +
+ +
+
+ + +
+ +
+
+

반갑습니다!

+ +

파이토치 한국 사용자 모임은 한국 사용자를 위한 비공식 사용자 모임으로, + 한국어를 사용하시는 많은 분들께 PyTorch를 소개하고 함께 배우며 성장하는 것을 목표로 하고 있습니다.

+ + + 시작하기 + +
+
+ + + + +
+
+
+ + +
+ + + +
+
+
+
  # import torch
+  import torch
+
+  # load model
+  model = torch.hub.load('datvuthanh/hybridnets', 'hybridnets', pretrained=True)
+
+  #inference
+  img = torch.randn(1, 3, 640, 384)
+  features, regression, classification, anchors, segmentation = model(img)
+
+ +
+
+
+ +
+
+

파이토치(PyTorch) 소개

+

PyTorch(파이토치)는 FAIR(Facebook AI Research)에서 만든 연구용 프로토타입부터 상용 제품까지 빠르게 만들 수 있는 오픈 소스 머신러닝 프레임워크입니다.

+ +

PyTorch는 사용자 친화적인 프론트엔드(front-end)와 분산 학습, 다양한 도구와 라이브러리를 통해 빠르고 유연한 실험 및 효과적인 상용화를 가능하게 합니다.

+ +

PyTorch에 대한 더 자세한 소개는 공식 홈페이지에서 확인하실 수 있습니다.

+ +
+
+ +
+ +
+ + + +
+
+

파이토치 한국 사용자 모임 소개

+

파이토치 한국 사용자 모임은 2018년 중순 학습 목적으로 PyTorch 튜토리얼 문서를 한국어로 번역하면서 시작하였습니다.

+ +

PyTorch를 학습하고 사용하는 한국 사용자들이 시작한 사용자 커뮤니티로, 한국어를 사용하시는 많은 분들께 PyTorch를 소개하고 함께 배우며 성장하는 것을 목표로 합니다.

+ +

PyTorch를 사용하며 얻은 유용한 정보를 공유하고 싶으시거나 다른 사용자와 소통하고 싶으시다면 커뮤니티 공간에 방문해주세요!

+ +
+
+ +
+
+
+
  import torch
+
+  class MyModule(torch.nn.Module):
+    def __init__(self, N, M):
+      super(MyModule, self).__init__()
+      self.weight = torch.nn.Parameter(torch.rand(N, M))
+
+    def forward(self, input):
+      if input.sum() > 0:
+        output = self.weight.mv(input)
+      else:
+        output = self.weight + input
+      return output
+
+    my_script_module = torch.jit.script(MyModule(3, 4))
+    my_script_module.save("my_script_module.pt")
+
+ +
+
+
+ +
+ + +
+
+ +

운영진 소개 (Maintainers)

+

파이토치 한국 사용자 모임을 함께 만들어가고 계신 분들을 소개합니다.

+
+
+ + + + +
+
+ 9bow +
+

박정환

+

Lead Maintainer

+

@CoC, @PyTorchKorea

+ + + + + + + + + +
+
+
+ + + + +
+
+ adonisues +
+

황성수

+

Maintainer

+

@Advisory

+ + + + + + + + + +
+
+
+ + + + +
+
+ bongmo +
+

김봉모

+

Maintainer

+

@Advisory

+ + + + + + + + + +
+
+
+ + + + +
+
+ codingbowoo +
+

장보우

+

Maintainer

+

@CoC, @hub-kr, @discuss

+ + + + + + + + + + + + + +
+
+
+ + + + + + +
+
+ hyoyoung +
+

장효영

+

Maintainer

+

@CoC, @tutorials-kr

+ + + + + + + + + + + +
+
+
+ + + + +
+
+ des00 +
+

김현길

+

Maintainer

+

@tutorials-kr

+ + + + + + + + + +
+
+
+ + + + + + + + +
+
+ corazzon +
+

박조은

+

Maintainer

+

@Coc, @discuss

+ + + + + + + + + + + +
+
+
+ + + + + + +
+
+ j-min +
+

조재민

+

Maintainer

+

@Advisory

+ + + + + + + + + +
+
+
+ + + + + + + + +
+
+ convin305 +
+

박수민

+

Maintainer

+

@Contributor2024

+ + + + + + + + + +
+
+
+ + + + +
+
+ dudtheheaven +
+

송채영

+

Maintainer

+

@Contributor2024

+ + + + + + + + + +
+
+
+ + + + +
+
+ jenner9212 +
+

박재윤

+

Maintainer

+

@Contributor2024

+ + + + + + + + + +
+
+
+ + + + +
+
+ jih0-kim +
+

김지호

+

Maintainer

+

@Contributor2024

+ + + + + + + + + +
+
+
+ + + + +
+
+ jet981217 +
+

차승일

+

Maintainer

+

@Contributor2024

+ + + + + + + + + +
+
+
+ + + + +
+
+ falconlee236 +
+

이상윤

+

Maintainer

+

@Contributor2024

+ + + + + + + + + +
+
+
+ + + + +
+
+ hkim15 +
+

김홍석

+

Maintainer

+

@Advisory, @CoreSIG

+ + + + + + + + + +
+
+
+ +
+
+
+
+ + +
+
+

함께 하셨던 분들 (Alumni)

+

파이토치 한국 사용자 모임과 함께 하셨던 분들을 소개합니다.

+
+
+ + + + + + + + + + + + +
+
+ creduo +
+

박주혁

+

Inactive Maintainer

+

@Alumni

+ + + + + + + + + + + + + +
+
+
+ + + + + + + + +
+
+ hrxorxm +
+

이하람

+

Inactive Maintainer

+

@Alumni

+ + + + + + + + + +
+
+
+ + + + +
+
+ Taeyoung96 +
+

김태영

+

Inactive Maintainer

+

@Alumni

+ + + + + + + + + + + +
+
+
+ + + + + + +
+
+ codertimo +
+

김준성

+

Inactive Maintainer

+

@Alumni

+ + + + + + + + + + + + + +
+
+
+ + + + + + +
+
+ jimin.lee +
+

이지민

+

Inactive Maintainer

+

@Alumni

+ + + + + + + + + + + + + +
+
+
+ + + + +
+
+ nysunshine +
+

조형주

+

Inactive Maintainer

+

@Alumni

+ + + + + + + + + + + + + +
+
+
+ + + + + + + + + + + + + + + +
+
+
+
+ +
+
+
+ +
+
+
+
+ +

PyTorch 설치하기

+ +

+ 사용 환경을 선택하고 설치 명령을 복사해서 실행해 보세요. Stable 버전은 테스트 및 지원되고 있는 가장 최근의 PyTorch 버전으로, 대부분의 사용자에게 적합합니다. + Preview 버전은 아직 완전히 테스트나 지원이 되지 않는 최신 버전으로 매일 밤 업데이트됩니다. 사용 중인 패키지 매니저에 따라 아래의 사전 요구사항(예: numpy)이 충족되었는지 확인해 주세요. + 모든 의존성을 설치할 수 있는 Anaconda를 패키지 매니저로 추천합니다. 이전 버전의 PyTorch도 설치할 수 있습니다. + LibTorch는 C++에서만 지원합니다. +

+

+ 참고: + 최신 버전의 PyTorch를 사용하기 위해서는 Python 3.8 이상이 필요합니다. 자세한 내용은 Python 섹션을 참고해주세요. +

+
+
+
+
PyTorch 빌드
+
+
+
OS 종류
+
+
+
패키지 매니저
+
+
+
언어
+
+
+
플랫폼
+
+
+
이 명령을 실행하세요:
+
+
+ +
+
+
+
PyTorch 빌드
+
+
+
Stable (1.13.1)
+
+
+
Preview (Nightly)
+
+
+
+
+
OS 종류
+
+
+
Linux
+
+
+
Mac
+
+
+
Windows
+
+
+
+
+
패키지 매니저
+
+
+
Conda
+
+
+
Pip
+
+
+
LibTorch
+
+
+
Source
+
+
+
+
+
언어
+
+
+
Python
+
+
+
C++ / Java
+
+
+
+
+
플랫폼
+
+
+
CUDA 11.8
+
+
+
CUDA 12.1
+
+
+
CUDA 12.4
+
+
+
ROCm 5.2
+
+
+
CPU
+
+
+
+
+
이 명령을 실행하세요:
+
+
+
conda install pytorch torchvision -c pytorch
+
+
+
+
+ + + 이전 버전의 PyTorch + +
+ +
+

클라우드에서
빠르게 시작하기

+ +

많이 쓰이는 클라우드 플랫폼과 머신러닝 서비스를 통해서 PyTorch를 빠르게 시작해보세요.

+ +
+ + +
+
+
+ Google Cloud Platform +
+ + + + + +
+
+ +
+
+
+

Microsoft Azure

+
+ + +
+
+
+ + +
+
+
+
+ + + + + +
+
+
+
+

PyTorchKorea @ GitHub

+

파이토치 한국 사용자 모임을 GitHub에서 만나보세요.

+ GitHub로 이동 +
+ +
+

한국어 튜토리얼

+

한국어로 번역 중인 파이토치 튜토리얼을 만나보세요.

+ 튜토리얼로 이동 +
+ +
+

커뮤니티

+

다른 사용자들과 의견을 나누고, 도와주세요!

+ 커뮤니티로 이동 +
+
+
+
+ + + +
+
+
+
+ + +
+
+
+ + +
+ + + + + + + + + + + + + + + diff --git a/alexnet1.png b/alexnet1.png new file mode 100644 index 000000000..9a34bfe5d Binary files /dev/null and b/alexnet1.png differ diff --git a/alexnet2.png b/alexnet2.png new file mode 100644 index 000000000..8eb6b7465 Binary files /dev/null and b/alexnet2.png differ diff --git a/assets/blog/2023-04-18-accelerating-large-language-models/PyTorch_Better-Transformer_Chart-2.png b/assets/blog/2023-04-18-accelerating-large-language-models/PyTorch_Better-Transformer_Chart-2.png new file mode 100644 index 000000000..85ef6c5f8 Binary files /dev/null and b/assets/blog/2023-04-18-accelerating-large-language-models/PyTorch_Better-Transformer_Chart-2.png differ diff --git a/assets/blog/2023-04-18-accelerating-large-language-models/PyTorch_Better-Transformer_Figure-1.png b/assets/blog/2023-04-18-accelerating-large-language-models/PyTorch_Better-Transformer_Figure-1.png new file mode 100644 index 000000000..38ad2962b Binary files /dev/null and b/assets/blog/2023-04-18-accelerating-large-language-models/PyTorch_Better-Transformer_Figure-1.png differ diff --git a/assets/blog/2023-04-18-accelerating-large-language-models/causal_attention_step_1.png b/assets/blog/2023-04-18-accelerating-large-language-models/causal_attention_step_1.png new file mode 100644 index 000000000..bf1e93971 Binary files /dev/null and b/assets/blog/2023-04-18-accelerating-large-language-models/causal_attention_step_1.png differ diff --git a/assets/blog/2023-04-18-accelerating-large-language-models/causal_attention_step_2.png b/assets/blog/2023-04-18-accelerating-large-language-models/causal_attention_step_2.png new file mode 100644 index 000000000..315652cd9 Binary files /dev/null and b/assets/blog/2023-04-18-accelerating-large-language-models/causal_attention_step_2.png differ diff --git a/assets/blog/2023-04-18-accelerating-large-language-models/chart.png b/assets/blog/2023-04-18-accelerating-large-language-models/chart.png new file mode 100644 index 000000000..b585303ba Binary files /dev/null and b/assets/blog/2023-04-18-accelerating-large-language-models/chart.png differ diff --git a/assets/blog/2023-04-18-accelerating-large-language-models/tweet.png b/assets/blog/2023-04-18-accelerating-large-language-models/tweet.png new file mode 100644 index 000000000..7598441da Binary files /dev/null and b/assets/blog/2023-04-18-accelerating-large-language-models/tweet.png differ diff --git a/assets/blog/2023-05-03-announcing-docathon/docathon-cover.jpg b/assets/blog/2023-05-03-announcing-docathon/docathon-cover.jpg new file mode 100644 index 000000000..c9311fbb5 Binary files /dev/null and b/assets/blog/2023-05-03-announcing-docathon/docathon-cover.jpg differ diff --git a/assets/blog/docathon-cover.jpg b/assets/blog/docathon-cover.jpg new file mode 100644 index 000000000..c9311fbb5 Binary files /dev/null and b/assets/blog/docathon-cover.jpg differ diff --git a/assets/cookie-banner.js b/assets/cookie-banner.js new file mode 100644 index 000000000..d9b7acfab --- /dev/null +++ b/assets/cookie-banner.js @@ -0,0 +1,42 @@ +var cookieBanner = { + init: function() { + cookieBanner.bind(); + + var cookieExists = cookieBanner.cookieExists(); + + if (!cookieExists) { + cookieBanner.setCookie(); + cookieBanner.showCookieNotice(); + } + }, + + bind: function() { + $(".close-button").on("click", cookieBanner.hideCookieNotice); + }, + + cookieExists: function() { + var cookie = localStorage.getItem("returningPytorchUser"); + + if (cookie) { + return true; + } else { + return false; + } + }, + + setCookie: function() { + localStorage.setItem("returningPytorchUser", true); + }, + + showCookieNotice: function() { + $(".cookie-banner-wrapper").addClass("is-visible"); + }, + + hideCookieNotice: function() { + $(".cookie-banner-wrapper").removeClass("is-visible"); + } +}; + +$(function() { + cookieBanner.init(); +}); diff --git a/assets/css/style.css b/assets/css/style.css new file mode 100644 index 000000000..1f9ba713d --- /dev/null +++ b/assets/css/style.css @@ -0,0 +1 @@ +/*! normalize.css v4.1.1 | MIT License | github.com/necolas/normalize.css */html{font-family:sans-serif;-ms-text-size-adjust:100%;-webkit-text-size-adjust:100%}body{margin:0}article,aside,details,figcaption,figure,footer,header,main,menu,nav,section{display:block}summary{display:list-item}audio,canvas,progress,video{display:inline-block}audio:not([controls]){display:none;height:0}progress{vertical-align:baseline}template,[hidden]{display:none !important}a{background-color:transparent}a:active,a:hover{outline-width:0}abbr[title]{border-bottom:none;text-decoration:underline;-webkit-text-decoration:underline dotted;text-decoration:underline dotted}b,strong{font-weight:inherit}b,strong{font-weight:bolder}dfn{font-style:italic}h1{font-size:2em;margin:0.67em 0}mark{background-color:#ff0;color:#000}small{font-size:80%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sub{bottom:-0.25em}sup{top:-0.5em}img{border-style:none}svg:not(:root){overflow:hidden}code,kbd,pre,samp{font-family:monospace, monospace;font-size:1em}figure{margin:1em 40px}hr{box-sizing:content-box;height:0;overflow:visible}button,input,select,textarea{font:inherit;margin:0}optgroup{font-weight:bold}button,input{overflow:visible}button,select{text-transform:none}button,html [type="button"],[type="reset"],[type="submit"]{-webkit-appearance:button}button::-moz-focus-inner,[type="button"]::-moz-focus-inner,[type="reset"]::-moz-focus-inner,[type="submit"]::-moz-focus-inner{border-style:none;padding:0}button:-moz-focusring,[type="button"]:-moz-focusring,[type="reset"]:-moz-focusring,[type="submit"]:-moz-focusring{outline:1px dotted ButtonText}fieldset{border:1px solid #c0c0c0;margin:0 2px;padding:0.35em 0.625em 0.75em}legend{box-sizing:border-box;color:inherit;display:table;max-width:100%;padding:0;white-space:normal}textarea{overflow:auto}[type="checkbox"],[type="radio"]{box-sizing:border-box;padding:0}[type="number"]::-webkit-inner-spin-button,[type="number"]::-webkit-outer-spin-button{height:auto}[type="search"]{-webkit-appearance:textfield;outline-offset:-2px}[type="search"]::-webkit-search-cancel-button,[type="search"]::-webkit-search-decoration{-webkit-appearance:none}::-webkit-input-placeholder{color:inherit;opacity:0.54}::-webkit-file-upload-button{-webkit-appearance:button;font:inherit}*{box-sizing:border-box}input,select,textarea,button{font-family:inherit;font-size:inherit;line-height:inherit}body{font-family:-apple-system,BlinkMacSystemFont,"Segoe UI",Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";font-size:14px;line-height:1.5;color:#24292e;background-color:#fff}a{color:#0366d6;text-decoration:none}a:hover{text-decoration:underline}b,strong{font-weight:600}hr,.rule{height:0;margin:15px 0;overflow:hidden;background:transparent;border:0;border-bottom:1px solid #dfe2e5}hr::before,.rule::before{display:table;content:""}hr::after,.rule::after{display:table;clear:both;content:""}table{border-spacing:0;border-collapse:collapse}td,th{padding:0}button{cursor:pointer;border-radius:0}[hidden][hidden]{display:none !important}details summary{cursor:pointer}details:not([open])>*:not(summary){display:none !important}h1,h2,h3,h4,h5,h6{margin-top:0;margin-bottom:0}h1{font-size:32px;font-weight:600}h2{font-size:24px;font-weight:600}h3{font-size:20px;font-weight:600}h4{font-size:16px;font-weight:600}h5{font-size:14px;font-weight:600}h6{font-size:12px;font-weight:600}p{margin-top:0;margin-bottom:10px}small{font-size:90%}blockquote{margin:0}ul,ol{padding-left:0;margin-top:0;margin-bottom:0}ol ol,ul ol{list-style-type:lower-roman}ul ul ol,ul ol ol,ol ul ol,ol ol ol{list-style-type:lower-alpha}dd{margin-left:0}tt,code{font-family:"SFMono-Regular",Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:12px}pre{margin-top:0;margin-bottom:0;font-family:"SFMono-Regular",Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:12px}.octicon{vertical-align:text-bottom}.anim-fade-in{-webkit-animation-name:fade-in;animation-name:fade-in;-webkit-animation-duration:1s;animation-duration:1s;-webkit-animation-timing-function:ease-in-out;animation-timing-function:ease-in-out}.anim-fade-in.fast{-webkit-animation-duration:300ms;animation-duration:300ms}@-webkit-keyframes fade-in{0%{opacity:0}100%{opacity:1}}@keyframes fade-in{0%{opacity:0}100%{opacity:1}}.anim-fade-out{-webkit-animation-name:fade-out;animation-name:fade-out;-webkit-animation-duration:1s;animation-duration:1s;-webkit-animation-timing-function:ease-out;animation-timing-function:ease-out}.anim-fade-out.fast{-webkit-animation-duration:0.3s;animation-duration:0.3s}@-webkit-keyframes fade-out{0%{opacity:1}100%{opacity:0}}@keyframes fade-out{0%{opacity:1}100%{opacity:0}}.anim-fade-up{opacity:0;-webkit-animation-name:fade-up;animation-name:fade-up;-webkit-animation-duration:0.3s;animation-duration:0.3s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-timing-function:ease-out;animation-timing-function:ease-out;-webkit-animation-delay:1s;animation-delay:1s}@-webkit-keyframes fade-up{0%{opacity:0.8;transform:translateY(100%)}100%{opacity:1;transform:translateY(0)}}@keyframes fade-up{0%{opacity:0.8;transform:translateY(100%)}100%{opacity:1;transform:translateY(0)}}.anim-fade-down{-webkit-animation-name:fade-down;animation-name:fade-down;-webkit-animation-duration:0.3s;animation-duration:0.3s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-timing-function:ease-in;animation-timing-function:ease-in}@-webkit-keyframes fade-down{0%{opacity:1;transform:translateY(0)}100%{opacity:0.5;transform:translateY(100%)}}@keyframes fade-down{0%{opacity:1;transform:translateY(0)}100%{opacity:0.5;transform:translateY(100%)}}.anim-grow-x{width:0%;-webkit-animation-name:grow-x;animation-name:grow-x;-webkit-animation-duration:0.3s;animation-duration:0.3s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-timing-function:ease;animation-timing-function:ease;-webkit-animation-delay:0.5s;animation-delay:0.5s}@-webkit-keyframes grow-x{to{width:100%}}@keyframes grow-x{to{width:100%}}.anim-shrink-x{-webkit-animation-name:shrink-x;animation-name:shrink-x;-webkit-animation-duration:0.3s;animation-duration:0.3s;-webkit-animation-fill-mode:forwards;animation-fill-mode:forwards;-webkit-animation-timing-function:ease-in-out;animation-timing-function:ease-in-out;-webkit-animation-delay:0.5s;animation-delay:0.5s}@-webkit-keyframes shrink-x{to{width:0%}}@keyframes shrink-x{to{width:0%}}.anim-scale-in{-webkit-animation-name:scale-in;animation-name:scale-in;-webkit-animation-duration:0.15s;animation-duration:0.15s;-webkit-animation-timing-function:cubic-bezier(0.2, 0, 0.13, 1.5);animation-timing-function:cubic-bezier(0.2, 0, 0.13, 1.5)}@-webkit-keyframes scale-in{0%{opacity:0;transform:scale(0.5)}100%{opacity:1;transform:scale(1)}}@keyframes scale-in{0%{opacity:0;transform:scale(0.5)}100%{opacity:1;transform:scale(1)}}.anim-pulse{-webkit-animation-name:pulse;animation-name:pulse;-webkit-animation-duration:2s;animation-duration:2s;-webkit-animation-timing-function:linear;animation-timing-function:linear;-webkit-animation-iteration-count:infinite;animation-iteration-count:infinite}@-webkit-keyframes pulse{0%{opacity:0.3}10%{opacity:1}100%{opacity:0.3}}@keyframes pulse{0%{opacity:0.3}10%{opacity:1}100%{opacity:0.3}}.anim-pulse-in{-webkit-animation-name:pulse-in;animation-name:pulse-in;-webkit-animation-duration:0.5s;animation-duration:0.5s}@-webkit-keyframes pulse-in{0%{transform:scale3d(1, 1, 1)}50%{transform:scale3d(1.1, 1.1, 1.1)}100%{transform:scale3d(1, 1, 1)}}@keyframes pulse-in{0%{transform:scale3d(1, 1, 1)}50%{transform:scale3d(1.1, 1.1, 1.1)}100%{transform:scale3d(1, 1, 1)}}.hover-grow{transition:transform 0.3s;-webkit-backface-visibility:hidden;backface-visibility:hidden}.hover-grow:hover{transform:scale(1.025)}.border{border:1px #e1e4e8 solid !important}.border-y{border-top:1px #e1e4e8 solid !important;border-bottom:1px #e1e4e8 solid !important}.border-0{border:0 !important}.border-dashed{border-style:dashed !important}.border-blue{border-color:#0366d6 !important}.border-blue-light{border-color:#c8e1ff !important}.border-green{border-color:#34d058 !important}.border-green-light{border-color:#a2cbac !important}.border-red{border-color:#d73a49 !important}.border-red-light{border-color:#cea0a5 !important}.border-purple{border-color:#6f42c1 !important}.border-yellow{border-color:#d9d0a5 !important}.border-gray-light{border-color:#eaecef !important}.border-gray-dark{border-color:#d1d5da !important}.border-black-fade{border-color:rgba(27,31,35,0.15) !important}.border-top{border-top:1px #e1e4e8 solid !important}.border-right{border-right:1px #e1e4e8 solid !important}.border-bottom{border-bottom:1px #e1e4e8 solid !important}.border-left{border-left:1px #e1e4e8 solid !important}.border-top-0{border-top:0 !important}.border-right-0{border-right:0 !important}.border-bottom-0{border-bottom:0 !important}.border-left-0{border-left:0 !important}.rounded-0{border-radius:0 !important}.rounded-1{border-radius:3px !important}.rounded-2{border-radius:6px !important}.rounded-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-top-1{border-top-left-radius:3px !important;border-top-right-radius:3px !important}.rounded-top-2{border-top-left-radius:6px !important;border-top-right-radius:6px !important}.rounded-right-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-right-1{border-top-right-radius:3px !important;border-bottom-right-radius:3px !important}.rounded-right-2{border-top-right-radius:6px !important;border-bottom-right-radius:6px !important}.rounded-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-bottom-1{border-bottom-right-radius:3px !important;border-bottom-left-radius:3px !important}.rounded-bottom-2{border-bottom-right-radius:6px !important;border-bottom-left-radius:6px !important}.rounded-left-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-left-1{border-bottom-left-radius:3px !important;border-top-left-radius:3px !important}.rounded-left-2{border-bottom-left-radius:6px !important;border-top-left-radius:6px !important}@media (min-width: 544px){.border-sm-top{border-top:1px #e1e4e8 solid !important}.border-sm-right{border-right:1px #e1e4e8 solid !important}.border-sm-bottom{border-bottom:1px #e1e4e8 solid !important}.border-sm-left{border-left:1px #e1e4e8 solid !important}.border-sm-top-0{border-top:0 !important}.border-sm-right-0{border-right:0 !important}.border-sm-bottom-0{border-bottom:0 !important}.border-sm-left-0{border-left:0 !important}.rounded-sm-0{border-radius:0 !important}.rounded-sm-1{border-radius:3px !important}.rounded-sm-2{border-radius:6px !important}.rounded-sm-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-sm-top-1{border-top-left-radius:3px !important;border-top-right-radius:3px !important}.rounded-sm-top-2{border-top-left-radius:6px !important;border-top-right-radius:6px !important}.rounded-sm-right-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-sm-right-1{border-top-right-radius:3px !important;border-bottom-right-radius:3px !important}.rounded-sm-right-2{border-top-right-radius:6px !important;border-bottom-right-radius:6px !important}.rounded-sm-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-sm-bottom-1{border-bottom-right-radius:3px !important;border-bottom-left-radius:3px !important}.rounded-sm-bottom-2{border-bottom-right-radius:6px !important;border-bottom-left-radius:6px !important}.rounded-sm-left-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-sm-left-1{border-bottom-left-radius:3px !important;border-top-left-radius:3px !important}.rounded-sm-left-2{border-bottom-left-radius:6px !important;border-top-left-radius:6px !important}}@media (min-width: 768px){.border-md-top{border-top:1px #e1e4e8 solid !important}.border-md-right{border-right:1px #e1e4e8 solid !important}.border-md-bottom{border-bottom:1px #e1e4e8 solid !important}.border-md-left{border-left:1px #e1e4e8 solid !important}.border-md-top-0{border-top:0 !important}.border-md-right-0{border-right:0 !important}.border-md-bottom-0{border-bottom:0 !important}.border-md-left-0{border-left:0 !important}.rounded-md-0{border-radius:0 !important}.rounded-md-1{border-radius:3px !important}.rounded-md-2{border-radius:6px !important}.rounded-md-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-md-top-1{border-top-left-radius:3px !important;border-top-right-radius:3px !important}.rounded-md-top-2{border-top-left-radius:6px !important;border-top-right-radius:6px !important}.rounded-md-right-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-md-right-1{border-top-right-radius:3px !important;border-bottom-right-radius:3px !important}.rounded-md-right-2{border-top-right-radius:6px !important;border-bottom-right-radius:6px !important}.rounded-md-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-md-bottom-1{border-bottom-right-radius:3px !important;border-bottom-left-radius:3px !important}.rounded-md-bottom-2{border-bottom-right-radius:6px !important;border-bottom-left-radius:6px !important}.rounded-md-left-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-md-left-1{border-bottom-left-radius:3px !important;border-top-left-radius:3px !important}.rounded-md-left-2{border-bottom-left-radius:6px !important;border-top-left-radius:6px !important}}@media (min-width: 1012px){.border-lg-top{border-top:1px #e1e4e8 solid !important}.border-lg-right{border-right:1px #e1e4e8 solid !important}.border-lg-bottom{border-bottom:1px #e1e4e8 solid !important}.border-lg-left{border-left:1px #e1e4e8 solid !important}.border-lg-top-0{border-top:0 !important}.border-lg-right-0{border-right:0 !important}.border-lg-bottom-0{border-bottom:0 !important}.border-lg-left-0{border-left:0 !important}.rounded-lg-0{border-radius:0 !important}.rounded-lg-1{border-radius:3px !important}.rounded-lg-2{border-radius:6px !important}.rounded-lg-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-lg-top-1{border-top-left-radius:3px !important;border-top-right-radius:3px !important}.rounded-lg-top-2{border-top-left-radius:6px !important;border-top-right-radius:6px !important}.rounded-lg-right-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-lg-right-1{border-top-right-radius:3px !important;border-bottom-right-radius:3px !important}.rounded-lg-right-2{border-top-right-radius:6px !important;border-bottom-right-radius:6px !important}.rounded-lg-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-lg-bottom-1{border-bottom-right-radius:3px !important;border-bottom-left-radius:3px !important}.rounded-lg-bottom-2{border-bottom-right-radius:6px !important;border-bottom-left-radius:6px !important}.rounded-lg-left-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-lg-left-1{border-bottom-left-radius:3px !important;border-top-left-radius:3px !important}.rounded-lg-left-2{border-bottom-left-radius:6px !important;border-top-left-radius:6px !important}}@media (min-width: 1280px){.border-xl-top{border-top:1px #e1e4e8 solid !important}.border-xl-right{border-right:1px #e1e4e8 solid !important}.border-xl-bottom{border-bottom:1px #e1e4e8 solid !important}.border-xl-left{border-left:1px #e1e4e8 solid !important}.border-xl-top-0{border-top:0 !important}.border-xl-right-0{border-right:0 !important}.border-xl-bottom-0{border-bottom:0 !important}.border-xl-left-0{border-left:0 !important}.rounded-xl-0{border-radius:0 !important}.rounded-xl-1{border-radius:3px !important}.rounded-xl-2{border-radius:6px !important}.rounded-xl-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-xl-top-1{border-top-left-radius:3px !important;border-top-right-radius:3px !important}.rounded-xl-top-2{border-top-left-radius:6px !important;border-top-right-radius:6px !important}.rounded-xl-right-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-xl-right-1{border-top-right-radius:3px !important;border-bottom-right-radius:3px !important}.rounded-xl-right-2{border-top-right-radius:6px !important;border-bottom-right-radius:6px !important}.rounded-xl-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-xl-bottom-1{border-bottom-right-radius:3px !important;border-bottom-left-radius:3px !important}.rounded-xl-bottom-2{border-bottom-right-radius:6px !important;border-bottom-left-radius:6px !important}.rounded-xl-left-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-xl-left-1{border-bottom-left-radius:3px !important;border-top-left-radius:3px !important}.rounded-xl-left-2{border-bottom-left-radius:6px !important;border-top-left-radius:6px !important}}.circle{border-radius:50% !important}.box-shadow{box-shadow:0 1px 1px rgba(27,31,35,0.1) !important}.box-shadow-medium{box-shadow:0 1px 5px rgba(27,31,35,0.15) !important}.box-shadow-large{box-shadow:0 1px 15px rgba(27,31,35,0.15) !important}.box-shadow-extra-large{box-shadow:0 10px 50px rgba(27,31,35,0.07) !important}.box-shadow-none{box-shadow:none !important}.bg-white{background-color:#fff !important}.bg-blue{background-color:#0366d6 !important}.bg-blue-light{background-color:#f1f8ff !important}.bg-gray-dark{background-color:#24292e !important}.bg-gray{background-color:#f6f8fa !important}.bg-gray-light{background-color:#fafbfc !important}.bg-green{background-color:#28a745 !important}.bg-green-light{background-color:#dcffe4 !important}.bg-red{background-color:#d73a49 !important}.bg-red-light{background-color:#ffdce0 !important}.bg-yellow{background-color:#ffd33d !important}.bg-yellow-light{background-color:#fff5b1 !important}.bg-purple{background-color:#6f42c1 !important}.bg-purple-light{background-color:#f5f0ff !important}.bg-shade-gradient{background-image:linear-gradient(180deg, rgba(27,31,35,0.065), rgba(27,31,35,0)) !important;background-repeat:no-repeat !important;background-size:100% 200px !important}.text-blue{color:#0366d6 !important}.text-red{color:#cb2431 !important}.text-gray-light{color:#6a737d !important}.text-gray{color:#586069 !important}.text-gray-dark{color:#24292e !important}.text-green{color:#28a745 !important}.text-orange{color:#a04100 !important}.text-orange-light{color:#e36209 !important}.text-purple{color:#6f42c1 !important}.text-white{color:#fff !important}.text-inherit{color:inherit !important}.text-pending{color:#b08800 !important}.bg-pending{color:#dbab09 !important}.link-gray{color:#586069 !important}.link-gray:hover{color:#0366d6 !important}.link-gray-dark{color:#24292e !important}.link-gray-dark:hover{color:#0366d6 !important}.link-hover-blue:hover{color:#0366d6 !important}.muted-link{color:#586069 !important}.muted-link:hover{color:#0366d6 !important;text-decoration:none}.details-overlay[open]>summary::before{position:fixed;top:0;right:0;bottom:0;left:0;z-index:80;display:block;cursor:default;content:" ";background:transparent}.details-overlay-dark[open]>summary::before{z-index:99;background:rgba(27,31,35,0.5)}.flex-row{flex-direction:row !important}.flex-row-reverse{flex-direction:row-reverse !important}.flex-column{flex-direction:column !important}.flex-wrap{flex-wrap:wrap !important}.flex-nowrap{flex-wrap:nowrap !important}.flex-justify-start{justify-content:flex-start !important}.flex-justify-end{justify-content:flex-end !important}.flex-justify-center{justify-content:center !important}.flex-justify-between{justify-content:space-between !important}.flex-justify-around{justify-content:space-around !important}.flex-items-start{align-items:flex-start !important}.flex-items-end{align-items:flex-end !important}.flex-items-center{align-items:center !important}.flex-items-baseline{align-items:baseline !important}.flex-items-stretch{align-items:stretch !important}.flex-content-start{align-content:flex-start !important}.flex-content-end{align-content:flex-end !important}.flex-content-center{align-content:center !important}.flex-content-between{align-content:space-between !important}.flex-content-around{align-content:space-around !important}.flex-content-stretch{align-content:stretch !important}.flex-auto{flex:1 1 auto !important}.flex-shrink-0{flex-shrink:0 !important}.flex-self-auto{align-self:auto !important}.flex-self-start{align-self:flex-start !important}.flex-self-end{align-self:flex-end !important}.flex-self-center{align-self:center !important}.flex-self-baseline{align-self:baseline !important}.flex-self-stretch{align-self:stretch !important}.flex-item-equal{flex-grow:1;flex-basis:0}@media (min-width: 544px){.flex-sm-row{flex-direction:row !important}.flex-sm-row-reverse{flex-direction:row-reverse !important}.flex-sm-column{flex-direction:column !important}.flex-sm-wrap{flex-wrap:wrap !important}.flex-sm-nowrap{flex-wrap:nowrap !important}.flex-sm-justify-start{justify-content:flex-start !important}.flex-sm-justify-end{justify-content:flex-end !important}.flex-sm-justify-center{justify-content:center !important}.flex-sm-justify-between{justify-content:space-between !important}.flex-sm-justify-around{justify-content:space-around !important}.flex-sm-items-start{align-items:flex-start !important}.flex-sm-items-end{align-items:flex-end !important}.flex-sm-items-center{align-items:center !important}.flex-sm-items-baseline{align-items:baseline !important}.flex-sm-items-stretch{align-items:stretch !important}.flex-sm-content-start{align-content:flex-start !important}.flex-sm-content-end{align-content:flex-end !important}.flex-sm-content-center{align-content:center !important}.flex-sm-content-between{align-content:space-between !important}.flex-sm-content-around{align-content:space-around !important}.flex-sm-content-stretch{align-content:stretch !important}.flex-sm-auto{flex:1 1 auto !important}.flex-sm-shrink-0{flex-shrink:0 !important}.flex-sm-self-auto{align-self:auto !important}.flex-sm-self-start{align-self:flex-start !important}.flex-sm-self-end{align-self:flex-end !important}.flex-sm-self-center{align-self:center !important}.flex-sm-self-baseline{align-self:baseline !important}.flex-sm-self-stretch{align-self:stretch !important}.flex-sm-item-equal{flex-grow:1;flex-basis:0}}@media (min-width: 768px){.flex-md-row{flex-direction:row !important}.flex-md-row-reverse{flex-direction:row-reverse !important}.flex-md-column{flex-direction:column !important}.flex-md-wrap{flex-wrap:wrap !important}.flex-md-nowrap{flex-wrap:nowrap !important}.flex-md-justify-start{justify-content:flex-start !important}.flex-md-justify-end{justify-content:flex-end !important}.flex-md-justify-center{justify-content:center !important}.flex-md-justify-between{justify-content:space-between !important}.flex-md-justify-around{justify-content:space-around !important}.flex-md-items-start{align-items:flex-start !important}.flex-md-items-end{align-items:flex-end !important}.flex-md-items-center{align-items:center !important}.flex-md-items-baseline{align-items:baseline !important}.flex-md-items-stretch{align-items:stretch !important}.flex-md-content-start{align-content:flex-start !important}.flex-md-content-end{align-content:flex-end !important}.flex-md-content-center{align-content:center !important}.flex-md-content-between{align-content:space-between !important}.flex-md-content-around{align-content:space-around !important}.flex-md-content-stretch{align-content:stretch !important}.flex-md-auto{flex:1 1 auto !important}.flex-md-shrink-0{flex-shrink:0 !important}.flex-md-self-auto{align-self:auto !important}.flex-md-self-start{align-self:flex-start !important}.flex-md-self-end{align-self:flex-end !important}.flex-md-self-center{align-self:center !important}.flex-md-self-baseline{align-self:baseline !important}.flex-md-self-stretch{align-self:stretch !important}.flex-md-item-equal{flex-grow:1;flex-basis:0}}@media (min-width: 1012px){.flex-lg-row{flex-direction:row !important}.flex-lg-row-reverse{flex-direction:row-reverse !important}.flex-lg-column{flex-direction:column !important}.flex-lg-wrap{flex-wrap:wrap !important}.flex-lg-nowrap{flex-wrap:nowrap !important}.flex-lg-justify-start{justify-content:flex-start !important}.flex-lg-justify-end{justify-content:flex-end !important}.flex-lg-justify-center{justify-content:center !important}.flex-lg-justify-between{justify-content:space-between !important}.flex-lg-justify-around{justify-content:space-around !important}.flex-lg-items-start{align-items:flex-start !important}.flex-lg-items-end{align-items:flex-end !important}.flex-lg-items-center{align-items:center !important}.flex-lg-items-baseline{align-items:baseline !important}.flex-lg-items-stretch{align-items:stretch !important}.flex-lg-content-start{align-content:flex-start !important}.flex-lg-content-end{align-content:flex-end !important}.flex-lg-content-center{align-content:center !important}.flex-lg-content-between{align-content:space-between !important}.flex-lg-content-around{align-content:space-around !important}.flex-lg-content-stretch{align-content:stretch !important}.flex-lg-auto{flex:1 1 auto !important}.flex-lg-shrink-0{flex-shrink:0 !important}.flex-lg-self-auto{align-self:auto !important}.flex-lg-self-start{align-self:flex-start !important}.flex-lg-self-end{align-self:flex-end !important}.flex-lg-self-center{align-self:center !important}.flex-lg-self-baseline{align-self:baseline !important}.flex-lg-self-stretch{align-self:stretch !important}.flex-lg-item-equal{flex-grow:1;flex-basis:0}}@media (min-width: 1280px){.flex-xl-row{flex-direction:row !important}.flex-xl-row-reverse{flex-direction:row-reverse !important}.flex-xl-column{flex-direction:column !important}.flex-xl-wrap{flex-wrap:wrap !important}.flex-xl-nowrap{flex-wrap:nowrap !important}.flex-xl-justify-start{justify-content:flex-start !important}.flex-xl-justify-end{justify-content:flex-end !important}.flex-xl-justify-center{justify-content:center !important}.flex-xl-justify-between{justify-content:space-between !important}.flex-xl-justify-around{justify-content:space-around !important}.flex-xl-items-start{align-items:flex-start !important}.flex-xl-items-end{align-items:flex-end !important}.flex-xl-items-center{align-items:center !important}.flex-xl-items-baseline{align-items:baseline !important}.flex-xl-items-stretch{align-items:stretch !important}.flex-xl-content-start{align-content:flex-start !important}.flex-xl-content-end{align-content:flex-end !important}.flex-xl-content-center{align-content:center !important}.flex-xl-content-between{align-content:space-between !important}.flex-xl-content-around{align-content:space-around !important}.flex-xl-content-stretch{align-content:stretch !important}.flex-xl-auto{flex:1 1 auto !important}.flex-xl-shrink-0{flex-shrink:0 !important}.flex-xl-self-auto{align-self:auto !important}.flex-xl-self-start{align-self:flex-start !important}.flex-xl-self-end{align-self:flex-end !important}.flex-xl-self-center{align-self:center !important}.flex-xl-self-baseline{align-self:baseline !important}.flex-xl-self-stretch{align-self:stretch !important}.flex-xl-item-equal{flex-grow:1;flex-basis:0}}.position-static{position:static !important}.position-relative{position:relative !important}.position-absolute{position:absolute !important}.position-fixed{position:fixed !important}.top-0{top:0 !important}.right-0{right:0 !important}.bottom-0{bottom:0 !important}.left-0{left:0 !important}.v-align-middle{vertical-align:middle !important}.v-align-top{vertical-align:top !important}.v-align-bottom{vertical-align:bottom !important}.v-align-text-top{vertical-align:text-top !important}.v-align-text-bottom{vertical-align:text-bottom !important}.v-align-baseline{vertical-align:baseline !important}.overflow-hidden{overflow:hidden !important}.overflow-scroll{overflow:scroll !important}.overflow-auto{overflow:auto !important}.clearfix::before{display:table;content:""}.clearfix::after{display:table;clear:both;content:""}.float-left{float:left !important}.float-right{float:right !important}.float-none{float:none !important}@media (min-width: 544px){.float-sm-left{float:left !important}.float-sm-right{float:right !important}.float-sm-none{float:none !important}}@media (min-width: 768px){.float-md-left{float:left !important}.float-md-right{float:right !important}.float-md-none{float:none !important}}@media (min-width: 1012px){.float-lg-left{float:left !important}.float-lg-right{float:right !important}.float-lg-none{float:none !important}}@media (min-width: 1280px){.float-xl-left{float:left !important}.float-xl-right{float:right !important}.float-xl-none{float:none !important}}.width-fit{max-width:100% !important}.width-full{width:100% !important}.height-fit{max-height:100% !important}.height-full{height:100% !important}.min-width-0{min-width:0 !important}.direction-rtl{direction:rtl !important}.direction-ltr{direction:ltr !important}@media (min-width: 544px){.direction-sm-rtl{direction:rtl !important}.direction-sm-ltr{direction:ltr !important}}@media (min-width: 768px){.direction-md-rtl{direction:rtl !important}.direction-md-ltr{direction:ltr !important}}@media (min-width: 1012px){.direction-lg-rtl{direction:rtl !important}.direction-lg-ltr{direction:ltr !important}}@media (min-width: 1280px){.direction-xl-rtl{direction:rtl !important}.direction-xl-ltr{direction:ltr !important}}.m-0{margin:0 !important}.mt-0{margin-top:0 !important}.mr-0{margin-right:0 !important}.mb-0{margin-bottom:0 !important}.ml-0{margin-left:0 !important}.mx-0{margin-right:0 !important;margin-left:0 !important}.my-0{margin-top:0 !important;margin-bottom:0 !important}.m-1{margin:4px !important}.mt-1{margin-top:4px !important}.mr-1{margin-right:4px !important}.mb-1{margin-bottom:4px !important}.ml-1{margin-left:4px !important}.mt-n1{margin-top:-4px !important}.mr-n1{margin-right:-4px !important}.mb-n1{margin-bottom:-4px !important}.ml-n1{margin-left:-4px !important}.mx-1{margin-right:4px !important;margin-left:4px !important}.my-1{margin-top:4px !important;margin-bottom:4px !important}.m-2{margin:8px !important}.mt-2{margin-top:8px !important}.mr-2{margin-right:8px !important}.mb-2{margin-bottom:8px !important}.ml-2{margin-left:8px !important}.mt-n2{margin-top:-8px !important}.mr-n2{margin-right:-8px !important}.mb-n2{margin-bottom:-8px !important}.ml-n2{margin-left:-8px !important}.mx-2{margin-right:8px !important;margin-left:8px !important}.my-2{margin-top:8px !important;margin-bottom:8px !important}.m-3{margin:16px !important}.mt-3{margin-top:16px !important}.mr-3{margin-right:16px !important}.mb-3{margin-bottom:16px !important}.ml-3{margin-left:16px !important}.mt-n3{margin-top:-16px !important}.mr-n3{margin-right:-16px !important}.mb-n3{margin-bottom:-16px !important}.ml-n3{margin-left:-16px !important}.mx-3{margin-right:16px !important;margin-left:16px !important}.my-3{margin-top:16px !important;margin-bottom:16px !important}.m-4{margin:24px !important}.mt-4{margin-top:24px !important}.mr-4{margin-right:24px !important}.mb-4{margin-bottom:24px !important}.ml-4{margin-left:24px !important}.mt-n4{margin-top:-24px !important}.mr-n4{margin-right:-24px !important}.mb-n4{margin-bottom:-24px !important}.ml-n4{margin-left:-24px !important}.mx-4{margin-right:24px !important;margin-left:24px !important}.my-4{margin-top:24px !important;margin-bottom:24px !important}.m-5{margin:32px !important}.mt-5{margin-top:32px !important}.mr-5{margin-right:32px !important}.mb-5{margin-bottom:32px !important}.ml-5{margin-left:32px !important}.mt-n5{margin-top:-32px !important}.mr-n5{margin-right:-32px !important}.mb-n5{margin-bottom:-32px !important}.ml-n5{margin-left:-32px !important}.mx-5{margin-right:32px !important;margin-left:32px !important}.my-5{margin-top:32px !important;margin-bottom:32px !important}.m-6{margin:40px !important}.mt-6{margin-top:40px !important}.mr-6{margin-right:40px !important}.mb-6{margin-bottom:40px !important}.ml-6{margin-left:40px !important}.mt-n6{margin-top:-40px !important}.mr-n6{margin-right:-40px !important}.mb-n6{margin-bottom:-40px !important}.ml-n6{margin-left:-40px !important}.mx-6{margin-right:40px !important;margin-left:40px !important}.my-6{margin-top:40px !important;margin-bottom:40px !important}.mx-auto{margin-right:auto !important;margin-left:auto !important}@media (min-width: 544px){.m-sm-0{margin:0 !important}.mt-sm-0{margin-top:0 !important}.mr-sm-0{margin-right:0 !important}.mb-sm-0{margin-bottom:0 !important}.ml-sm-0{margin-left:0 !important}.mx-sm-0{margin-right:0 !important;margin-left:0 !important}.my-sm-0{margin-top:0 !important;margin-bottom:0 !important}.m-sm-1{margin:4px !important}.mt-sm-1{margin-top:4px !important}.mr-sm-1{margin-right:4px !important}.mb-sm-1{margin-bottom:4px !important}.ml-sm-1{margin-left:4px !important}.mt-sm-n1{margin-top:-4px !important}.mr-sm-n1{margin-right:-4px !important}.mb-sm-n1{margin-bottom:-4px !important}.ml-sm-n1{margin-left:-4px !important}.mx-sm-1{margin-right:4px !important;margin-left:4px !important}.my-sm-1{margin-top:4px !important;margin-bottom:4px !important}.m-sm-2{margin:8px !important}.mt-sm-2{margin-top:8px !important}.mr-sm-2{margin-right:8px !important}.mb-sm-2{margin-bottom:8px !important}.ml-sm-2{margin-left:8px !important}.mt-sm-n2{margin-top:-8px !important}.mr-sm-n2{margin-right:-8px !important}.mb-sm-n2{margin-bottom:-8px !important}.ml-sm-n2{margin-left:-8px !important}.mx-sm-2{margin-right:8px !important;margin-left:8px !important}.my-sm-2{margin-top:8px !important;margin-bottom:8px !important}.m-sm-3{margin:16px !important}.mt-sm-3{margin-top:16px !important}.mr-sm-3{margin-right:16px !important}.mb-sm-3{margin-bottom:16px !important}.ml-sm-3{margin-left:16px !important}.mt-sm-n3{margin-top:-16px !important}.mr-sm-n3{margin-right:-16px !important}.mb-sm-n3{margin-bottom:-16px !important}.ml-sm-n3{margin-left:-16px !important}.mx-sm-3{margin-right:16px !important;margin-left:16px !important}.my-sm-3{margin-top:16px !important;margin-bottom:16px !important}.m-sm-4{margin:24px !important}.mt-sm-4{margin-top:24px !important}.mr-sm-4{margin-right:24px !important}.mb-sm-4{margin-bottom:24px !important}.ml-sm-4{margin-left:24px !important}.mt-sm-n4{margin-top:-24px !important}.mr-sm-n4{margin-right:-24px !important}.mb-sm-n4{margin-bottom:-24px !important}.ml-sm-n4{margin-left:-24px !important}.mx-sm-4{margin-right:24px !important;margin-left:24px !important}.my-sm-4{margin-top:24px !important;margin-bottom:24px !important}.m-sm-5{margin:32px !important}.mt-sm-5{margin-top:32px !important}.mr-sm-5{margin-right:32px !important}.mb-sm-5{margin-bottom:32px !important}.ml-sm-5{margin-left:32px !important}.mt-sm-n5{margin-top:-32px !important}.mr-sm-n5{margin-right:-32px !important}.mb-sm-n5{margin-bottom:-32px !important}.ml-sm-n5{margin-left:-32px !important}.mx-sm-5{margin-right:32px !important;margin-left:32px !important}.my-sm-5{margin-top:32px !important;margin-bottom:32px !important}.m-sm-6{margin:40px !important}.mt-sm-6{margin-top:40px !important}.mr-sm-6{margin-right:40px !important}.mb-sm-6{margin-bottom:40px !important}.ml-sm-6{margin-left:40px !important}.mt-sm-n6{margin-top:-40px !important}.mr-sm-n6{margin-right:-40px !important}.mb-sm-n6{margin-bottom:-40px !important}.ml-sm-n6{margin-left:-40px !important}.mx-sm-6{margin-right:40px !important;margin-left:40px !important}.my-sm-6{margin-top:40px !important;margin-bottom:40px !important}.mx-sm-auto{margin-right:auto !important;margin-left:auto !important}}@media (min-width: 768px){.m-md-0{margin:0 !important}.mt-md-0{margin-top:0 !important}.mr-md-0{margin-right:0 !important}.mb-md-0{margin-bottom:0 !important}.ml-md-0{margin-left:0 !important}.mx-md-0{margin-right:0 !important;margin-left:0 !important}.my-md-0{margin-top:0 !important;margin-bottom:0 !important}.m-md-1{margin:4px !important}.mt-md-1{margin-top:4px !important}.mr-md-1{margin-right:4px !important}.mb-md-1{margin-bottom:4px !important}.ml-md-1{margin-left:4px !important}.mt-md-n1{margin-top:-4px !important}.mr-md-n1{margin-right:-4px !important}.mb-md-n1{margin-bottom:-4px !important}.ml-md-n1{margin-left:-4px !important}.mx-md-1{margin-right:4px !important;margin-left:4px !important}.my-md-1{margin-top:4px !important;margin-bottom:4px !important}.m-md-2{margin:8px !important}.mt-md-2{margin-top:8px !important}.mr-md-2{margin-right:8px !important}.mb-md-2{margin-bottom:8px !important}.ml-md-2{margin-left:8px !important}.mt-md-n2{margin-top:-8px !important}.mr-md-n2{margin-right:-8px !important}.mb-md-n2{margin-bottom:-8px !important}.ml-md-n2{margin-left:-8px !important}.mx-md-2{margin-right:8px !important;margin-left:8px !important}.my-md-2{margin-top:8px !important;margin-bottom:8px !important}.m-md-3{margin:16px !important}.mt-md-3{margin-top:16px !important}.mr-md-3{margin-right:16px !important}.mb-md-3{margin-bottom:16px !important}.ml-md-3{margin-left:16px !important}.mt-md-n3{margin-top:-16px !important}.mr-md-n3{margin-right:-16px !important}.mb-md-n3{margin-bottom:-16px !important}.ml-md-n3{margin-left:-16px !important}.mx-md-3{margin-right:16px !important;margin-left:16px !important}.my-md-3{margin-top:16px !important;margin-bottom:16px !important}.m-md-4{margin:24px !important}.mt-md-4{margin-top:24px !important}.mr-md-4{margin-right:24px !important}.mb-md-4{margin-bottom:24px !important}.ml-md-4{margin-left:24px !important}.mt-md-n4{margin-top:-24px !important}.mr-md-n4{margin-right:-24px !important}.mb-md-n4{margin-bottom:-24px !important}.ml-md-n4{margin-left:-24px !important}.mx-md-4{margin-right:24px !important;margin-left:24px !important}.my-md-4{margin-top:24px !important;margin-bottom:24px !important}.m-md-5{margin:32px !important}.mt-md-5{margin-top:32px !important}.mr-md-5{margin-right:32px !important}.mb-md-5{margin-bottom:32px !important}.ml-md-5{margin-left:32px !important}.mt-md-n5{margin-top:-32px !important}.mr-md-n5{margin-right:-32px !important}.mb-md-n5{margin-bottom:-32px !important}.ml-md-n5{margin-left:-32px !important}.mx-md-5{margin-right:32px !important;margin-left:32px !important}.my-md-5{margin-top:32px !important;margin-bottom:32px !important}.m-md-6{margin:40px !important}.mt-md-6{margin-top:40px !important}.mr-md-6{margin-right:40px !important}.mb-md-6{margin-bottom:40px !important}.ml-md-6{margin-left:40px !important}.mt-md-n6{margin-top:-40px !important}.mr-md-n6{margin-right:-40px !important}.mb-md-n6{margin-bottom:-40px !important}.ml-md-n6{margin-left:-40px !important}.mx-md-6{margin-right:40px !important;margin-left:40px !important}.my-md-6{margin-top:40px !important;margin-bottom:40px !important}.mx-md-auto{margin-right:auto !important;margin-left:auto !important}}@media (min-width: 1012px){.m-lg-0{margin:0 !important}.mt-lg-0{margin-top:0 !important}.mr-lg-0{margin-right:0 !important}.mb-lg-0{margin-bottom:0 !important}.ml-lg-0{margin-left:0 !important}.mx-lg-0{margin-right:0 !important;margin-left:0 !important}.my-lg-0{margin-top:0 !important;margin-bottom:0 !important}.m-lg-1{margin:4px !important}.mt-lg-1{margin-top:4px !important}.mr-lg-1{margin-right:4px !important}.mb-lg-1{margin-bottom:4px !important}.ml-lg-1{margin-left:4px !important}.mt-lg-n1{margin-top:-4px !important}.mr-lg-n1{margin-right:-4px !important}.mb-lg-n1{margin-bottom:-4px !important}.ml-lg-n1{margin-left:-4px !important}.mx-lg-1{margin-right:4px !important;margin-left:4px !important}.my-lg-1{margin-top:4px !important;margin-bottom:4px !important}.m-lg-2{margin:8px !important}.mt-lg-2{margin-top:8px !important}.mr-lg-2{margin-right:8px !important}.mb-lg-2{margin-bottom:8px !important}.ml-lg-2{margin-left:8px !important}.mt-lg-n2{margin-top:-8px !important}.mr-lg-n2{margin-right:-8px !important}.mb-lg-n2{margin-bottom:-8px !important}.ml-lg-n2{margin-left:-8px !important}.mx-lg-2{margin-right:8px !important;margin-left:8px !important}.my-lg-2{margin-top:8px !important;margin-bottom:8px !important}.m-lg-3{margin:16px !important}.mt-lg-3{margin-top:16px !important}.mr-lg-3{margin-right:16px !important}.mb-lg-3{margin-bottom:16px !important}.ml-lg-3{margin-left:16px !important}.mt-lg-n3{margin-top:-16px !important}.mr-lg-n3{margin-right:-16px !important}.mb-lg-n3{margin-bottom:-16px !important}.ml-lg-n3{margin-left:-16px !important}.mx-lg-3{margin-right:16px !important;margin-left:16px !important}.my-lg-3{margin-top:16px !important;margin-bottom:16px !important}.m-lg-4{margin:24px !important}.mt-lg-4{margin-top:24px !important}.mr-lg-4{margin-right:24px !important}.mb-lg-4{margin-bottom:24px !important}.ml-lg-4{margin-left:24px !important}.mt-lg-n4{margin-top:-24px !important}.mr-lg-n4{margin-right:-24px !important}.mb-lg-n4{margin-bottom:-24px !important}.ml-lg-n4{margin-left:-24px !important}.mx-lg-4{margin-right:24px !important;margin-left:24px !important}.my-lg-4{margin-top:24px !important;margin-bottom:24px !important}.m-lg-5{margin:32px !important}.mt-lg-5{margin-top:32px !important}.mr-lg-5{margin-right:32px !important}.mb-lg-5{margin-bottom:32px !important}.ml-lg-5{margin-left:32px !important}.mt-lg-n5{margin-top:-32px !important}.mr-lg-n5{margin-right:-32px !important}.mb-lg-n5{margin-bottom:-32px !important}.ml-lg-n5{margin-left:-32px !important}.mx-lg-5{margin-right:32px !important;margin-left:32px !important}.my-lg-5{margin-top:32px !important;margin-bottom:32px !important}.m-lg-6{margin:40px !important}.mt-lg-6{margin-top:40px !important}.mr-lg-6{margin-right:40px !important}.mb-lg-6{margin-bottom:40px !important}.ml-lg-6{margin-left:40px !important}.mt-lg-n6{margin-top:-40px !important}.mr-lg-n6{margin-right:-40px !important}.mb-lg-n6{margin-bottom:-40px !important}.ml-lg-n6{margin-left:-40px !important}.mx-lg-6{margin-right:40px !important;margin-left:40px !important}.my-lg-6{margin-top:40px !important;margin-bottom:40px !important}.mx-lg-auto{margin-right:auto !important;margin-left:auto !important}}@media (min-width: 1280px){.m-xl-0{margin:0 !important}.mt-xl-0{margin-top:0 !important}.mr-xl-0{margin-right:0 !important}.mb-xl-0{margin-bottom:0 !important}.ml-xl-0{margin-left:0 !important}.mx-xl-0{margin-right:0 !important;margin-left:0 !important}.my-xl-0{margin-top:0 !important;margin-bottom:0 !important}.m-xl-1{margin:4px !important}.mt-xl-1{margin-top:4px !important}.mr-xl-1{margin-right:4px !important}.mb-xl-1{margin-bottom:4px !important}.ml-xl-1{margin-left:4px !important}.mt-xl-n1{margin-top:-4px !important}.mr-xl-n1{margin-right:-4px !important}.mb-xl-n1{margin-bottom:-4px !important}.ml-xl-n1{margin-left:-4px !important}.mx-xl-1{margin-right:4px !important;margin-left:4px !important}.my-xl-1{margin-top:4px !important;margin-bottom:4px !important}.m-xl-2{margin:8px !important}.mt-xl-2{margin-top:8px !important}.mr-xl-2{margin-right:8px !important}.mb-xl-2{margin-bottom:8px !important}.ml-xl-2{margin-left:8px !important}.mt-xl-n2{margin-top:-8px !important}.mr-xl-n2{margin-right:-8px !important}.mb-xl-n2{margin-bottom:-8px !important}.ml-xl-n2{margin-left:-8px !important}.mx-xl-2{margin-right:8px !important;margin-left:8px !important}.my-xl-2{margin-top:8px !important;margin-bottom:8px !important}.m-xl-3{margin:16px !important}.mt-xl-3{margin-top:16px !important}.mr-xl-3{margin-right:16px !important}.mb-xl-3{margin-bottom:16px !important}.ml-xl-3{margin-left:16px !important}.mt-xl-n3{margin-top:-16px !important}.mr-xl-n3{margin-right:-16px !important}.mb-xl-n3{margin-bottom:-16px !important}.ml-xl-n3{margin-left:-16px !important}.mx-xl-3{margin-right:16px !important;margin-left:16px !important}.my-xl-3{margin-top:16px !important;margin-bottom:16px !important}.m-xl-4{margin:24px !important}.mt-xl-4{margin-top:24px !important}.mr-xl-4{margin-right:24px !important}.mb-xl-4{margin-bottom:24px !important}.ml-xl-4{margin-left:24px !important}.mt-xl-n4{margin-top:-24px !important}.mr-xl-n4{margin-right:-24px !important}.mb-xl-n4{margin-bottom:-24px !important}.ml-xl-n4{margin-left:-24px !important}.mx-xl-4{margin-right:24px !important;margin-left:24px !important}.my-xl-4{margin-top:24px !important;margin-bottom:24px !important}.m-xl-5{margin:32px !important}.mt-xl-5{margin-top:32px !important}.mr-xl-5{margin-right:32px !important}.mb-xl-5{margin-bottom:32px !important}.ml-xl-5{margin-left:32px !important}.mt-xl-n5{margin-top:-32px !important}.mr-xl-n5{margin-right:-32px !important}.mb-xl-n5{margin-bottom:-32px !important}.ml-xl-n5{margin-left:-32px !important}.mx-xl-5{margin-right:32px !important;margin-left:32px !important}.my-xl-5{margin-top:32px !important;margin-bottom:32px !important}.m-xl-6{margin:40px !important}.mt-xl-6{margin-top:40px !important}.mr-xl-6{margin-right:40px !important}.mb-xl-6{margin-bottom:40px !important}.ml-xl-6{margin-left:40px !important}.mt-xl-n6{margin-top:-40px !important}.mr-xl-n6{margin-right:-40px !important}.mb-xl-n6{margin-bottom:-40px !important}.ml-xl-n6{margin-left:-40px !important}.mx-xl-6{margin-right:40px !important;margin-left:40px !important}.my-xl-6{margin-top:40px !important;margin-bottom:40px !important}.mx-xl-auto{margin-right:auto !important;margin-left:auto !important}}.p-0{padding:0 !important}.pt-0{padding-top:0 !important}.pr-0{padding-right:0 !important}.pb-0{padding-bottom:0 !important}.pl-0{padding-left:0 !important}.px-0{padding-right:0 !important;padding-left:0 !important}.py-0{padding-top:0 !important;padding-bottom:0 !important}.p-1{padding:4px !important}.pt-1{padding-top:4px !important}.pr-1{padding-right:4px !important}.pb-1{padding-bottom:4px !important}.pl-1{padding-left:4px !important}.px-1{padding-right:4px !important;padding-left:4px !important}.py-1{padding-top:4px !important;padding-bottom:4px !important}.p-2{padding:8px !important}.pt-2{padding-top:8px !important}.pr-2{padding-right:8px !important}.pb-2{padding-bottom:8px !important}.pl-2{padding-left:8px !important}.px-2{padding-right:8px !important;padding-left:8px !important}.py-2{padding-top:8px !important;padding-bottom:8px !important}.p-3{padding:16px !important}.pt-3{padding-top:16px !important}.pr-3{padding-right:16px !important}.pb-3{padding-bottom:16px !important}.pl-3{padding-left:16px !important}.px-3{padding-right:16px !important;padding-left:16px !important}.py-3{padding-top:16px !important;padding-bottom:16px !important}.p-4{padding:24px !important}.pt-4{padding-top:24px !important}.pr-4{padding-right:24px !important}.pb-4{padding-bottom:24px !important}.pl-4{padding-left:24px !important}.px-4{padding-right:24px !important;padding-left:24px !important}.py-4{padding-top:24px !important;padding-bottom:24px !important}.p-5{padding:32px !important}.pt-5{padding-top:32px !important}.pr-5{padding-right:32px !important}.pb-5{padding-bottom:32px !important}.pl-5{padding-left:32px !important}.px-5{padding-right:32px !important;padding-left:32px !important}.py-5{padding-top:32px !important;padding-bottom:32px !important}.p-6{padding:40px !important}.pt-6{padding-top:40px !important}.pr-6{padding-right:40px !important}.pb-6{padding-bottom:40px !important}.pl-6{padding-left:40px !important}.px-6{padding-right:40px !important;padding-left:40px !important}.py-6{padding-top:40px !important;padding-bottom:40px !important}@media (min-width: 544px){.p-sm-0{padding:0 !important}.pt-sm-0{padding-top:0 !important}.pr-sm-0{padding-right:0 !important}.pb-sm-0{padding-bottom:0 !important}.pl-sm-0{padding-left:0 !important}.px-sm-0{padding-right:0 !important;padding-left:0 !important}.py-sm-0{padding-top:0 !important;padding-bottom:0 !important}.p-sm-1{padding:4px !important}.pt-sm-1{padding-top:4px !important}.pr-sm-1{padding-right:4px !important}.pb-sm-1{padding-bottom:4px !important}.pl-sm-1{padding-left:4px !important}.px-sm-1{padding-right:4px !important;padding-left:4px !important}.py-sm-1{padding-top:4px !important;padding-bottom:4px !important}.p-sm-2{padding:8px !important}.pt-sm-2{padding-top:8px !important}.pr-sm-2{padding-right:8px !important}.pb-sm-2{padding-bottom:8px !important}.pl-sm-2{padding-left:8px !important}.px-sm-2{padding-right:8px !important;padding-left:8px !important}.py-sm-2{padding-top:8px !important;padding-bottom:8px !important}.p-sm-3{padding:16px !important}.pt-sm-3{padding-top:16px !important}.pr-sm-3{padding-right:16px !important}.pb-sm-3{padding-bottom:16px !important}.pl-sm-3{padding-left:16px !important}.px-sm-3{padding-right:16px !important;padding-left:16px !important}.py-sm-3{padding-top:16px !important;padding-bottom:16px !important}.p-sm-4{padding:24px !important}.pt-sm-4{padding-top:24px !important}.pr-sm-4{padding-right:24px !important}.pb-sm-4{padding-bottom:24px !important}.pl-sm-4{padding-left:24px !important}.px-sm-4{padding-right:24px !important;padding-left:24px !important}.py-sm-4{padding-top:24px !important;padding-bottom:24px !important}.p-sm-5{padding:32px !important}.pt-sm-5{padding-top:32px !important}.pr-sm-5{padding-right:32px !important}.pb-sm-5{padding-bottom:32px !important}.pl-sm-5{padding-left:32px !important}.px-sm-5{padding-right:32px !important;padding-left:32px !important}.py-sm-5{padding-top:32px !important;padding-bottom:32px !important}.p-sm-6{padding:40px !important}.pt-sm-6{padding-top:40px !important}.pr-sm-6{padding-right:40px !important}.pb-sm-6{padding-bottom:40px !important}.pl-sm-6{padding-left:40px !important}.px-sm-6{padding-right:40px !important;padding-left:40px !important}.py-sm-6{padding-top:40px !important;padding-bottom:40px !important}}@media (min-width: 768px){.p-md-0{padding:0 !important}.pt-md-0{padding-top:0 !important}.pr-md-0{padding-right:0 !important}.pb-md-0{padding-bottom:0 !important}.pl-md-0{padding-left:0 !important}.px-md-0{padding-right:0 !important;padding-left:0 !important}.py-md-0{padding-top:0 !important;padding-bottom:0 !important}.p-md-1{padding:4px !important}.pt-md-1{padding-top:4px !important}.pr-md-1{padding-right:4px !important}.pb-md-1{padding-bottom:4px !important}.pl-md-1{padding-left:4px !important}.px-md-1{padding-right:4px !important;padding-left:4px !important}.py-md-1{padding-top:4px !important;padding-bottom:4px !important}.p-md-2{padding:8px !important}.pt-md-2{padding-top:8px !important}.pr-md-2{padding-right:8px !important}.pb-md-2{padding-bottom:8px !important}.pl-md-2{padding-left:8px !important}.px-md-2{padding-right:8px !important;padding-left:8px !important}.py-md-2{padding-top:8px !important;padding-bottom:8px !important}.p-md-3{padding:16px !important}.pt-md-3{padding-top:16px !important}.pr-md-3{padding-right:16px !important}.pb-md-3{padding-bottom:16px !important}.pl-md-3{padding-left:16px !important}.px-md-3{padding-right:16px !important;padding-left:16px !important}.py-md-3{padding-top:16px !important;padding-bottom:16px !important}.p-md-4{padding:24px !important}.pt-md-4{padding-top:24px !important}.pr-md-4{padding-right:24px !important}.pb-md-4{padding-bottom:24px !important}.pl-md-4{padding-left:24px !important}.px-md-4{padding-right:24px !important;padding-left:24px !important}.py-md-4{padding-top:24px !important;padding-bottom:24px !important}.p-md-5{padding:32px !important}.pt-md-5{padding-top:32px !important}.pr-md-5{padding-right:32px !important}.pb-md-5{padding-bottom:32px !important}.pl-md-5{padding-left:32px !important}.px-md-5{padding-right:32px !important;padding-left:32px !important}.py-md-5{padding-top:32px !important;padding-bottom:32px !important}.p-md-6{padding:40px !important}.pt-md-6{padding-top:40px !important}.pr-md-6{padding-right:40px !important}.pb-md-6{padding-bottom:40px !important}.pl-md-6{padding-left:40px !important}.px-md-6{padding-right:40px !important;padding-left:40px !important}.py-md-6{padding-top:40px !important;padding-bottom:40px !important}}@media (min-width: 1012px){.p-lg-0{padding:0 !important}.pt-lg-0{padding-top:0 !important}.pr-lg-0{padding-right:0 !important}.pb-lg-0{padding-bottom:0 !important}.pl-lg-0{padding-left:0 !important}.px-lg-0{padding-right:0 !important;padding-left:0 !important}.py-lg-0{padding-top:0 !important;padding-bottom:0 !important}.p-lg-1{padding:4px !important}.pt-lg-1{padding-top:4px !important}.pr-lg-1{padding-right:4px !important}.pb-lg-1{padding-bottom:4px !important}.pl-lg-1{padding-left:4px !important}.px-lg-1{padding-right:4px !important;padding-left:4px !important}.py-lg-1{padding-top:4px !important;padding-bottom:4px !important}.p-lg-2{padding:8px !important}.pt-lg-2{padding-top:8px !important}.pr-lg-2{padding-right:8px !important}.pb-lg-2{padding-bottom:8px !important}.pl-lg-2{padding-left:8px !important}.px-lg-2{padding-right:8px !important;padding-left:8px !important}.py-lg-2{padding-top:8px !important;padding-bottom:8px !important}.p-lg-3{padding:16px !important}.pt-lg-3{padding-top:16px !important}.pr-lg-3{padding-right:16px !important}.pb-lg-3{padding-bottom:16px !important}.pl-lg-3{padding-left:16px !important}.px-lg-3{padding-right:16px !important;padding-left:16px !important}.py-lg-3{padding-top:16px !important;padding-bottom:16px !important}.p-lg-4{padding:24px !important}.pt-lg-4{padding-top:24px !important}.pr-lg-4{padding-right:24px !important}.pb-lg-4{padding-bottom:24px !important}.pl-lg-4{padding-left:24px !important}.px-lg-4{padding-right:24px !important;padding-left:24px !important}.py-lg-4{padding-top:24px !important;padding-bottom:24px !important}.p-lg-5{padding:32px !important}.pt-lg-5{padding-top:32px !important}.pr-lg-5{padding-right:32px !important}.pb-lg-5{padding-bottom:32px !important}.pl-lg-5{padding-left:32px !important}.px-lg-5{padding-right:32px !important;padding-left:32px !important}.py-lg-5{padding-top:32px !important;padding-bottom:32px !important}.p-lg-6{padding:40px !important}.pt-lg-6{padding-top:40px !important}.pr-lg-6{padding-right:40px !important}.pb-lg-6{padding-bottom:40px !important}.pl-lg-6{padding-left:40px !important}.px-lg-6{padding-right:40px !important;padding-left:40px !important}.py-lg-6{padding-top:40px !important;padding-bottom:40px !important}}@media (min-width: 1280px){.p-xl-0{padding:0 !important}.pt-xl-0{padding-top:0 !important}.pr-xl-0{padding-right:0 !important}.pb-xl-0{padding-bottom:0 !important}.pl-xl-0{padding-left:0 !important}.px-xl-0{padding-right:0 !important;padding-left:0 !important}.py-xl-0{padding-top:0 !important;padding-bottom:0 !important}.p-xl-1{padding:4px !important}.pt-xl-1{padding-top:4px !important}.pr-xl-1{padding-right:4px !important}.pb-xl-1{padding-bottom:4px !important}.pl-xl-1{padding-left:4px !important}.px-xl-1{padding-right:4px !important;padding-left:4px !important}.py-xl-1{padding-top:4px !important;padding-bottom:4px !important}.p-xl-2{padding:8px !important}.pt-xl-2{padding-top:8px !important}.pr-xl-2{padding-right:8px !important}.pb-xl-2{padding-bottom:8px !important}.pl-xl-2{padding-left:8px !important}.px-xl-2{padding-right:8px !important;padding-left:8px !important}.py-xl-2{padding-top:8px !important;padding-bottom:8px !important}.p-xl-3{padding:16px !important}.pt-xl-3{padding-top:16px !important}.pr-xl-3{padding-right:16px !important}.pb-xl-3{padding-bottom:16px !important}.pl-xl-3{padding-left:16px !important}.px-xl-3{padding-right:16px !important;padding-left:16px !important}.py-xl-3{padding-top:16px !important;padding-bottom:16px !important}.p-xl-4{padding:24px !important}.pt-xl-4{padding-top:24px !important}.pr-xl-4{padding-right:24px !important}.pb-xl-4{padding-bottom:24px !important}.pl-xl-4{padding-left:24px !important}.px-xl-4{padding-right:24px !important;padding-left:24px !important}.py-xl-4{padding-top:24px !important;padding-bottom:24px !important}.p-xl-5{padding:32px !important}.pt-xl-5{padding-top:32px !important}.pr-xl-5{padding-right:32px !important}.pb-xl-5{padding-bottom:32px !important}.pl-xl-5{padding-left:32px !important}.px-xl-5{padding-right:32px !important;padding-left:32px !important}.py-xl-5{padding-top:32px !important;padding-bottom:32px !important}.p-xl-6{padding:40px !important}.pt-xl-6{padding-top:40px !important}.pr-xl-6{padding-right:40px !important}.pb-xl-6{padding-bottom:40px !important}.pl-xl-6{padding-left:40px !important}.px-xl-6{padding-right:40px !important;padding-left:40px !important}.py-xl-6{padding-top:40px !important;padding-bottom:40px !important}}.p-responsive{padding-right:16px !important;padding-left:16px !important}@media (min-width: 544px){.p-responsive{padding-right:40px !important;padding-left:40px !important}}@media (min-width: 1012px){.p-responsive{padding-right:16px !important;padding-left:16px !important}}.h1{font-size:26px !important}@media (min-width: 768px){.h1{font-size:32px !important}}.h2{font-size:22px !important}@media (min-width: 768px){.h2{font-size:24px !important}}.h3{font-size:18px !important}@media (min-width: 768px){.h3{font-size:20px !important}}.h4{font-size:16px !important}.h5{font-size:14px !important}.h6{font-size:12px !important}.h1,.h2,.h3,.h4,.h5,.h6{font-weight:600 !important}.f1{font-size:26px !important}@media (min-width: 768px){.f1{font-size:32px !important}}.f2{font-size:22px !important}@media (min-width: 768px){.f2{font-size:24px !important}}.f3{font-size:18px !important}@media (min-width: 768px){.f3{font-size:20px !important}}.f4{font-size:16px !important}@media (min-width: 768px){.f4{font-size:16px !important}}.f5{font-size:14px !important}.f6{font-size:12px !important}.f00-light{font-size:40px !important;font-weight:300 !important}@media (min-width: 768px){.f00-light{font-size:48px !important}}.f0-light{font-size:32px !important;font-weight:300 !important}@media (min-width: 768px){.f0-light{font-size:40px !important}}.f1-light{font-size:26px !important;font-weight:300 !important}@media (min-width: 768px){.f1-light{font-size:32px !important}}.f2-light{font-size:22px !important;font-weight:300 !important}@media (min-width: 768px){.f2-light{font-size:24px !important}}.f3-light{font-size:18px !important;font-weight:300 !important}@media (min-width: 768px){.f3-light{font-size:20px !important}}.text-small{font-size:12px !important}.lead{margin-bottom:30px;font-size:20px;font-weight:300;color:#586069}.lh-condensed-ultra{line-height:1 !important}.lh-condensed{line-height:1.25 !important}.lh-default{line-height:1.5 !important}.lh-0{line-height:0 !important}.text-right{text-align:right !important}.text-left{text-align:left !important}.text-center{text-align:center !important}@media (min-width: 544px){.text-sm-right{text-align:right !important}.text-sm-left{text-align:left !important}.text-sm-center{text-align:center !important}}@media (min-width: 768px){.text-md-right{text-align:right !important}.text-md-left{text-align:left !important}.text-md-center{text-align:center !important}}@media (min-width: 1012px){.text-lg-right{text-align:right !important}.text-lg-left{text-align:left !important}.text-lg-center{text-align:center !important}}@media (min-width: 1280px){.text-xl-right{text-align:right !important}.text-xl-left{text-align:left !important}.text-xl-center{text-align:center !important}}.text-normal{font-weight:400 !important}.text-bold{font-weight:600 !important}.text-italic{font-style:italic !important}.text-uppercase{text-transform:uppercase !important}.text-underline{text-decoration:underline !important}.no-underline{text-decoration:none !important}.no-wrap{white-space:nowrap !important}.ws-normal{white-space:normal !important}.wb-break-all{word-break:break-all !important}.text-emphasized{font-weight:600;color:#24292e}.list-style-none{list-style:none !important}.text-shadow-dark{text-shadow:0 1px 1px rgba(27,31,35,0.25),0 1px 25px rgba(27,31,35,0.75)}.text-shadow-light{text-shadow:0 1px 0 rgba(255,255,255,0.5)}.text-mono{font-family:"SFMono-Regular",Consolas,"Liberation Mono",Menlo,Courier,monospace}.user-select-none{-webkit-user-select:none !important;-moz-user-select:none !important;-ms-user-select:none !important;user-select:none !important}.d-block{display:block !important}.d-flex{display:flex !important}.d-inline{display:inline !important}.d-inline-block{display:inline-block !important}.d-inline-flex{display:inline-flex !important}.d-none{display:none !important}.d-table{display:table !important}.d-table-cell{display:table-cell !important}@media (min-width: 544px){.d-sm-block{display:block !important}.d-sm-flex{display:flex !important}.d-sm-inline{display:inline !important}.d-sm-inline-block{display:inline-block !important}.d-sm-inline-flex{display:inline-flex !important}.d-sm-none{display:none !important}.d-sm-table{display:table !important}.d-sm-table-cell{display:table-cell !important}}@media (min-width: 768px){.d-md-block{display:block !important}.d-md-flex{display:flex !important}.d-md-inline{display:inline !important}.d-md-inline-block{display:inline-block !important}.d-md-inline-flex{display:inline-flex !important}.d-md-none{display:none !important}.d-md-table{display:table !important}.d-md-table-cell{display:table-cell !important}}@media (min-width: 1012px){.d-lg-block{display:block !important}.d-lg-flex{display:flex !important}.d-lg-inline{display:inline !important}.d-lg-inline-block{display:inline-block !important}.d-lg-inline-flex{display:inline-flex !important}.d-lg-none{display:none !important}.d-lg-table{display:table !important}.d-lg-table-cell{display:table-cell !important}}@media (min-width: 1280px){.d-xl-block{display:block !important}.d-xl-flex{display:flex !important}.d-xl-inline{display:inline !important}.d-xl-inline-block{display:inline-block !important}.d-xl-inline-flex{display:inline-flex !important}.d-xl-none{display:none !important}.d-xl-table{display:table !important}.d-xl-table-cell{display:table-cell !important}}.v-hidden{visibility:hidden !important}.v-visible{visibility:visible !important}@media (max-width: 544px){.hide-sm{display:none !important}}@media (min-width: 544px) and (max-width: 768px){.hide-md{display:none !important}}@media (min-width: 768px) and (max-width: 1012px){.hide-lg{display:none !important}}@media (min-width: 1012px){.hide-xl{display:none !important}}.table-fixed{table-layout:fixed !important}.sr-only{position:absolute;width:1px;height:1px;padding:0;overflow:hidden;clip:rect(0, 0, 0, 0);word-wrap:normal;border:0}.show-on-focus{position:absolute;width:1px;height:1px;margin:0;overflow:hidden;clip:rect(1px, 1px, 1px, 1px)}.show-on-focus:focus{z-index:20;width:auto;height:auto;clip:auto}.container{width:980px;margin-right:auto;margin-left:auto}.container::before{display:table;content:""}.container::after{display:table;clear:both;content:""}.container-md{max-width:768px;margin-right:auto;margin-left:auto}.container-lg{max-width:1012px;margin-right:auto;margin-left:auto}.container-xl{max-width:1280px;margin-right:auto;margin-left:auto}.columns{margin-right:-10px;margin-left:-10px}.columns::before{display:table;content:""}.columns::after{display:table;clear:both;content:""}.column{float:left;padding-right:10px;padding-left:10px}.one-third{width:33.333333%}.two-thirds{width:66.666667%}.one-fourth{width:25%}.one-half{width:50%}.three-fourths{width:75%}.one-fifth{width:20%}.four-fifths{width:80%}.centered{display:block;float:none;margin-right:auto;margin-left:auto}.col-1{width:8.3333333333%}.col-2{width:16.6666666667%}.col-3{width:25%}.col-4{width:33.3333333333%}.col-5{width:41.6666666667%}.col-6{width:50%}.col-7{width:58.3333333333%}.col-8{width:66.6666666667%}.col-9{width:75%}.col-10{width:83.3333333333%}.col-11{width:91.6666666667%}.col-12{width:100%}@media (min-width: 544px){.col-sm-1{width:8.3333333333%}.col-sm-2{width:16.6666666667%}.col-sm-3{width:25%}.col-sm-4{width:33.3333333333%}.col-sm-5{width:41.6666666667%}.col-sm-6{width:50%}.col-sm-7{width:58.3333333333%}.col-sm-8{width:66.6666666667%}.col-sm-9{width:75%}.col-sm-10{width:83.3333333333%}.col-sm-11{width:91.6666666667%}.col-sm-12{width:100%}}@media (min-width: 768px){.col-md-1{width:8.3333333333%}.col-md-2{width:16.6666666667%}.col-md-3{width:25%}.col-md-4{width:33.3333333333%}.col-md-5{width:41.6666666667%}.col-md-6{width:50%}.col-md-7{width:58.3333333333%}.col-md-8{width:66.6666666667%}.col-md-9{width:75%}.col-md-10{width:83.3333333333%}.col-md-11{width:91.6666666667%}.col-md-12{width:100%}}@media (min-width: 1012px){.col-lg-1{width:8.3333333333%}.col-lg-2{width:16.6666666667%}.col-lg-3{width:25%}.col-lg-4{width:33.3333333333%}.col-lg-5{width:41.6666666667%}.col-lg-6{width:50%}.col-lg-7{width:58.3333333333%}.col-lg-8{width:66.6666666667%}.col-lg-9{width:75%}.col-lg-10{width:83.3333333333%}.col-lg-11{width:91.6666666667%}.col-lg-12{width:100%}}@media (min-width: 1280px){.col-xl-1{width:8.3333333333%}.col-xl-2{width:16.6666666667%}.col-xl-3{width:25%}.col-xl-4{width:33.3333333333%}.col-xl-5{width:41.6666666667%}.col-xl-6{width:50%}.col-xl-7{width:58.3333333333%}.col-xl-8{width:66.6666666667%}.col-xl-9{width:75%}.col-xl-10{width:83.3333333333%}.col-xl-11{width:91.6666666667%}.col-xl-12{width:100%}}.gutter{margin-right:-16px;margin-left:-16px}.gutter>[class*="col-"]{padding-right:16px !important;padding-left:16px !important}.gutter-condensed{margin-right:-8px;margin-left:-8px}.gutter-condensed>[class*="col-"]{padding-right:8px !important;padding-left:8px !important}.gutter-spacious{margin-right:-24px;margin-left:-24px}.gutter-spacious>[class*="col-"]{padding-right:24px !important;padding-left:24px !important}@media (min-width: 544px){.gutter-sm{margin-right:-16px;margin-left:-16px}.gutter-sm>[class*="col-"]{padding-right:16px !important;padding-left:16px !important}.gutter-sm-condensed{margin-right:-8px;margin-left:-8px}.gutter-sm-condensed>[class*="col-"]{padding-right:8px !important;padding-left:8px !important}.gutter-sm-spacious{margin-right:-24px;margin-left:-24px}.gutter-sm-spacious>[class*="col-"]{padding-right:24px !important;padding-left:24px !important}}@media (min-width: 768px){.gutter-md{margin-right:-16px;margin-left:-16px}.gutter-md>[class*="col-"]{padding-right:16px !important;padding-left:16px !important}.gutter-md-condensed{margin-right:-8px;margin-left:-8px}.gutter-md-condensed>[class*="col-"]{padding-right:8px !important;padding-left:8px !important}.gutter-md-spacious{margin-right:-24px;margin-left:-24px}.gutter-md-spacious>[class*="col-"]{padding-right:24px !important;padding-left:24px !important}}@media (min-width: 1012px){.gutter-lg{margin-right:-16px;margin-left:-16px}.gutter-lg>[class*="col-"]{padding-right:16px !important;padding-left:16px !important}.gutter-lg-condensed{margin-right:-8px;margin-left:-8px}.gutter-lg-condensed>[class*="col-"]{padding-right:8px !important;padding-left:8px !important}.gutter-lg-spacious{margin-right:-24px;margin-left:-24px}.gutter-lg-spacious>[class*="col-"]{padding-right:24px !important;padding-left:24px !important}}@media (min-width: 1280px){.gutter-xl{margin-right:-16px;margin-left:-16px}.gutter-xl>[class*="col-"]{padding-right:16px !important;padding-left:16px !important}.gutter-xl-condensed{margin-right:-8px;margin-left:-8px}.gutter-xl-condensed>[class*="col-"]{padding-right:8px !important;padding-left:8px !important}.gutter-xl-spacious{margin-right:-24px;margin-left:-24px}.gutter-xl-spacious>[class*="col-"]{padding-right:24px !important;padding-left:24px !important}}.offset-1{margin-left:8.3333333333% !important}.offset-2{margin-left:16.6666666667% !important}.offset-3{margin-left:25% !important}.offset-4{margin-left:33.3333333333% !important}.offset-5{margin-left:41.6666666667% !important}.offset-6{margin-left:50% !important}.offset-7{margin-left:58.3333333333% !important}.offset-8{margin-left:66.6666666667% !important}.offset-9{margin-left:75% !important}.offset-10{margin-left:83.3333333333% !important}.offset-11{margin-left:91.6666666667% !important}@media (min-width: 544px){.offset-sm-1{margin-left:8.3333333333% !important}.offset-sm-2{margin-left:16.6666666667% !important}.offset-sm-3{margin-left:25% !important}.offset-sm-4{margin-left:33.3333333333% !important}.offset-sm-5{margin-left:41.6666666667% !important}.offset-sm-6{margin-left:50% !important}.offset-sm-7{margin-left:58.3333333333% !important}.offset-sm-8{margin-left:66.6666666667% !important}.offset-sm-9{margin-left:75% !important}.offset-sm-10{margin-left:83.3333333333% !important}.offset-sm-11{margin-left:91.6666666667% !important}}@media (min-width: 768px){.offset-md-1{margin-left:8.3333333333% !important}.offset-md-2{margin-left:16.6666666667% !important}.offset-md-3{margin-left:25% !important}.offset-md-4{margin-left:33.3333333333% !important}.offset-md-5{margin-left:41.6666666667% !important}.offset-md-6{margin-left:50% !important}.offset-md-7{margin-left:58.3333333333% !important}.offset-md-8{margin-left:66.6666666667% !important}.offset-md-9{margin-left:75% !important}.offset-md-10{margin-left:83.3333333333% !important}.offset-md-11{margin-left:91.6666666667% !important}}@media (min-width: 1012px){.offset-lg-1{margin-left:8.3333333333% !important}.offset-lg-2{margin-left:16.6666666667% !important}.offset-lg-3{margin-left:25% !important}.offset-lg-4{margin-left:33.3333333333% !important}.offset-lg-5{margin-left:41.6666666667% !important}.offset-lg-6{margin-left:50% !important}.offset-lg-7{margin-left:58.3333333333% !important}.offset-lg-8{margin-left:66.6666666667% !important}.offset-lg-9{margin-left:75% !important}.offset-lg-10{margin-left:83.3333333333% !important}.offset-lg-11{margin-left:91.6666666667% !important}}@media (min-width: 1280px){.offset-xl-1{margin-left:8.3333333333% !important}.offset-xl-2{margin-left:16.6666666667% !important}.offset-xl-3{margin-left:25% !important}.offset-xl-4{margin-left:33.3333333333% !important}.offset-xl-5{margin-left:41.6666666667% !important}.offset-xl-6{margin-left:50% !important}.offset-xl-7{margin-left:58.3333333333% !important}.offset-xl-8{margin-left:66.6666666667% !important}.offset-xl-9{margin-left:75% !important}.offset-xl-10{margin-left:83.3333333333% !important}.offset-xl-11{margin-left:91.6666666667% !important}}.markdown-body{font-family:-apple-system,BlinkMacSystemFont,"Segoe UI",Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";font-size:16px;line-height:1.5;word-wrap:break-word}.markdown-body::before{display:table;content:""}.markdown-body::after{display:table;clear:both;content:""}.markdown-body>*:first-child{margin-top:0 !important}.markdown-body>*:last-child{margin-bottom:0 !important}.markdown-body a:not([href]){color:inherit;text-decoration:none}.markdown-body .absent{color:#cb2431}.markdown-body .anchor{float:left;padding-right:4px;margin-left:-20px;line-height:1}.markdown-body .anchor:focus{outline:none}.markdown-body p,.markdown-body blockquote,.markdown-body ul,.markdown-body ol,.markdown-body dl,.markdown-body table,.markdown-body pre{margin-top:0;margin-bottom:16px}.markdown-body hr{height:.25em;padding:0;margin:24px 0;background-color:#e1e4e8;border:0}.markdown-body blockquote{padding:0 1em;color:#6a737d;border-left:0.25em solid #dfe2e5}.markdown-body blockquote>:first-child{margin-top:0}.markdown-body blockquote>:last-child{margin-bottom:0}.markdown-body kbd{display:inline-block;padding:3px 5px;font-size:11px;line-height:10px;color:#444d56;vertical-align:middle;background-color:#fafbfc;border:solid 1px #c6cbd1;border-bottom-color:#959da5;border-radius:3px;box-shadow:inset 0 -1px 0 #959da5}.markdown-body h1,.markdown-body h2,.markdown-body h3,.markdown-body h4,.markdown-body h5,.markdown-body h6{margin-top:24px;margin-bottom:16px;font-weight:600;line-height:1.25}.markdown-body h1 .octicon-link,.markdown-body h2 .octicon-link,.markdown-body h3 .octicon-link,.markdown-body h4 .octicon-link,.markdown-body h5 .octicon-link,.markdown-body h6 .octicon-link{color:#1b1f23;vertical-align:middle;visibility:hidden}.markdown-body h1:hover .anchor,.markdown-body h2:hover .anchor,.markdown-body h3:hover .anchor,.markdown-body h4:hover .anchor,.markdown-body h5:hover .anchor,.markdown-body h6:hover .anchor{text-decoration:none}.markdown-body h1:hover .anchor .octicon-link,.markdown-body h2:hover .anchor .octicon-link,.markdown-body h3:hover .anchor .octicon-link,.markdown-body h4:hover .anchor .octicon-link,.markdown-body h5:hover .anchor .octicon-link,.markdown-body h6:hover .anchor .octicon-link{visibility:visible}.markdown-body h1 tt,.markdown-body h1 code,.markdown-body h2 tt,.markdown-body h2 code,.markdown-body h3 tt,.markdown-body h3 code,.markdown-body h4 tt,.markdown-body h4 code,.markdown-body h5 tt,.markdown-body h5 code,.markdown-body h6 tt,.markdown-body h6 code{font-size:inherit}.markdown-body h1{padding-bottom:0.3em;font-size:2em;border-bottom:1px solid #eaecef}.markdown-body h2{padding-bottom:0.3em;font-size:1.5em;border-bottom:1px solid #eaecef}.markdown-body h3{font-size:1.25em}.markdown-body h4{font-size:1em}.markdown-body h5{font-size:0.875em}.markdown-body h6{font-size:0.85em;color:#6a737d}.markdown-body ul,.markdown-body ol{padding-left:2em}.markdown-body ul.no-list,.markdown-body ol.no-list{padding:0;list-style-type:none}.markdown-body ul ul,.markdown-body ul ol,.markdown-body ol ol,.markdown-body ol ul{margin-top:0;margin-bottom:0}.markdown-body li{word-wrap:break-all}.markdown-body li>p{margin-top:16px}.markdown-body li+li{margin-top:.25em}.markdown-body dl{padding:0}.markdown-body dl dt{padding:0;margin-top:16px;font-size:1em;font-style:italic;font-weight:600}.markdown-body dl dd{padding:0 16px;margin-bottom:16px}.markdown-body table{display:block;width:100%;overflow:auto}.markdown-body table th{font-weight:600}.markdown-body table th,.markdown-body table td{padding:6px 13px;border:1px solid #dfe2e5}.markdown-body table tr{background-color:#fff;border-top:1px solid #c6cbd1}.markdown-body table tr:nth-child(2n){background-color:#f6f8fa}.markdown-body table img{background-color:transparent}.markdown-body img{max-width:100%;box-sizing:content-box;background-color:#fff}.markdown-body img[align=right]{padding-left:20px}.markdown-body img[align=left]{padding-right:20px}.markdown-body .emoji{max-width:none;vertical-align:text-top;background-color:transparent}.markdown-body span.frame{display:block;overflow:hidden}.markdown-body span.frame>span{display:block;float:left;width:auto;padding:7px;margin:13px 0 0;overflow:hidden;border:1px solid #dfe2e5}.markdown-body span.frame span img{display:block;float:left}.markdown-body span.frame span span{display:block;padding:5px 0 0;clear:both;color:#24292e}.markdown-body span.align-center{display:block;overflow:hidden;clear:both}.markdown-body span.align-center>span{display:block;margin:13px auto 0;overflow:hidden;text-align:center}.markdown-body span.align-center span img{margin:0 auto;text-align:center}.markdown-body span.align-right{display:block;overflow:hidden;clear:both}.markdown-body span.align-right>span{display:block;margin:13px 0 0;overflow:hidden;text-align:right}.markdown-body span.align-right span img{margin:0;text-align:right}.markdown-body span.float-left{display:block;float:left;margin-right:13px;overflow:hidden}.markdown-body span.float-left span{margin:13px 0 0}.markdown-body span.float-right{display:block;float:right;margin-left:13px;overflow:hidden}.markdown-body span.float-right>span{display:block;margin:13px auto 0;overflow:hidden;text-align:right}.markdown-body code,.markdown-body tt{padding:0.2em 0.4em;margin:0;font-size:85%;background-color:rgba(27,31,35,0.05);border-radius:3px}.markdown-body code br,.markdown-body tt br{display:none}.markdown-body del code{text-decoration:inherit}.markdown-body pre{word-wrap:normal}.markdown-body pre>code{padding:0;margin:0;font-size:100%;word-break:normal;white-space:pre;background:transparent;border:0}.markdown-body .highlight{margin-bottom:16px}.markdown-body .highlight pre{margin-bottom:0;word-break:normal}.markdown-body .highlight pre,.markdown-body pre{padding:16px;overflow:auto;font-size:85%;line-height:1.45;background-color:#f6f8fa;border-radius:3px}.markdown-body pre code,.markdown-body pre tt{display:inline;max-width:auto;padding:0;margin:0;overflow:visible;line-height:inherit;word-wrap:normal;background-color:transparent;border:0}.markdown-body .csv-data td,.markdown-body .csv-data th{padding:5px;overflow:hidden;font-size:12px;line-height:1;text-align:left;white-space:nowrap}.markdown-body .csv-data .blob-num{padding:10px 8px 9px;text-align:right;background:#fff;border:0}.markdown-body .csv-data tr{border-top:0}.markdown-body .csv-data th{font-weight:600;background:#f6f8fa;border-top:0}.highlight table td{padding:5px}.highlight table pre{margin:0}.highlight .cm{color:#999988;font-style:italic}.highlight .cp{color:#999999;font-weight:bold}.highlight .c1{color:#999988;font-style:italic}.highlight .cs{color:#999999;font-weight:bold;font-style:italic}.highlight .c,.highlight .cd{color:#999988;font-style:italic}.highlight .err{color:#a61717;background-color:#e3d2d2}.highlight .gd{color:#000000;background-color:#ffdddd}.highlight .ge{color:#000000;font-style:italic}.highlight .gr{color:#aa0000}.highlight .gh{color:#999999}.highlight .gi{color:#000000;background-color:#ddffdd}.highlight .go{color:#888888}.highlight .gp{color:#555555}.highlight .gs{font-weight:bold}.highlight .gu{color:#aaaaaa}.highlight .gt{color:#aa0000}.highlight .kc{color:#000000;font-weight:bold}.highlight .kd{color:#000000;font-weight:bold}.highlight .kn{color:#000000;font-weight:bold}.highlight .kp{color:#000000;font-weight:bold}.highlight .kr{color:#000000;font-weight:bold}.highlight .kt{color:#445588;font-weight:bold}.highlight .k,.highlight .kv{color:#000000;font-weight:bold}.highlight .mf{color:#009999}.highlight .mh{color:#009999}.highlight .il{color:#009999}.highlight .mi{color:#009999}.highlight .mo{color:#009999}.highlight .m,.highlight .mb,.highlight .mx{color:#009999}.highlight .sb{color:#d14}.highlight .sc{color:#d14}.highlight .sd{color:#d14}.highlight .s2{color:#d14}.highlight .se{color:#d14}.highlight .sh{color:#d14}.highlight .si{color:#d14}.highlight .sx{color:#d14}.highlight .sr{color:#009926}.highlight .s1{color:#d14}.highlight .ss{color:#990073}.highlight .s{color:#d14}.highlight .na{color:#008080}.highlight .bp{color:#999999}.highlight .nb{color:#0086B3}.highlight .nc{color:#445588;font-weight:bold}.highlight .no{color:#008080}.highlight .nd{color:#3c5d5d;font-weight:bold}.highlight .ni{color:#800080}.highlight .ne{color:#990000;font-weight:bold}.highlight .nf{color:#990000;font-weight:bold}.highlight .nl{color:#990000;font-weight:bold}.highlight .nn{color:#555555}.highlight .nt{color:#000080}.highlight .vc{color:#008080}.highlight .vg{color:#008080}.highlight .vi{color:#008080}.highlight .nv{color:#008080}.highlight .ow{color:#000000;font-weight:bold}.highlight .o{color:#000000;font-weight:bold}.highlight .w{color:#bbbbbb}.highlight{background-color:#f8f8f8} diff --git a/assets/external-links-new-tab.js b/assets/external-links-new-tab.js new file mode 100644 index 000000000..e111a7cea --- /dev/null +++ b/assets/external-links-new-tab.js @@ -0,0 +1,11 @@ +var links = document.links; + +for (var i = 0; i < links.length; i++) { + if (links[i].hostname == 'localhost' || links[i].hostname == '127.0.0.1') { + continue; + } + + if (!links[i].hostname.includes("pytorch.kr")) { + links[i].target = '_blank'; + } +} \ No newline at end of file diff --git a/assets/filter-hub-tags.js b/assets/filter-hub-tags.js new file mode 100644 index 000000000..65e59f033 --- /dev/null +++ b/assets/filter-hub-tags.js @@ -0,0 +1,103 @@ +var filterScript = $("script[src*=filter-hub-tags]"); +var listId = filterScript.attr("list-id"); +var displayCount = Number(filterScript.attr("display-count")); +var pagination = filterScript.attr("pagination"); + +var options = { + valueNames: ["github-stars-count-whole-number", { data: ["tags", "date-added", "title"] }], + page: displayCount +}; + +$(".next-news-item").on("click" , function(){ + $(".pagination").find(".active").next().trigger( "click" ); +}); + +$(".previous-news-item").on("click" , function(){ + $(".pagination").find(".active").prev().trigger( "click" ); +}); + +// Only the hub index page should have pagination + +if (pagination == "true") { + options.pagination = true; +} + +var hubList = new List(listId, options); + +function filterSelectedTags(cardTags, selectedTags) { + return cardTags.some(function(tag) { + return selectedTags.some(function(selectedTag) { + return selectedTag == tag; + }); + }); +} + +function updateList() { + var selectedTags = []; + + $(".selected").each(function() { + selectedTags.push($(this).data("tag")); + }); + + hubList.filter(function(item) { + var cardTags = item.values().tags.split(","); + + if (selectedTags.length == 0) { + return true; + } else { + return filterSelectedTags(cardTags, selectedTags); + } + }); +} + +$(".filter-btn").on("click", function() { + if ($(this).data("tag") == "all") { + $(this).addClass("all-tag-selected"); + $(".filter").removeClass("selected"); + } else { + $(this).toggleClass("selected"); + $("[data-tag='all']").removeClass("all-tag-selected"); + } + + // If no tags are selected then highlight the 'All' tag + + if (!$(".selected")[0]) { + $("[data-tag='all']").addClass("all-tag-selected"); + } + + updateList(); +}); + +//Scroll back to top of hub cards on click of next/previous page button + +$(document).on("click", ".page", function(e) { + e.preventDefault(); + $('html, body').animate( + {scrollTop: $("#pagination-scroll").position().top}, + 'slow' + ); +}); + +$("#sortLowLeft").on("click", function() { + hubList.sort("github-stars-count-whole-number", { order: "asc" }); +}); + +$("#sortHighLeft").on("click", function() { + hubList.sort("github-stars-count-whole-number", { order: "desc" }); +}); + +$("#sortDateNew").on("click", function() { + hubList.sort("date-added", { order: "desc" }); +}); + +$("#sortDateOld").on("click", function() { + hubList.sort("date-added", { order: "asc" }); +}); + +$("#sortTitleLow").on("click", function() { + hubList.sort("title", { order: "desc" }); +}); + +$("#sortTitleHigh").on("click", function() { + hubList.sort("title", { order: "asc" }); +}); diff --git a/assets/fonts/FreightSans/freight-sans-bold-italic.woff b/assets/fonts/FreightSans/freight-sans-bold-italic.woff new file mode 100755 index 000000000..e31724842 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-bold-italic.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-bold-italic.woff2 b/assets/fonts/FreightSans/freight-sans-bold-italic.woff2 new file mode 100755 index 000000000..cec2dc94f Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-bold-italic.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-bold.woff b/assets/fonts/FreightSans/freight-sans-bold.woff new file mode 100755 index 000000000..de46625ed Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-bold.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-bold.woff2 b/assets/fonts/FreightSans/freight-sans-bold.woff2 new file mode 100755 index 000000000..dc05cd82b Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-bold.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-book-italic.woff b/assets/fonts/FreightSans/freight-sans-book-italic.woff new file mode 100755 index 000000000..a50e5038a Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-book-italic.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-book-italic.woff2 b/assets/fonts/FreightSans/freight-sans-book-italic.woff2 new file mode 100755 index 000000000..fe284db66 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-book-italic.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-book.woff b/assets/fonts/FreightSans/freight-sans-book.woff new file mode 100755 index 000000000..6ab8775f0 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-book.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-book.woff2 b/assets/fonts/FreightSans/freight-sans-book.woff2 new file mode 100755 index 000000000..2688739f1 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-book.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-light-italic.woff b/assets/fonts/FreightSans/freight-sans-light-italic.woff new file mode 100755 index 000000000..beda58d4e Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-light-italic.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-light-italic.woff2 b/assets/fonts/FreightSans/freight-sans-light-italic.woff2 new file mode 100755 index 000000000..e2fa0134b Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-light-italic.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-light.woff b/assets/fonts/FreightSans/freight-sans-light.woff new file mode 100755 index 000000000..226a0bf83 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-light.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-light.woff2 b/assets/fonts/FreightSans/freight-sans-light.woff2 new file mode 100755 index 000000000..6d8ff2c04 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-light.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-medium-italic.woff b/assets/fonts/FreightSans/freight-sans-medium-italic.woff new file mode 100644 index 000000000..a42115d63 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-medium-italic.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-medium-italic.woff2 b/assets/fonts/FreightSans/freight-sans-medium-italic.woff2 new file mode 100644 index 000000000..16a7713a4 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-medium-italic.woff2 differ diff --git a/assets/fonts/FreightSans/freight-sans-medium.woff b/assets/fonts/FreightSans/freight-sans-medium.woff new file mode 100755 index 000000000..5ea34539c Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-medium.woff differ diff --git a/assets/fonts/FreightSans/freight-sans-medium.woff2 b/assets/fonts/FreightSans/freight-sans-medium.woff2 new file mode 100755 index 000000000..c58b6a528 Binary files /dev/null and b/assets/fonts/FreightSans/freight-sans-medium.woff2 differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff b/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff new file mode 100644 index 000000000..cf37a5c50 Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff2 b/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff2 new file mode 100644 index 000000000..955a6eab5 Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff2 differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff b/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff new file mode 100644 index 000000000..fc65a679c Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff2 b/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff2 new file mode 100644 index 000000000..c352e40e3 Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff2 differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff b/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff new file mode 100644 index 000000000..7d63d89f2 Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff2 b/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff2 new file mode 100644 index 000000000..d0d7ded90 Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff2 differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff b/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff new file mode 100644 index 000000000..1da7753cf Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff differ diff --git a/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff2 b/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff2 new file mode 100644 index 000000000..79dffdb85 Binary files /dev/null and b/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff2 differ diff --git a/assets/get-started-sidebar.js b/assets/get-started-sidebar.js new file mode 100644 index 000000000..ff1e70204 --- /dev/null +++ b/assets/get-started-sidebar.js @@ -0,0 +1,91 @@ +// Create the sidebar menus for each OS and Cloud Partner + +$([".macos", ".linux", ".windows"]).each(function (index, osClass) { + buildSidebarMenu(osClass, "#get-started-locally-sidebar-list"); +}); + +$([".alibaba", ".aws", ".microsoft-azure", ".google-cloud"]).each(function (index, cloudPartner) { + buildSidebarMenu(cloudPartner, "#get-started-cloud-sidebar-list"); +}); + +$(["macos", "linux", "windows"]).each(function (index, osClass) { + $("#" + osClass).on("click", function () { + showSidebar(osClass, ".get-started-locally-sidebar li"); + }); +}); + +// Show cloud partner side nav on click or hide side nav if already open +$(["alibaba", "aws", "microsoft-azure", "google-cloud"]).each(function (index, sidebarClass) { + $("#" + sidebarClass).click(function () { + showSidebar(sidebarClass, ".get-started-cloud-sidebar li"); + // alibaba filter for centering cloud module + if (sidebarClass == "alibaba") { + $(".article-wrapper").parent().removeClass("col-md-8 offset-md-1").addClass("col-md-12"); + $(".cloud-nav").hide(); + } else { + $(".article-wrapper").parent().removeClass("col-md-12").addClass("col-md-8 offset-md-1"); + $(".cloud-nav").show(); + } + if ( + $("#" + sidebarClass) + .parent() + .hasClass("open") + ) { + $(".get-started-cloud-sidebar li").hide(); + $(".cloud-nav").hide(); + $(".article-wrapper").parent().removeClass("col-md-8 offset-md-1").addClass("col-md-12"); + } + }); +}); + +function buildSidebarMenu(menuClass, menuItem) { + $(menuClass + " > h2," + menuClass + " > h3").each(function (index, element) { + menuClass = menuClass.replace(".", ""); + + // If the menu item is an H3 tag then it should be indented + var indentMenuItem = $(element).get(0).tagName == "H3" ? "subitem" : ""; + + // Combine the menu item classes + var menuItemClasses = [menuClass, indentMenuItem].join(" "); + + $(menuItem).append( + "" + ); + }); +} + +function showSidebar(selectedClass, menuItem) { + // Hide all of the menu items at first + // Then filter for the selected OS/cloud partner + $(menuItem) + .hide() + .filter(function () { + return $(this).attr("class").includes(selectedClass); + }) + .show(); +} + +$(".get-started-locally-sidebar li").on("click", function () { + removeActiveClass(); + addActiveClass(this); +}); + +function removeActiveClass() { + $(".get-started-locally-sidebar li a").each(function () { + $(this).removeClass("active"); + }); +} + +function addActiveClass(element) { + $(element).find("a").addClass("active"); +} + +if ($("#get-started-locally-sidebar-list").text() == "") { + $("#get-started-shortcuts-menu").hide(); +} diff --git a/assets/github-stars.js b/assets/github-stars.js new file mode 100644 index 000000000..8f2d99492 --- /dev/null +++ b/assets/github-stars.js @@ -0,0 +1,79 @@ +var githubStarsScript = $("script[src*=github-stars]"); +var starCountCallDate = githubStarsScript.attr("star-count-call-date"); +var starCountData = githubStarsScript.attr("star-count-data"); +var ecosystemStars = githubStarsScript.attr("ecosystem"); +var cloudfrontUrl = ""; + +if (ecosystemStars == "true") { + cloudfrontUrl = "https://d2ze5o8gurgoho.cloudfront.net/star-count"; +} +else { + cloudfrontUrl = "https://du4l4liqvfo92.cloudfront.net/star-count"; +} + +var today = new Date(); +var starCountCallDateParsed = new Date( + parseInt(localStorage.getItem(starCountCallDate), 10) +); + +if ( + Date.parse(today) > + starCountCallDateParsed.setDate(starCountCallDateParsed.getDate() + 7) || + localStorage.getItem(starCountCallDate) == null +) { + updateStarCount(); +} else { + useLocalStorageStarCount(); +} + +function updateStarCount() { + console.log("Updated star count fetched"); + $.getJSON(cloudfrontUrl, function (data) { + localStorage.setItem(starCountCallDate, Date.parse(today)); + localStorage.setItem(starCountData, JSON.stringify(data)); + + updateStarsOnPage(data); + }); +} + +function useLocalStorageStarCount() { + var data = JSON.parse(localStorage.getItem(starCountData)); + + updateStarsOnPage(data); +} + +// Loop through each card and add the star count +// Once each card has its star count then the pagination script is added + +function updateStarsOnPage(data) { + return new Promise(function (resolve, reject) { + for (var i = 0; i < data.length; i++) { + var starCount = data[i].stars; + if (starCount > 999) { + starCount = numeral(starCount).format("0.0a"); + } else if (starCount > 9999) { + starCount = numeral(starCount).format("0.00a"); + } + $("[data-id='" + data[i].id + "'] .github-stars-count-whole-number").html(data[i].stars); + $("[data-id='" + data[i].id + "'] .github-stars-count").html(starCount); + } + resolve( + $("#filter-script").html(addFilterScript()) + ); + }); +} + +function addFilterScript() { + var data = $("#filter-script").data(); + + var script = + ""; + + return script; +} diff --git a/assets/hub-buttons.js b/assets/hub-buttons.js new file mode 100644 index 000000000..ce7de5943 --- /dev/null +++ b/assets/hub-buttons.js @@ -0,0 +1,40 @@ +var numberOfCardsToShow = 3; + +$(".cards-left > .col-md-12, .cards-right > .col-md-12") + .filter(function() { + return $(this).attr("data-item-count") > numberOfCardsToShow; + }) + .hide(); + +$("#development-models").on("click", function() { + showCards(this, "#development-models-hide", ".cards-right > .col-md-12"); +}); + +$("#development-models-hide").on("click", function() { + hideCards(this, "#development-models", ".cards-right > .col-md-12"); +}); + +$("#research-models").on("click", function() { + showCards(this, "#research-models-hide", ".cards-left > .col-md-12"); +}); + +$("#research-models-hide").on("click", function() { + hideCards(this, "#research-models", ".cards-left > .col-md-12"); +}); + +function showCards(buttonToHide, buttonToShow, cardsWrapper) { + $(buttonToHide).hide(); + $(buttonToShow) + .add(cardsWrapper) + .show(); +} + +function hideCards(buttonToHide, buttonToShow, cardsWrapper) { + $(buttonToHide).hide(); + $(buttonToShow).show(); + $(cardsWrapper) + .filter(function() { + return $(this).attr("data-item-count") > numberOfCardsToShow; + }) + .hide(); +} diff --git a/assets/hub-detail.js b/assets/hub-detail.js new file mode 100644 index 000000000..89496b5e1 --- /dev/null +++ b/assets/hub-detail.js @@ -0,0 +1,7 @@ +// Hide broken images that appear on the hub detail page. + +$(".featured-image").each(function() { + if ($(this).data("image-name") == "no-image") { + $(this).hide(); + } +}); diff --git a/assets/hub-search-bar.js b/assets/hub-search-bar.js new file mode 100644 index 000000000..5e9433ecc --- /dev/null +++ b/assets/hub-search-bar.js @@ -0,0 +1,49 @@ +docsearch({ + apiKey: "e3b73ac141dff0b0fd27bdae9055bc73", + indexName: "pytorch", + inputSelector: "#hub-search-input", + algoliaOptions: { facetFilters: ["tags:hub"] }, + debug: false // Set debug to true if you want to inspect the dropdown +}); + +$("#hub-search-icon").on("click", function() { + $(this).hide(); + $("#hub-icons").hide(); + $("#hub-close-search").fadeIn("slow"); + $(".hub-divider").addClass("active-hub-divider"); + $("#hub-search-input") + .show() + .css("background-color", "#CCCDD1") + .focus(); + $(".hub-search-wrapper, .hub-tags-container").addClass("active"); + $("#dropdown-filter-tags").hide(); +}); + +function hideHubSearch(searchIcon) { + $(searchIcon).hide(); + + $("#hub-search-icon, #dropdown-filter-tags").fadeIn("slow"); + $("#hub-icons").fadeIn("slow"); + $("#hub-search-input") + .fadeOut("slow") + .css("background-color", "#f3f4f7"); + $(".hub-divider").removeClass("active-hub-divider"); + $("#hub-search-input") + .removeClass("active-search-icon") + .val(""); + $(".hub-search-wrapper, .hub-tags-container").removeClass("active"); +} + +$("#hub-close-search").on("click", function() { + hideHubSearch(this); +}); + +$(document).click(function(event) { + $target = $(event.target); + if ( + !$target.closest(".hub-search-wrapper").length && + $(".hub-search-wrapper").is(":visible") + ) { + hideHubSearch("#hub-close-search"); + } +}); diff --git a/assets/hub-sort.js b/assets/hub-sort.js new file mode 100644 index 000000000..e7e117a5a --- /dev/null +++ b/assets/hub-sort.js @@ -0,0 +1,31 @@ +var $wrapper = $(".cards-right"); +var $leftWrapper = $(".cards-left"); + +$("#sortLow").on("click", function() { + sorter("low", $wrapper); +}); + +$("#sortHigh").on("click", function() { + sorter("high", $wrapper); +}); + +$("#sortLowLeft").on("click", function() { + sorter("low", $leftWrapper); +}); + +$("#sortHighLeft").on("click", function() { + sorter("high", $leftWrapper); +}); + +function sorter(type, wrapper) { + wrapper + .find(".col-md-12") + .sort(function(a, b) { + if (type == "high") { + return b.dataset.count - a.dataset.count; + } else { + return a.dataset.count - b.dataset.count; + } + }) + .appendTo(wrapper); +} diff --git a/assets/hub/CONTRIBUTING.ipynb b/assets/hub/CONTRIBUTING.ipynb new file mode 100644 index 000000000..363fcab7e --- /dev/null +++ b/assets/hub/CONTRIBUTING.ipynb @@ -0,0 +1,6 @@ +{ + "cells": [], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/CONTRIBUTING_MODELS.ipynb b/assets/hub/CONTRIBUTING_MODELS.ipynb new file mode 100644 index 000000000..363fcab7e --- /dev/null +++ b/assets/hub/CONTRIBUTING_MODELS.ipynb @@ -0,0 +1,6 @@ +{ + "cells": [], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/TRANSLATION_GUIDE.ipynb b/assets/hub/TRANSLATION_GUIDE.ipynb new file mode 100644 index 000000000..363fcab7e --- /dev/null +++ b/assets/hub/TRANSLATION_GUIDE.ipynb @@ -0,0 +1,6 @@ +{ + "cells": [], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/Window_build.ipynb b/assets/hub/Window_build.ipynb new file mode 100644 index 000000000..363fcab7e --- /dev/null +++ b/assets/hub/Window_build.ipynb @@ -0,0 +1,6 @@ +{ + "cells": [], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/datvuthanh_hybridnets.ipynb b/assets/hub/datvuthanh_hybridnets.ipynb new file mode 100644 index 000000000..9ad15b190 --- /dev/null +++ b/assets/hub/datvuthanh_hybridnets.ipynb @@ -0,0 +1,148 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "f39f490e", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# HybridNets\n", + "\n", + "*Author: Dat Vu Thanh*\n", + "\n", + "**HybridNets - End2End Perception Network**\n", + "\n", + "## Before You Start\n", + "\n", + "Start from a **Python>=3.7** environment with **PyTorch>=1.10** installed. To install PyTorch see [https://pytorch.org/get-started/locally/](https://pytorch.org/get-started/locally/). To install HybridNets dependencies:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f6255399", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install -qr https://raw.githubusercontent.com/datvuthanh/HybridNets/main/requirements.txt # install dependencies" + ] + }, + { + "cell_type": "markdown", + "id": "ee991072", + "metadata": {}, + "source": [ + "## Model Description\n", + " \n", + " \n", + "\n", + "HybridNets is an end2end perception network for multi-tasks. Our work focused on traffic object detection, drivable area segmentation and lane detection. HybridNets can run real-time on embedded systems, and obtains SOTA Object Detection, Lane Detection on BDD100K Dataset.\n", + "\n", + "### Results\n", + "\n", + "### Traffic Object Detection\n", + "\n", + "| Model | Recall (%) | mAP@0.5 (%) |\n", + "|:------------------:|:------------:|:---------------:|\n", + "| `MultiNet` | 81.3 | 60.2 |\n", + "| `DLT-Net` | 89.4 | 68.4 |\n", + "| `Faster R-CNN` | 77.2 | 55.6 |\n", + "| `YOLOv5s` | 86.8 | 77.2 |\n", + "| `YOLOP` | 89.2 | 76.5 |\n", + "| **`HybridNets`** | **92.8** | **77.3** |\n", + "\n", + "\n", + " \n", + "### Drivable Area Segmentation\n", + "\n", + "| Model | Drivable mIoU (%) |\n", + "|:----------------:|:-----------------:|\n", + "| `MultiNet` | 71.6 |\n", + "| `DLT-Net` | 71.3 |\n", + "| `PSPNet` | 89.6 |\n", + "| `YOLOP` | 91.5 |\n", + "| **`HybridNets`** | **90.5** |\n", + "\n", + "\n", + " \n", + "### Lane Line Detection\n", + "\n", + "| Model | Accuracy (%) | Lane Line IoU (%) |\n", + "|:----------------:|:------------:|:-----------------:|\n", + "| `Enet` | 34.12 | 14.64 |\n", + "| `SCNN` | 35.79 | 15.84 |\n", + "| `Enet-SAD` | 36.56 | 16.02 |\n", + "| `YOLOP` | 70.5 | 26.2 |\n", + "| **`HybridNets`** | **85.4** | **31.6** |\n", + "\n", + "\n", + " \n", + "\n", + " \n", + " \n", + "### Load From PyTorch Hub\n", + "\n", + "This example loads the pretrained **HybridNets** model and passes an image for inference." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "80ed688e", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "# load model\n", + "model = torch.hub.load('datvuthanh/hybridnets', 'hybridnets', pretrained=True)\n", + "\n", + "#inference\n", + "img = torch.randn(1,3,640,384)\n", + "features, regression, classification, anchors, segmentation = model(img)" + ] + }, + { + "cell_type": "markdown", + "id": "42aa441f", + "metadata": {}, + "source": [ + "### Citation\n", + "\n", + "If you find our [paper](https://arxiv.org/abs/2203.09035) and [code](https://github.com/datvuthanh/HybridNets) useful for your research, please consider giving a star and citation:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "09bc5be6", + "metadata": { + "attributes": { + "classes": [ + "BibTeX" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "@misc{vu2022hybridnets,\n", + " title={HybridNets: End-to-End Perception Network}, \n", + " author={Dat Vu and Bao Ngo and Hung Phan},\n", + " year={2022},\n", + " eprint={2203.09035},\n", + " archivePrefix={arXiv},\n", + " primaryClass={cs.CV}\n", + "}" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_WSL-Images_resnext.ipynb b/assets/hub/facebookresearch_WSL-Images_resnext.ipynb new file mode 100644 index 000000000..a9a06489d --- /dev/null +++ b/assets/hub/facebookresearch_WSL-Images_resnext.ipynb @@ -0,0 +1,127 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "a481ccfa", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ResNext WSL\n", + "\n", + "*Author: Facebook AI*\n", + "\n", + "**ResNext models trained with billion scale weakly-supervised data.**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3baf1907", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('facebookresearch/WSL-Images', 'resnext101_32x8d_wsl')\n", + "# 또는\n", + "# model = torch.hub.load('facebookresearch/WSL-Images', 'resnext101_32x16d_wsl')\n", + "# 또는\n", + "# model = torch.hub.load('facebookresearch/WSL-Images', 'resnext101_32x32d_wsl')\n", + "# 또는\n", + "#model = torch.hub.load('facebookresearch/WSL-Images', 'resnext101_32x48d_wsl')\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "3225850d", + "metadata": {}, + "source": [ + "모든 사전 학습된 모델은 동일한 방식으로 정규화된 입력 이미지를 요구합니다.\n", + "즉, `H`와 `W`가 최소 `224`의 크기를 가지는 `(3 x H x W)`형태의 3채널 RGB 이미지의 미니배치가 필요합니다.\n", + "이미지는 [0, 1] 범위로 불러온 다음 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]`를 이용하여 정규화해야 합니다.\n", + "\n", + "다음은 실행 예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bbcacfa4", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예시 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "efa51334", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예시(torchvision이 요구됩니다.)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # create a mini-batch as expected by the model\n", + "\n", + "# GPU를 사용할 수 있다면, 속도 향상을 위해 입력과 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# Imagenet의 1000개 클래스에 대한 신뢰도 점수를 가진, shape이 1000인 텐서 출력\n", + "print(output[0])\n", + "# 출력값은 정규화되지 않은 형태입니다. Softmax를 실행하면 확률을 얻을 수 있습니다.\n", + "print(torch.nn.functional.softmax(output[0], dim=0))\n" + ] + }, + { + "cell_type": "markdown", + "id": "4c8c4572", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "제공되는 ResNeXt 모델들은 9억 4천만개의 공공 이미지를 weakly-supervised 방식으로 사전 학습한 후 ImageNet1K 데이터셋을 사용해 미세 조정(fine-tuning)합니다. 여기서 사용되는 공공 이미지들은 1000개의 ImageNet1K 동의어 집합(synset)에 해당하는 1500개의 해시태그를 가집니다. 모델 학습에 대한 세부 사항은 [“Exploring the Limits of Weakly Supervised Pretraining”](https://arxiv.org/abs/1805.00932)을 참고해주세요.\n", + "\n", + "서로 다른 성능을 가진 4개의 ResNeXt 모델이 제공되고 있습니다.\n", + "\n", + "| Model | #Parameters | FLOPS | Top-1 Acc. | Top-5 Acc. |\n", + "| ------------------ | :---------: | :---: | :--------: | :--------: |\n", + "| ResNeXt-101 32x8d | 88M | 16B | 82.2 | 96.4 |\n", + "| ResNeXt-101 32x16d | 193M | 36B | 84.2 | 97.2 |\n", + "| ResNeXt-101 32x32d | 466M | 87B | 85.1 | 97.5 |\n", + "| ResNeXt-101 32x48d | 829M | 153B | 85.4 | 97.6 |\n", + "\n", + "ResNeXt 모델을 사용하면 사전 학습된 모델을 사용하지 않고 처음부터 학습하는 경우에 비해 ImageNet 데이터셋에서의 학습 정확도가 크게 향상됩니다. ResNext-101 32x48d 모델은 ImageNet 데이터셋을 사용했을 때 85.4%에 달하는 최고 수준의 정확도를 달성했습니다.\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Exploring the Limits of Weakly Supervised Pretraining](https://arxiv.org/abs/1805.00932)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_pytorch-gan-zoo_dcgan.ipynb b/assets/hub/facebookresearch_pytorch-gan-zoo_dcgan.ipynb new file mode 100644 index 000000000..ade454dd4 --- /dev/null +++ b/assets/hub/facebookresearch_pytorch-gan-zoo_dcgan.ipynb @@ -0,0 +1,91 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "e39d1750", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# DCGAN on FashionGen\n", + "\n", + "*Author: FAIR HDGAN*\n", + "\n", + "**64x64 이미지 생성을 위한 기본 이미지 생성 모델**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9afa7da7", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "use_gpu = True if torch.cuda.is_available() else False\n", + "\n", + "model = torch.hub.load('facebookresearch/pytorch_GAN_zoo:hub', 'DCGAN', pretrained=True, useGPU=use_gpu)" + ] + }, + { + "cell_type": "markdown", + "id": "63a41643", + "metadata": {}, + "source": [ + "모델에 입력하는 잡음(noise) 벡터의 크기는 `(N, 120)` 이며 여기서 `N`은 생성하고자 하는 이미지의 개수입니다. 데이터 생성은 `.buildNoiseData` 함수를 사용하여 데이터를 생성할 수 있습니다. 모델의 `.test` 함수를 사용하면 잡음 벡터를 입력받아 이미지를 생성합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2e2a944f", + "metadata": {}, + "outputs": [], + "source": [ + "num_images = 64\n", + "noise, _ = model.buildNoiseData(num_images)\n", + "with torch.no_grad():\n", + " generated_images = model.test(noise)\n", + "\n", + "# torchvision 과 matplotlib 를 사용하여 생성된 이미지들을 시각화합니다.\n", + "import matplotlib.pyplot as plt\n", + "import torchvision\n", + "plt.imshow(torchvision.utils.make_grid(generated_images).permute(1, 2, 0).cpu().numpy())\n", + "# plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "bd8285f3", + "metadata": {}, + "source": [ + "왼쪽에 있는 이미지와 유사하다는것을 볼 수 있습니다.\n", + "\n", + "만약 자기만의 DCGAN과 다른 GAN을 처음부터 학습시키고 싶다면, [PyTorch GAN Zoo](https://github.com/facebookresearch/pytorch_GAN_zoo) 를 참고하시기 바랍니다.\n", + "\n", + "### 모델 설명\n", + "\n", + "컴퓨터 비전 분야에서 생성 모델은 주어진 입력에 대한 이미지를 생성하도록 훈련된 네트워크(networks)입니다. 본 예제에서는 무작위 벡터와 실제 이미지 생성 간의 연결하는 방법을 배우는 GANs (Generative Adversarial Networks) 으로 특정 종류의 생성 네트워크를 살펴봅니다.\n", + "\n", + "DCGAN은 2015년 Radford 등이 설계한 모델 구조입니다. 상세한 내용은 [Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks](https://arxiv.org/abs/1511.06434) 논문에서 확인할 수 있습니다. 모델은 GAN 구조이며 저해상도 이미지 (최대 64x64) 생성에 매우 간편하고 효율적입니다.\n", + "\n", + "\n", + "### 요구 사항\n", + "\n", + "- 현재는 오직 Python 3 에서만 지원됩니다.\n", + "\n", + "### 참고문헌\n", + "\n", + "- [Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks](https://arxiv.org/abs/1511.06434)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_pytorch-gan-zoo_pgan.ipynb b/assets/hub/facebookresearch_pytorch-gan-zoo_pgan.ipynb new file mode 100644 index 000000000..42e1551ad --- /dev/null +++ b/assets/hub/facebookresearch_pytorch-gan-zoo_pgan.ipynb @@ -0,0 +1,103 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "a2cbe89f", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Progressive Growing of GANs (PGAN)\n", + "\n", + "*Author: FAIR HDGAN*\n", + "\n", + "**High-quality image generation of fashion, celebrity faces**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/pgan_mix.jpg) | ![alt](https://pytorch.org/assets/images/pgan_celebaHQ.jpg)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c223da6e", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "use_gpu = True if torch.cuda.is_available() else False\n", + "\n", + "# 이 모델은 유명인들의 고해상도 얼굴 데이터셋 \"celebA\"로 학습되었습니다.\n", + "# 아래 모델의 출력은 512 x 512 픽셀의 이미지입니다.\n", + "model = torch.hub.load('facebookresearch/pytorch_GAN_zoo:hub',\n", + " 'PGAN', model_name='celebAHQ-512',\n", + " pretrained=True, useGPU=use_gpu)\n", + "# 아래 모델의 출력은 256 x 256 픽셀의 이미지입니다.\n", + "# model = torch.hub.load('facebookresearch/pytorch_GAN_zoo:hub',\n", + "# 'PGAN', model_name='celebAHQ-256',\n", + "# pretrained=True, useGPU=use_gpu)" + ] + }, + { + "cell_type": "markdown", + "id": "eeab60d4", + "metadata": {}, + "source": [ + "모델의 입력은 `(N, 512)` 크기의 노이즈(noise) 벡터입니다. `N`은 생성하고자 하는 이미지의 개수를 뜻합니다.\n", + "이 노이즈 벡터들은 함수 `.buildNoiseData`를 통하여 생성 할 수 있습니다.\n", + "이 모델은 노이즈 벡터를 받아서 이미지를 생성하는 `.test` 함수를 가지고 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "657c5d11", + "metadata": {}, + "outputs": [], + "source": [ + "num_images = 4\n", + "noise, _ = model.buildNoiseData(num_images)\n", + "with torch.no_grad():\n", + " generated_images = model.test(noise)\n", + "\n", + "# torchvision과 matplotlib를 이용하여 생성한 이미지들을 시각화 해봅시다.\n", + "import matplotlib.pyplot as plt\n", + "import torchvision\n", + "grid = torchvision.utils.make_grid(generated_images.clamp(min=-1, max=1), scale_each=True, normalize=True)\n", + "plt.imshow(grid.permute(1, 2, 0).cpu().numpy())\n", + "# plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "77e94a13", + "metadata": {}, + "source": [ + "왼쪽과 비슷한 이미지를 결과물로 확인할 수 있습니다.\n", + "\n", + "만약 자신만의 Progressive GAN 이나 다른 GAN 모델들을 직접 학습해 보고 싶다면 [PyTorch GAN Zoo](https://github.com/facebookresearch/pytorch_GAN_zoo)를 참고해 보시기 바랍니다.\n", + "\n", + "### 모델 설명\n", + "\n", + "컴퓨터 비전(Computer Vision)분야에서 생성 모델은 주어진 입력값으로 부터 이미지를 생성해 내도록 학습된 신경망입니다. 현재 다루는 모델은 생성 모델의 특정한 종류로서 무작위의 벡터에서 사실적인 이미지를 생성하는 법을 학습하는 GAN 모델입니다.\n", + "\n", + "GAN의 점진적인 증가(Progressive Growing of GANs)는 Karras와 그 외[1]가 2017년에 발표한 고해상도의 이미지 생성을 위한 방법론 입니다. 이를 위하여 생성 모델은 여러 단계로 나뉘어서 학습됩니다. 제일 먼저 모델은 아주 낮은 해상도의 이미지를 생성하도록 학습이 되고, 어느정도 모델이 수렴하면 새로운 계층이 모델에 더해지고 출력 해상도는 2배가 됩니다. 이 과정을 원하는 해상도에 도달 할 때 까지 반복합니다.\n", + "\n", + "### 요구사항\n", + "\n", + "- 현재는 Python3 에서만 지원합니다.\n", + "\n", + "### 참고\n", + "\n", + "- [1] Tero Karras et al, [Progressive Growing of GANs for Improved Quality, Stability, and Variation](https://arxiv.org/abs/1710.10196)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_pytorchvideo_resnet.ipynb b/assets/hub/facebookresearch_pytorchvideo_resnet.ipynb new file mode 100644 index 000000000..6c2e98aa3 --- /dev/null +++ b/assets/hub/facebookresearch_pytorchvideo_resnet.ipynb @@ -0,0 +1,281 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "0ea04021", + "metadata": {}, + "source": [ + "# 3D ResNet\n", + "\n", + "*Author: FAIR PyTorchVideo*\n", + "\n", + "**Resnet Style Video classification networks pretrained on the Kinetics 400 dataset**\n", + "\n", + "\n", + "### 사용 예시\n", + "\n", + "#### Imports\n", + "\n", + "모델 불러오기:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5c7b50a8", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "# `slow_r50` 모델 선택\n", + "model = torch.hub.load('facebookresearch/pytorchvideo', 'slow_r50', pretrained=True)" + ] + }, + { + "cell_type": "markdown", + "id": "44bc8049", + "metadata": {}, + "source": [ + "나머지 함수들 불러오기:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "eae75300", + "metadata": {}, + "outputs": [], + "source": [ + "import json\n", + "import urllib\n", + "from pytorchvideo.data.encoded_video import EncodedVideo\n", + "\n", + "from torchvision.transforms import Compose, Lambda\n", + "from torchvision.transforms._transforms_video import (\n", + " CenterCropVideo,\n", + " NormalizeVideo,\n", + ")\n", + "from pytorchvideo.transforms import (\n", + " ApplyTransformToKey,\n", + " ShortSideScale,\n", + " UniformTemporalSubsample\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "f6a68491", + "metadata": {}, + "source": [ + "#### 환경설정\n", + "\n", + "모델을 평가 모드로 설정하고 원하는 디바이스 방식을 선택합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "57d58e21", + "metadata": { + "attributes": { + "classes": [ + "python " + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "# GPU 또는 CPU 방식을 설정합니다.\n", + "device = \"cpu\"\n", + "model = model.eval()\n", + "model = model.to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "18f9b149", + "metadata": {}, + "source": [ + "토치 허브 모델이 훈련된 Kinetics 400 데이터셋에 대해 ID에서의 레이블과 맞는 정보를 다운로드합니다. 이는 예측된 클래스 ID에서 카테고리 레이블 이름을 가져오는데 사용됩니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "948c992c", + "metadata": {}, + "outputs": [], + "source": [ + "json_url = \"https://dl.fbaipublicfiles.com/pyslowfast/dataset/class_names/kinetics_classnames.json\"\n", + "json_filename = \"kinetics_classnames.json\"\n", + "try: urllib.URLopener().retrieve(json_url, json_filename)\n", + "except: urllib.request.urlretrieve(json_url, json_filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "84b66739", + "metadata": {}, + "outputs": [], + "source": [ + "with open(json_filename, \"r\") as f:\n", + " kinetics_classnames = json.load(f)\n", + "\n", + "# 레이블 이름과 맞는 ID 만들기\n", + "kinetics_id_to_classname = {}\n", + "for k, v in kinetics_classnames.items():\n", + " kinetics_id_to_classname[v] = str(k).replace('\"', \"\")" + ] + }, + { + "cell_type": "markdown", + "id": "6d140fd9", + "metadata": {}, + "source": [ + "#### 입력 형태에 대한 정의" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "83e14263", + "metadata": {}, + "outputs": [], + "source": [ + "side_size = 256\n", + "mean = [0.45, 0.45, 0.45]\n", + "std = [0.225, 0.225, 0.225]\n", + "crop_size = 256\n", + "num_frames = 8\n", + "sampling_rate = 8\n", + "frames_per_second = 30\n", + "\n", + "# 이 변환은 slow_R50 모델에만 해당됩니다.\n", + "transform = ApplyTransformToKey(\n", + " key=\"video\",\n", + " transform=Compose(\n", + " [\n", + " UniformTemporalSubsample(num_frames),\n", + " Lambda(lambda x: x/255.0),\n", + " NormalizeVideo(mean, std),\n", + " ShortSideScale(\n", + " size=side_size\n", + " ),\n", + " CenterCropVideo(crop_size=(crop_size, crop_size))\n", + " ]\n", + " ),\n", + ")\n", + "\n", + "# 입력 클립의 길이는 모델에 따라 달라집니다.\n", + "clip_duration = (num_frames * sampling_rate)/frames_per_second" + ] + }, + { + "cell_type": "markdown", + "id": "fc9d2230", + "metadata": {}, + "source": [ + "#### 추론 실행\n", + "\n", + "예제 영상을 다운로드합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "81c03aed", + "metadata": {}, + "outputs": [], + "source": [ + "url_link = \"https://dl.fbaipublicfiles.com/pytorchvideo/projects/archery.mp4\"\n", + "video_path = 'archery.mp4'\n", + "try: urllib.URLopener().retrieve(url_link, video_path)\n", + "except: urllib.request.urlretrieve(url_link, video_path)" + ] + }, + { + "cell_type": "markdown", + "id": "245cb466", + "metadata": {}, + "source": [ + "영상을 불러오고 이것을 모델에 필요한 입력 형식으로 변환합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "62106b97", + "metadata": {}, + "outputs": [], + "source": [ + "# 시작 및 종료 구간을 지정하여 불러올 클립의 길이를 선택합니다.\n", + "# start_sec는 영상에서 행동이 시작되는 위치와 일치해야합니다.\n", + "start_sec = 0\n", + "end_sec = start_sec + clip_duration\n", + "\n", + "# EncodedVideo helper 클래스를 초기화하고 영상을 불러옵니다.\n", + "video = EncodedVideo.from_path(video_path)\n", + "\n", + "# 원하는 클립을 불러옵니다.\n", + "video_data = video.get_clip(start_sec=start_sec, end_sec=end_sec)\n", + "\n", + "# 비디오 입력을 정규화하기 위해 transform 함수를 적용합니다.\n", + "video_data = transform(video_data)\n", + "\n", + "# 입력을 원하는 디바이스로 이동합니다.\n", + "inputs = video_data[\"video\"]\n", + "inputs = inputs.to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "6cf21d80", + "metadata": {}, + "source": [ + "#### 예측값 구하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "971ec2af", + "metadata": {}, + "outputs": [], + "source": [ + "# 모델을 통해 입력 클립을 전달합니다.\n", + "preds = model(inputs[None, ...])\n", + "\n", + "# 예측된 클래스를 가져옵니다.\n", + "post_act = torch.nn.Softmax(dim=1)\n", + "preds = post_act(preds)\n", + "pred_classes = preds.topk(k=5).indices[0]\n", + "\n", + "# 예측된 클래스를 레이블 이름에 매핑합니다.\n", + "pred_class_names = [kinetics_id_to_classname[int(i)] for i in pred_classes]\n", + "print(\"Top 5 predicted labels: %s\" % \", \".join(pred_class_names))" + ] + }, + { + "cell_type": "markdown", + "id": "6ae463f9", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "모델 아키텍처는 Kinetics 데이터셋의 8x8 설정을 사용하여 사전 훈련된 가중치가 있는 참고문헌 [1]을 기반으로 합니다.\n", + "| arch | depth | frame length x sample rate | top 1 | top 5 | Flops (G) | Params (M) |\n", + "| --------------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- |\n", + "| Slow | R50 | 8x8 | 74.58 | 91.63 | 54.52 | 32.45 |\n", + "\n", + "\n", + "### 참고문헌\n", + "[1] Christoph Feichtenhofer et al, \"SlowFast Networks for Video Recognition\"\n", + "https://arxiv.org/pdf/1812.03982.pdf" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_pytorchvideo_slowfast.ipynb b/assets/hub/facebookresearch_pytorchvideo_slowfast.ipynb new file mode 100644 index 000000000..994f4961c --- /dev/null +++ b/assets/hub/facebookresearch_pytorchvideo_slowfast.ipynb @@ -0,0 +1,307 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "0083f95b", + "metadata": {}, + "source": [ + "# SlowFast\n", + "\n", + "*Author: FAIR PyTorchVideo*\n", + "\n", + "**SlowFast networks pretrained on the Kinetics 400 dataset**\n", + "\n", + "\n", + "### 사용 예시\n", + "\n", + "#### 불러오기\n", + "\n", + "모델 불러오기:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c63a914b", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "# `slowfast_r50` 모델 선택\n", + "model = torch.hub.load('facebookresearch/pytorchvideo', 'slowfast_r50', pretrained=True)" + ] + }, + { + "cell_type": "markdown", + "id": "f7ad5b8d", + "metadata": {}, + "source": [ + "나머지 함수들 불러오기:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a6f0c7dc", + "metadata": {}, + "outputs": [], + "source": [ + "from typing import Dict\n", + "import json\n", + "import urllib\n", + "from torchvision.transforms import Compose, Lambda\n", + "from torchvision.transforms._transforms_video import (\n", + " CenterCropVideo,\n", + " NormalizeVideo,\n", + ")\n", + "from pytorchvideo.data.encoded_video import EncodedVideo\n", + "from pytorchvideo.transforms import (\n", + " ApplyTransformToKey,\n", + " ShortSideScale,\n", + " UniformTemporalSubsample,\n", + " UniformCropVideo\n", + ") " + ] + }, + { + "cell_type": "markdown", + "id": "17c77f6b", + "metadata": {}, + "source": [ + "#### 셋업\n", + "\n", + "모델을 평가 모드로 설정하고 원하는 디바이스 방식을 선택합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d6de266c", + "metadata": { + "attributes": { + "classes": [ + "python " + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "# GPU 또는 CPU 방식을 설정합니다.\n", + "device = \"cpu\"\n", + "model = model.eval()\n", + "model = model.to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "6596c96e", + "metadata": {}, + "source": [ + "토치 허브 모델이 훈련된 Kinetics 400 데이터셋을 위한 id-레이블 매핑 정보를 다운로드합니다. 이는 예측된 클래스 id에 카테고리 레이블 이름을 붙이는 데 사용됩니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5f34af31", + "metadata": {}, + "outputs": [], + "source": [ + "json_url = \"https://dl.fbaipublicfiles.com/pyslowfast/dataset/class_names/kinetics_classnames.json\"\n", + "json_filename = \"kinetics_classnames.json\"\n", + "try: urllib.URLopener().retrieve(json_url, json_filename)\n", + "except: urllib.request.urlretrieve(json_url, json_filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cfa1ef68", + "metadata": {}, + "outputs": [], + "source": [ + "with open(json_filename, \"r\") as f:\n", + " kinetics_classnames = json.load(f)\n", + "\n", + "# id-레이블 이름 매핑 만들기\n", + "kinetics_id_to_classname = {}\n", + "for k, v in kinetics_classnames.items():\n", + " kinetics_id_to_classname[v] = str(k).replace('\"', \"\")" + ] + }, + { + "cell_type": "markdown", + "id": "67828588", + "metadata": {}, + "source": [ + "#### 입력 변환에 대한 정의" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d5e40519", + "metadata": {}, + "outputs": [], + "source": [ + "side_size = 256\n", + "mean = [0.45, 0.45, 0.45]\n", + "std = [0.225, 0.225, 0.225]\n", + "crop_size = 256\n", + "num_frames = 32\n", + "sampling_rate = 2\n", + "frames_per_second = 30\n", + "slowfast_alpha = 4\n", + "num_clips = 10\n", + "num_crops = 3\n", + "\n", + "class PackPathway(torch.nn.Module):\n", + " \"\"\"\n", + " 영상 프레임을 텐서 리스트로 바꾸기 위한 변환.\n", + " \"\"\"\n", + " def __init__(self):\n", + " super().__init__()\n", + " \n", + " def forward(self, frames: torch.Tensor):\n", + " fast_pathway = frames\n", + " # Perform temporal sampling from the fast pathway.\n", + " slow_pathway = torch.index_select(\n", + " frames,\n", + " 1,\n", + " torch.linspace(\n", + " 0, frames.shape[1] - 1, frames.shape[1] // slowfast_alpha\n", + " ).long(),\n", + " )\n", + " frame_list = [slow_pathway, fast_pathway]\n", + " return frame_list\n", + "\n", + "transform = ApplyTransformToKey(\n", + " key=\"video\",\n", + " transform=Compose(\n", + " [\n", + " UniformTemporalSubsample(num_frames),\n", + " Lambda(lambda x: x/255.0),\n", + " NormalizeVideo(mean, std),\n", + " ShortSideScale(\n", + " size=side_size\n", + " ),\n", + " CenterCropVideo(crop_size),\n", + " PackPathway()\n", + " ]\n", + " ),\n", + ")\n", + "\n", + "# 입력 클립의 길이는 모델에 따라 달라집니다.\n", + "clip_duration = (num_frames * sampling_rate)/frames_per_second" + ] + }, + { + "cell_type": "markdown", + "id": "776f6aa8", + "metadata": {}, + "source": [ + "#### 추론 실행\n", + "\n", + "예제 영상을 다운로드합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fba5e4da", + "metadata": {}, + "outputs": [], + "source": [ + "url_link = \"https://dl.fbaipublicfiles.com/pytorchvideo/projects/archery.mp4\"\n", + "video_path = 'archery.mp4'\n", + "try: urllib.URLopener().retrieve(url_link, video_path)\n", + "except: urllib.request.urlretrieve(url_link, video_path)" + ] + }, + { + "cell_type": "markdown", + "id": "004dd2bb", + "metadata": {}, + "source": [ + "영상을 불러오고 모델에 필요한 입력 형식으로 변환합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2851b80f", + "metadata": {}, + "outputs": [], + "source": [ + "# 시작 및 종료 구간을 지정하여 불러올 클립의 길이를 선택합니다.\n", + "# start_sec는 영상에서 행동이 시작되는 위치와 일치해야 합니다.\n", + "start_sec = 0\n", + "end_sec = start_sec + clip_duration\n", + "\n", + "# EncodedVideo helper 클래스를 초기화하고 영상을 불러옵니다.\n", + "video = EncodedVideo.from_path(video_path)\n", + "\n", + "# 원하는 클립을 불러옵니다.\n", + "video_data = video.get_clip(start_sec=start_sec, end_sec=end_sec)\n", + "\n", + "# 영상 입력을 정규화하기 위한 변환(transform 함수)을 적용합니다.\n", + "video_data = transform(video_data)\n", + "\n", + "# 입력을 원하는 디바이스로 이동합니다.\n", + "inputs = video_data[\"video\"]\n", + "inputs = [i.to(device)[None, ...] for i in inputs]" + ] + }, + { + "cell_type": "markdown", + "id": "95000af1", + "metadata": {}, + "source": [ + "#### 예측값 구하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1ec72161", + "metadata": {}, + "outputs": [], + "source": [ + "# 모델을 통해 입력 클립을 전달합니다.\n", + "preds = model(inputs)\n", + "\n", + "# 예측된 클래스를 가져옵니다.\n", + "post_act = torch.nn.Softmax(dim=1)\n", + "preds = post_act(preds)\n", + "pred_classes = preds.topk(k=5).indices[0]\n", + "\n", + "# 예측된 클래스를 레이블 이름에 매핑합니다.\n", + "pred_class_names = [kinetics_id_to_classname[int(i)] for i in pred_classes]\n", + "print(\"Top 5 predicted labels: %s\" % \", \".join(pred_class_names))" + ] + }, + { + "cell_type": "markdown", + "id": "d3ae4a54", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "SlowFast 모델 아키텍처는 Kinetics 데이터셋의 8x8 설정을 사용하여 사전 훈련된 가중치가 있는 [1]을 기반으로 합니다.\n", + "\n", + "| arch | depth | frame length x sample rate | top 1 | top 5 | Flops (G) | Params (M) |\n", + "| --------------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- |\n", + "| SlowFast | R50 | 8x8 | 76.94 | 92.69 | 65.71 | 34.57 |\n", + "| SlowFast | R101 | 8x8 | 77.90 | 93.27 | 127.20 | 62.83 |\n", + "\n", + "\n", + "### 참고문헌\n", + "[1] Christoph Feichtenhofer et al, \"SlowFast Networks for Video Recognition\"\n", + "https://arxiv.org/pdf/1812.03982.pdf" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_pytorchvideo_x3d.ipynb b/assets/hub/facebookresearch_pytorchvideo_x3d.ipynb new file mode 100644 index 000000000..96c3806d9 --- /dev/null +++ b/assets/hub/facebookresearch_pytorchvideo_x3d.ipynb @@ -0,0 +1,297 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "25a2e458", + "metadata": {}, + "source": [ + "# X3D\n", + "\n", + "*Author: FAIR PyTorchVideo*\n", + "\n", + "**X3D networks pretrained on the Kinetics 400 dataset**\n", + "\n", + "\n", + "### 사용 예시\n", + "\n", + "#### Imports\n", + "\n", + "모델 불러오기:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e25e2113", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "# `x3d_s` 모델 선택\n", + "model_name = 'x3d_s'\n", + "model = torch.hub.load('facebookresearch/pytorchvideo', model_name, pretrained=True)" + ] + }, + { + "cell_type": "markdown", + "id": "81ce27ed", + "metadata": {}, + "source": [ + "나머지 함수들 불러오기:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5fabc37e", + "metadata": {}, + "outputs": [], + "source": [ + "import json\n", + "import urllib\n", + "from pytorchvideo.data.encoded_video import EncodedVideo\n", + "\n", + "from torchvision.transforms import Compose, Lambda\n", + "from torchvision.transforms._transforms_video import (\n", + " CenterCropVideo,\n", + " NormalizeVideo,\n", + ")\n", + "from pytorchvideo.transforms import (\n", + " ApplyTransformToKey,\n", + " ShortSideScale,\n", + " UniformTemporalSubsample\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "58a2a810", + "metadata": {}, + "source": [ + "#### 셋업\n", + "\n", + "모델을 평가 모드로 설정하고 원하는 디바이스 방식을 선택합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2b7f6c5c", + "metadata": {}, + "outputs": [], + "source": [ + "# GPU 또는 CPU 방식을 설정합니다.\n", + "device = \"cpu\"\n", + "model = model.eval()\n", + "model = model.to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "626eabac", + "metadata": {}, + "source": [ + "토치 허브 모델이 훈련된 Kinetics 400 데이터셋에 대해 ID에서의 레이블 매핑 정보를 다운로드합니다. 이는 예측된 클래스 ID에서 카테고리 레이블 이름을 가져오는데 사용됩니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0ad38632", + "metadata": {}, + "outputs": [], + "source": [ + "json_url = \"https://dl.fbaipublicfiles.com/pyslowfast/dataset/class_names/kinetics_classnames.json\"\n", + "json_filename = \"kinetics_classnames.json\"\n", + "try: urllib.URLopener().retrieve(json_url, json_filename)\n", + "except: urllib.request.urlretrieve(json_url, json_filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "68c35cc3", + "metadata": {}, + "outputs": [], + "source": [ + "with open(json_filename, \"r\") as f:\n", + " kinetics_classnames = json.load(f)\n", + "\n", + "# 레이블 이름 매핑에 대한 ID 만들기\n", + "kinetics_id_to_classname = {}\n", + "for k, v in kinetics_classnames.items():\n", + " kinetics_id_to_classname[v] = str(k).replace('\"', \"\")" + ] + }, + { + "cell_type": "markdown", + "id": "08f14496", + "metadata": {}, + "source": [ + "#### 입력 형태에 대한 정의" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b5934607", + "metadata": {}, + "outputs": [], + "source": [ + "mean = [0.45, 0.45, 0.45]\n", + "std = [0.225, 0.225, 0.225]\n", + "frames_per_second = 30\n", + "model_transform_params = {\n", + " \"x3d_xs\": {\n", + " \"side_size\": 182,\n", + " \"crop_size\": 182,\n", + " \"num_frames\": 4,\n", + " \"sampling_rate\": 12,\n", + " },\n", + " \"x3d_s\": {\n", + " \"side_size\": 182,\n", + " \"crop_size\": 182,\n", + " \"num_frames\": 13,\n", + " \"sampling_rate\": 6,\n", + " },\n", + " \"x3d_m\": {\n", + " \"side_size\": 256,\n", + " \"crop_size\": 256,\n", + " \"num_frames\": 16,\n", + " \"sampling_rate\": 5,\n", + " }\n", + "}\n", + "\n", + "# 모델에 맞는 변환 매개변수 가져오기\n", + "transform_params = model_transform_params[model_name]\n", + "\n", + "# 이 변환은 slow_R50 모델에 한정됩니다.\n", + "transform = ApplyTransformToKey(\n", + " key=\"video\",\n", + " transform=Compose(\n", + " [\n", + " UniformTemporalSubsample(transform_params[\"num_frames\"]),\n", + " Lambda(lambda x: x/255.0),\n", + " NormalizeVideo(mean, std),\n", + " ShortSideScale(size=transform_params[\"side_size\"]),\n", + " CenterCropVideo(\n", + " crop_size=(transform_params[\"crop_size\"], transform_params[\"crop_size\"])\n", + " )\n", + " ]\n", + " ),\n", + ")\n", + "\n", + "# 입력 클립의 길이는 모델에 따라 달라집니다.\n", + "clip_duration = (transform_params[\"num_frames\"] * transform_params[\"sampling_rate\"])/frames_per_second" + ] + }, + { + "cell_type": "markdown", + "id": "0e74d4a7", + "metadata": {}, + "source": [ + "#### 추론 실행\n", + "\n", + "예제 영상을 다운로드합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "be476ed1", + "metadata": {}, + "outputs": [], + "source": [ + "url_link = \"https://dl.fbaipublicfiles.com/pytorchvideo/projects/archery.mp4\"\n", + "video_path = 'archery.mp4'\n", + "try: urllib.URLopener().retrieve(url_link, video_path)\n", + "except: urllib.request.urlretrieve(url_link, video_path)" + ] + }, + { + "cell_type": "markdown", + "id": "878417f9", + "metadata": {}, + "source": [ + "영상을 불러오고 모델에 필요한 입력 형식으로 변환합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1c2123f0", + "metadata": {}, + "outputs": [], + "source": [ + "# 시작 및 종료 기간을 지정하여 불러올 클립의 기간을 선택합니다.\n", + "# start_sec는 영상에서 행동이 시작되는 위치와 일치해야합니다.\n", + "start_sec = 0\n", + "end_sec = start_sec + clip_duration\n", + "\n", + "# EncodedVideo helper 클래스를 초기화하고 영상을 불러옵니다.\n", + "video = EncodedVideo.from_path(video_path)\n", + "\n", + "# 원하는 클립을 불러옵니다.\n", + "video_data = video.get_clip(start_sec=start_sec, end_sec=end_sec)\n", + "\n", + "# 영상 입력을 정규화하기 위해 변형(transform) 함수를 적용합니다.\n", + "video_data = transform(video_data)\n", + "\n", + "# 입력을 원하는 디바이스로 이동합니다.\n", + "inputs = video_data[\"video\"]\n", + "inputs = inputs.to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "0ce0d9cd", + "metadata": {}, + "source": [ + "#### 예측값 구하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8791498f", + "metadata": {}, + "outputs": [], + "source": [ + "# 모델을 통해 입력클립을 전달합니다.\n", + "preds = model(inputs[None, ...])\n", + "\n", + "# 예측된 클래스를 가져옵니다.\n", + "post_act = torch.nn.Softmax(dim=1)\n", + "preds = post_act(preds)\n", + "pred_classes = preds.topk(k=5).indices[0]\n", + "\n", + "# 예측된 클래스를 레이블 이름에 매핑합니다.\n", + "pred_class_names = [kinetics_id_to_classname[int(i)] for i in pred_classes]\n", + "print(\"Top 5 predicted labels: %s\" % \", \".join(pred_class_names))" + ] + }, + { + "cell_type": "markdown", + "id": "312e5ddc", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "X3D 모델 아키텍처는 Kinetics 데이터셋에 대해 사전 훈련된 [1]을 기반으로 합니다.\n", + "\n", + "| arch | depth | frame length x sample rate | top 1 | top 5 | Flops (G) | Params (M) |\n", + "| --------------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- | ----------- |\n", + "| X3D | XS | 4x12 | 69.12 | 88.63 | 0.91 | 3.79 |\n", + "| X3D | S | 13x6 | 73.33 | 91.27 | 2.96 | 3.79 |\n", + "| X3D | M | 16x5 | 75.94 | 92.72 | 6.72 | 3.79 |\n", + "\n", + "\n", + "### 참고문헌\n", + "[1] Christoph Feichtenhofer, \"X3D: Expanding Architectures for\n", + " Efficient Video Recognition.\" https://arxiv.org/abs/2004.04730" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/facebookresearch_semi-supervised-ImageNet1K-models_resnext.ipynb b/assets/hub/facebookresearch_semi-supervised-ImageNet1K-models_resnext.ipynb new file mode 100644 index 000000000..c3c0b15e1 --- /dev/null +++ b/assets/hub/facebookresearch_semi-supervised-ImageNet1K-models_resnext.ipynb @@ -0,0 +1,162 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "218a404a", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Semi-supervised and semi-weakly supervised ImageNet Models\n", + "\n", + "*Author: Facebook AI*\n", + "\n", + "**Billion scale semi-supervised learning for image classification 에서 제안된 ResNet, ResNext 모델**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cbd8d994", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "# === 해시태그된 9억 4천만개의 이미지를 활용한 Semi-weakly supervised 사전 학습 모델 ===\n", + "model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnet18_swsl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnet50_swsl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext50_32x4d_swsl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext101_32x4d_swsl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext101_32x8d_swsl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext101_32x16d_swsl')\n", + "# ================= YFCC100M 데이터를 활용한 Semi-supervised 사전 학습 모델 ==================\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnet18_ssl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnet50_ssl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext50_32x4d_ssl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext101_32x4d_ssl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext101_32x8d_ssl')\n", + "# model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext101_32x16d_ssl')\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "eb9a81a3", + "metadata": {}, + "source": [ + "사전에 학습된 모든 모델은 동일한 방식으로 정규화된 입력 이미지, 즉, `H` 와 `W` 는 최소 `224` 이상인 `(3 x H x W)` 형태의 3-채널 RGB 이미지의 미니 배치를 요구합니다. 이미지를 `[0, 1]` 범위에서 불러온 다음 `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]`를 통해 정규화합니다.\n", + "\n", + "실행 예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4d0d787f", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예제 이미지를 다운로드합니다.\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "df7e9bc8", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예시입니다. (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 미니배치를 생성합니다.\n", + "\n", + "# 가능하다면 속도를 위해 입력과 모델을 GPU로 옮깁니다.\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# 1000개의 ImageNet 클래스에 대한 신뢰도 점수(confidence score)를 가진 1000 크기의 Tensor\n", + "print(output[0])\n", + "# output엔 정규화되지 않은 신뢰도 점수가 있습니다. 확률값을 얻으려면 softmax를 실행하세요.\n", + "print(torch.nn.functional.softmax(output[0], dim=0))\n" + ] + }, + { + "cell_type": "markdown", + "id": "20701bda", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "본 문서에선 [Billion-scale Semi-Supervised Learning for Image Classification](https://arxiv.org/abs/1905.00546)에서 제안된 Semi-supervised, Semi-weakly supervised 방식의 ImageNet 분류 모델을 다룹니다.\n", + "\n", + "\"Semi-supervised\" 방식에서 대용량(hight-capacity)의 teacher 모델은 ImageNet1K 훈련 데이터로 학습됩니다. student 모델은 레이블이 없는 YFCC100M의 일부 이미지를 활용해 사전 학습하며, 이후 ImageNet1K의 훈련 데이터를 통해서 파인 튜닝합니다. 자세한 사항은 앞서 언급한 논문에서 확인할 수 있습니다.\n", + "\n", + "\"Semi-weakly supervised\" 방식에서 teacher 모델은 해시태그가 포함된 9억 4천만장의 이미지 일부를 활용해 사전 학습되며, 이후 ImageNet1K 훈련 데이터로 파인 튜닝됩니다. 활용된 해시태그는 1500개 정도이며 ImageNet1K 레이블의 동의어 집합(synsets)들을 모은 것입니다. 해시태그는 teacher 모델 사전 학습 과정에서만 레이블로 활용됩니다. student 모델은 teacher 모델이 사용한 이미지와 ImageNet1k 레이블로 사전 학습하며, 이후 ImageNet1K의 훈련 데이터를 통해서 파인 튜닝합니다.\n", + "\n", + "[Xie *et al*.](https://arxiv.org/pdf/1611.05431.pdf), [Mixup](https://arxiv.org/pdf/1710.09412.pdf), [LabelRefinery](https://arxiv.org/pdf/1805.02641.pdf), [Autoaugment](https://arxiv.org/pdf/1805.09501.pdf), [Weakly supervised](https://arxiv.org/pdf/1805.00932.pdf) 기법을 활용했을 때와 비교했을 때, Semi-supervised 및 Semi-weakly-supervised 방식은 ResNet, ResNext 모델의 ImageNet Top-1 검증 정확도를 크게 개선했습니다. 예시, **ResNet-50 구조로 ImageNet 검증 정확도를 81.2% 기록했습니다.**.\n", + "\n", + "\n", + "| Architecture | Supervision | #Parameters | FLOPS | Top-1 Acc. | Top-5 Acc. |\n", + "| ------------------ | :--------------:|:----------: | :---: | :--------: | :--------: |\n", + "| ResNet-18 | semi-supervised |14M | 2B | 72.8 | 91.5 |\n", + "| ResNet-50 | semi-supervised |25M | 4B | 79.3 | 94.9 |\n", + "| ResNeXt-50 32x4d | semi-supervised |25M | 4B | 80.3 | 95.4 |\n", + "| ResNeXt-101 32x4d | semi-supervised |42M | 8B | 81.0 | 95.7 |\n", + "| ResNeXt-101 32x8d | semi-supervised |88M | 16B | 81.7 | 96.1 |\n", + "| ResNeXt-101 32x16d | semi-supervised |193M | 36B | 81.9 | 96.2 |\n", + "| ResNet-18 | semi-weakly supervised |14M | 2B | **73.4** | 91.9 |\n", + "| ResNet-50 | semi-weakly supervised |25M | 4B | **81.2** | 96.0 |\n", + "| ResNeXt-50 32x4d | semi-weakly supervised |25M | 4B | **82.2** | 96.3 |\n", + "| ResNeXt-101 32x4d | semi-weakly supervised |42M | 8B | **83.4** | 96.8 |\n", + "| ResNeXt-101 32x8d | semi-weakly supervised |88M | 16B | **84.3** | 97.2 |\n", + "| ResNeXt-101 32x16d | semi-weakly supervised |193M | 36B | **84.8** | 97.4 |\n", + "\n", + "\n", + "## 인용\n", + "\n", + "저장소에 공개된 모델을 사용할 땐, 다음 논문을 인용해주세요. ([Billion-scale Semi-Supervised Learning for Image Classification](https://arxiv.org/abs/1905.00546))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "062ccbd0", + "metadata": {}, + "outputs": [], + "source": [ + "@misc{yalniz2019billionscale,\n", + " title={Billion-scale semi-supervised learning for image classification},\n", + " author={I. Zeki Yalniz and Hervé Jégou and Kan Chen and Manohar Paluri and Dhruv Mahajan},\n", + " year={2019},\n", + " eprint={1905.00546},\n", + " archivePrefix={arXiv},\n", + " primaryClass={cs.CV}\n", + "}" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/huggingface_pytorch-transformers.ipynb b/assets/hub/huggingface_pytorch-transformers.ipynb new file mode 100644 index 000000000..0839d1852 --- /dev/null +++ b/assets/hub/huggingface_pytorch-transformers.ipynb @@ -0,0 +1,452 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "34c2fd56", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# PyTorch-Transformers\n", + "\n", + "*Author: HuggingFace Team*\n", + "\n", + "**PyTorch implementations of popular NLP Transformers**\n", + "\n", + "\n", + "# 모델 설명\n", + "\n", + "\n", + "PyTorch-Transformers (이전엔 `pytorch-pretrained-bert`으로 알려짐) 는 자연어 처리(NLP)를 위한 최신식 사전 학습된 모델들을 모아놓은 라이브러리입니다.\n", + "\n", + "라이브러리는 현재 다음 모델들에 대한 파이토치 구현과 사전 학습된 가중치, 사용 스크립트, 변환 유틸리티를 포함하고 있습니다.\n", + "\n", + "1. **[BERT](https://github.com/google-research/bert)** 는 Google에서 발표한 [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805) 논문과 함께 공개되었습니다. (저자: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova)\n", + "2. **[GPT](https://github.com/openai/finetune-transformer-lm)** 는 OpenAI에서 발표한 [Improving Language Understanding by Generative Pre-Training](https://blog.openai.com/language-unsupervised/) 논문과 함께 공개되었습니다. (저자: Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever)\n", + "3. **[GPT-2](https://blog.openai.com/better-language-models/)** 는 OpenAI에서 발표한 [Language Models are Unsupervised Multitask Learners](https://blog.openai.com/better-language-models/) 논문과 함께 공개되었습니다. (저자: Alec Radford*, Jeffrey Wu*, Rewon Child, David Luan, Dario Amodei**, Ilya Sutskever**)\n", + "4. **[Transformer-XL](https://github.com/kimiyoung/transformer-xl)** 는 Google/CMU에서 발표한 [Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context](https://arxiv.org/abs/1901.02860) 논문과 함께 공개되었습니다. (저자: Zihang Dai*, Zhilin Yang*, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov)\n", + "5. **[XLNet](https://github.com/zihangdai/xlnet/)** 는 Google/CMU에서 발표한 [​XLNet: Generalized Autoregressive Pretraining for Language Understanding](https://arxiv.org/abs/1906.08237) 논문과 함께 공개되었습니다. (저자: Zhilin Yang*, Zihang Dai*, Yiming Yang, Jaime Carbonell, Ruslan Salakhutdinov, Quoc V. Le)\n", + "6. **[XLM](https://github.com/facebookresearch/XLM/)** 는 Facebook에서 발표한 [Cross-lingual Language Model Pretraining](https://arxiv.org/abs/1901.07291) 논문과 함께 공개되었습니다. (저자: Guillaume Lample, Alexis Conneau)\n", + "7. **[RoBERTa](https://github.com/pytorch/fairseq/tree/master/examples/roberta)** 는 Facebook에서 발표한 [Robustly Optimized BERT Pretraining Approach](https://arxiv.org/abs/1907.11692) 논문과 함께 공개되었습니다. (저자: Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, Veselin Stoyanov)\n", + "8. **[DistilBERT](https://github.com/huggingface/pytorch-transformers/tree/master/examples/distillation)** 는 HuggingFace에서 게시한 [Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT](https://medium.com/huggingface/distilbert-8cf3380435b) 블로그 포스팅과 함께 발표되었습니다. (저자: Victor Sanh, Lysandre Debut, Thomas Wolf)\n", + "\n", + "여기에서 사용되는 구성요소들은 `pytorch-transformers` 라이브러리에 있는 `AutoModel` 과 `AutoTokenizer` 클래스를 기반으로 하고 있습니다.\n", + "\n", + "# 요구 사항\n", + "\n", + "파이토치 허브에 있는 대부분의 다른 모델들과 다르게, BERT는 별도의 파이썬 패키지들을 설치해야 합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cbb1098e", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install tqdm boto3 requests regex sentencepiece sacremoses" + ] + }, + { + "cell_type": "markdown", + "id": "a5070905", + "metadata": {}, + "source": [ + "# 사용 방법\n", + "\n", + "사용 가능한 메소드는 다음과 같습니다:\n", + "- `config`: 지정한 모델 또는 경로에 해당하는 설정값(configuration)을 반환합니다.\n", + "- `tokenizer`: 지정한 모델 또는 경로에 해당하는 토크나이저(tokenizer)를 반환합니다.\n", + "- `model`: 지정한 모델 또는 경로에 해당하는 모델을 반환합니다.\n", + "- `modelForCausalLM`: 지정한 모델 또는 경로에 해당하는, 언어 모델링 헤드(language modeling head)가 추가된 모델을 반환합니다.\n", + "- `modelForSequenceClassification`: 지정한 모델 또는 경로에 해당하는, 시퀀스 분류기(sequence classifier)가 추가된 모델을 반환합니다.\n", + "- `modelForQuestionAnswering`: 지정한 모델 또는 경로에 해당하는, 질의 응답 헤드(question answering head)가 추가된 모델을 반환합니다.\n", + "\n", + "여기의 모든 메소드들은 다음 인자를 공유합니다: `pretrained_model_or_path` 는 반환할 인스턴스에 대한 사전 학습된 모델 또는 경로를 나타내는 문자열입니다. 각 모델에 대해 사용할 수 있는 다양한 체크포인트(checkpoint)가 있고, 자세한 내용은 아래에서 확인하실 수 있습니다:\n", + "\n", + "\n", + "\n", + "\n", + "사용 가능한 모델은 [pytorch-transformers 문서의 pre-trained models 섹션](https://huggingface.co/pytorch-transformers/pretrained_models.html)에 나열되어 있습니다.\n", + "\n", + "# 문서\n", + "\n", + "다음은 각 사용 가능한 메소드들의 사용법을 자세히 설명하는 몇 가지 예시입니다.\n", + "\n", + "\n", + "## 토크나이저\n", + "\n", + "토크나이저 객체로 문자열을 모델에서 사용할 수 있는 토큰으로 변환할 수 있습니다. 각 모델마다 고유한 토크나이저가 있고, 일부 토큰화 메소드는 토크나이저에 따라 다릅니다. 전체 문서는 [여기](https://huggingface.co/pytorch-transformers/main_classes/tokenizer.html)에서 확인해보실 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "94ae94b0", + "metadata": { + "attributes": { + "classes": [ + "py" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "import torch\n", + "tokenizer = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-uncased') # S3 및 캐시에서 어휘(vocabulary)를 다운로드합니다.\n", + "tokenizer = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', './test/bert_saved_model/') # `save_pretrained('./test/saved_model/')`를 통해 토크나이저를 저장한 경우에 로딩하는 예시입니다." + ] + }, + { + "cell_type": "markdown", + "id": "5e630684", + "metadata": {}, + "source": [ + "## 모델\n", + "\n", + "모델 객체는 `nn.Module` 를 상속하는 모델의 인스턴스입니다. 각 모델은 로컬 파일 혹은 디렉터리나 사전 학습할 때 사용된 설정값(앞서 설명한 `config`)으로부터 저장/로딩하는 방법이 함께 제공됩니다. 각 모델은 다르게 동작하며, 여러 다른 모델들의 전체 개요는 [여기](https://huggingface.co/pytorch-transformers/pretrained_models.html)에서 확인해보실 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2d0f1642", + "metadata": { + "attributes": { + "classes": [ + "py" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'model', 'bert-base-uncased') # S3와 캐시로부터 모델과 설정값을 다운로드합니다.\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'model', './test/bert_model/') # `save_pretrained('./test/saved_model/')`를 통해 모델을 저장한 경우에 로딩하는 예시입니다.\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'model', 'bert-base-uncased', output_attentions=True) # 설정값을 업데이트하여 로딩합니다.\n", + "assert model.config.output_attentions == True\n", + "# 파이토치 모델 대신 텐서플로우 체크포인트 파일로부터 로딩합니다. (느림)\n", + "config = AutoConfig.from_json_file('./tf_model/bert_tf_model_config.json')\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'model', './tf_model/bert_tf_checkpoint.ckpt.index', from_tf=True, config=config)" + ] + }, + { + "cell_type": "markdown", + "id": "7d469102", + "metadata": {}, + "source": [ + "## 언어 모델링 헤드가 추가된 모델\n", + "\n", + "앞서 언급한, 언어 모델링 헤드가 추가된 `model` 인스턴스입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f658788b", + "metadata": { + "attributes": { + "classes": [ + "py" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('huggingface/transformers', 'modelForCausalLM', 'gpt2') # huggingface.co와 캐시로부터 모델과 설정값을 다운로드합니다.\n", + "model = torch.hub.load('huggingface/transformers', 'modelForCausalLM', './test/saved_model/') # `save_pretrained('./test/saved_model/')`를 통해 모델을 저장한 경우에 로딩하는 예시입니다.\n", + "model = torch.hub.load('huggingface/transformers', 'modelForCausalLM', 'gpt2', output_attentions=True) # 설정값을 업데이트하여 로딩합니다.\n", + "assert model.config.output_attentions == True\n", + "# 파이토치 모델 대신 텐서플로우 체크포인트 파일로부터 로딩합니다. (느림)\n", + "config = AutoConfig.from_pretrained('./tf_model/gpt_tf_model_config.json')\n", + "model = torch.hub.load('huggingface/transformers', 'modelForCausalLM', './tf_model/gpt_tf_checkpoint.ckpt.index', from_tf=True, config=config)" + ] + }, + { + "cell_type": "markdown", + "id": "0eaab51e", + "metadata": {}, + "source": [ + "## 시퀀스 분류기가 추가된 모델\n", + "\n", + "앞서 언급한, 시퀀스 분류기가 추가된 `model` 인스턴스입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0f4ed89d", + "metadata": { + "attributes": { + "classes": [ + "py" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForSequenceClassification', 'bert-base-uncased') # S3와 캐시로부터 모델과 설정값을 다운로드합니다.\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForSequenceClassification', './test/bert_model/') # `save_pretrained('./test/saved_model/')`를 통해 모델을 저장한 경우에 로딩하는 예시입니다.\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForSequenceClassification', 'bert-base-uncased', output_attention=True) # 설정값을 업데이트하여 로딩합니다.\n", + "assert model.config.output_attention == True\n", + "# 파이토치 모델 대신 텐서플로우 체크포인트 파일로부터 로딩합니다. (느림)\n", + "config = AutoConfig.from_json_file('./tf_model/bert_tf_model_config.json')\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForSequenceClassification', './tf_model/bert_tf_checkpoint.ckpt.index', from_tf=True, config=config)" + ] + }, + { + "cell_type": "markdown", + "id": "0460ec86", + "metadata": {}, + "source": [ + "## 질의 응답 헤드가 추가된 모델\n", + "\n", + "앞서 언급한, 질의 응답 헤드가 추가된 `model` 인스턴스입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1afde5e4", + "metadata": { + "attributes": { + "classes": [ + "py" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForQuestionAnswering', 'bert-base-uncased') # S3와 캐시로부터 모델과 설정값을 다운로드합니다.\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForQuestionAnswering', './test/bert_model/') # `save_pretrained('./test/saved_model/')`를 통해 모델을 저장한 경우에 로딩하는 예시입니다.\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForQuestionAnswering', 'bert-base-uncased', output_attention=True) # 설정값을 업데이트하여 로딩합니다.\n", + "assert model.config.output_attention == True\n", + "# 파이토치 모델 대신 텐서플로우 체크포인트 파일로부터 로딩합니다. (느림)\n", + "config = AutoConfig.from_json_file('./tf_model/bert_tf_model_config.json')\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'modelForQuestionAnswering', './tf_model/bert_tf_checkpoint.ckpt.index', from_tf=True, config=config)" + ] + }, + { + "cell_type": "markdown", + "id": "465bb480", + "metadata": {}, + "source": [ + "## 설정값\n", + "\n", + "설정값은 선택 사항입니다. 설정값 객체는 모델에 관한 정보, 예를 들어 헤드나 레이어의 개수, 모델이 어텐션(attentions) 또는 은닉 상태(hidden states)를 출력해야 하는지, 또는 모델이 TorchScript에 맞게 조정되어야 하는지 여부에 대한 정보를 가지고 있습니다. 각 모델에 따라 다양한 매개변수를 사용할 수 있습니다. 전체 문서는 [여기](https://huggingface.co/pytorch-transformers/main_classes/configuration.html)에서 확인해보실 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9fd58e51", + "metadata": { + "attributes": { + "classes": [ + "py" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "import torch\n", + "config = torch.hub.load('huggingface/pytorch-transformers', 'config', 'bert-base-uncased') # S3와 캐시로부터 모델과 설정값을 다운로드합니다.\n", + "config = torch.hub.load('huggingface/pytorch-transformers', 'config', './test/bert_saved_model/') # `save_pretrained('./test/saved_model/')`를 통해 모델을 저장한 경우에 로딩하는 예시입니다.\n", + "config = torch.hub.load('huggingface/pytorch-transformers', 'config', './test/bert_saved_model/my_configuration.json')\n", + "config = torch.hub.load('huggingface/pytorch-transformers', 'config', 'bert-base-uncased', output_attention=True, foo=False)\n", + "assert config.output_attention == True\n", + "config, unused_kwargs = torch.hub.load('huggingface/pytorch-transformers', 'config', 'bert-base-uncased', output_attention=True, foo=False, return_unused_kwargs=True)\n", + "assert config.output_attention == True\n", + "assert unused_kwargs == {'foo': False}\n", + "\n", + "# 설정값을 사용하여 모델을 로딩합니다.\n", + "config = torch.hub.load('huggingface/pytorch-transformers', 'config', 'bert-base-uncased')\n", + "config.output_attentions = True\n", + "config.output_hidden_states = True\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'model', 'bert-base-uncased', config=config)\n", + "# 모델은 이제 어텐션과 은닉 상태도 출력하도록 설정되었습니다.\n" + ] + }, + { + "cell_type": "markdown", + "id": "f2ad995c", + "metadata": {}, + "source": [ + "# 사용 예시\n", + "\n", + "다음은 입력 텍스트를 토큰화한 후 BERT 모델에 입력으로 넣어서 계산된 은닉 상태를 가져오거나, 언어 모델링 BERT 모델을 이용하여 마스킹된 토큰들을 예측하는 방법에 대한 예시입니다.\n", + "\n", + "## 먼저, 입력을 토큰화하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d16ad55b", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "tokenizer = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-cased')\n", + "\n", + "text_1 = \"Who was Jim Henson ?\"\n", + "text_2 = \"Jim Henson was a puppeteer\"\n", + "\n", + "# 주위에 특수 토큰이 있는 입력을 토큰화합니다. (BERT에서는 처음과 끝에 각각 [CLS]와 [SEP] 토큰이 있습니다.)\n", + "indexed_tokens = tokenizer.encode(text_1, text_2, add_special_tokens=True)" + ] + }, + { + "cell_type": "markdown", + "id": "6bd9d255", + "metadata": {}, + "source": [ + "## `BertModel`을 사용하여, 입력 문장을 마지막 레이어 은닉 상태의 시퀀스로 인코딩하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f5711441", + "metadata": {}, + "outputs": [], + "source": [ + "# 첫번째 문장 A와 두번째 문장 B의 인덱스를 정의합니다. (논문 참조)\n", + "segments_ids = [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1]\n", + "\n", + "# 입력값을 PyTorch tensor로 변환합니다.\n", + "segments_tensors = torch.tensor([segments_ids])\n", + "tokens_tensor = torch.tensor([indexed_tokens])\n", + "\n", + "model = torch.hub.load('huggingface/pytorch-transformers', 'model', 'bert-base-cased')\n", + "\n", + "with torch.no_grad():\n", + " encoded_layers, _ = model(tokens_tensor, token_type_ids=segments_tensors)" + ] + }, + { + "cell_type": "markdown", + "id": "aa173122", + "metadata": {}, + "source": [ + "## `modelForMaskedLM`을 사용하여, BERT로 마스킹된 토큰 예측하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b4646c0f", + "metadata": {}, + "outputs": [], + "source": [ + "# `BertForMaskedLM`를 통해 예측할 토큰을 마스킹(마스크 토큰으로 변환)합니다.\n", + "masked_index = 8\n", + "indexed_tokens[masked_index] = tokenizer.mask_token_id\n", + "tokens_tensor = torch.tensor([indexed_tokens])\n", + "\n", + "masked_lm_model = torch.hub.load('huggingface/pytorch-transformers', 'modelForMaskedLM', 'bert-base-cased')\n", + "\n", + "with torch.no_grad():\n", + " predictions = masked_lm_model(tokens_tensor, token_type_ids=segments_tensors)\n", + "\n", + "# 예측된 토큰을 가져옵니다.\n", + "predicted_index = torch.argmax(predictions[0][0], dim=1)[masked_index].item()\n", + "predicted_token = tokenizer.convert_ids_to_tokens([predicted_index])[0]\n", + "assert predicted_token == 'Jim'" + ] + }, + { + "cell_type": "markdown", + "id": "0b3aa357", + "metadata": {}, + "source": [ + "## `modelForQuestionAnswering`을 사용하여, BERT로 질의 응답하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9824f27a", + "metadata": {}, + "outputs": [], + "source": [ + "question_answering_model = torch.hub.load('huggingface/pytorch-transformers', 'modelForQuestionAnswering', 'bert-large-uncased-whole-word-masking-finetuned-squad')\n", + "question_answering_tokenizer = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-large-uncased-whole-word-masking-finetuned-squad')\n", + "\n", + "# 형식은 단락이 먼저 주어지고, 그 다음에 질문이 주어지는 형식입니다.\n", + "text_1 = \"Jim Henson was a puppeteer\"\n", + "text_2 = \"Who was Jim Henson ?\"\n", + "indexed_tokens = question_answering_tokenizer.encode(text_1, text_2, add_special_tokens=True)\n", + "segments_ids = [0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1]\n", + "segments_tensors = torch.tensor([segments_ids])\n", + "tokens_tensor = torch.tensor([indexed_tokens])\n", + "\n", + "# 시작 및 종료 위치에 대한 로짓(logits)을 예측합니다.\n", + "with torch.no_grad():\n", + " out = question_answering_model(tokens_tensor, token_type_ids=segments_tensors)\n", + "\n", + "# 가장 높은 로짓을 가진 예측을 가져옵니다.\n", + "answer = question_answering_tokenizer.decode(indexed_tokens[torch.argmax(out.start_logits):torch.argmax(out.end_logits)+1])\n", + "assert answer == \"puppeteer\"\n", + "\n", + "# 또는 시작 및 종료 위치에 대한 교차 엔트로피 손실의 총합을 가져옵니다. (이 코드가 학습 시에 사용되는 경우 미리 모델을 학습 모드로 설정해야 합니다.)\n", + "start_positions, end_positions = torch.tensor([12]), torch.tensor([14])\n", + "multiple_choice_loss = question_answering_model(tokens_tensor, token_type_ids=segments_tensors, start_positions=start_positions, end_positions=end_positions)" + ] + }, + { + "cell_type": "markdown", + "id": "215649c9", + "metadata": {}, + "source": [ + "## `modelForSequenceClassification`을 사용하여, BERT로 패러프레이즈(paraphrase) 분류하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "33348ea4", + "metadata": {}, + "outputs": [], + "source": [ + "sequence_classification_model = torch.hub.load('huggingface/pytorch-transformers', 'modelForSequenceClassification', 'bert-base-cased-finetuned-mrpc')\n", + "sequence_classification_tokenizer = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-cased-finetuned-mrpc')\n", + "\n", + "text_1 = \"Jim Henson was a puppeteer\"\n", + "text_2 = \"Who was Jim Henson ?\"\n", + "indexed_tokens = sequence_classification_tokenizer.encode(text_1, text_2, add_special_tokens=True)\n", + "segments_ids = [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1]\n", + "segments_tensors = torch.tensor([segments_ids])\n", + "tokens_tensor = torch.tensor([indexed_tokens])\n", + "\n", + "# 시퀀스 분류를 위한 로짓을 예측합니다.\n", + "with torch.no_grad():\n", + " seq_classif_logits = sequence_classification_model(tokens_tensor, token_type_ids=segments_tensors)\n", + "\n", + "predicted_labels = torch.argmax(seq_classif_logits[0]).item()\n", + "\n", + "assert predicted_labels == 0 # MRPC 데이터셋에서, 이는 두 문장이 서로 바꾸어 표현할 수 없다는 것을 뜻합니다.\n", + "\n", + "# 또는 시퀀스 분류에 대한 손실을 가져옵니다. (이 코드가 학습 시에 사용되는 경우 미리 모델을 학습 모드로 설정해야 합니다.)\n", + "labels = torch.tensor([1])\n", + "seq_classif_loss = sequence_classification_model(tokens_tensor, token_type_ids=segments_tensors, labels=labels)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/hustvl_yolop.ipynb b/assets/hub/hustvl_yolop.ipynb new file mode 100644 index 000000000..35ea7b738 --- /dev/null +++ b/assets/hub/hustvl_yolop.ipynb @@ -0,0 +1,164 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "3685cbff", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# YOLOP\n", + "\n", + "*Author: Hust Visual Learning Team*\n", + "\n", + "**YOLOP pretrained on the BDD100K dataset**\n", + "\n", + "\n", + "## 시작하기 전 참고 사항\n", + "YOLOP 종속 패키지를 설치하려면 아래 명령을 수행해주세요:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f4c69818", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install -qr https://github.com/hustvl/YOLOP/blob/main/requirements.txt # install dependencies" + ] + }, + { + "cell_type": "markdown", + "id": "66893e24", + "metadata": {}, + "source": [ + "## YOLOP: You Only Look Once for Panoptic driving Perception\n", + "\n", + "### 모델 설명\n", + "\n", + "\"YOLOP\n", + " \n", + "\n", + "- YOLOP는 자율 주행에서 중요한, 다음의 세 가지 작업을 공동으로 처리할 수 있는 효율적인 다중 작업 네트워크 입니다. 이 다중 네트워크는 물체 감지(object detection), 주행 영역 분할(drivable area segmentation), 차선 인식(lane detection)을 수행합니다. 또한 YOLOP는 **BDD100K** 데이터셋에서 최신 기술(state-of-the-art)의 수준을 유지하면서 임베디드 기기에서 실시간성에 도달한 최초의 모델입니다.\n", + "\n", + "\n", + "### 결과\n", + "\n", + "#### 차량 객체(Traffic Object) 인식 결과\n", + "\n", + "| Model | Recall(%) | mAP50(%) | Speed(fps) |\n", + "| -------------- | --------- | -------- | ---------- |\n", + "| `Multinet` | 81.3 | 60.2 | 8.6 |\n", + "| `DLT-Net` | 89.4 | 68.4 | 9.3 |\n", + "| `Faster R-CNN` | 77.2 | 55.6 | 5.3 |\n", + "| `YOLOv5s` | 86.8 | 77.2 | 82 |\n", + "| `YOLOP(ours)` | 89.2 | 76.5 | 41 |\n", + "\n", + "#### 주행 가능 영역 인식 결과\n", + "\n", + "| Model | mIOU(%) | Speed(fps) |\n", + "| ------------- | ------- | ---------- |\n", + "| `Multinet` | 71.6 | 8.6 |\n", + "| `DLT-Net` | 71.3 | 9.3 |\n", + "| `PSPNet` | 89.6 | 11.1 |\n", + "| `YOLOP(ours)` | 91.5 | 41 |\n", + "\n", + "#### 차선 인식 결과\n", + "\n", + "| Model | mIOU(%) | IOU(%) |\n", + "| ------------- | ------- | ------ |\n", + "| `ENet` | 34.12 | 14.64 |\n", + "| `SCNN` | 35.79 | 15.84 |\n", + "| `ENet-SAD` | 36.56 | 16.02 |\n", + "| `YOLOP(ours)` | 70.50 | 26.20 |\n", + "\n", + "#### 조건 변화에 따른 모델 평가 1 (Ablation Studies 1): End-to-end v.s. Step-by-step\n", + "\n", + "| Training_method | Recall(%) | AP(%) | mIoU(%) | Accuracy(%) | IoU(%) |\n", + "| --------------- | --------- | ----- | ------- | ----------- | ------ |\n", + "| `ES-W` | 87.0 | 75.3 | 90.4 | 66.8 | 26.2 |\n", + "| `ED-W` | 87.3 | 76.0 | 91.6 | 71.2 | 26.1 |\n", + "| `ES-D-W` | 87.0 | 75.1 | 91.7 | 68.6 | 27.0 |\n", + "| `ED-S-W` | 87.5 | 76.1 | 91.6 | 68.0 | 26.8 |\n", + "| `End-to-end` | 89.2 | 76.5 | 91.5 | 70.5 | 26.2 |\n", + "\n", + "#### 조건 변화에 따른 모델 평가 2 (Ablation Studies 2): Multi-task v.s. Single task\n", + "\n", + "| Training_method | Recall(%) | AP(%) | mIoU(%) | Accuracy(%) | IoU(%) | Speed(ms/frame) |\n", + "| --------------- | --------- | ----- | ------- | ----------- | ------ | --------------- |\n", + "| `Det(only)` | 88.2 | 76.9 | - | - | - | 15.7 |\n", + "| `Da-Seg(only)` | - | - | 92.0 | - | - | 14.8 |\n", + "| `Ll-Seg(only)` | - | - | - | 79.6 | 27.9 | 14.8 |\n", + "| `Multitask` | 89.2 | 76.5 | 91.5 | 70.5 | 26.2 | 24.4 |\n", + "\n", + "**안내**:\n", + "\n", + "- 표 4에서 E, D, S, W는 인코더(Encoder), 검출 헤드(Detect head), 2개의 세그먼트 헤드(Segment heads) 와 전체 네트워크를 의미합니다. 그래서 이 알고리즘(이 알고리즘은 첫째, 인코더 및 검출 헤드만 학습합니다. 그 후, 인코더 및 검출 헤드를 고정하고 두 개의 분할(segmentation) 헤드를 학습합니다. 마지막으로, 전체 네트워크는 세 가지 작업 모두에 대해 함께 학습됩니다.)은 ED-S-W로 표기되며, 다른 알고리즘도 마찬가지입니다.\n", + "\n", + "### 시각화\n", + "\n", + "#### 차량 객체(Traffic Object) 인식 결과\n", + "\n", + "\"Traffic\n", + " \n", + "\n", + "#### 주행 가능 영역 인식 결과\n", + "\n", + "\"Drivable\n", + " \n", + "\n", + "#### 차선 인식 결과\n", + "\n", + "\"Lane\n", + " \n", + "\n", + "**안내**:\n", + "\n", + "- 차선 인식의 시각화 결과는 이차함수 형태로 근사하는 과정(quadratic fitting)을 통해 후처리(post processed) 되었습니다.\n", + "\n", + "### 배포\n", + "\n", + "YOLOP 모델은 이미지 캡쳐를 **Zed Camera**가 장착된 **Jetson Tx2**에서 실시간으로 추론할 수 있습니다. 속도 향상을 위해 **TensorRT**를 사용합니다. 모델의 배포와 추론을 위해 [github code](https://github.com/hustvl/YOLOP/tree/main/toolkits/deploy) 에서 코드를 제공합니다.\n", + "\n", + "\n", + "### 파이토치 허브로부터 모델 불러오기\n", + "이 예제는 사전에 학습된 **YOLOP** 모델을 불러오고 추론을 위한 이미지를 모델에 전달합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4340a3b6", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "model = torch.hub.load('hustvl/yolop', 'yolop', pretrained=True)\n", + "\n", + "img = torch.randn(1,3,640,640)\n", + "det_out, da_seg_out,ll_seg_out = model(img)" + ] + }, + { + "cell_type": "markdown", + "id": "d0a94c7e", + "metadata": {}, + "source": [ + "### 인용(Citation)\n", + "\n", + "See for more detail in [github code](https://github.com/hustvl/YOLOP) and [arxiv paper](https://arxiv.org/abs/2108.11250).\n", + "\n", + "본 논문과 코드가 여러분의 연구에 유용하다고 판단되면, GitHub star를 주는 것과 본 논문을 인용하는 것을 고려해 주세요:" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/intelisl_midas_v2.ipynb b/assets/hub/intelisl_midas_v2.ipynb new file mode 100644 index 000000000..7f09c82ab --- /dev/null +++ b/assets/hub/intelisl_midas_v2.ipynb @@ -0,0 +1,269 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "64b341d3", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# MiDaS\n", + "\n", + "*Author: Intel ISL*\n", + "\n", + "**MiDaS models for computing relative depth from a single image.**\n", + "\n", + "\"alt\"\n", + "\n", + "\n", + "### 모델 설명\n", + "\n", + "[MiDaS](https://arxiv.org/abs/1907.01341)는 단일 이미지로부터 상대적 역 깊이(relative inverse depth)를 계산합니다. 본 저장소는 작지만 고속의 모델부터 가장 높은 정확도를 제공하는 매우 큰 모델까지 다양한 사례를 다루는 여러 모델을 제공합니다. 또한 모델은 광범위한 입력에서 높은 품질을 보장하기 위해 다목적(multi-objective) 최적화를 사용해 10개의 개별 데이터 셋에 대해 훈련되었습니다. \n", + "\n", + "### 종속 패키지 설치\n", + "\n", + "MiDas 모델은 [timm](https://github.com/rwightman/pytorch-image-models)을 사용합니다 아래 명령어를 통해 설치해 주세요." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "29ddb686", + "metadata": { + "attributes": { + "classes": [ + "shell" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "pip install timm" + ] + }, + { + "cell_type": "markdown", + "id": "a0ddeb0f", + "metadata": {}, + "source": [ + "### 사용 예시\n", + "\n", + "파이토치 홈페이지로부터 이미지를 다운로드합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a0395723", + "metadata": {}, + "outputs": [], + "source": [ + "import cv2\n", + "import torch\n", + "import urllib.request\n", + "\n", + "import matplotlib.pyplot as plt\n", + "\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "markdown", + "id": "fe2fcb9d", + "metadata": {}, + "source": [ + "모델을 로드합니다. (개요는 [https://github.com/intel-isl/MiDaS/#Accuracy](https://github.com/intel-isl/MiDaS/#Accuracy) 를 참조하세요.)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "60754ab4", + "metadata": {}, + "outputs": [], + "source": [ + "model_type = \"DPT_Large\" # MiDaS v3 - Large (highest accuracy, slowest inference speed)\n", + "#model_type = \"DPT_Hybrid\" # MiDaS v3 - Hybrid (medium accuracy, medium inference speed)\n", + "#model_type = \"MiDaS_small\" # MiDaS v2.1 - Small (lowest accuracy, highest inference speed)\n", + "\n", + "midas = torch.hub.load(\"intel-isl/MiDaS\", model_type)" + ] + }, + { + "cell_type": "markdown", + "id": "25bd0f7b", + "metadata": {}, + "source": [ + "GPU 사용이 가능한 환경이라면, 모델에 GPU를 사용합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5a923b40", + "metadata": {}, + "outputs": [], + "source": [ + "device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", + "midas.to(device)\n", + "midas.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "5f86f263", + "metadata": {}, + "source": [ + "여러가지 모델에 입력할 이미지를 크기 변경(resize)이나 정규화(normalize)하기 위한 변환(transform)을 불러옵니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e9747829", + "metadata": {}, + "outputs": [], + "source": [ + "midas_transforms = torch.hub.load(\"intel-isl/MiDaS\", \"transforms\")\n", + "\n", + "if model_type == \"DPT_Large\" or model_type == \"DPT_Hybrid\":\n", + " transform = midas_transforms.dpt_transform\n", + "else:\n", + " transform = midas_transforms.small_transform" + ] + }, + { + "cell_type": "markdown", + "id": "e98d6ac4", + "metadata": {}, + "source": [ + "이미지를 로드하고 변환(transforms)을 적용합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cfc393db", + "metadata": {}, + "outputs": [], + "source": [ + "img = cv2.imread(filename)\n", + "img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n", + "\n", + "input_batch = transform(img).to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "0d13fa5b", + "metadata": {}, + "source": [ + "기존 해상도로 예측 및 크기 변경합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "629ba39f", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " prediction = midas(input_batch)\n", + "\n", + " prediction = torch.nn.functional.interpolate(\n", + " prediction.unsqueeze(1),\n", + " size=img.shape[:2],\n", + " mode=\"bicubic\",\n", + " align_corners=False,\n", + " ).squeeze()\n", + "\n", + "output = prediction.cpu().numpy()" + ] + }, + { + "cell_type": "markdown", + "id": "82c834ea", + "metadata": {}, + "source": [ + "결과 출력" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ad8944d0", + "metadata": {}, + "outputs": [], + "source": [ + "plt.imshow(output)\n", + "# plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "d158e4a5", + "metadata": {}, + "source": [ + "### 참고문헌\n", + "[Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-shot Cross-dataset Transfer](https://arxiv.org/abs/1907.01341)\n", + "\n", + "[Vision Transformers for Dense Prediction](https://arxiv.org/abs/2103.13413)\n", + "\n", + "만약 MiDaS 모델을 사용한다면 본 논문을 인용해 주세요:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7c8db369", + "metadata": { + "attributes": { + "classes": [ + "bibtex" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "@article{Ranftl2020,\n", + "\tauthor = {Ren\\'{e} Ranftl and Katrin Lasinger and David Hafner and Konrad Schindler and Vladlen Koltun},\n", + "\ttitle = {Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-shot Cross-dataset Transfer},\n", + "\tjournal = {IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},\n", + "\tyear = {2020},\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "46dc343d", + "metadata": { + "attributes": { + "classes": [ + "bibtex" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "@article{Ranftl2021,\n", + "\tauthor = {Ren\\'{e} Ranftl and Alexey Bochkovskiy and Vladlen Koltun},\n", + "\ttitle = {Vision Transformers for Dense Prediction},\n", + "\tjournal = {ArXiv preprint},\n", + "\tyear = {2021},\n", + "}" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/mateuszbuda_brain-segmentation-pytorch_unet.ipynb b/assets/hub/mateuszbuda_brain-segmentation-pytorch_unet.ipynb new file mode 100644 index 000000000..1c80b8506 --- /dev/null +++ b/assets/hub/mateuszbuda_brain-segmentation-pytorch_unet.ipynb @@ -0,0 +1,117 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "1d064bc4", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# U-Net for brain MRI\n", + "\n", + "*Author: mateuszbuda*\n", + "\n", + "**U-Net with batch normalization for biomedical image segmentation with pretrained weights for abnormality segmentation in brain MRI**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a6a781f0", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('mateuszbuda/brain-segmentation-pytorch', 'unet',\n", + " in_channels=3, out_channels=1, init_features=32, pretrained=True)\n" + ] + }, + { + "cell_type": "markdown", + "id": "d019b7d4", + "metadata": {}, + "source": [ + "위 코드는 뇌 MRI 볼륨 데이터 셋 [kaggle.com/mateuszbuda/lgg-mri-segmentation](https://www.kaggle.com/mateuszbuda/lgg-mri-segmentation)의 이상 탐지를 위해 사전 학습된 U-Net 모델을 불러옵니다. \n", + "사전 학습된 모델은 첫 번째 계층에서 3개의 입력 채널, 1개의 출력 채널 그리고 32개의 특징을 가집니다.\n", + "\n", + "### 모델 설명\n", + "\n", + "U-Net 모델은 배치 정규화 및 ReLU 활성 함수를 가진 두 개의 합성곱 계층, 인코딩 과정의 맥스 풀링(max-pooling) 계층 그리고 디코딩 과정의 업 컨볼루셔널(up-convolutional) 계층을 포함한 네 가지 단계의 블록으로 구성됩니다.\n", + "각 블록의 합성곱 필터 수는 32, 64, 128, 256개입니다.\n", + "병목 계층(bottleneck layer)은 512개의 합성곱 필터를 가집니다.\n", + "인코딩 과정의 계층에서 얻은 특징을 이에 상응하는 디코딩 과정의 계층에 합치는 스킵 연결(skip connections)이 진행됩니다.\n", + "입력 이미지는 pre-contrast, FLAIR 및 post-contrast 과정에서 얻은 3-채널 뇌 MRI 슬라이스입니다.\n", + "출력은 입력 이미지와 동일한 크기를 가지고 1-채널의 이상 탐지 영역을 확률적으로 나타냅니다.\n", + "아래의 예시처럼 임계 값을 설정하면 출력 이미지를 이진 분할 마스크로 변환할 수 있습니다.\n", + "\n", + "### 예시\n", + "\n", + "사전 학습된 모델에 입력되는 이미지는 3개의 채널을 가져야 하며 256x256 픽셀로 크기가 조정되고 각 볼륨마다 z-점수로 정규화된 상태여야 합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4ace8a73", + "metadata": {}, + "outputs": [], + "source": [ + "# 예시 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/mateuszbuda/brain-segmentation-pytorch/raw/master/assets/TCGA_CS_4944.png\", \"TCGA_CS_4944.png\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "64f18ac3", + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "\n", + "input_image = Image.open(filename)\n", + "m, s = np.mean(input_image, axis=(0, 1)), np.std(input_image, axis=(0, 1))\n", + "preprocess = transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=m, std=s),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0)\n", + "\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model = model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "\n", + "print(torch.round(output[0]))" + ] + }, + { + "cell_type": "markdown", + "id": "9559c08e", + "metadata": {}, + "source": [ + "### 참고문헌\n", + "\n", + "- [Association of genomic subtypes of lower-grade gliomas with shape features automatically extracted by a deep learning algorithm](http://arxiv.org/abs/1906.03720)\n", + "- [U-Net: Convolutional Networks for Biomedical Image Segmentation](https://arxiv.org/abs/1505.04597)\n", + "- [Brain MRI segmentation dataset](https://www.kaggle.com/mateuszbuda/lgg-mri-segmentation)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nicolalandro_ntsnet-cub200_ntsnet.ipynb b/assets/hub/nicolalandro_ntsnet-cub200_ntsnet.ipynb new file mode 100644 index 000000000..442426aac --- /dev/null +++ b/assets/hub/nicolalandro_ntsnet-cub200_ntsnet.ipynb @@ -0,0 +1,118 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "0ecc07a0", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ntsnet\n", + "\n", + "*Author: Moreno Caraffini and Nicola Landro*\n", + "\n", + "**classify birds using this fine-grained image classifier**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "300fb97b", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('nicolalandro/ntsnet-cub200', 'ntsnet', pretrained=True,\n", + " **{'topN': 6, 'device':'cpu', 'num_classes': 200})" + ] + }, + { + "cell_type": "markdown", + "id": "77d35129", + "metadata": {}, + "source": [ + "### 사용 예제" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "104061cf", + "metadata": {}, + "outputs": [], + "source": [ + "from torchvision import transforms\n", + "import torch\n", + "import urllib\n", + "from PIL import Image\n", + "\n", + "transform_test = transforms.Compose([\n", + " transforms.Resize((600, 600), Image.BILINEAR),\n", + " transforms.CenterCrop((448, 448)),\n", + " # transforms.RandomHorizontalFlip(), # only if train\n", + " transforms.ToTensor(),\n", + " transforms.Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)),\n", + "])\n", + "\n", + "\n", + "model = torch.hub.load('nicolalandro/ntsnet-cub200', 'ntsnet', pretrained=True, **{'topN': 6, 'device':'cpu', 'num_classes': 200})\n", + "model.eval()\n", + "\n", + "url = 'https://raw.githubusercontent.com/nicolalandro/ntsnet-cub200/master/images/nts-net.png'\n", + "img = Image.open(urllib.request.urlopen(url))\n", + "scaled_img = transform_test(img)\n", + "torch_images = scaled_img.unsqueeze(0)\n", + "\n", + "with torch.no_grad():\n", + " top_n_coordinates, concat_out, raw_logits, concat_logits, part_logits, top_n_index, top_n_prob = model(torch_images)\n", + "\n", + " _, predict = torch.max(concat_logits, 1)\n", + " pred_id = predict.item()\n", + " print('bird class:', model.bird_classes[pred_id])" + ] + }, + { + "cell_type": "markdown", + "id": "529c033e", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "이 모델은 세분화된 조류 데이터셋인 CUB200 2011 데이터셋으로 사전 학습된 nts-net입니다.\n", + "\n", + "### 참조\n", + "[link](http://artelab.dista.uninsubria.it/res/research/papers/2019/2019-IVCNZ-Nawaz-Birds.pdf) - 여기에서 전체 내용을 읽을 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "240466eb", + "metadata": { + "attributes": { + "classes": [ + "bibtex" + ], + "id": "" + } + }, + "outputs": [], + "source": [ + "@INPROCEEDINGS{Gallo:2019:IVCNZ,\n", + " author={Nawaz, Shah and Calefati, Alessandro and Caraffini, Moreno and Landro, Nicola and Gallo, Ignazio},\n", + " booktitle={2019 International Conference on Image and Vision Computing New Zealand (IVCNZ 2019)},\n", + " title={Are These Birds Similar: Learning Branched Networks for Fine-grained Representations},\n", + " year={2019},\n", + " month={Dec},\n", + "}" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_efficientnet.ipynb b/assets/hub/nvidia_deeplearningexamples_efficientnet.ipynb new file mode 100644 index 000000000..2cbc73a3f --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_efficientnet.ipynb @@ -0,0 +1,204 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "53df6efd", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# EfficientNet\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**EfficientNets are a family of image classification models, which achieve state-of-the-art accuracy, being an order-of-magnitude smaller and faster. Trained with mixed precision using Tensor Cores.**\n", + "\n", + "\"alt\"\n", + "\n", + "\n", + "\n", + "### Model Description\n", + "\n", + "EfficientNet is an image classification model family. It was first described in [EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks](https://arxiv.org/abs/1905.11946). This notebook allows you to load and test the EfficientNet-B0, EfficientNet-B4, EfficientNet-WideSE-B0 and, EfficientNet-WideSE-B4 models.\n", + "\n", + "EfficientNet-WideSE models use Squeeze-and-Excitation layers wider than original EfficientNet models, the width of SE module is proportional to the width of Depthwise Separable Convolutions instead of block width.\n", + "\n", + "WideSE models are slightly more accurate than original models.\n", + "\n", + "This model is trained with mixed precision using Tensor Cores on Volta and the NVIDIA Ampere GPU architectures. Therefore, researchers can get results over 2x faster than training without Tensor Cores, while experiencing the benefits of mixed precision training. This model is tested against each NGC monthly container release to ensure consistent accuracy and performance over time.\n", + "\n", + "We use [NHWC data layout](https://pytorch.org/tutorials/intermediate/memory_format_tutorial.html) when training using Mixed Precision.\n", + "\n", + "### Example\n", + "\n", + "In the example below we will use the pretrained ***EfficientNet*** model to perform inference on image and present the result.\n", + "\n", + "To run the example you need some extra python packages installed. These are needed for preprocessing images and visualization." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "85415d8a", + "metadata": {}, + "outputs": [], + "source": [ + "!pip install validators matplotlib" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "28cd74f8", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "from PIL import Image\n", + "import torchvision.transforms as transforms\n", + "import numpy as np\n", + "import json\n", + "import requests\n", + "import matplotlib.pyplot as plt\n", + "import warnings\n", + "warnings.filterwarnings('ignore')\n", + "%matplotlib inline\n", + "\n", + "device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", + "print(f'Using {device} for inference')" + ] + }, + { + "cell_type": "markdown", + "id": "8fb7470e", + "metadata": {}, + "source": [ + "Load the model pretrained on IMAGENET dataset.\n", + "\n", + "You can choose among the following models:\n", + "\n", + "| TorchHub entrypoint | Description |\n", + "| :----- | :----- |\n", + "| `nvidia_efficientnet_b0` | baseline EfficientNet |\n", + "| `nvidia_efficientnet_b4` | scaled EfficientNet|\n", + "| `nvidia_efficientnet_widese_b0` | model with Squeeze-and-Excitation layers wider than baseline EfficientNet model |\n", + "| `nvidia_efficientnet_widese_b4` | model with Squeeze-and-Excitation layers wider than scaled EfficientNet model |\n", + "\n", + "There are also quantized version of the models, but they require nvidia container. See [quantized models](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/efficientnet#quantization)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7dbb5266", + "metadata": {}, + "outputs": [], + "source": [ + "efficientnet = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_efficientnet_b0', pretrained=True)\n", + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_convnets_processing_utils')\n", + "\n", + "efficientnet.eval().to(device)\n" + ] + }, + { + "cell_type": "markdown", + "id": "3b65bfd5", + "metadata": {}, + "source": [ + "Prepare sample input data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b4721e2c", + "metadata": {}, + "outputs": [], + "source": [ + "uris = [\n", + " 'http://images.cocodataset.org/test-stuff2017/000000024309.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000028117.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000006149.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000004954.jpg',\n", + "]\n", + "\n", + "batch = torch.cat(\n", + " [utils.prepare_input_from_uri(uri) for uri in uris]\n", + ").to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "2b1f1fd9", + "metadata": {}, + "source": [ + "Run inference. Use `pick_n_best(predictions=output, n=topN)` helper function to pick N most probable hypotheses according to the model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e0564bf4", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " output = torch.nn.functional.softmax(efficientnet(batch), dim=1)\n", + " \n", + "results = utils.pick_n_best(predictions=output, n=5)" + ] + }, + { + "cell_type": "markdown", + "id": "8e0d0426", + "metadata": {}, + "source": [ + "Display the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "49e9411b", + "metadata": {}, + "outputs": [], + "source": [ + "for uri, result in zip(uris, results):\n", + " img = Image.open(requests.get(uri, stream=True).raw)\n", + " img.thumbnail((256,256), Image.ANTIALIAS)\n", + " plt.imshow(img)\n", + " plt.show()\n", + " print(result)" + ] + }, + { + "cell_type": "markdown", + "id": "6d23a4d2", + "metadata": {}, + "source": [ + "### Details\n", + "For detailed information on model input and output, training recipies, inference and performance visit:\n", + "[github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/efficientnet)\n", + "and/or [NGC](https://ngc.nvidia.com/catalog/resources/nvidia:efficientnet_for_pytorch)\n", + "\n", + "### References\n", + "\n", + " - [EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks](https://arxiv.org/abs/1905.11946)\n", + " - [model on NGC](https://ngc.nvidia.com/catalog/resources/nvidia:efficientnet_for_pytorch)\n", + " - [model on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/efficientnet)\n", + " - [pretrained model on NGC (efficientnet-b0)](https://ngc.nvidia.com/catalog/models/nvidia:efficientnet_b0_pyt_amp)\n", + " - [pretrained model on NGC (efficientnet-b4)](https://ngc.nvidia.com/catalog/models/nvidia:efficientnet_b4_pyt_amp)\n", + " - [pretrained model on NGC (efficientnet-widese-b0)](https://ngc.nvidia.com/catalog/models/nvidia:efficientnet_widese_b0_pyt_amp)\n", + " - [pretrained model on NGC (efficientnet-widese-b4)](https://ngc.nvidia.com/catalog/models/nvidia:efficientnet_widese_b4_pyt_amp)\n", + " - [pretrained, quantized model on NGC (efficientnet-widese-b0)](https://ngc.nvidia.com/catalog/models/nvidia:efficientnet_widese_b0_pyt_amp)\n", + " - [pretrained, quantized model on NGC (efficientnet-widese-b4)](https://ngc.nvidia.com/catalog/models/nvidia:efficientnet_widese_b4_pyt_amp)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_resnet50.ipynb b/assets/hub/nvidia_deeplearningexamples_resnet50.ipynb new file mode 100644 index 000000000..f745ad64e --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_resnet50.ipynb @@ -0,0 +1,190 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "eff94f0a", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ResNet50\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**ResNet50 model trained with mixed precision using Tensor Cores.**\n", + "\n", + "\"alt\"\n", + "\n", + "\n", + "\n", + "### 모델 설명\n", + "\n", + "***ResNet50 v1.5***모델은 [original ResNet50 v1 model](https://arxiv.org/abs/1512.03385)의 수정된 버전입니다.\n", + "\n", + "v1과 v1.5의 차이점은 다운샘플링이 필요한 병목 블록에서 v1은 첫 번째 1x1 컨볼루션에서 스트라이드 = 2를 갖는 반면 v1.5는 3x3 컨볼루션에서 스트라이드 = 2를 갖는다는 것입니다.\n", + "\n", + "이러한 차이는 ResNet50 v1.5를 v1보다 조금 더 정확하게 만들지만(\\~0.5% top1) 약간의 성능적인 단점(\\~5% imgs/sec)이 있습니다.\n", + "\n", + "모델은 [Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification](https://arxiv.org/pdf/1502.01852.pdf)에 설명된 대로 초기화됩니다.\n", + "\n", + "이 모델은 Volta, Turing 및 NVIDIA Ampere GPU 아키텍처의 Tensor 코어를 사용하여 혼합 정밀도(mixed precision)로 학습됩니다. 따라서 연구자들은 혼합 정밀 교육의 이점을 경험하면서 Tensor Core 없이 학습하는 것보다 2배 이상 빠른 결과를 얻을 수 있습니다. 이 모델은 시간이 지남에 따라 일관된 정확성과 성능을 보장하기 위해 각 NGC 월별 컨테이너 릴리스에 대해 테스트됩니다.\n", + "\n", + "ResNet50 v1.5 모델은 TorchScript, ONNX Runtime 또는 TensorRT를 실행 백엔드로 사용하여 [NVIDIA Triton Inference Server](https://github.com/NVIDIA/trtis-inference-server)에서 추론을 위해 배치될 수 있습니다. 자세한 내용은 [NGC](https://ngc.nvidia.com/catalog/resources/nvidia:resnet_for_triton_from_pytorch)를 확인하십시오.\n", + "\n", + "### 예시 사례\n", + "\n", + "아래 예제에서는 사전 훈련된 ***ResNet50 v1.5*** 모델을 사용하여 ***이미지***에 대한 추론을 수행 하고 결과를 제시할 것입니다.\n", + "\n", + "예제를 실행하려면 몇 가지 추가 파이썬 패키지가 설치되어 있어야 합니다. 이는 이미지를 전처리하고 시각화하는 데 필요합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ec94df5a", + "metadata": {}, + "outputs": [], + "source": [ + "!pip install validators matplotlib" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2c42ab2d", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "from PIL import Image\n", + "import torchvision.transforms as transforms\n", + "import numpy as np\n", + "import json\n", + "import requests\n", + "import matplotlib.pyplot as plt\n", + "import warnings\n", + "warnings.filterwarnings('ignore')\n", + "%matplotlib inline\n", + "\n", + "device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", + "print(f'Using {device} for inference')" + ] + }, + { + "cell_type": "markdown", + "id": "b8e45a02", + "metadata": {}, + "source": [ + "IMAGENET 데이터셋에서 사전 훈련된 모델을 로드합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4775d70f", + "metadata": {}, + "outputs": [], + "source": [ + "resnet50 = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_resnet50', pretrained=True)\n", + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_convnets_processing_utils')\n", + "\n", + "resnet50.eval().to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "c132b8ac", + "metadata": {}, + "source": [ + "샘플 입력 데이터를 준비합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6b72a6fb", + "metadata": {}, + "outputs": [], + "source": [ + "uris = [\n", + " 'http://images.cocodataset.org/test-stuff2017/000000024309.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000028117.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000006149.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000004954.jpg',\n", + "]\n", + "\n", + "batch = torch.cat(\n", + " [utils.prepare_input_from_uri(uri) for uri in uris]\n", + ").to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "75037055", + "metadata": {}, + "source": [ + "추론을 실행합니다. `pick_n_best(predictions=output, n=topN)` helper 함수를 사용하여 모델에 따라 가장 가능성이 높은 가설을 N개 선택합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a759088d", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " output = torch.nn.functional.softmax(resnet50(batch), dim=1)\n", + " \n", + "results = utils.pick_n_best(predictions=output, n=5)" + ] + }, + { + "cell_type": "markdown", + "id": "2967a99f", + "metadata": {}, + "source": [ + "결과를 표시합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8fff331e", + "metadata": {}, + "outputs": [], + "source": [ + "for uri, result in zip(uris, results):\n", + " img = Image.open(requests.get(uri, stream=True).raw)\n", + " img.thumbnail((256,256), Image.ANTIALIAS)\n", + " plt.imshow(img)\n", + " plt.show()\n", + " print(result)\n" + ] + }, + { + "cell_type": "markdown", + "id": "0198dd40", + "metadata": {}, + "source": [ + "### 세부사항\n", + "모델 입력 및 출력, 학습 방법, 추론 및 성능 등에 대한 더 자세한 정보는 [github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/resnet50v1.5) 및 and/or [NGC](https://ngc.nvidia.com/catalog/resources/nvidia:resnet_50_v1_5_for_pytorch)에서 볼 수 있습니다.\n", + "\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Original ResNet50 v1 paper](https://arxiv.org/abs/1512.03385)\n", + " - [Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification](https://arxiv.org/pdf/1502.01852.pdf)\n", + " - [model on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/resnet50v1.5)\n", + " - [model on NGC](https://ngc.nvidia.com/catalog/resources/nvidia:resnet_50_v1_5_for_pytorch)\n", + " - [pretrained model on NGC](https://ngc.nvidia.com/catalog/models/nvidia:resnet50_pyt_amp)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_resnext.ipynb b/assets/hub/nvidia_deeplearningexamples_resnext.ipynb new file mode 100644 index 000000000..c786c709c --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_resnext.ipynb @@ -0,0 +1,202 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "0e67b1fb", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ResNeXt101\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**ResNet with bottleneck 3x3 Convolutions substituted by 3x3 Grouped Convolutions, trained with mixed precision using Tensor Cores.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/ResNeXtArch.png) | ![alt](https://pytorch.org/assets/images/classification.jpg)\n", + "\n", + "\n", + "\n", + "### 모델 설명\n", + "\n", + "***ResNeXt101-32x4d***는 [Aggregated Residual Transformations for Deep Neural Networks](https://arxiv.org/pdf/1611.05431.pdf) 논문에 소개된 모델입니다.\n", + "\n", + "이 모델은 일반적인 ResNet 모델에 기반을 두고 있으며 ResNet의 3x3 그룹 합성곱(Grouped Convolution) 계층을 병목 블록(Bottleneck Block) 내부의 3x3 합성곱 계층으로 대체합니다.\n", + "\n", + "ResNeXt101 모델은 Volta, Turing 및 NVIDIA Ampere 아키텍처에서 Tensor Core를 사용하여 혼합 정밀도(Mixed Precision) 방식[1]으로 학습됩니다. 따라서 연구자들은 혼합 정밀도 학습의 장점을 경험하는 동시에 Tensor Cores를 사용하지 않을 때보다 결과를 3배 빠르게 얻을 수 있습니다. 이 모델은 시간이 지남에도 지속적인 정확도와 성능을 유지하기 위해 월별 NGC 컨테이너 출시에 대해 테스트되고 있습니다.\n", + "\n", + "혼합 정밀도 학습에는 [NHWC 데이터 레이아웃](https://pytorch.org/tutorials/intermediate/memory_format_tutorial.html)이 사용됩니다. \n", + "\n", + "ResNeXt101-32x4d 모델은 추론을 위해 TorchScript, ONNX Runtime 또는 TensorRT를 실행 백엔드로 사용하고 [NVIDIA Triton Inference Server](https://github.com/NVIDIA/trtis-inference-server)에 배포할 수 있습니다. 자세한 내용은 [NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/resources/resnext_for_triton_from_pytorch)에서 확인하세요. \n", + "\n", + "#### 모델 구조\n", + "\n", + "![ResNextArch](https://pytorch.org/assets/images/ResNeXtArch.png)\n", + "\n", + "_이미지 출처: Aggregated Residual Transformations for Deep Neural Networks](https://arxiv.org/pdf/1611.05431.pdf)_\n", + "\n", + "위의 이미지는 ResNet 모델의 병목 블록과 ResNeXt 모델의 병목 블록의 차이를 나타냅니다.\n", + "\n", + "ResNeXt101-32x4d 모델의 카디널리티(Cardinality)는 32이고 병목 블록의 Width는 4입니다.\n", + "### 예시\n", + "\n", + "아래 예시에서 사전 학습된 ***ResNeXt101-32x4d***모델을 사용하여 이미지들에 대한 추론을 진행하고 결과를 제시합니다.\n", + "\n", + "예시를 실행하려면 추가적인 파이썬 패키지들이 설치되어야 합니다. 이 패키지들은 이미지 전처리 및 시각화에 필요합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8bd3d621", + "metadata": {}, + "outputs": [], + "source": [ + "!pip install validators matplotlib" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "306383d9", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "from PIL import Image\n", + "import torchvision.transforms as transforms\n", + "import numpy as np\n", + "import json\n", + "import requests\n", + "import matplotlib.pyplot as plt\n", + "import warnings\n", + "warnings.filterwarnings('ignore')\n", + "%matplotlib inline\n", + "\n", + "device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", + "print(f'Using {device} for inference')" + ] + }, + { + "cell_type": "markdown", + "id": "ed998e81", + "metadata": {}, + "source": [ + "IMAGENET 데이터셋으로 사전 학습된 모델을 불러옵니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4060f110", + "metadata": {}, + "outputs": [], + "source": [ + "resneXt = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_resneXt')\n", + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_convnets_processing_utils')\n", + "\n", + "resneXt.eval().to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "d2930b6d", + "metadata": {}, + "source": [ + "샘플 입력 데이터를 준비합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c1a8ba33", + "metadata": {}, + "outputs": [], + "source": [ + "uris = [\n", + " 'http://images.cocodataset.org/test-stuff2017/000000024309.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000028117.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000006149.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000004954.jpg',\n", + "]\n", + "\n", + "\n", + "batch = torch.cat(\n", + " [utils.prepare_input_from_uri(uri) for uri in uris]\n", + ").to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "f67d35e8", + "metadata": {}, + "source": [ + "추론을 시작합니다. 헬퍼 함수 `pick_n_best(predictions=output, n=topN)`를 사용해 모델에 대한 N개의 가장 가능성이 높은 가설을 선택합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e7461e48", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " output = torch.nn.functional.softmax(resneXt(batch), dim=1)\n", + " \n", + "results = utils.pick_n_best(predictions=output, n=5)" + ] + }, + { + "cell_type": "markdown", + "id": "ff767dd9", + "metadata": {}, + "source": [ + "결과를 출력합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "335c65df", + "metadata": {}, + "outputs": [], + "source": [ + "for uri, result in zip(uris, results):\n", + " img = Image.open(requests.get(uri, stream=True).raw)\n", + " img.thumbnail((256,256), Image.ANTIALIAS)\n", + " plt.imshow(img)\n", + " plt.show()\n", + " print(result)\n" + ] + }, + { + "cell_type": "markdown", + "id": "5fbbc9b1", + "metadata": {}, + "source": [ + "### 세부사항\n", + "모델 입력 및 출력, 학습 방법, 추론 및 성능에 대한 자세한 내용은 [github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/resnext101-32x4d)이나 [NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/resources/resnext_for_pytorch)에서 확인할 수 있습니다.\n", + "\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Aggregated Residual Transformations for Deep Neural Networks](https://arxiv.org/pdf/1611.05431.pdf)\n", + " - [model on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/resnext101-32x4d)\n", + " - [model on NGC](https://ngc.nvidia.com/catalog/resources/nvidia:resnext_for_pytorch)\n", + " - [pretrained model on NGC](https://ngc.nvidia.com/catalog/models/nvidia:resnext101_32x4d_pyt_amp)\n", + "\n", + "\n", + " [1]: 빠르고 효율적인 처리를 위해 16비트 부동소수점과 32비트 부동소수점을 함께 사용해 학습하는 방식." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_se-resnext.ipynb b/assets/hub/nvidia_deeplearningexamples_se-resnext.ipynb new file mode 100644 index 000000000..23fc2fc5f --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_se-resnext.ipynb @@ -0,0 +1,201 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "f06821f0", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# SE-ResNeXt101\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**ResNeXt with Squeeze-and-Excitation module added, trained with mixed precision using Tensor Cores.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/SEArch.png) | ![alt](https://pytorch.org/assets/images/classification.jpg)\n", + "\n", + "\n", + "\n", + "### Model Description\n", + "\n", + "The ***SE-ResNeXt101-32x4d*** is a [ResNeXt101-32x4d](https://arxiv.org/pdf/1611.05431.pdf)\n", + "model with added Squeeze-and-Excitation module introduced\n", + "in the [Squeeze-and-Excitation Networks](https://arxiv.org/pdf/1709.01507.pdf) paper.\n", + "\n", + "This model is trained with mixed precision using Tensor Cores on Volta, Turing, and the NVIDIA Ampere GPU architectures. Therefore, researchers can get results 3x faster than training without Tensor Cores, while experiencing the benefits of mixed precision training. This model is tested against each NGC monthly container release to ensure consistent accuracy and performance over time.\n", + "\n", + "We use [NHWC data layout](https://pytorch.org/tutorials/intermediate/memory_format_tutorial.html) when training using Mixed Precision.\n", + "\n", + "#### Model architecture\n", + "\n", + "![SEArch](https://pytorch.org/assets/images/SEArch.png)\n", + "\n", + "_Image source: [Squeeze-and-Excitation Networks](https://arxiv.org/pdf/1709.01507.pdf)_\n", + "\n", + "Image shows the architecture of SE block and where is it placed in ResNet bottleneck block.\n", + "\n", + "\n", + "Note that the SE-ResNeXt101-32x4d model can be deployed for inference on the [NVIDIA Triton Inference Server](https://github.com/NVIDIA/trtis-inference-server) using TorchScript, ONNX Runtime or TensorRT as an execution backend. For details check [NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/resources/se_resnext_for_triton_from_pytorch).\n", + "\n", + "### Example\n", + "\n", + "In the example below we will use the pretrained ***SE-ResNeXt101-32x4d*** model to perform inference on images and present the result.\n", + "\n", + "To run the example you need some extra python packages installed. These are needed for preprocessing images and visualization." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ef57b4a1", + "metadata": {}, + "outputs": [], + "source": [ + "!pip install validators matplotlib" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "425ea5e9", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "from PIL import Image\n", + "import torchvision.transforms as transforms\n", + "import numpy as np\n", + "import json\n", + "import requests\n", + "import matplotlib.pyplot as plt\n", + "import warnings\n", + "warnings.filterwarnings('ignore')\n", + "%matplotlib inline\n", + "\n", + "device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")\n", + "print(f'Using {device} for inference')" + ] + }, + { + "cell_type": "markdown", + "id": "abe9637a", + "metadata": {}, + "source": [ + "Load the model pretrained on IMAGENET dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7248ff13", + "metadata": {}, + "outputs": [], + "source": [ + "resneXt = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_se_resnext101_32x4d')\n", + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_convnets_processing_utils')\n", + "\n", + "resneXt.eval().to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "3b28255c", + "metadata": {}, + "source": [ + "Prepare sample input data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f8579abf", + "metadata": {}, + "outputs": [], + "source": [ + "uris = [\n", + " 'http://images.cocodataset.org/test-stuff2017/000000024309.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000028117.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000006149.jpg',\n", + " 'http://images.cocodataset.org/test-stuff2017/000000004954.jpg',\n", + "]\n", + "\n", + "\n", + "batch = torch.cat(\n", + " [utils.prepare_input_from_uri(uri) for uri in uris]\n", + ").to(device)" + ] + }, + { + "cell_type": "markdown", + "id": "9385dfaa", + "metadata": {}, + "source": [ + "Run inference. Use `pick_n_best(predictions=output, n=topN)` helper function to pick N most probable hypotheses according to the model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "314a4b4e", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " output = torch.nn.functional.softmax(resneXt(batch), dim=1)\n", + " \n", + "results = utils.pick_n_best(predictions=output, n=5)" + ] + }, + { + "cell_type": "markdown", + "id": "a8210d8b", + "metadata": {}, + "source": [ + "Display the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "974dc50b", + "metadata": {}, + "outputs": [], + "source": [ + "for uri, result in zip(uris, results):\n", + " img = Image.open(requests.get(uri, stream=True).raw)\n", + " img.thumbnail((256,256), Image.ANTIALIAS)\n", + " plt.imshow(img)\n", + " plt.show()\n", + " print(result)\n" + ] + }, + { + "cell_type": "markdown", + "id": "6a517b69", + "metadata": {}, + "source": [ + "### Details\n", + "For detailed information on model input and output, training recipies, inference and performance visit:\n", + "[github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/se-resnext101-32x4d)\n", + "and/or [NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/resources/se_resnext_for_pytorch).\n", + "\n", + "\n", + "### References\n", + "\n", + " - [Squeeze-and-Excitation Networks](https://arxiv.org/pdf/1709.01507.pdf)\n", + " - [model on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Classification/ConvNets/se-resnext101-32x4d)\n", + " - [model on NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/resources/se_resnext_for_pytorch)\n", + " - [pretrained model on NGC](https://catalog.ngc.nvidia.com/orgs/nvidia/models/seresnext101_32x4d_pyt_amp)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_ssd.ipynb b/assets/hub/nvidia_deeplearningexamples_ssd.ipynb new file mode 100644 index 000000000..e8a90ad10 --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_ssd.ipynb @@ -0,0 +1,255 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "3be398c5", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# SSD\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**Single Shot MultiBox Detector model for object detection**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/ssd_diagram.png) | ![alt](https://pytorch.org/assets/images/ssd.png)\n", + "\n", + "\n", + "\n", + "### Model Description\n", + "\n", + "SSD300 모델은 \"단일 심층 신경망을 사용하여 이미지에서 물체를 감지하는 방법\"을 설명하는 [SSD: Single Shot MultiBox Detector](https://arxiv.org/abs/1512.02325) 논문을 기반으로 합니다. 입력 크기는 300x300으로 고정되어 있습니다.\n", + "\n", + "이 모델과 논문에 설명된 모델의 큰 차이점은 백본(backbone)에 있습니다. 특히, 논문에서 사용한 VGG 모델은 더 이상 사용되지 않으며 ResNet-50 모델로 대체되었습니다.\n", + "\n", + "[Speed/accuracy trade-offs for modern convolutional object detectors](https://arxiv.org/abs/1611.10012) 논문에서, 백본에 대해 다음과 같은 개선이 이루어졌습니다.\n", + "\n", + "* conv5_x, avgpool, fc 및 softmax 레이어는 기존의 분류 모델에서 제거되었습니다.\n", + "* conv4_x의 모든 strides는 1x1로 설정됩니다.\n", + "\n", + "백본 뒤에는 5개의 합성곱 레이어가 추가됩니다. 또한 합성곱 레이어 외에도 6개의 detection heads를 추가했습니다.\n", + "The backbone is followed by 5 additional convolutional layers.\n", + "In addition to the convolutional layers, we attached 6 detection heads:\n", + "* 첫 번째 detection head는 마지막 conv4_x 레이어에 연결됩니다.\n", + "* 나머지 5개의 detection head는 추가되는 5개의 합성곱 레이어에 부착됩니다.\n", + "\n", + "Detector heads는 논문에서 언급된 것과 유사하지만, 각각의 합성곱 레이어 뒤에 BatchNorm 레이어를 추가함으로써 성능이 향상됩니다.\n", + "\n", + "### Example\n", + "\n", + "아래 예에서는 사전에 학습된 SSD 모델을 사용하여 샘플 이미지에서 객체를 탐지하고 결과를 시각화합니다.\n", + "\n", + "예제를 실행하려면 몇 가지 추가적인 파이썬 패키지가 설치되어 있어야 합니다. 이는 이미지 전처리 및 시각화에 필요합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "417c8c58", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install numpy scipy scikit-image matplotlib" + ] + }, + { + "cell_type": "markdown", + "id": "65f13326", + "metadata": {}, + "source": [ + "COCO 데이터셋에 대해 사전에 학습된 SSD 모델과, 모델의 입력 및 출력에 대한 편리하고 포괄적인 형식 지정을 위한 유틸리티를 불러옵니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a473c8a3", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "ssd_model = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_ssd')\n", + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_ssd_processing_utils')" + ] + }, + { + "cell_type": "markdown", + "id": "9fee547e", + "metadata": {}, + "source": [ + "추론을 위해 불러온 모델을 준비합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "381363ae", + "metadata": {}, + "outputs": [], + "source": [ + "ssd_model.to('cuda')\n", + "ssd_model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "713c4c15", + "metadata": {}, + "source": [ + "객체 탐지를 위한 입력 이미지를 준비합니다. \n", + "(아래 예제 링크는 COCO 데이터셋의 처음 몇 개의 테스트 이미지에 해당하지만, 로컬 이미지에 대한 경로를 지정할 수도 있습니다.)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a5067560", + "metadata": {}, + "outputs": [], + "source": [ + "uris = [\n", + " 'http://images.cocodataset.org/val2017/000000397133.jpg',\n", + " 'http://images.cocodataset.org/val2017/000000037777.jpg',\n", + " 'http://images.cocodataset.org/val2017/000000252219.jpg'\n", + "]" + ] + }, + { + "cell_type": "markdown", + "id": "0993ebfa", + "metadata": {}, + "source": [ + "네트워크 입력에 맞게 이미지를 포맷하고 텐서로 변환합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "465c1d23", + "metadata": {}, + "outputs": [], + "source": [ + "inputs = [utils.prepare_input(uri) for uri in uris]\n", + "tensor = utils.prepare_tensor(inputs)" + ] + }, + { + "cell_type": "markdown", + "id": "393d7bdc", + "metadata": {}, + "source": [ + "객체를 탐지하기 위해 SSD 네트워크를 실행합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2ebef0c9", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " detections_batch = ssd_model(tensor)" + ] + }, + { + "cell_type": "markdown", + "id": "4f289622", + "metadata": {}, + "source": [ + "SSD 네트워크의 기본 출력값은 객체의 위치를 식별하는 8732개의 box와 클래스 확률 분포를 담고 있습니다.\n", + "보다 의미있는 결과(신뢰도>40%)만 필터링 해 보겠습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0128e095", + "metadata": {}, + "outputs": [], + "source": [ + "results_per_input = utils.decode_results(detections_batch)\n", + "best_results_per_input = [utils.pick_best(results, 0.40) for results in results_per_input]" + ] + }, + { + "cell_type": "markdown", + "id": "c786167f", + "metadata": {}, + "source": [ + "이 모델은 COCO 데이터셋에 대해 학습되었고, 클래스 ID를 (사람이 식별할 수 있는) 객체 이름으로 바꾸기 위해 coco 데이터셋에 접근이 필요합니다.\n", + "처음에 다운로드할 때는 시간이 걸릴 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "25e993e2", + "metadata": {}, + "outputs": [], + "source": [ + "classes_to_labels = utils.get_coco_object_dictionary()" + ] + }, + { + "cell_type": "markdown", + "id": "db154279", + "metadata": {}, + "source": [ + "끝으로, 탐지한 결과를 시각화해 보겠습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "28f54304", + "metadata": {}, + "outputs": [], + "source": [ + "from matplotlib import pyplot as plt\n", + "import matplotlib.patches as patches\n", + "\n", + "for image_idx in range(len(best_results_per_input)):\n", + " fig, ax = plt.subplots(1)\n", + " # Show original, denormalized image...\n", + " image = inputs[image_idx] / 2 + 0.5\n", + " ax.imshow(image)\n", + " # ...with detections\n", + " bboxes, classes, confidences = best_results_per_input[image_idx]\n", + " for idx in range(len(bboxes)):\n", + " left, bot, right, top = bboxes[idx]\n", + " x, y, w, h = [val * 300 for val in [left, bot, right - left, top - bot]]\n", + " rect = patches.Rectangle((x, y), w, h, linewidth=1, edgecolor='r', facecolor='none')\n", + " ax.add_patch(rect)\n", + " ax.text(x, y, \"{} {:.0f}%\".format(classes_to_labels[classes[idx] - 1], confidences[idx]*100), bbox=dict(facecolor='white', alpha=0.5))\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "77fa79a7", + "metadata": {}, + "source": [ + "### Details\n", + "모델 입력 및 출력, 학습 방법, 추론 및 성능 등에 대한 더 자세한 정보는 [github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Detection/SSD) 및 [NGC](https://ngc.nvidia.com/catalog/resources/nvidia:ssd_for_pytorch)에서 볼 수 있습니다.\n", + "\n", + "### References\n", + "\n", + " - [SSD: Single Shot MultiBox Detector](https://arxiv.org/abs/1512.02325) paper\n", + " - [Speed/accuracy trade-offs for modern convolutional object detectors](https://arxiv.org/abs/1611.10012) paper\n", + " - [SSD on NGC](https://ngc.nvidia.com/catalog/resources/nvidia:ssd_for_pytorch)\n", + " - [SSD on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/Detection/SSD)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_tacotron2.ipynb b/assets/hub/nvidia_deeplearningexamples_tacotron2.ipynb new file mode 100644 index 000000000..2e7139cba --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_tacotron2.ipynb @@ -0,0 +1,213 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "cbb5d9fe", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Tacotron 2\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**The Tacotron 2 model for generating mel spectrograms from text**\n", + "\n", + "\"alt\"\n", + "\n", + "\n", + "\n", + "### 모델 설명\n", + "\n", + "Tacotron 2 및 WaveGlow 모델은 추가 운율 정보 없이 원본 텍스트에서 자연스러운 음성을 합성할 수 있는 텍스트 음성 변환 시스템을 만듭니다. Tacotron 2 모델은 인코더-디코더 아키텍처를 사용하여 입력 텍스트에서 멜 스펙트로그램(mel spectrogram)을 생성합니다. WaveGlow (torch.hub를 통해서도 사용 가능)는 멜 스펙트로그램을 사용하여 음성을 생성하는 흐름 기반(flow-based) 모델입니다.\n", + "\n", + "사전 훈련된 Tacotron 2 모델은 논문과 다르게 구현되었습니다. 여기서 제공하는 모델에서는 LSTM 레이어를 정규화하기 위해 Zoneout 대신 Dropout을 사용합니다.\n", + "\n", + "### 예시 사례\n", + "\n", + "아래 예제에서는:\n", + "- 사전 훈련된 Tacotron2 및 Waveglow 모델은 torch.hub에서 가져옵니다.\n", + "- Tacotron2는 (\"Hello world, I miss you so much\")와 같은 입력 텍스트의 텐서 표현이 주어지면 그림과 같은 멜 스펙트로그램을 생성합니다. \n", + "- Waveglow는 멜 스펙트로그램에서 사운드를 생성합니다.\n", + "- 출력 사운드는 'audio.wav' 파일에 저장됩니다.\n", + "\n", + "이 예제를 실행하려면 몇 가지 추가 파이썬 패키지가 설치되어 있어야 합니다.\n", + "이는 텍스트 및 오디오를 전처리하는 것은 물론 디스플레이 및 입출력 전처리에도 필요합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8908c09d", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install numpy scipy librosa unidecode inflect librosa\n", + "apt-get update\n", + "apt-get install -y libsndfile1" + ] + }, + { + "cell_type": "markdown", + "id": "d2f6834d", + "metadata": {}, + "source": [ + "[LJ Speech dataset](https://keithito.com/LJ-Speech-Dataset/) 데이터셋에서 사전 훈련된 Tacotron2 모델을 불러오고 추론을 준비합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d83ac243", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "tacotron2 = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_tacotron2', model_math='fp16')\n", + "tacotron2 = tacotron2.to('cuda')\n", + "tacotron2.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "7bb6f3a2", + "metadata": {}, + "source": [ + "사전 훈련된 WaveGlow 모델 불러오기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "43d761ab", + "metadata": {}, + "outputs": [], + "source": [ + "waveglow = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_waveglow', model_math='fp16')\n", + "waveglow = waveglow.remove_weightnorm(waveglow)\n", + "waveglow = waveglow.to('cuda')\n", + "waveglow.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "9564006b", + "metadata": {}, + "source": [ + "모델이 다음과 같이 말하게 합시다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2f209aac", + "metadata": {}, + "outputs": [], + "source": [ + "text = \"Hello world, I missed you so much.\"" + ] + }, + { + "cell_type": "markdown", + "id": "1aea5e8b", + "metadata": {}, + "source": [ + "유틸리티 메서드를 사용하여 입력 형식을 지정합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7c7abc4f", + "metadata": {}, + "outputs": [], + "source": [ + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_tts_utils')\n", + "sequences, lengths = utils.prepare_input_sequence([text])" + ] + }, + { + "cell_type": "markdown", + "id": "0514cda4", + "metadata": {}, + "source": [ + "연결된 모델을 실행합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5e9e84c1", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " mel, _, _ = tacotron2.infer(sequences, lengths)\n", + " audio = waveglow.infer(mel)\n", + "audio_numpy = audio[0].data.cpu().numpy()\n", + "rate = 22050" + ] + }, + { + "cell_type": "markdown", + "id": "16857233", + "metadata": {}, + "source": [ + "파일로 저장하여 들어볼 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0ddf428b", + "metadata": {}, + "outputs": [], + "source": [ + "from scipy.io.wavfile import write\n", + "write(\"audio.wav\", rate, audio_numpy)" + ] + }, + { + "cell_type": "markdown", + "id": "36118230", + "metadata": {}, + "source": [ + "또는 IPython이 있는 노트북에서 바로 들어볼 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "83a944dc", + "metadata": {}, + "outputs": [], + "source": [ + "from IPython.display import Audio\n", + "Audio(audio_numpy, rate=rate)" + ] + }, + { + "cell_type": "markdown", + "id": "6aea62b0", + "metadata": {}, + "source": [ + "### 세부사항\n", + "모델 입력 및 출력, 학습 방법, 추론 및 성능 등에 대한 더 자세한 정보는 [github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/SpeechSynthesis/Tacotron2) 및 and/or [NGC](https://ngc.nvidia.com/catalog/resources/nvidia:tacotron_2_and_waveglow_for_pytorch)에서 볼 수 있습니다.\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Natural TTS Synthesis by Conditioning WaveNet on Mel Spectrogram Predictions](https://arxiv.org/abs/1712.05884)\n", + " - [WaveGlow: A Flow-based Generative Network for Speech Synthesis](https://arxiv.org/abs/1811.00002)\n", + " - [Tacotron2 and WaveGlow on NGC](https://ngc.nvidia.com/catalog/resources/nvidia:tacotron_2_and_waveglow_for_pytorch)\n", + " - [Tacotron2 and Waveglow on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/SpeechSynthesis/Tacotron2)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/nvidia_deeplearningexamples_waveglow.ipynb b/assets/hub/nvidia_deeplearningexamples_waveglow.ipynb new file mode 100644 index 000000000..f930ba0f9 --- /dev/null +++ b/assets/hub/nvidia_deeplearningexamples_waveglow.ipynb @@ -0,0 +1,228 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "02285c4a", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# WaveGlow\n", + "\n", + "*Author: NVIDIA*\n", + "\n", + "**WaveGlow model for generating speech from mel spectrograms (generated by Tacotron2)**\n", + "\n", + "\"alt\"\n", + "\n", + "\n", + "\n", + "### Model Description\n", + "\n", + "The Tacotron 2 and WaveGlow model form a text-to-speech system that enables user to synthesise a natural sounding speech from raw transcripts without any additional prosody information. The Tacotron 2 model (also available via torch.hub) produces mel spectrograms from input text using encoder-decoder architecture. WaveGlow is a flow-based model that consumes the mel spectrograms to generate speech.\n", + "\n", + "### Example\n", + "\n", + "In the example below:\n", + "- pretrained Tacotron2 and Waveglow models are loaded from torch.hub\n", + "- Tacotron2 generates mel spectrogram given tensor represantation of an input text (\"Hello world, I missed you so much\")\n", + "- Waveglow generates sound given the mel spectrogram\n", + "- the output sound is saved in an 'audio.wav' file\n", + "\n", + "To run the example you need some extra python packages installed.\n", + "These are needed for preprocessing the text and audio, as well as for display and input / output." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "22d17453", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install numpy scipy librosa unidecode inflect librosa\n", + "apt-get update\n", + "apt-get install -y libsndfile1" + ] + }, + { + "cell_type": "markdown", + "id": "f01b0ea6", + "metadata": {}, + "source": [ + "Load the WaveGlow model pre-trained on [LJ Speech dataset](https://keithito.com/LJ-Speech-Dataset/)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2c98a776", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "waveglow = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_waveglow', model_math='fp32')" + ] + }, + { + "cell_type": "markdown", + "id": "97838af5", + "metadata": {}, + "source": [ + "Prepare the WaveGlow model for inference" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f42b807d", + "metadata": {}, + "outputs": [], + "source": [ + "waveglow = waveglow.remove_weightnorm(waveglow)\n", + "waveglow = waveglow.to('cuda')\n", + "waveglow.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "7af3cbdf", + "metadata": {}, + "source": [ + "Load a pretrained Tacotron2 model" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "249b66ab", + "metadata": {}, + "outputs": [], + "source": [ + "tacotron2 = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_tacotron2', model_math='fp32')\n", + "tacotron2 = tacotron2.to('cuda')\n", + "tacotron2.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "1fa500c3", + "metadata": {}, + "source": [ + "Now, let's make the model say:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6b113837", + "metadata": {}, + "outputs": [], + "source": [ + "text = \"hello world, I missed you so much\"" + ] + }, + { + "cell_type": "markdown", + "id": "776b806a", + "metadata": {}, + "source": [ + "Format the input using utility methods" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ed2cbfdb", + "metadata": {}, + "outputs": [], + "source": [ + "utils = torch.hub.load('NVIDIA/DeepLearningExamples:torchhub', 'nvidia_tts_utils')\n", + "sequences, lengths = utils.prepare_input_sequence([text])" + ] + }, + { + "cell_type": "markdown", + "id": "32688dcb", + "metadata": {}, + "source": [ + "Run the chained models" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "aa31519e", + "metadata": {}, + "outputs": [], + "source": [ + "with torch.no_grad():\n", + " mel, _, _ = tacotron2.infer(sequences, lengths)\n", + " audio = waveglow.infer(mel)\n", + "audio_numpy = audio[0].data.cpu().numpy()\n", + "rate = 22050" + ] + }, + { + "cell_type": "markdown", + "id": "412fa651", + "metadata": {}, + "source": [ + "You can write it to a file and listen to it" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b662bfc9", + "metadata": {}, + "outputs": [], + "source": [ + "from scipy.io.wavfile import write\n", + "write(\"audio.wav\", rate, audio_numpy)" + ] + }, + { + "cell_type": "markdown", + "id": "eb814a80", + "metadata": {}, + "source": [ + "Alternatively, play it right away in a notebook with IPython widgets" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0022134d", + "metadata": {}, + "outputs": [], + "source": [ + "from IPython.display import Audio\n", + "Audio(audio_numpy, rate=rate)" + ] + }, + { + "cell_type": "markdown", + "id": "1cf994b6", + "metadata": {}, + "source": [ + "### Details\n", + "For detailed information on model input and output, training recipies, inference and performance visit: [github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/SpeechSynthesis/Tacotron2) and/or [NGC](https://ngc.nvidia.com/catalog/resources/nvidia:tacotron_2_and_waveglow_for_pytorch)\n", + "\n", + "### References\n", + "\n", + " - [Natural TTS Synthesis by Conditioning WaveNet on Mel Spectrogram Predictions](https://arxiv.org/abs/1712.05884)\n", + " - [WaveGlow: A Flow-based Generative Network for Speech Synthesis](https://arxiv.org/abs/1811.00002)\n", + " - [Tacotron2 and WaveGlow on NGC](https://ngc.nvidia.com/catalog/resources/nvidia:tacotron_2_and_waveglow_for_pytorch)\n", + " - [Tacotron2 and Waveglow on github](https://github.com/NVIDIA/DeepLearningExamples/tree/master/PyTorch/SpeechSynthesis/Tacotron2)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_fairseq_roberta.ipynb b/assets/hub/pytorch_fairseq_roberta.ipynb new file mode 100644 index 000000000..b256a753e --- /dev/null +++ b/assets/hub/pytorch_fairseq_roberta.ipynb @@ -0,0 +1,180 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "e2b48886", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# RoBERTa\n", + "\n", + "*Author: Facebook AI (fairseq Team)*\n", + "\n", + "**BERT를 강력하게 최적화하는 사전 학습 접근법, RoBERTa**\n", + "\n", + "\n", + "\n", + "### 모델 설명\n", + "\n", + "Bidirectional Encoder Representations from Transformers, [BERT][1]는 텍스트에서 의도적으로 숨겨진 부분을 예측하는 뛰어난 자기지도 사전 학습(self-supervised pretraining) 기술입니다. 특히 BERT가 학습한 표현은 다운스트림 태스크(downstream tasks)에 잘 일반화되는 것으로 나타났으며, BERT가 처음 공개된 2018년에 수많은 자연어처리 벤치마크 데이터셋에 대해 가장 좋은 성능을 기록했습니다.\n", + "\n", + "[RoBERTa][2]는 BERT의 언어 마스킹 전략(language masking strategy)에 기반하지만 몇 가지 차이점이 존재합니다. 다음 문장 사전 학습(next-sentence pretraining objective)을 제거하고 훨씬 더 큰 미니 배치와 학습 속도로 훈련하는 등 주요 하이퍼파라미터를 수정합니다. 또한 RoBERTa는 더 오랜 시간 동안 BERT보다 훨씬 많은 데이터에 대해 학습되었습니다. 이를 통해 RoBERTa의 표현은 BERT보다 다운스트림 태스크에 더 잘 일반화될 수 있습니다.\n", + "\n", + "\n", + "### 요구 사항\n", + "\n", + "추가적인 Python 의존성이 필요합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c8b05422", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install regex requests hydra-core omegaconf" + ] + }, + { + "cell_type": "markdown", + "id": "d93d8cfd", + "metadata": {}, + "source": [ + "### 예시\n", + "\n", + "##### RoBERTa 불러오기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "082e4326", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "roberta = torch.hub.load('pytorch/fairseq', 'roberta.large')\n", + "roberta.eval() # 드롭아웃 비활성화 (또는 학습 모드 비활성화)" + ] + }, + { + "cell_type": "markdown", + "id": "01efd92b", + "metadata": {}, + "source": [ + "##### 입력 텍스트에 Byte-Pair Encoding (BPE) 적용하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1579bdc5", + "metadata": {}, + "outputs": [], + "source": [ + "tokens = roberta.encode('Hello world!')\n", + "assert tokens.tolist() == [0, 31414, 232, 328, 2]\n", + "assert roberta.decode(tokens) == 'Hello world!'" + ] + }, + { + "cell_type": "markdown", + "id": "23ff9486", + "metadata": {}, + "source": [ + "##### RoBERTa에서 특징(feature) 추출" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "03629826", + "metadata": {}, + "outputs": [], + "source": [ + "# 마지막 계층의 특징 추출\n", + "last_layer_features = roberta.extract_features(tokens)\n", + "assert last_layer_features.size() == torch.Size([1, 5, 1024])\n", + "\n", + "# 모든 계층의 특징 추출\n", + "all_layers = roberta.extract_features(tokens, return_all_hiddens=True)\n", + "assert len(all_layers) == 25\n", + "assert torch.all(all_layers[-1] == last_layer_features)" + ] + }, + { + "cell_type": "markdown", + "id": "8cfdde4a", + "metadata": {}, + "source": [ + "##### 문장 관계 분류(sentence-pair classification) 태스크에 RoBERTa 사용하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5074dab2", + "metadata": {}, + "outputs": [], + "source": [ + "# MNLI에 대해 미세조정된 RoBERTa 다운로드\n", + "roberta = torch.hub.load('pytorch/fairseq', 'roberta.large.mnli')\n", + "roberta.eval() # 평가를 위해 드롭아웃 비활성화\n", + "\n", + "with torch.no_grad():\n", + " # 한 쌍의 문장을 인코딩하고 예측\n", + " tokens = roberta.encode('Roberta is a heavily optimized version of BERT.', 'Roberta is not very optimized.')\n", + " prediction = roberta.predict('mnli', tokens).argmax().item()\n", + " assert prediction == 0 # contradiction\n", + "\n", + " # 다른 문장 쌍을 인코딩하고 예측\n", + " tokens = roberta.encode('Roberta is a heavily optimized version of BERT.', 'Roberta is based on BERT.')\n", + " prediction = roberta.predict('mnli', tokens).argmax().item()\n", + " assert prediction == 2 # entailment" + ] + }, + { + "cell_type": "markdown", + "id": "c7f975dc", + "metadata": {}, + "source": [ + "##### 새로운 분류층 적용하기" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f6c10892", + "metadata": {}, + "outputs": [], + "source": [ + "roberta.register_classification_head('new_task', num_classes=3)\n", + "logprobs = roberta.predict('new_task', tokens) # tensor([[-1.1050, -1.0672, -1.1245]], grad_fn=)" + ] + }, + { + "cell_type": "markdown", + "id": "d3ae12b4", + "metadata": {}, + "source": [ + "### 참고\n", + "\n", + "- [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding][1]\n", + "- [RoBERTa: A Robustly Optimized BERT Pretraining Approach][2]\n", + "\n", + "\n", + "[1]: https://arxiv.org/abs/1810.04805\n", + "[2]: https://arxiv.org/abs/1907.11692" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_fairseq_translation.ipynb b/assets/hub/pytorch_fairseq_translation.ipynb new file mode 100644 index 000000000..af6bde6bf --- /dev/null +++ b/assets/hub/pytorch_fairseq_translation.ipynb @@ -0,0 +1,197 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "3b2f7672", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Transformer (NMT)\n", + "\n", + "*Author: Facebook AI (fairseq Team)*\n", + "\n", + "**영어-프랑스어 번역과 영어-독일어 번역을 위한 트랜스포머 모델**\n", + "\n", + "\n", + "\n", + "### 모델 설명\n", + "\n", + "논문 [Attention Is All You Need][1]에 소개되었던 트랜스포머(Transformer)는 \n", + "강력한 시퀀스-투-시퀀스 모델링 아키텍처로 최신 기계 신경망 번역 시스템을 가능하게 합니다.\n", + "\n", + "최근, `fairseq`팀은 역번역된 데이터를 활용한 \n", + "트랜스포머의 대규모 준지도 학습을 통해 번역 수준을 기존보다 향상시켰습니다.\n", + "더 자세한 내용은 [블로그 포스트][2]를 통해 찾으실 수 있습니다.\n", + "\n", + "\n", + "### 요구사항\n", + "\n", + "전처리 과정을 위해 몇 가지 python 라이브러리가 필요합니다:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9c34eb44", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install bitarray fastBPE hydra-core omegaconf regex requests sacremoses subword_nmt" + ] + }, + { + "cell_type": "markdown", + "id": "fcd2706d", + "metadata": {}, + "source": [ + "### 영어 ➡️ 프랑스어 번역\n", + "\n", + "영어를 프랑스어로 번역하기 위해 [Scaling\n", + "Neural Machine Translation][3] 논문의 모델을 활용합니다:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b10d20be", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "# WMT'14 data에서 학습된 영어 ➡️ 프랑스어 트랜스포머 모델 불러오기:\n", + "en2fr = torch.hub.load('pytorch/fairseq', 'transformer.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')\n", + "\n", + "# GPU 사용 (선택사항):\n", + "en2fr.cuda()\n", + "\n", + "# beam search를 통한 번역:\n", + "fr = en2fr.translate('Hello world!', beam=5)\n", + "assert fr == 'Bonjour à tous !'\n", + "\n", + "# 토큰화:\n", + "en_toks = en2fr.tokenize('Hello world!')\n", + "assert en_toks == 'Hello world !'\n", + "\n", + "# BPE 적용:\n", + "en_bpe = en2fr.apply_bpe(en_toks)\n", + "assert en_bpe == 'H@@ ello world !'\n", + "\n", + "# 이진화:\n", + "en_bin = en2fr.binarize(en_bpe)\n", + "assert en_bin.tolist() == [329, 14044, 682, 812, 2]\n", + "\n", + "# top-k sampling을 통해 다섯 번역 사례 생성:\n", + "fr_bin = en2fr.generate(en_bin, beam=5, sampling=True, sampling_topk=20)\n", + "assert len(fr_bin) == 5\n", + "\n", + "# 예시중 하나를 문자열로 변환하고 비토큰화\n", + "fr_sample = fr_bin[0]['tokens']\n", + "fr_bpe = en2fr.string(fr_sample)\n", + "fr_toks = en2fr.remove_bpe(fr_bpe)\n", + "fr = en2fr.detokenize(fr_toks)\n", + "assert fr == en2fr.decode(fr_sample)" + ] + }, + { + "cell_type": "markdown", + "id": "b09b37cf", + "metadata": {}, + "source": [ + "### 영어 ➡️ 독일어 번역\n", + "\n", + "역번역에 대한 준지도학습은 번역 시스템을 향상시키는데 효율적인 방법입니다.\n", + "논문 [Understanding Back-Translation at Scale][4]에서,\n", + "추가적인 학습 데이터로 사용하기 위해 2억개 이상의 독일어 문장을 역번역합니다. 이 다섯 모델들의 앙상블은 [WMT'18 English-German news translation competition][5]의 수상작입니다.\n", + "\n", + "[noisy-channel reranking][6]을 통해 이 접근법을 더 향상시킬 수 있습니다. \n", + "더 자세한 내용은 [블로그 포스트][7]에서 볼 수 있습니다. \n", + "이러한 노하우로 학습된 모델들의 앙상블은 [WMT'19 English-German news\n", + "translation competition][8]의 수상작입니다.\n", + "\n", + "앞서 소개된 대회 수상 모델 중 하나를 사용하여 영어를 독일어로 번역해보겠습니다:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "46bbaee9", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "# WMT'19 data에서 학습된 영어 ➡️ 독일어 트랜스포머 모델 불러오기:\n", + "en2de = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.en-de.single_model', tokenizer='moses', bpe='fastbpe')\n", + "\n", + "# 기본 트랜스포머 모델에 접근\n", + "assert isinstance(en2de.models[0], torch.nn.Module)\n", + "\n", + "# 영어 ➡️ 독일어 번역\n", + "de = en2de.translate('PyTorch Hub is a pre-trained model repository designed to facilitate research reproducibility.')\n", + "assert de == 'PyTorch Hub ist ein vorgefertigtes Modell-Repository, das die Reproduzierbarkeit der Forschung erleichtern soll.'" + ] + }, + { + "cell_type": "markdown", + "id": "4159e73b", + "metadata": {}, + "source": [ + "교차번역으로 같은 문장에 대한 의역을 만들 수도 있습니다:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0642a212", + "metadata": {}, + "outputs": [], + "source": [ + "# 영어 ↔️ 독일어 교차번역:\n", + "en2de = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.en-de.single_model', tokenizer='moses', bpe='fastbpe')\n", + "de2en = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.de-en.single_model', tokenizer='moses', bpe='fastbpe')\n", + "\n", + "paraphrase = de2en.translate(en2de.translate('PyTorch Hub is an awesome interface!'))\n", + "assert paraphrase == 'PyTorch Hub is a fantastic interface!'\n", + "\n", + "# 영어 ↔️ 러시아어 교차번역과 비교:\n", + "en2ru = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.en-ru.single_model', tokenizer='moses', bpe='fastbpe')\n", + "ru2en = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.ru-en.single_model', tokenizer='moses', bpe='fastbpe')\n", + "\n", + "paraphrase = ru2en.translate(en2ru.translate('PyTorch Hub is an awesome interface!'))\n", + "assert paraphrase == 'PyTorch is a great interface!'" + ] + }, + { + "cell_type": "markdown", + "id": "01f0c462", + "metadata": {}, + "source": [ + "### 참고 문헌\n", + "\n", + "- [Attention Is All You Need][1]\n", + "- [Scaling Neural Machine Translation][3]\n", + "- [Understanding Back-Translation at Scale][4]\n", + "- [Facebook FAIR's WMT19 News Translation Task Submission][6]\n", + "\n", + "\n", + "[1]: https://arxiv.org/abs/1706.03762\n", + "[2]: https://code.fb.com/ai-research/scaling-neural-machine-translation-to-bigger-data-sets-with-faster-training-and-inference/\n", + "[3]: https://arxiv.org/abs/1806.00187\n", + "[4]: https://arxiv.org/abs/1808.09381\n", + "[5]: http://www.statmt.org/wmt18/translation-task.html\n", + "[6]: https://arxiv.org/abs/1907.06616\n", + "[7]: https://ai.facebook.com/blog/facebook-leads-wmt-translation-competition/\n", + "[8]: http://www.statmt.org/wmt19/translation-task.html" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_alexnet.ipynb b/assets/hub/pytorch_vision_alexnet.ipynb new file mode 100644 index 000000000..4e72abb75 --- /dev/null +++ b/assets/hub/pytorch_vision_alexnet.ipynb @@ -0,0 +1,149 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "e45ea396", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# AlexNet\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**The 2012 ImageNet winner achieved a top-5 error of 15.3%, more than 10.8 percentage points lower than that of the runner up.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/alexnet1.png) | ![alt](https://pytorch.org/assets/images/alexnet2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c6e24a0e", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'alexnet', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "dfdbed6a", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지, 즉 N이 이미지 수이고, H와 W는 최소 224픽셀인 (N, 3, H, W)형태의 3채널 RGB 이미지의 미니 배치를 요구합니다. 이미지를 [0, 1] 범위로 로드한 다음 mean = [0.485, 0.456, 0.406] 및 std = [0.229, 0.224, 0.225]를 사용하여 정규화해야 합니다.\n", + "\n", + "이미지는 `[0, 1]`의 범위에서 로드되어야 하고 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]` 으로 정규화해야합니다.\n", + "\n", + "다음은 실행 예제입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8eda9cc3", + "metadata": {}, + "outputs": [], + "source": [ + "# PyTorch 웹사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f05b220c", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예제 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 형식인 미니 배치 생성\n", + "\n", + "# 빠른 실행을 위해 GPU 사용 가능 시 모델과 입력값을 GPU를 사용하도록 설정\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# Imagenet 1000개 클래스의 신뢰 점수를 나타내는 텐서\n", + "\n", + "print(output[0])\n", + "\n", + "# 결과는 비정규화된 점수입니다. softmax으로 돌리면 확률값을 얻을 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9bcebfd1", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fd2296e7", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지별 확률값 상위 카테고리 출력\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "15245352", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "AlexNet은 2012년도 ImageNet Large Scale Visual Recognition Challenge (ILSVRC)에 참여한 모델입니다. 이 네트워크는 15.3%의 top-5 에러율을 달성했고, 이는 2위보다 10.8%P 낮은 수치입니다. 원 논문의 주요 결론은 높은 성능을 위해 모델의 깊이가 필수적이라는 것이었습니다. 이는 계산 비용이 많이 들지만, 학습 과정에서 GPU의 사용으로 가능해졌습니다.\n", + "\n", + "사전 훈련된 모델이 있는 ImageNet 데이터셋의 1-crop 에러율은 다음 표와 같습니다.\n", + "\n", + "| 모델 구조 | Top-1 에러 | Top-5 에러 |\n", + "| --------------- | ----------- | ----------- |\n", + "| alexnet | 43.45 | 20.91 | -->\n", + "\n", + "### 참고문헌\n", + "\n", + "1. [One weird trick for parallelizing convolutional neural networks](https://arxiv.org/abs/1404.5997)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_deeplabv3_resnet101.ipynb b/assets/hub/pytorch_vision_deeplabv3_resnet101.ipynb new file mode 100644 index 000000000..51550eab0 --- /dev/null +++ b/assets/hub/pytorch_vision_deeplabv3_resnet101.ipynb @@ -0,0 +1,158 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c21e55bb", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Deeplabv3\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**DeepLabV3 models with ResNet-50, ResNet-101 and MobileNet-V3 backbones**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/deeplab1.png) | ![alt](https://pytorch.org/assets/images/deeplab2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "57b67a45", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'deeplabv3_resnet50', pretrained=True)\n", + "# 또는 아래 중 하나\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'deeplabv3_resnet101', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'deeplabv3_mobilenet_v3_large', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "3cf0a74b", + "metadata": {}, + "source": [ + "사전 훈련된 모든 모델들은 동일한 방식으로 정규화된 입력 이미지를 기대합니다.\n", + "즉, `(N, 3, H, W)` 모양의 3채널 RGB 이미지의 미니 배치, 여기서 `N` 은 이미지의 개수, `H` 와 `W`은 각각 최소 `224` 픽셀들로 이루어진 것으로 기대합니다.\n", + "이미지는 `[0, 1]` 범위로 로드한 다음 `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]`\n", + "를 사용하여 정규화를 진행합니다.\n", + "\n", + "모델은 입력 Tensor와 높이와 너비가 같지만 21개의 클래스가 있는 두 개의 텐서가 있는 `OrderedDict`를 반환합니다.\n", + "`output['out']` 의미론적 마스크를 포함하고 있고, `output['aux']`에는 픽셀 당 보조 손실(auxiliary loss) 값을 포함하고 있습니다. 추론 모드에서는, `output['aux']`는 유용하지 않습니다.\n", + "따라서, `output['out']`은 `(N, 21, H, W)`과 같은 모양을 가집니다. 좀 더 자세한 정보는 [이곳](https://pytorch.org/vision/stable/models.html#semantic-segmentation)에서 확인할 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e5949368", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예시 이미지를 다운로드합니다.\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/deeplab1.png\", \"deeplab1.png\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "955c5727", + "metadata": {}, + "outputs": [], + "source": [ + "# 샘플을 실행합니다. (torchvision이 필요합니다.)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "input_image = input_image.convert(\"RGB\")\n", + "preprocess = transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델이 원하는 미니 배치를 만듭니다.\n", + "\n", + "# 가능한 경우 속도를 빠르게 하기 위해 입력 및 모델을 GPU로 이동합니다.\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)['out'][0]\n", + "output_predictions = output.argmax(0)" + ] + }, + { + "cell_type": "markdown", + "id": "a0811dd0", + "metadata": {}, + "source": [ + "여기서 출력은 `(21, H, W)` 형태이며, 각 위치에서는 클래스마다 예측에 해당하는 정규화되지 않은 확률이 있습니다.\n", + "각 클래스의 최대 예측값을 얻은 다음 다운스트림 작업에 사용하려면, `output_predictions = output.argmax(0)`를 수행합니다.\n", + "\n", + "다음은 각각 클래스마다 색상이 할당된 예측을 나타내는 작은 조각입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b6cdbce7", + "metadata": {}, + "outputs": [], + "source": [ + "# 색상 팔레트를 만들고 각 클래스의 색상을 선택합니다.\n", + "palette = torch.tensor([2 ** 25 - 1, 2 ** 15 - 1, 2 ** 21 - 1])\n", + "colors = torch.as_tensor([i for i in range(21)])[:, None] * palette\n", + "colors = (colors % 255).numpy().astype(\"uint8\")\n", + "\n", + "# 각 색상에서 21개 클래스의 의미론적 분할 예측을 플로팅합니다.\n", + "r = Image.fromarray(output_predictions.byte().cpu().numpy()).resize(input_image.size)\n", + "r.putpalette(colors)\n", + "\n", + "import matplotlib.pyplot as plt\n", + "plt.imshow(r)\n", + "# plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "8d5e6d05", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "Deeplabv3-ResNet은 ResNet-50 또는 ResNet-101 백본이 있는 Deeplabv3 모델로 구성되어 있습니다.\n", + "Deeplabv3-MobileNetV3-Large는 MobileNetV3 large 백본이 있는 DeepLabv3 모델로 구성되어 있습니다.\n", + "사전 훈련된 모델은 Pascal VOC 데이터 세트에 있는 20개 카테고리에 대해 COCO train2017의 일부분 데이터 셋에 대해 훈련되었습니다.\n", + "\n", + "COCO val2017 데이터 셋에서 평가된 사전 훈련된 모델의 정확도는 다음과 같습니다.\n", + "\n", + "| Model structure | Mean IOU | Global Pixelwise Accuracy |\n", + "| ---------------------------- | ----------- | --------------------------|\n", + "| deeplabv3_resnet50 | 66.4 | 92.4 |\n", + "| deeplabv3_resnet101 | 67.4 | 92.4 |\n", + "| deeplabv3_mobilenet_v3_large | 60.3 | 91.2 |\n", + "\n", + "### 참조\n", + "\n", + " - [Rethinking Atrous Convolution for Semantic Image Segmentation](https://arxiv.org/abs/1706.05587)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_densenet.ipynb b/assets/hub/pytorch_vision_densenet.ipynb new file mode 100644 index 000000000..6b1e0aa2d --- /dev/null +++ b/assets/hub/pytorch_vision_densenet.ipynb @@ -0,0 +1,155 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "809975a2", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Densenet\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Dense Convolutional Network (DenseNet), connects each layer to every other layer in a feed-forward fashion.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/densenet1.png) | ![alt](https://pytorch.org/assets/images/densenet2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b808b710", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'densenet121', pretrained=True)\n", + "# or any of these variants\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'densenet169', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'densenet201', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'densenet161', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "c75f2754", + "metadata": {}, + "source": [ + "사전에 학습된 모든 모델은 동일한 방식으로 정규화된 입력 이미지,\n", + "즉, `H` 와 `W` 는 최소 `224` 이상인 `(3 x H x W)` 형태의 3-채널 RGB 이미지의 미니 배치를 요구합니다.\n", + "이미지를 `[0, 1]` 범위에서 로드한 다음 `mean = [0.485, 0.456, 0.406]`\n", + "과 `std = [0.229, 0.224, 0.225]` 를 통해 정규화합니다.\n", + "\n", + "실행 예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4ec41f0f", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "09f4c970", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예시 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구되는 미니배치 생성\n", + "\n", + "# 가능하다면 속도를 위해 입력과 모델을 GPU로 옮깁니다\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# shape이 1000이며 ImageNet의 1000개 클래스에 대한 신뢰도 점수가 있는 텐서\n", + "print(output[0])\n", + "# 출력에 정규화되지 않은 점수가 있습니다. 확률을 얻으려면 소프트맥스를 실행하세요.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "03ffd128", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "92928b6f", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지 별 Top5 카테고리 조회\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "782d13be", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "Dense Convolutional Network (DenseNet)는 순전파(feed-forward) 방식으로 각 레이어를 다른 모든 레이어과 연결합니다. L 계층의 기존 합성곱 신경망이 L개의 연결 - 각 층과 다음 층 사이의 하나 - 인 반면 우리의 신경망은 L(L+1)/2 직접 연결을 가집니다. 각 계층에, 모든 선행 계층의 (feature-map)형상 맵은 입력으로 사용되며, 자체 형상 맵은 모든 후속 계층에 대한 입력으로 사용됩니다. DenseNets는 몇 가지 강력한 장점을 가집니다: 그레디언트가 사라지는 문제를 완화시키고, 특징 전파를 강화하며, 특징 재사용을 권장하며, 매개 변수의 수를 크게 줄입니다.\n", + "\n", + "사전 학습된 모델을 사용한 imagenet 데이터셋의 1-crop 오류율은 다음 표와 같습니다.\n", + "\n", + "| Model structure | Top-1 error | Top-5 error |\n", + "| --------------- | ----------- | ----------- |\n", + "| densenet121 | 25.35 | 7.83 |\n", + "| densenet169 | 24.00 | 7.00 |\n", + "| densenet201 | 22.80 | 6.43 |\n", + "| densenet161 | 22.35 | 6.20 |\n", + "\n", + "### 참고 자료\n", + "\n", + " - [Densely Connected Convolutional Networks](https://arxiv.org/abs/1608.06993)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_fcn_resnet101.ipynb b/assets/hub/pytorch_vision_fcn_resnet101.ipynb new file mode 100644 index 000000000..0d7ee9c6e --- /dev/null +++ b/assets/hub/pytorch_vision_fcn_resnet101.ipynb @@ -0,0 +1,147 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c149136e", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# FCN\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Fully-Convolutional Network model with ResNet-50 and ResNet-101 backbones**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/deeplab1.png) | ![alt](https://pytorch.org/assets/images/fcn2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b813b212", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'fcn_resnet50', pretrained=True)\n", + "# or\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'fcn_resnet101', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "d7845f97", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지, 즉 `N`이 이미지 수이고, `H`와 `W`는 최소 `224`픽셀인 `(N, 3, H, W)`형태의 3채널 RGB 이미지의 미니 배치를 요구합니다. \n", + "이미지를 `[0, 1]` 범위로 로드한 다음 `mean = [0.485, 0.456, 0.406]` 및 `std = [0.229, 0.224, 0.225]`를 사용하여 정규화해야 합니다.\n", + "모델은 입력 텐서와 높이와 너비는 같지만 클래스가 21개인 텐서를 가진 `OrderedDict`를 반환합니다. `output['out']`에는 시멘틱 마스크가 포함되며 `output['aux']`에는 픽셀당 보조 손실 값이 포함됩니다. 추론 모드에서는 `output['aux']`이 유용하지 않습니다.\n", + "그래서 `output['out']`의 크기는 `(N, 21, H, W)`입니다. 추가 설명서는 [여기](https://pytorch.org/vision/stable/models.html#object-detection-instance-segmentation-and-person-keypoint-detection)에서 찾을 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0a20840f", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/deeplab1.png\", \"deeplab1.png\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7d67e334", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예시 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "input_image = input_image.convert(\"RGB\")\n", + "preprocess = transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구되는 미니배치 생성\n", + "\n", + "# 가능하다면 속도를 위해 입력과 모델을 GPU로 옮깁니다\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)['out'][0]\n", + "output_predictions = output.argmax(0)" + ] + }, + { + "cell_type": "markdown", + "id": "58cd0374", + "metadata": {}, + "source": [ + "여기서의 출력 형태는 `(21, H, W)`이며, 각 위치에는 각 클래스의 예측에 해당하는 정규화되지 않은 확률이 있습니다. 각 클래스의 최대 예측을 가져온 다음 이를 다운스트림 작업에 사용하려면 `output_propertions = output.slmax(0)`를 수행합니다. 다음은 각 클래스에 할당된 각 색상과 함께 예측을 표시하는 작은 토막글 입니다(왼쪽의 시각화 이미지 참조)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e81e20d4", + "metadata": {}, + "outputs": [], + "source": [ + "# 각 클래스에 대한 색상을 선택하여 색상 팔레트를 만듭니다.\n", + "palette = torch.tensor([2 ** 25 - 1, 2 ** 15 - 1, 2 ** 21 - 1])\n", + "colors = torch.as_tensor([i for i in range(21)])[:, None] * palette\n", + "colors = (colors % 255).numpy().astype(\"uint8\")\n", + "\n", + "# 각 색상의 21개 클래스의 시멘틱 세그멘테이션 예측을 그림으로 표시합니다.\n", + "r = Image.fromarray(output_predictions.byte().cpu().numpy()).resize(input_image.size)\n", + "r.putpalette(colors)\n", + "\n", + "import matplotlib.pyplot as plt\n", + "plt.imshow(r)\n", + "# plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "ba512637", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "FCN-ResNet은 ResNet-50 또는 ResNet-101 백본을 사용하여 완전 컨볼루션 네트워크 모델로 구성됩니다. 사전 훈련된 모델은 Pascal VOC 데이터 세트에 존재하는 20개 범주에 대한 COCO 2017의 하위 집합에 대해 훈련 되었습니다.\n", + "\n", + "COCO val 2017 데이터셋에서 평가된 사전 훈련된 모델의 정확성은 아래에 나열되어 있습니다.\n", + "\n", + "| Model structure | Mean IOU | Global Pixelwise Accuracy |\n", + "| --------------- | ----------- | --------------------------|\n", + "| fcn_resnet50 | 60.5 | 91.4 |\n", + "| fcn_resnet101 | 63.7 | 91.9 |\n", + "\n", + "### Resources\n", + "\n", + " - [Fully Convolutional Networks for Semantic Segmentation](https://arxiv.org/abs/1605.06211)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_ghostnet.ipynb b/assets/hub/pytorch_vision_ghostnet.ipynb new file mode 100644 index 000000000..178d5a14f --- /dev/null +++ b/assets/hub/pytorch_vision_ghostnet.ipynb @@ -0,0 +1,156 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "ea95842c", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# GhostNet\n", + "\n", + "*Author: Huawei Noah's Ark Lab*\n", + "\n", + "**Efficient networks by generating more features from cheap operations**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "36de2a11", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('huawei-noah/ghostnet', 'ghostnet_1x', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "70160078", + "metadata": {}, + "source": [ + "모든 사전 학습된 모델들은 입력 이미지가 동일한 방식으로 정규화 되는 것을 요구합니다. \n", + "다시 말해 `H`와 `W`가 적어도 `224`이고, `(3 x H x W)`의 shape를 가지는 3채널 RGB 이미지들의 미니배치를 말합니다.\n", + "이 이미지들은 `[0, 1]`의 범위로 로드되어야 하고, `mean = [0.485, 0.456, 0.406]`\n", + "과 `std = [0.229, 0.224, 0.225]`를 사용하여 정규화되어야 합니다.\n", + "\n", + "여기서부터는 예시 코드 입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "391149a0", + "metadata": {}, + "outputs": [], + "source": [ + "# pytorch 웹사이트에서 예시 이미지를 다운로드 합니다.\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c68273ca", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예시 코드 (torchvision 필요합니다)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에 맞추어 미니배치를 생성 합니다.\n", + "\n", + "# 연산속도를 위해 input과 모델을 GPU에 로드 합니다\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# ImageNet 1000개의 클래스의 신뢰점수를 포함하는 (1000,) 의 텐서를 return 합니다.\n", + "print(output[0])\n", + "# output은 정규화되지 않은 신뢰 점수로 얻어집니다. 확률을 얻기 위해 소프트맥스를 사용할 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6f929317", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet의 라벨을 다운로드 합니다.\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "660d75a5", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리를 읽어옵니다.\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지 마다 확률값이 가장 높은 범주 출력 합니다.\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "f5eb9821", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "고스트넷 아키텍처는 다양한 특징 맵을 효율적인 연산으로 생성하는 고스트 모듈 구조로 이루어집니다. \n", + "합성곱 신경망에서의 학습 과정에서 추론에 중요한 중복되는 고유 특징맵(고스트 맵)들이 다수 생성되는 현상에 기반하여 설계 되었습니다. 고스트넷에서는 더 효율적인 연산으로 고스트 맵들을 생성합니다.\n", + "벤치마크에서 수행된 실험을 통해 속도와 정확도의 상충 관계에 관한 고스트넷의 우수성을 보여줍니다.\n", + "\n", + "사전 학습된 모델을 사용한 ImageNet 데이터셋에 따른 정확도는 아래에 나열되어 있습니다.\n", + "\n", + "| Model structure | FLOPs | Top-1 acc | Top-5 acc |\n", + "| --------------- | ----------- | ----------- | ----------- |\n", + "| GhostNet 1.0x | 142M | 73.98 | 91.46 |\n", + "\n", + "\n", + "### 참고\n", + "\n", + "다음 [링크](https://arxiv.org/abs/1911.11907)에서 논문의 전체적인 내용에 대하여 읽을 수 있습니다.\n", + "\n", + ">@inproceedings{han2019ghostnet,\n", + "> title={GhostNet: More Features from Cheap Operations},\n", + "> author={Kai Han and Yunhe Wang and Qi Tian and Jianyuan Guo and Chunjing Xu and Chang Xu},\n", + "> booktitle={CVPR},\n", + "> year={2020},\n", + ">}" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_googlenet.ipynb b/assets/hub/pytorch_vision_googlenet.ipynb new file mode 100644 index 000000000..8d42b7145 --- /dev/null +++ b/assets/hub/pytorch_vision_googlenet.ipynb @@ -0,0 +1,146 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "b357620e", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# GoogLeNet\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**GoogLeNet was based on a deep convolutional neural network architecture codenamed \"Inception\" which won ImageNet 2014.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/googlenet1.png) | ![alt](https://pytorch.org/assets/images/googlenet2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8223a8c1", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'googlenet', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "f793ae4a", + "metadata": {}, + "source": [ + "사전 훈련된 모델들을 사용할 때는 동일한 방식으로 정규화된 이미지를 입력으로 넣어야 합니다.\n", + "즉, 미니 배치(mini-batch)의 3-채널 RGB 이미지들은 `(3 x H x W)`의 형태를 가지며, 해당 `H`와 `W`는 최소 `224` 이상이어야 합니다.\n", + "각 이미지는 `[0, 1]`의 범위 내에서 불러와야 하며, `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]`을 이용해 정규화되어야 합니다.\n", + "다음은 실행 예제 입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ff6387b6", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "57c9d1e5", + "metadata": {}, + "outputs": [], + "source": [ + "# 예시 코드 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 가정하는대로 미니배치 생성\n", + "\n", + "# gpu를 사용할 수 있다면, 속도를 위해 입력과 모델을 gpu로 옮김\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# output은 shape가 [1000]인 Tensor 자료형이며, 이는 ImageNet 데이터셋의 1000개의 각 클래스에 대한 모델의 확신도(confidence)를 나타냄\n", + "print(output[0])\n", + "# output은 정규화되지 않았으므로, 확률화하기 위해 softmax 함수를 처리\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8af25363", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 데이터셋 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "185e8846", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리(클래스) 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 각 이미지에 대한 top 5 카테고리 출력\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "1e2fda3f", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "GoogLeNet은 코드네임 \"Inception\"으로 불리는 신경망 아키텍처에 기반한 깊은 합성곱 신경망입니다. 이 모델은 ImageNet Large-Scale Visual Recognition Challenge 2014 (ILSVRC 2014) 에서 새로운 SOTA(state of the art)를 달성했습니다. 사전 훈련된 모델로 ImageNet 데이터셋에서의 단일-크롭 방식으로 오류 비율을 측정한 결과는 아래와 같습니다.\n", + "\n", + "| 모델 구조 | Top-1 오류 | Top-5 오류 |\n", + "| --------------- | ----------- | ----------- |\n", + "| googlenet | 30.22 | 10.47 |\n", + "\n", + "\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Going Deeper with Convolutions](https://arxiv.org/abs/1409.4842)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_hardnet.ipynb b/assets/hub/pytorch_vision_hardnet.ipynb new file mode 100644 index 000000000..5be74b2d0 --- /dev/null +++ b/assets/hub/pytorch_vision_hardnet.ipynb @@ -0,0 +1,156 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "7c63b336", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# HarDNet\n", + "\n", + "*Author: PingoLH*\n", + "\n", + "**Harmonic DenseNet pre-trained on ImageNet**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/hardnet.png) | ![alt](https://pytorch.org/assets/images/hardnet_blk.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6f42d746", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('PingoLH/Pytorch-HarDNet', 'hardnet68', pretrained=True)\n", + "# or any of these variants\n", + "# model = torch.hub.load('PingoLH/Pytorch-HarDNet', 'hardnet85', pretrained=True)\n", + "# model = torch.hub.load('PingoLH/Pytorch-HarDNet', 'hardnet68ds', pretrained=True)\n", + "# model = torch.hub.load('PingoLH/Pytorch-HarDNet', 'hardnet39ds', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "c214fe5a", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지를 요구합니다.\n", + "즉, `H`와 `W`가 최소 `224`의 크기를 가지는 `(3 x H x W)`형태의 3채널 RGB 이미지의 미니배치가 필요합니다. \n", + "이미지를 [0, 1] 범위로 불러온 다음 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]`를 이용하여 정규화해야 합니다.\n", + "\n", + "다음은 실행예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cef5dc30", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1480b62e", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행예시 (torchvision이 요구됩니다)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 미니배치 생성\n", + "\n", + "# GPU 사용이 가능한 경우 속도를 위해 입력과 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# ImageNet의 1000개 클래스에 대한 신뢰도 점수를 가진 1000 형태의 Tensor 출력\n", + "print(output[0])\n", + "# 출력은 정규화되어있지 않습니다. 소프트맥스를 실행하여 확률을 얻을 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f0ddf4b7", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "28cf90af", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽어오기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지마다 상위 카테고리 5개 보여주기\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "dd587a5d", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "HarDNet(Harmonic DenseNet)은 낮은 메모리 트래픽을 가지는 CNN 모델로 빠르고 효율적입니다.\n", + "기본 개념은 계산 비용과 메모리 접근 비용을 동시에 최소화하는 것입니다. 따라서 HarDNet 모델은 동일한 정확도를 가진 ResNet 모델에 비해 GPU에서 실행되는 속도가 35% 더 빠릅니다. (MobileNet과 비교하기 위해 설계된 두 DS 모델은 제외)\n", + "\n", + "아래에는 각각 깊이별 분리 가능한 Conv 레이어가 있거나 없는 39, 68, 85개의 레이어를 포함한 4가지 버전의 HardNet 모델이 있습니다.\n", + "사전 훈련된 모델에 대해 ImageNet 데이터셋의 1-crop 오류율은 아래에 나열되어 있습니다.\n", + "\n", + "| Model structure | Top-1 error | Top-5 error |\n", + "| --------------- | ----------- | ----------- |\n", + "| hardnet39ds | 27.92 | 9.57 |\n", + "| hardnet68ds | 25.71 | 8.13 |\n", + "| hardnet68 | 23.52 | 6.99 |\n", + "| hardnet85 | 21.96 | 6.11 |\n", + "\n", + "### 참고문헌\n", + "\n", + " - [HarDNet: A Low Memory Traffic Network](https://arxiv.org/abs/1909.00948)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_ibnnet.ipynb b/assets/hub/pytorch_vision_ibnnet.ipynb new file mode 100644 index 000000000..363fcab7e --- /dev/null +++ b/assets/hub/pytorch_vision_ibnnet.ipynb @@ -0,0 +1,6 @@ +{ + "cells": [], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_inception_v3.ipynb b/assets/hub/pytorch_vision_inception_v3.ipynb new file mode 100644 index 000000000..f990587f2 --- /dev/null +++ b/assets/hub/pytorch_vision_inception_v3.ipynb @@ -0,0 +1,143 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "ac647325", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Inception_v3\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Also called GoogleNetv3, a famous ConvNet trained on Imagenet from 2015**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "791894f7", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'inception_v3', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "1ca306f4", + "metadata": {}, + "source": [ + "사전 훈련된 모델들을 사용할 때는 동일한 방식으로 정규화된 이미지를 입력으로 넣어야 합니다.\n", + "즉, 미니 배치(mini-batch)의 3-채널 RGB 이미지들은 `(3 x H x W)`의 형태를 가지며, 해당 `H`와 `W`는 최소 `224` 이상이어야 합니다.\n", + "각 이미지는 `[0, 1]`의 범위 내에서 불러와야 하며, `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]`을 이용해 정규화되어야 합니다.\n", + "다음은 실행 예제 입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "066680af", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9716d227", + "metadata": {}, + "outputs": [], + "source": [ + "# 예시 코드 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(299),\n", + " transforms.CenterCrop(299),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 가정하는대로 미니배치 생성\n", + "\n", + "# gpu를 사용할 수 있다면, 속도를 위해 입력과 모델을 gpu로 옮김\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# output은 shape가 [1000]인 Tensor 자료형이며, 이는 Imagenet 데이터셋의 1000개의 각 클래스에 대한 모델의 확신도(confidence)를 나타냄\n", + "print(output[0])\n", + "# output은 정규화되지 않았으므로, 확률화하기 위해 softmax 함수를 처리\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d46ed0f8", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 데이터셋 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "59802809", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리(클래스) 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 각 이미지에 대한 top 5 카테고리 출력\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "7ce3bfd3", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "Inception v3는 합성곱 연산을 적절히 분해하고 적극적인 정규화를 통해 추가된 계산을 가능한 한 효율적으로 활용하는 것을 목표로 네트워크를 확장하는 방법에 대한 탐색을 기반으로 합니다. ILSVRC 2012 (ImageNet) 분류 문제에서 본 논문은 당시 기준의 SOTA(state of the art) 모델보다 상당한 성능 향상을 얻었고, 단일 프레임 평가에서 21.2%의 top-1 오류와 5.6%의 top-5 오류를 달성했습니다. 이 결과는 2500만개 이하의 파라미터와 단일 추론 당 50억번의 곱셈-덧셈 연산의 계산 비용으로 달성되었습니다. 또한 4개 모델의 앙상블(ensemble)과 다중-크롭 평가(multi-crop evaluation)을 이용하여, 17.3%의 top-1 오류와 3.6%의 top-5 오류를 평가(validation) 데이터셋에서 달성합니다.\n", + "사전 훈련된 모델로 ImageNet 데이터셋에서의 단일-크롭 방식으로 오류 비율을 측정한 결과는 아래와 같습니다.\n", + "\n", + "| 모델 구조 | Top-1 오류 | Top-5 오류 |\n", + "| --------------- | ----------- | ----------- |\n", + "| inception_v3 | 22.55 | 6.44 |\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Rethinking the Inception Architecture for Computer Vision](https://arxiv.org/abs/1512.00567)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_meal_v2.ipynb b/assets/hub/pytorch_vision_meal_v2.ipynb new file mode 100644 index 000000000..5bb2068ff --- /dev/null +++ b/assets/hub/pytorch_vision_meal_v2.ipynb @@ -0,0 +1,196 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c6c1e08f", + "metadata": {}, + "source": [ + "### This notebook requires a GPU runtime to run.\n", + "### Please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# MEAL_V2\n", + "\n", + "*Author: Carnegie Mellon University*\n", + "\n", + "**Boosting Tiny and Efficient Models using Knowledge Distillation.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/MEALV2_method.png) | ![alt](https://pytorch.org/assets/images/MEALV2_results.png)\n", + "\n", + "\n", + "`timm` 종속 패키지 설치가 필요합니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2cc79914", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "!pip install timm" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "53f9b7cc", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "# 모델 종류: 'mealv1_resnest50', 'mealv2_resnest50', 'mealv2_resnest50_cutmix', 'mealv2_resnest50_380x380', 'mealv2_mobilenetv3_small_075', 'mealv2_mobilenetv3_small_100', 'mealv2_mobilenet_v3_large_100', 'mealv2_efficientnet_b0'\n", + "# 사전에 학습된 \"mealv2_resnest50_cutmix\"을 불러오는 예시입니다.\n", + "model = torch.hub.load('szq0214/MEAL-V2','meal_v2', 'mealv2_resnest50_cutmix', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "ed53b9cc", + "metadata": {}, + "source": [ + "사전에 학습된 모든 모델은 동일한 방식으로 정규화된 입력 이미지, 즉, `H` 와 `W` 는 최소 `224` 이상인 `(3 x H x W)` 형태의 3-채널 RGB 이미지의 미니 배치를 요구합니다. 이미지를 `[0, 1]` 범위에서 불러온 다음 `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]` 를 통해 정규화합니다.\n", + "\n", + "실행 예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2b5ef94d", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0d5183a9", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예시 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 미니배치 생성\n", + "\n", + "# 가능하다면 속도를 위해 입력과 모델을 GPU로 옮깁니다.\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# 1000개의 ImageNet 클래스에 대한 신뢰도 점수(confidence score)를 가진 1000 크기의 Tensor\n", + "print(output[0])\n", + "# output엔 정규화되지 않은 신뢰도 점수가 있습니다. 확률값을 얻으려면 softmax를 실행하세요.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "177ca315", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d56197f7", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지별 Top5 카테고리 조회\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "20028bed", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "MEAL V2 모델들은 [MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks](https://arxiv.org/pdf/2009.08453.pdf) 논문에 기반합니다. \n", + "\n", + "MEAL V2의 주요 관점은 distillation 과정에 One-Hot 레이블을 사용하지 않는다는 것입니다. MEAL V2는 판별자를 이용한 knowledge distillation 앙상블 기법인 [MEAL](https://arxiv.org/abs/1812.02425)에 기초하며, MEAL을 단순화하기 위해 다음의 수정을 거쳤습니다. 1) 판별자 입력, 유사도 손실 계산에 최종 출력만을 활용합니다. 2) 모든 teacher들의 예측 확률을 평균 내어 distillation에 활용합니다. 이를 통해 MEAL V2는 어떠한 트릭 사용 없이도 ResNet-50의 ImageNet Top-1 정확도를 80% 이상 기록할 수 있습니다. (트릭 : 1) 모델 구조 변경; 2) ImageNet 외 추가 데이터 활용; 3) autoaug/randaug; 4) cosine learning rate; 5) mixup/cutmix; 6) label smoothing; etc)\n", + "\n", + "| Models | Resolution| #Parameters | Top-1/Top-5 |\n", + "| :---: | :-: | :-: | :------:| :------: | \n", + "| [MEAL-V1 w/ ResNet50](https://arxiv.org/abs/1812.02425) | 224 | 25.6M |**78.21/94.01** | [GitHub](https://github.com/AaronHeee/MEAL#imagenet-model) |\n", + "| MEAL-V2 w/ ResNet50 | 224 | 25.6M | **80.67/95.09** | \n", + "| MEAL-V2 w/ ResNet50| 380 | 25.6M | **81.72/95.81** | \n", + "| MEAL-V2 + CutMix w/ ResNet50| 224 | 25.6M | **80.98/95.35** | \n", + "| MEAL-V2 w/ MobileNet V3-Small 0.75| 224 | 2.04M | **67.60/87.23** | \n", + "| MEAL-V2 w/ MobileNet V3-Small 1.0| 224 | 2.54M | **69.65/88.71** | \n", + "| MEAL-V2 w/ MobileNet V3-Large 1.0 | 224 | 5.48M | **76.92/93.32** | \n", + "| MEAL-V2 w/ EfficientNet-B0| 224 | 5.29M | **78.29/93.95** | \n", + "\n", + "### 참조\n", + "\n", + "자세한 사항은 [MEAL V2](https://arxiv.org/pdf/2009.08453.pdf), [MEAL](https://arxiv.org/pdf/1812.02425.pdf)을 통해 확인할 수 있습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3ce61ccc", + "metadata": {}, + "outputs": [], + "source": [ + "@article{shen2020mealv2,\n", + " title={MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks},\n", + " author={Shen, Zhiqiang and Savvides, Marios},\n", + " journal={arXiv preprint arXiv:2009.08453},\n", + " year={2020}\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "7032db77", + "metadata": {}, + "source": [ + "@inproceedings{shen2019MEAL,\n", + "\t\ttitle = {MEAL: Multi-Model Ensemble via Adversarial Learning},\n", + "\t\tauthor = {Shen, Zhiqiang and He, Zhankui and Xue, Xiangyang},\n", + "\t\tbooktitle = {AAAI},\n", + "\t\tyear = {2019}\n", + "\t}" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_mobilenet_v2.ipynb b/assets/hub/pytorch_vision_mobilenet_v2.ipynb new file mode 100644 index 000000000..c33d07857 --- /dev/null +++ b/assets/hub/pytorch_vision_mobilenet_v2.ipynb @@ -0,0 +1,146 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "a4acfa6f", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# MobileNet v2\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**잔차 블록에 기반한 속도와 메모리에 최적화된 효율적인 네트워크**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/mobilenet_v2_1.png) | ![alt](https://pytorch.org/assets/images/mobilenet_v2_2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "83b9f453", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'mobilenet_v2', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "05553227", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델들은 동일한 방식으로 정규화된 이미지를 입력으로 사용합니다.\n", + "즉, 미니 배치의 3-채널 RGB 이미지들은 `(3 x H x W)`의 형태를 가지며, 해당 `H`와 `W`는 최소 `224` 이상이어야 합니다.\n", + "각 이미지는 `[0, 1]`의 범위 내에서 불러와야 하며, `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]`을 이용해 정규화되어야 합니다.\n", + "\n", + "다음은 실행 예제 입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7a94ae2b", + "metadata": {}, + "outputs": [], + "source": [ + "# pytorch 웹사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bea6f4d8", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행 예제 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 형태의 미니배치 생성\n", + "\n", + "# 사용 가능한 경우 속도를 위해 입력 데이터와 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# output은 1000개의 Tensor 형태이며, 이는 Imagenet 데이터 셋의 1000개 클래스에 대한 신뢰도 점수를 나타내는 결과\n", + "print(output[0])\n", + "# output 결과는 정규화되지 않은 결과. 확률을 얻기 위해선 softmax를 거쳐야 함.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "29c02890", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 라벨 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "30b68ed4", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽어들이기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지 별 상위 카테고리 표시\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "5a4f0f0a", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "MobileNet v2 구조는 잔차 블록의 입력 및 출력이 얇은 병목 계층 형태인 반전된 잔차 구조를 기반으로 합니다. 반전된 잔차 구조는 입력단에서 확장된 표현을 사용하는 기존의 잔차 모델과 반대되는 구조입니다. MobileNet v2는 경량화된 depthwise 합성곱을 사용하여 중간 확장 계층의 특징들을 필터링합니다. 또한, 표현력 유지를 위해 좁은 계층의 비선형성은 제거되었습니다.\n", + "\n", + "| 모델 구조 | Top-1 오류 | Top-5 오류 |\n", + "| --------------- | ----------- | ----------- |\n", + "| mobilenet_v2 | 28.12 | 9.71 |\n", + "\n", + "\n", + "### 참고문헌\n", + "\n", + " - [MobileNetV2: Inverted Residuals and Linear Bottlenecks](https://arxiv.org/abs/1801.04381)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_proxylessnas.ipynb b/assets/hub/pytorch_vision_proxylessnas.ipynb new file mode 100644 index 000000000..231c1ff6c --- /dev/null +++ b/assets/hub/pytorch_vision_proxylessnas.ipynb @@ -0,0 +1,158 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9d82d376", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ProxylessNAS\n", + "\n", + "*Author: MIT Han Lab*\n", + "\n", + "**Proxylessly specialize CNN architectures for different hardware platforms.**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5cbd2779", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "target_platform = \"proxyless_cpu\"\n", + "# proxyless_gpu, proxyless_mobile, proxyless_mobile14도 사용할 수 있습니다.\n", + "model = torch.hub.load('mit-han-lab/ProxylessNAS', target_platform, pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "629a9dc8", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지를 요구합니다.\n", + "즉, `H`와 `W`가 최소 `224`의 크기를 가지는 `(3 x H x W)`형태의 3채널 RGB 이미지의 미니배치가 필요합니다. \n", + "이미지를 [0, 1] 범위로 불러온 다음 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]`를 이용하여 정규화해야 합니다.\n", + "\n", + "다음은 실행예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a5be9b16", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3ac19396", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행예시 (torchvision이 요구됩니다.)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 미니배치 생성\n", + "\n", + "# GPU 사용이 가능한 경우 속도를 위해 입력과 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# ImageNet 1000개 클래스에 대한 신뢰도 점수를 가진 1000 형태의 Tensor 출력\n", + "print(output[0])\n", + "# 출력은 정규화되어있지 않습니다. 소프트맥스를 실행하여 확률을 얻을 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e594006a", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "37a95b8c", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽어오기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지마다 상위 카테고리 5개 보여주기\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "721b7a55", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "ProxylessNAS 모델은 [ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware](https://arxiv.org/abs/1812.00332) 논문에서 제안되었습니다.\n", + "\n", + "일반적으로, 사람들은 *모든 하드웨어 플랫폼*에 대해 *하나의 효율적인 모델*을 설계하는 경향이 있습니다. 하지만 하드웨어마다 특성이 다릅니다. 예를 들어 CPU는 더 높은 주파수를 가지지만 GPU는 병렬화에 더 뛰어납니다. 따라서 모델을 일반화하기보다는 하드웨어 플랫폼에 맞게 CNN 아키텍처를 **전문화**해야 합니다. 아래에서 볼 수 있듯이, 전문화는 세 가지 플랫폼 모두에서 상당한 성능 향상을 제공합니다.\n", + "\n", + "| Model structure | GPU Latency | CPU Latency | Mobile Latency\n", + "| --------------- | ----------- | ----------- | ----------- |\n", + "| proxylessnas_gpu | **5.1ms** | 204.9ms | 124ms |\n", + "| proxylessnas_cpu | 7.4ms | **138.7ms** | 116ms |\n", + "| proxylessnas_mobile | 7.2ms | 164.1ms | **78ms** |\n", + "\n", + "사전 훈련된 모델에 해당하는 Top-1 정확도는 아래에 나열되어 있습니다.\n", + "\n", + "| Model structure | Top-1 error |\n", + "| --------------- | ----------- |\n", + "| proxylessnas_cpu | 24.7 |\n", + "| proxylessnas_gpu | 24.9 |\n", + "| proxylessnas_mobile | 25.4 |\n", + "| proxylessnas_mobile_14 | 23.3 |\n", + "\n", + "### 참고문헌\n", + "\n", + " - [ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware](https://arxiv.org/abs/1812.00332)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_resnest.ipynb b/assets/hub/pytorch_vision_resnest.ipynb new file mode 100644 index 000000000..fbe173384 --- /dev/null +++ b/assets/hub/pytorch_vision_resnest.ipynb @@ -0,0 +1,151 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "f5e0ac02", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ResNeSt\n", + "\n", + "*Author: Hang Zhang*\n", + "\n", + "**A new ResNet variant.**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f7a3782a", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "# 모델 목록 불러오기\n", + "torch.hub.list('zhanghang1989/ResNeSt', force_reload=True)\n", + "# 예시로 ResNeSt-50을 사용하여 사전 훈련된 모델을 가져오기\n", + "model = torch.hub.load('zhanghang1989/ResNeSt', 'resnest50', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "2fb02a70", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지를 요구합니다.\n", + "즉, `H`와 `W`가 최소 `224`의 크기를 가지는 `(3 x H x W)`형태의 3채널 RGB 이미지의 미니배치가 필요합니다. \n", + "이미지를 [0, 1] 범위로 불러온 다음 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]`를 이용하여 정규화해야 합니다.\n", + "\n", + "다음은 실행예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ef0b1f90", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "11d507e2", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행예시 (torchvision이 요구됩니다.)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # create a mini-batch as expected by the model\n", + "\n", + "# GPU 사용이 가능한 경우 속도를 위해 입력과 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# ImageNet의 1000개 클래스에 대한 신뢰도 점수를 가진 1000 형태의 Tensor 출력\n", + "print(output[0])\n", + "# 출력은 정규화되어있지 않습니다. 소프트맥스를 실행하여 확률을 얻을 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7ed9923a", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a7f12eaa", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽어오기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지마다 상위 카테고리 5개 보여주기\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "6ece453d", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "ResNeSt 모델은 [ResNeSt: Split-Attention Networks](https://arxiv.org/pdf/2004.08955.pdf) 논문에서 제안되었습니다.\n", + "\n", + "최근 이미지 분류 모델이 계속 발전하고 있지만 객체 감지 및 의미 분할과 같은 대부분의 다운스트림 애플리케이션(downstream applications)은 간단하게 모듈화된 구조로 인해 여전히 ResNet 변형을 백본 네트워크(backbone network)로 사용합니다. 기능 맵 그룹 전반에 걸쳐 주의를 기울일 수 있는 Split-Attention 블록을 제시합니다. 이러한 Split-Attention 블록을 ResNet 스타일로 쌓아서 ResNeSt라고 하는 새로운 ResNet 변형을 얻습니다. ResNeSt 모델은 유사한 모델 복잡성을 가진 다른 네트워크보다 성능이 우수하며 객체 감지, 인스턴스 분할(instance segmentation) 및 의미 분할을 포함한 다운스트림 작업을 지원합니다.\n", + "\n", + "| | crop size | PyTorch |\n", + "|-------------|-----------|---------|\n", + "| ResNeSt-50 | 224 | 81.03 |\n", + "| ResNeSt-101 | 256 | 82.83 |\n", + "| ResNeSt-200 | 320 | 83.84 |\n", + "| ResNeSt-269 | 416 | 84.54 |\n", + "\n", + "### 참고문헌\n", + "\n", + " - [ResNeSt: Split-Attention Networks](https://arxiv.org/abs/2004.08955)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_resnet.ipynb b/assets/hub/pytorch_vision_resnet.ipynb new file mode 100644 index 000000000..f109f46f6 --- /dev/null +++ b/assets/hub/pytorch_vision_resnet.ipynb @@ -0,0 +1,156 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "470e17f0", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ResNet\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Deep residual networks pre-trained on ImageNet**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8052cbd4", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet18', pretrained=True)\n", + "# or any of these variants\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet34', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet50', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet101', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet152', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "0758e90a", + "metadata": {}, + "source": [ + "All pre-trained models expect input images normalized in the same way,\n", + "i.e. mini-batches of 3-channel RGB images of shape `(3 x H x W)`, where `H` and `W` are expected to be at least `224`.\n", + "The images have to be loaded in to a range of `[0, 1]` and then normalized using `mean = [0.485, 0.456, 0.406]`\n", + "and `std = [0.229, 0.224, 0.225]`.\n", + "\n", + "Here's a sample execution." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0366f5b8", + "metadata": {}, + "outputs": [], + "source": [ + "# Download an example image from the pytorch website\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e39b302e", + "metadata": {}, + "outputs": [], + "source": [ + "# sample execution (requires torchvision)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # create a mini-batch as expected by the model\n", + "\n", + "# move the input and model to GPU for speed if available\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# Tensor of shape 1000, with confidence scores over Imagenet's 1000 classes\n", + "print(output[0])\n", + "# The output has unnormalized scores. To get probabilities, you can run a softmax on it.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8b441f01", + "metadata": {}, + "outputs": [], + "source": [ + "# Download ImageNet labels\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "616d05e2", + "metadata": {}, + "outputs": [], + "source": [ + "# Read the categories\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# Show top categories per image\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "da128355", + "metadata": {}, + "source": [ + "### Model Description\n", + "\n", + "Resnet models were proposed in \"Deep Residual Learning for Image Recognition\".\n", + "Here we have the 5 versions of resnet models, which contains 18, 34, 50, 101, 152 layers respectively.\n", + "Detailed model architectures can be found in Table 1.\n", + "Their 1-crop error rates on imagenet dataset with pretrained models are listed below.\n", + "\n", + "| Model structure | Top-1 error | Top-5 error |\n", + "| --------------- | ----------- | ----------- |\n", + "| resnet18 | 30.24 | 10.92 |\n", + "| resnet34 | 26.70 | 8.58 |\n", + "| resnet50 | 23.85 | 7.13 |\n", + "| resnet101 | 22.63 | 6.44 |\n", + "| resnet152 | 21.69 | 5.94 |\n", + "\n", + "### References\n", + "\n", + " - [Deep Residual Learning for Image Recognition](https://arxiv.org/abs/1512.03385)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_resnext.ipynb b/assets/hub/pytorch_vision_resnext.ipynb new file mode 100644 index 000000000..0c7a892ba --- /dev/null +++ b/assets/hub/pytorch_vision_resnext.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "34493d17", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ResNext\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Next generation ResNets, more efficient and accurate**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "78885be3", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'resnext50_32x4d', pretrained=True)\n", + "# or\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'resnext101_32x8d', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "2ca08831", + "metadata": {}, + "source": [ + "사전 훈련된 모델들을 사용할 때는 동일한 방식으로 정규화된 이미지를 입력으로 넣어야 합니다.\n", + "즉, 미니 배치(mini-batch)의 3-채널 RGB 이미지들은 `(3 x H x W)`의 형태를 가지며, 해당 `H`와 `W`는 최소 `224` 이상이어야 합니다.\n", + "각 이미지는 `[0, 1]`의 범위 내에서 불러와야 하며, `mean = [0.485, 0.456, 0.406]` 과 `std = [0.229, 0.224, 0.225]`을 이용해 정규화되어야 합니다.\n", + "다음은 실행 예제 입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "59f830f3", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1be1b40a", + "metadata": {}, + "outputs": [], + "source": [ + "# 예시 코드 (torchvision 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 가정하는 대로 미니 배치 생성\n", + "\n", + "# gpu를 사용할 수 있다면, 속도를 위해 입력과 모델을 gpu로 옮김\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# output은 shape가 [1000]인 Tensor 자료형이며, 이는 ImageNet 데이터셋의 1000개의 각 클래스에 대한 모델의 확신도(confidence)를 나타냄.\n", + "print(output[0])\n", + "# output은 정규화되지 않았으므로, 확률화하기 위해 softmax 함수를 처리\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "086263f3", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 데이터셋 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7cce2074", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리(클래스) 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "\n", + "# 각 이미지에 대한 top 5 카테고리 출력\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "a2ebe364", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "Resnext 모델은 논문 [Aggregated Residual Transformations for Deep Neural Networks]에서 제안되었습니다. (https://arxiv.org/abs/1611.05431).\n", + "여기서는 50개의 계층과 101개의 계층을 가지는 2개의 resnet 모델을 제공하고 있습니다.\n", + "resnet50과 resnext50의 아키텍처 차이는 논문의 Table 1을 참고하십시오.\n", + "ImageNet 데이터셋에 대한 사전훈련된 모델의 에러(성능)은 아래 표와 같습니다.\n", + "\n", + "| 모델 구조 | Top-1 오류 | Top-5 오류 |\n", + "| ----------------- | ----------- | ----------- |\n", + "| resnext50_32x4d | 22.38 | 6.30 |\n", + "| resnext101_32x8d | 20.69 | 5.47 |\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Aggregated Residual Transformations for Deep Neural Networks](https://arxiv.org/abs/1611.05431)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_shufflenet_v2.ipynb b/assets/hub/pytorch_vision_shufflenet_v2.ipynb new file mode 100644 index 000000000..8cb812c40 --- /dev/null +++ b/assets/hub/pytorch_vision_shufflenet_v2.ipynb @@ -0,0 +1,146 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "adbc55eb", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# ShuffleNet v2\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**An efficient ConvNet optimized for speed and memory, pre-trained on Imagenet**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/shufflenet_v2_1.png) | ![alt](https://pytorch.org/assets/images/shufflenet_v2_2.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d13a7218", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'shufflenet_v2_x1_0', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "bb315d85", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지를 요구합니다.\n", + "즉, `H`와 `W`가 최소 `224`의 크기를 가지는 `(3 x H x W)`형태의 3채널 RGB 이미지의 미니배치가 필요합니다. \n", + "이미지를 [0, 1] 범위로 불러온 다음 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]`를 이용하여 정규화해야 합니다.\n", + "\n", + "다음은 실행예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7b9b37c1", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "36a8e8e7", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행예시 (torchvision이 요구됩니다.)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 미니배치 생성\n", + "\n", + "# GPU 사용이 가능한 경우 속도를 위해 입력과 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# Imagnet의 1000개 클래스에 대한 신뢰도 점수를 가진 1000 형태의 텐서 출력\n", + "print(output[0])\n", + "# 출력은 정규화되어있지 않습니다. 소프트맥스를 실행하여 확률을 얻을 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5bc90acf", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b0fe9982", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽어오기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지마다 상위 카테고리 5개 보여주기\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "16ae89d1", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "이전에는 신경망 아키텍처 설계는 주로 FLOP와 같은 계산 복잡성의 간접 측정 기준에 따라 진행되었습니다. 그러나 속도와 같은 직접적인 측정은 메모리 액세스 비용 및 플랫폼 특성과 같은 다른 요소에도 의존합니다. 일련의 통제된 실험을 기반으로, 이 작업은 효율적인 네트워크 설계를 위한 몇 가지 실용적인 지침을 도출합니다. 따라서 ShuffleNet V2라는 새로운 아키텍처가 제시됩니다. 조건 변화에 따른 모델 평가를 통해 속도와 정확도 트레이드오프 측면에서 최고 수준임을 확인했습니다.\n", + "\n", + "| Model structure | Top-1 error | Top-5 error |\n", + "| --------------- | ----------- | ----------- |\n", + "| shufflenet_v2 | 30.64 | 11.68 |\n", + "\n", + "\n", + "### 참고문헌\n", + "\n", + " - [ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design](https://arxiv.org/abs/1807.11164)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_squeezenet.ipynb b/assets/hub/pytorch_vision_squeezenet.ipynb new file mode 100644 index 000000000..d15b19d70 --- /dev/null +++ b/assets/hub/pytorch_vision_squeezenet.ipynb @@ -0,0 +1,152 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "58a0f1c1", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# SqueezeNet\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Alexnet-level accuracy with 50x fewer parameters.**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "57a72350", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'squeezenet1_0', pretrained=True)\n", + "# 또는\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'squeezenet1_1', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "3969d19e", + "metadata": {}, + "source": [ + "사전에 훈련된 모델은 모두 같은 방식으로 정규화(normalize)한 이미지를 입력으로 받습니다.\n", + "\n", + "예를 들어, `(3 x H x W)` 포맷의 3채널 rgb 이미지들의 미니 배치의 경우 H 와 W 의 크기는 224 이상이어야 합니다.\n", + "이 때 모든 픽셀들은 0과 1 사이의 값을 가지도록 변환한 이후 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]` 로 정규화해야 합니다.\n", + "\n", + "실행 예제는 아래와 같습니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d33d07f8", + "metadata": {}, + "outputs": [], + "source": [ + "# pytorch에서 웹사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "762b1858", + "metadata": {}, + "outputs": [], + "source": [ + "# 예제 (토치비전 필요)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 형식인 mini batch 형태로 변환\n", + "\n", + "# 빠르게 실행하기 위해 가능한 경우 model 과 input image 를 gpu 를 사용하도록 설정\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# ImageNet 1000개 범주에 대한 신뢰 점수를 나타내는 텐서 반환\n", + "print(output[0])\n", + "# 해당 신뢰 점수는 softmax를 취해 확률값으로 변환가능합니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "432e9d13", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 라벨 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0a3b4ad6", + "metadata": {}, + "outputs": [], + "source": [ + "# 범주 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지 별로 확률값이 가장 높은 범주 출력\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "ec011051", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "`squeezenet1_0` 모델은 [SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size](https://arxiv.org/pdf/1602.07360.pdf) 논문을 구현한 것입니다.\n", + "\n", + "`squeezenet1_1` 모델은 [official squeezenet repo](https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1) 에서 왔습니다.\n", + "`squeezenet1_0` 수준의 정확도를 유지하며 2.4배 계산이 덜 필요하고, `squeezenet1_0`보다 매개변수의 수가 적습니다.\n", + "\n", + "ImageNet 데이터셋 기준으로 훈련된 모델들의 1-crop 에러율은 아래와 같습니다.\n", + "\n", + "| 모델 | Top-1 에러 | Top-5 에러 |\n", + "| --------------- | ----------- | ----------- |\n", + "| squeezenet1_0 | 41.90 | 19.58 |\n", + "| squeezenet1_1 | 41.81 | 19.38 |\n", + "\n", + "### 참조\n", + "\n", + " - [Squeezenet: Alexnet-level accuracy with 50x fewer parameters and <0.5MB model size](https://arxiv.org/pdf/1602.07360.pdf)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_vgg.ipynb b/assets/hub/pytorch_vision_vgg.ipynb new file mode 100644 index 000000000..b52d206cb --- /dev/null +++ b/assets/hub/pytorch_vision_vgg.ipynb @@ -0,0 +1,163 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "7d13fd06", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# vgg-nets\n", + "\n", + "*Author: Pytorch Team*\n", + "\n", + "**Award winning ConvNets from 2014 Imagenet ILSVRC challenge**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e6c8278f", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg11', pretrained=True)\n", + "# 추가로 아래와 같이 변형된 구조의 모델들이 있습니다\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg11_bn', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg13', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg13_bn', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg16', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg16_bn', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg19', pretrained=True)\n", + "# model = torch.hub.load('pytorch/vision:v0.10.0', 'vgg19_bn', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "a44bcda2", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 훈련때와 같은 방식으로 정규화된 입력 이미지를 주어야합니다.\n", + "즉, `(3 x H x W)` 모양의 3채널 RGB 이미지의 미니배치에서 `H`와 `W`는 최소 `224`가 될 것으로 예상됩니다.\n", + "이미지는 `[0, 1]` 범위로 로드한 다음(RGB 채널마다 0~255값으로 표현되므로 이미지를 255로 나눔) `mean = [0.485, 0.456, 0.406]`과 `std = [0.229, 0.224, 0.225]` 값을 사용하여 정규화해야 합니다.\n", + "\n", + "다음은 샘플 실행입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "405e695d", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹사이트에서 예제 이미지를 다운로드 합니다\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "43b98639", + "metadata": {}, + "outputs": [], + "source": [ + "# 샘플 실행 (torchvision이 필요합니다)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델의 입력값에 맞춘 미니 배치 생성\n", + "\n", + "# 가능하면 속도를 위해서 입력과 모델을 GPU로 이동 합니다\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# Imagenet의 1000개 클래스에 대한 신뢰도 점수가 있는 1000개의 Tensor입니다.\n", + "print(output[0])\n", + "# 출력에 정규화되지 않은 점수가 있습니다. 확률을 얻으려면 소프트맥스를 실행할 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1780c8c8", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 라벨 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6a2a4ab1", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# Show top categories per image\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "2a3e87dd", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "각 구성 및 bachnorm 버전에 대해서 [Very Deep Convolutional Networks for Large-Scale Image Recognition](https://arxiv.org/abs/1409.1556)에서 제안한 모델에 대한 구현이 있습니다.\n", + "\n", + "예를 들어, 논문에 제시된 구성 `A`는 `vgg11`, `B`는 `vgg13`, `D`는 `vgg16`, `E`는 `vgg19`입니다.\n", + "batchnorm 버전은 `_bn`이 접미사로 붙어있습니다.\n", + "\n", + "사전 훈련된 모델이 있는 imagenet 데이터 세트의 1-crop 오류율은 아래에 나열되어 있습니다.\n", + "\n", + "| Model structure | Top-1 error | Top-5 error |\n", + "| --------------- | ----------- | ----------- |\n", + "| vgg11 | 30.98 | 11.37 |\n", + "| vgg11_bn | 26.70 | 8.58 |\n", + "| vgg13 | 30.07 | 10.75 |\n", + "| vgg13_bn | 28.45 | 9.63 |\n", + "| vgg16 | 28.41 | 9.62 |\n", + "| vgg16_bn | 26.63 | 8.50 |\n", + "| vgg19 | 27.62 | 9.12 |\n", + "| vgg19_bn | 25.76 | 8.15 |\n", + "\n", + "### 참조\n", + "\n", + "- [Very Deep Convolutional Networks for Large-Scale Image Recognition](https://arxiv.org/abs/1409.1556)." + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/pytorch_vision_wide_resnet.ipynb b/assets/hub/pytorch_vision_wide_resnet.ipynb new file mode 100644 index 000000000..812330aef --- /dev/null +++ b/assets/hub/pytorch_vision_wide_resnet.ipynb @@ -0,0 +1,154 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "f9ae7374", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Wide ResNet\n", + "\n", + "*Author: Sergey Zagoruyko*\n", + "\n", + "**Wide Residual Networks**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ff727f52", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "# load WRN-50-2:\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'wide_resnet50_2', pretrained=True)\n", + "# or WRN-101-2\n", + "model = torch.hub.load('pytorch/vision:v0.10.0', 'wide_resnet101_2', pretrained=True)\n", + "model.eval()" + ] + }, + { + "cell_type": "markdown", + "id": "c752d116", + "metadata": {}, + "source": [ + "모든 사전 훈련된 모델은 동일한 방식으로 정규화된 입력 이미지를 요구합니다.\n", + "즉, `H`와 `W`가 최소 `224`의 크기를 가지는 `(3 x H x W)`형태의 3채널 RGB 이미지의 미니배치가 필요합니다. \n", + "이미지를 [0, 1] 범위로 불러온 다음 `mean = [0.485, 0.456, 0.406]`, `std = [0.229, 0.224, 0.225]`를 이용하여 정규화해야 합니다.\n", + "\n", + "다음은 실행예시입니다." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "60ed1a5a", + "metadata": {}, + "outputs": [], + "source": [ + "# 파이토치 웹 사이트에서 예제 이미지 다운로드\n", + "import urllib\n", + "url, filename = (\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n", + "try: urllib.URLopener().retrieve(url, filename)\n", + "except: urllib.request.urlretrieve(url, filename)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "29eb9f18", + "metadata": {}, + "outputs": [], + "source": [ + "# 실행예시 (torchvision이 요구됩니다.)\n", + "from PIL import Image\n", + "from torchvision import transforms\n", + "input_image = Image.open(filename)\n", + "preprocess = transforms.Compose([\n", + " transforms.Resize(256),\n", + " transforms.CenterCrop(224),\n", + " transforms.ToTensor(),\n", + " transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n", + "])\n", + "input_tensor = preprocess(input_image)\n", + "input_batch = input_tensor.unsqueeze(0) # 모델에서 요구하는 미니배치 생성\n", + "\n", + "# GPU 사용이 가능한 경우 속도를 위해 입력과 모델을 GPU로 이동\n", + "if torch.cuda.is_available():\n", + " input_batch = input_batch.to('cuda')\n", + " model.to('cuda')\n", + "\n", + "with torch.no_grad():\n", + " output = model(input_batch)\n", + "# ImageNet 1000개 클래스에 대한 신뢰도 점수를 가진 1000 형태의 Tensor 출력\n", + "print(output[0])\n", + "# 출력은 정규화되어있지 않습니다. 소프트맥스를 실행하여 확률을 얻을 수 있습니다.\n", + "probabilities = torch.nn.functional.softmax(output[0], dim=0)\n", + "print(probabilities)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7febdd7c", + "metadata": {}, + "outputs": [], + "source": [ + "# ImageNet 레이블 다운로드\n", + "!wget https://raw.githubusercontent.com/pytorch/hub/master/imagenet_classes.txt" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a92511f2", + "metadata": {}, + "outputs": [], + "source": [ + "# 카테고리 읽어오기\n", + "with open(\"imagenet_classes.txt\", \"r\") as f:\n", + " categories = [s.strip() for s in f.readlines()]\n", + "# 이미지마다 상위 카테고리 5개 보여주기\n", + "top5_prob, top5_catid = torch.topk(probabilities, 5)\n", + "for i in range(top5_prob.size(0)):\n", + " print(categories[top5_catid[i]], top5_prob[i].item())" + ] + }, + { + "cell_type": "markdown", + "id": "19ecc9ca", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "Wide Residual 네트워크는 ResNet에 비해 단순히 채널 수가 증가했습니다. \n", + "이외의 아키텍처는 ResNet과 동일합니다. \n", + "병목(bottleneck) 블록이 있는 심층 ImageNet 모델은 내부 3x3 합성곱 채널 수를 증가 시켰습니다.\n", + "\n", + "`wide_resnet50_2` 및 `wide_resnet101_2` 모델은 [Warm Restarts가 있는 SGD(SGDR)](https://arxiv.org/abs/1608.03983)를 사용하여 혼합 정밀도(Mixed Precision) 방식으로 학습되었습니다.\n", + "체크 포인트는 크기가 작은 경우 절반 정밀도(batch norm 제외)의 가중치를 가지며 FP32 모델에서도 사용할 수 있습니다.\n", + "| Model structure | Top-1 error | Top-5 error | # parameters |\n", + "| ----------------- | :---------: | :---------: | :----------: |\n", + "| wide_resnet50_2 | 21.49 | 5.91 | 68.9M |\n", + "| wide_resnet101_2 | 21.16 | 5.72 | 126.9M |\n", + "\n", + "### 참고문헌\n", + "\n", + " - [Wide Residual Networks](https://arxiv.org/abs/1605.07146)\n", + " - [Deep Residual Learning for Image Recognition](https://arxiv.org/abs/1512.03385)\n", + " - [Mixed Precision Training](https://arxiv.org/abs/1710.03740)\n", + " - [SGDR: Stochastic Gradient Descent with Warm Restarts](https://arxiv.org/abs/1608.03983)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/sigsep_open-unmix-pytorch_umx.ipynb b/assets/hub/sigsep_open-unmix-pytorch_umx.ipynb new file mode 100644 index 000000000..6639a10ed --- /dev/null +++ b/assets/hub/sigsep_open-unmix-pytorch_umx.ipynb @@ -0,0 +1,120 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "2a4ab6e8", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Open-Unmix\n", + "\n", + "*Author: Inria*\n", + "\n", + "**Reference implementation for music source separation**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "911eb870", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "# assuming you have a PyTorch >=1.6.0 installed\n", + "pip install -q torchaudio" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4e38fd80", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "# loading umxhq four target separator\n", + "separator = torch.hub.load('sigsep/open-unmix-pytorch', 'umxhq')\n", + "\n", + "# generate random audio\n", + "# ... with shape (nb_samples, nb_channels, nb_timesteps)\n", + "# ... and with the same sample rate as that of the separator\n", + "audio = torch.rand((1, 2, 100000))\n", + "original_sample_rate = separator.sample_rate\n", + "\n", + "# make sure to resample the audio to models' sample rate, separator.sample_rate, if the two are different\n", + "# resampler = torchaudio.transforms.Resample(original_sample_rate, separator.sample_rate)\n", + "# audio = resampler(audio)\n", + "\n", + "estimates = separator(audio)\n", + "# estimates.shape = (1, 4, 2, 100000)" + ] + }, + { + "cell_type": "markdown", + "id": "6c750c6d", + "metadata": {}, + "source": [ + "### Model Description\n", + "\n", + "__Open-Unmix__ provides ready-to-use models that allow users to separate pop music into four stems: __vocals__, __drums__, __bass__ and the remaining __other__ instruments. The models were pre-trained on the freely available [MUSDB18](https://sigsep.github.io/datasets/musdb.html) dataset.\n", + "\n", + "Each target model is based on a three-layer bidirectional deep LSTM. The model learns to predict the magnitude spectrogram of a target source, like vocals, from the magnitude spectrogram of a mixture input. Internally, the prediction is obtained by applying a mask on the input. The model is optimized in the magnitude domain using mean squared error.\n", + "\n", + "A `Separator` meta-model (as shown in the code example above) puts together multiple _Open-unmix_ spectrogram models for each desired target, and combines their output through a multichannel generalized Wiener filter, before application of inverse STFTs using `torchaudio`.\n", + "The filtering is differentiable (but parameter-free) version of [norbert](https://github.com/sigsep/norbert).\n", + "\n", + "### Pre-trained `Separator` models\n", + "\n", + "* __`umxhq` (default)__ trained on [MUSDB18-HQ](https://sigsep.github.io/datasets/musdb.html#uncompressed-wav) which comprises the same tracks as in MUSDB18 but un-compressed which yield in a full bandwidth of 22050 Hz.\n", + "\n", + "* __`umx`__ is trained on the regular [MUSDB18](https://sigsep.github.io/datasets/musdb.html#compressed-stems) which is bandwidth limited to 16 kHz due to AAC compression. This model should be used for comparison with other (older) methods for evaluation in [SiSEC18](sisec18.unmix.app).\n", + "\n", + "Furthermore, we provide a model for speech enhancement trained by [Sony Corporation](link)\n", + "\n", + "* __`umxse`__ speech enhancement model is trained on the 28-speaker version of the [Voicebank+DEMAND corpus](https://datashare.is.ed.ac.uk/handle/10283/1942?show=full).\n", + "\n", + "All three models are also available as spectrogram (core) models, which take magnitude spectrogram inputs and ouput separated spectrograms.\n", + "These models can be loaded using `umxhq_spec`, `umx_spec` and `umxse_spec`.\n", + "\n", + "### Details\n", + "\n", + "For additional examples, documentation and usage examples, please visit this [the github repo](https://github.com/sigsep/open-unmix-pytorch).\n", + "\n", + "Furthermore, the models and all utility function to preprocess, read and save audio stems, are available in a python package that can be installed via" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0bf79aa4", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install openunmix" + ] + }, + { + "cell_type": "markdown", + "id": "6e790d91", + "metadata": {}, + "source": [ + "### References\n", + "\n", + "- [Open-Unmix - A Reference Implementation for Music Source Separation](https://doi.org/10.21105/joss.01667)\n", + "- [SigSep - Open Ressources for Music Separation](https://sigsep.github.io/)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/snakers4_silero-models_stt.ipynb b/assets/hub/snakers4_silero-models_stt.ipynb new file mode 100644 index 000000000..c16c1e3b9 --- /dev/null +++ b/assets/hub/snakers4_silero-models_stt.ipynb @@ -0,0 +1,107 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "44733657", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Silero Speech-To-Text Models\n", + "\n", + "*Author: Silero AI Team*\n", + "\n", + "**A set of compact enterprise-grade pre-trained STT Models for multiple languages.**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/silero_stt_model.jpg) | ![alt](https://pytorch.org/assets/images/silero_imagenet_moment.png)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ae6dc344", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "# PyTorch의 적절한 버전이 이미 설치되어 있다고 가정합니다.\n", + "pip install -q torchaudio omegaconf soundfile" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "afbf3f52", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "import zipfile\n", + "import torchaudio\n", + "from glob import glob\n", + "\n", + "device = torch.device('cpu') # gpu에서도 잘 돌아가지만, cpu에서도 충분히 빠릅니다.\n", + "\n", + "model, decoder, utils = torch.hub.load(repo_or_dir='snakers4/silero-models',\n", + " model='silero_stt',\n", + " language='en', # 'de', 'es'도 사용 가능\n", + " device=device)\n", + "(read_batch, split_into_batches,\n", + " read_audio, prepare_model_input) = utils # 자세한 내용은 함수 시그니처(function signature)를 참조하세요.\n", + "\n", + "# TorchAudio와 호환되는 형식(사운드 파일 백엔드)중 하나의 파일 다운로드\n", + "torch.hub.download_url_to_file('https://opus-codec.org/static/examples/samples/speech_orig.wav',\n", + " dst ='speech_orig.wav', progress=True)\n", + "test_files = glob('speech_orig.wav')\n", + "batches = split_into_batches(test_files, batch_size=10)\n", + "input = prepare_model_input(read_batch(batches[0]),\n", + " device=device)\n", + "\n", + "output = model(input)\n", + "for example in output:\n", + " print(decoder(example.cpu()))" + ] + }, + { + "cell_type": "markdown", + "id": "8557ee0f", + "metadata": {}, + "source": [ + "### 모델 설명\n", + "\n", + "Silero Speech-To-Text 모델은 일반적으로 사용되는 여러 언어에 대해 소형 폼 팩터 형태로 엔터프라이즈급 STT를 제공합니다. 기존 ASR 모델과 달리 다양한 방언, 코덱, 도메인, 노이즈, 낮은 샘플링 속도에 강인합니다(단순화를 위해 오디오는 16kHz로 다시 샘플링해야 함). 모델은 샘플 형태의 정규화된 오디오(즉, [-1, 1] 범위로의 정규화를 제외한 어떤 전처리 없이)와 토큰 확률이 있는 출력 프레임을 사용합니다. 단순화를 위해 디코더 도구를 제공합니다. 모델 자체에 포함할 수 있지만 자막이 결합된 모듈은, 특정한 내보내기 상황에서 레이블같은 모델의 생성물을 저장할 때 문제가 있었습니다.\n", + "\n", + "Speech에서 Open-STT와 Silero Models에 대한 노력이 ImageNet 같은 순간에 다가가길 바랍니다.\n", + "\n", + "### 지원되는 언어 및 형식\n", + "\n", + "지원되는 언어는 다음과 같습니다.\n", + "\n", + "- English\n", + "- German\n", + "- Spanish\n", + "\n", + "항상 최신 지원 언어 목록을 보려면 [repo](https://github.com/snakers4/silero-models)를 방문하여 사용 가능한 체크포인트에 대한 `yml` [file](https://github.com/snakers4/silero-models/blob/master/models.yml)을 확인하십시오 .\n", + "To see the always up-to-date language list, please visit our [repo](https://github.com/snakers4/silero-models) and see the `yml` [file](https://github.com/snakers4/silero-models/blob/master/models.yml) for all available checkpoints.\n", + "\n", + "### 추가 예제 및 벤치마크\n", + "\n", + "추가 예제 및 기타 모델 형식을 보려면 이 [link](https://github.com/snakers4/silero-models)를 방문하십시오. 품질 및 성능 벤치마크는 [wiki](https://github.com/snakers4/silero-models/wiki)를 참조하십시오. 관련 자료는 수시로 업데이트됩니다.\n", + "\n", + "### 참고문헌\n", + "\n", + "- [Silero Models](https://github.com/snakers4/silero-models)\n", + "- [Alexander Veysov, \"Toward's an ImageNet Moment for Speech-to-Text\", The Gradient, 2020](https://thegradient.pub/towards-an-imagenet-moment-for-speech-to-text/)\n", + "- [Alexander Veysov, \"A Speech-To-Text Practitioner’s Criticisms of Industry and Academia\", The Gradient, 2020](https://thegradient.pub/a-speech-to-text-practitioners-criticisms-of-industry-and-academia/)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/snakers4_silero-models_tts.ipynb b/assets/hub/snakers4_silero-models_tts.ipynb new file mode 100644 index 000000000..9ce7bd203 --- /dev/null +++ b/assets/hub/snakers4_silero-models_tts.ipynb @@ -0,0 +1,99 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "67211164", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Silero Text-To-Speech Models\n", + "\n", + "*Author: Silero AI Team*\n", + "\n", + "**A set of compact enterprise-grade pre-trained TTS Models for multiple languages**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "51ba364d", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "# this assumes that you have a proper version of PyTorch already installed\n", + "pip install -q torchaudio omegaconf" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cfdc1330", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "language = 'en'\n", + "speaker = 'lj_16khz'\n", + "device = torch.device('cpu')\n", + "model, symbols, sample_rate, example_text, apply_tts = torch.hub.load(repo_or_dir='snakers4/silero-models',\n", + " model='silero_tts',\n", + " language=language,\n", + " speaker=speaker)\n", + "model = model.to(device) # gpu or cpu\n", + "audio = apply_tts(texts=[example_text],\n", + " model=model,\n", + " sample_rate=sample_rate,\n", + " symbols=symbols,\n", + " device=device)" + ] + }, + { + "cell_type": "markdown", + "id": "9a20fe22", + "metadata": {}, + "source": [ + "### Model Description\n", + "\n", + "Silero Text-To-Speech models provide enterprise grade TTS in a compact form-factor for several commonly spoken languages:\n", + "\n", + "- One-line usage\n", + "- Naturally sounding speech\n", + "- No GPU or training required\n", + "- Minimalism and lack of dependencies\n", + "- A library of voices in many languages\n", + "- Support for `16kHz` and `8kHz` out of the box\n", + "- High throughput on slow hardware. Decent performance on one CPU thread\n", + "\n", + "### Supported Languages and Formats\n", + "\n", + "As of this page update, the speakers of the following languages are supported both in 8 kHz and 16 kHz:\n", + "\n", + "- Russian (6 speakers)\n", + "- English (1 speaker)\n", + "- German (1 speaker)\n", + "- Spanish (1 speaker)\n", + "- French (1 speaker)\n", + "\n", + "To see the always up-to-date language list, please visit our [repo](https://github.com/snakers4/silero-models) and see the `yml` [file](https://github.com/snakers4/silero-models/blob/master/models.yml) for all available checkpoints.\n", + "\n", + "### Additional Examples and Benchmarks\n", + "\n", + "For additional examples and other model formats please visit this [link](https://github.com/snakers4/silero-models). For quality and performance benchmarks please see the [wiki](https://github.com/snakers4/silero-models/wiki). These resources will be updated from time to time.\n", + "\n", + "### References\n", + "\n", + "- [Silero Models](https://github.com/snakers4/silero-models)\n", + "- [High-Quality Speech-to-Text Made Accessible, Simple and Fast](https://habr.com/ru/post/549482/)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/snakers4_silero-vad_language.ipynb b/assets/hub/snakers4_silero-vad_language.ipynb new file mode 100644 index 000000000..f55efa514 --- /dev/null +++ b/assets/hub/snakers4_silero-vad_language.ipynb @@ -0,0 +1,89 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9dec7534", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Silero Language Classifier\n", + "\n", + "*Author: Silero AI Team*\n", + "\n", + "**Pre-trained Spoken Language Classifier**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "76ff27d3", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "# this assumes that you have a proper version of PyTorch already installed\n", + "pip install -q torchaudio soundfile" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c39a793b", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "torch.set_num_threads(1)\n", + "from pprint import pprint\n", + "# download example\n", + "torch.hub.download_url_to_file('https://models.silero.ai/vad_models/de.wav', 'de_example.wav')\n", + "\n", + "model, utils = torch.hub.load(repo_or_dir='snakers4/silero-vad',\n", + " model='silero_lang_detector',\n", + " force_reload=True)\n", + "\n", + "get_language, read_audio, *_ = utils\n", + "\n", + "files_dir = torch.hub.get_dir() + '/snakers4_silero-vad_master/files'\n", + "\n", + "wav = read_audio('de_example.wav')\n", + "language = get_language(wav, model)\n", + "\n", + "pprint(language)" + ] + }, + { + "cell_type": "markdown", + "id": "cf75ec58", + "metadata": {}, + "source": [ + "### Model Description\n", + "\n", + "Silero VAD: pre-trained enterprise-grade Voice Activity Detector (VAD), Number Detector and Language Classifier (95 languages). Enterprise-grade Speech Products made refreshingly simple (see our STT models). **Each model is published separately**.\n", + "\n", + "Currently, there are hardly any high quality / modern / free / public voice activity detectors except for WebRTC Voice Activity Detector (link). WebRTC though starts to show its age and it suffers from many false positives.\n", + "\n", + "**(!!!) Important Notice (!!!)** - the models are intended to run on CPU only and were optimized for performance on 1 CPU thread. Note that the model is quantized.\n", + "\n", + "### Additional Examples and Benchmarks\n", + "\n", + "For additional examples and other model formats please visit this [link](https://github.com/snakers4/silero-vad) and please refer to the extensive examples in the Colab format (including the streaming examples).\n", + "\n", + "### References\n", + "\n", + "Language classifier model architecture is based on similar STT architectures.\n", + "\n", + "- [Silero VAD](https://github.com/snakers4/silero-vad)\n", + "- [Alexander Veysov, \"Toward's an ImageNet Moment for Speech-to-Text\", The Gradient, 2020](https://thegradient.pub/towards-an-imagenet-moment-for-speech-to-text/)\n", + "- [Alexander Veysov, \"A Speech-To-Text Practitioner’s Criticisms of Industry and Academia\", The Gradient, 2020](https://thegradient.pub/a-speech-to-text-practitioners-criticisms-of-industry-and-academia/)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/snakers4_silero-vad_number.ipynb b/assets/hub/snakers4_silero-vad_number.ipynb new file mode 100644 index 000000000..539b354c7 --- /dev/null +++ b/assets/hub/snakers4_silero-vad_number.ipynb @@ -0,0 +1,106 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "4f3f6442", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Silero Number Detector\n", + "\n", + "*Author: Silero AI Team*\n", + "\n", + "**Pre-trained Spoken Number Detector**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3ccc16d4", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "# this assumes that you have a proper version of PyTorch already installed\n", + "pip install -q torchaudio soundfile" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c07d630e", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "torch.set_num_threads(1)\n", + "from pprint import pprint\n", + "torch.hub.download_url_to_file('https://models.silero.ai/vad_models/en_num.wav', 'en_number_example.wav')\n", + "\n", + "model, utils = torch.hub.load(repo_or_dir='snakers4/silero-vad',\n", + " model='silero_number_detector',\n", + " force_reload=True)\n", + "\n", + "(get_number_ts,\n", + " _, read_audio,\n", + " *_) = utils\n", + "\n", + "files_dir = torch.hub.get_dir() + '/snakers4_silero-vad_master/files'\n", + "\n", + "wav = read_audio(f'en_number_example.wav')\n", + "# full audio\n", + "# get number timestamps from full audio file\n", + "number_timestamps = get_number_ts(wav, model)\n", + "\n", + "pprint(number_timestamps)" + ] + }, + { + "cell_type": "markdown", + "id": "af1c86d4", + "metadata": {}, + "source": [ + "### Model Description\n", + "\n", + "Silero VAD: pre-trained enterprise-grade Voice Activity Detector (VAD), Number Detector and Language Classifier. Enterprise-grade Speech Products made refreshingly simple (see our STT models). **Each model is published separately**.\n", + "\n", + "Currently, there are hardly any high quality / modern / free / public voice activity detectors except for WebRTC Voice Activity Detector (link). WebRTC though starts to show its age and it suffers from many false positives.\n", + "\n", + "Also in some cases it is crucial to be able to anonymize large-scale spoken corpora (i.e. remove personal data). Typically personal data is considered to be private / sensitive if it contains (i) a name (ii) some private ID. Name recognition is a highly subjective matter and it depends on locale and business case, but Voice Activity and Number Detection are quite general tasks.\n", + "\n", + "**(!!!) Important Notice (!!!)** - the models are intended to run on CPU only and were optimized for performance on 1 CPU thread. Note that the model is quantized.\n", + "\n", + "\n", + "### Supported Languages\n", + "\n", + "As of this page update, the following languages are supported:\n", + "\n", + "- Russian\n", + "- English\n", + "- German\n", + "- Spanish\n", + "\n", + "To see the always up-to-date language list, please visit our [repo](https://github.com/snakers4/silero-vad).\n", + "\n", + "### Additional Examples and Benchmarks\n", + "\n", + "For additional examples and other model formats please visit this [link](https://github.com/snakers4/silero-vad) and please refer to the extensive examples in the Colab format (including the streaming examples).\n", + "\n", + "### References\n", + "\n", + "Number detector model architecture is based on similar STT architectures.\n", + "\n", + "- [Silero VAD](https://github.com/snakers4/silero-vad)\n", + "- [Alexander Veysov, \"Toward's an ImageNet Moment for Speech-to-Text\", The Gradient, 2020](https://thegradient.pub/towards-an-imagenet-moment-for-speech-to-text/)\n", + "- [Alexander Veysov, \"A Speech-To-Text Practitioner’s Criticisms of Industry and Academia\", The Gradient, 2020](https://thegradient.pub/a-speech-to-text-practitioners-criticisms-of-industry-and-academia/)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/snakers4_silero-vad_vad.ipynb b/assets/hub/snakers4_silero-vad_vad.ipynb new file mode 100644 index 000000000..ba4011c6a --- /dev/null +++ b/assets/hub/snakers4_silero-vad_vad.ipynb @@ -0,0 +1,95 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "2b8cdbc2", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# Silero Voice Activity Detector\n", + "\n", + "*Author: Silero AI Team*\n", + "\n", + "**Pre-trained Voice Activity Detector**\n", + "\n", + "\"alt\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "88abb568", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "# this assumes that you have a proper version of PyTorch already installed\n", + "pip install -q torchaudio" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1e6d453d", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "torch.set_num_threads(1)\n", + "\n", + "from IPython.display import Audio\n", + "from pprint import pprint\n", + "# download example\n", + "torch.hub.download_url_to_file('https://models.silero.ai/vad_models/en.wav', 'en_example.wav')\n", + "\n", + "model, utils = torch.hub.load(repo_or_dir='snakers4/silero-vad',\n", + " model='silero_vad',\n", + " force_reload=True)\n", + "\n", + "(get_speech_timestamps,\n", + " _, read_audio,\n", + " *_) = utils\n", + "\n", + "sampling_rate = 16000 # also accepts 8000\n", + "wav = read_audio('en_example.wav', sampling_rate=sampling_rate)\n", + "# get speech timestamps from full audio file\n", + "speech_timestamps = get_speech_timestamps(wav, model, sampling_rate=sampling_rate)\n", + "pprint(speech_timestamps)" + ] + }, + { + "cell_type": "markdown", + "id": "2d2d4994", + "metadata": {}, + "source": [ + "### Model Description\n", + "\n", + "Silero VAD: pre-trained enterprise-grade Voice Activity Detector (VAD). Enterprise-grade Speech Products made refreshingly simple (see our STT models). **Each model is published separately**.\n", + "\n", + "Currently, there are hardly any high quality / modern / free / public voice activity detectors except for WebRTC Voice Activity Detector (link). WebRTC though starts to show its age and it suffers from many false positives.\n", + "\n", + "**(!!!) Important Notice (!!!)** - the models are intended to run on CPU only and were optimized for performance on 1 CPU thread. Note that the model is quantized.\n", + "\n", + "\n", + "### Additional Examples and Benchmarks\n", + "\n", + "For additional examples and other model formats please visit this [link](https://github.com/snakers4/silero-vad) and please refer to the extensive examples in the Colab format (including the streaming examples).\n", + "\n", + "### References\n", + "\n", + "VAD model architectures are based on similar STT architectures.\n", + "\n", + "- [Silero VAD](https://github.com/snakers4/silero-vad)\n", + "- [Alexander Veysov, \"Toward's an ImageNet Moment for Speech-to-Text\", The Gradient, 2020](https://thegradient.pub/towards-an-imagenet-moment-for-speech-to-text/)\n", + "- [Alexander Veysov, \"A Speech-To-Text Practitioner’s Criticisms of Industry and Academia\", The Gradient, 2020](https://thegradient.pub/a-speech-to-text-practitioners-criticisms-of-industry-and-academia/)" + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/hub/ultralytics_yolov5.ipynb b/assets/hub/ultralytics_yolov5.ipynb new file mode 100644 index 000000000..e6164f4b1 --- /dev/null +++ b/assets/hub/ultralytics_yolov5.ipynb @@ -0,0 +1,142 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "b2239840", + "metadata": {}, + "source": [ + "### This notebook is optionally accelerated with a GPU runtime.\n", + "### If you would like to use this acceleration, please select the menu option \"Runtime\" -> \"Change runtime type\", select \"Hardware Accelerator\" -> \"GPU\" and click \"SAVE\"\n", + "\n", + "----------------------------------------------------------------------\n", + "\n", + "# YOLOv5\n", + "\n", + "*Author: Ultralytics*\n", + "\n", + "**YOLOv5 in PyTorch > ONNX > CoreML > TFLite**\n", + "\n", + "_ | _\n", + "- | -\n", + "![alt](https://pytorch.org/assets/images/ultralytics_yolov5_img1.jpg) | ![alt](https://pytorch.org/assets/images/ultralytics_yolov5_img2.png)\n", + "\n", + "\n", + "## Before You Start\n", + "\n", + "Start from a **Python>=3.8** environment with **PyTorch>=1.7** installed. To install PyTorch see [https://pytorch.org/get-started/locally/](https://pytorch.org/get-started/locally/). To install YOLOv5 dependencies:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fb11983c", + "metadata": {}, + "outputs": [], + "source": [ + "%%bash\n", + "pip install -qr https://raw.githubusercontent.com/ultralytics/yolov5/master/requirements.txt # install dependencies" + ] + }, + { + "cell_type": "markdown", + "id": "de7a1775", + "metadata": {}, + "source": [ + "## Model Description\n", + "\n", + "\"YOLOv5\n", + " \n", + "\n", + "[YOLOv5](https://ultralytics.com/yolov5) 🚀 is a family of compound-scaled object detection models trained on the COCO dataset, and includes simple functionality for Test Time Augmentation (TTA), model ensembling, hyperparameter evolution, and export to ONNX, CoreML and TFLite.\n", + "\n", + "|Model |size
(pixels) |mAPval
0.5:0.95 |mAPtest
0.5:0.95 |mAPval
0.5 |Speed
V100 (ms) | |params
(M) |FLOPS
640 (B)\n", + "|--- |--- |--- |--- |--- |--- |---|--- |---\n", + "|[YOLOv5s6](https://github.com/ultralytics/yolov5/releases) |1280 |43.3 |43.3 |61.9 |**4.3** | |12.7 |17.4\n", + "|[YOLOv5m6](https://github.com/ultralytics/yolov5/releases) |1280 |50.5 |50.5 |68.7 |8.4 | |35.9 |52.4\n", + "|[YOLOv5l6](https://github.com/ultralytics/yolov5/releases) |1280 |53.4 |53.4 |71.1 |12.3 | |77.2 |117.7\n", + "|[YOLOv5x6](https://github.com/ultralytics/yolov5/releases) |1280 |**54.4** |**54.4** |**72.0** |22.4 | |141.8 |222.9\n", + "|[YOLOv5x6](https://github.com/ultralytics/yolov5/releases) TTA |1280 |**55.0** |**55.0** |**72.0** |70.8 | |- |-\n", + "\n", + "
\n", + " Table Notes (click to expand)\n", + "\n", + " * APtest denotes COCO [test-dev2017](http://cocodataset.org/#upload) server results, all other AP results denote val2017 accuracy.\n", + " * AP values are for single-model single-scale unless otherwise noted. **Reproduce mAP** by `python test.py --data coco.yaml --img 640 --conf 0.001 --iou 0.65`\n", + " * SpeedGPU averaged over 5000 COCO val2017 images using a GCP [n1-standard-16](https://cloud.google.com/compute/docs/machine-types#n1_standard_machine_types) V100 instance, and includes FP16 inference, postprocessing and NMS. **Reproduce speed** by `python test.py --data coco.yaml --img 640 --conf 0.25 --iou 0.45`\n", + " * All checkpoints are trained to 300 epochs with default settings and hyperparameters (no autoaugmentation).\n", + " * Test Time Augmentation ([TTA](https://github.com/ultralytics/yolov5/issues/303)) includes reflection and scale augmentation. **Reproduce TTA** by `python test.py --data coco.yaml --img 1536 --iou 0.7 --augment`\n", + "\n", + "
\n", + "\n", + "

\n", + "\n", + "
\n", + " Figure Notes (click to expand)\n", + "\n", + " * GPU Speed measures end-to-end time per image averaged over 5000 COCO val2017 images using a V100 GPU with batch size 32, and includes image preprocessing, PyTorch FP16 inference, postprocessing and NMS.\n", + " * EfficientDet data from [google/automl](https://github.com/google/automl) at batch size 8.\n", + " * **Reproduce** by `python test.py --task study --data coco.yaml --iou 0.7 --weights yolov5s6.pt yolov5m6.pt yolov5l6.pt yolov5x6.pt`\n", + "\n", + "
\n", + "\n", + "## Load From PyTorch Hub\n", + "\n", + "\n", + "This example loads a pretrained **YOLOv5s** model and passes an image for inference. YOLOv5 accepts **URL**, **Filename**, **PIL**, **OpenCV**, **Numpy** and **PyTorch** inputs, and returns detections in **torch**, **pandas**, and **JSON** output formats. See our [YOLOv5 PyTorch Hub Tutorial](https://github.com/ultralytics/yolov5/issues/36) for details." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4d043c7a", + "metadata": {}, + "outputs": [], + "source": [ + "import torch\n", + "\n", + "# Model\n", + "model = torch.hub.load('ultralytics/yolov5', 'yolov5s', pretrained=True)\n", + "\n", + "# Images\n", + "imgs = ['https://ultralytics.com/images/zidane.jpg'] # batch of images\n", + "\n", + "# Inference\n", + "results = model(imgs)\n", + "\n", + "# Results\n", + "results.print()\n", + "results.save() # or .show()\n", + "\n", + "results.xyxy[0] # img1 predictions (tensor)\n", + "results.pandas().xyxy[0] # img1 predictions (pandas)\n", + "# xmin ymin xmax ymax confidence class name\n", + "# 0 749.50 43.50 1148.0 704.5 0.874023 0 person\n", + "# 1 433.50 433.50 517.5 714.5 0.687988 27 tie\n", + "# 2 114.75 195.75 1095.0 708.0 0.624512 0 person\n", + "# 3 986.00 304.00 1028.0 420.0 0.286865 27 tie" + ] + }, + { + "cell_type": "markdown", + "id": "b4cf0589", + "metadata": {}, + "source": [ + "## Citation\n", + "\n", + "[![DOI](https://zenodo.org/badge/264818686.svg)](https://zenodo.org/badge/latestdoi/264818686)\n", + "\n", + "\n", + "## Contact\n", + "\n", + "\n", + "**Issues should be raised directly in https://github.com/ultralytics/yolov5.** For business inquiries or professional support requests please visit [https://ultralytics.com](https://ultralytics.com) or email Glenn Jocher at [glenn.jocher@ultralytics.com](mailto:glenn.jocher@ultralytics.com).\n", + "\n", + "\n", + " " + ] + } + ], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/assets/images/2023-03-22-batchsizescaling.svg b/assets/images/2023-03-22-batchsizescaling.svg new file mode 100644 index 000000000..1fa09c7ad --- /dev/null +++ b/assets/images/2023-03-22-batchsizescaling.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/2023-03-22-inferencespeedup.svg b/assets/images/2023-03-22-inferencespeedup.svg new file mode 100644 index 000000000..db16bdba2 --- /dev/null +++ b/assets/images/2023-03-22-inferencespeedup.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/2023-03-22-torchbenchtraining.svg b/assets/images/2023-03-22-torchbenchtraining.svg new file mode 100644 index 000000000..566999611 --- /dev/null +++ b/assets/images/2023-03-22-torchbenchtraining.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/2023-03-22-trainingspeedup.svg b/assets/images/2023-03-22-trainingspeedup.svg new file mode 100644 index 000000000..bc0873a04 --- /dev/null +++ b/assets/images/2023-03-22-trainingspeedup.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/2023-04-11-accelerated-generative-diffusion-models1.png b/assets/images/2023-04-11-accelerated-generative-diffusion-models1.png new file mode 100644 index 000000000..27f6ba1cd Binary files /dev/null and b/assets/images/2023-04-11-accelerated-generative-diffusion-models1.png differ diff --git a/assets/images/2023-04-11-accelerated-generative-diffusion-models2.png b/assets/images/2023-04-11-accelerated-generative-diffusion-models2.png new file mode 100644 index 000000000..260fcfe31 Binary files /dev/null and b/assets/images/2023-04-11-accelerated-generative-diffusion-models2.png differ diff --git a/assets/images/2023-04-11-accelerated-generative-diffusion-models3.png b/assets/images/2023-04-11-accelerated-generative-diffusion-models3.png new file mode 100644 index 000000000..e2c056a4d Binary files /dev/null and b/assets/images/2023-04-11-accelerated-generative-diffusion-models3.png differ diff --git a/assets/images/2023-04-11-accelerated-generative-diffusion-models4.png b/assets/images/2023-04-11-accelerated-generative-diffusion-models4.png new file mode 100644 index 000000000..7a3cfadc8 Binary files /dev/null and b/assets/images/2023-04-11-accelerated-generative-diffusion-models4.png differ diff --git a/assets/images/404_sign.png b/assets/images/404_sign.png new file mode 100644 index 000000000..2c2ae05fb Binary files /dev/null and b/assets/images/404_sign.png differ diff --git a/assets/images/Bert_HF.png b/assets/images/Bert_HF.png new file mode 100644 index 000000000..e98d06f74 Binary files /dev/null and b/assets/images/Bert_HF.png differ diff --git a/assets/images/Captum 1.jpg b/assets/images/Captum 1.jpg new file mode 100644 index 000000000..fec68a92e Binary files /dev/null and b/assets/images/Captum 1.jpg differ diff --git a/assets/images/Captum 2.png b/assets/images/Captum 2.png new file mode 100644 index 000000000..9691bba50 Binary files /dev/null and b/assets/images/Captum 2.png differ diff --git a/assets/images/Caveats.jpg b/assets/images/Caveats.jpg new file mode 100644 index 000000000..698424f32 Binary files /dev/null and b/assets/images/Caveats.jpg differ diff --git a/assets/images/Cub200Dataset.png b/assets/images/Cub200Dataset.png new file mode 100644 index 000000000..ead780b0d Binary files /dev/null and b/assets/images/Cub200Dataset.png differ diff --git a/assets/images/GPT1.png b/assets/images/GPT1.png new file mode 100644 index 000000000..425ea2e75 Binary files /dev/null and b/assets/images/GPT1.png differ diff --git a/assets/images/Hackathon_Facebook_Cover.png b/assets/images/Hackathon_Facebook_Cover.png new file mode 100644 index 000000000..e4d446207 Binary files /dev/null and b/assets/images/Hackathon_Facebook_Cover.png differ diff --git a/assets/images/MEALV2.png b/assets/images/MEALV2.png new file mode 100644 index 000000000..b4e8b2088 Binary files /dev/null and b/assets/images/MEALV2.png differ diff --git a/assets/images/MEALV2_method.png b/assets/images/MEALV2_method.png new file mode 100644 index 000000000..02f7668d4 Binary files /dev/null and b/assets/images/MEALV2_method.png differ diff --git a/assets/images/MEALV2_results.png b/assets/images/MEALV2_results.png new file mode 100644 index 000000000..947734e70 Binary files /dev/null and b/assets/images/MEALV2_results.png differ diff --git a/assets/images/PTD2-social-asset.png b/assets/images/PTD2-social-asset.png new file mode 100644 index 000000000..37ba3c990 Binary files /dev/null and b/assets/images/PTD2-social-asset.png differ diff --git a/assets/images/PTEDPostEventHeader.png b/assets/images/PTEDPostEventHeader.png new file mode 100644 index 000000000..4e41c9ad3 Binary files /dev/null and b/assets/images/PTEDPostEventHeader.png differ diff --git a/assets/images/PTE_lockup_PRIMARY.svg b/assets/images/PTE_lockup_PRIMARY.svg new file mode 100644 index 000000000..f992e9d50 --- /dev/null +++ b/assets/images/PTE_lockup_PRIMARY.svg @@ -0,0 +1 @@ + diff --git a/assets/images/PyTorch_XLA Future Stack.svg b/assets/images/PyTorch_XLA Future Stack.svg new file mode 100644 index 000000000..f573882ae --- /dev/null +++ b/assets/images/PyTorch_XLA Future Stack.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/Q_AID_architecture.png b/assets/images/Q_AID_architecture.png new file mode 100644 index 000000000..538b9df6a Binary files /dev/null and b/assets/images/Q_AID_architecture.png differ diff --git a/assets/images/ResNeXtArch.png b/assets/images/ResNeXtArch.png new file mode 100644 index 000000000..b75d41b64 Binary files /dev/null and b/assets/images/ResNeXtArch.png differ diff --git a/assets/images/SEArch.png b/assets/images/SEArch.png new file mode 100755 index 000000000..a7fb8d047 Binary files /dev/null and b/assets/images/SEArch.png differ diff --git a/assets/images/Summer_hackathon.png b/assets/images/Summer_hackathon.png new file mode 100644 index 000000000..16f925f35 Binary files /dev/null and b/assets/images/Summer_hackathon.png differ diff --git a/assets/images/about-background.jpg b/assets/images/about-background.jpg new file mode 100644 index 000000000..92a8433a1 Binary files /dev/null and b/assets/images/about-background.jpg differ diff --git a/assets/images/alexnet1.png b/assets/images/alexnet1.png new file mode 100644 index 000000000..9a34bfe5d Binary files /dev/null and b/assets/images/alexnet1.png differ diff --git a/assets/images/alexnet2.png b/assets/images/alexnet2.png new file mode 100644 index 000000000..8eb6b7465 Binary files /dev/null and b/assets/images/alexnet2.png differ diff --git a/assets/images/alibaba-logo.svg b/assets/images/alibaba-logo.svg new file mode 100644 index 000000000..039e39c3a --- /dev/null +++ b/assets/images/alibaba-logo.svg @@ -0,0 +1,40 @@ + + + +Created by potrace 1.15, written by Peter Selinger 2001-2017 + + + + + + + diff --git a/assets/images/allennlp.png b/assets/images/allennlp.png new file mode 100644 index 000000000..ee27ea664 Binary files /dev/null and b/assets/images/allennlp.png differ diff --git a/assets/images/amd_rocm_blog.png b/assets/images/amd_rocm_blog.png new file mode 100644 index 000000000..49d58c4d4 Binary files /dev/null and b/assets/images/amd_rocm_blog.png differ diff --git a/assets/images/arrow-right-with-tail-white.svg b/assets/images/arrow-right-with-tail-white.svg new file mode 100644 index 000000000..be46c6c2e --- /dev/null +++ b/assets/images/arrow-right-with-tail-white.svg @@ -0,0 +1,19 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + + + + diff --git a/assets/images/arrow-right-with-tail.svg b/assets/images/arrow-right-with-tail.svg new file mode 100644 index 000000000..5843588fc --- /dev/null +++ b/assets/images/arrow-right-with-tail.svg @@ -0,0 +1,19 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/arrows-icon.svg b/assets/images/arrows-icon.svg new file mode 100644 index 000000000..690eb9718 --- /dev/null +++ b/assets/images/arrows-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/aws-logo.svg b/assets/images/aws-logo.svg new file mode 100644 index 000000000..348530eb5 --- /dev/null +++ b/assets/images/aws-logo.svg @@ -0,0 +1,51 @@ + + + + + + + + + diff --git a/assets/images/bert1.png b/assets/images/bert1.png new file mode 100644 index 000000000..af404522a Binary files /dev/null and b/assets/images/bert1.png differ diff --git a/assets/images/bert2.png b/assets/images/bert2.png new file mode 100644 index 000000000..73d9ae425 Binary files /dev/null and b/assets/images/bert2.png differ diff --git a/assets/images/blockfiltering.png b/assets/images/blockfiltering.png new file mode 100644 index 000000000..6a656bb35 Binary files /dev/null and b/assets/images/blockfiltering.png differ diff --git a/assets/images/blog-background.jpg b/assets/images/blog-background.jpg new file mode 100644 index 000000000..4af9a199f Binary files /dev/null and b/assets/images/blog-background.jpg differ diff --git a/assets/images/chevron-down-black.svg b/assets/images/chevron-down-black.svg new file mode 100644 index 000000000..9fffa7726 --- /dev/null +++ b/assets/images/chevron-down-black.svg @@ -0,0 +1,17 @@ + + + + Created with Sketch. + + + + + + + + + + + + + diff --git a/assets/images/chevron-down-orange.svg b/assets/images/chevron-down-orange.svg new file mode 100644 index 000000000..8a3c27cc7 --- /dev/null +++ b/assets/images/chevron-down-orange.svg @@ -0,0 +1,17 @@ + + + + Created with Sketch. + + + + + + + + + + + + + diff --git a/assets/images/chevron-down-white.svg b/assets/images/chevron-down-white.svg new file mode 100644 index 000000000..e7bd84de3 --- /dev/null +++ b/assets/images/chevron-down-white.svg @@ -0,0 +1,17 @@ + + + + Created with Sketch. + + + + + + + + + + + + + diff --git a/assets/images/chevron-left-grey.svg b/assets/images/chevron-left-grey.svg new file mode 100644 index 000000000..c3bc0130b --- /dev/null +++ b/assets/images/chevron-left-grey.svg @@ -0,0 +1,14 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + \ No newline at end of file diff --git a/assets/images/chevron-left-orange.svg b/assets/images/chevron-left-orange.svg new file mode 100644 index 000000000..f005a743b --- /dev/null +++ b/assets/images/chevron-left-orange.svg @@ -0,0 +1,11 @@ + + + + Group + Created with Sketch. + + + + + + \ No newline at end of file diff --git a/assets/images/chevron-right-grey.svg b/assets/images/chevron-right-grey.svg new file mode 100644 index 000000000..16eca9899 --- /dev/null +++ b/assets/images/chevron-right-grey.svg @@ -0,0 +1,14 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + \ No newline at end of file diff --git a/assets/images/chevron-right-orange.svg b/assets/images/chevron-right-orange.svg new file mode 100644 index 000000000..7033fc93b --- /dev/null +++ b/assets/images/chevron-right-orange.svg @@ -0,0 +1,17 @@ + + + + +Page 1 +Created with Sketch. + + + + + + + + + + diff --git a/assets/images/chevron-right-white.svg b/assets/images/chevron-right-white.svg new file mode 100644 index 000000000..dd9e77f26 --- /dev/null +++ b/assets/images/chevron-right-white.svg @@ -0,0 +1,17 @@ + + + + +Page 1 +Created with Sketch. + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/chip-icon.svg b/assets/images/chip-icon.svg new file mode 100644 index 000000000..b46477ee3 --- /dev/null +++ b/assets/images/chip-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/clacheck.png b/assets/images/clacheck.png new file mode 100644 index 000000000..c6076ebea Binary files /dev/null and b/assets/images/clacheck.png differ diff --git a/assets/images/clafb.png b/assets/images/clafb.png new file mode 100644 index 000000000..1aa5a0126 Binary files /dev/null and b/assets/images/clafb.png differ diff --git a/assets/images/classification.jpg b/assets/images/classification.jpg new file mode 100644 index 000000000..eb1e20641 Binary files /dev/null and b/assets/images/classification.jpg differ diff --git a/assets/images/coc-background.jpg b/assets/images/coc-background.jpg new file mode 100644 index 000000000..e440cbf73 Binary files /dev/null and b/assets/images/coc-background.jpg differ diff --git a/assets/images/colab-logo.svg b/assets/images/colab-logo.svg new file mode 100644 index 000000000..5f4cf28f7 --- /dev/null +++ b/assets/images/colab-logo.svg @@ -0,0 +1,37 @@ + + + + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/compact-hub-icon-selected.svg b/assets/images/compact-hub-icon-selected.svg new file mode 100644 index 000000000..4f2d4ef35 --- /dev/null +++ b/assets/images/compact-hub-icon-selected.svg @@ -0,0 +1,61 @@ + + + + 071519_Airlift_PyTorchOrg_HubCompactTemplate_v2 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/compact-hub-icon.svg b/assets/images/compact-hub-icon.svg new file mode 100644 index 000000000..af633ea67 --- /dev/null +++ b/assets/images/compact-hub-icon.svg @@ -0,0 +1,64 @@ + + + + Group 449 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/cursor-icon.svg b/assets/images/cursor-icon.svg new file mode 100644 index 000000000..8a186d4ef --- /dev/null +++ b/assets/images/cursor-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/custom-rnn-chunk.png b/assets/images/custom-rnn-chunk.png new file mode 100644 index 000000000..d1dda2893 Binary files /dev/null and b/assets/images/custom-rnn-chunk.png differ diff --git a/assets/images/custom-rnn-improve.png b/assets/images/custom-rnn-improve.png new file mode 100644 index 000000000..22b111ca4 Binary files /dev/null and b/assets/images/custom-rnn-improve.png differ diff --git a/assets/images/dcgan_dtd.jpg b/assets/images/dcgan_dtd.jpg new file mode 100644 index 000000000..1923a4e7a Binary files /dev/null and b/assets/images/dcgan_dtd.jpg differ diff --git a/assets/images/dcgan_fashionGen.jpg b/assets/images/dcgan_fashionGen.jpg new file mode 100644 index 000000000..ac852ae0e Binary files /dev/null and b/assets/images/dcgan_fashionGen.jpg differ diff --git a/assets/images/deep-learning-thank-you-background.jpg b/assets/images/deep-learning-thank-you-background.jpg new file mode 100644 index 000000000..acc25e1f6 Binary files /dev/null and b/assets/images/deep-learning-thank-you-background.jpg differ diff --git a/assets/images/deep-learning-thumbnail.png b/assets/images/deep-learning-thumbnail.png new file mode 100644 index 000000000..0ce580120 Binary files /dev/null and b/assets/images/deep-learning-thumbnail.png differ diff --git a/assets/images/deeplab1.png b/assets/images/deeplab1.png new file mode 100644 index 000000000..740093882 Binary files /dev/null and b/assets/images/deeplab1.png differ diff --git a/assets/images/deeplab2.png b/assets/images/deeplab2.png new file mode 100644 index 000000000..872b505eb Binary files /dev/null and b/assets/images/deeplab2.png differ diff --git a/assets/images/densenet1.png b/assets/images/densenet1.png new file mode 100644 index 000000000..013598aa8 Binary files /dev/null and b/assets/images/densenet1.png differ diff --git a/assets/images/densenet2.png b/assets/images/densenet2.png new file mode 100644 index 000000000..f6cd657a5 Binary files /dev/null and b/assets/images/densenet2.png differ diff --git a/assets/images/dog.jpg b/assets/images/dog.jpg new file mode 100644 index 000000000..12f0e0dd1 Binary files /dev/null and b/assets/images/dog.jpg differ diff --git a/assets/images/executorch-arrows.svg b/assets/images/executorch-arrows.svg new file mode 100644 index 000000000..2febe67c7 --- /dev/null +++ b/assets/images/executorch-arrows.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/external-link-icon.svg b/assets/images/external-link-icon.svg new file mode 100644 index 000000000..d0f7a53b5 --- /dev/null +++ b/assets/images/external-link-icon.svg @@ -0,0 +1,27 @@ + + + + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/fairseq_logo.png b/assets/images/fairseq_logo.png new file mode 100644 index 000000000..75472cbb5 Binary files /dev/null and b/assets/images/fairseq_logo.png differ diff --git a/assets/images/fcn2.png b/assets/images/fcn2.png new file mode 100644 index 000000000..f4f1de9ac Binary files /dev/null and b/assets/images/fcn2.png differ diff --git a/assets/images/features-background.jpg b/assets/images/features-background.jpg new file mode 100644 index 000000000..57402bb85 Binary files /dev/null and b/assets/images/features-background.jpg differ diff --git a/assets/images/feedback-flag.svg b/assets/images/feedback-flag.svg new file mode 100644 index 000000000..ad1a9ffdb --- /dev/null +++ b/assets/images/feedback-flag.svg @@ -0,0 +1,13 @@ + + + + + + + + + + + + + diff --git a/assets/images/filter-arrow.svg b/assets/images/filter-arrow.svg new file mode 100644 index 000000000..5c4ac42d2 --- /dev/null +++ b/assets/images/filter-arrow.svg @@ -0,0 +1,13 @@ + + + + + + + + + + + + + diff --git a/assets/images/flashattention-3/fg1.png b/assets/images/flashattention-3/fg1.png new file mode 100644 index 000000000..3e73398cc Binary files /dev/null and b/assets/images/flashattention-3/fg1.png differ diff --git a/assets/images/flashattention-3/fg2.png b/assets/images/flashattention-3/fg2.png new file mode 100644 index 000000000..6b3b1da13 Binary files /dev/null and b/assets/images/flashattention-3/fg2.png differ diff --git a/assets/images/flashattention-3/fg3.png b/assets/images/flashattention-3/fg3.png new file mode 100644 index 000000000..5d95157d3 Binary files /dev/null and b/assets/images/flashattention-3/fg3.png differ diff --git a/assets/images/flashattention-3/fg4.png b/assets/images/flashattention-3/fg4.png new file mode 100644 index 000000000..bbaba22ed Binary files /dev/null and b/assets/images/flashattention-3/fg4.png differ diff --git a/assets/images/flashattention-3/fg5.png b/assets/images/flashattention-3/fg5.png new file mode 100644 index 000000000..a5378413d Binary files /dev/null and b/assets/images/flashattention-3/fg5.png differ diff --git a/assets/images/flashattention-3/fg6.png b/assets/images/flashattention-3/fg6.png new file mode 100644 index 000000000..65a105bf3 Binary files /dev/null and b/assets/images/flashattention-3/fg6.png differ diff --git a/assets/images/flashattention-3/fg6a.png b/assets/images/flashattention-3/fg6a.png new file mode 100644 index 000000000..de5659c83 Binary files /dev/null and b/assets/images/flashattention-3/fg6a.png differ diff --git a/assets/images/flashattention-3/fg7.png b/assets/images/flashattention-3/fg7.png new file mode 100644 index 000000000..9ea9e6b92 Binary files /dev/null and b/assets/images/flashattention-3/fg7.png differ diff --git a/assets/images/flashattention-3/fg8.png b/assets/images/flashattention-3/fg8.png new file mode 100644 index 000000000..d757bd0f7 Binary files /dev/null and b/assets/images/flashattention-3/fg8.png differ diff --git a/assets/images/flashattention-3/fg9.png b/assets/images/flashattention-3/fg9.png new file mode 100644 index 000000000..f7ef51912 Binary files /dev/null and b/assets/images/flashattention-3/fg9.png differ diff --git a/assets/images/full-hub-icon-selected.svg b/assets/images/full-hub-icon-selected.svg new file mode 100644 index 000000000..d47d9f439 --- /dev/null +++ b/assets/images/full-hub-icon-selected.svg @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + diff --git a/assets/images/full-hub-icon.svg b/assets/images/full-hub-icon.svg new file mode 100644 index 000000000..249fcf276 --- /dev/null +++ b/assets/images/full-hub-icon.svg @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + diff --git a/assets/images/geomloss.jpg b/assets/images/geomloss.jpg new file mode 100644 index 000000000..1b0506b36 Binary files /dev/null and b/assets/images/geomloss.jpg differ diff --git a/assets/images/get-started-background.jpg b/assets/images/get-started-background.jpg new file mode 100644 index 000000000..66cb142bd Binary files /dev/null and b/assets/images/get-started-background.jpg differ diff --git a/assets/images/ghostnet.png b/assets/images/ghostnet.png new file mode 100644 index 000000000..b91337e2a Binary files /dev/null and b/assets/images/ghostnet.png differ diff --git a/assets/images/github-star.svg b/assets/images/github-star.svg new file mode 100644 index 000000000..6a262594f --- /dev/null +++ b/assets/images/github-star.svg @@ -0,0 +1,17 @@ + + + + + + + + + + + + + + + + + diff --git a/assets/images/google-cloud-logo.svg b/assets/images/google-cloud-logo.svg new file mode 100644 index 000000000..ae6b9734d --- /dev/null +++ b/assets/images/google-cloud-logo.svg @@ -0,0 +1,40 @@ + + + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/googlenet1.png b/assets/images/googlenet1.png new file mode 100644 index 000000000..fa50a220c Binary files /dev/null and b/assets/images/googlenet1.png differ diff --git a/assets/images/googlenet2.png b/assets/images/googlenet2.png new file mode 100644 index 000000000..ae52c1f34 Binary files /dev/null and b/assets/images/googlenet2.png differ diff --git a/assets/images/hardnet.png b/assets/images/hardnet.png new file mode 100644 index 000000000..33e118fac Binary files /dev/null and b/assets/images/hardnet.png differ diff --git a/assets/images/hardnet_blk.png b/assets/images/hardnet_blk.png new file mode 100644 index 000000000..0aad0a8bc Binary files /dev/null and b/assets/images/hardnet_blk.png differ diff --git a/assets/images/home-background.jpg b/assets/images/home-background.jpg new file mode 100644 index 000000000..6a42b8dcb Binary files /dev/null and b/assets/images/home-background.jpg differ diff --git a/assets/images/home-footer-background.jpg b/assets/images/home-footer-background.jpg new file mode 100644 index 000000000..c541fca8e Binary files /dev/null and b/assets/images/home-footer-background.jpg differ diff --git a/assets/images/horse2zebra.gif b/assets/images/horse2zebra.gif new file mode 100644 index 000000000..b9b5f627e Binary files /dev/null and b/assets/images/horse2zebra.gif differ diff --git a/assets/images/hub-background.jpg b/assets/images/hub-background.jpg new file mode 100644 index 000000000..6e9ba75fa Binary files /dev/null and b/assets/images/hub-background.jpg differ diff --git a/assets/images/hub-blog-header-1.png b/assets/images/hub-blog-header-1.png new file mode 100644 index 000000000..e894e6181 Binary files /dev/null and b/assets/images/hub-blog-header-1.png differ diff --git a/assets/images/hub-blog-pwc.png b/assets/images/hub-blog-pwc.png new file mode 100644 index 000000000..c35a2c0b4 Binary files /dev/null and b/assets/images/hub-blog-pwc.png differ diff --git a/assets/images/hugging_face_transformers.svg b/assets/images/hugging_face_transformers.svg new file mode 100644 index 000000000..091946ae5 --- /dev/null +++ b/assets/images/hugging_face_transformers.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/huggingface-logo.png b/assets/images/huggingface-logo.png new file mode 100644 index 000000000..cbd884e30 Binary files /dev/null and b/assets/images/huggingface-logo.png differ diff --git a/assets/images/hybridnets.jpg b/assets/images/hybridnets.jpg new file mode 100644 index 000000000..ee053ce4f Binary files /dev/null and b/assets/images/hybridnets.jpg differ diff --git a/assets/images/ibnnet.png b/assets/images/ibnnet.png new file mode 100644 index 000000000..d6c0ce600 Binary files /dev/null and b/assets/images/ibnnet.png differ diff --git a/assets/images/icon-close.svg b/assets/images/icon-close.svg new file mode 100644 index 000000000..348964e79 --- /dev/null +++ b/assets/images/icon-close.svg @@ -0,0 +1,21 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/icon-menu-dots-dark.svg b/assets/images/icon-menu-dots-dark.svg new file mode 100644 index 000000000..fa2ad044b --- /dev/null +++ b/assets/images/icon-menu-dots-dark.svg @@ -0,0 +1,42 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/icon-menu-dots.svg b/assets/images/icon-menu-dots.svg new file mode 100644 index 000000000..fc0318e62 --- /dev/null +++ b/assets/images/icon-menu-dots.svg @@ -0,0 +1,44 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/inception_v3.png b/assets/images/inception_v3.png new file mode 100644 index 000000000..c1bc7af71 Binary files /dev/null and b/assets/images/inception_v3.png differ diff --git a/assets/images/inplace_abn.png b/assets/images/inplace_abn.png new file mode 100644 index 000000000..1f281b2d1 Binary files /dev/null and b/assets/images/inplace_abn.png differ diff --git a/assets/images/install-matrix.png b/assets/images/install-matrix.png new file mode 100644 index 000000000..3313d6421 Binary files /dev/null and b/assets/images/install-matrix.png differ diff --git a/assets/images/int8/pytorch_quant_x86_1.jpg b/assets/images/int8/pytorch_quant_x86_1.jpg new file mode 100644 index 000000000..ac506ee7e Binary files /dev/null and b/assets/images/int8/pytorch_quant_x86_1.jpg differ diff --git a/assets/images/int8/pytorch_quant_x86_2.jpg b/assets/images/int8/pytorch_quant_x86_2.jpg new file mode 100644 index 000000000..0c2ab43ce Binary files /dev/null and b/assets/images/int8/pytorch_quant_x86_2.jpg differ diff --git a/assets/images/int8/pytorch_quant_x86_3.jpg b/assets/images/int8/pytorch_quant_x86_3.jpg new file mode 100644 index 000000000..bb3d9efa6 Binary files /dev/null and b/assets/images/int8/pytorch_quant_x86_3.jpg differ diff --git a/assets/images/intel-gpus-pytorch-2-4.jpg b/assets/images/intel-gpus-pytorch-2-4.jpg new file mode 100644 index 000000000..a1264401c Binary files /dev/null and b/assets/images/intel-gpus-pytorch-2-4.jpg differ diff --git a/assets/images/intel-logo.png b/assets/images/intel-logo.png new file mode 100644 index 000000000..2d022a97c Binary files /dev/null and b/assets/images/intel-logo.png differ diff --git a/assets/images/intel-new-logo.svg b/assets/images/intel-new-logo.svg new file mode 100644 index 000000000..5133faa15 --- /dev/null +++ b/assets/images/intel-new-logo.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/logo-dark.svg b/assets/images/logo-dark.svg new file mode 100644 index 000000000..9b4c1a56a --- /dev/null +++ b/assets/images/logo-dark.svg @@ -0,0 +1,30 @@ + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/logo-detectron.svg b/assets/images/logo-detectron.svg new file mode 100644 index 000000000..19f88bc81 --- /dev/null +++ b/assets/images/logo-detectron.svg @@ -0,0 +1,16 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/logo-elf.svg b/assets/images/logo-elf.svg new file mode 100644 index 000000000..fb9684639 --- /dev/null +++ b/assets/images/logo-elf.svg @@ -0,0 +1,16 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/logo-facebook-dark.svg b/assets/images/logo-facebook-dark.svg new file mode 100644 index 000000000..cff17915c --- /dev/null +++ b/assets/images/logo-facebook-dark.svg @@ -0,0 +1,8 @@ + + + + + + diff --git a/assets/images/logo-github.svg b/assets/images/logo-github.svg new file mode 100644 index 000000000..8471876ef --- /dev/null +++ b/assets/images/logo-github.svg @@ -0,0 +1,12 @@ + + + + + + diff --git a/assets/images/logo-icon.svg b/assets/images/logo-icon.svg new file mode 100644 index 000000000..575f6823e --- /dev/null +++ b/assets/images/logo-icon.svg @@ -0,0 +1,12 @@ + + + + + + + + + diff --git a/assets/images/logo-ko-dark.svg b/assets/images/logo-ko-dark.svg new file mode 100644 index 000000000..079dee540 --- /dev/null +++ b/assets/images/logo-ko-dark.svg @@ -0,0 +1,21 @@ + + + + + + + + + + + + + 이토치 + + + + + + + + diff --git a/assets/images/logo-ko-square-dark.svg b/assets/images/logo-ko-square-dark.svg new file mode 100644 index 000000000..fcd1d6cf7 --- /dev/null +++ b/assets/images/logo-ko-square-dark.svg @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + diff --git a/assets/images/logo-ko-square.svg b/assets/images/logo-ko-square.svg new file mode 100644 index 000000000..72d626a34 --- /dev/null +++ b/assets/images/logo-ko-square.svg @@ -0,0 +1,18 @@ + + + + + + + + + + + + + + + + + + diff --git a/assets/images/logo-ko.svg b/assets/images/logo-ko.svg new file mode 100644 index 000000000..493a8b8e5 --- /dev/null +++ b/assets/images/logo-ko.svg @@ -0,0 +1,23 @@ + + + + + + + + + + + 한국 + + + 사용자 + + + 이토치 + + + + + + diff --git a/assets/images/logo-parlai.svg b/assets/images/logo-parlai.svg new file mode 100644 index 000000000..4c46bce65 --- /dev/null +++ b/assets/images/logo-parlai.svg @@ -0,0 +1,16 @@ + + + + Page 1 + Created with Sketch. + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/logo-slack.svg b/assets/images/logo-slack.svg new file mode 100644 index 000000000..4b02a4fdf --- /dev/null +++ b/assets/images/logo-slack.svg @@ -0,0 +1,16 @@ + + + + slack + Created with Sketch. + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/logo-twitter-dark.svg b/assets/images/logo-twitter-dark.svg new file mode 100644 index 000000000..1572570f8 --- /dev/null +++ b/assets/images/logo-twitter-dark.svg @@ -0,0 +1,16 @@ + + + + + + + + diff --git a/assets/images/logo-twitter-grey.svg b/assets/images/logo-twitter-grey.svg new file mode 100644 index 000000000..33039c37f --- /dev/null +++ b/assets/images/logo-twitter-grey.svg @@ -0,0 +1,16 @@ + + + + + + + + diff --git a/assets/images/logo-wav2letter.svg b/assets/images/logo-wav2letter.svg new file mode 100644 index 000000000..9ad1e5124 --- /dev/null +++ b/assets/images/logo-wav2letter.svg @@ -0,0 +1,12 @@ + + + + Page 1 + Created with Sketch. + + + + + + + \ No newline at end of file diff --git a/assets/images/logo-white.svg b/assets/images/logo-white.svg new file mode 100644 index 000000000..26faf2f00 --- /dev/null +++ b/assets/images/logo-white.svg @@ -0,0 +1,31 @@ + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/logo-youtube-dark.svg b/assets/images/logo-youtube-dark.svg new file mode 100644 index 000000000..e3cfedd79 --- /dev/null +++ b/assets/images/logo-youtube-dark.svg @@ -0,0 +1,21 @@ + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/logo.svg b/assets/images/logo.svg new file mode 100644 index 000000000..f8d44b984 --- /dev/null +++ b/assets/images/logo.svg @@ -0,0 +1,31 @@ + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/maintainers/9bow.png b/assets/images/maintainers/9bow.png new file mode 100644 index 000000000..c5acddc1d Binary files /dev/null and b/assets/images/maintainers/9bow.png differ diff --git a/assets/images/maintainers/Taeyoung96.png b/assets/images/maintainers/Taeyoung96.png new file mode 100644 index 000000000..9e4dfcd7f Binary files /dev/null and b/assets/images/maintainers/Taeyoung96.png differ diff --git a/assets/images/maintainers/adonisues.png b/assets/images/maintainers/adonisues.png new file mode 100644 index 000000000..4ea6672ea Binary files /dev/null and b/assets/images/maintainers/adonisues.png differ diff --git a/assets/images/maintainers/bongmo.png b/assets/images/maintainers/bongmo.png new file mode 100644 index 000000000..1d7c31000 Binary files /dev/null and b/assets/images/maintainers/bongmo.png differ diff --git a/assets/images/maintainers/codertimo.png b/assets/images/maintainers/codertimo.png new file mode 100644 index 000000000..93cf72b61 Binary files /dev/null and b/assets/images/maintainers/codertimo.png differ diff --git a/assets/images/maintainers/codingbowoo.png b/assets/images/maintainers/codingbowoo.png new file mode 100644 index 000000000..a150c2e31 Binary files /dev/null and b/assets/images/maintainers/codingbowoo.png differ diff --git a/assets/images/maintainers/convin305.png b/assets/images/maintainers/convin305.png new file mode 100644 index 000000000..3092558c7 Binary files /dev/null and b/assets/images/maintainers/convin305.png differ diff --git a/assets/images/maintainers/corazzon.png b/assets/images/maintainers/corazzon.png new file mode 100644 index 000000000..a39d034aa Binary files /dev/null and b/assets/images/maintainers/corazzon.png differ diff --git a/assets/images/maintainers/creduo.png b/assets/images/maintainers/creduo.png new file mode 100644 index 000000000..8bb9c8e29 Binary files /dev/null and b/assets/images/maintainers/creduo.png differ diff --git a/assets/images/maintainers/des00.png b/assets/images/maintainers/des00.png new file mode 100644 index 000000000..f4c06520f Binary files /dev/null and b/assets/images/maintainers/des00.png differ diff --git a/assets/images/maintainers/dudtheheaven.png b/assets/images/maintainers/dudtheheaven.png new file mode 100644 index 000000000..d1e7ce867 Binary files /dev/null and b/assets/images/maintainers/dudtheheaven.png differ diff --git a/assets/images/maintainers/falconlee236.png b/assets/images/maintainers/falconlee236.png new file mode 100644 index 000000000..e27c18104 Binary files /dev/null and b/assets/images/maintainers/falconlee236.png differ diff --git a/assets/images/maintainers/hkim15.png b/assets/images/maintainers/hkim15.png new file mode 100644 index 000000000..b601c5536 Binary files /dev/null and b/assets/images/maintainers/hkim15.png differ diff --git a/assets/images/maintainers/hrxorxm.png b/assets/images/maintainers/hrxorxm.png new file mode 100644 index 000000000..58c33ad69 Binary files /dev/null and b/assets/images/maintainers/hrxorxm.png differ diff --git a/assets/images/maintainers/hyoyoung.png b/assets/images/maintainers/hyoyoung.png new file mode 100644 index 000000000..1a502faf0 Binary files /dev/null and b/assets/images/maintainers/hyoyoung.png differ diff --git a/assets/images/maintainers/j-min.png b/assets/images/maintainers/j-min.png new file mode 100644 index 000000000..440e07834 Binary files /dev/null and b/assets/images/maintainers/j-min.png differ diff --git a/assets/images/maintainers/jenner9212.png b/assets/images/maintainers/jenner9212.png new file mode 100644 index 000000000..e64631b14 Binary files /dev/null and b/assets/images/maintainers/jenner9212.png differ diff --git a/assets/images/maintainers/jet981217.png b/assets/images/maintainers/jet981217.png new file mode 100644 index 000000000..bd401c6e6 Binary files /dev/null and b/assets/images/maintainers/jet981217.png differ diff --git a/assets/images/maintainers/jih0-kim.png b/assets/images/maintainers/jih0-kim.png new file mode 100644 index 000000000..523ffef5e Binary files /dev/null and b/assets/images/maintainers/jih0-kim.png differ diff --git a/assets/images/maintainers/jimin.lee.png b/assets/images/maintainers/jimin.lee.png new file mode 100644 index 000000000..240253895 Binary files /dev/null and b/assets/images/maintainers/jimin.lee.png differ diff --git a/assets/images/maintainers/nysunshine.png b/assets/images/maintainers/nysunshine.png new file mode 100644 index 000000000..5b79f64e7 Binary files /dev/null and b/assets/images/maintainers/nysunshine.png differ diff --git a/assets/images/microsoft-azure-logo.svg b/assets/images/microsoft-azure-logo.svg new file mode 100644 index 000000000..89e45a912 --- /dev/null +++ b/assets/images/microsoft-azure-logo.svg @@ -0,0 +1,57 @@ + + + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/midas_samples.png b/assets/images/midas_samples.png new file mode 100644 index 000000000..921e290ed Binary files /dev/null and b/assets/images/midas_samples.png differ diff --git a/assets/images/mobile-icon.svg b/assets/images/mobile-icon.svg new file mode 100644 index 000000000..ba5cbebfb --- /dev/null +++ b/assets/images/mobile-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/mobilenet_v2_1.png b/assets/images/mobilenet_v2_1.png new file mode 100644 index 000000000..55fd2a18b Binary files /dev/null and b/assets/images/mobilenet_v2_1.png differ diff --git a/assets/images/mobilenet_v2_2.png b/assets/images/mobilenet_v2_2.png new file mode 100644 index 000000000..bc184fa04 Binary files /dev/null and b/assets/images/mobilenet_v2_2.png differ diff --git a/assets/images/model_page.png b/assets/images/model_page.png new file mode 100644 index 000000000..35fc96966 Binary files /dev/null and b/assets/images/model_page.png differ diff --git a/assets/images/ncf_diagram.png b/assets/images/ncf_diagram.png new file mode 100644 index 000000000..ff27ccbfd Binary files /dev/null and b/assets/images/ncf_diagram.png differ diff --git a/assets/images/netlify.png b/assets/images/netlify.png new file mode 100644 index 000000000..513fcd2b9 Binary files /dev/null and b/assets/images/netlify.png differ diff --git a/assets/images/no-image b/assets/images/no-image new file mode 100644 index 000000000..e69de29bb diff --git a/assets/images/nswapytorch2.jpg b/assets/images/nswapytorch2.jpg new file mode 100644 index 000000000..491a10e18 Binary files /dev/null and b/assets/images/nswapytorch2.jpg differ diff --git a/assets/images/nswapytorch6.png b/assets/images/nswapytorch6.png new file mode 100644 index 000000000..e7483dae6 Binary files /dev/null and b/assets/images/nswapytorch6.png differ diff --git a/assets/images/nswapytorch8.png b/assets/images/nswapytorch8.png new file mode 100644 index 000000000..3b1ba9e8b Binary files /dev/null and b/assets/images/nswapytorch8.png differ diff --git a/assets/images/nts-net.png b/assets/images/nts-net.png new file mode 100644 index 000000000..b7bd97b1e Binary files /dev/null and b/assets/images/nts-net.png differ diff --git a/assets/images/nvidia-logo.png b/assets/images/nvidia-logo.png new file mode 100644 index 000000000..194e1bad8 Binary files /dev/null and b/assets/images/nvidia-logo.png differ diff --git a/assets/images/nvidia_logo.png b/assets/images/nvidia_logo.png new file mode 100644 index 000000000..41caa39c7 Binary files /dev/null and b/assets/images/nvidia_logo.png differ diff --git a/assets/images/nvidiafp16onv100.png b/assets/images/nvidiafp16onv100.png new file mode 100644 index 000000000..46d29522a Binary files /dev/null and b/assets/images/nvidiafp16onv100.png differ diff --git a/assets/images/nvidiafp32onv100.jpg b/assets/images/nvidiafp32onv100.jpg new file mode 100644 index 000000000..b3e0d03ed Binary files /dev/null and b/assets/images/nvidiafp32onv100.jpg differ diff --git a/assets/images/openmined-pytorch.png b/assets/images/openmined-pytorch.png new file mode 100644 index 000000000..610799477 Binary files /dev/null and b/assets/images/openmined-pytorch.png differ diff --git a/assets/images/org-features-background.jpg b/assets/images/org-features-background.jpg new file mode 100644 index 000000000..645edc9f1 Binary files /dev/null and b/assets/images/org-features-background.jpg differ diff --git a/assets/images/org-get-started-background.jpg b/assets/images/org-get-started-background.jpg new file mode 100644 index 000000000..dda2e6eeb Binary files /dev/null and b/assets/images/org-get-started-background.jpg differ diff --git a/assets/images/org-home-background.jpg b/assets/images/org-home-background.jpg new file mode 100644 index 000000000..3f020015d Binary files /dev/null and b/assets/images/org-home-background.jpg differ diff --git a/assets/images/packed_sequence.png b/assets/images/packed_sequence.png new file mode 100644 index 000000000..b32a8953e Binary files /dev/null and b/assets/images/packed_sequence.png differ diff --git a/assets/images/paris-tech-logo.png b/assets/images/paris-tech-logo.png new file mode 100644 index 000000000..c2c77bf3b Binary files /dev/null and b/assets/images/paris-tech-logo.png differ diff --git a/assets/images/pgan_celebaHQ.jpg b/assets/images/pgan_celebaHQ.jpg new file mode 100644 index 000000000..9fbcc4291 Binary files /dev/null and b/assets/images/pgan_celebaHQ.jpg differ diff --git a/assets/images/pgan_mix.jpg b/assets/images/pgan_mix.jpg new file mode 100644 index 000000000..91959af4a Binary files /dev/null and b/assets/images/pgan_mix.jpg differ diff --git a/assets/images/pganlogo.png b/assets/images/pganlogo.png new file mode 100644 index 000000000..1b623f792 Binary files /dev/null and b/assets/images/pganlogo.png differ diff --git a/assets/images/probpackages.png b/assets/images/probpackages.png new file mode 100644 index 000000000..82b3ec115 Binary files /dev/null and b/assets/images/probpackages.png differ diff --git a/assets/images/proxylessnas.png b/assets/images/proxylessnas.png new file mode 100644 index 000000000..50fb554a5 Binary files /dev/null and b/assets/images/proxylessnas.png differ diff --git a/assets/images/pytorch-2.0-feature-img.png b/assets/images/pytorch-2.0-feature-img.png new file mode 100644 index 000000000..57f90a187 Binary files /dev/null and b/assets/images/pytorch-2.0-feature-img.png differ diff --git a/assets/images/pytorch-2.0-img10.png b/assets/images/pytorch-2.0-img10.png new file mode 100644 index 000000000..4c408b78d Binary files /dev/null and b/assets/images/pytorch-2.0-img10.png differ diff --git a/assets/images/pytorch-2.0-img11.png b/assets/images/pytorch-2.0-img11.png new file mode 100644 index 000000000..1e737c529 Binary files /dev/null and b/assets/images/pytorch-2.0-img11.png differ diff --git a/assets/images/pytorch-2.0-img12.png b/assets/images/pytorch-2.0-img12.png new file mode 100644 index 000000000..55d1dde5b Binary files /dev/null and b/assets/images/pytorch-2.0-img12.png differ diff --git a/assets/images/pytorch-2.0-img2.png b/assets/images/pytorch-2.0-img2.png new file mode 100644 index 000000000..92bbc3ad6 Binary files /dev/null and b/assets/images/pytorch-2.0-img2.png differ diff --git a/assets/images/pytorch-2.0-img3.gif b/assets/images/pytorch-2.0-img3.gif new file mode 100644 index 000000000..36a503541 Binary files /dev/null and b/assets/images/pytorch-2.0-img3.gif differ diff --git a/assets/images/pytorch-2.0-img4.jpg b/assets/images/pytorch-2.0-img4.jpg new file mode 100644 index 000000000..ffebd754b Binary files /dev/null and b/assets/images/pytorch-2.0-img4.jpg differ diff --git a/assets/images/pytorch-2.0-img5.png b/assets/images/pytorch-2.0-img5.png new file mode 100644 index 000000000..b63079389 Binary files /dev/null and b/assets/images/pytorch-2.0-img5.png differ diff --git a/assets/images/pytorch-2.0-img6.png b/assets/images/pytorch-2.0-img6.png new file mode 100644 index 000000000..d30c3c9fe Binary files /dev/null and b/assets/images/pytorch-2.0-img6.png differ diff --git a/assets/images/pytorch-2.0-img7.png b/assets/images/pytorch-2.0-img7.png new file mode 100644 index 000000000..d5f1143f9 Binary files /dev/null and b/assets/images/pytorch-2.0-img7.png differ diff --git a/assets/images/pytorch-2.0-img8.png b/assets/images/pytorch-2.0-img8.png new file mode 100644 index 000000000..fe75d443f Binary files /dev/null and b/assets/images/pytorch-2.0-img8.png differ diff --git a/assets/images/pytorch-2.0-img9.png b/assets/images/pytorch-2.0-img9.png new file mode 100644 index 000000000..cb1d40571 Binary files /dev/null and b/assets/images/pytorch-2.0-img9.png differ diff --git a/assets/images/pytorch-ecosystem.png b/assets/images/pytorch-ecosystem.png new file mode 100644 index 000000000..d453c5b26 Binary files /dev/null and b/assets/images/pytorch-ecosystem.png differ diff --git a/assets/images/pytorch-edge-arrows.svg b/assets/images/pytorch-edge-arrows.svg new file mode 100644 index 000000000..f25e6771b --- /dev/null +++ b/assets/images/pytorch-edge-arrows.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/pytorch-hub-arrow.svg b/assets/images/pytorch-hub-arrow.svg new file mode 100644 index 000000000..d5c383a2d --- /dev/null +++ b/assets/images/pytorch-hub-arrow.svg @@ -0,0 +1,12 @@ + + + + + + + < + + + + + diff --git a/assets/images/pytorch-kr-logo-sm.png b/assets/images/pytorch-kr-logo-sm.png new file mode 100644 index 000000000..5d3911c68 Binary files /dev/null and b/assets/images/pytorch-kr-logo-sm.png differ diff --git a/assets/images/pytorch-kr-logo.png b/assets/images/pytorch-kr-logo.png new file mode 100644 index 000000000..c0f444ddd Binary files /dev/null and b/assets/images/pytorch-kr-logo.png differ diff --git a/assets/images/pytorch-logo.png b/assets/images/pytorch-logo.png new file mode 100644 index 000000000..bad49bf30 Binary files /dev/null and b/assets/images/pytorch-logo.png differ diff --git a/assets/images/pytorch-mobile.png b/assets/images/pytorch-mobile.png new file mode 100644 index 000000000..3f0d2340c Binary files /dev/null and b/assets/images/pytorch-mobile.png differ diff --git a/assets/images/pytorch-profiler-bottleneck.png b/assets/images/pytorch-profiler-bottleneck.png new file mode 100644 index 000000000..33bba0cf7 Binary files /dev/null and b/assets/images/pytorch-profiler-bottleneck.png differ diff --git a/assets/images/pytorch-profiler-vscode-launch.png b/assets/images/pytorch-profiler-vscode-launch.png new file mode 100644 index 000000000..5fa0299f3 Binary files /dev/null and b/assets/images/pytorch-profiler-vscode-launch.png differ diff --git a/assets/images/pytorch-profiler-vscode.png b/assets/images/pytorch-profiler-vscode.png new file mode 100644 index 000000000..4a7c47ac3 Binary files /dev/null and b/assets/images/pytorch-profiler-vscode.png differ diff --git a/assets/images/pytorch-timeline-ko.svg b/assets/images/pytorch-timeline-ko.svg new file mode 100644 index 000000000..09703667a --- /dev/null +++ b/assets/images/pytorch-timeline-ko.svg @@ -0,0 +1,80 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/pytorch-timeline.svg b/assets/images/pytorch-timeline.svg new file mode 100644 index 000000000..6997d2833 --- /dev/null +++ b/assets/images/pytorch-timeline.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/pytorch-x.svg b/assets/images/pytorch-x.svg new file mode 100644 index 000000000..74856ea9f --- /dev/null +++ b/assets/images/pytorch-x.svg @@ -0,0 +1,10 @@ + + + + + + + diff --git a/assets/images/pytorch1.6.png b/assets/images/pytorch1.6.png new file mode 100644 index 000000000..2c173d9a7 Binary files /dev/null and b/assets/images/pytorch1.6.png differ diff --git a/assets/images/pytorch_bg_purple.jpg b/assets/images/pytorch_bg_purple.jpg new file mode 100644 index 000000000..2f48b15de Binary files /dev/null and b/assets/images/pytorch_bg_purple.jpg differ diff --git a/assets/images/pytorchmobile.png b/assets/images/pytorchmobile.png new file mode 100644 index 000000000..8f5c87142 Binary files /dev/null and b/assets/images/pytorchmobile.png differ diff --git a/assets/images/pytorchwebdataset1.png b/assets/images/pytorchwebdataset1.png new file mode 100644 index 000000000..5c61a7780 Binary files /dev/null and b/assets/images/pytorchwebdataset1.png differ diff --git a/assets/images/qaid.gif b/assets/images/qaid.gif new file mode 100644 index 000000000..4815089d0 Binary files /dev/null and b/assets/images/qaid.gif differ diff --git a/assets/images/rebellions-logo.svg b/assets/images/rebellions-logo.svg new file mode 100644 index 000000000..200a62c80 --- /dev/null +++ b/assets/images/rebellions-logo.svg @@ -0,0 +1,20 @@ + + + + + + + + + + + + + + + + + + + + diff --git a/assets/images/resnest.jpg b/assets/images/resnest.jpg new file mode 100644 index 000000000..994dc6ff0 Binary files /dev/null and b/assets/images/resnest.jpg differ diff --git a/assets/images/resnet.png b/assets/images/resnet.png new file mode 100644 index 000000000..81b782967 Binary files /dev/null and b/assets/images/resnet.png differ diff --git a/assets/images/resnext.png b/assets/images/resnext.png new file mode 100644 index 000000000..f74c4eb90 Binary files /dev/null and b/assets/images/resnext.png differ diff --git a/assets/images/salesforce.png b/assets/images/salesforce.png new file mode 100644 index 000000000..22bf99e04 Binary files /dev/null and b/assets/images/salesforce.png differ diff --git a/assets/images/seamless.png b/assets/images/seamless.png new file mode 100644 index 000000000..ad0b9d8f2 Binary files /dev/null and b/assets/images/seamless.png differ diff --git a/assets/images/search-icon-orange.svg b/assets/images/search-icon-orange.svg new file mode 100644 index 000000000..0e66c2c77 --- /dev/null +++ b/assets/images/search-icon-orange.svg @@ -0,0 +1,11 @@ + + + + + + + + + + + diff --git a/assets/images/search-icon-white.svg b/assets/images/search-icon-white.svg new file mode 100644 index 000000000..70b2f6e5b --- /dev/null +++ b/assets/images/search-icon-white.svg @@ -0,0 +1,11 @@ + + + + + + + + + + + diff --git a/assets/images/search-icon.svg b/assets/images/search-icon.svg new file mode 100644 index 000000000..ade68ef1a --- /dev/null +++ b/assets/images/search-icon.svg @@ -0,0 +1,11 @@ + + + + + + + + + + + diff --git a/assets/images/sentiment.png b/assets/images/sentiment.png new file mode 100644 index 000000000..27f3ed12b Binary files /dev/null and b/assets/images/sentiment.png differ diff --git a/assets/images/shufflenet_v2_1.png b/assets/images/shufflenet_v2_1.png new file mode 100644 index 000000000..2e6750b41 Binary files /dev/null and b/assets/images/shufflenet_v2_1.png differ diff --git a/assets/images/shufflenet_v2_2.png b/assets/images/shufflenet_v2_2.png new file mode 100644 index 000000000..44960faac Binary files /dev/null and b/assets/images/shufflenet_v2_2.png differ diff --git a/assets/images/sigsep_logo_inria.png b/assets/images/sigsep_logo_inria.png new file mode 100644 index 000000000..066ea8861 Binary files /dev/null and b/assets/images/sigsep_logo_inria.png differ diff --git a/assets/images/sigsep_umx-diagram.png b/assets/images/sigsep_umx-diagram.png new file mode 100644 index 000000000..9cb5c4a35 Binary files /dev/null and b/assets/images/sigsep_umx-diagram.png differ diff --git a/assets/images/silero_imagenet_moment.png b/assets/images/silero_imagenet_moment.png new file mode 100644 index 000000000..faa16dc5c Binary files /dev/null and b/assets/images/silero_imagenet_moment.png differ diff --git a/assets/images/silero_logo.jpg b/assets/images/silero_logo.jpg new file mode 100644 index 000000000..0ced1942a Binary files /dev/null and b/assets/images/silero_logo.jpg differ diff --git a/assets/images/silero_stt_model.jpg b/assets/images/silero_stt_model.jpg new file mode 100644 index 000000000..2e67c11c2 Binary files /dev/null and b/assets/images/silero_stt_model.jpg differ diff --git a/assets/images/silero_vad_performance.png b/assets/images/silero_vad_performance.png new file mode 100644 index 000000000..9d1d9f4f1 Binary files /dev/null and b/assets/images/silero_vad_performance.png differ diff --git a/assets/images/slowfast.png b/assets/images/slowfast.png new file mode 100644 index 000000000..c5f542a1f Binary files /dev/null and b/assets/images/slowfast.png differ diff --git a/assets/images/snowflake-logo.svg b/assets/images/snowflake-logo.svg new file mode 100644 index 000000000..479b911b7 --- /dev/null +++ b/assets/images/snowflake-logo.svg @@ -0,0 +1,26 @@ + + + + Group + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/assets/images/spectrograms.png b/assets/images/spectrograms.png new file mode 100644 index 000000000..48f6a1ef5 Binary files /dev/null and b/assets/images/spectrograms.png differ diff --git a/assets/images/squares-icon.svg b/assets/images/squares-icon.svg new file mode 100644 index 000000000..2b9a75a33 --- /dev/null +++ b/assets/images/squares-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/squeezenet.png b/assets/images/squeezenet.png new file mode 100644 index 000000000..f98b670f4 Binary files /dev/null and b/assets/images/squeezenet.png differ diff --git a/assets/images/ssd.png b/assets/images/ssd.png new file mode 100644 index 000000000..a7bdff94e Binary files /dev/null and b/assets/images/ssd.png differ diff --git a/assets/images/ssd_diagram.png b/assets/images/ssd_diagram.png new file mode 100644 index 000000000..cbb0e69bc Binary files /dev/null and b/assets/images/ssd_diagram.png differ diff --git a/assets/images/ssl-image.png b/assets/images/ssl-image.png new file mode 100644 index 000000000..0fa72e245 Binary files /dev/null and b/assets/images/ssl-image.png differ diff --git a/assets/images/stanford-university.png b/assets/images/stanford-university.png new file mode 100644 index 000000000..c18454477 Binary files /dev/null and b/assets/images/stanford-university.png differ diff --git a/assets/images/stopwatch-icon.svg b/assets/images/stopwatch-icon.svg new file mode 100644 index 000000000..eed1bb869 --- /dev/null +++ b/assets/images/stopwatch-icon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/images/summer_hackathon_2020.jpeg b/assets/images/summer_hackathon_2020.jpeg new file mode 100644 index 000000000..6eaff0ce2 Binary files /dev/null and b/assets/images/summer_hackathon_2020.jpeg differ diff --git a/assets/images/swapytorch1.png b/assets/images/swapytorch1.png new file mode 100644 index 000000000..784e30f87 Binary files /dev/null and b/assets/images/swapytorch1.png differ diff --git a/assets/images/swapytorch10.png b/assets/images/swapytorch10.png new file mode 100644 index 000000000..0b292face Binary files /dev/null and b/assets/images/swapytorch10.png differ diff --git a/assets/images/swapytorch2.png b/assets/images/swapytorch2.png new file mode 100644 index 000000000..9a4150da8 Binary files /dev/null and b/assets/images/swapytorch2.png differ diff --git a/assets/images/swapytorch3.jpg b/assets/images/swapytorch3.jpg new file mode 100644 index 000000000..6189bd88c Binary files /dev/null and b/assets/images/swapytorch3.jpg differ diff --git a/assets/images/swapytorch4.png b/assets/images/swapytorch4.png new file mode 100644 index 000000000..a64656629 Binary files /dev/null and b/assets/images/swapytorch4.png differ diff --git a/assets/images/swapytorch5.png b/assets/images/swapytorch5.png new file mode 100644 index 000000000..cc3c1f8d4 Binary files /dev/null and b/assets/images/swapytorch5.png differ diff --git a/assets/images/swapytorch6.png b/assets/images/swapytorch6.png new file mode 100644 index 000000000..0be145c26 Binary files /dev/null and b/assets/images/swapytorch6.png differ diff --git a/assets/images/swapytorch7.png b/assets/images/swapytorch7.png new file mode 100644 index 000000000..3e70edeb9 Binary files /dev/null and b/assets/images/swapytorch7.png differ diff --git a/assets/images/swapytorch8.jpg b/assets/images/swapytorch8.jpg new file mode 100644 index 000000000..6dc85f1f3 Binary files /dev/null and b/assets/images/swapytorch8.jpg differ diff --git a/assets/images/swapytorch8.png b/assets/images/swapytorch8.png new file mode 100644 index 000000000..3b1ba9e8b Binary files /dev/null and b/assets/images/swapytorch8.png differ diff --git a/assets/images/swapytorch9.png b/assets/images/swapytorch9.png new file mode 100644 index 000000000..64cd95291 Binary files /dev/null and b/assets/images/swapytorch9.png differ diff --git a/assets/images/t-vs-eager-mode.svg b/assets/images/t-vs-eager-mode.svg new file mode 100644 index 000000000..f56363d3b --- /dev/null +++ b/assets/images/t-vs-eager-mode.svg @@ -0,0 +1,80 @@ + + + + + + + + ~1.5X + ~1.5X + ~1.7X + ~2.3X + 1X + eager-mode + Eager Mode + Eager Mode + + + DistillGPT2 + TorchInductor + CamemBert + T5Small + \ No newline at end of file diff --git a/assets/images/tacotron2_diagram.png b/assets/images/tacotron2_diagram.png new file mode 100644 index 000000000..6efb12f93 Binary files /dev/null and b/assets/images/tacotron2_diagram.png differ diff --git a/assets/images/tensorboard_model.png b/assets/images/tensorboard_model.png new file mode 100644 index 000000000..e4222bc9f Binary files /dev/null and b/assets/images/tensorboard_model.png differ diff --git a/assets/images/tochvisionmobile.png b/assets/images/tochvisionmobile.png new file mode 100644 index 000000000..bf84a4667 Binary files /dev/null and b/assets/images/tochvisionmobile.png differ diff --git a/assets/images/torch_stack1.png b/assets/images/torch_stack1.png new file mode 100644 index 000000000..2a5c50908 Binary files /dev/null and b/assets/images/torch_stack1.png differ diff --git a/assets/images/torchchat.png b/assets/images/torchchat.png new file mode 100644 index 000000000..2018dfd1e Binary files /dev/null and b/assets/images/torchchat.png differ diff --git a/assets/images/torchcsprng.png b/assets/images/torchcsprng.png new file mode 100644 index 000000000..9d9273a07 Binary files /dev/null and b/assets/images/torchcsprng.png differ diff --git a/assets/images/torchvision_0.3_headline.png b/assets/images/torchvision_0.3_headline.png new file mode 100644 index 000000000..270ed5f3c Binary files /dev/null and b/assets/images/torchvision_0.3_headline.png differ diff --git a/assets/images/training-moes/fg1.png b/assets/images/training-moes/fg1.png new file mode 100644 index 000000000..70242caaa Binary files /dev/null and b/assets/images/training-moes/fg1.png differ diff --git a/assets/images/training-moes/fg2.png b/assets/images/training-moes/fg2.png new file mode 100644 index 000000000..0543d9062 Binary files /dev/null and b/assets/images/training-moes/fg2.png differ diff --git a/assets/images/training-moes/fg3.png b/assets/images/training-moes/fg3.png new file mode 100644 index 000000000..525e2c110 Binary files /dev/null and b/assets/images/training-moes/fg3.png differ diff --git a/assets/images/training-moes/fg4.png b/assets/images/training-moes/fg4.png new file mode 100644 index 000000000..40b6187ad Binary files /dev/null and b/assets/images/training-moes/fg4.png differ diff --git a/assets/images/training-moes/fg5.png b/assets/images/training-moes/fg5.png new file mode 100644 index 000000000..0b8c79fd4 Binary files /dev/null and b/assets/images/training-moes/fg5.png differ diff --git a/assets/images/transformer.png b/assets/images/transformer.png new file mode 100644 index 000000000..a84f91f1b Binary files /dev/null and b/assets/images/transformer.png differ diff --git a/assets/images/tutorialhomepage.png b/assets/images/tutorialhomepage.png new file mode 100644 index 000000000..ebe3e7fea Binary files /dev/null and b/assets/images/tutorialhomepage.png differ diff --git a/assets/images/udacity.png b/assets/images/udacity.png new file mode 100644 index 000000000..413f42e84 Binary files /dev/null and b/assets/images/udacity.png differ diff --git a/assets/images/ultralytics_yolov5_img0.jpg b/assets/images/ultralytics_yolov5_img0.jpg new file mode 100644 index 000000000..bffe5837e Binary files /dev/null and b/assets/images/ultralytics_yolov5_img0.jpg differ diff --git a/assets/images/ultralytics_yolov5_img1.jpg b/assets/images/ultralytics_yolov5_img1.jpg new file mode 100644 index 000000000..cc42fff74 Binary files /dev/null and b/assets/images/ultralytics_yolov5_img1.jpg differ diff --git a/assets/images/ultralytics_yolov5_img2.png b/assets/images/ultralytics_yolov5_img2.png new file mode 100644 index 000000000..59c1792b9 Binary files /dev/null and b/assets/images/ultralytics_yolov5_img2.png differ diff --git a/assets/images/unet_brain_mri.png b/assets/images/unet_brain_mri.png new file mode 100644 index 000000000..397719f37 Binary files /dev/null and b/assets/images/unet_brain_mri.png differ diff --git a/assets/images/unet_tcga_cs_4944.png b/assets/images/unet_tcga_cs_4944.png new file mode 100644 index 000000000..d7e556675 Binary files /dev/null and b/assets/images/unet_tcga_cs_4944.png differ diff --git a/assets/images/vgg.png b/assets/images/vgg.png new file mode 100644 index 000000000..f7a03d160 Binary files /dev/null and b/assets/images/vgg.png differ diff --git a/assets/images/visdom.png b/assets/images/visdom.png new file mode 100644 index 000000000..2234ef271 Binary files /dev/null and b/assets/images/visdom.png differ diff --git a/assets/images/waveglow_diagram.png b/assets/images/waveglow_diagram.png new file mode 100644 index 000000000..3ea45444a Binary files /dev/null and b/assets/images/waveglow_diagram.png differ diff --git a/assets/images/webdataset1.png b/assets/images/webdataset1.png new file mode 100644 index 000000000..9bf62cf89 Binary files /dev/null and b/assets/images/webdataset1.png differ diff --git a/assets/images/webdataset2.png b/assets/images/webdataset2.png new file mode 100644 index 000000000..b660928e3 Binary files /dev/null and b/assets/images/webdataset2.png differ diff --git a/assets/images/webdataset3.png b/assets/images/webdataset3.png new file mode 100644 index 000000000..f77d7108e Binary files /dev/null and b/assets/images/webdataset3.png differ diff --git a/assets/images/wide_resnet.png b/assets/images/wide_resnet.png new file mode 100644 index 000000000..21ab12edb Binary files /dev/null and b/assets/images/wide_resnet.png differ diff --git a/assets/images/wsl-image.png b/assets/images/wsl-image.png new file mode 100644 index 000000000..0da85d8dd Binary files /dev/null and b/assets/images/wsl-image.png differ diff --git a/assets/images/x3d.png b/assets/images/x3d.png new file mode 100644 index 000000000..7f86e44b7 Binary files /dev/null and b/assets/images/x3d.png differ diff --git a/assets/images/yolop.png b/assets/images/yolop.png new file mode 100644 index 000000000..1a6088452 Binary files /dev/null and b/assets/images/yolop.png differ diff --git a/assets/main-menu-dropdown.js b/assets/main-menu-dropdown.js new file mode 100644 index 000000000..3da2d3220 --- /dev/null +++ b/assets/main-menu-dropdown.js @@ -0,0 +1,15 @@ +$("[data-toggle='resources-dropdown']").hover(function() { + toggleDropdown($(this).attr("data-toggle")); +}); + +function toggleDropdown(menuToggle) { + var showMenuClass = "show-menu"; + var menuClass = "." + menuToggle + "-menu"; + + if ($(menuClass).hasClass(showMenuClass)) { + $(menuClass).removeClass(showMenuClass); + } else { + $("[data-toggle=" + menuToggle + "].show-menu").removeClass(showMenuClass); + $(menuClass).addClass(showMenuClass); + } +} diff --git a/assets/main.css b/assets/main.css new file mode 100644 index 000000000..ba0214037 --- /dev/null +++ b/assets/main.css @@ -0,0 +1,6 @@ +/*! + * Bootstrap v4.3.1 (https://getbootstrap.com/) + * Copyright 2011-2019 The Bootstrap Authors + * Copyright 2011-2019 Twitter, Inc. + * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE) + */@import url("https://fonts.googleapis.com/css?family=Nanum+Gothic:400,700");@import url("https://fonts.googleapis.com/css?family=Nanum+Gothic+Coding:400,700");:root{--blue: #007bff;--indigo: #6610f2;--purple: #6f42c1;--pink: #e83e8c;--red: #dc3545;--orange: #fd7e14;--yellow: #ffc107;--green: #28a745;--teal: #20c997;--cyan: #17a2b8;--white: #fff;--gray: #6c757d;--gray-dark: #343a40;--primary: #007bff;--secondary: #6c757d;--success: #28a745;--info: #17a2b8;--warning: #ffc107;--danger: #dc3545;--light: #f8f9fa;--dark: #343a40;--breakpoint-xs: 0;--breakpoint-sm: 576px;--breakpoint-md: 768px;--breakpoint-lg: 992px;--breakpoint-xl: 1200px;--font-family-sans-serif: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";--font-family-monospace: SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace}*,*::before,*::after{box-sizing:border-box}html{font-family:sans-serif;line-height:1.15;-webkit-text-size-adjust:100%;-webkit-tap-highlight-color:transparent}article,aside,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}body{margin:0;font-family:-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol","Noto Color Emoji";font-size:1rem;font-weight:400;line-height:1.5;color:#212529;text-align:left;background-color:#fff}[tabindex="-1"]:focus{outline:0 !important}hr{box-sizing:content-box;height:0;overflow:visible}h1,h2,h3,h4,h5,h6{margin-top:0;margin-bottom:.5rem}p{margin-top:0;margin-bottom:1rem}abbr[title],abbr[data-original-title]{text-decoration:underline;-webkit-text-decoration:underline dotted;text-decoration:underline dotted;cursor:help;border-bottom:0;-webkit-text-decoration-skip-ink:none;text-decoration-skip-ink:none}address{margin-bottom:1rem;font-style:normal;line-height:inherit}ol,ul,dl{margin-top:0;margin-bottom:1rem}ol ol,ul ul,ol ul,ul ol{margin-bottom:0}dt{font-weight:700}dd{margin-bottom:.5rem;margin-left:0}blockquote{margin:0 0 1rem}b,strong{font-weight:bolder}small{font-size:80%}sub,sup{position:relative;font-size:75%;line-height:0;vertical-align:baseline}sub{bottom:-.25em}sup{top:-.5em}a{color:#007bff;text-decoration:none;background-color:transparent}a:hover{color:#0056b3;text-decoration:underline}a:not([href]):not([tabindex]){color:inherit;text-decoration:none}a:not([href]):not([tabindex]):hover,a:not([href]):not([tabindex]):focus{color:inherit;text-decoration:none}a:not([href]):not([tabindex]):focus{outline:0}pre,code,kbd,samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace;font-size:1em}pre{margin-top:0;margin-bottom:1rem;overflow:auto}figure{margin:0 0 1rem}img{vertical-align:middle;border-style:none}svg{overflow:hidden;vertical-align:middle}table{border-collapse:collapse}caption{padding-top:.75rem;padding-bottom:.75rem;color:#6c757d;text-align:left;caption-side:bottom}th{text-align:inherit}label{display:inline-block;margin-bottom:.5rem}button{border-radius:0}button:focus{outline:1px dotted;outline:5px auto -webkit-focus-ring-color}input,button,select,optgroup,textarea{margin:0;font-family:inherit;font-size:inherit;line-height:inherit}button,input{overflow:visible}button,select{text-transform:none}select{word-wrap:normal}button,[type="button"],[type="reset"],[type="submit"]{-webkit-appearance:button}button:not(:disabled),[type="button"]:not(:disabled),[type="reset"]:not(:disabled),[type="submit"]:not(:disabled){cursor:pointer}button::-moz-focus-inner,[type="button"]::-moz-focus-inner,[type="reset"]::-moz-focus-inner,[type="submit"]::-moz-focus-inner{padding:0;border-style:none}input[type="radio"],input[type="checkbox"]{box-sizing:border-box;padding:0}input[type="date"],input[type="time"],input[type="datetime-local"],input[type="month"]{-webkit-appearance:listbox}textarea{overflow:auto;resize:vertical}fieldset{min-width:0;padding:0;margin:0;border:0}legend{display:block;width:100%;max-width:100%;padding:0;margin-bottom:.5rem;font-size:1.5rem;line-height:inherit;color:inherit;white-space:normal}progress{vertical-align:baseline}[type="number"]::-webkit-inner-spin-button,[type="number"]::-webkit-outer-spin-button{height:auto}[type="search"]{outline-offset:-2px;-webkit-appearance:none}[type="search"]::-webkit-search-decoration{-webkit-appearance:none}::-webkit-file-upload-button{font:inherit;-webkit-appearance:button}output{display:inline-block}summary{display:list-item;cursor:pointer}template{display:none}[hidden]{display:none !important}h1,h2,h3,h4,h5,h6,.h1,.h2,.h3,.h4,.h5,.h6{margin-bottom:.5rem;font-weight:500;line-height:1.2}h1,.h1{font-size:2.5rem}h2,.h2{font-size:2rem}h3,.h3{font-size:1.75rem}h4,.h4{font-size:1.5rem}h5,.h5{font-size:1.25rem}h6,.h6{font-size:1rem}.lead{font-size:1.25rem;font-weight:300}.display-1{font-size:6rem;font-weight:300;line-height:1.2}.display-2{font-size:5.5rem;font-weight:300;line-height:1.2}.display-3{font-size:4.5rem;font-weight:300;line-height:1.2}.display-4{font-size:3.5rem;font-weight:300;line-height:1.2}hr{margin-top:1rem;margin-bottom:1rem;border:0;border-top:1px solid rgba(0,0,0,0.1)}small,.small{font-size:80%;font-weight:400}mark,.mark{padding:.2em;background-color:#fcf8e3}.list-unstyled{padding-left:0;list-style:none}.list-inline{padding-left:0;list-style:none}.list-inline-item{display:inline-block}.list-inline-item:not(:last-child){margin-right:.5rem}.initialism{font-size:90%;text-transform:uppercase}.blockquote{margin-bottom:1rem;font-size:1.25rem}.blockquote-footer{display:block;font-size:80%;color:#6c757d}.blockquote-footer::before{content:"\2014\00A0"}.img-fluid{max-width:100%;height:auto}.img-thumbnail{padding:.25rem;background-color:#fff;border:1px solid #dee2e6;border-radius:.25rem;max-width:100%;height:auto}.figure{display:inline-block}.figure-img{margin-bottom:.5rem;line-height:1}.figure-caption{font-size:90%;color:#6c757d}code{font-size:87.5%;color:#e83e8c;word-break:break-word}a>code{color:inherit}kbd{padding:.2rem .4rem;font-size:87.5%;color:#fff;background-color:#212529;border-radius:.2rem}kbd kbd{padding:0;font-size:100%;font-weight:700}pre{display:block;font-size:87.5%;color:#212529}pre code{font-size:inherit;color:inherit;word-break:normal}.pre-scrollable{max-height:340px;overflow-y:scroll}.container{width:100%;padding-right:15px;padding-left:15px;margin-right:auto;margin-left:auto}@media (min-width: 576px){.container{max-width:540px}}@media (min-width: 768px){.container{max-width:720px}}@media (min-width: 992px){.container{max-width:960px}}@media (min-width: 1200px){.container{max-width:1140px}}.container-fluid{width:100%;padding-right:15px;padding-left:15px;margin-right:auto;margin-left:auto}.row{display:flex;flex-wrap:wrap;margin-right:-15px;margin-left:-15px}.no-gutters{margin-right:0;margin-left:0}.no-gutters>.col,.no-gutters>[class*="col-"]{padding-right:0;padding-left:0}.col-1,.col-2,.col-3,.col-4,.col-5,.col-6,.col-7,.col-8,.col-9,.col-10,.col-11,.col-12,.col,.col-auto,.col-sm-1,.col-sm-2,.col-sm-3,.col-sm-4,.col-sm-5,.col-sm-6,.col-sm-7,.col-sm-8,.col-sm-9,.col-sm-10,.col-sm-11,.col-sm-12,.col-sm,.col-sm-auto,.col-md-1,.col-md-2,.col-md-3,.col-md-4,.col-md-5,.col-md-6,.col-md-7,.col-md-8,.col-md-9,.col-md-10,.col-md-11,.col-md-12,.col-md,.col-md-auto,.col-lg-1,.col-lg-2,.col-lg-3,.col-lg-4,.col-lg-5,.col-lg-6,.col-lg-7,.col-lg-8,.col-lg-9,.col-lg-10,.col-lg-11,.col-lg-12,.col-lg,.col-lg-auto,.col-xl-1,.col-xl-2,.col-xl-3,.col-xl-4,.col-xl-5,.col-xl-6,.col-xl-7,.col-xl-8,.col-xl-9,.col-xl-10,.col-xl-11,.col-xl-12,.col-xl,.col-xl-auto{position:relative;width:100%;padding-right:15px;padding-left:15px}.col{flex-basis:0;flex-grow:1;max-width:100%}.col-auto{flex:0 0 auto;width:auto;max-width:100%}.col-1{flex:0 0 8.3333333333%;max-width:8.3333333333%}.col-2{flex:0 0 16.6666666667%;max-width:16.6666666667%}.col-3{flex:0 0 25%;max-width:25%}.col-4{flex:0 0 33.3333333333%;max-width:33.3333333333%}.col-5{flex:0 0 41.6666666667%;max-width:41.6666666667%}.col-6{flex:0 0 50%;max-width:50%}.col-7{flex:0 0 58.3333333333%;max-width:58.3333333333%}.col-8{flex:0 0 66.6666666667%;max-width:66.6666666667%}.col-9{flex:0 0 75%;max-width:75%}.col-10{flex:0 0 83.3333333333%;max-width:83.3333333333%}.col-11{flex:0 0 91.6666666667%;max-width:91.6666666667%}.col-12{flex:0 0 100%;max-width:100%}.order-first{order:-1}.order-last{order:13}.order-0{order:0}.order-1{order:1}.order-2{order:2}.order-3{order:3}.order-4{order:4}.order-5{order:5}.order-6{order:6}.order-7{order:7}.order-8{order:8}.order-9{order:9}.order-10{order:10}.order-11{order:11}.order-12{order:12}.offset-1{margin-left:8.3333333333%}.offset-2{margin-left:16.6666666667%}.offset-3{margin-left:25%}.offset-4{margin-left:33.3333333333%}.offset-5{margin-left:41.6666666667%}.offset-6{margin-left:50%}.offset-7{margin-left:58.3333333333%}.offset-8{margin-left:66.6666666667%}.offset-9{margin-left:75%}.offset-10{margin-left:83.3333333333%}.offset-11{margin-left:91.6666666667%}@media (min-width: 576px){.col-sm{flex-basis:0;flex-grow:1;max-width:100%}.col-sm-auto{flex:0 0 auto;width:auto;max-width:100%}.col-sm-1{flex:0 0 8.3333333333%;max-width:8.3333333333%}.col-sm-2{flex:0 0 16.6666666667%;max-width:16.6666666667%}.col-sm-3{flex:0 0 25%;max-width:25%}.col-sm-4{flex:0 0 33.3333333333%;max-width:33.3333333333%}.col-sm-5{flex:0 0 41.6666666667%;max-width:41.6666666667%}.col-sm-6{flex:0 0 50%;max-width:50%}.col-sm-7{flex:0 0 58.3333333333%;max-width:58.3333333333%}.col-sm-8{flex:0 0 66.6666666667%;max-width:66.6666666667%}.col-sm-9{flex:0 0 75%;max-width:75%}.col-sm-10{flex:0 0 83.3333333333%;max-width:83.3333333333%}.col-sm-11{flex:0 0 91.6666666667%;max-width:91.6666666667%}.col-sm-12{flex:0 0 100%;max-width:100%}.order-sm-first{order:-1}.order-sm-last{order:13}.order-sm-0{order:0}.order-sm-1{order:1}.order-sm-2{order:2}.order-sm-3{order:3}.order-sm-4{order:4}.order-sm-5{order:5}.order-sm-6{order:6}.order-sm-7{order:7}.order-sm-8{order:8}.order-sm-9{order:9}.order-sm-10{order:10}.order-sm-11{order:11}.order-sm-12{order:12}.offset-sm-0{margin-left:0}.offset-sm-1{margin-left:8.3333333333%}.offset-sm-2{margin-left:16.6666666667%}.offset-sm-3{margin-left:25%}.offset-sm-4{margin-left:33.3333333333%}.offset-sm-5{margin-left:41.6666666667%}.offset-sm-6{margin-left:50%}.offset-sm-7{margin-left:58.3333333333%}.offset-sm-8{margin-left:66.6666666667%}.offset-sm-9{margin-left:75%}.offset-sm-10{margin-left:83.3333333333%}.offset-sm-11{margin-left:91.6666666667%}}@media (min-width: 768px){.col-md{flex-basis:0;flex-grow:1;max-width:100%}.col-md-auto{flex:0 0 auto;width:auto;max-width:100%}.col-md-1{flex:0 0 8.3333333333%;max-width:8.3333333333%}.col-md-2{flex:0 0 16.6666666667%;max-width:16.6666666667%}.col-md-3{flex:0 0 25%;max-width:25%}.col-md-4{flex:0 0 33.3333333333%;max-width:33.3333333333%}.col-md-5{flex:0 0 41.6666666667%;max-width:41.6666666667%}.col-md-6{flex:0 0 50%;max-width:50%}.col-md-7{flex:0 0 58.3333333333%;max-width:58.3333333333%}.col-md-8{flex:0 0 66.6666666667%;max-width:66.6666666667%}.col-md-9{flex:0 0 75%;max-width:75%}.col-md-10{flex:0 0 83.3333333333%;max-width:83.3333333333%}.col-md-11{flex:0 0 91.6666666667%;max-width:91.6666666667%}.col-md-12{flex:0 0 100%;max-width:100%}.order-md-first{order:-1}.order-md-last{order:13}.order-md-0{order:0}.order-md-1{order:1}.order-md-2{order:2}.order-md-3{order:3}.order-md-4{order:4}.order-md-5{order:5}.order-md-6{order:6}.order-md-7{order:7}.order-md-8{order:8}.order-md-9{order:9}.order-md-10{order:10}.order-md-11{order:11}.order-md-12{order:12}.offset-md-0{margin-left:0}.offset-md-1{margin-left:8.3333333333%}.offset-md-2{margin-left:16.6666666667%}.offset-md-3{margin-left:25%}.offset-md-4{margin-left:33.3333333333%}.offset-md-5{margin-left:41.6666666667%}.offset-md-6{margin-left:50%}.offset-md-7{margin-left:58.3333333333%}.offset-md-8{margin-left:66.6666666667%}.offset-md-9{margin-left:75%}.offset-md-10{margin-left:83.3333333333%}.offset-md-11{margin-left:91.6666666667%}}@media (min-width: 992px){.col-lg{flex-basis:0;flex-grow:1;max-width:100%}.col-lg-auto{flex:0 0 auto;width:auto;max-width:100%}.col-lg-1{flex:0 0 8.3333333333%;max-width:8.3333333333%}.col-lg-2{flex:0 0 16.6666666667%;max-width:16.6666666667%}.col-lg-3{flex:0 0 25%;max-width:25%}.col-lg-4{flex:0 0 33.3333333333%;max-width:33.3333333333%}.col-lg-5{flex:0 0 41.6666666667%;max-width:41.6666666667%}.col-lg-6{flex:0 0 50%;max-width:50%}.col-lg-7{flex:0 0 58.3333333333%;max-width:58.3333333333%}.col-lg-8{flex:0 0 66.6666666667%;max-width:66.6666666667%}.col-lg-9{flex:0 0 75%;max-width:75%}.col-lg-10{flex:0 0 83.3333333333%;max-width:83.3333333333%}.col-lg-11{flex:0 0 91.6666666667%;max-width:91.6666666667%}.col-lg-12{flex:0 0 100%;max-width:100%}.order-lg-first{order:-1}.order-lg-last{order:13}.order-lg-0{order:0}.order-lg-1{order:1}.order-lg-2{order:2}.order-lg-3{order:3}.order-lg-4{order:4}.order-lg-5{order:5}.order-lg-6{order:6}.order-lg-7{order:7}.order-lg-8{order:8}.order-lg-9{order:9}.order-lg-10{order:10}.order-lg-11{order:11}.order-lg-12{order:12}.offset-lg-0{margin-left:0}.offset-lg-1{margin-left:8.3333333333%}.offset-lg-2{margin-left:16.6666666667%}.offset-lg-3{margin-left:25%}.offset-lg-4{margin-left:33.3333333333%}.offset-lg-5{margin-left:41.6666666667%}.offset-lg-6{margin-left:50%}.offset-lg-7{margin-left:58.3333333333%}.offset-lg-8{margin-left:66.6666666667%}.offset-lg-9{margin-left:75%}.offset-lg-10{margin-left:83.3333333333%}.offset-lg-11{margin-left:91.6666666667%}}@media (min-width: 1200px){.col-xl{flex-basis:0;flex-grow:1;max-width:100%}.col-xl-auto{flex:0 0 auto;width:auto;max-width:100%}.col-xl-1{flex:0 0 8.3333333333%;max-width:8.3333333333%}.col-xl-2{flex:0 0 16.6666666667%;max-width:16.6666666667%}.col-xl-3{flex:0 0 25%;max-width:25%}.col-xl-4{flex:0 0 33.3333333333%;max-width:33.3333333333%}.col-xl-5{flex:0 0 41.6666666667%;max-width:41.6666666667%}.col-xl-6{flex:0 0 50%;max-width:50%}.col-xl-7{flex:0 0 58.3333333333%;max-width:58.3333333333%}.col-xl-8{flex:0 0 66.6666666667%;max-width:66.6666666667%}.col-xl-9{flex:0 0 75%;max-width:75%}.col-xl-10{flex:0 0 83.3333333333%;max-width:83.3333333333%}.col-xl-11{flex:0 0 91.6666666667%;max-width:91.6666666667%}.col-xl-12{flex:0 0 100%;max-width:100%}.order-xl-first{order:-1}.order-xl-last{order:13}.order-xl-0{order:0}.order-xl-1{order:1}.order-xl-2{order:2}.order-xl-3{order:3}.order-xl-4{order:4}.order-xl-5{order:5}.order-xl-6{order:6}.order-xl-7{order:7}.order-xl-8{order:8}.order-xl-9{order:9}.order-xl-10{order:10}.order-xl-11{order:11}.order-xl-12{order:12}.offset-xl-0{margin-left:0}.offset-xl-1{margin-left:8.3333333333%}.offset-xl-2{margin-left:16.6666666667%}.offset-xl-3{margin-left:25%}.offset-xl-4{margin-left:33.3333333333%}.offset-xl-5{margin-left:41.6666666667%}.offset-xl-6{margin-left:50%}.offset-xl-7{margin-left:58.3333333333%}.offset-xl-8{margin-left:66.6666666667%}.offset-xl-9{margin-left:75%}.offset-xl-10{margin-left:83.3333333333%}.offset-xl-11{margin-left:91.6666666667%}}.table{width:100%;margin-bottom:1rem;color:#212529}.table th,.table td{padding:.75rem;vertical-align:top;border-top:1px solid #dee2e6}.table thead th{vertical-align:bottom;border-bottom:2px solid #dee2e6}.table tbody+tbody{border-top:2px solid #dee2e6}.table-sm th,.table-sm td{padding:.3rem}.table-bordered{border:1px solid #dee2e6}.table-bordered th,.table-bordered td{border:1px solid #dee2e6}.table-bordered thead th,.table-bordered thead td{border-bottom-width:2px}.table-borderless th,.table-borderless td,.table-borderless thead th,.table-borderless tbody+tbody{border:0}.table-striped tbody tr:nth-of-type(odd){background-color:rgba(0,0,0,0.05)}.table-hover tbody tr:hover{color:#212529;background-color:rgba(0,0,0,0.075)}.table-primary,.table-primary>th,.table-primary>td{background-color:#b8daff}.table-primary th,.table-primary td,.table-primary thead th,.table-primary tbody+tbody{border-color:#7abaff}.table-hover .table-primary:hover{background-color:#9fcdff}.table-hover .table-primary:hover>td,.table-hover .table-primary:hover>th{background-color:#9fcdff}.table-secondary,.table-secondary>th,.table-secondary>td{background-color:#d6d8db}.table-secondary th,.table-secondary td,.table-secondary thead th,.table-secondary tbody+tbody{border-color:#b3b7bb}.table-hover .table-secondary:hover{background-color:#c8cbcf}.table-hover .table-secondary:hover>td,.table-hover .table-secondary:hover>th{background-color:#c8cbcf}.table-success,.table-success>th,.table-success>td{background-color:#c3e6cb}.table-success th,.table-success td,.table-success thead th,.table-success tbody+tbody{border-color:#8fd19e}.table-hover .table-success:hover{background-color:#b1dfbb}.table-hover .table-success:hover>td,.table-hover .table-success:hover>th{background-color:#b1dfbb}.table-info,.table-info>th,.table-info>td{background-color:#bee5eb}.table-info th,.table-info td,.table-info thead th,.table-info tbody+tbody{border-color:#86cfda}.table-hover .table-info:hover{background-color:#abdde5}.table-hover .table-info:hover>td,.table-hover .table-info:hover>th{background-color:#abdde5}.table-warning,.table-warning>th,.table-warning>td{background-color:#ffeeba}.table-warning th,.table-warning td,.table-warning thead th,.table-warning tbody+tbody{border-color:#ffdf7e}.table-hover .table-warning:hover{background-color:#ffe8a1}.table-hover .table-warning:hover>td,.table-hover .table-warning:hover>th{background-color:#ffe8a1}.table-danger,.table-danger>th,.table-danger>td{background-color:#f5c6cb}.table-danger th,.table-danger td,.table-danger thead th,.table-danger tbody+tbody{border-color:#ed969e}.table-hover .table-danger:hover{background-color:#f1b0b7}.table-hover .table-danger:hover>td,.table-hover .table-danger:hover>th{background-color:#f1b0b7}.table-light,.table-light>th,.table-light>td{background-color:#fdfdfe}.table-light th,.table-light td,.table-light thead th,.table-light tbody+tbody{border-color:#fbfcfc}.table-hover .table-light:hover{background-color:#ececf6}.table-hover .table-light:hover>td,.table-hover .table-light:hover>th{background-color:#ececf6}.table-dark,.table-dark>th,.table-dark>td{background-color:#c6c8ca}.table-dark th,.table-dark td,.table-dark thead th,.table-dark tbody+tbody{border-color:#95999c}.table-hover .table-dark:hover{background-color:#b9bbbe}.table-hover .table-dark:hover>td,.table-hover .table-dark:hover>th{background-color:#b9bbbe}.table-active,.table-active>th,.table-active>td{background-color:rgba(0,0,0,0.075)}.table-hover .table-active:hover{background-color:rgba(0,0,0,0.075)}.table-hover .table-active:hover>td,.table-hover .table-active:hover>th{background-color:rgba(0,0,0,0.075)}.table .thead-dark th{color:#fff;background-color:#343a40;border-color:#454d55}.table .thead-light th{color:#495057;background-color:#e9ecef;border-color:#dee2e6}.table-dark{color:#fff;background-color:#343a40}.table-dark th,.table-dark td,.table-dark thead th{border-color:#454d55}.table-dark.table-bordered{border:0}.table-dark.table-striped tbody tr:nth-of-type(odd){background-color:rgba(255,255,255,0.05)}.table-dark.table-hover tbody tr:hover{color:#fff;background-color:rgba(255,255,255,0.075)}@media (max-width: 575.98px){.table-responsive-sm{display:block;width:100%;overflow-x:auto;-webkit-overflow-scrolling:touch}.table-responsive-sm>.table-bordered{border:0}}@media (max-width: 767.98px){.table-responsive-md{display:block;width:100%;overflow-x:auto;-webkit-overflow-scrolling:touch}.table-responsive-md>.table-bordered{border:0}}@media (max-width: 991.98px){.table-responsive-lg{display:block;width:100%;overflow-x:auto;-webkit-overflow-scrolling:touch}.table-responsive-lg>.table-bordered{border:0}}@media (max-width: 1199.98px){.table-responsive-xl{display:block;width:100%;overflow-x:auto;-webkit-overflow-scrolling:touch}.table-responsive-xl>.table-bordered{border:0}}.table-responsive{display:block;width:100%;overflow-x:auto;-webkit-overflow-scrolling:touch}.table-responsive>.table-bordered{border:0}.form-control{display:block;width:100%;height:calc(1.5em + .75rem + 2px);padding:.375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#495057;background-color:#fff;background-clip:padding-box;border:1px solid #ced4da;border-radius:.25rem;transition:border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out}@media (prefers-reduced-motion: reduce){.form-control{transition:none}}.form-control::-ms-expand{background-color:transparent;border:0}.form-control:focus{color:#495057;background-color:#fff;border-color:#80bdff;outline:0;box-shadow:0 0 0 .2rem rgba(0,123,255,0.25)}.form-control::-moz-placeholder{color:#6c757d;opacity:1}.form-control:-ms-input-placeholder{color:#6c757d;opacity:1}.form-control::-ms-input-placeholder{color:#6c757d;opacity:1}.form-control::placeholder{color:#6c757d;opacity:1}.form-control:disabled,.form-control[readonly]{background-color:#e9ecef;opacity:1}select.form-control:focus::-ms-value{color:#495057;background-color:#fff}.form-control-file,.form-control-range{display:block;width:100%}.col-form-label{padding-top:calc(.375rem + 1px);padding-bottom:calc(.375rem + 1px);margin-bottom:0;font-size:inherit;line-height:1.5}.col-form-label-lg{padding-top:calc(.5rem + 1px);padding-bottom:calc(.5rem + 1px);font-size:1.25rem;line-height:1.5}.col-form-label-sm{padding-top:calc(.25rem + 1px);padding-bottom:calc(.25rem + 1px);font-size:.875rem;line-height:1.5}.form-control-plaintext{display:block;width:100%;padding-top:.375rem;padding-bottom:.375rem;margin-bottom:0;line-height:1.5;color:#212529;background-color:transparent;border:solid transparent;border-width:1px 0}.form-control-plaintext.form-control-sm,.form-control-plaintext.form-control-lg{padding-right:0;padding-left:0}.form-control-sm{height:calc(1.5em + .5rem + 2px);padding:.25rem .5rem;font-size:.875rem;line-height:1.5;border-radius:.2rem}.form-control-lg{height:calc(1.5em + 1rem + 2px);padding:.5rem 1rem;font-size:1.25rem;line-height:1.5;border-radius:.3rem}select.form-control[size],select.form-control[multiple]{height:auto}textarea.form-control{height:auto}.form-group{margin-bottom:1rem}.form-text{display:block;margin-top:.25rem}.form-row{display:flex;flex-wrap:wrap;margin-right:-5px;margin-left:-5px}.form-row>.col,.form-row>[class*="col-"]{padding-right:5px;padding-left:5px}.form-check{position:relative;display:block;padding-left:1.25rem}.form-check-input{position:absolute;margin-top:.3rem;margin-left:-1.25rem}.form-check-input:disabled ~ .form-check-label{color:#6c757d}.form-check-label{margin-bottom:0}.form-check-inline{display:inline-flex;align-items:center;padding-left:0;margin-right:.75rem}.form-check-inline .form-check-input{position:static;margin-top:0;margin-right:.3125rem;margin-left:0}.valid-feedback{display:none;width:100%;margin-top:.25rem;font-size:80%;color:#28a745}.valid-tooltip{position:absolute;top:100%;z-index:5;display:none;max-width:100%;padding:.25rem .5rem;margin-top:.1rem;font-size:.875rem;line-height:1.5;color:#fff;background-color:rgba(40,167,69,0.9);border-radius:.25rem}.was-validated .form-control:valid,.form-control.is-valid{border-color:#28a745;padding-right:calc(1.5em + .75rem);background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%2328a745' d='M2.3 6.73L.6 4.53c-.4-1.04.46-1.4 1.1-.8l1.1 1.4 3.4-3.8c.6-.63 1.6-.27 1.2.7l-4 4.6c-.43.5-.8.4-1.1.1z'/%3e%3c/svg%3e");background-repeat:no-repeat;background-position:center right calc(.375em + .1875rem);background-size:calc(.75em + .375rem) calc(.75em + .375rem)}.was-validated .form-control:valid:focus,.form-control.is-valid:focus{border-color:#28a745;box-shadow:0 0 0 .2rem rgba(40,167,69,0.25)}.was-validated .form-control:valid ~ .valid-feedback,.was-validated .form-control:valid ~ .valid-tooltip,.form-control.is-valid ~ .valid-feedback,.form-control.is-valid ~ .valid-tooltip{display:block}.was-validated textarea.form-control:valid,textarea.form-control.is-valid{padding-right:calc(1.5em + .75rem);background-position:top calc(.375em + .1875rem) right calc(.375em + .1875rem)}.was-validated .custom-select:valid,.custom-select.is-valid{border-color:#28a745;padding-right:calc((1em + .75rem) * 3 / 4 + 1.75rem);background:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 4 5'%3e%3cpath fill='%23343a40' d='M2 0L0 2h4zm0 5L0 3h4z'/%3e%3c/svg%3e") no-repeat right .75rem center/8px 10px,url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%2328a745' d='M2.3 6.73L.6 4.53c-.4-1.04.46-1.4 1.1-.8l1.1 1.4 3.4-3.8c.6-.63 1.6-.27 1.2.7l-4 4.6c-.43.5-.8.4-1.1.1z'/%3e%3c/svg%3e") #fff no-repeat center right 1.75rem/calc(.75em + .375rem) calc(.75em + .375rem)}.was-validated .custom-select:valid:focus,.custom-select.is-valid:focus{border-color:#28a745;box-shadow:0 0 0 .2rem rgba(40,167,69,0.25)}.was-validated .custom-select:valid ~ .valid-feedback,.was-validated .custom-select:valid ~ .valid-tooltip,.custom-select.is-valid ~ .valid-feedback,.custom-select.is-valid ~ .valid-tooltip{display:block}.was-validated .form-control-file:valid ~ .valid-feedback,.was-validated .form-control-file:valid ~ .valid-tooltip,.form-control-file.is-valid ~ .valid-feedback,.form-control-file.is-valid ~ .valid-tooltip{display:block}.was-validated .form-check-input:valid ~ .form-check-label,.form-check-input.is-valid ~ .form-check-label{color:#28a745}.was-validated .form-check-input:valid ~ .valid-feedback,.was-validated .form-check-input:valid ~ .valid-tooltip,.form-check-input.is-valid ~ .valid-feedback,.form-check-input.is-valid ~ .valid-tooltip{display:block}.was-validated .custom-control-input:valid ~ .custom-control-label,.custom-control-input.is-valid ~ .custom-control-label{color:#28a745}.was-validated .custom-control-input:valid ~ .custom-control-label::before,.custom-control-input.is-valid ~ .custom-control-label::before{border-color:#28a745}.was-validated .custom-control-input:valid ~ .valid-feedback,.was-validated .custom-control-input:valid ~ .valid-tooltip,.custom-control-input.is-valid ~ .valid-feedback,.custom-control-input.is-valid ~ .valid-tooltip{display:block}.was-validated .custom-control-input:valid:checked ~ .custom-control-label::before,.custom-control-input.is-valid:checked ~ .custom-control-label::before{border-color:#34ce57;background-color:#34ce57}.was-validated .custom-control-input:valid:focus ~ .custom-control-label::before,.custom-control-input.is-valid:focus ~ .custom-control-label::before{box-shadow:0 0 0 .2rem rgba(40,167,69,0.25)}.was-validated .custom-control-input:valid:focus:not(:checked) ~ .custom-control-label::before,.custom-control-input.is-valid:focus:not(:checked) ~ .custom-control-label::before{border-color:#28a745}.was-validated .custom-file-input:valid ~ .custom-file-label,.custom-file-input.is-valid ~ .custom-file-label{border-color:#28a745}.was-validated .custom-file-input:valid ~ .valid-feedback,.was-validated .custom-file-input:valid ~ .valid-tooltip,.custom-file-input.is-valid ~ .valid-feedback,.custom-file-input.is-valid ~ .valid-tooltip{display:block}.was-validated .custom-file-input:valid:focus ~ .custom-file-label,.custom-file-input.is-valid:focus ~ .custom-file-label{border-color:#28a745;box-shadow:0 0 0 .2rem rgba(40,167,69,0.25)}.invalid-feedback{display:none;width:100%;margin-top:.25rem;font-size:80%;color:#dc3545}.invalid-tooltip{position:absolute;top:100%;z-index:5;display:none;max-width:100%;padding:.25rem .5rem;margin-top:.1rem;font-size:.875rem;line-height:1.5;color:#fff;background-color:rgba(220,53,69,0.9);border-radius:.25rem}.was-validated .form-control:invalid,.form-control.is-invalid{border-color:#dc3545;padding-right:calc(1.5em + .75rem);background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' fill='%23dc3545' viewBox='-2 -2 7 7'%3e%3cpath stroke='%23dc3545' d='M0 0l3 3m0-3L0 3'/%3e%3ccircle r='.5'/%3e%3ccircle cx='3' r='.5'/%3e%3ccircle cy='3' r='.5'/%3e%3ccircle cx='3' cy='3' r='.5'/%3e%3c/svg%3E");background-repeat:no-repeat;background-position:center right calc(.375em + .1875rem);background-size:calc(.75em + .375rem) calc(.75em + .375rem)}.was-validated .form-control:invalid:focus,.form-control.is-invalid:focus{border-color:#dc3545;box-shadow:0 0 0 .2rem rgba(220,53,69,0.25)}.was-validated .form-control:invalid ~ .invalid-feedback,.was-validated .form-control:invalid ~ .invalid-tooltip,.form-control.is-invalid ~ .invalid-feedback,.form-control.is-invalid ~ .invalid-tooltip{display:block}.was-validated textarea.form-control:invalid,textarea.form-control.is-invalid{padding-right:calc(1.5em + .75rem);background-position:top calc(.375em + .1875rem) right calc(.375em + .1875rem)}.was-validated .custom-select:invalid,.custom-select.is-invalid{border-color:#dc3545;padding-right:calc((1em + .75rem) * 3 / 4 + 1.75rem);background:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 4 5'%3e%3cpath fill='%23343a40' d='M2 0L0 2h4zm0 5L0 3h4z'/%3e%3c/svg%3e") no-repeat right .75rem center/8px 10px,url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' fill='%23dc3545' viewBox='-2 -2 7 7'%3e%3cpath stroke='%23dc3545' d='M0 0l3 3m0-3L0 3'/%3e%3ccircle r='.5'/%3e%3ccircle cx='3' r='.5'/%3e%3ccircle cy='3' r='.5'/%3e%3ccircle cx='3' cy='3' r='.5'/%3e%3c/svg%3E") #fff no-repeat center right 1.75rem/calc(.75em + .375rem) calc(.75em + .375rem)}.was-validated .custom-select:invalid:focus,.custom-select.is-invalid:focus{border-color:#dc3545;box-shadow:0 0 0 .2rem rgba(220,53,69,0.25)}.was-validated .custom-select:invalid ~ .invalid-feedback,.was-validated .custom-select:invalid ~ .invalid-tooltip,.custom-select.is-invalid ~ .invalid-feedback,.custom-select.is-invalid ~ .invalid-tooltip{display:block}.was-validated .form-control-file:invalid ~ .invalid-feedback,.was-validated .form-control-file:invalid ~ .invalid-tooltip,.form-control-file.is-invalid ~ .invalid-feedback,.form-control-file.is-invalid ~ .invalid-tooltip{display:block}.was-validated .form-check-input:invalid ~ .form-check-label,.form-check-input.is-invalid ~ .form-check-label{color:#dc3545}.was-validated .form-check-input:invalid ~ .invalid-feedback,.was-validated .form-check-input:invalid ~ .invalid-tooltip,.form-check-input.is-invalid ~ .invalid-feedback,.form-check-input.is-invalid ~ .invalid-tooltip{display:block}.was-validated .custom-control-input:invalid ~ .custom-control-label,.custom-control-input.is-invalid ~ .custom-control-label{color:#dc3545}.was-validated .custom-control-input:invalid ~ .custom-control-label::before,.custom-control-input.is-invalid ~ .custom-control-label::before{border-color:#dc3545}.was-validated .custom-control-input:invalid ~ .invalid-feedback,.was-validated .custom-control-input:invalid ~ .invalid-tooltip,.custom-control-input.is-invalid ~ .invalid-feedback,.custom-control-input.is-invalid ~ .invalid-tooltip{display:block}.was-validated .custom-control-input:invalid:checked ~ .custom-control-label::before,.custom-control-input.is-invalid:checked ~ .custom-control-label::before{border-color:#e4606d;background-color:#e4606d}.was-validated .custom-control-input:invalid:focus ~ .custom-control-label::before,.custom-control-input.is-invalid:focus ~ .custom-control-label::before{box-shadow:0 0 0 .2rem rgba(220,53,69,0.25)}.was-validated .custom-control-input:invalid:focus:not(:checked) ~ .custom-control-label::before,.custom-control-input.is-invalid:focus:not(:checked) ~ .custom-control-label::before{border-color:#dc3545}.was-validated .custom-file-input:invalid ~ .custom-file-label,.custom-file-input.is-invalid ~ .custom-file-label{border-color:#dc3545}.was-validated .custom-file-input:invalid ~ .invalid-feedback,.was-validated .custom-file-input:invalid ~ .invalid-tooltip,.custom-file-input.is-invalid ~ .invalid-feedback,.custom-file-input.is-invalid ~ .invalid-tooltip{display:block}.was-validated .custom-file-input:invalid:focus ~ .custom-file-label,.custom-file-input.is-invalid:focus ~ .custom-file-label{border-color:#dc3545;box-shadow:0 0 0 .2rem rgba(220,53,69,0.25)}.form-inline{display:flex;flex-flow:row wrap;align-items:center}.form-inline .form-check{width:100%}@media (min-width: 576px){.form-inline label{display:flex;align-items:center;justify-content:center;margin-bottom:0}.form-inline .form-group{display:flex;flex:0 0 auto;flex-flow:row wrap;align-items:center;margin-bottom:0}.form-inline .form-control{display:inline-block;width:auto;vertical-align:middle}.form-inline .form-control-plaintext{display:inline-block}.form-inline .input-group,.form-inline .custom-select{width:auto}.form-inline .form-check{display:flex;align-items:center;justify-content:center;width:auto;padding-left:0}.form-inline .form-check-input{position:relative;flex-shrink:0;margin-top:0;margin-right:.25rem;margin-left:0}.form-inline .custom-control{align-items:center;justify-content:center}.form-inline .custom-control-label{margin-bottom:0}}.btn{display:inline-block;font-weight:400;color:#212529;text-align:center;vertical-align:middle;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;background-color:transparent;border:1px solid transparent;padding:.375rem .75rem;font-size:1rem;line-height:1.5;border-radius:.25rem;transition:color 0.15s ease-in-out,background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out}@media (prefers-reduced-motion: reduce){.btn{transition:none}}.btn:hover{color:#212529;text-decoration:none}.btn:focus,.btn.focus{outline:0;box-shadow:0 0 0 .2rem rgba(0,123,255,0.25)}.btn.disabled,.btn:disabled{opacity:.65}a.btn.disabled,fieldset:disabled a.btn{pointer-events:none}.btn-primary{color:#fff;background-color:#007bff;border-color:#007bff}.btn-primary:hover{color:#fff;background-color:#0069d9;border-color:#0062cc}.btn-primary:focus,.btn-primary.focus{box-shadow:0 0 0 .2rem rgba(38,143,255,0.5)}.btn-primary.disabled,.btn-primary:disabled{color:#fff;background-color:#007bff;border-color:#007bff}.btn-primary:not(:disabled):not(.disabled):active,.btn-primary:not(:disabled):not(.disabled).active,.show>.btn-primary.dropdown-toggle{color:#fff;background-color:#0062cc;border-color:#005cbf}.btn-primary:not(:disabled):not(.disabled):active:focus,.btn-primary:not(:disabled):not(.disabled).active:focus,.show>.btn-primary.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(38,143,255,0.5)}.btn-secondary{color:#fff;background-color:#6c757d;border-color:#6c757d}.btn-secondary:hover{color:#fff;background-color:#5a6268;border-color:#545b62}.btn-secondary:focus,.btn-secondary.focus{box-shadow:0 0 0 .2rem rgba(130,138,145,0.5)}.btn-secondary.disabled,.btn-secondary:disabled{color:#fff;background-color:#6c757d;border-color:#6c757d}.btn-secondary:not(:disabled):not(.disabled):active,.btn-secondary:not(:disabled):not(.disabled).active,.show>.btn-secondary.dropdown-toggle{color:#fff;background-color:#545b62;border-color:#4e555b}.btn-secondary:not(:disabled):not(.disabled):active:focus,.btn-secondary:not(:disabled):not(.disabled).active:focus,.show>.btn-secondary.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(130,138,145,0.5)}.btn-success{color:#fff;background-color:#28a745;border-color:#28a745}.btn-success:hover{color:#fff;background-color:#218838;border-color:#1e7e34}.btn-success:focus,.btn-success.focus{box-shadow:0 0 0 .2rem rgba(72,180,97,0.5)}.btn-success.disabled,.btn-success:disabled{color:#fff;background-color:#28a745;border-color:#28a745}.btn-success:not(:disabled):not(.disabled):active,.btn-success:not(:disabled):not(.disabled).active,.show>.btn-success.dropdown-toggle{color:#fff;background-color:#1e7e34;border-color:#1c7430}.btn-success:not(:disabled):not(.disabled):active:focus,.btn-success:not(:disabled):not(.disabled).active:focus,.show>.btn-success.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(72,180,97,0.5)}.btn-info{color:#fff;background-color:#17a2b8;border-color:#17a2b8}.btn-info:hover{color:#fff;background-color:#138496;border-color:#117a8b}.btn-info:focus,.btn-info.focus{box-shadow:0 0 0 .2rem rgba(58,176,195,0.5)}.btn-info.disabled,.btn-info:disabled{color:#fff;background-color:#17a2b8;border-color:#17a2b8}.btn-info:not(:disabled):not(.disabled):active,.btn-info:not(:disabled):not(.disabled).active,.show>.btn-info.dropdown-toggle{color:#fff;background-color:#117a8b;border-color:#10707f}.btn-info:not(:disabled):not(.disabled):active:focus,.btn-info:not(:disabled):not(.disabled).active:focus,.show>.btn-info.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(58,176,195,0.5)}.btn-warning{color:#212529;background-color:#ffc107;border-color:#ffc107}.btn-warning:hover{color:#212529;background-color:#e0a800;border-color:#d39e00}.btn-warning:focus,.btn-warning.focus{box-shadow:0 0 0 .2rem rgba(222,170,12,0.5)}.btn-warning.disabled,.btn-warning:disabled{color:#212529;background-color:#ffc107;border-color:#ffc107}.btn-warning:not(:disabled):not(.disabled):active,.btn-warning:not(:disabled):not(.disabled).active,.show>.btn-warning.dropdown-toggle{color:#212529;background-color:#d39e00;border-color:#c69500}.btn-warning:not(:disabled):not(.disabled):active:focus,.btn-warning:not(:disabled):not(.disabled).active:focus,.show>.btn-warning.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(222,170,12,0.5)}.btn-danger{color:#fff;background-color:#dc3545;border-color:#dc3545}.btn-danger:hover{color:#fff;background-color:#c82333;border-color:#bd2130}.btn-danger:focus,.btn-danger.focus{box-shadow:0 0 0 .2rem rgba(225,83,97,0.5)}.btn-danger.disabled,.btn-danger:disabled{color:#fff;background-color:#dc3545;border-color:#dc3545}.btn-danger:not(:disabled):not(.disabled):active,.btn-danger:not(:disabled):not(.disabled).active,.show>.btn-danger.dropdown-toggle{color:#fff;background-color:#bd2130;border-color:#b21f2d}.btn-danger:not(:disabled):not(.disabled):active:focus,.btn-danger:not(:disabled):not(.disabled).active:focus,.show>.btn-danger.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(225,83,97,0.5)}.btn-light{color:#212529;background-color:#f8f9fa;border-color:#f8f9fa}.btn-light:hover{color:#212529;background-color:#e2e6ea;border-color:#dae0e5}.btn-light:focus,.btn-light.focus{box-shadow:0 0 0 .2rem rgba(216,217,219,0.5)}.btn-light.disabled,.btn-light:disabled{color:#212529;background-color:#f8f9fa;border-color:#f8f9fa}.btn-light:not(:disabled):not(.disabled):active,.btn-light:not(:disabled):not(.disabled).active,.show>.btn-light.dropdown-toggle{color:#212529;background-color:#dae0e5;border-color:#d3d9df}.btn-light:not(:disabled):not(.disabled):active:focus,.btn-light:not(:disabled):not(.disabled).active:focus,.show>.btn-light.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(216,217,219,0.5)}.btn-dark{color:#fff;background-color:#343a40;border-color:#343a40}.btn-dark:hover{color:#fff;background-color:#23272b;border-color:#1d2124}.btn-dark:focus,.btn-dark.focus{box-shadow:0 0 0 .2rem rgba(82,88,93,0.5)}.btn-dark.disabled,.btn-dark:disabled{color:#fff;background-color:#343a40;border-color:#343a40}.btn-dark:not(:disabled):not(.disabled):active,.btn-dark:not(:disabled):not(.disabled).active,.show>.btn-dark.dropdown-toggle{color:#fff;background-color:#1d2124;border-color:#171a1d}.btn-dark:not(:disabled):not(.disabled):active:focus,.btn-dark:not(:disabled):not(.disabled).active:focus,.show>.btn-dark.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(82,88,93,0.5)}.btn-outline-primary{color:#007bff;border-color:#007bff}.btn-outline-primary:hover{color:#fff;background-color:#007bff;border-color:#007bff}.btn-outline-primary:focus,.btn-outline-primary.focus{box-shadow:0 0 0 .2rem rgba(0,123,255,0.5)}.btn-outline-primary.disabled,.btn-outline-primary:disabled{color:#007bff;background-color:transparent}.btn-outline-primary:not(:disabled):not(.disabled):active,.btn-outline-primary:not(:disabled):not(.disabled).active,.show>.btn-outline-primary.dropdown-toggle{color:#fff;background-color:#007bff;border-color:#007bff}.btn-outline-primary:not(:disabled):not(.disabled):active:focus,.btn-outline-primary:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-primary.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(0,123,255,0.5)}.btn-outline-secondary{color:#6c757d;border-color:#6c757d}.btn-outline-secondary:hover{color:#fff;background-color:#6c757d;border-color:#6c757d}.btn-outline-secondary:focus,.btn-outline-secondary.focus{box-shadow:0 0 0 .2rem rgba(108,117,125,0.5)}.btn-outline-secondary.disabled,.btn-outline-secondary:disabled{color:#6c757d;background-color:transparent}.btn-outline-secondary:not(:disabled):not(.disabled):active,.btn-outline-secondary:not(:disabled):not(.disabled).active,.show>.btn-outline-secondary.dropdown-toggle{color:#fff;background-color:#6c757d;border-color:#6c757d}.btn-outline-secondary:not(:disabled):not(.disabled):active:focus,.btn-outline-secondary:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-secondary.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(108,117,125,0.5)}.btn-outline-success{color:#28a745;border-color:#28a745}.btn-outline-success:hover{color:#fff;background-color:#28a745;border-color:#28a745}.btn-outline-success:focus,.btn-outline-success.focus{box-shadow:0 0 0 .2rem rgba(40,167,69,0.5)}.btn-outline-success.disabled,.btn-outline-success:disabled{color:#28a745;background-color:transparent}.btn-outline-success:not(:disabled):not(.disabled):active,.btn-outline-success:not(:disabled):not(.disabled).active,.show>.btn-outline-success.dropdown-toggle{color:#fff;background-color:#28a745;border-color:#28a745}.btn-outline-success:not(:disabled):not(.disabled):active:focus,.btn-outline-success:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-success.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(40,167,69,0.5)}.btn-outline-info{color:#17a2b8;border-color:#17a2b8}.btn-outline-info:hover{color:#fff;background-color:#17a2b8;border-color:#17a2b8}.btn-outline-info:focus,.btn-outline-info.focus{box-shadow:0 0 0 .2rem rgba(23,162,184,0.5)}.btn-outline-info.disabled,.btn-outline-info:disabled{color:#17a2b8;background-color:transparent}.btn-outline-info:not(:disabled):not(.disabled):active,.btn-outline-info:not(:disabled):not(.disabled).active,.show>.btn-outline-info.dropdown-toggle{color:#fff;background-color:#17a2b8;border-color:#17a2b8}.btn-outline-info:not(:disabled):not(.disabled):active:focus,.btn-outline-info:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-info.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(23,162,184,0.5)}.btn-outline-warning{color:#ffc107;border-color:#ffc107}.btn-outline-warning:hover{color:#212529;background-color:#ffc107;border-color:#ffc107}.btn-outline-warning:focus,.btn-outline-warning.focus{box-shadow:0 0 0 .2rem rgba(255,193,7,0.5)}.btn-outline-warning.disabled,.btn-outline-warning:disabled{color:#ffc107;background-color:transparent}.btn-outline-warning:not(:disabled):not(.disabled):active,.btn-outline-warning:not(:disabled):not(.disabled).active,.show>.btn-outline-warning.dropdown-toggle{color:#212529;background-color:#ffc107;border-color:#ffc107}.btn-outline-warning:not(:disabled):not(.disabled):active:focus,.btn-outline-warning:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-warning.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(255,193,7,0.5)}.btn-outline-danger{color:#dc3545;border-color:#dc3545}.btn-outline-danger:hover{color:#fff;background-color:#dc3545;border-color:#dc3545}.btn-outline-danger:focus,.btn-outline-danger.focus{box-shadow:0 0 0 .2rem rgba(220,53,69,0.5)}.btn-outline-danger.disabled,.btn-outline-danger:disabled{color:#dc3545;background-color:transparent}.btn-outline-danger:not(:disabled):not(.disabled):active,.btn-outline-danger:not(:disabled):not(.disabled).active,.show>.btn-outline-danger.dropdown-toggle{color:#fff;background-color:#dc3545;border-color:#dc3545}.btn-outline-danger:not(:disabled):not(.disabled):active:focus,.btn-outline-danger:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-danger.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(220,53,69,0.5)}.btn-outline-light{color:#f8f9fa;border-color:#f8f9fa}.btn-outline-light:hover{color:#212529;background-color:#f8f9fa;border-color:#f8f9fa}.btn-outline-light:focus,.btn-outline-light.focus{box-shadow:0 0 0 .2rem rgba(248,249,250,0.5)}.btn-outline-light.disabled,.btn-outline-light:disabled{color:#f8f9fa;background-color:transparent}.btn-outline-light:not(:disabled):not(.disabled):active,.btn-outline-light:not(:disabled):not(.disabled).active,.show>.btn-outline-light.dropdown-toggle{color:#212529;background-color:#f8f9fa;border-color:#f8f9fa}.btn-outline-light:not(:disabled):not(.disabled):active:focus,.btn-outline-light:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-light.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(248,249,250,0.5)}.btn-outline-dark{color:#343a40;border-color:#343a40}.btn-outline-dark:hover{color:#fff;background-color:#343a40;border-color:#343a40}.btn-outline-dark:focus,.btn-outline-dark.focus{box-shadow:0 0 0 .2rem rgba(52,58,64,0.5)}.btn-outline-dark.disabled,.btn-outline-dark:disabled{color:#343a40;background-color:transparent}.btn-outline-dark:not(:disabled):not(.disabled):active,.btn-outline-dark:not(:disabled):not(.disabled).active,.show>.btn-outline-dark.dropdown-toggle{color:#fff;background-color:#343a40;border-color:#343a40}.btn-outline-dark:not(:disabled):not(.disabled):active:focus,.btn-outline-dark:not(:disabled):not(.disabled).active:focus,.show>.btn-outline-dark.dropdown-toggle:focus{box-shadow:0 0 0 .2rem rgba(52,58,64,0.5)}.btn-link{font-weight:400;color:#007bff;text-decoration:none}.btn-link:hover{color:#0056b3;text-decoration:underline}.btn-link:focus,.btn-link.focus{text-decoration:underline;box-shadow:none}.btn-link:disabled,.btn-link.disabled{color:#6c757d;pointer-events:none}.btn-lg,.btn-group-lg>.btn{padding:.5rem 1rem;font-size:1.25rem;line-height:1.5;border-radius:.3rem}.btn-sm,.btn-group-sm>.btn{padding:.25rem .5rem;font-size:.875rem;line-height:1.5;border-radius:.2rem}.btn-block{display:block;width:100%}.btn-block+.btn-block{margin-top:.5rem}input[type="submit"].btn-block,input[type="reset"].btn-block,input[type="button"].btn-block{width:100%}.fade{transition:opacity 0.15s linear}@media (prefers-reduced-motion: reduce){.fade{transition:none}}.fade:not(.show){opacity:0}.collapse:not(.show){display:none}.collapsing{position:relative;height:0;overflow:hidden;transition:height 0.35s ease}@media (prefers-reduced-motion: reduce){.collapsing{transition:none}}.dropup,.dropright,.dropdown,.dropleft{position:relative}.dropdown-toggle{white-space:nowrap}.dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:.3em solid;border-right:.3em solid transparent;border-bottom:0;border-left:.3em solid transparent}.dropdown-toggle:empty::after{margin-left:0}.dropdown-menu{position:absolute;top:100%;left:0;z-index:1000;display:none;float:left;min-width:10rem;padding:.5rem 0;margin:.125rem 0 0;font-size:1rem;color:#212529;text-align:left;list-style:none;background-color:#fff;background-clip:padding-box;border:1px solid rgba(0,0,0,0.15);border-radius:.25rem}.dropdown-menu-left{right:auto;left:0}.dropdown-menu-right{right:0;left:auto}@media (min-width: 576px){.dropdown-menu-sm-left{right:auto;left:0}.dropdown-menu-sm-right{right:0;left:auto}}@media (min-width: 768px){.dropdown-menu-md-left{right:auto;left:0}.dropdown-menu-md-right{right:0;left:auto}}@media (min-width: 992px){.dropdown-menu-lg-left{right:auto;left:0}.dropdown-menu-lg-right{right:0;left:auto}}@media (min-width: 1200px){.dropdown-menu-xl-left{right:auto;left:0}.dropdown-menu-xl-right{right:0;left:auto}}.dropup .dropdown-menu{top:auto;bottom:100%;margin-top:0;margin-bottom:.125rem}.dropup .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:0;border-right:.3em solid transparent;border-bottom:.3em solid;border-left:.3em solid transparent}.dropup .dropdown-toggle:empty::after{margin-left:0}.dropright .dropdown-menu{top:0;right:auto;left:100%;margin-top:0;margin-left:.125rem}.dropright .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:.3em solid transparent;border-right:0;border-bottom:.3em solid transparent;border-left:.3em solid}.dropright .dropdown-toggle:empty::after{margin-left:0}.dropright .dropdown-toggle::after{vertical-align:0}.dropleft .dropdown-menu{top:0;right:100%;left:auto;margin-top:0;margin-right:.125rem}.dropleft .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:""}.dropleft .dropdown-toggle::after{display:none}.dropleft .dropdown-toggle::before{display:inline-block;margin-right:.255em;vertical-align:.255em;content:"";border-top:.3em solid transparent;border-right:.3em solid;border-bottom:.3em solid transparent}.dropleft .dropdown-toggle:empty::after{margin-left:0}.dropleft .dropdown-toggle::before{vertical-align:0}.dropdown-menu[x-placement^="top"],.dropdown-menu[x-placement^="right"],.dropdown-menu[x-placement^="bottom"],.dropdown-menu[x-placement^="left"]{right:auto;bottom:auto}.dropdown-divider{height:0;margin:.5rem 0;overflow:hidden;border-top:1px solid #e9ecef}.dropdown-item{display:block;width:100%;padding:.25rem 1.5rem;clear:both;font-weight:400;color:#212529;text-align:inherit;white-space:nowrap;background-color:transparent;border:0}.dropdown-item:hover,.dropdown-item:focus{color:#16181b;text-decoration:none;background-color:#f8f9fa}.dropdown-item.active,.dropdown-item:active{color:#fff;text-decoration:none;background-color:#007bff}.dropdown-item.disabled,.dropdown-item:disabled{color:#6c757d;pointer-events:none;background-color:transparent}.dropdown-menu.show{display:block}.dropdown-header{display:block;padding:.5rem 1.5rem;margin-bottom:0;font-size:.875rem;color:#6c757d;white-space:nowrap}.dropdown-item-text{display:block;padding:.25rem 1.5rem;color:#212529}.btn-group,.btn-group-vertical{position:relative;display:inline-flex;vertical-align:middle}.btn-group>.btn,.btn-group-vertical>.btn{position:relative;flex:1 1 auto}.btn-group>.btn:hover,.btn-group-vertical>.btn:hover{z-index:1}.btn-group>.btn:focus,.btn-group>.btn:active,.btn-group>.btn.active,.btn-group-vertical>.btn:focus,.btn-group-vertical>.btn:active,.btn-group-vertical>.btn.active{z-index:1}.btn-toolbar{display:flex;flex-wrap:wrap;justify-content:flex-start}.btn-toolbar .input-group{width:auto}.btn-group>.btn:not(:first-child),.btn-group>.btn-group:not(:first-child){margin-left:-1px}.btn-group>.btn:not(:last-child):not(.dropdown-toggle),.btn-group>.btn-group:not(:last-child)>.btn{border-top-right-radius:0;border-bottom-right-radius:0}.btn-group>.btn:not(:first-child),.btn-group>.btn-group:not(:first-child)>.btn{border-top-left-radius:0;border-bottom-left-radius:0}.dropdown-toggle-split{padding-right:.5625rem;padding-left:.5625rem}.dropdown-toggle-split::after,.dropup .dropdown-toggle-split::after,.dropright .dropdown-toggle-split::after{margin-left:0}.dropleft .dropdown-toggle-split::before{margin-right:0}.btn-sm+.dropdown-toggle-split,.btn-group-sm>.btn+.dropdown-toggle-split{padding-right:.375rem;padding-left:.375rem}.btn-lg+.dropdown-toggle-split,.btn-group-lg>.btn+.dropdown-toggle-split{padding-right:.75rem;padding-left:.75rem}.btn-group-vertical{flex-direction:column;align-items:flex-start;justify-content:center}.btn-group-vertical>.btn,.btn-group-vertical>.btn-group{width:100%}.btn-group-vertical>.btn:not(:first-child),.btn-group-vertical>.btn-group:not(:first-child){margin-top:-1px}.btn-group-vertical>.btn:not(:last-child):not(.dropdown-toggle),.btn-group-vertical>.btn-group:not(:last-child)>.btn{border-bottom-right-radius:0;border-bottom-left-radius:0}.btn-group-vertical>.btn:not(:first-child),.btn-group-vertical>.btn-group:not(:first-child)>.btn{border-top-left-radius:0;border-top-right-radius:0}.btn-group-toggle>.btn,.btn-group-toggle>.btn-group>.btn{margin-bottom:0}.btn-group-toggle>.btn input[type="radio"],.btn-group-toggle>.btn input[type="checkbox"],.btn-group-toggle>.btn-group>.btn input[type="radio"],.btn-group-toggle>.btn-group>.btn input[type="checkbox"]{position:absolute;clip:rect(0, 0, 0, 0);pointer-events:none}.input-group{position:relative;display:flex;flex-wrap:wrap;align-items:stretch;width:100%}.input-group>.form-control,.input-group>.form-control-plaintext,.input-group>.custom-select,.input-group>.custom-file{position:relative;flex:1 1 auto;width:1%;margin-bottom:0}.input-group>.form-control+.form-control,.input-group>.form-control+.custom-select,.input-group>.form-control+.custom-file,.input-group>.form-control-plaintext+.form-control,.input-group>.form-control-plaintext+.custom-select,.input-group>.form-control-plaintext+.custom-file,.input-group>.custom-select+.form-control,.input-group>.custom-select+.custom-select,.input-group>.custom-select+.custom-file,.input-group>.custom-file+.form-control,.input-group>.custom-file+.custom-select,.input-group>.custom-file+.custom-file{margin-left:-1px}.input-group>.form-control:focus,.input-group>.custom-select:focus,.input-group>.custom-file .custom-file-input:focus ~ .custom-file-label{z-index:3}.input-group>.custom-file .custom-file-input:focus{z-index:4}.input-group>.form-control:not(:last-child),.input-group>.custom-select:not(:last-child){border-top-right-radius:0;border-bottom-right-radius:0}.input-group>.form-control:not(:first-child),.input-group>.custom-select:not(:first-child){border-top-left-radius:0;border-bottom-left-radius:0}.input-group>.custom-file{display:flex;align-items:center}.input-group>.custom-file:not(:last-child) .custom-file-label,.input-group>.custom-file:not(:last-child) .custom-file-label::after{border-top-right-radius:0;border-bottom-right-radius:0}.input-group>.custom-file:not(:first-child) .custom-file-label{border-top-left-radius:0;border-bottom-left-radius:0}.input-group-prepend,.input-group-append{display:flex}.input-group-prepend .btn,.input-group-append .btn{position:relative;z-index:2}.input-group-prepend .btn:focus,.input-group-append .btn:focus{z-index:3}.input-group-prepend .btn+.btn,.input-group-prepend .btn+.input-group-text,.input-group-prepend .input-group-text+.input-group-text,.input-group-prepend .input-group-text+.btn,.input-group-append .btn+.btn,.input-group-append .btn+.input-group-text,.input-group-append .input-group-text+.input-group-text,.input-group-append .input-group-text+.btn{margin-left:-1px}.input-group-prepend{margin-right:-1px}.input-group-append{margin-left:-1px}.input-group-text{display:flex;align-items:center;padding:.375rem .75rem;margin-bottom:0;font-size:1rem;font-weight:400;line-height:1.5;color:#495057;text-align:center;white-space:nowrap;background-color:#e9ecef;border:1px solid #ced4da;border-radius:.25rem}.input-group-text input[type="radio"],.input-group-text input[type="checkbox"]{margin-top:0}.input-group-lg>.form-control:not(textarea),.input-group-lg>.custom-select{height:calc(1.5em + 1rem + 2px)}.input-group-lg>.form-control,.input-group-lg>.custom-select,.input-group-lg>.input-group-prepend>.input-group-text,.input-group-lg>.input-group-append>.input-group-text,.input-group-lg>.input-group-prepend>.btn,.input-group-lg>.input-group-append>.btn{padding:.5rem 1rem;font-size:1.25rem;line-height:1.5;border-radius:.3rem}.input-group-sm>.form-control:not(textarea),.input-group-sm>.custom-select{height:calc(1.5em + .5rem + 2px)}.input-group-sm>.form-control,.input-group-sm>.custom-select,.input-group-sm>.input-group-prepend>.input-group-text,.input-group-sm>.input-group-append>.input-group-text,.input-group-sm>.input-group-prepend>.btn,.input-group-sm>.input-group-append>.btn{padding:.25rem .5rem;font-size:.875rem;line-height:1.5;border-radius:.2rem}.input-group-lg>.custom-select,.input-group-sm>.custom-select{padding-right:1.75rem}.input-group>.input-group-prepend>.btn,.input-group>.input-group-prepend>.input-group-text,.input-group>.input-group-append:not(:last-child)>.btn,.input-group>.input-group-append:not(:last-child)>.input-group-text,.input-group>.input-group-append:last-child>.btn:not(:last-child):not(.dropdown-toggle),.input-group>.input-group-append:last-child>.input-group-text:not(:last-child){border-top-right-radius:0;border-bottom-right-radius:0}.input-group>.input-group-append>.btn,.input-group>.input-group-append>.input-group-text,.input-group>.input-group-prepend:not(:first-child)>.btn,.input-group>.input-group-prepend:not(:first-child)>.input-group-text,.input-group>.input-group-prepend:first-child>.btn:not(:first-child),.input-group>.input-group-prepend:first-child>.input-group-text:not(:first-child){border-top-left-radius:0;border-bottom-left-radius:0}.custom-control{position:relative;display:block;min-height:1.5rem;padding-left:1.5rem}.custom-control-inline{display:inline-flex;margin-right:1rem}.custom-control-input{position:absolute;z-index:-1;opacity:0}.custom-control-input:checked ~ .custom-control-label::before{color:#fff;border-color:#007bff;background-color:#007bff}.custom-control-input:focus ~ .custom-control-label::before{box-shadow:0 0 0 .2rem rgba(0,123,255,0.25)}.custom-control-input:focus:not(:checked) ~ .custom-control-label::before{border-color:#80bdff}.custom-control-input:not(:disabled):active ~ .custom-control-label::before{color:#fff;background-color:#b3d7ff;border-color:#b3d7ff}.custom-control-input:disabled ~ .custom-control-label{color:#6c757d}.custom-control-input:disabled ~ .custom-control-label::before{background-color:#e9ecef}.custom-control-label{position:relative;margin-bottom:0;vertical-align:top}.custom-control-label::before{position:absolute;top:.25rem;left:-1.5rem;display:block;width:1rem;height:1rem;pointer-events:none;content:"";background-color:#fff;border:#adb5bd solid 1px}.custom-control-label::after{position:absolute;top:.25rem;left:-1.5rem;display:block;width:1rem;height:1rem;content:"";background:no-repeat 50% / 50% 50%}.custom-checkbox .custom-control-label::before{border-radius:.25rem}.custom-checkbox .custom-control-input:checked ~ .custom-control-label::after{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%23fff' d='M6.564.75l-3.59 3.612-1.538-1.55L0 4.26 2.974 7.25 8 2.193z'/%3e%3c/svg%3e")}.custom-checkbox .custom-control-input:indeterminate ~ .custom-control-label::before{border-color:#007bff;background-color:#007bff}.custom-checkbox .custom-control-input:indeterminate ~ .custom-control-label::after{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 4 4'%3e%3cpath stroke='%23fff' d='M0 2h4'/%3e%3c/svg%3e")}.custom-checkbox .custom-control-input:disabled:checked ~ .custom-control-label::before{background-color:rgba(0,123,255,0.5)}.custom-checkbox .custom-control-input:disabled:indeterminate ~ .custom-control-label::before{background-color:rgba(0,123,255,0.5)}.custom-radio .custom-control-label::before{border-radius:50%}.custom-radio .custom-control-input:checked ~ .custom-control-label::after{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23fff'/%3e%3c/svg%3e")}.custom-radio .custom-control-input:disabled:checked ~ .custom-control-label::before{background-color:rgba(0,123,255,0.5)}.custom-switch{padding-left:2.25rem}.custom-switch .custom-control-label::before{left:-2.25rem;width:1.75rem;pointer-events:all;border-radius:.5rem}.custom-switch .custom-control-label::after{top:calc(.25rem + 2px);left:calc(-2.25rem + 2px);width:calc(1rem - 4px);height:calc(1rem - 4px);background-color:#adb5bd;border-radius:.5rem;transition:transform 0.15s ease-in-out,background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out}@media (prefers-reduced-motion: reduce){.custom-switch .custom-control-label::after{transition:none}}.custom-switch .custom-control-input:checked ~ .custom-control-label::after{background-color:#fff;transform:translateX(.75rem)}.custom-switch .custom-control-input:disabled:checked ~ .custom-control-label::before{background-color:rgba(0,123,255,0.5)}.custom-select{display:inline-block;width:100%;height:calc(1.5em + .75rem + 2px);padding:.375rem 1.75rem .375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#495057;vertical-align:middle;background:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 4 5'%3e%3cpath fill='%23343a40' d='M2 0L0 2h4zm0 5L0 3h4z'/%3e%3c/svg%3e") no-repeat right .75rem center/8px 10px;background-color:#fff;border:1px solid #ced4da;border-radius:.25rem;-webkit-appearance:none;-moz-appearance:none;appearance:none}.custom-select:focus{border-color:#80bdff;outline:0;box-shadow:0 0 0 .2rem rgba(0,123,255,0.25)}.custom-select:focus::-ms-value{color:#495057;background-color:#fff}.custom-select[multiple],.custom-select[size]:not([size="1"]){height:auto;padding-right:.75rem;background-image:none}.custom-select:disabled{color:#6c757d;background-color:#e9ecef}.custom-select::-ms-expand{display:none}.custom-select-sm{height:calc(1.5em + .5rem + 2px);padding-top:.25rem;padding-bottom:.25rem;padding-left:.5rem;font-size:.875rem}.custom-select-lg{height:calc(1.5em + 1rem + 2px);padding-top:.5rem;padding-bottom:.5rem;padding-left:1rem;font-size:1.25rem}.custom-file{position:relative;display:inline-block;width:100%;height:calc(1.5em + .75rem + 2px);margin-bottom:0}.custom-file-input{position:relative;z-index:2;width:100%;height:calc(1.5em + .75rem + 2px);margin:0;opacity:0}.custom-file-input:focus ~ .custom-file-label{border-color:#80bdff;box-shadow:0 0 0 .2rem rgba(0,123,255,0.25)}.custom-file-input:disabled ~ .custom-file-label{background-color:#e9ecef}.custom-file-input:lang(en) ~ .custom-file-label::after{content:"Browse"}.custom-file-input ~ .custom-file-label[data-browse]::after{content:attr(data-browse)}.custom-file-label{position:absolute;top:0;right:0;left:0;z-index:1;height:calc(1.5em + .75rem + 2px);padding:.375rem .75rem;font-weight:400;line-height:1.5;color:#495057;background-color:#fff;border:1px solid #ced4da;border-radius:.25rem}.custom-file-label::after{position:absolute;top:0;right:0;bottom:0;z-index:3;display:block;height:calc(1.5em + .75rem);padding:.375rem .75rem;line-height:1.5;color:#495057;content:"Browse";background-color:#e9ecef;border-left:inherit;border-radius:0 .25rem .25rem 0}.custom-range{width:100%;height:calc(1rem + .4rem);padding:0;background-color:transparent;-webkit-appearance:none;-moz-appearance:none;appearance:none}.custom-range:focus{outline:none}.custom-range:focus::-webkit-slider-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .2rem rgba(0,123,255,0.25)}.custom-range:focus::-moz-range-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .2rem rgba(0,123,255,0.25)}.custom-range:focus::-ms-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .2rem rgba(0,123,255,0.25)}.custom-range::-moz-focus-outer{border:0}.custom-range::-webkit-slider-thumb{width:1rem;height:1rem;margin-top:-.25rem;background-color:#007bff;border:0;border-radius:1rem;-webkit-transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out;transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out;-webkit-appearance:none;appearance:none}@media (prefers-reduced-motion: reduce){.custom-range::-webkit-slider-thumb{-webkit-transition:none;transition:none}}.custom-range::-webkit-slider-thumb:active{background-color:#b3d7ff}.custom-range::-webkit-slider-runnable-track{width:100%;height:.5rem;color:transparent;cursor:pointer;background-color:#dee2e6;border-color:transparent;border-radius:1rem}.custom-range::-moz-range-thumb{width:1rem;height:1rem;background-color:#007bff;border:0;border-radius:1rem;-moz-transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out;transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out;-moz-appearance:none;appearance:none}@media (prefers-reduced-motion: reduce){.custom-range::-moz-range-thumb{-moz-transition:none;transition:none}}.custom-range::-moz-range-thumb:active{background-color:#b3d7ff}.custom-range::-moz-range-track{width:100%;height:.5rem;color:transparent;cursor:pointer;background-color:#dee2e6;border-color:transparent;border-radius:1rem}.custom-range::-ms-thumb{width:1rem;height:1rem;margin-top:0;margin-right:.2rem;margin-left:.2rem;background-color:#007bff;border:0;border-radius:1rem;-ms-transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out;transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out;appearance:none}@media (prefers-reduced-motion: reduce){.custom-range::-ms-thumb{-ms-transition:none;transition:none}}.custom-range::-ms-thumb:active{background-color:#b3d7ff}.custom-range::-ms-track{width:100%;height:.5rem;color:transparent;cursor:pointer;background-color:transparent;border-color:transparent;border-width:.5rem}.custom-range::-ms-fill-lower{background-color:#dee2e6;border-radius:1rem}.custom-range::-ms-fill-upper{margin-right:15px;background-color:#dee2e6;border-radius:1rem}.custom-range:disabled::-webkit-slider-thumb{background-color:#adb5bd}.custom-range:disabled::-webkit-slider-runnable-track{cursor:default}.custom-range:disabled::-moz-range-thumb{background-color:#adb5bd}.custom-range:disabled::-moz-range-track{cursor:default}.custom-range:disabled::-ms-thumb{background-color:#adb5bd}.custom-control-label::before,.custom-file-label,.custom-select{transition:background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out}@media (prefers-reduced-motion: reduce){.custom-control-label::before,.custom-file-label,.custom-select{transition:none}}.nav{display:flex;flex-wrap:wrap;padding-left:0;margin-bottom:0;list-style:none}.nav-link{display:block;padding:.5rem 1rem}.nav-link:hover,.nav-link:focus{text-decoration:none}.nav-link.disabled{color:#6c757d;pointer-events:none;cursor:default}.nav-tabs{border-bottom:1px solid #dee2e6}.nav-tabs .nav-item{margin-bottom:-1px}.nav-tabs .nav-link{border:1px solid transparent;border-top-left-radius:.25rem;border-top-right-radius:.25rem}.nav-tabs .nav-link:hover,.nav-tabs .nav-link:focus{border-color:#e9ecef #e9ecef #dee2e6}.nav-tabs .nav-link.disabled{color:#6c757d;background-color:transparent;border-color:transparent}.nav-tabs .nav-link.active,.nav-tabs .nav-item.show .nav-link{color:#495057;background-color:#fff;border-color:#dee2e6 #dee2e6 #fff}.nav-tabs .dropdown-menu{margin-top:-1px;border-top-left-radius:0;border-top-right-radius:0}.nav-pills .nav-link{border-radius:.25rem}.nav-pills .nav-link.active,.nav-pills .show>.nav-link{color:#fff;background-color:#007bff}.nav-fill .nav-item{flex:1 1 auto;text-align:center}.nav-justified .nav-item{flex-basis:0;flex-grow:1;text-align:center}.tab-content>.tab-pane{display:none}.tab-content>.active{display:block}.navbar{position:relative;display:flex;flex-wrap:wrap;align-items:center;justify-content:space-between;padding:.5rem 1rem}.navbar>.container,.navbar>.container-fluid{display:flex;flex-wrap:wrap;align-items:center;justify-content:space-between}.navbar-brand{display:inline-block;padding-top:.3125rem;padding-bottom:.3125rem;margin-right:1rem;font-size:1.25rem;line-height:inherit;white-space:nowrap}.navbar-brand:hover,.navbar-brand:focus{text-decoration:none}.navbar-nav{display:flex;flex-direction:column;padding-left:0;margin-bottom:0;list-style:none}.navbar-nav .nav-link{padding-right:0;padding-left:0}.navbar-nav .dropdown-menu{position:static;float:none}.navbar-text{display:inline-block;padding-top:.5rem;padding-bottom:.5rem}.navbar-collapse{flex-basis:100%;flex-grow:1;align-items:center}.navbar-toggler{padding:.25rem .75rem;font-size:1.25rem;line-height:1;background-color:transparent;border:1px solid transparent;border-radius:.25rem}.navbar-toggler:hover,.navbar-toggler:focus{text-decoration:none}.navbar-toggler-icon{display:inline-block;width:1.5em;height:1.5em;vertical-align:middle;content:"";background:no-repeat center center;background-size:100% 100%}@media (max-width: 575.98px){.navbar-expand-sm>.container,.navbar-expand-sm>.container-fluid{padding-right:0;padding-left:0}}@media (min-width: 576px){.navbar-expand-sm{flex-flow:row nowrap;justify-content:flex-start}.navbar-expand-sm .navbar-nav{flex-direction:row}.navbar-expand-sm .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-sm .navbar-nav .nav-link{padding-right:.5rem;padding-left:.5rem}.navbar-expand-sm>.container,.navbar-expand-sm>.container-fluid{flex-wrap:nowrap}.navbar-expand-sm .navbar-collapse{display:flex !important;flex-basis:auto}.navbar-expand-sm .navbar-toggler{display:none}}@media (max-width: 767.98px){.navbar-expand-md>.container,.navbar-expand-md>.container-fluid{padding-right:0;padding-left:0}}@media (min-width: 768px){.navbar-expand-md{flex-flow:row nowrap;justify-content:flex-start}.navbar-expand-md .navbar-nav{flex-direction:row}.navbar-expand-md .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-md .navbar-nav .nav-link{padding-right:.5rem;padding-left:.5rem}.navbar-expand-md>.container,.navbar-expand-md>.container-fluid{flex-wrap:nowrap}.navbar-expand-md .navbar-collapse{display:flex !important;flex-basis:auto}.navbar-expand-md .navbar-toggler{display:none}}@media (max-width: 991.98px){.navbar-expand-lg>.container,.navbar-expand-lg>.container-fluid{padding-right:0;padding-left:0}}@media (min-width: 992px){.navbar-expand-lg{flex-flow:row nowrap;justify-content:flex-start}.navbar-expand-lg .navbar-nav{flex-direction:row}.navbar-expand-lg .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-lg .navbar-nav .nav-link{padding-right:.5rem;padding-left:.5rem}.navbar-expand-lg>.container,.navbar-expand-lg>.container-fluid{flex-wrap:nowrap}.navbar-expand-lg .navbar-collapse{display:flex !important;flex-basis:auto}.navbar-expand-lg .navbar-toggler{display:none}}@media (max-width: 1199.98px){.navbar-expand-xl>.container,.navbar-expand-xl>.container-fluid{padding-right:0;padding-left:0}}@media (min-width: 1200px){.navbar-expand-xl{flex-flow:row nowrap;justify-content:flex-start}.navbar-expand-xl .navbar-nav{flex-direction:row}.navbar-expand-xl .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-xl .navbar-nav .nav-link{padding-right:.5rem;padding-left:.5rem}.navbar-expand-xl>.container,.navbar-expand-xl>.container-fluid{flex-wrap:nowrap}.navbar-expand-xl .navbar-collapse{display:flex !important;flex-basis:auto}.navbar-expand-xl .navbar-toggler{display:none}}.navbar-expand{flex-flow:row nowrap;justify-content:flex-start}.navbar-expand>.container,.navbar-expand>.container-fluid{padding-right:0;padding-left:0}.navbar-expand .navbar-nav{flex-direction:row}.navbar-expand .navbar-nav .dropdown-menu{position:absolute}.navbar-expand .navbar-nav .nav-link{padding-right:.5rem;padding-left:.5rem}.navbar-expand>.container,.navbar-expand>.container-fluid{flex-wrap:nowrap}.navbar-expand .navbar-collapse{display:flex !important;flex-basis:auto}.navbar-expand .navbar-toggler{display:none}.navbar-light .navbar-brand{color:rgba(0,0,0,0.9)}.navbar-light .navbar-brand:hover,.navbar-light .navbar-brand:focus{color:rgba(0,0,0,0.9)}.navbar-light .navbar-nav .nav-link{color:rgba(0,0,0,0.5)}.navbar-light .navbar-nav .nav-link:hover,.navbar-light .navbar-nav .nav-link:focus{color:rgba(0,0,0,0.7)}.navbar-light .navbar-nav .nav-link.disabled{color:rgba(0,0,0,0.3)}.navbar-light .navbar-nav .show>.nav-link,.navbar-light .navbar-nav .active>.nav-link,.navbar-light .navbar-nav .nav-link.show,.navbar-light .navbar-nav .nav-link.active{color:rgba(0,0,0,0.9)}.navbar-light .navbar-toggler{color:rgba(0,0,0,0.5);border-color:rgba(0,0,0,0.1)}.navbar-light .navbar-toggler-icon{background-image:url("data:image/svg+xml,%3csvg viewBox='0 0 30 30' xmlns='http://www.w3.org/2000/svg'%3e%3cpath stroke='rgba(0,0,0,0.5)' stroke-width='2' stroke-linecap='round' stroke-miterlimit='10' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e")}.navbar-light .navbar-text{color:rgba(0,0,0,0.5)}.navbar-light .navbar-text a{color:rgba(0,0,0,0.9)}.navbar-light .navbar-text a:hover,.navbar-light .navbar-text a:focus{color:rgba(0,0,0,0.9)}.navbar-dark .navbar-brand{color:#fff}.navbar-dark .navbar-brand:hover,.navbar-dark .navbar-brand:focus{color:#fff}.navbar-dark .navbar-nav .nav-link{color:rgba(255,255,255,0.5)}.navbar-dark .navbar-nav .nav-link:hover,.navbar-dark .navbar-nav .nav-link:focus{color:rgba(255,255,255,0.75)}.navbar-dark .navbar-nav .nav-link.disabled{color:rgba(255,255,255,0.25)}.navbar-dark .navbar-nav .show>.nav-link,.navbar-dark .navbar-nav .active>.nav-link,.navbar-dark .navbar-nav .nav-link.show,.navbar-dark .navbar-nav .nav-link.active{color:#fff}.navbar-dark .navbar-toggler{color:rgba(255,255,255,0.5);border-color:rgba(255,255,255,0.1)}.navbar-dark .navbar-toggler-icon{background-image:url("data:image/svg+xml,%3csvg viewBox='0 0 30 30' xmlns='http://www.w3.org/2000/svg'%3e%3cpath stroke='rgba(255,255,255,0.5)' stroke-width='2' stroke-linecap='round' stroke-miterlimit='10' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e")}.navbar-dark .navbar-text{color:rgba(255,255,255,0.5)}.navbar-dark .navbar-text a{color:#fff}.navbar-dark .navbar-text a:hover,.navbar-dark .navbar-text a:focus{color:#fff}.card{position:relative;display:flex;flex-direction:column;min-width:0;word-wrap:break-word;background-color:#fff;background-clip:border-box;border:1px solid rgba(0,0,0,0.125);border-radius:.25rem}.card>hr{margin-right:0;margin-left:0}.card>.list-group:first-child .list-group-item:first-child{border-top-left-radius:.25rem;border-top-right-radius:.25rem}.card>.list-group:last-child .list-group-item:last-child{border-bottom-right-radius:.25rem;border-bottom-left-radius:.25rem}.card-body{flex:1 1 auto;padding:1.25rem}.card-title{margin-bottom:.75rem}.card-subtitle{margin-top:-.375rem;margin-bottom:0}.card-text:last-child{margin-bottom:0}.card-link:hover{text-decoration:none}.card-link+.card-link{margin-left:1.25rem}.card-header{padding:.75rem 1.25rem;margin-bottom:0;background-color:rgba(0,0,0,0.03);border-bottom:1px solid rgba(0,0,0,0.125)}.card-header:first-child{border-radius:calc(.25rem - 1px) calc(.25rem - 1px) 0 0}.card-header+.list-group .list-group-item:first-child{border-top:0}.card-footer{padding:.75rem 1.25rem;background-color:rgba(0,0,0,0.03);border-top:1px solid rgba(0,0,0,0.125)}.card-footer:last-child{border-radius:0 0 calc(.25rem - 1px) calc(.25rem - 1px)}.card-header-tabs{margin-right:-.625rem;margin-bottom:-0.75rem;margin-left:-.625rem;border-bottom:0}.card-header-pills{margin-right:-.625rem;margin-left:-.625rem}.card-img-overlay{position:absolute;top:0;right:0;bottom:0;left:0;padding:1.25rem}.card-img{width:100%;border-radius:calc(.25rem - 1px)}.card-img-top{width:100%;border-top-left-radius:calc(.25rem - 1px);border-top-right-radius:calc(.25rem - 1px)}.card-img-bottom{width:100%;border-bottom-right-radius:calc(.25rem - 1px);border-bottom-left-radius:calc(.25rem - 1px)}.card-deck{display:flex;flex-direction:column}.card-deck .card{margin-bottom:15px}@media (min-width: 576px){.card-deck{flex-flow:row wrap;margin-right:-15px;margin-left:-15px}.card-deck .card{display:flex;flex:1 0 0%;flex-direction:column;margin-right:15px;margin-bottom:0;margin-left:15px}}.card-group{display:flex;flex-direction:column}.card-group>.card{margin-bottom:15px}@media (min-width: 576px){.card-group{flex-flow:row wrap}.card-group>.card{flex:1 0 0%;margin-bottom:0}.card-group>.card+.card{margin-left:0;border-left:0}.card-group>.card:not(:last-child){border-top-right-radius:0;border-bottom-right-radius:0}.card-group>.card:not(:last-child) .card-img-top,.card-group>.card:not(:last-child) .card-header{border-top-right-radius:0}.card-group>.card:not(:last-child) .card-img-bottom,.card-group>.card:not(:last-child) .card-footer{border-bottom-right-radius:0}.card-group>.card:not(:first-child){border-top-left-radius:0;border-bottom-left-radius:0}.card-group>.card:not(:first-child) .card-img-top,.card-group>.card:not(:first-child) .card-header{border-top-left-radius:0}.card-group>.card:not(:first-child) .card-img-bottom,.card-group>.card:not(:first-child) .card-footer{border-bottom-left-radius:0}}.card-columns .card{margin-bottom:.75rem}@media (min-width: 576px){.card-columns{-moz-column-count:3;column-count:3;-moz-column-gap:1.25rem;column-gap:1.25rem;orphans:1;widows:1}.card-columns .card{display:inline-block;width:100%}}.accordion>.card{overflow:hidden}.accordion>.card:not(:first-of-type) .card-header:first-child{border-radius:0}.accordion>.card:not(:first-of-type):not(:last-of-type){border-bottom:0;border-radius:0}.accordion>.card:first-of-type{border-bottom:0;border-bottom-right-radius:0;border-bottom-left-radius:0}.accordion>.card:last-of-type{border-top-left-radius:0;border-top-right-radius:0}.accordion>.card .card-header{margin-bottom:-1px}.breadcrumb{display:flex;flex-wrap:wrap;padding:.75rem 1rem;margin-bottom:1rem;list-style:none;background-color:#e9ecef;border-radius:.25rem}.breadcrumb-item+.breadcrumb-item{padding-left:.5rem}.breadcrumb-item+.breadcrumb-item::before{display:inline-block;padding-right:.5rem;color:#6c757d;content:"/"}.breadcrumb-item+.breadcrumb-item:hover::before{text-decoration:underline}.breadcrumb-item+.breadcrumb-item:hover::before{text-decoration:none}.breadcrumb-item.active{color:#6c757d}.pagination{display:flex;padding-left:0;list-style:none;border-radius:.25rem}.page-link{position:relative;display:block;padding:.5rem .75rem;margin-left:-1px;line-height:1.25;color:#007bff;background-color:#fff;border:1px solid #dee2e6}.page-link:hover{z-index:2;color:#0056b3;text-decoration:none;background-color:#e9ecef;border-color:#dee2e6}.page-link:focus{z-index:2;outline:0;box-shadow:0 0 0 .2rem rgba(0,123,255,0.25)}.page-item:first-child .page-link{margin-left:0;border-top-left-radius:.25rem;border-bottom-left-radius:.25rem}.page-item:last-child .page-link{border-top-right-radius:.25rem;border-bottom-right-radius:.25rem}.page-item.active .page-link{z-index:1;color:#fff;background-color:#007bff;border-color:#007bff}.page-item.disabled .page-link{color:#6c757d;pointer-events:none;cursor:auto;background-color:#fff;border-color:#dee2e6}.pagination-lg .page-link{padding:.75rem 1.5rem;font-size:1.25rem;line-height:1.5}.pagination-lg .page-item:first-child .page-link{border-top-left-radius:.3rem;border-bottom-left-radius:.3rem}.pagination-lg .page-item:last-child .page-link{border-top-right-radius:.3rem;border-bottom-right-radius:.3rem}.pagination-sm .page-link{padding:.25rem .5rem;font-size:.875rem;line-height:1.5}.pagination-sm .page-item:first-child .page-link{border-top-left-radius:.2rem;border-bottom-left-radius:.2rem}.pagination-sm .page-item:last-child .page-link{border-top-right-radius:.2rem;border-bottom-right-radius:.2rem}.badge{display:inline-block;padding:.25em .4em;font-size:75%;font-weight:700;line-height:1;text-align:center;white-space:nowrap;vertical-align:baseline;border-radius:.25rem;transition:color 0.15s ease-in-out,background-color 0.15s ease-in-out,border-color 0.15s ease-in-out,box-shadow 0.15s ease-in-out}@media (prefers-reduced-motion: reduce){.badge{transition:none}}a.badge:hover,a.badge:focus{text-decoration:none}.badge:empty{display:none}.btn .badge{position:relative;top:-1px}.badge-pill{padding-right:.6em;padding-left:.6em;border-radius:10rem}.badge-primary{color:#fff;background-color:#007bff}a.badge-primary:hover,a.badge-primary:focus{color:#fff;background-color:#0062cc}a.badge-primary:focus,a.badge-primary.focus{outline:0;box-shadow:0 0 0 .2rem rgba(0,123,255,0.5)}.badge-secondary{color:#fff;background-color:#6c757d}a.badge-secondary:hover,a.badge-secondary:focus{color:#fff;background-color:#545b62}a.badge-secondary:focus,a.badge-secondary.focus{outline:0;box-shadow:0 0 0 .2rem rgba(108,117,125,0.5)}.badge-success{color:#fff;background-color:#28a745}a.badge-success:hover,a.badge-success:focus{color:#fff;background-color:#1e7e34}a.badge-success:focus,a.badge-success.focus{outline:0;box-shadow:0 0 0 .2rem rgba(40,167,69,0.5)}.badge-info{color:#fff;background-color:#17a2b8}a.badge-info:hover,a.badge-info:focus{color:#fff;background-color:#117a8b}a.badge-info:focus,a.badge-info.focus{outline:0;box-shadow:0 0 0 .2rem rgba(23,162,184,0.5)}.badge-warning{color:#212529;background-color:#ffc107}a.badge-warning:hover,a.badge-warning:focus{color:#212529;background-color:#d39e00}a.badge-warning:focus,a.badge-warning.focus{outline:0;box-shadow:0 0 0 .2rem rgba(255,193,7,0.5)}.badge-danger{color:#fff;background-color:#dc3545}a.badge-danger:hover,a.badge-danger:focus{color:#fff;background-color:#bd2130}a.badge-danger:focus,a.badge-danger.focus{outline:0;box-shadow:0 0 0 .2rem rgba(220,53,69,0.5)}.badge-light{color:#212529;background-color:#f8f9fa}a.badge-light:hover,a.badge-light:focus{color:#212529;background-color:#dae0e5}a.badge-light:focus,a.badge-light.focus{outline:0;box-shadow:0 0 0 .2rem rgba(248,249,250,0.5)}.badge-dark{color:#fff;background-color:#343a40}a.badge-dark:hover,a.badge-dark:focus{color:#fff;background-color:#1d2124}a.badge-dark:focus,a.badge-dark.focus{outline:0;box-shadow:0 0 0 .2rem rgba(52,58,64,0.5)}.jumbotron{padding:2rem 1rem;margin-bottom:2rem;background-color:#e9ecef;border-radius:.3rem}@media (min-width: 576px){.jumbotron{padding:4rem 2rem}}.jumbotron-fluid{padding-right:0;padding-left:0;border-radius:0}.alert{position:relative;padding:.75rem 1.25rem;margin-bottom:1rem;border:1px solid transparent;border-radius:.25rem}.alert-heading{color:inherit}.alert-link{font-weight:700}.alert-dismissible{padding-right:4rem}.alert-dismissible .close{position:absolute;top:0;right:0;padding:.75rem 1.25rem;color:inherit}.alert-primary{color:#004085;background-color:#cce5ff;border-color:#b8daff}.alert-primary hr{border-top-color:#9fcdff}.alert-primary .alert-link{color:#002752}.alert-secondary{color:#383d41;background-color:#e2e3e5;border-color:#d6d8db}.alert-secondary hr{border-top-color:#c8cbcf}.alert-secondary .alert-link{color:#202326}.alert-success{color:#155724;background-color:#d4edda;border-color:#c3e6cb}.alert-success hr{border-top-color:#b1dfbb}.alert-success .alert-link{color:#0b2e13}.alert-info{color:#0c5460;background-color:#d1ecf1;border-color:#bee5eb}.alert-info hr{border-top-color:#abdde5}.alert-info .alert-link{color:#062c33}.alert-warning{color:#856404;background-color:#fff3cd;border-color:#ffeeba}.alert-warning hr{border-top-color:#ffe8a1}.alert-warning .alert-link{color:#533f03}.alert-danger{color:#721c24;background-color:#f8d7da;border-color:#f5c6cb}.alert-danger hr{border-top-color:#f1b0b7}.alert-danger .alert-link{color:#491217}.alert-light{color:#818182;background-color:#fefefe;border-color:#fdfdfe}.alert-light hr{border-top-color:#ececf6}.alert-light .alert-link{color:#686868}.alert-dark{color:#1b1e21;background-color:#d6d8d9;border-color:#c6c8ca}.alert-dark hr{border-top-color:#b9bbbe}.alert-dark .alert-link{color:#040505}@-webkit-keyframes progress-bar-stripes{from{background-position:1rem 0}to{background-position:0 0}}@keyframes progress-bar-stripes{from{background-position:1rem 0}to{background-position:0 0}}.progress{display:flex;height:1rem;overflow:hidden;font-size:.75rem;background-color:#e9ecef;border-radius:.25rem}.progress-bar{display:flex;flex-direction:column;justify-content:center;color:#fff;text-align:center;white-space:nowrap;background-color:#007bff;transition:width 0.6s ease}@media (prefers-reduced-motion: reduce){.progress-bar{transition:none}}.progress-bar-striped{background-image:linear-gradient(45deg, rgba(255,255,255,0.15) 25%, transparent 25%, transparent 50%, rgba(255,255,255,0.15) 50%, rgba(255,255,255,0.15) 75%, transparent 75%, transparent);background-size:1rem 1rem}.progress-bar-animated{-webkit-animation:progress-bar-stripes 1s linear infinite;animation:progress-bar-stripes 1s linear infinite}@media (prefers-reduced-motion: reduce){.progress-bar-animated{-webkit-animation:none;animation:none}}.media{display:flex;align-items:flex-start}.media-body{flex:1}.list-group{display:flex;flex-direction:column;padding-left:0;margin-bottom:0}.list-group-item-action{width:100%;color:#495057;text-align:inherit}.list-group-item-action:hover,.list-group-item-action:focus{z-index:1;color:#495057;text-decoration:none;background-color:#f8f9fa}.list-group-item-action:active{color:#212529;background-color:#e9ecef}.list-group-item{position:relative;display:block;padding:.75rem 1.25rem;margin-bottom:-1px;background-color:#fff;border:1px solid rgba(0,0,0,0.125)}.list-group-item:first-child{border-top-left-radius:.25rem;border-top-right-radius:.25rem}.list-group-item:last-child{margin-bottom:0;border-bottom-right-radius:.25rem;border-bottom-left-radius:.25rem}.list-group-item.disabled,.list-group-item:disabled{color:#6c757d;pointer-events:none;background-color:#fff}.list-group-item.active{z-index:2;color:#fff;background-color:#007bff;border-color:#007bff}.list-group-horizontal{flex-direction:row}.list-group-horizontal .list-group-item{margin-right:-1px;margin-bottom:0}.list-group-horizontal .list-group-item:first-child{border-top-left-radius:.25rem;border-bottom-left-radius:.25rem;border-top-right-radius:0}.list-group-horizontal .list-group-item:last-child{margin-right:0;border-top-right-radius:.25rem;border-bottom-right-radius:.25rem;border-bottom-left-radius:0}@media (min-width: 576px){.list-group-horizontal-sm{flex-direction:row}.list-group-horizontal-sm .list-group-item{margin-right:-1px;margin-bottom:0}.list-group-horizontal-sm .list-group-item:first-child{border-top-left-radius:.25rem;border-bottom-left-radius:.25rem;border-top-right-radius:0}.list-group-horizontal-sm .list-group-item:last-child{margin-right:0;border-top-right-radius:.25rem;border-bottom-right-radius:.25rem;border-bottom-left-radius:0}}@media (min-width: 768px){.list-group-horizontal-md{flex-direction:row}.list-group-horizontal-md .list-group-item{margin-right:-1px;margin-bottom:0}.list-group-horizontal-md .list-group-item:first-child{border-top-left-radius:.25rem;border-bottom-left-radius:.25rem;border-top-right-radius:0}.list-group-horizontal-md .list-group-item:last-child{margin-right:0;border-top-right-radius:.25rem;border-bottom-right-radius:.25rem;border-bottom-left-radius:0}}@media (min-width: 992px){.list-group-horizontal-lg{flex-direction:row}.list-group-horizontal-lg .list-group-item{margin-right:-1px;margin-bottom:0}.list-group-horizontal-lg .list-group-item:first-child{border-top-left-radius:.25rem;border-bottom-left-radius:.25rem;border-top-right-radius:0}.list-group-horizontal-lg .list-group-item:last-child{margin-right:0;border-top-right-radius:.25rem;border-bottom-right-radius:.25rem;border-bottom-left-radius:0}}@media (min-width: 1200px){.list-group-horizontal-xl{flex-direction:row}.list-group-horizontal-xl .list-group-item{margin-right:-1px;margin-bottom:0}.list-group-horizontal-xl .list-group-item:first-child{border-top-left-radius:.25rem;border-bottom-left-radius:.25rem;border-top-right-radius:0}.list-group-horizontal-xl .list-group-item:last-child{margin-right:0;border-top-right-radius:.25rem;border-bottom-right-radius:.25rem;border-bottom-left-radius:0}}.list-group-flush .list-group-item{border-right:0;border-left:0;border-radius:0}.list-group-flush .list-group-item:last-child{margin-bottom:-1px}.list-group-flush:first-child .list-group-item:first-child{border-top:0}.list-group-flush:last-child .list-group-item:last-child{margin-bottom:0;border-bottom:0}.list-group-item-primary{color:#004085;background-color:#b8daff}.list-group-item-primary.list-group-item-action:hover,.list-group-item-primary.list-group-item-action:focus{color:#004085;background-color:#9fcdff}.list-group-item-primary.list-group-item-action.active{color:#fff;background-color:#004085;border-color:#004085}.list-group-item-secondary{color:#383d41;background-color:#d6d8db}.list-group-item-secondary.list-group-item-action:hover,.list-group-item-secondary.list-group-item-action:focus{color:#383d41;background-color:#c8cbcf}.list-group-item-secondary.list-group-item-action.active{color:#fff;background-color:#383d41;border-color:#383d41}.list-group-item-success{color:#155724;background-color:#c3e6cb}.list-group-item-success.list-group-item-action:hover,.list-group-item-success.list-group-item-action:focus{color:#155724;background-color:#b1dfbb}.list-group-item-success.list-group-item-action.active{color:#fff;background-color:#155724;border-color:#155724}.list-group-item-info{color:#0c5460;background-color:#bee5eb}.list-group-item-info.list-group-item-action:hover,.list-group-item-info.list-group-item-action:focus{color:#0c5460;background-color:#abdde5}.list-group-item-info.list-group-item-action.active{color:#fff;background-color:#0c5460;border-color:#0c5460}.list-group-item-warning{color:#856404;background-color:#ffeeba}.list-group-item-warning.list-group-item-action:hover,.list-group-item-warning.list-group-item-action:focus{color:#856404;background-color:#ffe8a1}.list-group-item-warning.list-group-item-action.active{color:#fff;background-color:#856404;border-color:#856404}.list-group-item-danger{color:#721c24;background-color:#f5c6cb}.list-group-item-danger.list-group-item-action:hover,.list-group-item-danger.list-group-item-action:focus{color:#721c24;background-color:#f1b0b7}.list-group-item-danger.list-group-item-action.active{color:#fff;background-color:#721c24;border-color:#721c24}.list-group-item-light{color:#818182;background-color:#fdfdfe}.list-group-item-light.list-group-item-action:hover,.list-group-item-light.list-group-item-action:focus{color:#818182;background-color:#ececf6}.list-group-item-light.list-group-item-action.active{color:#fff;background-color:#818182;border-color:#818182}.list-group-item-dark{color:#1b1e21;background-color:#c6c8ca}.list-group-item-dark.list-group-item-action:hover,.list-group-item-dark.list-group-item-action:focus{color:#1b1e21;background-color:#b9bbbe}.list-group-item-dark.list-group-item-action.active{color:#fff;background-color:#1b1e21;border-color:#1b1e21}.close{float:right;font-size:1.5rem;font-weight:700;line-height:1;color:#000;text-shadow:0 1px 0 #fff;opacity:.5}.close:hover{color:#000;text-decoration:none}.close:not(:disabled):not(.disabled):hover,.close:not(:disabled):not(.disabled):focus{opacity:.75}button.close{padding:0;background-color:transparent;border:0;-webkit-appearance:none;-moz-appearance:none;appearance:none}a.close.disabled{pointer-events:none}.toast{max-width:350px;overflow:hidden;font-size:.875rem;background-color:rgba(255,255,255,0.85);background-clip:padding-box;border:1px solid rgba(0,0,0,0.1);box-shadow:0 0.25rem 0.75rem rgba(0,0,0,0.1);-webkit-backdrop-filter:blur(10px);backdrop-filter:blur(10px);opacity:0;border-radius:.25rem}.toast:not(:last-child){margin-bottom:.75rem}.toast.showing{opacity:1}.toast.show{display:block;opacity:1}.toast.hide{display:none}.toast-header{display:flex;align-items:center;padding:.25rem .75rem;color:#6c757d;background-color:rgba(255,255,255,0.85);background-clip:padding-box;border-bottom:1px solid rgba(0,0,0,0.05)}.toast-body{padding:.75rem}.modal-open{overflow:hidden}.modal-open .modal{overflow-x:hidden;overflow-y:auto}.modal{position:fixed;top:0;left:0;z-index:1050;display:none;width:100%;height:100%;overflow:hidden;outline:0}.modal-dialog{position:relative;width:auto;margin:.5rem;pointer-events:none}.modal.fade .modal-dialog{transition:transform 0.3s ease-out;transform:translate(0, -50px)}@media (prefers-reduced-motion: reduce){.modal.fade .modal-dialog{transition:none}}.modal.show .modal-dialog{transform:none}.modal-dialog-scrollable{display:flex;max-height:calc(100% - 1rem)}.modal-dialog-scrollable .modal-content{max-height:calc(100vh - 1rem);overflow:hidden}.modal-dialog-scrollable .modal-header,.modal-dialog-scrollable .modal-footer{flex-shrink:0}.modal-dialog-scrollable .modal-body{overflow-y:auto}.modal-dialog-centered{display:flex;align-items:center;min-height:calc(100% - 1rem)}.modal-dialog-centered::before{display:block;height:calc(100vh - 1rem);content:""}.modal-dialog-centered.modal-dialog-scrollable{flex-direction:column;justify-content:center;height:100%}.modal-dialog-centered.modal-dialog-scrollable .modal-content{max-height:none}.modal-dialog-centered.modal-dialog-scrollable::before{content:none}.modal-content{position:relative;display:flex;flex-direction:column;width:100%;pointer-events:auto;background-color:#fff;background-clip:padding-box;border:1px solid rgba(0,0,0,0.2);border-radius:.3rem;outline:0}.modal-backdrop{position:fixed;top:0;left:0;z-index:1040;width:100vw;height:100vh;background-color:#000}.modal-backdrop.fade{opacity:0}.modal-backdrop.show{opacity:.5}.modal-header{display:flex;align-items:flex-start;justify-content:space-between;padding:1rem 1rem;border-bottom:1px solid #dee2e6;border-top-left-radius:.3rem;border-top-right-radius:.3rem}.modal-header .close{padding:1rem 1rem;margin:-1rem -1rem -1rem auto}.modal-title{margin-bottom:0;line-height:1.5}.modal-body{position:relative;flex:1 1 auto;padding:1rem}.modal-footer{display:flex;align-items:center;justify-content:flex-end;padding:1rem;border-top:1px solid #dee2e6;border-bottom-right-radius:.3rem;border-bottom-left-radius:.3rem}.modal-footer>:not(:first-child){margin-left:.25rem}.modal-footer>:not(:last-child){margin-right:.25rem}.modal-scrollbar-measure{position:absolute;top:-9999px;width:50px;height:50px;overflow:scroll}@media (min-width: 576px){.modal-dialog{max-width:500px;margin:1.75rem auto}.modal-dialog-scrollable{max-height:calc(100% - 3.5rem)}.modal-dialog-scrollable .modal-content{max-height:calc(100vh - 3.5rem)}.modal-dialog-centered{min-height:calc(100% - 3.5rem)}.modal-dialog-centered::before{height:calc(100vh - 3.5rem)}.modal-sm{max-width:300px}}@media (min-width: 992px){.modal-lg,.modal-xl{max-width:800px}}@media (min-width: 1200px){.modal-xl{max-width:1140px}}.tooltip{position:absolute;z-index:1070;display:block;margin:0;font-family:-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol","Noto Color Emoji";font-style:normal;font-weight:400;line-height:1.5;text-align:left;text-align:start;text-decoration:none;text-shadow:none;text-transform:none;letter-spacing:normal;word-break:normal;word-spacing:normal;white-space:normal;line-break:auto;font-size:.875rem;word-wrap:break-word;opacity:0}.tooltip.show{opacity:.9}.tooltip .arrow{position:absolute;display:block;width:.8rem;height:.4rem}.tooltip .arrow::before{position:absolute;content:"";border-color:transparent;border-style:solid}.bs-tooltip-top,.bs-tooltip-auto[x-placement^="top"]{padding:.4rem 0}.bs-tooltip-top .arrow,.bs-tooltip-auto[x-placement^="top"] .arrow{bottom:0}.bs-tooltip-top .arrow::before,.bs-tooltip-auto[x-placement^="top"] .arrow::before{top:0;border-width:.4rem .4rem 0;border-top-color:#000}.bs-tooltip-right,.bs-tooltip-auto[x-placement^="right"]{padding:0 .4rem}.bs-tooltip-right .arrow,.bs-tooltip-auto[x-placement^="right"] .arrow{left:0;width:.4rem;height:.8rem}.bs-tooltip-right .arrow::before,.bs-tooltip-auto[x-placement^="right"] .arrow::before{right:0;border-width:.4rem .4rem .4rem 0;border-right-color:#000}.bs-tooltip-bottom,.bs-tooltip-auto[x-placement^="bottom"]{padding:.4rem 0}.bs-tooltip-bottom .arrow,.bs-tooltip-auto[x-placement^="bottom"] .arrow{top:0}.bs-tooltip-bottom .arrow::before,.bs-tooltip-auto[x-placement^="bottom"] .arrow::before{bottom:0;border-width:0 .4rem .4rem;border-bottom-color:#000}.bs-tooltip-left,.bs-tooltip-auto[x-placement^="left"]{padding:0 .4rem}.bs-tooltip-left .arrow,.bs-tooltip-auto[x-placement^="left"] .arrow{right:0;width:.4rem;height:.8rem}.bs-tooltip-left .arrow::before,.bs-tooltip-auto[x-placement^="left"] .arrow::before{left:0;border-width:.4rem 0 .4rem .4rem;border-left-color:#000}.tooltip-inner{max-width:200px;padding:.25rem .5rem;color:#fff;text-align:center;background-color:#000;border-radius:.25rem}.popover{position:absolute;top:0;left:0;z-index:1060;display:block;max-width:276px;font-family:-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,"Noto Sans",sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol","Noto Color Emoji";font-style:normal;font-weight:400;line-height:1.5;text-align:left;text-align:start;text-decoration:none;text-shadow:none;text-transform:none;letter-spacing:normal;word-break:normal;word-spacing:normal;white-space:normal;line-break:auto;font-size:.875rem;word-wrap:break-word;background-color:#fff;background-clip:padding-box;border:1px solid rgba(0,0,0,0.2);border-radius:.3rem}.popover .arrow{position:absolute;display:block;width:1rem;height:.5rem;margin:0 .3rem}.popover .arrow::before,.popover .arrow::after{position:absolute;display:block;content:"";border-color:transparent;border-style:solid}.bs-popover-top,.bs-popover-auto[x-placement^="top"]{margin-bottom:.5rem}.bs-popover-top>.arrow,.bs-popover-auto[x-placement^="top"]>.arrow{bottom:calc((.5rem + 1px) * -1)}.bs-popover-top>.arrow::before,.bs-popover-auto[x-placement^="top"]>.arrow::before{bottom:0;border-width:.5rem .5rem 0;border-top-color:rgba(0,0,0,0.25)}.bs-popover-top>.arrow::after,.bs-popover-auto[x-placement^="top"]>.arrow::after{bottom:1px;border-width:.5rem .5rem 0;border-top-color:#fff}.bs-popover-right,.bs-popover-auto[x-placement^="right"]{margin-left:.5rem}.bs-popover-right>.arrow,.bs-popover-auto[x-placement^="right"]>.arrow{left:calc((.5rem + 1px) * -1);width:.5rem;height:1rem;margin:.3rem 0}.bs-popover-right>.arrow::before,.bs-popover-auto[x-placement^="right"]>.arrow::before{left:0;border-width:.5rem .5rem .5rem 0;border-right-color:rgba(0,0,0,0.25)}.bs-popover-right>.arrow::after,.bs-popover-auto[x-placement^="right"]>.arrow::after{left:1px;border-width:.5rem .5rem .5rem 0;border-right-color:#fff}.bs-popover-bottom,.bs-popover-auto[x-placement^="bottom"]{margin-top:.5rem}.bs-popover-bottom>.arrow,.bs-popover-auto[x-placement^="bottom"]>.arrow{top:calc((.5rem + 1px) * -1)}.bs-popover-bottom>.arrow::before,.bs-popover-auto[x-placement^="bottom"]>.arrow::before{top:0;border-width:0 .5rem .5rem .5rem;border-bottom-color:rgba(0,0,0,0.25)}.bs-popover-bottom>.arrow::after,.bs-popover-auto[x-placement^="bottom"]>.arrow::after{top:1px;border-width:0 .5rem .5rem .5rem;border-bottom-color:#fff}.bs-popover-bottom .popover-header::before,.bs-popover-auto[x-placement^="bottom"] .popover-header::before{position:absolute;top:0;left:50%;display:block;width:1rem;margin-left:-.5rem;content:"";border-bottom:1px solid #f7f7f7}.bs-popover-left,.bs-popover-auto[x-placement^="left"]{margin-right:.5rem}.bs-popover-left>.arrow,.bs-popover-auto[x-placement^="left"]>.arrow{right:calc((.5rem + 1px) * -1);width:.5rem;height:1rem;margin:.3rem 0}.bs-popover-left>.arrow::before,.bs-popover-auto[x-placement^="left"]>.arrow::before{right:0;border-width:.5rem 0 .5rem .5rem;border-left-color:rgba(0,0,0,0.25)}.bs-popover-left>.arrow::after,.bs-popover-auto[x-placement^="left"]>.arrow::after{right:1px;border-width:.5rem 0 .5rem .5rem;border-left-color:#fff}.popover-header{padding:.5rem .75rem;margin-bottom:0;font-size:1rem;background-color:#f7f7f7;border-bottom:1px solid #ebebeb;border-top-left-radius:calc(.3rem - 1px);border-top-right-radius:calc(.3rem - 1px)}.popover-header:empty{display:none}.popover-body{padding:.5rem .75rem;color:#212529}.carousel{position:relative}.carousel.pointer-event{touch-action:pan-y}.carousel-inner{position:relative;width:100%;overflow:hidden}.carousel-inner::after{display:block;clear:both;content:""}.carousel-item{position:relative;display:none;float:left;width:100%;margin-right:-100%;-webkit-backface-visibility:hidden;backface-visibility:hidden;transition:transform .6s ease-in-out}@media (prefers-reduced-motion: reduce){.carousel-item{transition:none}}.carousel-item.active,.carousel-item-next,.carousel-item-prev{display:block}.carousel-item-next:not(.carousel-item-left),.active.carousel-item-right{transform:translateX(100%)}.carousel-item-prev:not(.carousel-item-right),.active.carousel-item-left{transform:translateX(-100%)}.carousel-fade .carousel-item{opacity:0;transition-property:opacity;transform:none}.carousel-fade .carousel-item.active,.carousel-fade .carousel-item-next.carousel-item-left,.carousel-fade .carousel-item-prev.carousel-item-right{z-index:1;opacity:1}.carousel-fade .active.carousel-item-left,.carousel-fade .active.carousel-item-right{z-index:0;opacity:0;transition:0s .6s opacity}@media (prefers-reduced-motion: reduce){.carousel-fade .active.carousel-item-left,.carousel-fade .active.carousel-item-right{transition:none}}.carousel-control-prev,.carousel-control-next{position:absolute;top:0;bottom:0;z-index:1;display:flex;align-items:center;justify-content:center;width:15%;color:#fff;text-align:center;opacity:.5;transition:opacity 0.15s ease}@media (prefers-reduced-motion: reduce){.carousel-control-prev,.carousel-control-next{transition:none}}.carousel-control-prev:hover,.carousel-control-prev:focus,.carousel-control-next:hover,.carousel-control-next:focus{color:#fff;text-decoration:none;outline:0;opacity:.9}.carousel-control-prev{left:0}.carousel-control-next{right:0}.carousel-control-prev-icon,.carousel-control-next-icon{display:inline-block;width:20px;height:20px;background:no-repeat 50% / 100% 100%}.carousel-control-prev-icon{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' fill='%23fff' viewBox='0 0 8 8'%3e%3cpath d='M5.25 0l-4 4 4 4 1.5-1.5-2.5-2.5 2.5-2.5-1.5-1.5z'/%3e%3c/svg%3e")}.carousel-control-next-icon{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' fill='%23fff' viewBox='0 0 8 8'%3e%3cpath d='M2.75 0l-1.5 1.5 2.5 2.5-2.5 2.5 1.5 1.5 4-4-4-4z'/%3e%3c/svg%3e")}.carousel-indicators{position:absolute;right:0;bottom:0;left:0;z-index:15;display:flex;justify-content:center;padding-left:0;margin-right:15%;margin-left:15%;list-style:none}.carousel-indicators li{box-sizing:content-box;flex:0 1 auto;width:30px;height:3px;margin-right:3px;margin-left:3px;text-indent:-999px;cursor:pointer;background-color:#fff;background-clip:padding-box;border-top:10px solid transparent;border-bottom:10px solid transparent;opacity:.5;transition:opacity 0.6s ease}@media (prefers-reduced-motion: reduce){.carousel-indicators li{transition:none}}.carousel-indicators .active{opacity:1}.carousel-caption{position:absolute;right:15%;bottom:20px;left:15%;z-index:10;padding-top:20px;padding-bottom:20px;color:#fff;text-align:center}@-webkit-keyframes spinner-border{to{transform:rotate(360deg)}}@keyframes spinner-border{to{transform:rotate(360deg)}}.spinner-border{display:inline-block;width:2rem;height:2rem;vertical-align:text-bottom;border:.25em solid currentColor;border-right-color:transparent;border-radius:50%;-webkit-animation:spinner-border .75s linear infinite;animation:spinner-border .75s linear infinite}.spinner-border-sm{width:1rem;height:1rem;border-width:.2em}@-webkit-keyframes spinner-grow{0%{transform:scale(0)}50%{opacity:1}}@keyframes spinner-grow{0%{transform:scale(0)}50%{opacity:1}}.spinner-grow{display:inline-block;width:2rem;height:2rem;vertical-align:text-bottom;background-color:currentColor;border-radius:50%;opacity:0;-webkit-animation:spinner-grow .75s linear infinite;animation:spinner-grow .75s linear infinite}.spinner-grow-sm{width:1rem;height:1rem}.align-baseline{vertical-align:baseline !important}.align-top{vertical-align:top !important}.align-middle{vertical-align:middle !important}.align-bottom{vertical-align:bottom !important}.align-text-bottom{vertical-align:text-bottom !important}.align-text-top{vertical-align:text-top !important}.bg-primary{background-color:#007bff !important}a.bg-primary:hover,a.bg-primary:focus,button.bg-primary:hover,button.bg-primary:focus{background-color:#0062cc !important}.bg-secondary{background-color:#6c757d !important}a.bg-secondary:hover,a.bg-secondary:focus,button.bg-secondary:hover,button.bg-secondary:focus{background-color:#545b62 !important}.bg-success{background-color:#28a745 !important}a.bg-success:hover,a.bg-success:focus,button.bg-success:hover,button.bg-success:focus{background-color:#1e7e34 !important}.bg-info{background-color:#17a2b8 !important}a.bg-info:hover,a.bg-info:focus,button.bg-info:hover,button.bg-info:focus{background-color:#117a8b !important}.bg-warning{background-color:#ffc107 !important}a.bg-warning:hover,a.bg-warning:focus,button.bg-warning:hover,button.bg-warning:focus{background-color:#d39e00 !important}.bg-danger{background-color:#dc3545 !important}a.bg-danger:hover,a.bg-danger:focus,button.bg-danger:hover,button.bg-danger:focus{background-color:#bd2130 !important}.bg-light{background-color:#f8f9fa !important}a.bg-light:hover,a.bg-light:focus,button.bg-light:hover,button.bg-light:focus{background-color:#dae0e5 !important}.bg-dark{background-color:#343a40 !important}a.bg-dark:hover,a.bg-dark:focus,button.bg-dark:hover,button.bg-dark:focus{background-color:#1d2124 !important}.bg-white{background-color:#fff !important}.bg-transparent{background-color:transparent !important}.border{border:1px solid #dee2e6 !important}.border-top{border-top:1px solid #dee2e6 !important}.border-right{border-right:1px solid #dee2e6 !important}.border-bottom{border-bottom:1px solid #dee2e6 !important}.border-left{border-left:1px solid #dee2e6 !important}.border-0{border:0 !important}.border-top-0{border-top:0 !important}.border-right-0{border-right:0 !important}.border-bottom-0{border-bottom:0 !important}.border-left-0{border-left:0 !important}.border-primary{border-color:#007bff !important}.border-secondary{border-color:#6c757d !important}.border-success{border-color:#28a745 !important}.border-info{border-color:#17a2b8 !important}.border-warning{border-color:#ffc107 !important}.border-danger{border-color:#dc3545 !important}.border-light{border-color:#f8f9fa !important}.border-dark{border-color:#343a40 !important}.border-white{border-color:#fff !important}.rounded-sm{border-radius:.2rem !important}.rounded{border-radius:.25rem !important}.rounded-top{border-top-left-radius:.25rem !important;border-top-right-radius:.25rem !important}.rounded-right{border-top-right-radius:.25rem !important;border-bottom-right-radius:.25rem !important}.rounded-bottom{border-bottom-right-radius:.25rem !important;border-bottom-left-radius:.25rem !important}.rounded-left{border-top-left-radius:.25rem !important;border-bottom-left-radius:.25rem !important}.rounded-lg{border-radius:.3rem !important}.rounded-circle{border-radius:50% !important}.rounded-pill{border-radius:50rem !important}.rounded-0{border-radius:0 !important}.clearfix::after{display:block;clear:both;content:""}.d-none{display:none !important}.d-inline{display:inline !important}.d-inline-block{display:inline-block !important}.d-block{display:block !important}.d-table{display:table !important}.d-table-row{display:table-row !important}.d-table-cell{display:table-cell !important}.d-flex{display:flex !important}.d-inline-flex{display:inline-flex !important}@media (min-width: 576px){.d-sm-none{display:none !important}.d-sm-inline{display:inline !important}.d-sm-inline-block{display:inline-block !important}.d-sm-block{display:block !important}.d-sm-table{display:table !important}.d-sm-table-row{display:table-row !important}.d-sm-table-cell{display:table-cell !important}.d-sm-flex{display:flex !important}.d-sm-inline-flex{display:inline-flex !important}}@media (min-width: 768px){.d-md-none{display:none !important}.d-md-inline{display:inline !important}.d-md-inline-block{display:inline-block !important}.d-md-block{display:block !important}.d-md-table{display:table !important}.d-md-table-row{display:table-row !important}.d-md-table-cell{display:table-cell !important}.d-md-flex{display:flex !important}.d-md-inline-flex{display:inline-flex !important}}@media (min-width: 992px){.d-lg-none{display:none !important}.d-lg-inline{display:inline !important}.d-lg-inline-block{display:inline-block !important}.d-lg-block{display:block !important}.d-lg-table{display:table !important}.d-lg-table-row{display:table-row !important}.d-lg-table-cell{display:table-cell !important}.d-lg-flex{display:flex !important}.d-lg-inline-flex{display:inline-flex !important}}@media (min-width: 1200px){.d-xl-none{display:none !important}.d-xl-inline{display:inline !important}.d-xl-inline-block{display:inline-block !important}.d-xl-block{display:block !important}.d-xl-table{display:table !important}.d-xl-table-row{display:table-row !important}.d-xl-table-cell{display:table-cell !important}.d-xl-flex{display:flex !important}.d-xl-inline-flex{display:inline-flex !important}}@media print{.d-print-none{display:none !important}.d-print-inline{display:inline !important}.d-print-inline-block{display:inline-block !important}.d-print-block{display:block !important}.d-print-table{display:table !important}.d-print-table-row{display:table-row !important}.d-print-table-cell{display:table-cell !important}.d-print-flex{display:flex !important}.d-print-inline-flex{display:inline-flex !important}}.embed-responsive{position:relative;display:block;width:100%;padding:0;overflow:hidden}.embed-responsive::before{display:block;content:""}.embed-responsive .embed-responsive-item,.embed-responsive iframe,.embed-responsive embed,.embed-responsive object,.embed-responsive video{position:absolute;top:0;bottom:0;left:0;width:100%;height:100%;border:0}.embed-responsive-21by9::before{padding-top:42.8571428571%}.embed-responsive-16by9::before{padding-top:56.25%}.embed-responsive-4by3::before{padding-top:75%}.embed-responsive-1by1::before{padding-top:100%}.flex-row{flex-direction:row !important}.flex-column{flex-direction:column !important}.flex-row-reverse{flex-direction:row-reverse !important}.flex-column-reverse{flex-direction:column-reverse !important}.flex-wrap{flex-wrap:wrap !important}.flex-nowrap{flex-wrap:nowrap !important}.flex-wrap-reverse{flex-wrap:wrap-reverse !important}.flex-fill{flex:1 1 auto !important}.flex-grow-0{flex-grow:0 !important}.flex-grow-1{flex-grow:1 !important}.flex-shrink-0{flex-shrink:0 !important}.flex-shrink-1{flex-shrink:1 !important}.justify-content-start{justify-content:flex-start !important}.justify-content-end{justify-content:flex-end !important}.justify-content-center{justify-content:center !important}.justify-content-between{justify-content:space-between !important}.justify-content-around{justify-content:space-around !important}.align-items-start{align-items:flex-start !important}.align-items-end{align-items:flex-end !important}.align-items-center{align-items:center !important}.align-items-baseline{align-items:baseline !important}.align-items-stretch{align-items:stretch !important}.align-content-start{align-content:flex-start !important}.align-content-end{align-content:flex-end !important}.align-content-center{align-content:center !important}.align-content-between{align-content:space-between !important}.align-content-around{align-content:space-around !important}.align-content-stretch{align-content:stretch !important}.align-self-auto{align-self:auto !important}.align-self-start{align-self:flex-start !important}.align-self-end{align-self:flex-end !important}.align-self-center{align-self:center !important}.align-self-baseline{align-self:baseline !important}.align-self-stretch{align-self:stretch !important}@media (min-width: 576px){.flex-sm-row{flex-direction:row !important}.flex-sm-column{flex-direction:column !important}.flex-sm-row-reverse{flex-direction:row-reverse !important}.flex-sm-column-reverse{flex-direction:column-reverse !important}.flex-sm-wrap{flex-wrap:wrap !important}.flex-sm-nowrap{flex-wrap:nowrap !important}.flex-sm-wrap-reverse{flex-wrap:wrap-reverse !important}.flex-sm-fill{flex:1 1 auto !important}.flex-sm-grow-0{flex-grow:0 !important}.flex-sm-grow-1{flex-grow:1 !important}.flex-sm-shrink-0{flex-shrink:0 !important}.flex-sm-shrink-1{flex-shrink:1 !important}.justify-content-sm-start{justify-content:flex-start !important}.justify-content-sm-end{justify-content:flex-end !important}.justify-content-sm-center{justify-content:center !important}.justify-content-sm-between{justify-content:space-between !important}.justify-content-sm-around{justify-content:space-around !important}.align-items-sm-start{align-items:flex-start !important}.align-items-sm-end{align-items:flex-end !important}.align-items-sm-center{align-items:center !important}.align-items-sm-baseline{align-items:baseline !important}.align-items-sm-stretch{align-items:stretch !important}.align-content-sm-start{align-content:flex-start !important}.align-content-sm-end{align-content:flex-end !important}.align-content-sm-center{align-content:center !important}.align-content-sm-between{align-content:space-between !important}.align-content-sm-around{align-content:space-around !important}.align-content-sm-stretch{align-content:stretch !important}.align-self-sm-auto{align-self:auto !important}.align-self-sm-start{align-self:flex-start !important}.align-self-sm-end{align-self:flex-end !important}.align-self-sm-center{align-self:center !important}.align-self-sm-baseline{align-self:baseline !important}.align-self-sm-stretch{align-self:stretch !important}}@media (min-width: 768px){.flex-md-row{flex-direction:row !important}.flex-md-column{flex-direction:column !important}.flex-md-row-reverse{flex-direction:row-reverse !important}.flex-md-column-reverse{flex-direction:column-reverse !important}.flex-md-wrap{flex-wrap:wrap !important}.flex-md-nowrap{flex-wrap:nowrap !important}.flex-md-wrap-reverse{flex-wrap:wrap-reverse !important}.flex-md-fill{flex:1 1 auto !important}.flex-md-grow-0{flex-grow:0 !important}.flex-md-grow-1{flex-grow:1 !important}.flex-md-shrink-0{flex-shrink:0 !important}.flex-md-shrink-1{flex-shrink:1 !important}.justify-content-md-start{justify-content:flex-start !important}.justify-content-md-end{justify-content:flex-end !important}.justify-content-md-center{justify-content:center !important}.justify-content-md-between{justify-content:space-between !important}.justify-content-md-around{justify-content:space-around !important}.align-items-md-start{align-items:flex-start !important}.align-items-md-end{align-items:flex-end !important}.align-items-md-center{align-items:center !important}.align-items-md-baseline{align-items:baseline !important}.align-items-md-stretch{align-items:stretch !important}.align-content-md-start{align-content:flex-start !important}.align-content-md-end{align-content:flex-end !important}.align-content-md-center{align-content:center !important}.align-content-md-between{align-content:space-between !important}.align-content-md-around{align-content:space-around !important}.align-content-md-stretch{align-content:stretch !important}.align-self-md-auto{align-self:auto !important}.align-self-md-start{align-self:flex-start !important}.align-self-md-end{align-self:flex-end !important}.align-self-md-center{align-self:center !important}.align-self-md-baseline{align-self:baseline !important}.align-self-md-stretch{align-self:stretch !important}}@media (min-width: 992px){.flex-lg-row{flex-direction:row !important}.flex-lg-column{flex-direction:column !important}.flex-lg-row-reverse{flex-direction:row-reverse !important}.flex-lg-column-reverse{flex-direction:column-reverse !important}.flex-lg-wrap{flex-wrap:wrap !important}.flex-lg-nowrap{flex-wrap:nowrap !important}.flex-lg-wrap-reverse{flex-wrap:wrap-reverse !important}.flex-lg-fill{flex:1 1 auto !important}.flex-lg-grow-0{flex-grow:0 !important}.flex-lg-grow-1{flex-grow:1 !important}.flex-lg-shrink-0{flex-shrink:0 !important}.flex-lg-shrink-1{flex-shrink:1 !important}.justify-content-lg-start{justify-content:flex-start !important}.justify-content-lg-end{justify-content:flex-end !important}.justify-content-lg-center{justify-content:center !important}.justify-content-lg-between{justify-content:space-between !important}.justify-content-lg-around{justify-content:space-around !important}.align-items-lg-start{align-items:flex-start !important}.align-items-lg-end{align-items:flex-end !important}.align-items-lg-center{align-items:center !important}.align-items-lg-baseline{align-items:baseline !important}.align-items-lg-stretch{align-items:stretch !important}.align-content-lg-start{align-content:flex-start !important}.align-content-lg-end{align-content:flex-end !important}.align-content-lg-center{align-content:center !important}.align-content-lg-between{align-content:space-between !important}.align-content-lg-around{align-content:space-around !important}.align-content-lg-stretch{align-content:stretch !important}.align-self-lg-auto{align-self:auto !important}.align-self-lg-start{align-self:flex-start !important}.align-self-lg-end{align-self:flex-end !important}.align-self-lg-center{align-self:center !important}.align-self-lg-baseline{align-self:baseline !important}.align-self-lg-stretch{align-self:stretch !important}}@media (min-width: 1200px){.flex-xl-row{flex-direction:row !important}.flex-xl-column{flex-direction:column !important}.flex-xl-row-reverse{flex-direction:row-reverse !important}.flex-xl-column-reverse{flex-direction:column-reverse !important}.flex-xl-wrap{flex-wrap:wrap !important}.flex-xl-nowrap{flex-wrap:nowrap !important}.flex-xl-wrap-reverse{flex-wrap:wrap-reverse !important}.flex-xl-fill{flex:1 1 auto !important}.flex-xl-grow-0{flex-grow:0 !important}.flex-xl-grow-1{flex-grow:1 !important}.flex-xl-shrink-0{flex-shrink:0 !important}.flex-xl-shrink-1{flex-shrink:1 !important}.justify-content-xl-start{justify-content:flex-start !important}.justify-content-xl-end{justify-content:flex-end !important}.justify-content-xl-center{justify-content:center !important}.justify-content-xl-between{justify-content:space-between !important}.justify-content-xl-around{justify-content:space-around !important}.align-items-xl-start{align-items:flex-start !important}.align-items-xl-end{align-items:flex-end !important}.align-items-xl-center{align-items:center !important}.align-items-xl-baseline{align-items:baseline !important}.align-items-xl-stretch{align-items:stretch !important}.align-content-xl-start{align-content:flex-start !important}.align-content-xl-end{align-content:flex-end !important}.align-content-xl-center{align-content:center !important}.align-content-xl-between{align-content:space-between !important}.align-content-xl-around{align-content:space-around !important}.align-content-xl-stretch{align-content:stretch !important}.align-self-xl-auto{align-self:auto !important}.align-self-xl-start{align-self:flex-start !important}.align-self-xl-end{align-self:flex-end !important}.align-self-xl-center{align-self:center !important}.align-self-xl-baseline{align-self:baseline !important}.align-self-xl-stretch{align-self:stretch !important}}.float-left{float:left !important}.float-right{float:right !important}.float-none{float:none !important}@media (min-width: 576px){.float-sm-left{float:left !important}.float-sm-right{float:right !important}.float-sm-none{float:none !important}}@media (min-width: 768px){.float-md-left{float:left !important}.float-md-right{float:right !important}.float-md-none{float:none !important}}@media (min-width: 992px){.float-lg-left{float:left !important}.float-lg-right{float:right !important}.float-lg-none{float:none !important}}@media (min-width: 1200px){.float-xl-left{float:left !important}.float-xl-right{float:right !important}.float-xl-none{float:none !important}}.overflow-auto{overflow:auto !important}.overflow-hidden{overflow:hidden !important}.position-static{position:static !important}.position-relative{position:relative !important}.position-absolute{position:absolute !important}.position-fixed{position:fixed !important}.position-sticky{position:-webkit-sticky !important;position:sticky !important}.fixed-top{position:fixed;top:0;right:0;left:0;z-index:1030}.fixed-bottom{position:fixed;right:0;bottom:0;left:0;z-index:1030}@supports ((position: -webkit-sticky) or (position: sticky)){.sticky-top{position:-webkit-sticky;position:sticky;top:0;z-index:1020}}.sr-only{position:absolute;width:1px;height:1px;padding:0;overflow:hidden;clip:rect(0, 0, 0, 0);white-space:nowrap;border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;overflow:visible;clip:auto;white-space:normal}.shadow-sm{box-shadow:0 0.125rem 0.25rem rgba(0,0,0,0.075) !important}.shadow{box-shadow:0 0.5rem 1rem rgba(0,0,0,0.15) !important}.shadow-lg{box-shadow:0 1rem 3rem rgba(0,0,0,0.175) !important}.shadow-none{box-shadow:none !important}.w-25{width:25% !important}.w-50{width:50% !important}.w-75{width:75% !important}.w-100{width:100% !important}.w-auto{width:auto !important}.h-25{height:25% !important}.h-50{height:50% !important}.h-75{height:75% !important}.h-100{height:100% !important}.h-auto{height:auto !important}.mw-100{max-width:100% !important}.mh-100{max-height:100% !important}.min-vw-100{min-width:100vw !important}.min-vh-100{min-height:100vh !important}.vw-100{width:100vw !important}.vh-100{height:100vh !important}.stretched-link::after{position:absolute;top:0;right:0;bottom:0;left:0;z-index:1;pointer-events:auto;content:"";background-color:transparent}.m-0{margin:0 !important}.mt-0,.my-0{margin-top:0 !important}.mr-0,.mx-0{margin-right:0 !important}.mb-0,.my-0{margin-bottom:0 !important}.ml-0,.mx-0{margin-left:0 !important}.m-1{margin:.25rem !important}.mt-1,.my-1{margin-top:.25rem !important}.mr-1,.mx-1{margin-right:.25rem !important}.mb-1,.my-1{margin-bottom:.25rem !important}.ml-1,.mx-1{margin-left:.25rem !important}.m-2{margin:.5rem !important}.mt-2,.my-2{margin-top:.5rem !important}.mr-2,.mx-2{margin-right:.5rem !important}.mb-2,.my-2{margin-bottom:.5rem !important}.ml-2,.mx-2{margin-left:.5rem !important}.m-3{margin:1rem !important}.mt-3,.my-3{margin-top:1rem !important}.mr-3,.mx-3{margin-right:1rem !important}.mb-3,.my-3{margin-bottom:1rem !important}.ml-3,.mx-3{margin-left:1rem !important}.m-4{margin:1.5rem !important}.mt-4,.my-4{margin-top:1.5rem !important}.mr-4,.mx-4{margin-right:1.5rem !important}.mb-4,.my-4{margin-bottom:1.5rem !important}.ml-4,.mx-4{margin-left:1.5rem !important}.m-5{margin:3rem !important}.mt-5,.my-5{margin-top:3rem !important}.mr-5,.mx-5{margin-right:3rem !important}.mb-5,.my-5{margin-bottom:3rem !important}.ml-5,.mx-5{margin-left:3rem !important}.p-0{padding:0 !important}.pt-0,.py-0{padding-top:0 !important}.pr-0,.px-0{padding-right:0 !important}.pb-0,.py-0{padding-bottom:0 !important}.pl-0,.px-0{padding-left:0 !important}.p-1{padding:.25rem !important}.pt-1,.py-1{padding-top:.25rem !important}.pr-1,.px-1{padding-right:.25rem !important}.pb-1,.py-1{padding-bottom:.25rem !important}.pl-1,.px-1{padding-left:.25rem !important}.p-2{padding:.5rem !important}.pt-2,.py-2{padding-top:.5rem !important}.pr-2,.px-2{padding-right:.5rem !important}.pb-2,.py-2{padding-bottom:.5rem !important}.pl-2,.px-2{padding-left:.5rem !important}.p-3{padding:1rem !important}.pt-3,.py-3{padding-top:1rem !important}.pr-3,.px-3{padding-right:1rem !important}.pb-3,.py-3{padding-bottom:1rem !important}.pl-3,.px-3{padding-left:1rem !important}.p-4{padding:1.5rem !important}.pt-4,.py-4{padding-top:1.5rem !important}.pr-4,.px-4{padding-right:1.5rem !important}.pb-4,.py-4{padding-bottom:1.5rem !important}.pl-4,.px-4{padding-left:1.5rem !important}.p-5{padding:3rem !important}.pt-5,.py-5{padding-top:3rem !important}.pr-5,.px-5{padding-right:3rem !important}.pb-5,.py-5{padding-bottom:3rem !important}.pl-5,.px-5{padding-left:3rem !important}.m-n1{margin:-.25rem !important}.mt-n1,.my-n1{margin-top:-.25rem !important}.mr-n1,.mx-n1{margin-right:-.25rem !important}.mb-n1,.my-n1{margin-bottom:-.25rem !important}.ml-n1,.mx-n1{margin-left:-.25rem !important}.m-n2{margin:-.5rem !important}.mt-n2,.my-n2{margin-top:-.5rem !important}.mr-n2,.mx-n2{margin-right:-.5rem !important}.mb-n2,.my-n2{margin-bottom:-.5rem !important}.ml-n2,.mx-n2{margin-left:-.5rem !important}.m-n3{margin:-1rem !important}.mt-n3,.my-n3{margin-top:-1rem !important}.mr-n3,.mx-n3{margin-right:-1rem !important}.mb-n3,.my-n3{margin-bottom:-1rem !important}.ml-n3,.mx-n3{margin-left:-1rem !important}.m-n4{margin:-1.5rem !important}.mt-n4,.my-n4{margin-top:-1.5rem !important}.mr-n4,.mx-n4{margin-right:-1.5rem !important}.mb-n4,.my-n4{margin-bottom:-1.5rem !important}.ml-n4,.mx-n4{margin-left:-1.5rem !important}.m-n5{margin:-3rem !important}.mt-n5,.my-n5{margin-top:-3rem !important}.mr-n5,.mx-n5{margin-right:-3rem !important}.mb-n5,.my-n5{margin-bottom:-3rem !important}.ml-n5,.mx-n5{margin-left:-3rem !important}.m-auto{margin:auto !important}.mt-auto,.my-auto{margin-top:auto !important}.mr-auto,.mx-auto{margin-right:auto !important}.mb-auto,.my-auto{margin-bottom:auto !important}.ml-auto,.mx-auto{margin-left:auto !important}@media (min-width: 576px){.m-sm-0{margin:0 !important}.mt-sm-0,.my-sm-0{margin-top:0 !important}.mr-sm-0,.mx-sm-0{margin-right:0 !important}.mb-sm-0,.my-sm-0{margin-bottom:0 !important}.ml-sm-0,.mx-sm-0{margin-left:0 !important}.m-sm-1{margin:.25rem !important}.mt-sm-1,.my-sm-1{margin-top:.25rem !important}.mr-sm-1,.mx-sm-1{margin-right:.25rem !important}.mb-sm-1,.my-sm-1{margin-bottom:.25rem !important}.ml-sm-1,.mx-sm-1{margin-left:.25rem !important}.m-sm-2{margin:.5rem !important}.mt-sm-2,.my-sm-2{margin-top:.5rem !important}.mr-sm-2,.mx-sm-2{margin-right:.5rem !important}.mb-sm-2,.my-sm-2{margin-bottom:.5rem !important}.ml-sm-2,.mx-sm-2{margin-left:.5rem !important}.m-sm-3{margin:1rem !important}.mt-sm-3,.my-sm-3{margin-top:1rem !important}.mr-sm-3,.mx-sm-3{margin-right:1rem !important}.mb-sm-3,.my-sm-3{margin-bottom:1rem !important}.ml-sm-3,.mx-sm-3{margin-left:1rem !important}.m-sm-4{margin:1.5rem !important}.mt-sm-4,.my-sm-4{margin-top:1.5rem !important}.mr-sm-4,.mx-sm-4{margin-right:1.5rem !important}.mb-sm-4,.my-sm-4{margin-bottom:1.5rem !important}.ml-sm-4,.mx-sm-4{margin-left:1.5rem !important}.m-sm-5{margin:3rem !important}.mt-sm-5,.my-sm-5{margin-top:3rem !important}.mr-sm-5,.mx-sm-5{margin-right:3rem !important}.mb-sm-5,.my-sm-5{margin-bottom:3rem !important}.ml-sm-5,.mx-sm-5{margin-left:3rem !important}.p-sm-0{padding:0 !important}.pt-sm-0,.py-sm-0{padding-top:0 !important}.pr-sm-0,.px-sm-0{padding-right:0 !important}.pb-sm-0,.py-sm-0{padding-bottom:0 !important}.pl-sm-0,.px-sm-0{padding-left:0 !important}.p-sm-1{padding:.25rem !important}.pt-sm-1,.py-sm-1{padding-top:.25rem !important}.pr-sm-1,.px-sm-1{padding-right:.25rem !important}.pb-sm-1,.py-sm-1{padding-bottom:.25rem !important}.pl-sm-1,.px-sm-1{padding-left:.25rem !important}.p-sm-2{padding:.5rem !important}.pt-sm-2,.py-sm-2{padding-top:.5rem !important}.pr-sm-2,.px-sm-2{padding-right:.5rem !important}.pb-sm-2,.py-sm-2{padding-bottom:.5rem !important}.pl-sm-2,.px-sm-2{padding-left:.5rem !important}.p-sm-3{padding:1rem !important}.pt-sm-3,.py-sm-3{padding-top:1rem !important}.pr-sm-3,.px-sm-3{padding-right:1rem !important}.pb-sm-3,.py-sm-3{padding-bottom:1rem !important}.pl-sm-3,.px-sm-3{padding-left:1rem !important}.p-sm-4{padding:1.5rem !important}.pt-sm-4,.py-sm-4{padding-top:1.5rem !important}.pr-sm-4,.px-sm-4{padding-right:1.5rem !important}.pb-sm-4,.py-sm-4{padding-bottom:1.5rem !important}.pl-sm-4,.px-sm-4{padding-left:1.5rem !important}.p-sm-5{padding:3rem !important}.pt-sm-5,.py-sm-5{padding-top:3rem !important}.pr-sm-5,.px-sm-5{padding-right:3rem !important}.pb-sm-5,.py-sm-5{padding-bottom:3rem !important}.pl-sm-5,.px-sm-5{padding-left:3rem !important}.m-sm-n1{margin:-.25rem !important}.mt-sm-n1,.my-sm-n1{margin-top:-.25rem !important}.mr-sm-n1,.mx-sm-n1{margin-right:-.25rem !important}.mb-sm-n1,.my-sm-n1{margin-bottom:-.25rem !important}.ml-sm-n1,.mx-sm-n1{margin-left:-.25rem !important}.m-sm-n2{margin:-.5rem !important}.mt-sm-n2,.my-sm-n2{margin-top:-.5rem !important}.mr-sm-n2,.mx-sm-n2{margin-right:-.5rem !important}.mb-sm-n2,.my-sm-n2{margin-bottom:-.5rem !important}.ml-sm-n2,.mx-sm-n2{margin-left:-.5rem !important}.m-sm-n3{margin:-1rem !important}.mt-sm-n3,.my-sm-n3{margin-top:-1rem !important}.mr-sm-n3,.mx-sm-n3{margin-right:-1rem !important}.mb-sm-n3,.my-sm-n3{margin-bottom:-1rem !important}.ml-sm-n3,.mx-sm-n3{margin-left:-1rem !important}.m-sm-n4{margin:-1.5rem !important}.mt-sm-n4,.my-sm-n4{margin-top:-1.5rem !important}.mr-sm-n4,.mx-sm-n4{margin-right:-1.5rem !important}.mb-sm-n4,.my-sm-n4{margin-bottom:-1.5rem !important}.ml-sm-n4,.mx-sm-n4{margin-left:-1.5rem !important}.m-sm-n5{margin:-3rem !important}.mt-sm-n5,.my-sm-n5{margin-top:-3rem !important}.mr-sm-n5,.mx-sm-n5{margin-right:-3rem !important}.mb-sm-n5,.my-sm-n5{margin-bottom:-3rem !important}.ml-sm-n5,.mx-sm-n5{margin-left:-3rem !important}.m-sm-auto{margin:auto !important}.mt-sm-auto,.my-sm-auto{margin-top:auto !important}.mr-sm-auto,.mx-sm-auto{margin-right:auto !important}.mb-sm-auto,.my-sm-auto{margin-bottom:auto !important}.ml-sm-auto,.mx-sm-auto{margin-left:auto !important}}@media (min-width: 768px){.m-md-0{margin:0 !important}.mt-md-0,.my-md-0{margin-top:0 !important}.mr-md-0,.mx-md-0{margin-right:0 !important}.mb-md-0,.my-md-0{margin-bottom:0 !important}.ml-md-0,.mx-md-0{margin-left:0 !important}.m-md-1{margin:.25rem !important}.mt-md-1,.my-md-1{margin-top:.25rem !important}.mr-md-1,.mx-md-1{margin-right:.25rem !important}.mb-md-1,.my-md-1{margin-bottom:.25rem !important}.ml-md-1,.mx-md-1{margin-left:.25rem !important}.m-md-2{margin:.5rem !important}.mt-md-2,.my-md-2{margin-top:.5rem !important}.mr-md-2,.mx-md-2{margin-right:.5rem !important}.mb-md-2,.my-md-2{margin-bottom:.5rem !important}.ml-md-2,.mx-md-2{margin-left:.5rem !important}.m-md-3{margin:1rem !important}.mt-md-3,.my-md-3{margin-top:1rem !important}.mr-md-3,.mx-md-3{margin-right:1rem !important}.mb-md-3,.my-md-3{margin-bottom:1rem !important}.ml-md-3,.mx-md-3{margin-left:1rem !important}.m-md-4{margin:1.5rem !important}.mt-md-4,.my-md-4{margin-top:1.5rem !important}.mr-md-4,.mx-md-4{margin-right:1.5rem !important}.mb-md-4,.my-md-4{margin-bottom:1.5rem !important}.ml-md-4,.mx-md-4{margin-left:1.5rem !important}.m-md-5{margin:3rem !important}.mt-md-5,.my-md-5{margin-top:3rem !important}.mr-md-5,.mx-md-5{margin-right:3rem !important}.mb-md-5,.my-md-5{margin-bottom:3rem !important}.ml-md-5,.mx-md-5{margin-left:3rem !important}.p-md-0{padding:0 !important}.pt-md-0,.py-md-0{padding-top:0 !important}.pr-md-0,.px-md-0{padding-right:0 !important}.pb-md-0,.py-md-0{padding-bottom:0 !important}.pl-md-0,.px-md-0{padding-left:0 !important}.p-md-1{padding:.25rem !important}.pt-md-1,.py-md-1{padding-top:.25rem !important}.pr-md-1,.px-md-1{padding-right:.25rem !important}.pb-md-1,.py-md-1{padding-bottom:.25rem !important}.pl-md-1,.px-md-1{padding-left:.25rem !important}.p-md-2{padding:.5rem !important}.pt-md-2,.py-md-2{padding-top:.5rem !important}.pr-md-2,.px-md-2{padding-right:.5rem !important}.pb-md-2,.py-md-2{padding-bottom:.5rem !important}.pl-md-2,.px-md-2{padding-left:.5rem !important}.p-md-3{padding:1rem !important}.pt-md-3,.py-md-3{padding-top:1rem !important}.pr-md-3,.px-md-3{padding-right:1rem !important}.pb-md-3,.py-md-3{padding-bottom:1rem !important}.pl-md-3,.px-md-3{padding-left:1rem !important}.p-md-4{padding:1.5rem !important}.pt-md-4,.py-md-4{padding-top:1.5rem !important}.pr-md-4,.px-md-4{padding-right:1.5rem !important}.pb-md-4,.py-md-4{padding-bottom:1.5rem !important}.pl-md-4,.px-md-4{padding-left:1.5rem !important}.p-md-5{padding:3rem !important}.pt-md-5,.py-md-5{padding-top:3rem !important}.pr-md-5,.px-md-5{padding-right:3rem !important}.pb-md-5,.py-md-5{padding-bottom:3rem !important}.pl-md-5,.px-md-5{padding-left:3rem !important}.m-md-n1{margin:-.25rem !important}.mt-md-n1,.my-md-n1{margin-top:-.25rem !important}.mr-md-n1,.mx-md-n1{margin-right:-.25rem !important}.mb-md-n1,.my-md-n1{margin-bottom:-.25rem !important}.ml-md-n1,.mx-md-n1{margin-left:-.25rem !important}.m-md-n2{margin:-.5rem !important}.mt-md-n2,.my-md-n2{margin-top:-.5rem !important}.mr-md-n2,.mx-md-n2{margin-right:-.5rem !important}.mb-md-n2,.my-md-n2{margin-bottom:-.5rem !important}.ml-md-n2,.mx-md-n2{margin-left:-.5rem !important}.m-md-n3{margin:-1rem !important}.mt-md-n3,.my-md-n3{margin-top:-1rem !important}.mr-md-n3,.mx-md-n3{margin-right:-1rem !important}.mb-md-n3,.my-md-n3{margin-bottom:-1rem !important}.ml-md-n3,.mx-md-n3{margin-left:-1rem !important}.m-md-n4{margin:-1.5rem !important}.mt-md-n4,.my-md-n4{margin-top:-1.5rem !important}.mr-md-n4,.mx-md-n4{margin-right:-1.5rem !important}.mb-md-n4,.my-md-n4{margin-bottom:-1.5rem !important}.ml-md-n4,.mx-md-n4{margin-left:-1.5rem !important}.m-md-n5{margin:-3rem !important}.mt-md-n5,.my-md-n5{margin-top:-3rem !important}.mr-md-n5,.mx-md-n5{margin-right:-3rem !important}.mb-md-n5,.my-md-n5{margin-bottom:-3rem !important}.ml-md-n5,.mx-md-n5{margin-left:-3rem !important}.m-md-auto{margin:auto !important}.mt-md-auto,.my-md-auto{margin-top:auto !important}.mr-md-auto,.mx-md-auto{margin-right:auto !important}.mb-md-auto,.my-md-auto{margin-bottom:auto !important}.ml-md-auto,.mx-md-auto{margin-left:auto !important}}@media (min-width: 992px){.m-lg-0{margin:0 !important}.mt-lg-0,.my-lg-0{margin-top:0 !important}.mr-lg-0,.mx-lg-0{margin-right:0 !important}.mb-lg-0,.my-lg-0{margin-bottom:0 !important}.ml-lg-0,.mx-lg-0{margin-left:0 !important}.m-lg-1{margin:.25rem !important}.mt-lg-1,.my-lg-1{margin-top:.25rem !important}.mr-lg-1,.mx-lg-1{margin-right:.25rem !important}.mb-lg-1,.my-lg-1{margin-bottom:.25rem !important}.ml-lg-1,.mx-lg-1{margin-left:.25rem !important}.m-lg-2{margin:.5rem !important}.mt-lg-2,.my-lg-2{margin-top:.5rem !important}.mr-lg-2,.mx-lg-2{margin-right:.5rem !important}.mb-lg-2,.my-lg-2{margin-bottom:.5rem !important}.ml-lg-2,.mx-lg-2{margin-left:.5rem !important}.m-lg-3{margin:1rem !important}.mt-lg-3,.my-lg-3{margin-top:1rem !important}.mr-lg-3,.mx-lg-3{margin-right:1rem !important}.mb-lg-3,.my-lg-3{margin-bottom:1rem !important}.ml-lg-3,.mx-lg-3{margin-left:1rem !important}.m-lg-4{margin:1.5rem !important}.mt-lg-4,.my-lg-4{margin-top:1.5rem !important}.mr-lg-4,.mx-lg-4{margin-right:1.5rem !important}.mb-lg-4,.my-lg-4{margin-bottom:1.5rem !important}.ml-lg-4,.mx-lg-4{margin-left:1.5rem !important}.m-lg-5{margin:3rem !important}.mt-lg-5,.my-lg-5{margin-top:3rem !important}.mr-lg-5,.mx-lg-5{margin-right:3rem !important}.mb-lg-5,.my-lg-5{margin-bottom:3rem !important}.ml-lg-5,.mx-lg-5{margin-left:3rem !important}.p-lg-0{padding:0 !important}.pt-lg-0,.py-lg-0{padding-top:0 !important}.pr-lg-0,.px-lg-0{padding-right:0 !important}.pb-lg-0,.py-lg-0{padding-bottom:0 !important}.pl-lg-0,.px-lg-0{padding-left:0 !important}.p-lg-1{padding:.25rem !important}.pt-lg-1,.py-lg-1{padding-top:.25rem !important}.pr-lg-1,.px-lg-1{padding-right:.25rem !important}.pb-lg-1,.py-lg-1{padding-bottom:.25rem !important}.pl-lg-1,.px-lg-1{padding-left:.25rem !important}.p-lg-2{padding:.5rem !important}.pt-lg-2,.py-lg-2{padding-top:.5rem !important}.pr-lg-2,.px-lg-2{padding-right:.5rem !important}.pb-lg-2,.py-lg-2{padding-bottom:.5rem !important}.pl-lg-2,.px-lg-2{padding-left:.5rem !important}.p-lg-3{padding:1rem !important}.pt-lg-3,.py-lg-3{padding-top:1rem !important}.pr-lg-3,.px-lg-3{padding-right:1rem !important}.pb-lg-3,.py-lg-3{padding-bottom:1rem !important}.pl-lg-3,.px-lg-3{padding-left:1rem !important}.p-lg-4{padding:1.5rem !important}.pt-lg-4,.py-lg-4{padding-top:1.5rem !important}.pr-lg-4,.px-lg-4{padding-right:1.5rem !important}.pb-lg-4,.py-lg-4{padding-bottom:1.5rem !important}.pl-lg-4,.px-lg-4{padding-left:1.5rem !important}.p-lg-5{padding:3rem !important}.pt-lg-5,.py-lg-5{padding-top:3rem !important}.pr-lg-5,.px-lg-5{padding-right:3rem !important}.pb-lg-5,.py-lg-5{padding-bottom:3rem !important}.pl-lg-5,.px-lg-5{padding-left:3rem !important}.m-lg-n1{margin:-.25rem !important}.mt-lg-n1,.my-lg-n1{margin-top:-.25rem !important}.mr-lg-n1,.mx-lg-n1{margin-right:-.25rem !important}.mb-lg-n1,.my-lg-n1{margin-bottom:-.25rem !important}.ml-lg-n1,.mx-lg-n1{margin-left:-.25rem !important}.m-lg-n2{margin:-.5rem !important}.mt-lg-n2,.my-lg-n2{margin-top:-.5rem !important}.mr-lg-n2,.mx-lg-n2{margin-right:-.5rem !important}.mb-lg-n2,.my-lg-n2{margin-bottom:-.5rem !important}.ml-lg-n2,.mx-lg-n2{margin-left:-.5rem !important}.m-lg-n3{margin:-1rem !important}.mt-lg-n3,.my-lg-n3{margin-top:-1rem !important}.mr-lg-n3,.mx-lg-n3{margin-right:-1rem !important}.mb-lg-n3,.my-lg-n3{margin-bottom:-1rem !important}.ml-lg-n3,.mx-lg-n3{margin-left:-1rem !important}.m-lg-n4{margin:-1.5rem !important}.mt-lg-n4,.my-lg-n4{margin-top:-1.5rem !important}.mr-lg-n4,.mx-lg-n4{margin-right:-1.5rem !important}.mb-lg-n4,.my-lg-n4{margin-bottom:-1.5rem !important}.ml-lg-n4,.mx-lg-n4{margin-left:-1.5rem !important}.m-lg-n5{margin:-3rem !important}.mt-lg-n5,.my-lg-n5{margin-top:-3rem !important}.mr-lg-n5,.mx-lg-n5{margin-right:-3rem !important}.mb-lg-n5,.my-lg-n5{margin-bottom:-3rem !important}.ml-lg-n5,.mx-lg-n5{margin-left:-3rem !important}.m-lg-auto{margin:auto !important}.mt-lg-auto,.my-lg-auto{margin-top:auto !important}.mr-lg-auto,.mx-lg-auto{margin-right:auto !important}.mb-lg-auto,.my-lg-auto{margin-bottom:auto !important}.ml-lg-auto,.mx-lg-auto{margin-left:auto !important}}@media (min-width: 1200px){.m-xl-0{margin:0 !important}.mt-xl-0,.my-xl-0{margin-top:0 !important}.mr-xl-0,.mx-xl-0{margin-right:0 !important}.mb-xl-0,.my-xl-0{margin-bottom:0 !important}.ml-xl-0,.mx-xl-0{margin-left:0 !important}.m-xl-1{margin:.25rem !important}.mt-xl-1,.my-xl-1{margin-top:.25rem !important}.mr-xl-1,.mx-xl-1{margin-right:.25rem !important}.mb-xl-1,.my-xl-1{margin-bottom:.25rem !important}.ml-xl-1,.mx-xl-1{margin-left:.25rem !important}.m-xl-2{margin:.5rem !important}.mt-xl-2,.my-xl-2{margin-top:.5rem !important}.mr-xl-2,.mx-xl-2{margin-right:.5rem !important}.mb-xl-2,.my-xl-2{margin-bottom:.5rem !important}.ml-xl-2,.mx-xl-2{margin-left:.5rem !important}.m-xl-3{margin:1rem !important}.mt-xl-3,.my-xl-3{margin-top:1rem !important}.mr-xl-3,.mx-xl-3{margin-right:1rem !important}.mb-xl-3,.my-xl-3{margin-bottom:1rem !important}.ml-xl-3,.mx-xl-3{margin-left:1rem !important}.m-xl-4{margin:1.5rem !important}.mt-xl-4,.my-xl-4{margin-top:1.5rem !important}.mr-xl-4,.mx-xl-4{margin-right:1.5rem !important}.mb-xl-4,.my-xl-4{margin-bottom:1.5rem !important}.ml-xl-4,.mx-xl-4{margin-left:1.5rem !important}.m-xl-5{margin:3rem !important}.mt-xl-5,.my-xl-5{margin-top:3rem !important}.mr-xl-5,.mx-xl-5{margin-right:3rem !important}.mb-xl-5,.my-xl-5{margin-bottom:3rem !important}.ml-xl-5,.mx-xl-5{margin-left:3rem !important}.p-xl-0{padding:0 !important}.pt-xl-0,.py-xl-0{padding-top:0 !important}.pr-xl-0,.px-xl-0{padding-right:0 !important}.pb-xl-0,.py-xl-0{padding-bottom:0 !important}.pl-xl-0,.px-xl-0{padding-left:0 !important}.p-xl-1{padding:.25rem !important}.pt-xl-1,.py-xl-1{padding-top:.25rem !important}.pr-xl-1,.px-xl-1{padding-right:.25rem !important}.pb-xl-1,.py-xl-1{padding-bottom:.25rem !important}.pl-xl-1,.px-xl-1{padding-left:.25rem !important}.p-xl-2{padding:.5rem !important}.pt-xl-2,.py-xl-2{padding-top:.5rem !important}.pr-xl-2,.px-xl-2{padding-right:.5rem !important}.pb-xl-2,.py-xl-2{padding-bottom:.5rem !important}.pl-xl-2,.px-xl-2{padding-left:.5rem !important}.p-xl-3{padding:1rem !important}.pt-xl-3,.py-xl-3{padding-top:1rem !important}.pr-xl-3,.px-xl-3{padding-right:1rem !important}.pb-xl-3,.py-xl-3{padding-bottom:1rem !important}.pl-xl-3,.px-xl-3{padding-left:1rem !important}.p-xl-4{padding:1.5rem !important}.pt-xl-4,.py-xl-4{padding-top:1.5rem !important}.pr-xl-4,.px-xl-4{padding-right:1.5rem !important}.pb-xl-4,.py-xl-4{padding-bottom:1.5rem !important}.pl-xl-4,.px-xl-4{padding-left:1.5rem !important}.p-xl-5{padding:3rem !important}.pt-xl-5,.py-xl-5{padding-top:3rem !important}.pr-xl-5,.px-xl-5{padding-right:3rem !important}.pb-xl-5,.py-xl-5{padding-bottom:3rem !important}.pl-xl-5,.px-xl-5{padding-left:3rem !important}.m-xl-n1{margin:-.25rem !important}.mt-xl-n1,.my-xl-n1{margin-top:-.25rem !important}.mr-xl-n1,.mx-xl-n1{margin-right:-.25rem !important}.mb-xl-n1,.my-xl-n1{margin-bottom:-.25rem !important}.ml-xl-n1,.mx-xl-n1{margin-left:-.25rem !important}.m-xl-n2{margin:-.5rem !important}.mt-xl-n2,.my-xl-n2{margin-top:-.5rem !important}.mr-xl-n2,.mx-xl-n2{margin-right:-.5rem !important}.mb-xl-n2,.my-xl-n2{margin-bottom:-.5rem !important}.ml-xl-n2,.mx-xl-n2{margin-left:-.5rem !important}.m-xl-n3{margin:-1rem !important}.mt-xl-n3,.my-xl-n3{margin-top:-1rem !important}.mr-xl-n3,.mx-xl-n3{margin-right:-1rem !important}.mb-xl-n3,.my-xl-n3{margin-bottom:-1rem !important}.ml-xl-n3,.mx-xl-n3{margin-left:-1rem !important}.m-xl-n4{margin:-1.5rem !important}.mt-xl-n4,.my-xl-n4{margin-top:-1.5rem !important}.mr-xl-n4,.mx-xl-n4{margin-right:-1.5rem !important}.mb-xl-n4,.my-xl-n4{margin-bottom:-1.5rem !important}.ml-xl-n4,.mx-xl-n4{margin-left:-1.5rem !important}.m-xl-n5{margin:-3rem !important}.mt-xl-n5,.my-xl-n5{margin-top:-3rem !important}.mr-xl-n5,.mx-xl-n5{margin-right:-3rem !important}.mb-xl-n5,.my-xl-n5{margin-bottom:-3rem !important}.ml-xl-n5,.mx-xl-n5{margin-left:-3rem !important}.m-xl-auto{margin:auto !important}.mt-xl-auto,.my-xl-auto{margin-top:auto !important}.mr-xl-auto,.mx-xl-auto{margin-right:auto !important}.mb-xl-auto,.my-xl-auto{margin-bottom:auto !important}.ml-xl-auto,.mx-xl-auto{margin-left:auto !important}}.text-monospace{font-family:SFMono-Regular,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace !important}.text-justify{text-align:justify !important}.text-wrap{white-space:normal !important}.text-nowrap{white-space:nowrap !important}.text-truncate{overflow:hidden;text-overflow:ellipsis;white-space:nowrap}.text-left{text-align:left !important}.text-right{text-align:right !important}.text-center{text-align:center !important}@media (min-width: 576px){.text-sm-left{text-align:left !important}.text-sm-right{text-align:right !important}.text-sm-center{text-align:center !important}}@media (min-width: 768px){.text-md-left{text-align:left !important}.text-md-right{text-align:right !important}.text-md-center{text-align:center !important}}@media (min-width: 992px){.text-lg-left{text-align:left !important}.text-lg-right{text-align:right !important}.text-lg-center{text-align:center !important}}@media (min-width: 1200px){.text-xl-left{text-align:left !important}.text-xl-right{text-align:right !important}.text-xl-center{text-align:center !important}}.text-lowercase{text-transform:lowercase !important}.text-uppercase{text-transform:uppercase !important}.text-capitalize{text-transform:capitalize !important}.font-weight-light{font-weight:300 !important}.font-weight-lighter{font-weight:lighter !important}.font-weight-normal{font-weight:400 !important}.font-weight-bold{font-weight:700 !important}.font-weight-bolder{font-weight:bolder !important}.font-italic{font-style:italic !important}.text-white{color:#fff !important}.text-primary{color:#007bff !important}a.text-primary:hover,a.text-primary:focus{color:#0056b3 !important}.text-secondary{color:#6c757d !important}a.text-secondary:hover,a.text-secondary:focus{color:#494f54 !important}.text-success{color:#28a745 !important}a.text-success:hover,a.text-success:focus{color:#19692c !important}.text-info{color:#17a2b8 !important}a.text-info:hover,a.text-info:focus{color:#0f6674 !important}.text-warning{color:#ffc107 !important}a.text-warning:hover,a.text-warning:focus{color:#ba8b00 !important}.text-danger{color:#dc3545 !important}a.text-danger:hover,a.text-danger:focus{color:#a71d2a !important}.text-light{color:#f8f9fa !important}a.text-light:hover,a.text-light:focus{color:#cbd3da !important}.text-dark{color:#343a40 !important}a.text-dark:hover,a.text-dark:focus{color:#121416 !important}.text-body{color:#212529 !important}.text-muted{color:#6c757d !important}.text-black-50{color:rgba(0,0,0,0.5) !important}.text-white-50{color:rgba(255,255,255,0.5) !important}.text-hide{font:0/0 a;color:transparent;text-shadow:none;background-color:transparent;border:0}.text-decoration-none{text-decoration:none !important}.text-break{word-break:break-word !important;overflow-wrap:break-word !important}.text-reset{color:inherit !important}.visible{visibility:visible !important}.invisible{visibility:hidden !important}@media print{*,*::before,*::after{text-shadow:none !important;box-shadow:none !important}a:not(.btn){text-decoration:underline}abbr[title]::after{content:" (" attr(title) ")"}pre{white-space:pre-wrap !important}pre,blockquote{border:1px solid #adb5bd;page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}p,h2,h3{orphans:3;widows:3}h2,h3{page-break-after:avoid}@page{size:a3}body{min-width:992px !important}.container{min-width:992px !important}.navbar{display:none}.badge{border:1px solid #000}.table{border-collapse:collapse !important}.table td,.table th{background-color:#fff !important}.table-bordered th,.table-bordered td{border:1px solid #dee2e6 !important}.table-dark{color:inherit}.table-dark th,.table-dark td,.table-dark thead th,.table-dark tbody+tbody{border-color:#dee2e6}.table .thead-dark th{color:inherit;border-color:#dee2e6}}.highlight table td{padding:5px}.highlight table pre{margin:0}.highlight .cm{color:#999988;font-style:italic}.highlight .cp{color:#999999;font-weight:bold}.highlight .c1{color:#999988;font-style:italic}.highlight .cs{color:#999999;font-weight:bold;font-style:italic}.highlight .c,.highlight .cd{color:#8c8c8c;font-style:italic}.highlight .err{color:#a61717;background-color:#e3d2d2}.highlight .gd{color:#000000;background-color:#ffdddd}.highlight .ge{color:#000000;font-style:italic}.highlight .gr{color:#aa0000}.highlight .gh{color:#999999}.highlight .gi{color:#000000;background-color:#ddffdd}.highlight .go{color:#888888}.highlight .gp{color:#555555}.highlight .gs{font-weight:bold}.highlight .gu{color:#aaaaaa}.highlight .gt{color:#aa0000}.highlight .kc{color:#000000;font-weight:bold}.highlight .kd{color:#000000;font-weight:bold}.highlight .kn{color:#000000;font-weight:bold}.highlight .kp{color:#000000;font-weight:bold}.highlight .kr{color:#000000;font-weight:bold}.highlight .kt{color:#445588;font-weight:bold}.highlight .k,.highlight .kv{color:#000000;font-weight:bold}.highlight .mf{color:#009999}.highlight .mh{color:#009999}.highlight .il{color:#009999}.highlight .mi{color:#009999}.highlight .mo{color:#009999}.highlight .m,.highlight .mb,.highlight .mx{color:#009999}.highlight .sb{color:#d14}.highlight .sc{color:#d14}.highlight .sd{color:#d14}.highlight .s2{color:#d14}.highlight .se{color:#d14}.highlight .sh{color:#d14}.highlight .si{color:#d14}.highlight .sx{color:#d14}.highlight .sr{color:#009926}.highlight .s1{color:#d14}.highlight .ss{color:#990073}.highlight .s{color:#d14}.highlight .na{color:#008080}.highlight .bp{color:#999999}.highlight .nb{color:#0086B3}.highlight .nc{color:#445588;font-weight:bold}.highlight .no{color:#008080}.highlight .nd{color:#3c5d5d;font-weight:bold}.highlight .ni{color:#800080}.highlight .ne{color:#990000;font-weight:bold}.highlight .nf{color:#990000;font-weight:bold}.highlight .nl{color:#990000;font-weight:bold}.highlight .nn{color:#555555}.highlight .nt{color:#000080}.highlight .vc{color:#008080}.highlight .vg{color:#008080}.highlight .vi{color:#008080}.highlight .nv{color:#008080}.highlight .ow{color:#000000;font-weight:bold}.highlight .o{color:#000000;font-weight:bold}.highlight .w{color:#bbbbbb}.highlight{background-color:#f8f8f8}.container{padding-left:30px;padding-right:30px;max-width:1240px}.container-fluid{padding-left:0;padding-right:0}@font-face{font-family:FreightSans;font-weight:700;font-style:normal;src:url("/assets/fonts/FreightSans/freight-sans-bold.woff2") format("woff2"),url("../fonts/FreightSans/freight-sans-bold.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:700;font-style:italic;src:url("/assets/fonts/FreightSans/freight-sans-bold-italic.woff2") format("woff2"),url("/assets/fonts/FreightSans/freight-sans-bold-italic.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:500;font-style:normal;src:url("/assets/fonts/FreightSans/freight-sans-medium.woff2") format("woff2"),url("/assets/fonts/FreightSans/freight-sans-medium.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:500;font-style:italic;src:url("/assets/fonts/FreightSans/freight-sans-medium-italic.woff2") format("woff2"),url("/assets/fonts/FreightSans/freight-sans-medium-italic.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:100;font-style:normal;src:url("/assets/fonts/FreightSans/freight-sans-light.woff2") format("woff2"),url("/assets/fonts/FreightSans/freight-sans-light.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:100;font-style:italic;src:url("/assets/fonts/FreightSans/freight-sans-light-italic.woff2") format("woff2"),url("/assets/fonts/FreightSans/freight-sans-light-italic.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:400;font-style:italic;src:url("/assets/fonts/FreightSans/freight-sans-book-italic.woff2") format("woff2"),url("/assets/fonts/FreightSans/freight-sans-book-italic.woff") format("woff")}@font-face{font-family:FreightSans;font-weight:400;font-style:normal;src:url("/assets/fonts/FreightSans/freight-sans-book.woff2") format("woff2"),url("/assets/FreightSans/freight-sans-book.woff") format("woff")}@font-face{font-family:IBMPlexMono;font-weight:600;font-style:normal;unicode-range:u+0020-007f;src:local("IBMPlexMono-SemiBold"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff2") format("woff2"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-SemiBold.woff") format("woff")}@font-face{font-family:IBMPlexMono;font-weight:500;font-style:normal;unicode-range:u+0020-007f;src:local("IBMPlexMono-Medium"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff2") format("woff2"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-Medium.woff") format("woff")}@font-face{font-family:IBMPlexMono;font-weight:400;font-style:normal;unicode-range:u+0020-007f;src:local("IBMPlexMono-Regular"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff2") format("woff2"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-Regular.woff") format("woff")}@font-face{font-family:IBMPlexMono;font-weight:300;font-style:normal;unicode-range:u+0020-007f;src:local("IBMPlexMono-Light"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff2") format("woff2"),url("/assets/fonts/IBMPlexMono/IBMPlexMono-Light.woff") format("woff")}*{font-family:Nanum Gothic, FreightSans, Helvetica Neue, Helvetica, Arial, sans-serif;font-weight:400}h1,h2,h3,h4,h5,h6{font-family:Nanum Gothic}p{margin-bottom:1.25rem}a,em,i,b,strong,u,span{font-size:inherit}a:link,a:visited,a:hover{text-decoration:none;color:#ee4c2c}p a:link,p a:visited,p a:hover{color:#ee4c2c;text-decoration:none}@media screen and (min-width: 768px){p a:hover{text-decoration:underline}}.btn,a.btn{border-radius:0;border:none;background-color:#f3f4f7;color:#6c6c6d;font-weight:400;position:relative;letter-spacing:0.25px}.btn.btn-lg,.btn-group-lg>.btn,a.btn.btn-lg,.btn-group-lg>a.btn{font-size:1.125rem;padding-top:.5rem}.btn.btn-white,a.btn.btn-white{background-color:#fff}.btn.btn-orange,a.btn.btn-orange{background-color:#ee4c2c}.btn.btn-demo,a.btn.btn-demo{color:#fff}@media screen and (min-width: 768px){.btn:after,a.btn:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.btn:hover:after,a.btn:hover:after{width:100%}.btn:hover,a.btn:hover{color:#262626}}.navbar{padding-left:0;padding-right:0}html{position:relative;min-height:100%;font-size:12px}@media screen and (min-width: 768px){html{font-size:14px}}@media screen and (min-width: 768px){body{margin:0 0 350px}}body.no-scroll{height:100%;overflow:hidden}a.with-right-arrow,.btn.with-right-arrow{padding-right:2rem;position:relative;background-image:url("/assets/images/chevron-right-orange.svg");background-size:6px 13px;background-position:top 10px right 11px;background-repeat:no-repeat}@media screen and (min-width: 768px){a.with-right-arrow,.btn.with-right-arrow{background-size:8px 14px;background-position:top 15px right 12px;padding-right:2rem}}a.with-left-arrow,.btn.with-left-arrow{padding-left:2rem;position:relative;background-image:url("/assets/images/chevron-left-grey.svg");background-size:6px 13px;background-position:top 10px left 11px;background-repeat:no-repeat}@media screen and (min-width: 768px){a.with-left-arrow,.btn.with-left-arrow{background-size:8px 14px;background-position:top 16px left 12px;padding-left:2rem}}.main-background{position:absolute;top:0;left:0;width:100%;height:350px;background-size:cover;background-position:80% center;background-repeat:no-repeat;background-color:#888;background-blend-mode:luminosity;z-index:-1}@media screen and (min-width: 768px){.main-background{height:640px}}.main-background.home-page-background{background-image:url("/assets/images/home-background.jpg")}.main-background.hub-background{background-color:#f3f4f7}.main-background.ecosystem-background{background-color:#f3f4f7}.main-background.ecosystem-join-background{background-color:#f3f4f7}.main-background.ecosystem-detail-background{background-image:url("/assets/images/ecosystem-detail-background.jpg")}.main-background.about-background{background-image:url("/assets/images/about-background.jpg")}.main-background.get-started-background{background-image:url("/assets/images/get-started-background.jpg")}.main-background.coc-background{background-image:url("/assets/images/coc-background.jpg")}.main-background.style-guide{background-image:url("https://via.placeholder.com/2560x1280/262626/3a3a3a")}.main-background.features-background{background-image:url("/assets/images/features-background.jpg")}@media screen and (min-width: 768px){.main-background.features-background{height:300px}}@media (max-width: 320px){.main-background.features-background{height:350px}}.main-background.blog-background{background-image:url("/assets/images/blog-background.jpg");background-color:#2F2F2F;background-blend-mode:saturation plus-darker}.main-background.mobile-background{background-image:url("/assets/images/get-started-background.jpg")}.main-background.deep-learning-background{background-image:url("/assets/images/deep-learning-thank-you-background.jpg")}.main-background.announcement-background{background-image:url("/assets/images/pytorch_bg_purple.jpg")}.bg-light-grey{background-color:#f3f4f7}.text-dark-grey{color:#6c6c6d}.sidebar-links .top-section{color:#000}.sidebar-links ul{list-style-type:none;padding-left:0}.sidebar-links ul li{color:#6c6c6d;margin-left:20px}.sidebar-links ul li a{color:inherit}.sidebar-links .with-sub-sections.top-section:before{content:"+ ";font-family:"Courier New", Courier, monospace;width:50px}.sidebar-links .with-sub-sections.top-section.open:before{content:"- ";font-family:"Courier New", Courier, monospace;width:50px}.bg-very-light-grey{background-color:#f3f4f7}.email-subscribe-form input.email{color:#ee4c2c;border:none;border-bottom:1px solid #939393;width:100%;background-color:transparent;outline:none;font-size:1.125rem;letter-spacing:0.25px;line-height:2.25rem}.email-subscribe-form ::-webkit-input-placeholder{color:#ee4c2c}.email-subscribe-form ::-moz-placeholder{color:#ee4c2c}.email-subscribe-form :-ms-input-placeholder{color:#ee4c2c}.email-subscribe-form :-moz-placeholder{color:#ee4c2c}.email-subscribe-form input[type="submit"]{position:absolute;right:0;top:10px;height:15px;width:15px;background-image:url("/assets/images/arrow-right-with-tail.svg");background-color:transparent;background-repeat:no-repeat;background-size:15px 15px;background-position:center center;-webkit-appearance:none;-moz-appearance:none;appearance:none;border:0}.email-subscribe-form-fields-wrapper{position:relative}.bg-slate{background-color:#262626}.tweets-wrapper{width:100%}.tweets-wrapper p{font-size:1rem;line-height:1.5rem;letter-spacing:0.22px}.tweets-wrapper ol{padding-left:0}.tweets-wrapper a{color:#ee4c2c}.tweets-wrapper img,.tweets-wrapper .timeline-Tweet-actions,.tweets-wrapper .timeline-Tweet-media,.tweets-wrapper .MediaCard{display:none !important}.tweet{margin-bottom:2.2rem;word-wrap:break-word}.tweet a{color:#ee4c2c;display:inline}.tweet a span{color:inherit}.tweet p,.tweet span{font-size:1rem;line-height:1.5rem;letter-spacing:0.22px;color:#A0A0A1}@media screen and (min-width: 1240px){.tweet p{padding-right:40px}}.tweet span.retweeted,.tweet span.in-reply-to{font-size:.8125rem}.tweet p.tweet-header{margin-bottom:.3125rem;line-height:.75rem}.tweet .tweet-bird:before{content:"";position:relative;left:0;background-image:url("/assets/images/logo-twitter-grey.svg");background-size:20px 16px;display:inline-block;width:20px;height:16px}@media screen and (min-width: 768px){.tweet .tweet-bird:before{margin-bottom:.625rem}}.anchorjs-link{color:#6c6c6d !important}@media screen and (min-width: 768px){.anchorjs-link:hover{color:inherit;text-decoration:none !important}}.article-page-module{background-color:#f3f4f7;padding-top:1.875rem;padding-bottom:1.875rem}@media screen and (min-width: 768px){.article-page-module{padding-top:3.75rem;padding-bottom:3.75rem}}@media screen and (min-width: 1240px){.article-page-module .col-md-3{padding-left:20px;padding-right:20px}}.article-page-module .module-link-col .btn{padding-left:0}@media screen and (min-width: 768px){.article-page-module .module-link-col{text-align:right}.article-page-module .module-link-col .btn{padding-left:inherit}}.article-page-module .module-content-wrapper{margin-top:1.25rem;margin-bottom:1.25rem}@media screen and (min-width: 768px){.article-page-module .module-content-wrapper{margin-top:0;margin-bottom:0}}.article-page-module img{margin-bottom:1.875rem;width:100%}.article-page-module h3{font-size:1.5rem;letter-spacing:1.33px;line-height:2rem;text-transform:uppercase;margin-bottom:1.25rem}@media screen and (min-width: 768px){.article-page-module h3{margin-bottom:3.75rem}}.article-page-module h5,.article-page-module p{font-size:1rem;line-height:1.5rem}.article-page-module h5{color:#262626}.article-page-module p{color:#CCCDD1;letter-spacing:0.25px}.article-page-module .module-header{position:relative}.article-page-module .module-button{padding-left:0}@media screen and (min-width: 768px){.article-page-module .module-button{position:absolute;right:15px;top:0;padding-top:0;padding-bottom:.125rem;background-position:center right;padding-right:16px}}.ecosystem-card,.resource-card,.hub-card{border-radius:0;border:none;height:110px;margin-bottom:1.25rem;margin-bottom:1.875rem;overflow:scroll}@media screen and (min-width: 1240px){.ecosystem-card,.resource-card,.hub-card{height:150px;overflow:inherit}}@media (min-width: 768px) and (max-width: 1239px){.ecosystem-card,.resource-card,.hub-card{height:170px;overflow:inherit}}.ecosystem-card p.card-summary,.resource-card p.card-summary,.hub-card p.card-summary{font-size:1.125rem;line-height:1.5rem;margin-bottom:0;color:#6c6c6d}.ecosystem-card h4,.resource-card h4,.hub-card h4{color:#262626;margin-bottom:1.125rem;overflow:hidden;white-space:nowrap;text-overflow:ellipsis}.ecosystem-card a,.resource-card a,.hub-card a{height:100%}@media screen and (min-width: 768px){.ecosystem-card a,.resource-card a,.hub-card a{min-height:190px}}@media (min-width: 768px) and (max-width: 1239px){.ecosystem-card a,.resource-card a,.hub-card a{min-height:234px}}@media screen and (min-width: 768px){.ecosystem-card:after,.resource-card:after,.hub-card:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.ecosystem-card:hover:after,.resource-card:hover:after,.hub-card:hover:after{width:100%}.ecosystem-card:hover,.resource-card:hover,.hub-card:hover{color:#262626}}.ecosystem-card:hover p.card-summary,.resource-card:hover p.card-summary,.hub-card:hover p.card-summary{color:#262626}.ecosystem-card .card-body{background-position:top 1.25rem right 1.25rem;background-repeat:no-repeat;padding:1.5625rem 1.875rem}.ecosystem-card .card-body.reasoning{background-image:url("/assets/images/logo-elf.svg");background-size:29px 25px}.ecosystem-card .card-body.tool{background-image:url("/assets/images/logo-wav2letter.svg");background-size:29px 25px}.ecosystem-card .card-body.language{background-image:url("/assets/images/logo-parlai.svg");background-size:29px 25px}.ecosystem-card .card-body.vision{background-image:url("/assets/images/logo-detectron.svg");background-size:29px 25px}.resource-card{border:1px solid #d6d7d8;background-color:transparent;margin-bottom:1.25rem}@media screen and (min-width: 768px){.resource-card{margin-bottom:0}}@media (min-width: 768px) and (max-width: 1239px){.resource-card{height:225px}}.resource-card .pytorch-image{position:relative;height:1.25rem;width:1.25rem;top:3.125rem}.resource-card a{letter-spacing:0.25px;color:#262626}.resource-card .card-body{display:block;padding:0 15px 0 0;position:relative;top:20px;margin-left:60px}@media (min-width: 768px) and (max-width: 1239px){.resource-card .card-body{top:18px}}@media screen and (min-width: 1240px){.resource-card .card-body{top:30px;margin-left:80px;padding-right:30px}}.resource-card.slack:before,.resource-card.github:before,.resource-card.pytorch-resource:before{content:"";background-size:32px 32px;background-repeat:no-repeat;display:block;position:absolute;height:32px;width:32px;top:15px;left:15px}@media screen and (min-width: 1240px){.resource-card.slack:before,.resource-card.github:before,.resource-card.pytorch-resource:before{left:30px;top:30px}}.resource-card.slack:before{background-image:url("/assets/images/logo-slack.svg")}.resource-card.github:before{background-image:url("/assets/images/logo-github.svg")}.resource-card.pytorch-resource:before{background-image:url("/assets/images/logo-icon.svg")}.resource-card .pytorch-discuss .discuss{color:#ee4c2c;font-weight:400}@media screen and (min-width: 768px){.resource-card:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.resource-card:hover:after{width:100%}.resource-card:hover{color:#262626}}.article-page-module.similar-projects .ecosystem-card p.card-summary{font-size:1rem;height:36px}@media screen and (min-width: 768px){.article-page-module.similar-projects .ecosystem-card p.card-summary{height:50px}}#twitter-widget iframe{display:none !important}body.general .main-content-wrapper{margin-top:80px}@media screen and (min-width: 768px){body.general .main-content-wrapper{margin-top:100px}}code,kbd,pre,samp{font-family:IBMPlexMono,SFMono-Regular,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace}code span,kbd span,pre span,samp span{font-family:IBMPlexMono,SFMono-Regular,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace}pre{padding:1.125rem;background-color:#f3f4f7}pre code{font-size:.875rem}pre.highlight{background-color:#f3f4f7;line-height:1.3125rem}code.highlighter-rouge{color:#6c6c6d;background-color:#f3f4f7;padding:2px 6px}a:link code.highlighter-rouge,a:visited code.highlighter-rouge,a:hover code.highlighter-rouge{color:#4974D1}a:link.has-code,a:visited.has-code,a:hover.has-code{color:#4974D1}p code,h1 code,h2 code,h3 code,h4 code,h5 code,h6 code{font-size:78.5%}.header-holder{height:68px;align-items:center;display:flex;left:0;margin-left:auto;margin-right:auto;position:fixed;right:0;top:0;width:100%;z-index:9999}@media screen and (min-width: 1100px){.header-holder{height:90px}}.header-holder.blog-header,.header-holder.blog-detail-header,.header-holder.resources-header,.header-holder.get-started-header,.header-holder.features-header,.header-holder.ecosystem-header,.header-holder.hub-header,.header-holder.coc-header,.header-holder.announcement-header,.header-holder.mobile-header{background-color:#fff;border-bottom:1px solid #e2e2e2}.header-container{position:relative;display:flex;align-items:center}.header-container:before,.header-container:after{content:"";display:table}.header-container:after{clear:both}.header-container{*zoom:1}@media screen and (min-width: 1100px){.header-container{display:block}}.header-logo{height:40px;width:2000px;background-image:url("/assets/images/logo-ko-dark.svg");background-repeat:no-repeat;background-size:200px 52px;display:block;float:left}@media screen and (min-width: 1100px){.header-logo{background-size:200px 52px;position:absolute;height:52px;width:200px;top:-10px;float:none}}.main-menu-open-button{background-image:url("/assets/images/icon-menu-dots.svg");background-position:center center;background-size:25px 7px;background-repeat:no-repeat;width:25px;height:7px;position:absolute;right:0;top:20px}@media screen and (min-width: 1100px){.main-menu-open-button{display:none}}.header-holder .main-menu{display:none}@media screen and (min-width: 1100px){.header-holder .main-menu{display:flex;align-items:center;justify-content:flex-end}}.header-holder .main-menu ul{display:flex;align-items:center;margin:0}.header-holder .main-menu ul li{display:inline-block;margin-right:40px;position:relative}.header-holder .main-menu ul li.active:after{content:"•";bottom:-24px;color:#ee4c2c;font-size:1.375rem;left:0;position:absolute;right:0;text-align:center}.header-holder .main-menu ul li.active a{color:#ee4c2c}.header-holder .main-menu ul li.active .with-down-arrow{background-image:url("/assets/images/chevron-down-orange.svg")}.header-holder .main-menu ul li.resources-active:after{left:-27px}.header-holder .main-menu ul li:last-of-type{margin-right:0}.header-holder .main-menu ul li a{color:#fff;font-size:1.3rem;letter-spacing:0;line-height:2.125rem;text-align:center;text-decoration:none}@media screen and (min-width: 1100px){.header-holder .main-menu ul li a:hover{color:#ee4c2c}}.mobile-main-menu{display:none}.mobile-main-menu.open{background-color:#262626;display:block;height:100%;left:0;margin-left:auto;margin-right:auto;min-height:100%;position:fixed;right:0;top:0;width:100%;z-index:99999}.mobile-main-menu .container-fluid{background-color:inherit;align-items:center;display:flex;height:68px;position:relative;z-index:1}.mobile-main-menu .container-fluid:before,.mobile-main-menu .container-fluid:after{content:"";display:table}.mobile-main-menu .container-fluid:after{clear:both}.mobile-main-menu .container-fluid{*zoom:1}.mobile-main-menu.open ul{list-style-type:none;padding:0}.mobile-main-menu.open ul li a,.mobile-main-menu.open .resources-mobile-menu-title{font-size:2rem;color:#fff;letter-spacing:0;line-height:4rem}.mobile-main-menu.open ul li.active a{color:#ee4c2c}.main-menu-close-button{background-image:url("/assets/images/icon-close.svg");background-position:center center;background-repeat:no-repeat;background-size:24px 24px;height:24px;position:absolute;right:0;width:24px;top:10px}.mobile-main-menu-header-container{position:relative}.mobile-main-menu-links-container{display:flex;align-items:center;padding-left:2.8125rem;height:90vh;margin-top:-25px;padding-top:50%;overflow-y:scroll}.mobile-main-menu-links-container .main-menu{height:100vh;width:100%}@media (min-width: 768px) and (max-width: 1239px){.mobile-main-menu-links-container{padding-top:85%}}@media only screen and (max-width: 320px){.mobile-main-menu-links-container .navSearchWrapper{width:75%}}.blog-header .header-logo,.blog-detail-header .header-logo,.resources-header .header-logo,.get-started-header .header-logo,.features-header .header-logo,.ecosystem-header .header-logo,.hub-header .header-logo,.coc-header .header-logo,.announcement-header .header-logo,.mobile-header .header-logo{background-image:url("/assets/images/logo-ko.svg");background-size:200px 52px}@media screen and (min-width: 768px){.blog-header .header-logo,.blog-detail-header .header-logo,.resources-header .header-logo,.get-started-header .header-logo,.features-header .header-logo,.ecosystem-header .header-logo,.hub-header .header-logo,.coc-header .header-logo,.announcement-header .header-logo,.mobile-header .header-logo{background-size:200px 52px;height:52px;width:200px;margin-bottom:0;top:-10px}}.blog-header .main-menu ul li a,.blog-detail-header .main-menu ul li a,.resources-header .main-menu ul li a,.get-started-header .main-menu ul li a,.features-header .main-menu ul li a,.ecosystem-header .main-menu ul li a,.hub-header .main-menu ul li a,.coc-header .main-menu ul li a,.announcement-header .main-menu ul li a,.mobile-header .main-menu ul li a{color:#262626}.blog-header .main-menu-open-button,.blog-detail-header .main-menu-open-button,.resources-header .main-menu-open-button,.get-started-header .main-menu-open-button,.features-header .main-menu-open-button,.ecosystem-header .main-menu-open-button,.hub-header .main-menu-open-button,.coc-header .main-menu-open-button,.announcement-header .main-menu-open-button,.mobile-header .main-menu-open-button{background-image:url("/assets/images/icon-menu-dots-dark.svg")}.main-menu ul li .ecosystem-dropdown,.main-menu ul li .resources-dropdown a{cursor:pointer}.main-menu ul li .ecosystem-dropdown .with-down-orange-arrow,.main-menu ul li .resources-dropdown .with-down-orange-arrow{padding-right:2rem;position:relative;background-image:url("/assets/images/chevron-down-orange.svg");background-size:14px 18px;background-position:top 7px right 10px;background-repeat:no-repeat}.main-menu ul li .ecosystem-dropdown .with-down-white-arrow,.main-menu ul li .resources-dropdown .with-down-white-arrow{padding-right:2rem;position:relative;background-image:url("/assets/images/chevron-down-white.svg");background-size:14px 18px;background-position:top 7px right 10px;background-repeat:no-repeat}.main-menu ul li .ecosystem-dropdown .with-down-white-arrow:hover,.main-menu ul li .resources-dropdown .with-down-white-arrow:hover{background-image:url("/assets/images/chevron-down-orange.svg")}.main-menu ul li .ecosystem-dropdown .with-down-arrow,.main-menu ul li .resources-dropdown .with-down-arrow{padding-right:2rem;position:relative;background-image:url("/assets/images/chevron-down-black.svg");background-size:14px 18px;background-position:top 7px right 10px;background-repeat:no-repeat}.main-menu ul li .ecosystem-dropdown .with-down-arrow:hover,.main-menu ul li .resources-dropdown .with-down-arrow:hover{background-image:url("/assets/images/chevron-down-orange.svg")}.main-menu ul li .dropdown-menu{border-radius:0;padding:0}.main-menu ul li .dropdown-menu .dropdown-item{color:#6c6c6d;border-bottom:1px solid #e2e2e2}.main-menu ul li .dropdown-menu .dropdown-item:last-of-type{border-bottom-color:transparent}.main-menu ul li .dropdown-menu .dropdown-item:hover{background-color:#ee4c2c}.main-menu ul li .dropdown-menu .dropdown-item p{font-size:1rem;color:#979797}.main-menu ul li .dropdown-menu a.dropdown-item:hover{color:#fff}.main-menu ul li .dropdown-menu a.dropdown-item:hover p{color:#fff}.ecosystem-dropdown-menu,.resources-dropdown-menu{left:-75px;width:226px;display:none;position:absolute;z-index:1000;display:none;float:left;min-width:10rem;padding:0.5rem 0;font-size:1rem;color:#212529;text-align:left;list-style:none;background-color:#fff;background-clip:padding-box;border:1px solid rgba(0,0,0,0.15);border-radius:0.25rem}.ecosystem-dropdown:hover .ecosystem-dropdown-menu,.ecosystem-dropdown:hover .resources-dropdown-menu,.resources-dropdown:hover .ecosystem-dropdown-menu,.resources-dropdown:hover .resources-dropdown-menu,.resources-active:hover .ecosystem-dropdown-menu,.resources-active:hover .resources-dropdown-menu{display:block}.main-menu ul li .ecosystem-dropdown-menu,.main-menu ul li .resources-dropdown-menu{border-radius:0;padding:0}.main-menu ul li .ecosystem-dropdown-menu .dropdown-item,.main-menu ul li .resources-dropdown-menu .dropdown-item{color:#6c6c6d;border-bottom:1px solid #e2e2e2}.header-holder .main-menu ul li a.nav-dropdown-item{display:block;font-size:1rem;line-height:1.3125rem;width:100%;padding:0.25rem 1.5rem;clear:both;font-weight:400;color:#979797;text-align:center;background-color:transparent;border-bottom:1px solid #e2e2e2}.header-holder .main-menu ul li a.nav-dropdown-item:last-of-type{border-bottom-color:transparent}.header-holder .main-menu ul li a.nav-dropdown-item:hover{background-color:#ee4c2c;color:white}.header-holder .main-menu ul li a.nav-dropdown-item .dropdown-title{font-size:1.125rem;color:#6c6c6d;letter-spacing:0;line-height:34px}.header-holder .main-menu ul li a.nav-dropdown-item .docs-title{display:block;padding-top:1rem}.header-holder .main-menu ul li a.nav-dropdown-item:hover .dropdown-title{background-color:#ee4c2c;color:white}.mobile-main-menu-links-container ul.resources-mobile-menu-items li{padding-left:15px}.jumbotron{background-color:transparent;position:absolute;left:0;right:0;margin-right:auto;margin-left:auto;padding:0;margin-bottom:0;display:flex;align-items:center;top:68px}@media screen and (min-width: 768px){.jumbotron{height:550px;top:90px}}.jumbotron .jumbotron-content{display:flex;align-items:center}.jumbotron .lead{font-weight:400;letter-spacing:0.25px}.jumbotron h1{font-size:2.5rem;text-transform:uppercase;font-weight:lighter;letter-spacing:1.08px;margin-bottom:.625rem;line-height:1.05}@media screen and (min-width: 768px){.jumbotron h1{font-size:4.5rem}}.jumbotron p{font-size:1.125rem;margin-bottom:1.25rem}@media screen and (min-width: 768px){.jumbotron p{width:50%}}.jumbotron.on-dark-background h1,.jumbotron.on-dark-background p{color:#fff}.jumbotron .btn{padding-top:.5625rem}@media screen and (min-width: 768px){.jumbotron .btn{margin-top:.625rem}}@media screen and (min-width: 768px){.homepage .main-content-wrapper{margin-top:539px}}.homepage h2{margin-bottom:1.5625rem;text-transform:uppercase;letter-spacing:1.78px;line-height:2.5rem}@media screen and (min-width: 768px){.homepage h2{margin-bottom:2.0625rem}}.homepage h3{font-size:1.5rem;letter-spacing:1.33px;line-height:2rem;text-transform:uppercase;margin-bottom:1.25rem}.homepage h5{margin-bottom:.5rem}@media screen and (min-width: 768px){.homepage h5{margin-bottom:.9375rem}}.homepage .jumbotron{height:195px}@media screen and (min-width: 768px){.homepage .jumbotron{height:395px}}.homepage .jumbotron .btn{margin-top:.375rem}.homepage .main-background{height:330px}@media screen and (min-width: 768px){.homepage .main-background{height:570px}}.homepage .ecosystem-row .card{background-color:#f3f4f7}.homepage .homepage-header{background-color:rgba(0,0,0,0.165)}.homepage-feature-module{padding-top:2.5rem;padding-bottom:2.5rem}@media screen and (min-width: 768px){.homepage-feature-module{padding-top:3.875rem;padding-bottom:4.5rem}.homepage-feature-module .module-button{position:absolute;right:15px;top:0}}.homepage-feature-module p{color:#6c6c6d;font-size:1.125em}.homepage-feature-module .title{color:#000;font-weight:300;font-size:1.5rem}@media (min-width: 768px) and (max-width: 1239px){.homepage-feature-module .title{font-size:1.25rem}}.homepage-feature-module .pytorch-title{font-size:1.5rem;letter-spacing:0.33px;line-height:2.25rem}.homepage-feature-module .subtext{font-size:1.125rem;color:#8c8c8c;letter-spacing:0;line-height:1.5rem}@media (min-width: 768px) and (max-width: 1239px){.homepage-feature-module .subtext{font-size:.9375rem}}.key-features-module{padding-bottom:0}@media screen and (min-width: 768px){.key-features-module{padding-bottom:1.55rem}}.key-features-module .key-features-boxes{margin-top:2rem}@media screen and (min-width: 768px){.key-features-module .key-features-boxes{margin-top:0}}.key-features-module .key-feature-box{margin-bottom:2rem}.key-features-module .key-feature-box p{margin-bottom:0;letter-spacing:0.25px}@media screen and (min-width: 768px){.key-features-module .key-feature-box{margin-bottom:2.5rem}}.community-heading{margin-top:2rem}.community-module{background-color:#fff}.community-module .ecosystem-card{height:auto}@media (min-width: 768px) and (max-width: 1239px){.community-module .ecosystem-card{padding:.625rem}}.community-module h2{margin-bottom:0}.community-module h5{text-transform:uppercase;color:#c6000a;margin-bottom:1.25rem}.community-module .h2-subheadline{margin-top:1.25rem;margin-bottom:2.6rem}@media screen and (min-width: 768px){.community-module .h2-subheadline{margin-top:0}}@media (min-width: 768px) and (max-width: 1239px){.community-module .card-body{padding:.625rem}}.community-module .module-button{background-color:#f3f4f7}.community-module p{margin-bottom:2.5rem;letter-spacing:0.25px}.community-module .module-subtext{margin-right:15.625rem}.community-module .email-subscribe-form input.email{border-bottom:1px solid #d6d7d8;font-size:1.25rem;line-height:0;padding-bottom:.75rem}.community-module .email-subscribe-form input[type="submit"]{top:6px}@media screen and (min-width: 768px){.community-module .email-subscribe-form input[type="submit"]{top:10px}}.pytorch-users-module,.homepage-bottom-wrapper{background-color:#f3f4f7}@media screen and (min-width: 768px){.pytorch-users-module{padding-bottom:1.9rem}}.community-avatar{height:60px;width:60px}.community-logo-bottom{height:200px;background-color:#f3f4f7}.university-testimonials h2{margin-bottom:2.2rem}.university-testimonials-content{margin-top:2.5rem;margin-bottom:2rem}@media screen and (min-width: 768px){.university-testimonials-content{margin-top:0}}.university-testimonials-content .col-md-4{margin-bottom:2.5rem}@media screen and (min-width: 768px){.university-testimonials-content .col-md-4{margin-bottom:0}}.university-testimonials-content .case-study-title{font-size:1.5rem;margin-bottom:1.25rem}.university-testimonials-content p{color:#262626;font-size:1.5rem;letter-spacing:0.25px;line-height:2.25rem}.follow-us-on-twitter h2{margin-bottom:1.25rem}@media screen and (min-width: 768px){.follow-us-on-twitter h2{margin-bottom:2.5rem}}.homepage-feature-module .tweets-wrapper p{font-size:1rem}.quick-starts p{font-size:1.125rem;line-height:1.75rem}.quick-start-guides{font-size:1.5rem;letter-spacing:0.25px;line-height:2.25rem;color:#a5a5a5}.quick-start-guides .step-counter{margin-bottom:.1875rem}.quick-start-guides ul{list-style-type:none;padding-left:0}.quick-start-guides ul li{margin-bottom:0;font-size:1.125rem}@media screen and (min-width: 768px){.quick-start-guides ul li{margin-bottom:.75rem}.quick-start-guides ul li:last-of-type{margin-bottom:0}}.quick-start-guides ul li.selected{color:#ee4c2c}.quick-start-guides ul li.selected:before{content:"\2022";position:absolute;left:0}@media screen and (min-width: 768px){.quick-start-guides ul li.selected:before{left:-5px}}.quick-start-guides .select-instructions{color:#262626;border-bottom:2px solid #a5a5a5;margin-bottom:1rem;font-size:1.125rem;display:inline-block}@media screen and (min-width: 768px){.quick-start-guides .select-instructions{margin-bottom:0}}.homepage .news-banner-container{background:#000;color:#fff;text-align:center;padding:20px;width:90%}.homepage .news-banner-container .right-arrow,.homepage .news-banner-container .left-arrow{height:15px;bottom:-3px;position:relative}@media screen and (min-width: 768px){.homepage .news-banner-container .right-arrow,.homepage .news-banner-container .left-arrow{bottom:-8px}}.homepage .news-banner-container .right-arrow:hover,.homepage .news-banner-container .left-arrow:hover{cursor:pointer}.homepage .news-banner-container .right-arrow{float:right}.homepage .news-banner-container .left-arrow{float:left}.homepage #news-items .pagination{display:none !important}.banner-info{display:inline-block;overflow:hidden;text-overflow:ellipsis;white-space:nowrap;margin:auto;width:80%;font-size:1.125rem}@media screen and (min-width: 768px){.banner-info{padding-top:3px}}.banner-info:hover{cursor:pointer;color:#ee4c2c}.news-banner-text a{color:white}.news-banner-text a:hover{color:#ee4c2c}.no-banner{padding-bottom:2rem}.site-footer{padding-top:2.5rem;width:100%;background:#000;background-size:100%;margin-left:0;margin-right:0;margin-bottom:0}@media screen and (min-width: 768px){.site-footer{padding-top:5rem;position:absolute;left:0;bottom:0;height:350px}}.site-footer p{color:#fff}.site-footer ul{list-style-type:none;padding-top:1rem;padding-bottom:1rem;padding-left:0;margin-bottom:0}.site-footer ul li{font-size:1.125rem;line-height:2rem;color:#A0A0A1;padding-bottom:.375rem}.site-footer ul li.list-title{padding-bottom:.75rem;color:#fff}.site-footer a:link,.site-footer a:visited{color:inherit}@media screen and (min-width: 768px){.site-footer a:hover{color:#ee4c2c}}.site-footer .trademark-disclaimer{background:#000000;display:flex}.site-footer .trademark-disclaimer li{font-size:.875rem;line-height:.9375rem;color:white}.site-footer .privacy-policy{background:#000000;display:flex}.site-footer .privacy-policy .privacy-policy-links{padding-top:1rem;padding-right:1rem;display:inline-flex;color:white}.docs-tutorials-resources{background-color:#262626;color:#fff;padding-top:2.5rem;padding-bottom:2.5rem}@media screen and (min-width: 768px){.docs-tutorials-resources{padding-top:4.125rem;padding-bottom:4.09rem}}.docs-tutorials-resources h2{font-size:1.5rem;letter-spacing:-0.25px;text-transform:none;margin-bottom:0.25rem}@media screen and (min-width: 768px){.docs-tutorials-resources h2{margin-bottom:1.25rem}}.docs-tutorials-resources .col-md-4{margin-bottom:2rem}@media screen and (min-width: 768px){.docs-tutorials-resources .col-md-4{margin-bottom:0}}.docs-tutorials-resources .with-right-arrow{margin-left:12px;background-position:top 3px right 11px}@media screen and (min-width: 768px){.docs-tutorials-resources .with-right-arrow{background-position:top 6px right 11px}}.docs-tutorials-resources .with-right-arrow:hover{background-image:url("/assets/images/chevron-right-white.svg")}.docs-tutorials-resources p{font-size:1rem;line-height:1.5rem;letter-spacing:0.22px;color:#A0A0A1;margin-bottom:.5rem}@media screen and (min-width: 768px){.docs-tutorials-resources p{margin-bottom:1.25rem}}.docs-tutorials-resources a{font-size:1.125rem;color:#ee4c2c}.docs-tutorials-resources a:hover{color:#fff}.footer-container{position:relative}@media screen and (min-width: 768px){.footer-logo-wrapper{position:absolute;top:0;left:30px}}.footer-logo{background-image:url("/assets/images/logo-ko-square-dark.svg");background-position:center;background-repeat:no-repeat;background-size:100px 100px;display:block;height:130px;width:100px;margin-bottom:2.8125rem}@media screen and (min-width: 768px){.footer-logo{background-size:100px 100px;height:130px;width:100px;margin-bottom:0}}.footer-links-wrapper{display:flex;flex-wrap:wrap;border-bottom:1px solid white}@media screen and (min-width: 768px){.footer-links-wrapper{flex-wrap:initial;justify-content:flex-end}}.footer-links-col{margin-bottom:3.75rem;width:50%}@media screen and (min-width: 768px){.footer-links-col{margin-bottom:0;width:15%;margin-right:23px}.footer-links-col.follow-us-col{width:18%;margin-right:0}}@media (min-width: 768px) and (max-width: 1239px){.footer-links-col{width:58%;margin-right:30px}}.footer-social-icons{margin:8.5625rem 0 2.5rem 0}.footer-social-icons a{height:32px;width:32px;display:inline-block;background-color:#CCCDD1;border-radius:50%;margin-right:5px}.footer-social-icons a.facebook{background-image:url("/assets/images/logo-facebook-dark.svg");background-position:center center;background-size:9px 18px;background-repeat:no-repeat}.footer-social-icons a.twitter{background-image:url("/assets/images/logo-twitter-dark.svg");background-position:center center;background-size:17px 17px;background-repeat:no-repeat}.footer-social-icons a.youtube{background-image:url("/assets/images/logo-youtube-dark.svg");background-position:center center;background-repeat:no-repeat}.site-footer .mc-field-group{margin-top:-2px}.site-footer .email-subscribe-form input[type="submit"]{top:9px}@media screen and (min-width: 768px){.site-footer .email-subscribe-form input[type="submit"]{top:13px}}.main-content-wrapper{margin-top:300px}@media screen and (min-width: 768px){.main-content-wrapper{margin-top:540px;min-height:400px}}.main-content{padding-top:1.5rem;padding-bottom:1.5rem}@media screen and (min-width: 768px){.main-content{padding-top:2.625rem}}.main-content-menu{margin-bottom:1.25rem}@media screen and (min-width: 768px){.main-content-menu{margin-bottom:5rem}}.main-content-menu .navbar-nav .nav-link{color:#262626;padding-left:1.875rem;padding-right:1.875rem}@media screen and (min-width: 768px){.main-content-menu .navbar-nav .nav-link:first-of-type{padding-left:0}}article.pytorch-article{max-width:920px;margin:0 auto;padding-bottom:90px}article.pytorch-article h2,article.pytorch-article h3,article.pytorch-article h4,article.pytorch-article h5,article.pytorch-article h6{margin-top:1.875rem;margin-bottom:1.5rem;color:#262626}article.pytorch-article h2{font-size:1.5rem;letter-spacing:1.33px;line-height:2rem;margin-top:3.125rem;text-transform:uppercase}article.pytorch-article h3{font-size:1.5rem;letter-spacing:-0.25px;line-height:1.875rem;text-transform:none}article.pytorch-article h4,article.pytorch-article h5,article.pytorch-article h6{font-size:1.125rem;letter-spacing:-0.19px;line-height:1.875rem}article.pytorch-article p{margin-bottom:1.125rem}article.pytorch-article p,article.pytorch-article ul li,article.pytorch-article ol li,article.pytorch-article dl dt,article.pytorch-article dl dd,article.pytorch-article blockquote{font-size:1.125rem;line-height:1.875rem;color:#6c6c6d}article.pytorch-article table{margin-bottom:2.5rem;width:100%}article.pytorch-article table thead{border-bottom:1px solid #cacaca}article.pytorch-article table th,article.pytorch-article table tr,article.pytorch-article table td{color:#6c6c6d;font-size:1rem;line-height:2.25rem;letter-spacing:-0.17px}article.pytorch-article table th{padding:.625rem;color:#262626}article.pytorch-article table td{padding:.3125rem}article.pytorch-article table tr th:first-of-type,article.pytorch-article table tr td:first-of-type{padding-left:0}article.pytorch-article ul,article.pytorch-article ol{margin:1.5rem 0 3.125rem 0}@media screen and (min-width: 768px){article.pytorch-article ul,article.pytorch-article ol{padding-left:6.25rem}}article.pytorch-article ul li,article.pytorch-article ol li{margin-bottom:.625rem}article.pytorch-article dl{margin-bottom:2.5rem}article.pytorch-article dl dt{margin-bottom:.75rem;font-weight:400}article.pytorch-article pre{margin-bottom:2.5rem}article.pytorch-article hr{margin-top:4.6875rem;margin-bottom:4.6875rem}article.pytorch-article blockquote{font-size:.75rem;font-style:italic;padding:15px 15px 5px 15px;width:100%;background-color:rgba(211,211,211,0.3);border-left:2px solid #000000}article.pytorch-article h3.no_toc{margin:0px}article.pytorch-article nav{float:right;display:block;overflow-y:auto;background-color:white;margin-left:20px;border-left:1px #717171}article.pytorch-article nav li{font-size:12px;line-height:20px;padding-top:0px;list-style:none}article.pytorch-article nav a{color:#717171;font-weight:bold}article.pytorch-article ul#markdown-toc{padding-left:1em;margin:0px}article.pytorch-article ul#markdown-toc ul{margin:0px;padding-left:1em}article.pytorch-article ul#markdown-toc li{margin:0px}.get-started article{margin-bottom:5rem}.get-started .quick-start-guides ul{margin-bottom:0;padding-left:0}.get-started .main-background{height:275px}@media screen and (min-width: 768px){.get-started .main-background{height:380px}}.get-started .main-content-wrapper{margin-top:275px}@media screen and (min-width: 768px){.get-started .main-content-wrapper{margin-top:350px}}.get-started .jumbotron{height:190px}@media screen and (min-width: 768px){.get-started .jumbotron{height:260px}}.get-started .main-content .navbar{background-color:#f3f4f7;padding-left:0;padding-bottom:0;padding-top:0}@media (min-width: 992px){.get-started .main-content .navbar li:first-of-type{padding-left:3.4375rem}.get-started .main-content .navbar .nav-item{padding:2rem;cursor:pointer}.get-started .main-content .navbar .nav-link{position:relative;top:10%;transform:translateY(-50%)}}.get-started .main-content .navbar .nav-select{background-color:#fff}.get-started .main-content .navbar .nav-select .nav-link{color:#ee4c2c;font-weight:500}.get-started .main-content .navbar .nav-link{font-size:1.125rem;color:#8c8c8c}@media screen and (min-width: 768px){.get-started .main-content .navbar .nav-link{margin-left:1.25rem;margin-right:1.25rem}}.get-started .main-content .navbar .nav-link:hover{color:#ee4c2c}.get-started .main-content .navbar .get-started-nav-link{padding-left:.625rem;padding-right:.625rem}@media screen and (min-width: 768px){.get-started .main-content .navbar .get-started-nav-link{padding-left:.9375rem;padding-right:.9375rem}}.get-started .main-content .navbar .nav-item{padding-top:.9375rem;padding-bottom:.9375rem}@media screen and (min-width: 768px){.get-started .main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (min-width: 768px) and (max-width: 1239px){.get-started .main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (max-width: 990px){.get-started .main-content .navbar .nav-item{padding-bottom:.625rem;padding-top:1rem}}.get-started .main-content .navbar .navbar-toggler{margin-left:2.5rem}.get-started .main-content{padding-top:0}@media screen and (min-width: 768px){.get-started .main-content{padding-top:1.9rem}}.get-started .quick-start-module{padding-bottom:0;padding-top:0;background-color:#fff}.get-started .quick-start-module .option,.get-started .quick-start-module #command{border:2px solid #fff;background:#f3f4f7}.get-started .quick-start-module .title-block{border:2px solid #fff}.get-started .quick-start-module .selected{background-color:#ee4c2c}.get-started .quick-start-module h1{font-size:2rem;letter-spacing:1.78px;line-height:2.5rem;text-transform:uppercase;margin-bottom:1.5rem}.get-started .nav-menu-wrapper{background-color:#f3f4f7}.get-started .navbar-nav{flex-direction:row}#installation .os{display:none}#installation .selected{display:block}#cloud .platform{display:none}#cloud .selected{display:block}.screencast{display:none}.screencast iframe{width:100% !important}.get-started .quick-starts .row.ptbuild,.get-started .quick-starts .row.os,.get-started .quick-starts .row.package,.get-started .quick-starts .row.language,.get-started .quick-starts .row.cuda{margin-bottom:1.25rem}@media screen and (min-width: 768px){.get-started .quick-starts .row.ptbuild,.get-started .quick-starts .row.os,.get-started .quick-starts .row.package,.get-started .quick-starts .row.language,.get-started .quick-starts .row.cuda{margin-bottom:0}}@media (min-width: 768px) and (max-width: 1239px){.get-started .quick-starts{flex:0 0 100%;max-width:100%}}@media screen and (min-width: 768px){.get-started .quick-starts{margin-bottom:2.5rem}.get-started .quick-starts .row{margin-bottom:0}}@media screen and (min-width: 1240px){.get-started .quick-starts{margin-bottom:0}}.get-started .get-started-locally-sidebar{padding-top:2.5rem;padding-bottom:2.5rem;top:15%}@media screen and (min-width: 768px){.get-started .get-started-locally-sidebar{padding-top:0}}.get-started .get-started-locally-sidebar ul{padding-left:0}.get-started .get-started-locally-sidebar li{list-style-type:none;line-height:36px}.get-started .get-started-locally-sidebar li a{color:#8c8c8c}.get-started .get-started-locally-sidebar li a.active,.get-started .get-started-locally-sidebar li a:hover{color:#ee4c2c}.get-started .get-started-locally-sidebar li .subitem{padding-left:1.25rem}.get-started .get-started-locally-sidebar li.subitem{padding-left:1.25rem}.cloud-nav{display:none}.get-started .get-started-cloud-sidebar{padding-top:3.125rem;padding-bottom:2.5rem;top:15%}.get-started .get-started-cloud-sidebar ul{padding-left:0}.get-started .get-started-cloud-sidebar li{list-style-type:none;line-height:36px}.get-started .get-started-cloud-sidebar li a{color:#8c8c8c}.get-started .get-started-cloud-sidebar li a.active,.get-started .get-started-cloud-sidebar li a:hover{color:#ee4c2c}.get-started .get-started-cloud-sidebar li .subitem{padding-left:1.25rem}.get-started .get-started-cloud-sidebar li.subitem{padding-left:1.25rem}.ecosystem .jumbotron{height:170px}@media screen and (min-width: 768px){.ecosystem .jumbotron{height:300px}}.ecosystem .jumbotron h1{padding-top:7.8125rem}.ecosystem .jumbotron h1 #ecosystem-header{color:#CC2FAA}.ecosystem .jumbotron h1 #ecosystem-header-tools{color:not_quite_black}.ecosystem .jumbotron p.lead{margin-bottom:1.5625rem;padding-top:3.125rem;color:#6c6c6d}.ecosystem .jumbotron .ecosystem-join{margin-bottom:3rem}.ecosystem .jumbotron svg{margin-bottom:1.25rem}.ecosystem .main-background{height:275px}@media screen and (min-width: 768px){.ecosystem .main-background{height:395px}}@media screen and (min-width: 768px){.ecosystem .main-content{padding-top:3.25rem}}.ecosystem .main-content-wrapper{background-color:#f3f4f7;margin-top:275px}@media screen and (min-width: 768px){.ecosystem .main-content-wrapper{margin-top:395px}}.ecosystem.ecosystem-detail .main-content-wrapper{background-color:#fff}.ecosystem-cards-wrapper{margin-bottom:1.125rem;padding-top:1.25rem}@media (min-width: 768px){.ecosystem-cards-wrapper .col-md-6{flex:0 0 100%;max-width:100%}}@media screen and (min-width: 1240px){.ecosystem-cards-wrapper .col-md-6{flex:0 0 50%;max-width:50%}}.ecosystem .main-content-menu .navbar-nav .nav-link{font-size:1.125rem;color:#CCCDD1;padding-right:0;margin-right:1.875rem}.ecosystem .main-content-menu .navbar-nav .nav-link.selected{color:#ee4c2c;border-bottom:1px solid #ee4c2c}@media screen and (min-width: 768px){.ecosystem .main-content-menu .nav-item:last-of-type{position:absolute;right:0}.ecosystem .main-content-menu .nav-item:last-of-type a{margin-right:0}}.ecosystem.ecosystem-detail .main-content{padding-bottom:0}.ecosystem article.pytorch-article{counter-reset:article-list}.ecosystem article.pytorch-article>ol{padding-left:0;list-style-type:none}@media screen and (min-width: 1240px){.ecosystem article.pytorch-article>ol>li{position:relative}.ecosystem article.pytorch-article>ol>li:before{counter-increment:article-list;content:counter(article-list,decimal-leading-zero);color:#B932CC;line-height:2.5rem;letter-spacing:-0.34px;font-size:2rem;font-weight:300;position:absolute;left:-60px;top:-16px;padding:.625rem 0;background-color:#fff;z-index:10}.ecosystem article.pytorch-article>ol>li:after{content:"";width:2px;position:absolute;left:-42px;top:0;height:100%;background-color:#f3f3f3;z-index:9}}.ecosystem article.pytorch-article>ol>li>h4{color:#262626}.ecosystem article.pytorch-article>ol>li ul li{list-style-type:disc}.ecosystem .quick-starts{background:#ecedf1}.ecosystem .quick-starts .title-block,.ecosystem .quick-starts #command,.ecosystem .quick-starts .option,.ecosystem .quick-starts .cloud-option{border-color:#ecedf1}.ecosystem .join-link{color:inherit;text-decoration:underline}.ecosystem .join-notice{text-align:center;padding-top:1.25rem;padding-bottom:2.5rem}.ecosystem .join-notice p{color:#6c6c6d;margin-bottom:0;line-height:1.875rem}.ecosystem .join-jumbotron{border-bottom:1px solid #C3C3C3;width:90%}@media screen and (min-width: 768px){.ecosystem .join-jumbotron{height:262px}}.ecosystem .join-jumbotron .container{max-width:920px}.ecosystem .join-jumbotron h1{padding-top:.3125rem;color:#000}.ecosystem .join-jumbotron h1 span{font-weight:300;color:#812CE5}.ecosystem .join-wrapper{background-color:#f3f4f7}@media screen and (min-width: 768px){.ecosystem .join-wrapper .main-content{padding-top:1.5rem}}.ecosystem .join-wrapper .container{max-width:920px}.ecosystem .join-wrapper #success-response{color:#6c6c6d}.ecosystem .join-intro{color:#6c6c6d;line-height:28px}.ecosystem .requirements span{color:#000;font-weight:bold}.ecosystem .requirements .join-number{color:#812CE5;display:flex;align-items:center}@media screen and (min-width: 768px){.ecosystem .requirements .join-number{padding-left:.625rem}}.ecosystem .requirements p{margin-bottom:0;margin-top:-.4375rem}@media screen and (min-width: 768px){.ecosystem .requirements p{padding-left:1.5rem}}@media screen and (min-width: 768px){.ecosystem .requirements .col-md-11{border-left:2px solid #f3f4f7}}.ecosystem .row.requirements{padding-bottom:2.5rem}.ecosystem .ecosystem-form{padding-bottom:3rem}.ecosystem .ecosystem-form .mc-field-group{padding-bottom:1.5625rem}.ecosystem .ecosystem-form .mc-field-group .error-message{color:#ee4c2c}.ecosystem .ecosystem-form h3{text-transform:uppercase;padding-top:1.5625rem;padding-bottom:1.875rem}.ecosystem .ecosystem-form label{color:#6c6c6d}.ecosystem .ecosystem-form input,.ecosystem .ecosystem-form textarea{width:100%;border:none;border-bottom:2px solid #812CE5;height:2.75rem;outline:none;padding-left:.9375rem}.ecosystem .ecosystem-form ::-moz-placeholder{color:#6c6c6d;opacity:0.5}.ecosystem .ecosystem-form :-ms-input-placeholder{color:#6c6c6d;opacity:0.5}.ecosystem .ecosystem-form ::-ms-input-placeholder{color:#6c6c6d;opacity:0.5}.ecosystem .ecosystem-form ::placeholder{color:#6c6c6d;opacity:0.5}.ecosystem .ecosystem-form .submit-project{padding-left:.75rem;margin-top:2.5rem;background-color:#ee4c2c;color:#fff;cursor:pointer;border:none;width:30%;height:2.8125rem;text-align:left;background-repeat:no-repeat;background-image:url(/assets/images/arrow-right-with-tail-white.svg);background-size:30px 12px;background-position:right}@media screen and (min-width: 768px){.ecosystem .ecosystem-form .submit-project{padding-left:1.125rem;background-origin:content-box;background-size:30px 15px}}.ecosystem .ecosystem-form .submit-field{margin-top:-6px;text-align:center}.ecosystem .ecosystem-form .large-input{height:5.625rem;padding-top:.625rem}.ecosystem .ecosystem-form #submitting-project{background-color:#6c6c6d}.ecosystem .ecosystem-form button{outline:none}.ecosystem .ecosystem-form #mce-error-response{color:#ee4c2c}.ecosystem .ecosystem-form #mce-error-response a{text-decoration:underline}.ecosystem .experimental .ecosystem-card-title-container{display:inline-flex}.ecosystem .experimental .ecosystem-card-title-container .experimental-badge{text-transform:uppercase;margin-left:15px;background-color:#e4e4e4;color:#262626;opacity:0.75;font-size:.625rem;letter-spacing:1px;line-height:1.375rem;height:1.25rem;width:6rem;text-align:center;margin-top:.25rem}.ecosystem .ecosystem-card-title-container .card-title{padding-left:0;font-size:1.5rem;color:#262626}.ecosystem .star-list{list-style:none;padding-left:0}.ecosystem .star-list li{display:inline}.ecosystem .star-list li.github-stars-count-whole-number{display:none}.ecosystem .icon-count-container{display:inline-block;vertical-align:text-bottom;margin-left:.5rem}.ecosystem .github-logo{height:15px;width:13px;margin-left:10px}.ecosystem .github-stars-count{color:#979797;position:relative;top:.25rem;font-size:14px;margin-left:0.125rem}@media screen and (min-width: 768px){.ecosystem .github-stars-count{top:.1875rem;font-size:initial}}.ecosystem-divider{position:relative;margin-bottom:4rem;margin-top:1.5rem;top:3rem}.ecosystem #dropdownSort,.ecosystem #dropdownSortLeft{margin-left:0}.ecosystem #dropdownSortLeft{top:inherit;right:inherit}.ecosystem-filter-menu ul{list-style-type:none;padding-left:1.25rem}.ecosystem-filter-menu ul li{padding-right:1.25rem;word-break:break-all}.ecosystem-filter-menu ul li a{color:#979797}.ecosystem-filter-menu ul li a:hover{color:#ee4c2c}.ecosystem .ecosystem-filter{cursor:pointer}.ecosystem .ecosystem-filter ul{list-style-type:none}.ecosystem #dropdownFilter,#dropdownSort,#dropdownSortLeft{color:#979797;cursor:pointer;z-index:1;position:absolute}.ecosystem .pagination .page{border:1px solid #dee2e6;padding:0.5rem 0.75rem}.ecosystem .pagination .active .page{background-color:#dee2e6}.features .main-content{padding-bottom:0}.features .navbar-nav .nav-link{color:#000}.features .nav-logo{background-image:url("/assets/images/logo-ko-dark.svg")}@media screen and (min-width: 768px){.features .main-background{height:575px}}@media (max-width: 320px){.features .main-content-wrapper{margin-top:340px}}.features-row{padding-bottom:3.75rem;align-items:center}.features-row:first-of-type{margin-top:1.25rem}.features-row:last-of-type{padding-bottom:4.5rem}@media screen and (min-width: 768px){.features-row{padding-bottom:6rem}.features-row:first-of-type{margin-top:4.05rem}}.features-row h3{font-size:2rem;letter-spacing:1.78px;line-height:2.25rem;font-weight:400;text-transform:uppercase;margin-bottom:1.25rem;font-weight:300}@media (min-width: 768px) and (max-width: 1239px){.features-row h3{width:80%}}@media screen and (min-width: 1240px){.features-row h3{width:590px}}.features-row p{font-size:1.125rem;letter-spacing:0.25px;line-height:1.75rem;color:#6c6c6d;padding-right:1.875rem}@media (min-width: 768px) and (max-width: 1239px){.features-row p{width:80%}}@media screen and (min-width: 1240px){.features-row p{width:590px}}.features-row .feature-content-holder{width:100%}@media screen and (min-width: 1240px){.features-row .feature-content-holder{width:495px}}.features-row .feature-content-holder pre.highlight{margin-bottom:0}.features-row:nth-child(odd) .col-md-6:nth-child(1n){order:2}.features-row:nth-child(odd) .col-md-6:nth-child(2n){order:1}@media screen and (min-width: 768px){.features-row:nth-child(odd) .col-md-6:nth-child(1n){order:1}.features-row:nth-child(odd) .col-md-6:nth-child(2n){order:2}}.features-row:nth-child(1n) h3{color:#B73BC9}.features-row:nth-child(1n) .feature-content-holder{border-bottom:2px solid #B73BC9}.features-row:nth-child(2n) h3{color:#D92F4C}.features-row:nth-child(2n) .feature-content-holder{border-bottom:2px solid #D92F4C}.features-row:nth-child(3n) h3{color:#8038E0}.features-row:nth-child(3n) .feature-content-holder{border-bottom:2px solid #8038E0}@media screen and (min-width: 1240px){.features-row .col-md-6{padding-left:0;padding-right:0}}@media screen and (min-width: 768px){.features-row .col-md-6:nth-of-type(2) .feature-content{width:100%}.features-row .col-md-6:nth-of-type(2) .feature-content h3,.features-row .col-md-6:nth-of-type(2) .feature-content p,.features-row .col-md-6:nth-of-type(2) .feature-content .feature-content-holder{float:right}}.features .jumbotron{height:200px}@media screen and (min-width: 768px){.features .jumbotron{height:195px}}@media (max-width: 320px){.features .jumbotron{height:250px}}.features .jumbotron h1{padding-top:1.875rem}@media screen and (min-width: 768px){.features .jumbotron{height:468px}.features .jumbotron h1{padding-top:0}}.features .jumbotron h1,.features .jumbotron p{color:#fff}@media screen and (min-width: 768px){.features .jumbotron .btn{margin-top:.375rem}}.contributors{margin-top:50px}@media screen and (min-width: 768px){.contributors{margin-top:20px}}.contributors h2 a{margin-top:30px;font-size:2.5rem;font-weight:600}.contributors ul li{padding-top:.625rem;font-size:1.4375rem}.contributors .contributor-card{border:none;overflow:hidden;text-align:center}.contributors .contributor-card img{width:130px;min-width:100px;border:none;border-radius:50%}.contributors .contributor-card p.card-summary{margin-top:10px;text-align:center;font-size:1.25rem;overflow:hidden;color:#262626}.maintainers-row{padding-bottom:3.75rem;align-items:center}@media screen and (min-width: 768px){.maintainers-row{padding-bottom:6rem}.maintainers-row:first-of-type{margin-top:4.05rem}}.maintainers-row .maintainer-content{width:100%;align-items:center}.maintainers-row .maintainer-content h3.features{font-size:2rem;letter-spacing:1.78px;line-height:2.25rem;font-weight:400;text-transform:uppercase;margin-bottom:1.25rem;color:#000}.maintainers-row .maintainer-content p.features{font-size:1.125rem;letter-spacing:0.25px;line-height:1.75rem;color:#6c6c6d}.maintainer{align-items:center;margin-top:10px;margin-bottom:25px;position:relative;transition:all 0.3s;box-shadow:0 0 200px transparent}.maintainer a img{position:relative;z-index:2;width:100%;margin-left:auto;margin-right:auto;-webkit-filter:grayscale(100%);filter:grayscale(100%);transition:all 0.4s}.maintainer a.active:hover img{-webkit-filter:grayscale(0%);filter:grayscale(0%)}.maintainer .member-info{position:relative;text-align:center;z-index:3;padding-bottom:10px}.maintainer .member-info h3{color:#812CE5;font-size:18px;font-weight:600;letter-spacing:3px;margin-top:15px;margin-bottom:10px}.maintainer .member-info p.title{font-size:1rem;margin-bottom:5px;letter-spacing:0.25px;font-weight:800;color:#262626}.maintainer .member-info p.team{font-size:.875rem;margin-bottom:10px;color:#6c6c6d}.maintainer .member-info a{margin-left:5px;margin-right:5px}.maintainer .member-info i{color:#3d5a97}.maintainer:after{position:absolute;content:"";width:100%;height:100%;left:0;top:0;border:2px solid #e3f0fa;opacity:1;transition:all 0.3s}.maintainer:hover{box-shadow:0 26px 49px rgba(0,0,0,0.17)}.maintainer:hover:after{opacity:0}.resources .main-background{display:none}.resources .jumbotron{height:145px;align-items:flex-end}@media screen and (min-width: 768px){.resources .jumbotron{height:194px}}.resources .jumbotron p{margin-bottom:0}.resources .main-content-wrapper{margin-top:230px;margin-bottom:0.75rem}@media screen and (min-width: 768px){.resources .main-content-wrapper{margin-top:307px}}@media screen and (min-width: 768px){.resources .resource-card{margin-bottom:2.25rem}}.quick-starts{background:#f3f4f7}.quick-starts .col-md-2-4{position:relative;width:100%;min-height:1px;padding-right:15px;padding-left:15px}@media (min-width: 768px){.quick-starts .col-md-2-4{flex:0 0 20%;max-width:20%}}.quick-starts .start-locally-col{margin-bottom:1.25rem}.quick-starts .start-locally-col .row.ptbuild,.quick-starts .start-locally-col .row.os,.quick-starts .start-locally-col .row.package,.quick-starts .start-locally-col .row.language,.quick-starts .start-locally-col .row.cuda{min-height:1.25rem;margin-bottom:1.25rem}@media screen and (min-width: 768px){.quick-starts .start-locally-col .row.ptbuild,.quick-starts .start-locally-col .row.os,.quick-starts .start-locally-col .row.package,.quick-starts .start-locally-col .row.language,.quick-starts .start-locally-col .row.cuda{margin-bottom:0}}@media (min-width: 768px) and (max-width: 1239px){.quick-starts .start-locally-col{flex:0 0 100%;max-width:100%}}@media screen and (min-width: 768px){.quick-starts .start-locally-col{margin-bottom:2.5rem}.quick-starts .start-locally-col .row{margin-bottom:0}}@media screen and (min-width: 1240px){.quick-starts .start-locally-col{margin-bottom:0}}.quick-starts .start-locally-col pre{font-size:80% !important;background-color:#ffffff !important}.quick-starts .start-locally-col .prev-versions-btn{margin-top:30px}@media (min-width: 768px) and (max-width: 1239px){.quick-starts .cloud-options-col{flex:0 0 100%;max-width:100%;margin-left:0;margin-top:1.25rem}}.quick-starts p{font-size:1.125rem;line-height:1.75rem}.quick-starts .card-body{flex:1 1 auto}.quick-starts .cloud-option-image{margin-left:.9375rem;margin-right:1.5625rem;margin-bottom:.3125rem}.quick-starts .cloud-option-row{margin-left:0;cursor:pointer}.quick-starts .option{border:2px solid #f3f4f7;font-size:1rem;color:#6c6c6d;letter-spacing:-0.22px;line-height:1.25rem;background:#fff;cursor:pointer}.quick-starts .option:hover{background-color:#ee4c2c;color:#fff}.quick-starts .selected{background-color:#ee4c2c;color:#fff}.quick-starts .block{margin-bottom:.0625rem;height:3.75rem;display:flex;align-items:center}.quick-starts .title-block{margin:.0625rem;height:3.75rem;border:2px solid #f3f4f7;font-size:1rem;color:#6c6c6d;line-height:1.25rem;display:flex;align-items:center}.quick-starts .title-block:before{display:block;content:".";color:transparent;border-left:2px solid #CCCDD1;height:100%;position:absolute;left:0}.quick-starts #command{color:#4a4a4a;background-color:#fff;padding:.9375rem;border:2px solid #f3f4f7;word-wrap:break-word;display:table-cell;vertical-align:middle}.quick-starts #command a{font-size:125%}@media screen and (min-width: 768px){.quick-starts #command a:hover{color:#ee4c2c}}.quick-starts #command pre{word-break:break-all;white-space:normal}.quick-starts .command-container{display:table;width:100%}@media screen and (min-width: 768px){.quick-starts .command-container{min-height:5.25rem}}.quick-starts .command-container pre{margin-bottom:0px;padding:0px;font-size:75%;background-color:#f3f4f7}.quick-starts .command-block{height:5.25rem;word-wrap:break-word;color:#6c6c6d}.quick-starts .command-block:before{border-left:2px solid #000}.quick-starts .quick-start-link{color:#6c6c6d}.quick-starts .mobile-heading{display:flex;align-items:center;font-weight:400}@media screen and (min-width: 768px){.quick-starts .mobile-heading{display:none}}.quick-starts .command-mobile-heading{display:flex;align-items:center;font-weight:400;color:#000}@media screen and (min-width: 768px){.quick-starts .command-mobile-heading{display:none}}.quick-starts .headings{display:none}@media screen and (min-width: 768px){.quick-starts .headings{display:block}}.quick-starts .cloud-options-col{margin-top:1.25rem}@media screen and (min-width: 768px){.quick-starts .cloud-options-col{margin-top:0}}@media (max-width: 978px){.quick-starts .os-text{margin-top:0}}.quick-start-guides{font-size:1.125rem;letter-spacing:0.25px;line-height:2.25rem;color:#CCCDD1}.quick-start-guides .select-instructions{color:#262626;border-bottom:2px solid #CCCDD1;margin-bottom:1rem;display:inline-block}@media screen and (min-width: 768px){.quick-start-guides .select-instructions{margin-bottom:0}}.quick-start-module{padding-top:2.5rem;padding-bottom:2.5rem}.quick-start-module .option-module{float:right}@media screen and (min-width: 768px){.quick-start-module{padding-top:4rem;padding-bottom:4.125rem}}.quick-start-module p{color:#6c6c6d;font-size:1.125em;letter-spacing:0.25px;padding-bottom:.9375rem;margin-bottom:1.4rem}.quick-start-module h3{font-size:1.5rem;letter-spacing:1.33px;line-height:2rem;text-transform:uppercase;margin-bottom:2.1rem}.quick-starts .cloud-option-body{display:flex;align-items:center;height:64px;padding:0 0 0 5rem;position:relative;background-image:url("/assets/images/chevron-right-orange.svg");background-size:6px 13px;background-position:center right 15px;background-repeat:no-repeat}@media screen and (min-width: 768px){.quick-starts .cloud-option-body:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.quick-starts .cloud-option-body:hover:after{width:100%}.quick-starts .cloud-option-body:hover{color:#262626}}@media screen and (min-width: 768px){.quick-starts .cloud-option-body{padding-right:2rem}}@media (min-width: 768px) and (max-width: 1239px){.quick-starts .cloud-option-body{padding-right:1.25rem}}@media screen and (min-width: 768px){.quick-starts .cloud-option-body{background-size:8px 14px}}.quick-starts .cloud-option-body:before{opacity:0.5;position:absolute;left:1.875rem;top:21px}.quick-starts .cloud-option-body.aws:before{content:url("/assets/images/aws-logo.svg")}.quick-starts .cloud-option-body.microsoft-azure:before{content:url("/assets/images/microsoft-azure-logo.svg")}.quick-starts .cloud-option-body.google-cloud:before{content:url("/assets/images/google-cloud-logo.svg")}.quick-starts .cloud-option-body.colab:before{content:url("/assets/images/colab-logo.svg")}.quick-starts .cloud-option-body.alibaba:before{content:url("/assets/images/alibaba-logo.svg");left:.75rem;top:19px}@media screen and (min-width: 768px){.quick-starts .cloud-option-body.alibaba:before{left:1.0625rem}}@media screen and (min-width: 768px){.quick-starts .cloud-option-body:hover:before{opacity:1}}.quick-starts .cloud-option{background-color:#fff;margin-bottom:.125rem;border:2px solid #f3f4f7;font-size:1.125rem;letter-spacing:-0.25px;line-height:1.875rem;color:#262626}.quick-starts .cloud-option #microsoft-azure p{color:#262626;margin:0;padding:0;font-size:inherit;line-height:1.3rem}.quick-starts .cloud-option #microsoft-azure span{margin-bottom:0;padding-bottom:0;color:#ee4c2c;padding:0px 35px 0px 8px;font-style:italic;line-height:1.3rem}@media (min-width: 768px) and (max-width: 1239px){.quick-starts .cloud-option{font-size:1rem}}.quick-starts .cloud-option ul{display:none;width:100%;margin:0 0 1.25rem 0;padding:0}.quick-starts .cloud-option ul li{margin-top:0;position:relative;padding-left:5rem}@media (min-width: 768px) and (max-width: 1239px){.quick-starts .cloud-option ul li{font-size:1rem}}.quick-starts .cloud-option ul li a{color:#6c6c6d;letter-spacing:-0.25px;line-height:30px}@media screen and (min-width: 768px){.quick-starts .cloud-option ul li a:hover{color:#ee4c2c}}@media screen and (min-width: 768px){.quick-starts .cloud-option ul li:hover:before{content:"\2022";color:#ee4c2c;position:absolute;left:36px}}.quick-starts .cloud-option ul li:first-of-type{margin-top:1.25rem}.quick-starts .cloud-option.open .cloud-option-body{background-image:url("/assets/images/chevron-down-orange.svg");background-size:14px 14px;border-bottom:1px solid #ee4c2c;color:#262626}@media screen and (min-width: 768px){.quick-starts .cloud-option.open .cloud-option-body{border-bottom:none}}.quick-starts .cloud-option.open .cloud-option-body:after{width:100%}.quick-starts .cloud-option.open .cloud-option-body:before{opacity:1}.quick-starts .cloud-option.open ul{display:block}.blog .navbar-nav .nav-link{color:#000}.blog .main-content{padding-bottom:1.5rem}@media screen and (min-width: 768px){.blog .main-content{padding-top:1.70rem;padding-bottom:3.5rem}}.blog .main-background{height:290px}@media screen and (min-width: 768px){.blog .main-background{height:485px}}.blog .blog-detail-background{height:300px}@media screen and (min-width: 768px){.blog .blog-detail-background{height:312px}}.blog .main-content-menu .navbar-nav .nav-link{text-transform:capitalize}.blog .main-content-menu .navbar-nav .nav-link.selected{color:#ee4c2c !important;text-decoration:underline;-webkit-text-decoration-color:#ee4c2c;text-decoration-color:#ee4c2c;opacity:0.75 !important}@media screen and (min-width: 768px){.blog .main-content-menu .nav-item:last-of-type{position:absolute;right:0}.blog .main-content-menu .nav-item:last-of-type a{margin-right:0}}.blog .zoom-in{cursor:zoom-in}.blog .zoomed{cursor:zoom-out}.blog .zoomed img{margin:auto !important;position:absolute;top:0;left:0;right:0;bottom:0;max-width:98%}.blog .nav-logo{background-image:url("/assets/images/logo-dark.svg")}.blog .main-content-wrapper{margin-top:275px}.blog .main-content-wrapper .row.blog-index{margin-top:30px}.blog .main-content-wrapper .row.blog-index p{color:#6c6c6d}.blog .main-content-wrapper .row.blog-vertical{display:block;max-width:100%;margin:auto}.blog .main-content-wrapper .row.blog-vertical .col-md-4{display:initial}.blog .main-content-wrapper .row.blog-vertical .btn{float:left}.blog .main-content-wrapper .vertical-blog-container{border-bottom:1px solid #E2E2E2;padding-bottom:3rem}.blog .main-content-wrapper .vertical-blog-container:last-of-type{margin-bottom:2rem}@media screen and (min-width: 768px){.blog .main-content-wrapper{margin-top:470px}.blog .main-content-wrapper .row.blog-index [class*="col-"]:not(:first-child):not(:last-child):not(:nth-child(3n)){padding-right:2.1875rem;padding-left:2.1875rem}.blog .main-content-wrapper .row.blog-index [class*="col-"]:nth-child(3n){padding-left:2.1875rem}.blog .main-content-wrapper .row.blog-index [class*="col-"]:nth-child(3n+1){padding-right:2.1875rem}.blog .main-content-wrapper .col-md-4{margin-bottom:1.4375rem}}.blog .main-content-wrapper h4 a{font-family:FreightSans;font-size:1.5rem;color:#000;letter-spacing:0;line-height:2rem;font-weight:400}.blog .main-content-wrapper .author{color:#ee4c2c;font-size:1rem;letter-spacing:0.25px;line-height:1.875rem}.blog .main-content-wrapper .author-icon{position:relative;top:1.625rem;height:1.0625rem;width:1.1875rem}.blog .blog-detail-content{padding-bottom:2.8rem}@media screen and (min-width: 768px){.blog .blog-detail-wrapper{margin-top:324px}}.blog .jumbotron{top:6.5625rem}@media screen and (min-width: 768px){.blog .jumbotron{height:25.3125rem}}@media screen and (min-width: 768px){.blog .jumbotron .container{padding-bottom:2.8125rem}}.blog .jumbotron .blog-index-title{overflow:hidden;white-space:nowrap;text-overflow:ellipsis;color:white}@media screen and (min-width: 768px){.blog .jumbotron .blog-index-title{overflow:unset;white-space:unset;text-overflow:unset}}.blog .jumbotron h1{letter-spacing:-1.65px;font-size:3rem;line-height:3.75rem;text-transform:none}.blog .jumbotron h1 a{color:#fff;word-wrap:break-word}.blog .jumbotron .blog-title{display:inline-flex}.blog .jumbotron .blog-title:hover{color:#fff}.blog .jumbotron .blog-subtitle{display:inline-flex;margin-top:0;color:#fff;font-size:1.5rem;font-style:italic}.blog .jumbotron .blog-detail-container{padding-top:4rem}@media screen and (min-width: 768px){.blog .jumbotron .blog-detail-container{padding-top:10.875rem}}.blog .jumbotron p{font-size:1.25rem;letter-spacing:0;line-height:1.875rem;color:#fff}.blog .jumbotron .btn{margin-top:.75rem;padding-top:.5625rem}.blog .jumbotron .blog-page-container p.blog-date{padding-top:.625rem}.blog .jumbotron .blog-page-container .btn{margin-bottom:.625rem}.blog .blog-detail-jumbotron{top:45px}@media screen and (min-width: 768px){.blog .blog-detail-jumbotron{height:107px;top:75px}}.blog p.blog-date{font-size:1.125rem;letter-spacing:0;line-height:1.5rem;margin-bottom:.625rem;color:#6c6c6d}.blog p.featured-post{font-size:1.125rem;letter-spacing:0;line-height:1.5rem;margin-bottom:.625rem;color:#fff;font-weight:bolder}.blog p.featured-blog-preview{width:65%;min-width:640px;max-width:1280px;margin-bottom:.75rem}.blog #blogPostFilter .nav-link{opacity:0.53;font-size:1.25rem;color:#000;letter-spacing:0;line-height:2.125rem}.blog .page-link{font-size:1.25rem;letter-spacing:0;line-height:2.125rem;color:#ee4c2c;width:7.5rem;text-align:center}.blog .blog-modal{max-width:75%;top:5rem}.blog .blog-modal:hover{cursor:zoom-out}@media (max-width: 575px){.blog .blog-modal{max-width:100%;top:10rem}}.blog .blog-image{cursor:zoom-in}@media (max-width: 1067px){.blog .jumbotron h1{margin-right:0}.blog .jumbotron h1 a{font-size:2.8125rem;line-height:2.5rem}.blog .main-content-wrapper .col-md-4{margin-bottom:4.6875rem}.blog .similar-posts{margin-bottom:3.125rem}}@media (max-width: 1050px){.blog .main-content-wrapper .author-icon{left:-1.875rem}}.blog table tr th{font-weight:600}.blog .pytorch-article{padding-bottom:0}.blog .pytorch-article .enterprise-azure-logo-container{padding-left:0}.blog .pytorch-article .enterprise-azure-logo-container img{margin-bottom:0}.blog .pytorch-article .translation-description{font-style:italic;font-size:1rem}.blog .pytorch-article .ad-discuss{font-size:1.3rem}.blog .pytorch-article ul,.blog .pytorch-article ol{padding-left:1.5rem}.blog .pytorch-article ul li blockquote,.blog .pytorch-article ol li blockquote{margin-left:-1.5rem}.blog .pytorch-article blockquote{padding:5px 10px 5px 10px;border-left:#ee4c2c 2px solid;word-wrap:break-word}.blog .pytorch-article blockquote p{font-size:1rem;margin-bottom:0}.blog .pytorch-article blockquote ul{padding-left:1.125rem;margin:5px 0 5px 0}.blog .pytorch-article table tr th:first-of-type,.blog .pytorch-article table tr td:first-of-type{padding:.3125rem}.blog .pytorch-article img{margin-bottom:1.125rem}twitterwidget{margin:0 auto;margin-top:1.125rem !important;margin-bottom:1.125rem !important}.pytorch-article .outlined-code-block{border:1px solid black;padding:1rem;margin-bottom:1rem}.pytorch-article .outlined-code-block pre{margin:0;padding:0;background-color:white}.pytorch-article .reference-list li{overflow-wrap:anywhere}.similar-posts-module{background:#f3f4f7}.similar-posts-module p.blog-date{font-size:1.125rem;color:#CCCDD1;letter-spacing:0;line-height:1.5rem}.similar-posts-module h4 a{font-family:FreightSans;font-size:1.5rem;color:#000;letter-spacing:0;line-height:2rem;font-weight:400}.similar-posts-module .module-content{margin-bottom:2.1875rem}.similar-posts-module .module-content .navbar-nav{margin-top:3.75rem}.similar-posts-module .module-content .module-heading{text-transform:uppercase;color:#000;font-size:1.5rem;letter-spacing:.083125rem;line-height:2rem;font-weight:400}@media screen and (min-width: 768px){.similar-posts-module .module-content .nav-item:last-of-type{position:absolute;right:0}.similar-posts-module .module-content .nav-item:last-of-type a{margin-right:0}}.similar-posts-module .see-more-posts{color:#000;font-size:1.125rem;letter-spacing:-0.25px;line-height:1.875rem;top:.125rem}input[type='search']{-moz-appearance:none;-webkit-appearance:none}.navSearchWrapper{align-items:center;align-self:center;display:flex;justify-content:center;position:relative;right:10px;top:15px;margin-left:0;padding-bottom:20px}@media screen and (min-width: 768px){.navSearchWrapper{position:absolute;margin-left:30px;display:block;padding-left:3px;padding-bottom:0}}.tabletSearchWrapper{top:0px}@media (min-width: 768px) and (max-width: 1239px){.tabletSearchWrapper{top:-55px}}@media (min-width: 768px) and (max-width: 1239px){.tabletSearchWrapper{padding-bottom:20px;position:relative;margin-left:0}}.navSearchWrapper .aa-dropdown-menu{background:#f9f9f9;border:3px solid rgba(57,57,57,0.25);color:#393939;font-size:.875rem;left:auto !important;line-height:1.2em;right:0 !important}.navSearchWrapper .aa-dropdown-menu .algolia-docsearch-suggestion--category-header{background:#000;color:white;font-size:.875rem;font-weight:400}.navSearchWrapper .aa-dropdown-menu .algolia-docsearch-suggestion--category-header .algolia-docsearch-suggestion--highlight{background-color:#000;color:#fff}.navSearchWrapper .aa-dropdown-menu .algolia-docsearch-suggestion--title .algolia-docsearch-suggestion--highlight,.navSearchWrapper .aa-dropdown-menu .algolia-docsearch-suggestion--subcategory-column .algolia-docsearch-suggestion--highlight{color:#000}.navSearchWrapper .aa-dropdown-menu .algolia-docsearch-suggestion__secondary,.navSearchWrapper .aa-dropdown-menu .algolia-docsearch-suggestion--subcategory-column{border-color:rgba(57,57,57,0.3)}@media screen and (min-width: 768px){.navSearchWrapper .algolia-autocomplete .algolia-docsearch-suggestion--subcategory-column{word-wrap:normal}}input#search-input{background-color:inherit;border:none;border-radius:20px;color:#000;font-size:1.125rem;font-weight:300;line-height:20px;outline:none;padding-left:25px;position:relative;transition:0.5s width ease;display:none;width:220px;background-image:url("/assets/images/search-icon.svg");background-size:12px 15px;background-repeat:no-repeat;background-position:8px 5px}input#search-input:hover{background-image:url("/assets/images/search-icon-orange.svg")}input#mobile-search-input{font-size:2rem;background-color:transparent;color:#fff;border:none;outline:none;padding-left:25px;position:relative;border-top-left-radius:20px;border-bottom-left-radius:20px;width:300px;display:block}input#search-input:focus,input#search-input:active{color:#000}.navigationSlider .slidingNav .navSearchWrapper .algolia-docsearch-footer a{height:auto}@media only screen and (max-width: 735px){.navSearchWrapper{width:100%}}input::-moz-placeholder{color:#e5e5e5}input:-ms-input-placeholder{color:#e5e5e5}input::-ms-input-placeholder{color:#e5e5e5}input::placeholder{color:#e5e5e5}.hljs{padding:1.25rem 1.5rem}@media only screen and (max-width: 1024px){.reactNavSearchWrapper input#search-input{background-color:rgba(242,196,178,0.25);border:none;border-radius:20px;box-sizing:border-box;color:#393939;font-size:.875rem;line-height:20px;outline:none;padding-left:25px;position:relative;transition:background-color 0.2s cubic-bezier(0.68, -0.55, 0.265, 1.55),width 0.2s cubic-bezier(0.68, -0.55, 0.265, 1.55),color 0.2s ease;width:100%}.reactNavSearchWrapper input#search-input:focus,.reactNavSearchWrapper input#search-input:active{background-color:#000;color:#fff}.reactNavSearchWrapper .algolia-docsearch-suggestion--subcategory-inline{display:none}.reactNavSearchWrapper>span{width:100%}.reactNavSearchWrapper .aa-dropdown-menu{font-size:.75rem;line-height:2em;padding:0;border-width:1px;min-width:500px}.reactNavSearchWrapper .algolia-docsearch-suggestion__secondary{border-top:none}.aa-suggestions{min-height:140px;max-height:60vh;-webkit-overflow-scrolling:touch;overflow-y:scroll}}@media only screen and (min-width: 1024px){.navSearchWrapper{padding-left:10px;position:relative;right:auto;top:auto}}@media only screen and (min-width: 1024px) and (min-width: 768px){.navSearchWrapper{padding-left:3px;right:10px;margin-left:0}}@media only screen and (min-width: 1024px){.navSearchWrapper .algolia-autocomplete{display:block}.tabletSearchWrapper{right:10px;top:-55px}}@media only screen and (max-width: 735px){.reactNavSearchWrapper .aa-dropdown-menu{min-width:400px}}@media only screen and (max-width: 475px){.reactNavSearchWrapper .aa-dropdown-menu{min-width:300px}}.search-border{display:none;flex-direction:row;border:none;background-color:transparent;border-radius:20px;width:100%;float:right}@media screen and (min-width: 768px){.search-border{display:flex}}.mobile-search-border{flex-direction:row;border:none;background-color:rgba(255,255,255,0.1);border-radius:20px;width:100%;float:right;display:flex}@media (min-width: 768px) and (max-width: 1239px){.mobile-search-border{border-radius:25px}}#close-search{color:#ee4c2c;padding-right:10px;font-size:.99em;display:none;cursor:pointer}.active-header{margin-top:-1px}.active-search-icon{background-image:url("/assets/images/search-icon-orange.svg") !important;display:inline-block !important}.active-background{background-color:#f3f4f7;width:50%;padding:4px}.homepage-header input#search-input{background-image:url("/assets/images/search-icon-white.svg");color:#fff}.homepage-header input#search-input:focus,.homepage-header input#search-input:active{color:#fff}.homepage-header .active-background{background-color:rgba(0,0,0,0.2)}.homepage-header #close-search{color:#fff;opacity:0.5}.homepage-header #close-search:hover{color:#ee4c2c}.homepage-header #search-icon{background-image:url(/assets/images/search-icon-white.svg)}.homepage-header #search-icon:hover{background-color:rgba(0,0,0,0.2)}#search-icon{background-image:url(/assets/images/search-icon.svg);color:transparent;width:25px;height:25px;background-size:14px 16px;background-repeat:no-repeat;background-position:6px 5px;border-radius:25px;cursor:pointer}#search-icon:hover{background-color:#f3f4f7;background-image:url(/assets/images/search-icon-orange.svg)}#mobile-search-icon{background-image:url(/assets/images/search-icon-white.svg);width:30px;height:38px;background-size:16px 28px;background-repeat:no-repeat;background-position:0px 5px;cursor:pointer;border-top-right-radius:20px;border-bottom-right-radius:20px}@media (min-width: 768px) and (max-width: 1239px){#mobile-search-icon{height:50px;width:35px;background-size:20px 42px}}.navSearchWrapper .algolia-autocomplete .ds-dropdown-menu{min-width:330px;height:500px;overflow-y:scroll}@media screen and (min-width: 768px){.navSearchWrapper .algolia-autocomplete .ds-dropdown-menu{height:auto;min-width:700px;overflow-y:hidden}}@media (min-width: 768px) and (max-width: 1239px){.navSearchWrapper .algolia-autocomplete .ds-dropdown-menu{height:700px;overflow-y:scroll}}@media (min-width: 769px) and (max-width: 1024px){.navSearchWrapper .algolia-autocomplete .ds-dropdown-menu{min-width:950px}}.cookie-banner-wrapper{display:none}.cookie-banner-wrapper.is-visible{display:block;position:fixed;bottom:0;background-color:#f3f4f7;min-height:100px;width:100%;z-index:401;border-top:3px solid #ededee}.cookie-banner-wrapper .gdpr-notice{color:#6c6c6d;margin-top:1.5625rem;text-align:left;max-width:1440px}@media screen and (min-width: 768px){.cookie-banner-wrapper .gdpr-notice{width:77%}}@media (min-width: 768px) and (max-width: 1239px){.cookie-banner-wrapper .gdpr-notice{width:inherit}}.cookie-banner-wrapper .gdpr-notice .cookie-policy-link{color:#343434}.cookie-banner-wrapper .close-button{-webkit-appearance:none;-moz-appearance:none;appearance:none;background:transparent;border:1px solid #f3f4f7;height:1.3125rem;position:absolute;bottom:42px;right:0;top:0;cursor:pointer;outline:none}@media screen and (min-width: 768px){.cookie-banner-wrapper .close-button{right:20%;top:inherit}}@media (min-width: 768px) and (max-width: 1239px){.cookie-banner-wrapper .close-button{right:0;top:0}}.hub .jumbotron{height:375px}@media screen and (min-width: 768px){.hub .jumbotron{height:420px}}.hub .jumbotron h1 #hub-header,.hub .jumbotron h1 #hub-sub-header{color:#ee4c2c;font-weight:lighter}.hub .jumbotron h1 #hub-sub-header{color:#262626}.hub .jumbotron p.lead,.hub .jumbotron p.hub-release-message{margin-bottom:.9375rem;padding-top:2.1875rem;color:#6c6c6d}@media screen and (min-width: 768px){.hub .jumbotron p.lead,.hub .jumbotron p.hub-release-message{width:77%}}.hub .jumbotron p.hub-release-message{padding-top:0;font-style:italic}.hub .jumbotron svg{margin-bottom:1.25rem}.hub .jumbotron p.detail-lead{padding-top:3.125rem;color:#979797;width:100%;margin-bottom:0px}.hub .jumbotron p.lead-summary{color:#6c6c6d}.hub.hub-index .jumbotron{height:280px}@media screen and (min-width: 768px){.hub.hub-index .jumbotron{height:325px}}.hub .detail-github-link{background:#ee4c2c;color:#fff}.hub .detail-colab-link{background:#ffc107;color:#000}.hub .detail-web-demo-link{background:#4a9fb5;color:#fff}.hub .detail-colab-link,.hub .detail-github-link,.hub .detail-web-demo-link{margin-top:1rem}.hub .detail-button-container{margin-top:2.8125rem}@media (min-width: 768px) and (max-width: 1239px){.hub .detail-button-container{margin-top:1.25rem}}@media (max-width: 320px){.hub .detail-button-container{margin-top:1.25rem}}@media (max-width: 360px){.hub .detail-button-container{margin-top:1.25rem}}.hub a .detail-colab-link,.hub a .detail-github-link{padding-right:3.125rem}.hub .detail-arrow{color:#ee4c2c;font-size:2.5rem}@media screen and (min-width: 768px){.hub .detail-arrow{font-size:4.5rem}}.hub .with-right-white-arrow{padding-right:2rem;position:relative;background-image:url("/assets/images/chevron-right-white.svg");background-size:6px 13px;background-position:top 10px right 11px;background-repeat:no-repeat}@media screen and (min-width: 768px){.hub .with-right-white-arrow{background-size:8px 14px;background-position:top 15px right 12px;padding-right:2rem}}.hub .main-content{padding-top:8.75rem}@media screen and (min-width: 768px){.hub .main-content{padding-top:8.4375rem}}@media (max-width: 320px){.hub .main-content{padding-top:10rem}}.hub.hub-detail .main-content{padding-top:12.5rem}@media screen and (min-width: 768px){.hub.hub-detail .main-content{padding-top:9.375rem}}.hub.hub-detail .jumbotron{height:350px}@media screen and (min-width: 768px){.hub.hub-detail .jumbotron{height:400px}}.hub .main-content-wrapper{background-color:#f3f4f7;margin-top:300px}@media screen and (min-width: 768px){.hub .main-content-wrapper{margin-top:395px}}.hub-feedback-button{border:2px solid #e2e2e2;color:#A0A0A1;padding-left:0;padding-right:5rem;font-size:1rem;width:13rem}.hub-feedback-button:after{bottom:-1px}.hub-flag{background-image:url("/assets/images/feedback-flag.svg");background-size:15px 20px;background-position:center right 10px;background-repeat:no-repeat}#hub-icons{height:2rem}@media (max-width: 480px){#hub-icons{position:initial;padding-left:0;padding-top:1rem}}.hub.hub-detail .main-content-wrapper{margin-top:305px}@media screen and (min-width: 768px){.hub.hub-detail .main-content-wrapper{margin-top:390px}}@media (min-width: 768px) and (max-width: 1239px){.hub.hub-detail .main-content-wrapper{margin-top:490px}}@media (max-width: 320px){.hub.hub-detail .main-content-wrapper{margin-top:330px}}.hub .hub-cards-wrapper,.hub-cards-wrapper-right{margin-bottom:1.125rem;padding-top:1.25rem}.hub .hub-cards-wrapper .card-body .card-summary,.hub-cards-wrapper-right .card-body .card-summary{width:75%}.hub .hub-cards-wrapper .card-body .hub-image,.hub-cards-wrapper-right .card-body .hub-image{position:absolute;top:0px;right:0px;height:100%;width:25%}.hub .hub-cards-wrapper .card-body .hub-image img,.hub-cards-wrapper-right .card-body .hub-image img{height:100%;width:100%}.hub .hub-cards-wrapper .card-body .hub-image:before,.hub-cards-wrapper-right .card-body .hub-image:before{content:'';position:absolute;top:0;left:0;bottom:0;right:0;z-index:1;background:#000000;opacity:.075}.hub .github-stars-count{color:#979797;position:relative;top:.25rem;font-size:14px}@media screen and (min-width: 768px){.hub .github-stars-count{top:.1875rem;font-size:initial}}.hub .github-stars-count-whole-number{display:none}.hub .github-logo{height:15px;width:13px}.hub .icon-count-container{display:inline-block;vertical-align:text-bottom;margin-left:.5rem}.hub .detail-count{font-size:1.25rem}.hub .main-stars-container{display:flex}.hub .detail-stars-container{display:inline-flex}.hub .detail-stars-container .github-stars-image{margin-left:0}.hub .card-body .hub-card-title-container{width:75%;display:inline-flex;max-width:18.75rem}.hub .card-body .hub-card-title-container .experimental-badge{text-transform:uppercase;margin-left:.9375rem;background-color:#e4e4e4;color:#262626;opacity:0.75;font-size:.625rem;letter-spacing:1px;line-height:1.375rem;height:1.25rem;width:6rem;text-align:center;margin-top:.25rem}.hub .card-body .hub-card-title-container .card-title{padding-left:0;font-size:1.5rem;color:#262626}.hub .card-body .hub-card-title-container .star-list{list-style:none;padding-left:0}.hub .card-body .hub-card-title-container .star-list li{display:inline}.hub .card-body .hub-card-title-container .star-list li.github-stars-count-whole-number{display:none}.hub .hub-filter-menu ul{list-style-type:none;padding-left:1.25rem}.hub .hub-filter-menu ul li{padding-right:1.25rem;word-break:break-all}.hub .hub-filter-menu ul li a{color:#979797}.hub .hub-filter-menu ul li a:hover{color:#ee4c2c}.hub .hub-filter{cursor:pointer}.hub-index #dropdownSortLeft{color:#979797;cursor:pointer;z-index:1;position:absolute;top:inherit;left:23%;max-width:4rem}@media (min-width: 480px) and (max-width: 590px){.hub-index #dropdownSortLeft{left:40%}}.hub #dropdownFilter,#dropdownSort,#dropdownSortLeft{color:#979797;cursor:pointer;z-index:1;position:absolute;top:11rem;right:1rem;left:inherit}@media (min-width: 480px) and (max-width: 590px){.hub #dropdownFilter,#dropdownSort,#dropdownSortLeft{top:7rem}}@media (min-width: 590px){.hub #dropdownFilter,#dropdownSort,#dropdownSortLeft{top:5rem}}@media screen and (min-width: 768px){.hub #dropdownFilter,#dropdownSort,#dropdownSortLeft{top:5rem}}.hub .sort-menu{left:inherit;right:1rem;top:12.5rem;max-width:12rem}@media (min-width: 480px) and (max-width: 590px){.hub .sort-menu{top:8.5rem}}@media (min-width: 590px) and (max-width: 900px){.hub .sort-menu{top:6.5rem}}@media (min-width: 900px) and (max-width: 1239px){.hub .sort-menu{top:6.5rem}}@media screen and (min-width: 1240px){.hub .sort-menu{right:0;top:6.5rem}}.hub-index .sort-menu{left:23%;top:inherit;max-width:12rem}.hub .research-hub-title,.research-hub-sub-title{text-transform:uppercase;letter-spacing:1.78px;line-height:2rem}.research-hub-sub-title{padding-bottom:1.25rem}.hub .research-hub-title{color:#ee4c2c}.hub .all-models-button,.full-docs-button{font-size:1.125rem;position:relative;cursor:pointer;outline:none;padding:.625rem 1.875rem .625rem 1.25rem;background-color:#fff;margin-bottom:0.125rem;border:2px solid #f3f4f7;letter-spacing:-0.25px;line-height:1.75rem;color:#6c6c6d;background-image:url("/assets/images/chevron-right-orange.svg");background-size:6px 13px;background-position:center right 10px;background-repeat:no-repeat}.hub .all-models-button a,.full-docs-button a{color:#6c6c6d}@media screen and (min-width: 768px){.hub .all-models-button:after,.full-docs-button:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.hub .all-models-button:hover:after,.full-docs-button:hover:after{width:100%}.hub .all-models-button:hover,.full-docs-button:hover{color:#262626}}.hub .hub-column{padding-bottom:4.6875rem}.hub.hub-index .hub-column{padding-bottom:0}.hub .how-it-works{padding-top:3.125rem;padding-bottom:2.8125rem}.hub .how-it-works .how-it-works-text{color:#6c6c6d;font-size:1.25rem;letter-spacing:0;line-height:1.875rem}.hub .how-it-works .how-it-works-title-col{padding-bottom:3.4375rem}.hub .how-it-works .full-docs-button{margin-top:1.875rem}.hub .hub-code-text{font-size:80%;color:#262626;background-color:#e2e2e2;padding:2px}.hub .hub-code-block{display:block;border-left:3px solid #ee4c2c;padding:1.25rem 1.5625rem 1.25rem 1.5625rem;margin-bottom:3.75rem}.hub pre.highlight{background-color:#e2e2e2;border-left:2px solid #ee4c2c}.hub code.highlighter-rouge{background-color:#e2e2e2}.hub article{padding-top:1.25rem}@media screen and (min-width: 768px){.hub article{padding-top:0}}.hub article p{color:#262626}@media screen and (min-width: 768px){.hub .hub-detail-background{height:515px}}.hub .dropdown-menu{border-radius:0;padding-bottom:0}.hub .card:hover .hub-image:before{bottom:100%}.hub.hub.hub-detail .github-stars-image img{height:9px}@media screen and (min-width: 768px){.hub.hub.hub-detail .github-stars-image img{height:10px}}.hub #development-models-hide,#research-models-hide{display:none}@media (min-width: 768px){.hub .col-md-6.hub-column{flex:0 0 100%;max-width:100%}}@media screen and (min-width: 1240px){.hub .col-md-6.hub-column{flex:0 0 50%;max-width:50%}}@media (min-width: 768px){.hub .col-md-12.hub-column .col-md-6{flex:0 0 100%;max-width:100%}}@media screen and (min-width: 1240px){.hub .col-md-12.hub-column .col-md-6{flex:0 0 100%;max-width:50%}}.hub .featured-image{padding-bottom:1.25rem}.hub .coming-soon{font-weight:300;font-style:italic}@media screen and (min-width: 768px){.hub.hub-index .jumbotron{height:325px}}.hub.hub-index .jumbotron h1{padding-top:0}@media screen and (min-width: 768px){.hub.hub-index .jumbotron h1{padding-top:3.4375rem}}.hub.hub-index .jumbotron p.lead{padding-top:3.4375rem}.hub.hub-index .main-content-wrapper{margin-top:210px}@media screen and (min-width: 768px){.hub.hub-index .main-content-wrapper{margin-top:280px}}.hub .page-link{font-size:1.25rem;letter-spacing:0;line-height:2.125rem;color:#ee4c2c;width:7.5rem;text-align:center}.hub .filter-btn{color:#979797;border:1px solid #979797;display:inline-block;text-align:center;white-space:nowrap;vertical-align:middle;padding:0.375rem 0.75rem;font-size:1rem;line-height:1.5;margin-bottom:5px}.hub .filter-btn:hover{border:1px solid #ee4c2c;color:#ee4c2c}.hub .selected{border:1px solid #ee4c2c;background-color:#ee4c2c;color:#fff}.hub .selected:hover{color:#fff}.hub .all-tag-selected{background-color:#979797;color:#fff}.hub .all-tag-selected:hover{border-color:#979797;color:#fff}.hub .pagination .page{border:1px solid #dee2e6;padding:0.5rem 0.75rem}.hub .pagination .active .page{background-color:#dee2e6}.hub .hub-tags-container{width:60%}.hub .hub-tags-container.active{width:0}@media screen and (min-width: 768px){.hub .hub-search-wrapper{top:8px}}.hub .hub-search-wrapper .algolia-autocomplete .ds-dropdown-menu{min-width:100%;max-width:100% !important}.hub .hub-search-wrapper .algolia-autocomplete{width:100%}.hub .hub-search-wrapper.active{width:100%}.hub .hub-search-wrapper span{font-size:1.125rem;text-align:center}@media (max-width: 480px){.hub #hub-search-icon{margin-top:1rem}}#hub-search-icon{background-image:url("/assets/images/search-icon.svg");color:transparent;opacity:0.4;width:25px;height:25px;margin-left:3rem;background-size:15px 20px;background-repeat:no-repeat;right:10px;position:absolute;z-index:1;cursor:pointer}#hub-search-icon:hover{background-image:url("/assets/images/search-icon-orange.svg");opacity:1}#hub-search-input{background-color:#CCCDD1;border:none;color:#000;font-size:1.125rem;font-weight:300;line-height:20px;outline:none;position:relative;display:none;width:100%;border-radius:5px;padding:.875rem 0 .875rem .3125rem}#hub-close-search{display:none;margin-left:20px;opacity:0.4;right:10px;position:absolute;z-index:1;cursor:pointer;font-size:1.125rem}@media screen and (min-width: 768px){#hub-close-search{top:1.125rem}}#hub-close-search:hover{color:#ee4c2c;opacity:1}.hub .hub-divider{margin-bottom:2.2rem;margin-top:1.5rem}.hub .active-hub-divider{border-color:#ee4c2c}.hub .hub-search-border{display:flex;align-items:center;flex-direction:row;border:none;background-color:transparent;border-radius:20px;width:100%}.hub .hub-cards-wrapper{z-index:1000}.hub .nav-container{display:flex;width:100%;position:absolute}.compact-cards{width:100%}.compact-cards a{color:#6C6C6D}.compact-cards a:hover{color:#ee4c2c}.compact-hub-card-wrapper{padding:0}.compact-card-container{display:flex;align-items:center}.compact-card-body{padding-top:8px}.compact-card-body:hover{border-bottom:1px solid #ee4c2c;color:#ee4c2c}.compact-card-body:hover .compact-item-title{color:#ee4c2c}.compact-card-body .compact-hub-card-title-container{width:75%;display:flex}.compact-model-card{height:auto;border-bottom:1px solid #E2E2E2}.compact-item-title{padding-left:0;color:#000}.compact-card-summary{white-space:nowrap;overflow:hidden;text-overflow:ellipsis;top:5px}.compact-hub-divider{padding:0;width:100%}.hub-select-container{position:absolute;right:0;height:2rem}.compact-hub-index-cards{padding-bottom:2rem}.full-hub-icon:hover{cursor:pointer;height:3rem}.compact-hub-icon{margin-left:0.5rem;margin-right:3.125rem}.compact-hub-icon:hover{cursor:pointer}.mobile article{margin-bottom:5rem}.mobile .main-background{height:275px}@media screen and (min-width: 768px){.mobile .main-background{height:380px}}.mobile .main-content-wrapper{margin-top:275px}@media screen and (min-width: 768px){.mobile .main-content-wrapper{margin-top:350px}}.mobile .jumbotron{height:190px}@media screen and (min-width: 768px){.mobile .jumbotron{height:260px}}.mobile .main-content .navbar{background-color:#f3f4f7;padding-left:0;padding-bottom:0;padding-top:0}@media (min-width: 992px){.mobile .main-content .navbar li:first-of-type{padding-left:3.4375rem}.mobile .main-content .navbar .nav-item{padding:2rem;cursor:pointer}.mobile .main-content .navbar .nav-link{position:relative;top:10%;transform:translateY(-50%)}}.mobile .main-content .navbar .nav-select{background-color:#fff}.mobile .main-content .navbar .nav-select .nav-link{color:#ee4c2c;font-weight:500}.mobile .main-content .navbar .nav-link{font-size:1.125rem;color:#8c8c8c}@media screen and (min-width: 768px){.mobile .main-content .navbar .nav-link{margin-left:1.875rem}}.mobile .main-content .navbar .nav-link:hover{color:#ee4c2c}.mobile .main-content .navbar .nav-item{padding-top:.9375rem;padding-bottom:.9375rem}@media screen and (min-width: 768px){.mobile .main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (min-width: 768px) and (max-width: 1239px){.mobile .main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (max-width: 990px){.mobile .main-content .navbar .nav-item{padding-bottom:.625rem;padding-top:1rem}}.mobile .main-content .navbar .navbar-toggler{margin-left:2.5rem}.mobile .main-content{padding-top:0}@media screen and (min-width: 768px){.mobile .main-content{padding-top:1.9rem}}.mobile .nav-menu-wrapper{background-color:#f3f4f7}.mobile .navbar-nav{flex-direction:row}.mobile .mobile-page-sidebar{padding-top:2.5rem;padding-bottom:2.5rem;top:15%}@media screen and (min-width: 768px){.mobile .mobile-page-sidebar{padding-top:0}}.mobile .mobile-page-sidebar ul{padding-left:0}.mobile .mobile-page-sidebar li{list-style-type:none;line-height:36px}.mobile .mobile-page-sidebar li a{color:#8c8c8c}.mobile .mobile-page-sidebar li a.active,.mobile .mobile-page-sidebar li a:hover{color:#ee4c2c}@media screen and (min-width: 1240px){.deep-learning .header-container{margin-bottom:1rem}}.deep-learning .jumbotron{height:180px}@media screen and (min-width: 768px){.deep-learning .jumbotron{height:250px}}.deep-learning .jumbotron .thank-you-page-container{margin-top:0}@media (min-width: 768px) and (max-width: 1239px){.deep-learning .jumbotron .thank-you-page-container{margin-top:250px}}@media screen and (min-width: 768px){.deep-learning .jumbotron .deep-learning-jumbotron-text{margin-top:55px}.deep-learning .jumbotron .deep-learning-jumbotron-text h1{padding-top:30px}}@media (min-width: 768px) and (max-width: 1239px){.deep-learning .jumbotron .deep-learning-jumbotron-text{max-width:95%;flex-basis:100%}}.deep-learning .jumbotron .deep-learning-thank-you-text{width:80%}.deep-learning .jumbotron .deep-learning-thank-you-text .download-book-link{display:inline-block}.deep-learning .jumbotron .deep-learning-landing-text{width:100%}@media screen and (min-width: 768px){.deep-learning .jumbotron .deep-learning-landing-text{width:85%}}.deep-learning .jumbotron .deep-learning-book-container{display:none}@media screen and (min-width: 768px){.deep-learning .jumbotron .deep-learning-book-container{display:block}}@media (min-width: 768px) and (max-width: 1239px){.deep-learning .jumbotron .deep-learning-book-container{display:none}}.deep-learning .jumbotron .thank-you-book-container{display:none}@media (min-width: 768px) and (max-width: 1239px){.deep-learning .jumbotron .thank-you-book-container{display:block}}@media screen and (min-width: 768px){.deep-learning .jumbotron .thank-you-book-container{display:block}}@media screen and (min-width: 768px){.deep-learning .deep-learning-col{max-width:80%}}@media screen and (min-width: 768px){.deep-learning .deep-learning-background{height:440px}}@media screen and (min-width: 768px){.deep-learning .header-holder{height:90px}}.deep-learning .main-content-wrapper{margin-top:250px}@media screen and (min-width: 768px){.deep-learning .main-content-wrapper{margin-top:480px}}@media screen and (min-width: 768px){.deep-learning .deep-learning-content{padding-top:0}}.deep-learning .main-background{height:250px}@media screen and (min-width: 768px){.deep-learning .main-background{height:440px}}.deep-learning .thank-you-wrapper{margin-top:400px}@media screen and (min-width: 768px){.deep-learning .thank-you-wrapper{margin-top:275px}}.deep-learning .thank-you-background{height:438px}@media screen and (min-width: 768px){.deep-learning .thank-you-background{height:680px}}.deep-learning-container{display:flex;align-items:center}.deep-learning-logo{background-image:url("/assets/images/pytorch-kr-logo.png")}.deep-learning-row{display:flex;align-items:center}.deep-learning-row .lead{margin-top:1rem;margin-bottom:2rem}@media (min-width: 768px) and (max-width: 1239px){.deep-learning-row h1{font-size:3rem}}@media screen and (min-width: 768px){.deep-learning-row h1{margin-top:2rem}}.deep-learning-book{max-width:100%;height:400px}.deep-learning-form{margin-left:-1rem}@media screen and (min-width: 768px){.deep-learning-form{margin-left:0;margin-top:1rem}}#deep-learning-button{margin-top:2rem}.deep-learning-form .email-subscribe-form .deep-learning-input{padding-left:.5rem;background-color:#f3f4f7}.deep-learning-form #mce-error-response{color:#ee4c2c}.ecosystem .contributor-jumbotron{width:90%}@media screen and (min-width: 768px){.ecosystem .contributor-jumbotron{height:262px}}.ecosystem .contributor-jumbotron .container{max-width:920px}.ecosystem .contributor-jumbotron h1{padding-top:0}.ecosystem .contributor-jumbotron h1 span{font-weight:300;color:#812CE5}.ecosystem .contributor-jumbotron .contributor-jumbo-text h1{color:white}.ecosystem .contributor-jumbotron .contributor-jumbo-text h2{color:white;padding-top:0}.hidden{display:none}.contributor-container-fluid{height:4rem;width:100%}@media screen and (max-width: 767px){.contributor-container-fluid{margin-top:2rem}}@media screen and (min-width: 1100px){.contributor-container-fluid{margin-left:0}}.ecosystem .contributor.main-content{padding-top:0}.ecosystem .contributor.main-content .navbar{padding-left:0;padding-bottom:0;padding-top:0}.ecosystem .contributor.main-content .navbar .nav-item{cursor:pointer}.ecosystem .contributor.main-content .navbar .nav-item:last-of-type{position:relative}@media (min-width: 992px){.ecosystem .contributor.main-content .navbar .nav-item{padding:2rem;cursor:pointer}.ecosystem .contributor.main-content .navbar .nav-link{position:relative;top:10%;transform:translateY(-50%)}}.ecosystem .contributor.main-content .navbar .nav-select{background-color:#fff}.ecosystem .contributor.main-content .navbar .nav-select .nav-link{color:#ee4c2c;font-weight:500}.ecosystem .contributor.main-content .navbar .nav-link{font-size:1.125rem;color:#8c8c8c}@media screen and (min-width: 768px){.ecosystem .contributor.main-content .navbar .nav-link{margin-left:1.875rem}}.ecosystem .contributor.main-content .navbar .nav-link:hover{color:#ee4c2c}.ecosystem .contributor.main-content .navbar .contributor-nav-link{padding-left:1.25rem;padding-right:1.25rem}@media screen and (min-width: 768px){.ecosystem .contributor.main-content .navbar .contributor-nav-link{padding-left:1.875rem;padding-right:1.875rem}}.ecosystem .contributor.main-content .navbar .contributor-nav{flex-direction:row}.ecosystem .contributor.main-content .navbar .nav-item{padding-top:.9375rem;padding-bottom:.9375rem}@media screen and (min-width: 768px){.ecosystem .contributor.main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (min-width: 768px) and (max-width: 1239px){.ecosystem .contributor.main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (max-width: 990px){.ecosystem .contributor.main-content .navbar .nav-item{padding-bottom:.625rem;padding-top:1rem}}.ecosystem .contributor.main-content .navbar .navbar-toggler{margin-left:2.5rem}.past-issue-container{display:flex}@media (max-width: 767px){.past-issue-container{display:block}}.past-issue-container .get-started-cloud-sidebar .sticky-top{position:-webkit-sticky;position:sticky;top:15%}@media (max-width: 767px){.past-issue-container .get-started-cloud-sidebar .sticky-top{position:relative;top:0;margin-left:0}}.past-issue-container .get-started-cloud-sidebar .pytorch-article li{list-style:initial}.past-issue-container .get-started-cloud-sidebar li{list-style-type:none;line-height:36px;color:#8c8c8c}.past-issue-container .get-started-cloud-sidebar span{white-space:nowrap}#past-issues{max-width:920px;margin:auto;margin-top:0;margin-bottom:0}.contributor-container{max-width:920px;left:0;right:0;margin-left:auto;margin-right:auto;padding-left:30px;padding-right:30px;width:90%}.past-issue-container.container{padding-left:5px;padding-top:45px}.nav-background{width:100%;background-color:#f3f4f7}#get-started-contributor-sidebar-list{padding-left:0}#get-started-contributor-sidebar-list .active{color:#ee4c2c}#get-started-contributor-sidebar-list li a{color:#8c8c8c}.two-column-row{max-width:920px;margin:0 auto 0 auto;padding:0 30px 43px 30px;width:90%}@media screen and (min-width: 768px){.two-column-row{display:flex}}.two-column-row h2{text-transform:uppercase;font-weight:100;margin-bottom:30px}.two-column-row p{margin-bottom:40px}.two-column-row .content-left{flex:60%;padding-top:76px}@media screen and (min-width: 768px){.two-column-row .content-left{margin-right:62px}}.two-column-row .content-left h2{color:#ee4c2c}.two-column-row .content-left .contributor-consent-check{max-width:400px}.two-column-row .content-left .email-consent{color:#979797;font-size:14px}.two-column-row .content-left .please-accept-terms{display:none;color:#ee4c2c;font-size:14px}.two-column-row .content-right{flex:40%;padding-top:76px}.two-column-row .content-right h2{color:#812CE5}.two-column-row .contributor-form{margin:-8px 0 47px 0}.two-column-row .contributor-form .form-success,.two-column-row .contributor-form .form-fail{color:#ee4c2c;display:none;flex:none;margin:8px 0 12px 0}.two-column-row .contributor-form form{width:100%}.two-column-row .contributor-form form .contributor-form-ui{display:flex;max-width:390px;flex-wrap:wrap}.two-column-row .contributor-form form .contributor-form-ui input[type="text"]{border:1px solid #e6e6e6;border-radius:4px;flex:1 70%;padding:5px 8px 5px 8px;margin-right:10px}.two-column-row .contributor-form form .contributor-form-ui input[type="text"]::-moz-placeholder{color:silver}.two-column-row .contributor-form form .contributor-form-ui input[type="text"]:-ms-input-placeholder{color:silver}.two-column-row .contributor-form form .contributor-form-ui input[type="text"]::-ms-input-placeholder{color:silver}.two-column-row .contributor-form form .contributor-form-ui input[type="text"]::placeholder{color:silver}.two-column-row .contributor-form form .contributor-form-ui input[type="text"]:focus{border:1px solid #ee4c2c}.two-column-row .contributor-form form .contributor-form-ui input[type="submit"]{background:#e6e6e6;border:none;border-radius:4px;color:#6d6d6d}.two-column-row .contributor-form form .contributor-form-ui input[type="submit"]:hover{background:silver;color:#3a3a3a}.two-column-row .contributor-form input[type="checkbox"]{margin:1px 6px 0 0}.two-column-row .contributor-form .contributor-consent-check{color:#979797;margin-top:1rem}.two-column-row .contributors-button{background-image:url("/assets/images/chevron-right-orange.svg");background-color:#fff;background-size:6px 13px;background-position:center right 10px;background-repeat:no-repeat;border:2px solid #f3f4f7;color:#6c6c6d;cursor:pointer;font-size:1.125rem;outline:none;letter-spacing:-0.25px;line-height:1.75rem;margin-bottom:0.125rem;padding:.625rem 1.875rem .625rem 1.25rem}.two-column-row .contributors-button a{color:#6c6c6d}@media screen and (min-width: 768px){.two-column-row .contributors-button:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.two-column-row .contributors-button:hover:after{width:100%}.two-column-row .contributors-button:hover{color:#262626}}.mobile .enterprise-jumbotron{height:210px}@media screen and (min-width: 768px){.mobile .enterprise-jumbotron{height:280px}}.enterprise{padding-bottom:0}.enterprise p,.enterprise li{color:#6c6c6d;font-size:18px}.enterprise h2{padding-bottom:1.5rem}.enterprise .container{padding:48px 30px 48px 30px}.enterprise .enterprise-gray-container{background-color:#f3f4f7}.enterprise .pyt-enterprise-logo{background-image:url("/assets/images/PTE_lockup_PRIMARY.svg");background-repeat:no-repeat;height:60px}.enterprise .container{max-width:940px}.enterprise .enterprise-landing-azure-logo-container{float:left;padding:0}.ecosystem .events-wrapper{background-color:white}@media screen and (min-width: 768px){.ecosystem .events-wrapper{margin-top:472px}}.ecosystem .events{padding-top:0}.ecosystem .events .event-info-container{display:flex;flex-flow:column}.ecosystem .events .sticky-top{top:15%}.ecosystem .events .event-label{margin-bottom:2rem}.ecosystem .live-event-container{display:flex}@media (max-width: 767px){.ecosystem .live-event-container{flex-flow:wrap}}.ecosystem .events-section{max-width:920px;margin:0 auto 0 auto;padding:0 30px 43px 30px;width:90%}.ecosystem .events-section .event-item{padding-bottom:3rem;border-bottom:1px solid #D6D7D8}.ecosystem .events-section .event-item h2{padding-bottom:1rem}.ecosystem .event-side-nav-container{padding-left:3rem}.ecosystem .event-side-nav-container ul{list-style:none}.ecosystem .live-events-section p{font-size:18px;margin-top:2rem}@media (min-width: 768px) and (max-width: 1239px){.ecosystem .live-events-section{width:100%;padding-left:5px;padding-right:5px}}@media (max-width: 767px){.ecosystem .live-events-section{width:100%;padding-left:5px;padding-right:5px}}.ecosystem .events.main-content{padding-top:0}.events-container-fluid{height:5rem;width:100%;padding-bottom:7rem}@media screen and (max-width: 767px){.events-container-fluid{margin-top:2rem}}@media screen and (min-width: 1100px){.events-container-fluid{margin-left:0}}.ecosystem .events.main-content .navbar{padding-left:0;padding-bottom:0;padding-top:0}.ecosystem .events.main-content .navbar .nav-item{cursor:pointer}.ecosystem .events.main-content .navbar .nav-item:last-of-type{position:relative}@media (min-width: 992px){.ecosystem .events.main-content .navbar .nav-item{padding:2rem;cursor:pointer}.ecosystem .events.main-content .navbar .nav-link{position:relative;top:10%;transform:translateY(-50%)}}.ecosystem .events.main-content .navbar .nav-select{background-color:#fff}.ecosystem .events.main-content .navbar .nav-select .nav-link{color:#ee4c2c;font-weight:500}.ecosystem .events.main-content .navbar .nav-link{font-size:1.125rem;color:#8c8c8c}@media screen and (min-width: 768px){.ecosystem .events.main-content .navbar .nav-link{margin-left:1.875rem}}.ecosystem .events.main-content .navbar .nav-link:hover{color:#ee4c2c}.ecosystem .events.main-content .navbar .events-nav-link{padding-left:1.25rem;padding-right:1.25rem}@media screen and (min-width: 768px){.ecosystem .events.main-content .navbar .events-nav-link{padding-left:1.875rem;padding-right:1.875rem}}.ecosystem .events.main-content .navbar .events-nav{flex-direction:row}.ecosystem .events.main-content .navbar .nav-item{padding-top:.9375rem;padding-bottom:.9375rem}@media screen and (min-width: 768px){.ecosystem .events.main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (min-width: 768px) and (max-width: 1239px){.ecosystem .events.main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (max-width: 990px){.ecosystem .events.main-content .navbar .nav-item{padding-bottom:.625rem;padding-top:1rem}}.ecosystem .events.main-content .navbar .navbar-toggler{margin-left:2.5rem}.events-video-wrapper{width:100%;border:1px solid #979797;background-color:#f3f4f7;height:21rem;margin-top:2.5rem}.events-video-wrapper .video-container{display:flex;top:12%}.events-video-wrapper .video-tabs{display:flex}.events-video-wrapper .events-video-nav{flex-direction:row;padding-right:0;margin-bottom:1rem}.events-video-wrapper .events-video-nav .nav-item{border-right:1px solid #979797;border-bottom:1px solid #979797}.events-video-wrapper .events-video-nav .nav-select{background-color:#fff;border-bottom:none}.events-video-wrapper .events-video-nav .nav-select .nav-link{color:#ee4c2c}.events-video-wrapper .events-nav-link{text-align:center}.events-video-wrapper .video{position:relative;height:0;padding-bottom:30%;place-self:center}.events-video-wrapper .video-info{margin-left:3rem;max-width:45%}.events-video-wrapper iframe{height:100%;width:100%;position:absolute}.video-links-container{border:1px solid #979797}.video-links-container .video-links{display:flex}.video-links-container .video-links .video-link-item{padding-left:1rem;list-style:none}.episode-header-text{font-size:26px;margin-bottom:2rem}.episode-card-row{display:block}@media screen and (min-width: 908px){.episode-card-row{display:flex;flex-wrap:wrap;margin-bottom:2rem}}.episode-card-row .episode-card.resource-card{height:14rem;margin-right:1rem;margin-bottom:1rem;background-color:#f3f4f7;border:none;max-width:31%;flex:auto}.episode-card-row .episode-card.resource-card ul{list-style:none}.episode-card-row .episode-card.resource-card a{color:inherit}.episode-card-row .episode-card.resource-card .episode-body{display:block;position:relative;top:30px;margin-left:20px}.episode-card-row .episode-card.resource-card .episode-title{margin-left:3.2rem;margin-bottom:.5rem;font-size:1.5rem}@media screen and (min-width: 768px){.episode-card-row .episode-card.resource-card .episode-title{margin-left:2.5rem}}.episode-card-row .episode-card.resource-card .guest-name{font-weight:500;font-size:1.25rem;overflow:hidden;white-space:nowrap;text-overflow:ellipsis}.episode-card-row .episode-card.resource-card .episode-info{display:flex;justify-content:space-between}.episode-card-row .episode-card.resource-card .episode-info span{padding-left:5px;padding-right:5px}.episode-card-row .episode-card.resource-card .info-divide{display:block;border-bottom:1px solid #D6D7D8;margin-top:.5rem;margin-bottom:.5rem}.episode-card-row .episode-card.resource-card .episode-poster{color:#ee4c2c}.episode-card-row .episode-card.resource-card .episode-date-time{display:flex;padding-left:0}.episode-card-row .episode-card.resource-card .episode-date-time span{padding-left:5px;padding-right:5px}@media screen and (max-width: 907px){.episode-card-row .episode-card.resource-card{max-width:100%;margin-bottom:1.25rem}}.episode-card-row .episode-card.resource-card.pytorch-resource:before{content:"";background-size:32px 32px;background-repeat:no-repeat;display:block;position:absolute;height:32px;width:32px;top:30px;left:15px}@media screen and (min-width: 768px){.episode-card-row .episode-card.resource-card.pytorch-resource:before{left:30px;top:30px}}.podcast-container{padding-left:0}@media screen and (min-width: 768px){.podcast-container{display:flex}.podcast-container .podcast-card:not(:first-of-type){margin-left:1rem}}.podcast-container .podcast-card{display:flex;align-items:center;justify-content:center;margin-top:2rem;border:1px solid #D6D7D8;height:8.75rem}@media screen and (min-width: 768px){.podcast-container .podcast-card:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.podcast-container .podcast-card:hover:after{width:100%}.podcast-container .podcast-card:hover{color:#262626}}.podcast-container .podcast-title{font-size:24px;font-weight:400}.comm-stories .community-stories-wrapper{background-color:white}.comm-stories .community-stories{padding-top:0}.comm-stories .community-stories .production-info-container,.comm-stories .community-stories .research-info-container{display:flex;flex-flow:column}.comm-stories .community-stories .sticky-top{top:15%}.comm-stories .production-container,.comm-stories .research-container{display:flex;padding-left:0}@media (max-width: 767px){.comm-stories .production-container,.comm-stories .research-container{flex-flow:wrap}}.comm-stories .production-section,.comm-stories .research-section{max-width:920px;margin:0 auto 0 auto;padding:0 30px 43px 30px;width:90%}.comm-stories .production-section .production-item,.comm-stories .production-section .research-item,.comm-stories .research-section .production-item,.comm-stories .research-section .research-item{padding-bottom:2rem;padding-top:2rem;border-bottom:1px solid #d6d7d8}.comm-stories .production-section .production-item h2,.comm-stories .production-section .research-item h2,.comm-stories .research-section .production-item h2,.comm-stories .research-section .research-item h2{padding-bottom:1rem}.comm-stories .production-side-nav-container #research-sidebar-list,.comm-stories .production-side-nav-container #production-sidebar-list,.comm-stories .research-side-nav-container #research-sidebar-list,.comm-stories .research-side-nav-container #production-sidebar-list{padding-left:0}.comm-stories .production-side-nav-container #research-sidebar-list .active,.comm-stories .production-side-nav-container #production-sidebar-list .active,.comm-stories .research-side-nav-container #research-sidebar-list .active,.comm-stories .research-side-nav-container #production-sidebar-list .active{color:#ee4c2c}.comm-stories .production-side-nav-container #research-sidebar-list ul,.comm-stories .production-side-nav-container #production-sidebar-list ul,.comm-stories .research-side-nav-container #research-sidebar-list ul,.comm-stories .research-side-nav-container #production-sidebar-list ul{padding-left:3rem;list-style:none}.comm-stories .production-side-nav-container #research-sidebar-list ul li,.comm-stories .production-side-nav-container #production-sidebar-list ul li,.comm-stories .research-side-nav-container #research-sidebar-list ul li,.comm-stories .research-side-nav-container #production-sidebar-list ul li{line-height:36px}.comm-stories .production-side-nav-container #research-sidebar-list ul li a,.comm-stories .production-side-nav-container #production-sidebar-list ul li a,.comm-stories .research-side-nav-container #research-sidebar-list ul li a,.comm-stories .research-side-nav-container #production-sidebar-list ul li a{color:#8c8c8c}.comm-stories .production-section p,.comm-stories .research-section p{font-size:18px;margin-top:2rem}@media (min-width: 768px) and (max-width: 1239px){.comm-stories .production-section,.comm-stories .research-section{width:100%;padding-left:5px;padding-right:5px}}@media (max-width: 767px){.comm-stories .production-section,.comm-stories .research-section{width:100%;padding-left:5px;padding-right:5px}}.comm-stories .main-content-wrapper{margin-top:275px}@media screen and (min-width: 768px){.comm-stories .main-content-wrapper{margin-top:380px}}.comm-stories .jumbotron{color:#fff;height:190px}@media screen and (min-width: 768px){.comm-stories .jumbotron{height:260px}}.ecosystem .community-stories.main-content{padding-top:0}.community-stories-container-fluid{height:5rem;width:100%;padding-bottom:7rem}@media screen and (max-width: 767px){.community-stories-container-fluid{margin-top:2rem}}@media screen and (min-width: 1100px){.community-stories-container-fluid{margin-left:0}}.comm-stories .community-stories.main-content .navbar{padding-left:0;padding-bottom:0;padding-top:0}.comm-stories .community-stories.main-content .navbar .nav-item{cursor:pointer}.comm-stories .community-stories.main-content .navbar .nav-item:last-of-type{position:relative}@media (min-width: 992px){.comm-stories .community-stories.main-content .navbar .nav-item{padding:2rem;cursor:pointer}.comm-stories .community-stories.main-content .navbar .nav-link{position:relative;top:10%;transform:translateY(-50%)}}.comm-stories .community-stories.main-content .navbar .nav-select{background-color:#fff}.comm-stories .community-stories.main-content .navbar .nav-select .nav-link{color:#ee4c2c;font-weight:500}.comm-stories .community-stories.main-content .navbar .nav-link{font-size:1.125rem;color:#8c8c8c}@media screen and (min-width: 768px){.comm-stories .community-stories.main-content .navbar .nav-link{margin-left:1.875rem}}.comm-stories .community-stories.main-content .navbar .nav-link:hover{color:#ee4c2c}.comm-stories .community-stories.main-content .navbar .community-stories-nav-link{padding-left:1.25rem;padding-right:1.25rem}@media screen and (min-width: 768px){.comm-stories .community-stories.main-content .navbar .community-stories-nav-link{padding-left:1.875rem;padding-right:1.875rem}}.comm-stories .community-stories.main-content .navbar .community-stories-nav{flex-direction:row}.comm-stories .community-stories.main-content .navbar .nav-item{padding-top:.9375rem;padding-bottom:.9375rem}@media screen and (min-width: 768px){.comm-stories .community-stories.main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (min-width: 768px) and (max-width: 1239px){.comm-stories .community-stories.main-content .navbar .nav-item{padding-bottom:0;padding-top:2rem}}@media (max-width: 990px){.comm-stories .community-stories.main-content .navbar .nav-item{padding-bottom:.625rem;padding-top:1rem}}.comm-stories .community-stories.main-content .navbar .navbar-toggler{margin-left:2.5rem}.announcement .hero-content{top:148px;height:250px;position:relative;margin-bottom:120px;justify-content:center}@media screen and (min-width: 768px){.announcement .hero-content{top:178px;height:350px}}.announcement .hero-content h1{font-size:3.75rem;text-transform:uppercase;font-weight:lighter;letter-spacing:1.08px;margin-bottom:.625rem;line-height:1.05;color:#fff}@media screen and (min-width: 768px){.announcement .hero-content h1{font-size:4.5rem}}.announcement .hero-content h1.small{font-size:40px}@media screen and (min-width: 768px){.announcement .hero-content h1.small{font-size:58px}}.announcement .hero-content .lead{margin-bottom:1.5625rem;padding-top:1.875rem;color:#fff;width:100%}.announcement .row{justify-content:center}.announcement .main-content{margin-bottom:5rem;padding-bottom:0}.announcement .main-background{height:370px}@media screen and (min-width: 768px){.announcement .main-background{height:450px}}.announcement .card-container{display:grid;grid-template-columns:repeat(2, 1fr);gap:20px;padding-top:3rem}.announcement .card-container .card{border:none;display:block}.announcement .card-container .card a{color:#000}.announcement .card-container .card .card-body{display:flex;flex-direction:column;height:100%;justify-content:space-between;padding:0}.announcement .card-container .card .card-body img{width:100%;height:207px;-o-object-fit:contain;object-fit:contain;padding:20px}@media screen and (min-width: 1000px){.announcement .card-container .card .card-body img{padding:30px}}@media screen and (min-width: 1000px){.announcement .card-container{grid-template-columns:repeat(3, 1fr);gap:36px}}.announcement .contact-us-section{background-color:#f3f4f7;padding:50px 0}.announcement .contact-us-section .row{justify-content:center}.announcement .contact-us-section .row .lead{padding-top:1.5rem}.announcement .contact-us-section .row .hbspt-form{padding:30px 0}.announcement .contact-us-section .row .hbspt-form .hs-button{background-image:url("/assets/images/chevron-right-orange.svg");background-size:6px 13px;background-position:top 16px right 11px;background-repeat:no-repeat;border-radius:0;border:none;background-color:#fff;color:#6c6c6d;font-weight:400;position:relative;letter-spacing:0.25px;padding:.75rem 2rem .75rem .75rem;margin:10px 0}@media screen and (min-width: 768px){.announcement .contact-us-section .row .hbspt-form .hs-button:after{content:"";display:block;width:0;height:1px;position:absolute;bottom:0;left:0;background-color:#ee4c2c;transition:width .250s ease-in-out}.announcement .contact-us-section .row .hbspt-form .hs-button:hover:after{width:100%}.announcement .contact-us-section .row .hbspt-form .hs-button:hover{color:#262626}}@media screen and (min-width: 768px){.announcement .contact-us-section .row .hbspt-form .hs-button{background-position:top 19px right 11px}}.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-2,.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-1{max-width:100%}.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-2 .hs-form-field,.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-1 .hs-form-field{max-width:100%;padding:10px 0;width:100%}.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-2 .hs-form-field input,.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-1 .hs-form-field input{border:none;width:100%}.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-2 .hs-form-field textarea,.announcement .contact-us-section .row .hbspt-form fieldset.form-columns-1 .hs-form-field textarea{border:none;width:100%}.announcement .contact-us-section .row .hbspt-form li.hs-form-radio input[type=radio]{width:auto !important}.announcement .contact-us-section .row .hbspt-form li.hs-form-radio span{margin-left:5px}.announcement .contact-us-section .row .hbspt-form ul{list-style-type:none}.announcement .light-background-section{background-color:#fff}.announcement .light-background-section .content{padding:40px 0}.announcement .grey-background-section{background-color:#f3f4f7;padding:60px 0}.announcement .grey-background-section img{height:100px}.announcement .grey-background-section p{font-size:14px;line-height:170%}.announcement .color-background-section{background-image:url("/assets/images/pytorch_bg_purple.jpg");background-size:100% 100%;background-repeat:no-repeat;padding:60px 0}.announcement .color-background-section h2{color:white}.announcement .body-side-text .lead{margin-bottom:1.5625rem;padding-top:1.5rem}.announcement img{width:100%}.announcement h2.upper{font-size:25px;line-height:130%;text-align:center;letter-spacing:1.75px;text-transform:uppercase;margin-bottom:30px}.announcement h3.upper{font-size:19px;text-transform:uppercase;letter-spacing:1.75px;line-height:130%;margin:25px 0}.announcement table.benefits{background-color:white;font-size:14px;text-align:center}.announcement table.benefits td.benefit{border-left:none;min-width:300px;text-align:left}@media screen and (min-width: 768px){.announcement table.benefits td.benefit{min-width:520px}}.announcement table.benefits tbody td{border-left:1px solid #812CE5;vertical-align:middle}.announcement table.benefits tbody td.benefit{font-weight:600}.announcement table.benefits thead,.announcement table.benefits tfoot{background-color:#812CE5;color:white;font-size:16px;font-weight:700}@media screen and (min-width: 768px){.announcement table.benefits thead,.announcement table.benefits tfoot{font-size:20px}}.announcement table.benefits thead td,.announcement table.benefits tfoot td{border-left:1px solid #000;vertical-align:middle;border-top:none}.announcement table.benefits thead a,.announcement table.benefits tfoot a{text-decoration:underline;color:white}.announcement table.benefits thead td.price,.announcement table.benefits tfoot td.price{font-size:14px;line-height:1.2}@media screen and (min-width: 768px){.announcement table.benefits thead td.price,.announcement table.benefits tfoot td.price{font-size:16px}}.announcement table.benefits img{width:15px}.announcement .modal-header{border-bottom:none;padding-bottom:0}.announcement .consolidated-employees tbody td{font-weight:600}.announcement .consolidated-employees td.no-border{border-left:none}.announcement .member-boxes{gap:20px;margin:0}.announcement .member-boxes div.col-sm{background-color:white}.board-member{margin:35px 0}.board-member img{margin-bottom:15px}.board-member a svg{margin-top:5px;height:25px;max-width:30px;fill:#000;color:#000}.board-member a:hover svg{fill:#ee4c2c;color:#ee4c2c}.community iframe{width:100%;height:500px}.coc .quick-start-guides ul{margin-bottom:0;padding-left:0}.coc .main-background{height:275px}@media screen and (min-width: 768px){.coc .main-background{height:380px}}.coc .main-content-wrapper{margin-top:275px}@media screen and (min-width: 768px){.coc .main-content-wrapper{margin-top:350px}}.coc .jumbotron{height:190px}@media screen and (min-width: 768px){.coc .jumbotron{height:260px}}.coc .main-content{padding-top:0}@media screen and (min-width: 768px){.coc .main-content{padding-top:1.9rem}}.coc .main-content .pytorch-article{margin-top:50px;max-width:800px}.coc .main-content .pytorch-article br{display:none}.coc .main-content .pytorch-article p{line-height:2.1rem}.coc .main-content .pytorch-article p br{display:inline}.coc .main-content .pytorch-article h1{font-size:2.0rem;text-align:center}.coc .main-content .pytorch-article h2{font-size:1.3rem;margin-top:2.0rem;margin-bottom:0.5rem}.coc .main-content .pytorch-article ul,.coc .main-content .pytorch-article ol{padding-left:1.25rem} diff --git a/assets/menu-tab-selection.js b/assets/menu-tab-selection.js new file mode 100644 index 000000000..04d3a0f8b --- /dev/null +++ b/assets/menu-tab-selection.js @@ -0,0 +1,7 @@ +var menuTabScript = $("script[src*=menu-tab-selection]"); +var pageId = menuTabScript.attr("page-id"); + +$(".main-content-menu .nav-item").removeClass("nav-select"); +$(".main-content-menu .nav-link[data-id='" + pageId + "']") + .parent(".nav-item") + .addClass("nav-select"); diff --git a/assets/mobile-menu.js b/assets/mobile-menu.js new file mode 100644 index 000000000..fab8b2e7a --- /dev/null +++ b/assets/mobile-menu.js @@ -0,0 +1,30 @@ +var mobileMenu = { + bind: function() { + $("[data-behavior='open-mobile-menu']").on('click', function(e) { + e.preventDefault(); + $(".mobile-main-menu").addClass("open"); + $("body").addClass('no-scroll'); + + mobileMenu.listenForResize(); + }); + + $("[data-behavior='close-mobile-menu']").on('click', function(e) { + e.preventDefault(); + mobileMenu.close(); + }); + }, + + listenForResize: function() { + $(window).on('resize.ForMobileMenu', function() { + if ($(this).width() > 768) { + mobileMenu.close(); + } + }); + }, + + close: function() { + $(".mobile-main-menu").removeClass("open"); + $("body").removeClass('no-scroll'); + $(window).off('resize.ForMobileMenu'); + } +}; diff --git a/assets/mobile-page-sidebar.js b/assets/mobile-page-sidebar.js new file mode 100644 index 000000000..d90f0495d --- /dev/null +++ b/assets/mobile-page-sidebar.js @@ -0,0 +1,26 @@ +$(".pytorch-article h2").each(function() { + $("#mobile-page-sidebar-list").append( + "
  • " + this.textContent + "
  • " + ); +}); + +$(".mobile-page-sidebar li").on("click", function() { + removeActiveClass(); + addActiveClass(this); +}); + +function removeActiveClass() { + $(".mobile-page-sidebar li a").each(function() { + $(this).removeClass("active"); + }); +} + +function addActiveClass(element) { + $(element) + .find("a") + .addClass("active"); +} + +if ($("#mobile-page-sidebar-list").text() == "") { + $("#shortcuts-menu").hide(); +} diff --git a/assets/pytorch2-2.pdf b/assets/pytorch2-2.pdf new file mode 100644 index 000000000..8669ecd43 Binary files /dev/null and b/assets/pytorch2-2.pdf differ diff --git a/assets/quick-start-module.js b/assets/quick-start-module.js new file mode 100644 index 000000000..c965682f4 --- /dev/null +++ b/assets/quick-start-module.js @@ -0,0 +1,452 @@ +// Keys are Substrings as diplayed by navigator.platform +var supportedOperatingSystems = new Map([ + ["linux", "linux"], + ["mac", "macos"], + ["win", "windows"], +]); + +var archInfoMap = new Map([ + ["cuda", { title: "CUDA", platforms: new Set(["linux", "windows"]) }], + ["rocm", { title: "ROCm", platforms: new Set(["linux"]) }], + ["accnone", { title: "CPU", platforms: new Set(["linux", "macos", "windows"]) }], +]); + +let version_map = { + nightly: { + accnone: ["cpu", ""], + "cuda.x": ["cuda", "11.8"], + "cuda.y": ["cuda", "12.1"], + "cuda.z": ["cuda", "12.4"], + "rocm5.x": ["rocm", "6.2"], + }, + release: { + accnone: ["cpu", ""], + "cuda.x": ["cuda", "11.8"], + "cuda.y": ["cuda", "12.1"], + "cuda.z": ["cuda", "12.4"], + "rocm5.x": ["rocm", "6.2"], + }, +}; +let stable_version = "Stable (2.5.1)"; + +var default_selected_os = getAnchorSelectedOS() || getDefaultSelectedOS(); +var opts = { + cuda: getPreferredCuda(default_selected_os), + os: default_selected_os, + pm: "pip", + language: "python", + ptbuild: "stable", +}; + +var supportedCloudPlatforms = ["aws", "google-cloud", "microsoft-azure"]; + +var os = $(".os > .option"); +var package = $(".package > .option"); +var language = $(".language > .option"); +var cuda = $(".cuda > .option"); +var ptbuild = $(".ptbuild > .option"); + +os.on("click", function () { + selectedOption(os, this, "os"); +}); +package.on("click", function () { + selectedOption(package, this, "pm"); +}); +language.on("click", function () { + selectedOption(language, this, "language"); +}); +cuda.on("click", function () { + selectedOption(cuda, this, "cuda"); +}); +ptbuild.on("click", function () { + selectedOption(ptbuild, this, "ptbuild"); +}); + +// Pre-select user's operating system +$(function () { + var userOsOption = document.getElementById(opts.os); + var userCudaOption = document.getElementById(opts.cuda); + if (userOsOption) { + $(userOsOption).trigger("click"); + } + if (userCudaOption) { + $(userCudaOption).trigger("click"); + } +}); + +// determine os (mac, linux, windows) based on user's platform +function getDefaultSelectedOS() { + var platform = navigator.platform.toLowerCase(); + for (var [navPlatformSubstring, os] of supportedOperatingSystems.entries()) { + if (platform.indexOf(navPlatformSubstring) !== -1) { + return os; + } + } + // Just return something if user platform is not in our supported map + return supportedOperatingSystems.values().next().value; +} + +// determine os based on location hash +function getAnchorSelectedOS() { + var anchor = location.hash; + var ANCHOR_REGEX = /^#[^ ]+$/; + // Look for anchor in the href + if (!ANCHOR_REGEX.test(anchor)) { + return false; + } + // Look for anchor with OS in the first portion + var testOS = anchor.slice(1).split("-")[0]; + for (var [navPlatformSubstring, os] of supportedOperatingSystems.entries()) { + if (testOS.indexOf(navPlatformSubstring) !== -1) { + return os; + } + } + return false; +} + +// determine CUDA version based on OS +function getPreferredCuda(os) { + // Only CPU builds are currently available for MacOS + if (os == "macos") { + return "accnone"; + } + return "cuda.x"; +} + +// Disable compute platform not supported on OS +function disableUnsupportedPlatforms(os) { + if (opts.ptbuild == "preview") archMap = version_map.nightly; + else archMap = version_map.release; + + for (const [arch_key, info] of archInfoMap) { + var elems = document.querySelectorAll('[id^="' + arch_key + '"]'); + if (elems == null) { + console.log("Failed to find element for architecture " + arch_key); + return; + } + for (var i = 0; i < elems.length; i++) { + var supported = info.platforms.has(os); + elems[i].style.textDecoration = supported ? "" : "line-through"; + + // Officially supported arch but not available + if (!archMap[elems[i].id]) { + elems[i].style.textDecoration = "line-through"; + } + } + } +} + +// Change compute versions depending on build type +function changeVersion(ptbuild) { + if (ptbuild == "preview") archMap = version_map.nightly; + else archMap = version_map.release; + + for (const [arch_key, info] of archInfoMap) { + var elems = document.querySelectorAll('[id^="' + arch_key + '"]'); + for (var i = 0; i < elems.length; i++) { + if (archMap[elems[i].id]) { + elems[i].style.textDecoration = ""; + elems[i].children[0].textContent = info.title + " " + archMap[elems[i].id][1]; + } else { + elems[i].style.textDecoration = "line-through"; + } + } + } + var stable_element = document.getElementById("stable"); + stable_element.children[0].textContent = stable_version; +} + +// Change accnone name depending on OS type +function changeAccNoneName(osname) { + var accnone_element = document.getElementById("accnone"); + if (accnone_element == null) { + console.log("Failed to find accnone element"); + return; + } + if (osname == "macos") { + accnone_element.children[0].textContent = "Default"; + } else { + accnone_element.children[0].textContent = "CPU"; + } +} + +function selectedOption(option, selection, category) { + $(option).removeClass("selected"); + $(selection).addClass("selected"); + opts[category] = selection.id; + if (category === "pm") { + var elements = document.getElementsByClassName("language")[0].children; + if (selection.id !== "libtorch" && elements["cplusplus"].classList.contains("selected")) { + $(elements["cplusplus"]).removeClass("selected"); + $(elements["python"]).addClass("selected"); + opts["language"] = "python"; + } else if (selection.id == "libtorch") { + for (var i = 0; i < elements.length; i++) { + if (elements[i].id === "cplusplus") { + $(elements[i]).addClass("selected"); + opts["language"] = "cplusplus"; + } else { + $(elements[i]).removeClass("selected"); + } + } + } + } else if (category === "language") { + var elements = document.getElementsByClassName("package")[0].children; + if (selection.id !== "cplusplus" && elements["libtorch"].classList.contains("selected")) { + $(elements["libtorch"]).removeClass("selected"); + $(elements["pip"]).addClass("selected"); + opts["pm"] = "pip"; + } else if (selection.id == "cplusplus") { + for (var i = 0; i < elements.length; i++) { + if (elements[i].id === "libtorch") { + $(elements[i]).addClass("selected"); + opts["pm"] = "libtorch"; + } else { + $(elements[i]).removeClass("selected"); + } + } + } + } else if (category == "ptbuild") { + changeVersion(opts.ptbuild); + //make sure unsupported platforms are disabled + disableUnsupportedPlatforms(opts.os); + } + commandMessage(buildMatcher()); + if (category === "os") { + disableUnsupportedPlatforms(opts.os); + display(opts.os, "installation", "os"); + } + changeAccNoneName(opts.os); +} + +function display(selection, id, category) { + var container = document.getElementById(id); + // Check if there's a container to display the selection + if (container === null) { + return; + } + var elements = container.getElementsByClassName(category); + for (var i = 0; i < elements.length; i++) { + if (elements[i].classList.contains(selection)) { + $(elements[i]).addClass("selected"); + } else { + $(elements[i]).removeClass("selected"); + } + } +} + +function buildMatcher() { + return ( + opts.ptbuild.toLowerCase() + + "," + + opts.pm.toLowerCase() + + "," + + opts.os.toLowerCase() + + "," + + opts.cuda.toLowerCase() + + "," + + opts.language.toLowerCase() + ); +} + +// Cloud Partners sub-menu toggle listeners +$("[data-toggle='cloud-dropdown']").on("click", function (e) { + if ($(this).hasClass("open")) { + $(this).removeClass("open"); + // If you deselect a current drop-down item, don't display it's info any longer + display(null, "cloud", "platform"); + } else { + $("[data-toggle='cloud-dropdown'].open").removeClass("open"); + $(this).addClass("open"); + var cls = $(this).find(".cloud-option-body")[0].className; + for (var i = 0; i < supportedCloudPlatforms.length; i++) { + if (cls.includes(supportedCloudPlatforms[i])) { + display(supportedCloudPlatforms[i], "cloud", "platform"); + } + } + } +}); + +function commandMessage(key) { + var object = { + "preview,pip,linux,accnone,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,pip,linux,cuda.x,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu118", + "preview,pip,linux,cuda.y,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121", + "preview,pip,linux,cuda.z,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu124", + "preview,pip,linux,rocm5.x,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.2", + "preview,conda,linux,cuda.x,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch-nightly -c nvidia", + "preview,conda,linux,cuda.y,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch-nightly -c nvidia", + "preview,conda,linux,cuda.z,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch-nightly -c nvidia", + "preview,conda,linux,rocm5.x,python": + "NOTE: Conda packages are not currently available for ROCm, please use pip instead
    ", + "preview,conda,linux,accnone,python": "conda install pytorch torchvision torchaudio cpuonly -c pytorch-nightly", + "preview,libtorch,linux,accnone,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-shared-with-deps-latest.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-cxx11-abi-shared-with-deps-latest.zip", + "preview,libtorch,linux,cuda.x,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cu118/libtorch-shared-with-deps-latest.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cu118/libtorch-cxx11-abi-shared-with-deps-latest.zip", + "preview,libtorch,linux,cuda.y,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cu121/libtorch-shared-with-deps-latest.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cu121/libtorch-cxx11-abi-shared-with-deps-latest.zip", + "preview,libtorch,linux,cuda.z,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cu124/libtorch-shared-with-deps-latest.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/cu124/libtorch-cxx11-abi-shared-with-deps-latest.zip", + "preview,libtorch,linux,rocm5.x,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/rocm6.2/libtorch-shared-with-deps-latest.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/nightly/rocm6.2/libtorch-cxx11-abi-shared-with-deps-latest.zip", + "preview,pip,macos,cuda.x,python": + "# CUDA is not available on MacOS, please use default package
    pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,pip,macos,cuda.y,python": + "# CUDA is not available on MacOS, please use default package
    pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,pip,macos,cuda.z,python": + "# CUDA is not available on MacOS, please use default package
    pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,pip,macos,rocm5.x,python": + "# ROCm is not available on MacOS, please use default package
    pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,pip,macos,accnone,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,conda,macos,cuda.x,python": + "# CUDA is not available on MacOS, please use default package
    conda install pytorch-nightly::pytorch torchvision torchaudio -c pytorch-nightly", + "preview,conda,macos,cuda.y,python": + "# CUDA is not available on MacOS, please use default package
    conda install pytorch-nightly::pytorch torchvision torchaudio -c pytorch-nightly", + "preview,conda,macos,cuda.z,python": + "# CUDA is not available on MacOS, please use default package
    conda install pytorch-nightly::pytorch torchvision torchaudio -c pytorch-nightly", + "preview,conda,macos,rocm5.x,python": + "# ROCm is not available on MacOS, please use default package
    conda install pytorch-nightly::pytorch torchvision torchaudio -c pytorch-nightly", + "preview,conda,macos,accnone,python": + "conda install pytorch-nightly::pytorch torchvision torchaudio -c pytorch-nightly", + "preview,libtorch,macos,accnone,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-macos-arm64-latest.zip", + "preview,libtorch,macos,cuda.x,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-macos-arm64-latest.zip", + "preview,libtorch,macos,cuda.y,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-macos-arm64-latest.zip", + "preview,libtorch,macos,cuda.z,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-macos-arm64-latest.zip", + "preview,libtorch,macos,rocm5.x,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-macos-arm64-latest.zip", + "preview,pip,windows,accnone,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu", + "preview,pip,windows,cuda.x,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu118", + "preview,pip,windows,cuda.y,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121", + "preview,pip,windows,cuda.z,python": + "pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu124", + "preview,pip,windows,rocm5.x,python": "NOTE: ROCm is not available on Windows", + "preview,conda,windows,cuda.x,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch-nightly -c nvidia", + "preview,conda,windows,cuda.y,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch-nightly -c nvidia", + "preview,conda,windows,cuda.z,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch-nightly -c nvidia", + "preview,conda,windows,rocm5.x,python": "NOTE: ROCm is not available on Windows", + "preview,conda,windows,accnone,python": "conda install pytorch torchvision torchaudio cpuonly -c pytorch-nightly", + "preview,libtorch,windows,accnone,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-win-shared-with-deps-latest.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/nightly/cpu/libtorch-win-shared-with-deps-debug-latest.zip", + "preview,libtorch,windows,cuda.x,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/nightly/cu118/libtorch-win-shared-with-deps-latest.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/nightly/cu118/libtorch-win-shared-with-deps-debug-latest.zip", + "preview,libtorch,windows,cuda.y,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/nightly/cu121/libtorch-win-shared-with-deps-latest.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/nightly/cu121/libtorch-win-shared-with-deps-debug-latest.zip", + "preview,libtorch,windows,cuda.z,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/nightly/cu124/libtorch-win-shared-with-deps-latest.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/nightly/cu124/libtorch-win-shared-with-deps-debug-latest.zip", + "preview,libtorch,windows,rocm5.x,cplusplus": "NOTE: ROCm is not available on Windows", + "stable,pip,linux,accnone,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu", + "stable,pip,linux,cuda.x,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118", + "stable,pip,linux,cuda.y,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121", + "stable,pip,linux,cuda.z,python": "pip3 install torch torchvision torchaudio", + "stable,pip,linux,rocm5.x,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/rocm6.2", + "stable,conda,linux,cuda.x,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia", + "stable,conda,linux,cuda.y,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia", + "stable,conda,linux,cuda.z,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch -c nvidia", + "stable,conda,linux,rocm5.x,python": + "NOTE: Conda packages are not currently available for ROCm, please use pip instead
    ", + "stable,conda,linux,accnone,python": "conda install pytorch torchvision torchaudio cpuonly -c pytorch", + "stable,libtorch,linux,accnone,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-2.5.1%2Bcpu.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-2.5.1%2Bcpu.zip", + "stable,libtorch,linux,cuda.x,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/cu118/libtorch-shared-with-deps-2.5.1%2Bcu118.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/cu118/libtorch-cxx11-abi-shared-with-deps-2.5.1%2Bcu118.zip", + "stable,libtorch,linux,cuda.y,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/cu121/libtorch-shared-with-deps-2.5.1%2Bcu121.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/cu121/libtorch-cxx11-abi-shared-with-deps-2.5.1%2Bcu121.zip", + "stable,libtorch,linux,cuda.z,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/cu124/libtorch-shared-with-deps-2.5.1%2Bcu124.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/cu124/libtorch-cxx11-abi-shared-with-deps-2.5.1%2Bcu124.zip", + "stable,libtorch,linux,rocm5.x,cplusplus": + "Download here (Pre-cxx11 ABI):
    https://download.pytorch.org/libtorch/rocm6.2/libtorch-shared-with-deps-2.5.1%2Brocm6.2.zip
    Download here (cxx11 ABI):
    https://download.pytorch.org/libtorch/rocm6.2/libtorch-cxx11-abi-shared-with-deps-2.5.1%2Brocm6.2.zip", + "stable,pip,macos,cuda.x,python": + "# CUDA is not available on MacOS, please use default package
    pip3 install torch torchvision torchaudio", + "stable,pip,macos,cuda.y,python": + "# CUDA is not available on MacOS, please use default package
    pip3 install torch torchvision torchaudio", + "stable,pip,macos,cuda.z,python": + "# CUDA is not available on MacOS, please use default package
    pip3 install torch torchvision torchaudio", + "stable,pip,macos,rocm5.x,python": + "# ROCm is not available on MacOS, please use default package
    pip3 install torch torchvision torchaudio", + "stable,pip,macos,accnone,python": "pip3 install torch torchvision torchaudio", + "stable,conda,macos,cuda.x,python": + "# CUDA is not available on MacOS, please use default package
    conda install pytorch::pytorch torchvision torchaudio -c pytorch", + "stable,conda,macos,cuda.y,python": + "# CUDA is not available on MacOS, please use default package
    conda install pytorch::pytorch torchvision torchaudio -c pytorch", + "stable,conda,macos,cuda.z,python": + "# CUDA is not available on MacOS, please use default package
    conda install pytorch::pytorch torchvision torchaudio -c pytorch", + "stable,conda,macos,rocm5.x,python": + "# ROCm is not available on MacOS, please use default package
    conda install pytorch::pytorch torchvision torchaudio -c pytorch", + "stable,conda,macos,accnone,python": "conda install pytorch::pytorch torchvision torchaudio -c pytorch", + "stable,libtorch,macos,accnone,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/cpu/libtorch-macos-arm64-2.5.1.zip", + "stable,libtorch,macos,cuda.x,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/cpu/libtorch-macos-arm64-2.5.1.zip", + "stable,libtorch,macos,cuda.y,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/cpu/libtorch-macos-arm64-2.5.1.zip", + "stable,libtorch,macos,cuda.z,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/cpu/libtorch-macos-arm64-2.5.1.zip", + "stable,libtorch,macos,rocm5.x,cplusplus": + "Download arm64 libtorch here (ROCm and CUDA are not supported):
    https://download.pytorch.org/libtorch/cpu/libtorch-macos-arm64-2.5.1.zip", + "stable,pip,windows,accnone,python": "pip3 install torch torchvision torchaudio", + "stable,pip,windows,cuda.x,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118", + "stable,pip,windows,cuda.y,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121", + "stable,pip,windows,cuda.z,python": + "pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124", + "stable,pip,windows,rocm5.x,python": "NOTE: ROCm is not available on Windows", + "stable,conda,windows,cuda.x,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia", + "stable,conda,windows,cuda.y,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia", + "stable,conda,windows,cuda.z,python": + "conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch -c nvidia", + "stable,conda,windows,rocm5.x,python": "NOTE: ROCm is not available on Windows", + "stable,conda,windows,accnone,python": "conda install pytorch torchvision torchaudio cpuonly -c pytorch", + "stable,libtorch,windows,accnone,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/cpu/libtorch-win-shared-with-deps-2.5.1%2Bcpu.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/cpu/libtorch-win-shared-with-deps-debug-2.5.1%2Bcpu.zip", + "stable,libtorch,windows,cuda.x,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/cu118/libtorch-win-shared-with-deps-2.5.1%2Bcu118.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/cu118/libtorch-win-shared-with-deps-debug-2.5.1%2Bcu118.zip", + "stable,libtorch,windows,cuda.y,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/cu121/libtorch-win-shared-with-deps-2.5.1%2Bcu121.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/cu121/libtorch-win-shared-with-deps-debug-2.5.1%2Bcu121.zip", + "stable,libtorch,windows,cuda.z,cplusplus": + "Download here (Release version):
    https://download.pytorch.org/libtorch/cu124/libtorch-win-shared-with-deps-2.5.1%2Bcu124.zip
    Download here (Debug version):
    https://download.pytorch.org/libtorch/cu124/libtorch-win-shared-with-deps-debug-2.5.1%2Bcu124.zip", + "stable,libtorch,windows,rocm5.x,cplusplus": "NOTE: ROCm is not available on Windows", + }; + + if (!object.hasOwnProperty(key)) { + $("#command").html( + "
     # Follow instructions at this URL: https://github.com/pytorch/pytorch#from-source 
    " + ); + } else if (key.indexOf("lts") == 0 && key.indexOf("rocm") < 0) { + $("#command").html("
    " + object[key] + "
    "); + } else { + $("#command").html("
    " + object[key] + "
    "); + } +} + +// Set cuda version right away +changeVersion("stable"); diff --git a/assets/scroll-to-anchor.js b/assets/scroll-to-anchor.js new file mode 100644 index 000000000..79fee28bb --- /dev/null +++ b/assets/scroll-to-anchor.js @@ -0,0 +1,86 @@ +// Modified from https://stackoverflow.com/a/13067009 +// Going for a JS solution to scrolling to an anchor so we can benefit from +// less hacky css and smooth scrolling. + +var scrollToAnchor = { + bind: function() { + var document = window.document; + var history = window.history; + var location = window.location + var HISTORY_SUPPORT = !!(history && history.pushState); + + var anchorScrolls = { + ANCHOR_REGEX: /^#[^ ]+$/, + offsetHeightPx: function() { + return $(".header-holder").height() + 20; + }, + + /** + * Establish events, and fix initial scroll position if a hash is provided. + */ + init: function() { + this.scrollToCurrent(); + $(window).on('hashchange', $.proxy(this, 'scrollToCurrent')); + $('body').on('click', 'a', $.proxy(this, 'delegateAnchors')); + }, + + /** + * Return the offset amount to deduct from the normal scroll position. + * Modify as appropriate to allow for dynamic calculations + */ + getFixedOffset: function() { + return this.offsetHeightPx(); + }, + + /** + * If the provided href is an anchor which resolves to an element on the + * page, scroll to it. + * @param {String} href + * @return {Boolean} - Was the href an anchor. + */ + scrollIfAnchor: function(href, pushToHistory) { + var match, anchorOffset; + + if(!this.ANCHOR_REGEX.test(href)) { + return false; + } + + match = document.getElementById(href.slice(1)); + + if(match) { + anchorOffset = $(match).offset().top - this.getFixedOffset(); + $('html, body').scrollTop(anchorOffset); + + // Add the state to history as-per normal anchor links + if(HISTORY_SUPPORT && pushToHistory) { + history.pushState({}, document.title, location.pathname + href); + } + } + + return !!match; + }, + + /** + * Attempt to scroll to the current location's hash. + */ + scrollToCurrent: function(e) { + if(this.scrollIfAnchor(window.location.hash) && e) { + e.preventDefault(); + } + }, + + /** + * If the click event's target was an anchor, fix the scroll position. + */ + delegateAnchors: function(e) { + var elem = e.target; + + if(this.scrollIfAnchor(elem.getAttribute('href'), true)) { + e.preventDefault(); + } + } + }; + + $(document).ready($.proxy(anchorScrolls, 'init')); + } +}; diff --git a/assets/search-bar.js b/assets/search-bar.js new file mode 100644 index 000000000..a9128101e --- /dev/null +++ b/assets/search-bar.js @@ -0,0 +1,42 @@ +docsearch({ + apiKey: "e3b73ac141dff0b0fd27bdae9055bc73", + indexName: "pytorch", + inputSelector: "#search-input", + debug: false // Set debug to true if you want to inspect the dropdown +}); + +docsearch({ + apiKey: 'e3b73ac141dff0b0fd27bdae9055bc73', + indexName: 'pytorch', + inputSelector: '#mobile-search-input', + algoliaOptions: { + hitsPerPage: 5 + }, + debug: false // Set debug to true if you want to inspect the dropdown +}); + +$("#search-icon").on("click", function() { + $(this).hide(); + $("#close-search").show(); + $(".search-border") + .addClass("active-background") + .animate({ width: "100%" }, "slow"); + $("#search-input") + .addClass("active-search-icon") + .focus(); + $(".main-menu-item").hide(); + $(".header-logo").addClass("active-header"); +}); + +$("#close-search").on("click", function() { + $(this).hide(); + $("#search-icon").show(); + $(".search-border") + .attr("style", "") + .removeClass("active-background"); + $("#search-input") + .removeClass("active-search-icon") + .val(""); + $(".main-menu-item").fadeIn("slow"); + $(".header-logo").removeClass("active-header"); +}); diff --git a/assets/show-screencast.js b/assets/show-screencast.js new file mode 100644 index 000000000..295d88edb --- /dev/null +++ b/assets/show-screencast.js @@ -0,0 +1,15 @@ +$('a.show-screencast').one('click', func); + +function func(e) { + e.preventDefault(); + $(this).next('div.screencast').show(); + // Hide the show button + $(this).hide(); +} + +$('div.screencast a:contains(Hide)').click(function (e) { + e.preventDefault(); + // Make the show button visible again + $(this).parent().hide() + .prev().one('click', func).show(); +}); \ No newline at end of file diff --git a/assets/track-events.js b/assets/track-events.js new file mode 100644 index 000000000..0a0a6271a --- /dev/null +++ b/assets/track-events.js @@ -0,0 +1,96 @@ +var trackEvents = { + recordClick: function (eventCategory, eventLabel) { + if (typeof gtag == "function") { + var gaEventObject = { + eventCategory: eventCategory, + eventAction: "click", + eventLabel: eventLabel + }; + + gtag('event', 'click', gaEventObject); + } + }, + + bind: function () { + // Clicks on the main menu + $(".main-menu ul li a").on("click", function () { + trackEvents.recordClick("Global Nav", $(this).text()); + return true; + }); + + // Clicks on GitHub link in main or mobile menu + $("#github-main-menu-link, #github-mobile-menu-link").on( + "click", + function () { + trackEvents.recordClick("Link", $(this).text()); + return true; + } + ); + + // Clicks on Resource cards + $(".resource-card a").on("click", function () { + trackEvents.recordClick("Resource Card", $(this).find("h4").text()); + return true; + }); + + // Clicks on Ecosystem Project cards + $(".ecosystem-card a").on("click", function () { + trackEvents.recordClick("Ecosystem Project Card", $(this).find(".card-title").text()); + return true; + }); + + // Clicks on 'Get Started' call to action buttons + $("[data-cta='get-started']").on("click", function () { + trackEvents.recordClick("Get Started CTA", $(this).text()); + return true; + }); + + // Clicks on Cloud Platforms in Quick Start Module + $(".cloud-option").on("click", function () { + var platformName = $.trim($(this).find(".cloud-option-body").text()); + trackEvents.recordClick("Quick Start Module - Cloud Platforms", platformName); + }); + + // Clicks on Cloud Platform Services in Quick Start Module + $(".cloud-option ul li a").on("click", function () { + var platformName = $.trim( + $(this). + closest("[data-toggle='cloud-dropdown']"). + find(".cloud-option-body"). + text() + ); + + var serviceName = $.trim($(this).text()); + + trackEvents.recordClick( + "Quick Start Module - Cloud Platforms", + platformName + " - " + serviceName + ); + return true; + }); + + // Clicks on options in Quick Start - Locally + $(".quick-start-module .row .option").on("click", function () { + var selectedOption = $.trim($(this).text()); + var rowIndex = $(this).closest(".row").index(); + var selectedCategory = $(".quick-start-module .headings .title-block"). + eq(rowIndex). + find(".option-text"). + text(); + + trackEvents.recordClick( + "Quick Start Module - Local Install", + selectedCategory + ": " + selectedOption + ) + }) + + // Clicks on Deep Learning Download button + $("#deep-learning-button").on( + "click", + function () { + trackEvents.recordClick("Link", "Download"); + return true; + } + ); + } +}; diff --git a/assets/vendor/anchor.min.js b/assets/vendor/anchor.min.js new file mode 100644 index 000000000..1c2b86fae --- /dev/null +++ b/assets/vendor/anchor.min.js @@ -0,0 +1,9 @@ +// @license magnet:?xt=urn:btih:d3d9a9a6595521f9666a5e94cc830dab83b65699&dn=expat.txt Expat +// +// AnchorJS - v4.3.1 - 2021-04-17 +// https://www.bryanbraun.com/anchorjs/ +// Copyright (c) 2021 Bryan Braun; Licensed MIT +// +// @license magnet:?xt=urn:btih:d3d9a9a6595521f9666a5e94cc830dab83b65699&dn=expat.txt Expat +!function(A,e){"use strict";"function"==typeof define&&define.amd?define([],e):"object"==typeof module&&module.exports?module.exports=e():(A.AnchorJS=e(),A.anchors=new A.AnchorJS)}(this,function(){"use strict";return function(A){function d(A){A.icon=Object.prototype.hasOwnProperty.call(A,"icon")?A.icon:"",A.visible=Object.prototype.hasOwnProperty.call(A,"visible")?A.visible:"hover",A.placement=Object.prototype.hasOwnProperty.call(A,"placement")?A.placement:"right",A.ariaLabel=Object.prototype.hasOwnProperty.call(A,"ariaLabel")?A.ariaLabel:"Anchor",A.class=Object.prototype.hasOwnProperty.call(A,"class")?A.class:"",A.base=Object.prototype.hasOwnProperty.call(A,"base")?A.base:"",A.truncate=Object.prototype.hasOwnProperty.call(A,"truncate")?Math.floor(A.truncate):64,A.titleText=Object.prototype.hasOwnProperty.call(A,"titleText")?A.titleText:""}function w(A){var e;if("string"==typeof A||A instanceof String)e=[].slice.call(document.querySelectorAll(A));else{if(!(Array.isArray(A)||A instanceof NodeList))throw new TypeError("The selector provided to AnchorJS was invalid.");e=[].slice.call(A)}return e}this.options=A||{},this.elements=[],d(this.options),this.isTouchDevice=function(){return Boolean("ontouchstart"in window||window.TouchEvent||window.DocumentTouch&&document instanceof DocumentTouch)},this.add=function(A){var e,t,o,i,n,s,a,c,r,l,h,u,p=[];if(d(this.options),"touch"===(l=this.options.visible)&&(l=this.isTouchDevice()?"always":"hover"),0===(e=w(A=A||"h2, h3, h4, h5, h6")).length)return this;for(null===document.head.querySelector("style.anchorjs")&&((u=document.createElement("style")).className="anchorjs",u.appendChild(document.createTextNode("")),void 0===(A=document.head.querySelector('[rel="stylesheet"],style'))?document.head.appendChild(u):document.head.insertBefore(u,A),u.sheet.insertRule(".anchorjs-link{opacity:0;text-decoration:none;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}",u.sheet.cssRules.length),u.sheet.insertRule(":hover>.anchorjs-link,.anchorjs-link:focus{opacity:1}",u.sheet.cssRules.length),u.sheet.insertRule("[data-anchorjs-icon]::after{content:attr(data-anchorjs-icon)}",u.sheet.cssRules.length),u.sheet.insertRule('@font-face{font-family:anchorjs-icons;src:url(data:n/a;base64,AAEAAAALAIAAAwAwT1MvMg8yG2cAAAE4AAAAYGNtYXDp3gC3AAABpAAAAExnYXNwAAAAEAAAA9wAAAAIZ2x5ZlQCcfwAAAH4AAABCGhlYWQHFvHyAAAAvAAAADZoaGVhBnACFwAAAPQAAAAkaG10eASAADEAAAGYAAAADGxvY2EACACEAAAB8AAAAAhtYXhwAAYAVwAAARgAAAAgbmFtZQGOH9cAAAMAAAAAunBvc3QAAwAAAAADvAAAACAAAQAAAAEAAHzE2p9fDzz1AAkEAAAAAADRecUWAAAAANQA6R8AAAAAAoACwAAAAAgAAgAAAAAAAAABAAADwP/AAAACgAAA/9MCrQABAAAAAAAAAAAAAAAAAAAAAwABAAAAAwBVAAIAAAAAAAIAAAAAAAAAAAAAAAAAAAAAAAMCQAGQAAUAAAKZAswAAACPApkCzAAAAesAMwEJAAAAAAAAAAAAAAAAAAAAARAAAAAAAAAAAAAAAAAAAAAAQAAg//0DwP/AAEADwABAAAAAAQAAAAAAAAAAAAAAIAAAAAAAAAIAAAACgAAxAAAAAwAAAAMAAAAcAAEAAwAAABwAAwABAAAAHAAEADAAAAAIAAgAAgAAACDpy//9//8AAAAg6cv//f///+EWNwADAAEAAAAAAAAAAAAAAAAACACEAAEAAAAAAAAAAAAAAAAxAAACAAQARAKAAsAAKwBUAAABIiYnJjQ3NzY2MzIWFxYUBwcGIicmNDc3NjQnJiYjIgYHBwYUFxYUBwYGIwciJicmNDc3NjIXFhQHBwYUFxYWMzI2Nzc2NCcmNDc2MhcWFAcHBgYjARQGDAUtLXoWOR8fORYtLTgKGwoKCjgaGg0gEhIgDXoaGgkJBQwHdR85Fi0tOAobCgoKOBoaDSASEiANehoaCQkKGwotLXoWOR8BMwUFLYEuehYXFxYugC44CQkKGwo4GkoaDQ0NDXoaShoKGwoFBe8XFi6ALjgJCQobCjgaShoNDQ0NehpKGgobCgoKLYEuehYXAAAADACWAAEAAAAAAAEACAAAAAEAAAAAAAIAAwAIAAEAAAAAAAMACAAAAAEAAAAAAAQACAAAAAEAAAAAAAUAAQALAAEAAAAAAAYACAAAAAMAAQQJAAEAEAAMAAMAAQQJAAIABgAcAAMAAQQJAAMAEAAMAAMAAQQJAAQAEAAMAAMAAQQJAAUAAgAiAAMAAQQJAAYAEAAMYW5jaG9yanM0MDBAAGEAbgBjAGgAbwByAGoAcwA0ADAAMABAAAAAAwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAABAAH//wAP) format("truetype")}',u.sheet.cssRules.length)),u=document.querySelectorAll("[id]"),t=[].map.call(u,function(A){return A.id}),i=0;i\]./()*\\\n\t\b\v\u00A0]/g,"-").replace(/-{2,}/g,"-").substring(0,this.options.truncate).replace(/^-+|-+$/gm,"").toLowerCase()},this.hasAnchorJSLink=function(A){var e=A.firstChild&&-1<(" "+A.firstChild.className+" ").indexOf(" anchorjs-link "),A=A.lastChild&&-1<(" "+A.lastChild.className+" ").indexOf(" anchorjs-link ");return e||A||!1}}}); +// @license-end \ No newline at end of file diff --git a/assets/vendor/bootstrap.min.js b/assets/vendor/bootstrap.min.js new file mode 100644 index 000000000..c4c0d1f95 --- /dev/null +++ b/assets/vendor/bootstrap.min.js @@ -0,0 +1,7 @@ +/*! + * Bootstrap v4.3.1 (https://getbootstrap.com/) + * Copyright 2011-2019 The Bootstrap Authors (https://github.com/twbs/bootstrap/graphs/contributors) + * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE) + */ +!function(t,e){"object"==typeof exports&&"undefined"!=typeof module?e(exports,require("jquery"),require("popper.js")):"function"==typeof define&&define.amd?define(["exports","jquery","popper.js"],e):e((t=t||self).bootstrap={},t.jQuery,t.Popper)}(this,function(t,g,u){"use strict";function i(t,e){for(var n=0;nthis._items.length-1||t<0))if(this._isSliding)g(this._element).one(Q.SLID,function(){return e.to(t)});else{if(n===t)return this.pause(),void this.cycle();var i=ndocument.documentElement.clientHeight;!this._isBodyOverflowing&&t&&(this._element.style.paddingLeft=this._scrollbarWidth+"px"),this._isBodyOverflowing&&!t&&(this._element.style.paddingRight=this._scrollbarWidth+"px")},t._resetAdjustments=function(){this._element.style.paddingLeft="",this._element.style.paddingRight=""},t._checkScrollbar=function(){var t=document.body.getBoundingClientRect();this._isBodyOverflowing=t.left+t.right
    ',trigger:"hover focus",title:"",delay:0,html:!1,selector:!1,placement:"top",offset:0,container:!1,fallbackPlacement:"flip",boundary:"scrollParent",sanitize:!0,sanitizeFn:null,whiteList:Ee},je="show",He="out",Re={HIDE:"hide"+De,HIDDEN:"hidden"+De,SHOW:"show"+De,SHOWN:"shown"+De,INSERTED:"inserted"+De,CLICK:"click"+De,FOCUSIN:"focusin"+De,FOCUSOUT:"focusout"+De,MOUSEENTER:"mouseenter"+De,MOUSELEAVE:"mouseleave"+De},xe="fade",Fe="show",Ue=".tooltip-inner",We=".arrow",qe="hover",Me="focus",Ke="click",Qe="manual",Be=function(){function i(t,e){if("undefined"==typeof u)throw new TypeError("Bootstrap's tooltips require Popper.js (https://popper.js.org/)");this._isEnabled=!0,this._timeout=0,this._hoverState="",this._activeTrigger={},this._popper=null,this.element=t,this.config=this._getConfig(e),this.tip=null,this._setListeners()}var t=i.prototype;return t.enable=function(){this._isEnabled=!0},t.disable=function(){this._isEnabled=!1},t.toggleEnabled=function(){this._isEnabled=!this._isEnabled},t.toggle=function(t){if(this._isEnabled)if(t){var e=this.constructor.DATA_KEY,n=g(t.currentTarget).data(e);n||(n=new this.constructor(t.currentTarget,this._getDelegateConfig()),g(t.currentTarget).data(e,n)),n._activeTrigger.click=!n._activeTrigger.click,n._isWithActiveTrigger()?n._enter(null,n):n._leave(null,n)}else{if(g(this.getTipElement()).hasClass(Fe))return void this._leave(null,this);this._enter(null,this)}},t.dispose=function(){clearTimeout(this._timeout),g.removeData(this.element,this.constructor.DATA_KEY),g(this.element).off(this.constructor.EVENT_KEY),g(this.element).closest(".modal").off("hide.bs.modal"),this.tip&&g(this.tip).remove(),this._isEnabled=null,this._timeout=null,this._hoverState=null,(this._activeTrigger=null)!==this._popper&&this._popper.destroy(),this._popper=null,this.element=null,this.config=null,this.tip=null},t.show=function(){var e=this;if("none"===g(this.element).css("display"))throw new Error("Please use show on visible elements");var t=g.Event(this.constructor.Event.SHOW);if(this.isWithContent()&&this._isEnabled){g(this.element).trigger(t);var n=_.findShadowRoot(this.element),i=g.contains(null!==n?n:this.element.ownerDocument.documentElement,this.element);if(t.isDefaultPrevented()||!i)return;var o=this.getTipElement(),r=_.getUID(this.constructor.NAME);o.setAttribute("id",r),this.element.setAttribute("aria-describedby",r),this.setContent(),this.config.animation&&g(o).addClass(xe);var s="function"==typeof this.config.placement?this.config.placement.call(this,o,this.element):this.config.placement,a=this._getAttachment(s);this.addAttachmentClass(a);var l=this._getContainer();g(o).data(this.constructor.DATA_KEY,this),g.contains(this.element.ownerDocument.documentElement,this.tip)||g(o).appendTo(l),g(this.element).trigger(this.constructor.Event.INSERTED),this._popper=new u(this.element,o,{placement:a,modifiers:{offset:this._getOffset(),flip:{behavior:this.config.fallbackPlacement},arrow:{element:We},preventOverflow:{boundariesElement:this.config.boundary}},onCreate:function(t){t.originalPlacement!==t.placement&&e._handlePopperPlacementChange(t)},onUpdate:function(t){return e._handlePopperPlacementChange(t)}}),g(o).addClass(Fe),"ontouchstart"in document.documentElement&&g(document.body).children().on("mouseover",null,g.noop);var c=function(){e.config.animation&&e._fixTransition();var t=e._hoverState;e._hoverState=null,g(e.element).trigger(e.constructor.Event.SHOWN),t===He&&e._leave(null,e)};if(g(this.tip).hasClass(xe)){var h=_.getTransitionDurationFromElement(this.tip);g(this.tip).one(_.TRANSITION_END,c).emulateTransitionEnd(h)}else c()}},t.hide=function(t){var e=this,n=this.getTipElement(),i=g.Event(this.constructor.Event.HIDE),o=function(){e._hoverState!==je&&n.parentNode&&n.parentNode.removeChild(n),e._cleanTipClass(),e.element.removeAttribute("aria-describedby"),g(e.element).trigger(e.constructor.Event.HIDDEN),null!==e._popper&&e._popper.destroy(),t&&t()};if(g(this.element).trigger(i),!i.isDefaultPrevented()){if(g(n).removeClass(Fe),"ontouchstart"in document.documentElement&&g(document.body).children().off("mouseover",null,g.noop),this._activeTrigger[Ke]=!1,this._activeTrigger[Me]=!1,this._activeTrigger[qe]=!1,g(this.tip).hasClass(xe)){var r=_.getTransitionDurationFromElement(n);g(n).one(_.TRANSITION_END,o).emulateTransitionEnd(r)}else o();this._hoverState=""}},t.update=function(){null!==this._popper&&this._popper.scheduleUpdate()},t.isWithContent=function(){return Boolean(this.getTitle())},t.addAttachmentClass=function(t){g(this.getTipElement()).addClass(Ae+"-"+t)},t.getTipElement=function(){return this.tip=this.tip||g(this.config.template)[0],this.tip},t.setContent=function(){var t=this.getTipElement();this.setElementContent(g(t.querySelectorAll(Ue)),this.getTitle()),g(t).removeClass(xe+" "+Fe)},t.setElementContent=function(t,e){"object"!=typeof e||!e.nodeType&&!e.jquery?this.config.html?(this.config.sanitize&&(e=Se(e,this.config.whiteList,this.config.sanitizeFn)),t.html(e)):t.text(e):this.config.html?g(e).parent().is(t)||t.empty().append(e):t.text(g(e).text())},t.getTitle=function(){var t=this.element.getAttribute("data-original-title");return t||(t="function"==typeof this.config.title?this.config.title.call(this.element):this.config.title),t},t._getOffset=function(){var e=this,t={};return"function"==typeof this.config.offset?t.fn=function(t){return t.offsets=l({},t.offsets,e.config.offset(t.offsets,e.element)||{}),t}:t.offset=this.config.offset,t},t._getContainer=function(){return!1===this.config.container?document.body:_.isElement(this.config.container)?g(this.config.container):g(document).find(this.config.container)},t._getAttachment=function(t){return Pe[t.toUpperCase()]},t._setListeners=function(){var i=this;this.config.trigger.split(" ").forEach(function(t){if("click"===t)g(i.element).on(i.constructor.Event.CLICK,i.config.selector,function(t){return i.toggle(t)});else if(t!==Qe){var e=t===qe?i.constructor.Event.MOUSEENTER:i.constructor.Event.FOCUSIN,n=t===qe?i.constructor.Event.MOUSELEAVE:i.constructor.Event.FOCUSOUT;g(i.element).on(e,i.config.selector,function(t){return i._enter(t)}).on(n,i.config.selector,function(t){return i._leave(t)})}}),g(this.element).closest(".modal").on("hide.bs.modal",function(){i.element&&i.hide()}),this.config.selector?this.config=l({},this.config,{trigger:"manual",selector:""}):this._fixTitle()},t._fixTitle=function(){var t=typeof this.element.getAttribute("data-original-title");(this.element.getAttribute("title")||"string"!==t)&&(this.element.setAttribute("data-original-title",this.element.getAttribute("title")||""),this.element.setAttribute("title",""))},t._enter=function(t,e){var n=this.constructor.DATA_KEY;(e=e||g(t.currentTarget).data(n))||(e=new this.constructor(t.currentTarget,this._getDelegateConfig()),g(t.currentTarget).data(n,e)),t&&(e._activeTrigger["focusin"===t.type?Me:qe]=!0),g(e.getTipElement()).hasClass(Fe)||e._hoverState===je?e._hoverState=je:(clearTimeout(e._timeout),e._hoverState=je,e.config.delay&&e.config.delay.show?e._timeout=setTimeout(function(){e._hoverState===je&&e.show()},e.config.delay.show):e.show())},t._leave=function(t,e){var n=this.constructor.DATA_KEY;(e=e||g(t.currentTarget).data(n))||(e=new this.constructor(t.currentTarget,this._getDelegateConfig()),g(t.currentTarget).data(n,e)),t&&(e._activeTrigger["focusout"===t.type?Me:qe]=!1),e._isWithActiveTrigger()||(clearTimeout(e._timeout),e._hoverState=He,e.config.delay&&e.config.delay.hide?e._timeout=setTimeout(function(){e._hoverState===He&&e.hide()},e.config.delay.hide):e.hide())},t._isWithActiveTrigger=function(){for(var t in this._activeTrigger)if(this._activeTrigger[t])return!0;return!1},t._getConfig=function(t){var e=g(this.element).data();return Object.keys(e).forEach(function(t){-1!==Oe.indexOf(t)&&delete e[t]}),"number"==typeof(t=l({},this.constructor.Default,e,"object"==typeof t&&t?t:{})).delay&&(t.delay={show:t.delay,hide:t.delay}),"number"==typeof t.title&&(t.title=t.title.toString()),"number"==typeof t.content&&(t.content=t.content.toString()),_.typeCheckConfig(be,t,this.constructor.DefaultType),t.sanitize&&(t.template=Se(t.template,t.whiteList,t.sanitizeFn)),t},t._getDelegateConfig=function(){var t={};if(this.config)for(var e in this.config)this.constructor.Default[e]!==this.config[e]&&(t[e]=this.config[e]);return t},t._cleanTipClass=function(){var t=g(this.getTipElement()),e=t.attr("class").match(Ne);null!==e&&e.length&&t.removeClass(e.join(""))},t._handlePopperPlacementChange=function(t){var e=t.instance;this.tip=e.popper,this._cleanTipClass(),this.addAttachmentClass(this._getAttachment(t.placement))},t._fixTransition=function(){var t=this.getTipElement(),e=this.config.animation;null===t.getAttribute("x-placement")&&(g(t).removeClass(xe),this.config.animation=!1,this.hide(),this.show(),this.config.animation=e)},i._jQueryInterface=function(n){return this.each(function(){var t=g(this).data(Ie),e="object"==typeof n&&n;if((t||!/dispose|hide/.test(n))&&(t||(t=new i(this,e),g(this).data(Ie,t)),"string"==typeof n)){if("undefined"==typeof t[n])throw new TypeError('No method named "'+n+'"');t[n]()}})},s(i,null,[{key:"VERSION",get:function(){return"4.3.1"}},{key:"Default",get:function(){return Le}},{key:"NAME",get:function(){return be}},{key:"DATA_KEY",get:function(){return Ie}},{key:"Event",get:function(){return Re}},{key:"EVENT_KEY",get:function(){return De}},{key:"DefaultType",get:function(){return ke}}]),i}();g.fn[be]=Be._jQueryInterface,g.fn[be].Constructor=Be,g.fn[be].noConflict=function(){return g.fn[be]=we,Be._jQueryInterface};var Ve="popover",Ye="bs.popover",ze="."+Ye,Xe=g.fn[Ve],$e="bs-popover",Ge=new RegExp("(^|\\s)"+$e+"\\S+","g"),Je=l({},Be.Default,{placement:"right",trigger:"click",content:"",template:''}),Ze=l({},Be.DefaultType,{content:"(string|element|function)"}),tn="fade",en="show",nn=".popover-header",on=".popover-body",rn={HIDE:"hide"+ze,HIDDEN:"hidden"+ze,SHOW:"show"+ze,SHOWN:"shown"+ze,INSERTED:"inserted"+ze,CLICK:"click"+ze,FOCUSIN:"focusin"+ze,FOCUSOUT:"focusout"+ze,MOUSEENTER:"mouseenter"+ze,MOUSELEAVE:"mouseleave"+ze},sn=function(t){var e,n;function i(){return t.apply(this,arguments)||this}n=t,(e=i).prototype=Object.create(n.prototype),(e.prototype.constructor=e).__proto__=n;var o=i.prototype;return o.isWithContent=function(){return this.getTitle()||this._getContent()},o.addAttachmentClass=function(t){g(this.getTipElement()).addClass($e+"-"+t)},o.getTipElement=function(){return this.tip=this.tip||g(this.config.template)[0],this.tip},o.setContent=function(){var t=g(this.getTipElement());this.setElementContent(t.find(nn),this.getTitle());var e=this._getContent();"function"==typeof e&&(e=e.call(this.element)),this.setElementContent(t.find(on),e),t.removeClass(tn+" "+en)},o._getContent=function(){return this.element.getAttribute("data-content")||this.config.content},o._cleanTipClass=function(){var t=g(this.getTipElement()),e=t.attr("class").match(Ge);null!==e&&0=this._offsets[o]&&("undefined"==typeof this._offsets[o+1]||t {\n called = true\n })\n\n setTimeout(() => {\n if (!called) {\n Util.triggerTransitionEnd(this)\n }\n }, duration)\n\n return this\n}\n\nfunction setTransitionEndSupport() {\n $.fn.emulateTransitionEnd = transitionEndEmulator\n $.event.special[Util.TRANSITION_END] = getSpecialTransitionEndEvent()\n}\n\n/**\n * --------------------------------------------------------------------------\n * Public Util Api\n * --------------------------------------------------------------------------\n */\n\nconst Util = {\n\n TRANSITION_END: 'bsTransitionEnd',\n\n getUID(prefix) {\n do {\n // eslint-disable-next-line no-bitwise\n prefix += ~~(Math.random() * MAX_UID) // \"~~\" acts like a faster Math.floor() here\n } while (document.getElementById(prefix))\n return prefix\n },\n\n getSelectorFromElement(element) {\n let selector = element.getAttribute('data-target')\n\n if (!selector || selector === '#') {\n const hrefAttr = element.getAttribute('href')\n selector = hrefAttr && hrefAttr !== '#' ? hrefAttr.trim() : ''\n }\n\n try {\n return document.querySelector(selector) ? selector : null\n } catch (err) {\n return null\n }\n },\n\n getTransitionDurationFromElement(element) {\n if (!element) {\n return 0\n }\n\n // Get transition-duration of the element\n let transitionDuration = $(element).css('transition-duration')\n let transitionDelay = $(element).css('transition-delay')\n\n const floatTransitionDuration = parseFloat(transitionDuration)\n const floatTransitionDelay = parseFloat(transitionDelay)\n\n // Return 0 if element or transition duration is not found\n if (!floatTransitionDuration && !floatTransitionDelay) {\n return 0\n }\n\n // If multiple durations are defined, take the first\n transitionDuration = transitionDuration.split(',')[0]\n transitionDelay = transitionDelay.split(',')[0]\n\n return (parseFloat(transitionDuration) + parseFloat(transitionDelay)) * MILLISECONDS_MULTIPLIER\n },\n\n reflow(element) {\n return element.offsetHeight\n },\n\n triggerTransitionEnd(element) {\n $(element).trigger(TRANSITION_END)\n },\n\n // TODO: Remove in v5\n supportsTransitionEnd() {\n return Boolean(TRANSITION_END)\n },\n\n isElement(obj) {\n return (obj[0] || obj).nodeType\n },\n\n typeCheckConfig(componentName, config, configTypes) {\n for (const property in configTypes) {\n if (Object.prototype.hasOwnProperty.call(configTypes, property)) {\n const expectedTypes = configTypes[property]\n const value = config[property]\n const valueType = value && Util.isElement(value)\n ? 'element' : toType(value)\n\n if (!new RegExp(expectedTypes).test(valueType)) {\n throw new Error(\n `${componentName.toUpperCase()}: ` +\n `Option \"${property}\" provided type \"${valueType}\" ` +\n `but expected type \"${expectedTypes}\".`)\n }\n }\n }\n },\n\n findShadowRoot(element) {\n if (!document.documentElement.attachShadow) {\n return null\n }\n\n // Can find the shadow root otherwise it'll return the document\n if (typeof element.getRootNode === 'function') {\n const root = element.getRootNode()\n return root instanceof ShadowRoot ? root : null\n }\n\n if (element instanceof ShadowRoot) {\n return element\n }\n\n // when we don't find a shadow root\n if (!element.parentNode) {\n return null\n }\n\n return Util.findShadowRoot(element.parentNode)\n }\n}\n\nsetTransitionEndSupport()\n\nexport default Util\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): alert.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'alert'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.alert'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\n\nconst Selector = {\n DISMISS : '[data-dismiss=\"alert\"]'\n}\n\nconst Event = {\n CLOSE : `close${EVENT_KEY}`,\n CLOSED : `closed${EVENT_KEY}`,\n CLICK_DATA_API : `click${EVENT_KEY}${DATA_API_KEY}`\n}\n\nconst ClassName = {\n ALERT : 'alert',\n FADE : 'fade',\n SHOW : 'show'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Alert {\n constructor(element) {\n this._element = element\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n // Public\n\n close(element) {\n let rootElement = this._element\n if (element) {\n rootElement = this._getRootElement(element)\n }\n\n const customEvent = this._triggerCloseEvent(rootElement)\n\n if (customEvent.isDefaultPrevented()) {\n return\n }\n\n this._removeElement(rootElement)\n }\n\n dispose() {\n $.removeData(this._element, DATA_KEY)\n this._element = null\n }\n\n // Private\n\n _getRootElement(element) {\n const selector = Util.getSelectorFromElement(element)\n let parent = false\n\n if (selector) {\n parent = document.querySelector(selector)\n }\n\n if (!parent) {\n parent = $(element).closest(`.${ClassName.ALERT}`)[0]\n }\n\n return parent\n }\n\n _triggerCloseEvent(element) {\n const closeEvent = $.Event(Event.CLOSE)\n\n $(element).trigger(closeEvent)\n return closeEvent\n }\n\n _removeElement(element) {\n $(element).removeClass(ClassName.SHOW)\n\n if (!$(element).hasClass(ClassName.FADE)) {\n this._destroyElement(element)\n return\n }\n\n const transitionDuration = Util.getTransitionDurationFromElement(element)\n\n $(element)\n .one(Util.TRANSITION_END, (event) => this._destroyElement(element, event))\n .emulateTransitionEnd(transitionDuration)\n }\n\n _destroyElement(element) {\n $(element)\n .detach()\n .trigger(Event.CLOSED)\n .remove()\n }\n\n // Static\n\n static _jQueryInterface(config) {\n return this.each(function () {\n const $element = $(this)\n let data = $element.data(DATA_KEY)\n\n if (!data) {\n data = new Alert(this)\n $element.data(DATA_KEY, data)\n }\n\n if (config === 'close') {\n data[config](this)\n }\n })\n }\n\n static _handleDismiss(alertInstance) {\n return function (event) {\n if (event) {\n event.preventDefault()\n }\n\n alertInstance.close(this)\n }\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\n$(document).on(\n Event.CLICK_DATA_API,\n Selector.DISMISS,\n Alert._handleDismiss(new Alert())\n)\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Alert._jQueryInterface\n$.fn[NAME].Constructor = Alert\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Alert._jQueryInterface\n}\n\nexport default Alert\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): button.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'button'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.button'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\n\nconst ClassName = {\n ACTIVE : 'active',\n BUTTON : 'btn',\n FOCUS : 'focus'\n}\n\nconst Selector = {\n DATA_TOGGLE_CARROT : '[data-toggle^=\"button\"]',\n DATA_TOGGLE : '[data-toggle=\"buttons\"]',\n INPUT : 'input:not([type=\"hidden\"])',\n ACTIVE : '.active',\n BUTTON : '.btn'\n}\n\nconst Event = {\n CLICK_DATA_API : `click${EVENT_KEY}${DATA_API_KEY}`,\n FOCUS_BLUR_DATA_API : `focus${EVENT_KEY}${DATA_API_KEY} ` +\n `blur${EVENT_KEY}${DATA_API_KEY}`\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Button {\n constructor(element) {\n this._element = element\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n // Public\n\n toggle() {\n let triggerChangeEvent = true\n let addAriaPressed = true\n const rootElement = $(this._element).closest(\n Selector.DATA_TOGGLE\n )[0]\n\n if (rootElement) {\n const input = this._element.querySelector(Selector.INPUT)\n\n if (input) {\n if (input.type === 'radio') {\n if (input.checked &&\n this._element.classList.contains(ClassName.ACTIVE)) {\n triggerChangeEvent = false\n } else {\n const activeElement = rootElement.querySelector(Selector.ACTIVE)\n\n if (activeElement) {\n $(activeElement).removeClass(ClassName.ACTIVE)\n }\n }\n }\n\n if (triggerChangeEvent) {\n if (input.hasAttribute('disabled') ||\n rootElement.hasAttribute('disabled') ||\n input.classList.contains('disabled') ||\n rootElement.classList.contains('disabled')) {\n return\n }\n input.checked = !this._element.classList.contains(ClassName.ACTIVE)\n $(input).trigger('change')\n }\n\n input.focus()\n addAriaPressed = false\n }\n }\n\n if (addAriaPressed) {\n this._element.setAttribute('aria-pressed',\n !this._element.classList.contains(ClassName.ACTIVE))\n }\n\n if (triggerChangeEvent) {\n $(this._element).toggleClass(ClassName.ACTIVE)\n }\n }\n\n dispose() {\n $.removeData(this._element, DATA_KEY)\n this._element = null\n }\n\n // Static\n\n static _jQueryInterface(config) {\n return this.each(function () {\n let data = $(this).data(DATA_KEY)\n\n if (!data) {\n data = new Button(this)\n $(this).data(DATA_KEY, data)\n }\n\n if (config === 'toggle') {\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\n$(document)\n .on(Event.CLICK_DATA_API, Selector.DATA_TOGGLE_CARROT, (event) => {\n event.preventDefault()\n\n let button = event.target\n\n if (!$(button).hasClass(ClassName.BUTTON)) {\n button = $(button).closest(Selector.BUTTON)\n }\n\n Button._jQueryInterface.call($(button), 'toggle')\n })\n .on(Event.FOCUS_BLUR_DATA_API, Selector.DATA_TOGGLE_CARROT, (event) => {\n const button = $(event.target).closest(Selector.BUTTON)[0]\n $(button).toggleClass(ClassName.FOCUS, /^focus(in)?$/.test(event.type))\n })\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Button._jQueryInterface\n$.fn[NAME].Constructor = Button\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Button._jQueryInterface\n}\n\nexport default Button\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): carousel.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'carousel'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.carousel'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\nconst ARROW_LEFT_KEYCODE = 37 // KeyboardEvent.which value for left arrow key\nconst ARROW_RIGHT_KEYCODE = 39 // KeyboardEvent.which value for right arrow key\nconst TOUCHEVENT_COMPAT_WAIT = 500 // Time for mouse compat events to fire after touch\nconst SWIPE_THRESHOLD = 40\n\nconst Default = {\n interval : 5000,\n keyboard : true,\n slide : false,\n pause : 'hover',\n wrap : true,\n touch : true\n}\n\nconst DefaultType = {\n interval : '(number|boolean)',\n keyboard : 'boolean',\n slide : '(boolean|string)',\n pause : '(string|boolean)',\n wrap : 'boolean',\n touch : 'boolean'\n}\n\nconst Direction = {\n NEXT : 'next',\n PREV : 'prev',\n LEFT : 'left',\n RIGHT : 'right'\n}\n\nconst Event = {\n SLIDE : `slide${EVENT_KEY}`,\n SLID : `slid${EVENT_KEY}`,\n KEYDOWN : `keydown${EVENT_KEY}`,\n MOUSEENTER : `mouseenter${EVENT_KEY}`,\n MOUSELEAVE : `mouseleave${EVENT_KEY}`,\n TOUCHSTART : `touchstart${EVENT_KEY}`,\n TOUCHMOVE : `touchmove${EVENT_KEY}`,\n TOUCHEND : `touchend${EVENT_KEY}`,\n POINTERDOWN : `pointerdown${EVENT_KEY}`,\n POINTERUP : `pointerup${EVENT_KEY}`,\n DRAG_START : `dragstart${EVENT_KEY}`,\n LOAD_DATA_API : `load${EVENT_KEY}${DATA_API_KEY}`,\n CLICK_DATA_API : `click${EVENT_KEY}${DATA_API_KEY}`\n}\n\nconst ClassName = {\n CAROUSEL : 'carousel',\n ACTIVE : 'active',\n SLIDE : 'slide',\n RIGHT : 'carousel-item-right',\n LEFT : 'carousel-item-left',\n NEXT : 'carousel-item-next',\n PREV : 'carousel-item-prev',\n ITEM : 'carousel-item',\n POINTER_EVENT : 'pointer-event'\n}\n\nconst Selector = {\n ACTIVE : '.active',\n ACTIVE_ITEM : '.active.carousel-item',\n ITEM : '.carousel-item',\n ITEM_IMG : '.carousel-item img',\n NEXT_PREV : '.carousel-item-next, .carousel-item-prev',\n INDICATORS : '.carousel-indicators',\n DATA_SLIDE : '[data-slide], [data-slide-to]',\n DATA_RIDE : '[data-ride=\"carousel\"]'\n}\n\nconst PointerType = {\n TOUCH : 'touch',\n PEN : 'pen'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\nclass Carousel {\n constructor(element, config) {\n this._items = null\n this._interval = null\n this._activeElement = null\n this._isPaused = false\n this._isSliding = false\n this.touchTimeout = null\n this.touchStartX = 0\n this.touchDeltaX = 0\n\n this._config = this._getConfig(config)\n this._element = element\n this._indicatorsElement = this._element.querySelector(Selector.INDICATORS)\n this._touchSupported = 'ontouchstart' in document.documentElement || navigator.maxTouchPoints > 0\n this._pointerEvent = Boolean(window.PointerEvent || window.MSPointerEvent)\n\n this._addEventListeners()\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n // Public\n\n next() {\n if (!this._isSliding) {\n this._slide(Direction.NEXT)\n }\n }\n\n nextWhenVisible() {\n // Don't call next when the page isn't visible\n // or the carousel or its parent isn't visible\n if (!document.hidden &&\n ($(this._element).is(':visible') && $(this._element).css('visibility') !== 'hidden')) {\n this.next()\n }\n }\n\n prev() {\n if (!this._isSliding) {\n this._slide(Direction.PREV)\n }\n }\n\n pause(event) {\n if (!event) {\n this._isPaused = true\n }\n\n if (this._element.querySelector(Selector.NEXT_PREV)) {\n Util.triggerTransitionEnd(this._element)\n this.cycle(true)\n }\n\n clearInterval(this._interval)\n this._interval = null\n }\n\n cycle(event) {\n if (!event) {\n this._isPaused = false\n }\n\n if (this._interval) {\n clearInterval(this._interval)\n this._interval = null\n }\n\n if (this._config.interval && !this._isPaused) {\n this._interval = setInterval(\n (document.visibilityState ? this.nextWhenVisible : this.next).bind(this),\n this._config.interval\n )\n }\n }\n\n to(index) {\n this._activeElement = this._element.querySelector(Selector.ACTIVE_ITEM)\n\n const activeIndex = this._getItemIndex(this._activeElement)\n\n if (index > this._items.length - 1 || index < 0) {\n return\n }\n\n if (this._isSliding) {\n $(this._element).one(Event.SLID, () => this.to(index))\n return\n }\n\n if (activeIndex === index) {\n this.pause()\n this.cycle()\n return\n }\n\n const direction = index > activeIndex\n ? Direction.NEXT\n : Direction.PREV\n\n this._slide(direction, this._items[index])\n }\n\n dispose() {\n $(this._element).off(EVENT_KEY)\n $.removeData(this._element, DATA_KEY)\n\n this._items = null\n this._config = null\n this._element = null\n this._interval = null\n this._isPaused = null\n this._isSliding = null\n this._activeElement = null\n this._indicatorsElement = null\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...config\n }\n Util.typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _handleSwipe() {\n const absDeltax = Math.abs(this.touchDeltaX)\n\n if (absDeltax <= SWIPE_THRESHOLD) {\n return\n }\n\n const direction = absDeltax / this.touchDeltaX\n\n // swipe left\n if (direction > 0) {\n this.prev()\n }\n\n // swipe right\n if (direction < 0) {\n this.next()\n }\n }\n\n _addEventListeners() {\n if (this._config.keyboard) {\n $(this._element)\n .on(Event.KEYDOWN, (event) => this._keydown(event))\n }\n\n if (this._config.pause === 'hover') {\n $(this._element)\n .on(Event.MOUSEENTER, (event) => this.pause(event))\n .on(Event.MOUSELEAVE, (event) => this.cycle(event))\n }\n\n if (this._config.touch) {\n this._addTouchEventListeners()\n }\n }\n\n _addTouchEventListeners() {\n if (!this._touchSupported) {\n return\n }\n\n const start = (event) => {\n if (this._pointerEvent && PointerType[event.originalEvent.pointerType.toUpperCase()]) {\n this.touchStartX = event.originalEvent.clientX\n } else if (!this._pointerEvent) {\n this.touchStartX = event.originalEvent.touches[0].clientX\n }\n }\n\n const move = (event) => {\n // ensure swiping with one touch and not pinching\n if (event.originalEvent.touches && event.originalEvent.touches.length > 1) {\n this.touchDeltaX = 0\n } else {\n this.touchDeltaX = event.originalEvent.touches[0].clientX - this.touchStartX\n }\n }\n\n const end = (event) => {\n if (this._pointerEvent && PointerType[event.originalEvent.pointerType.toUpperCase()]) {\n this.touchDeltaX = event.originalEvent.clientX - this.touchStartX\n }\n\n this._handleSwipe()\n if (this._config.pause === 'hover') {\n // If it's a touch-enabled device, mouseenter/leave are fired as\n // part of the mouse compatibility events on first tap - the carousel\n // would stop cycling until user tapped out of it;\n // here, we listen for touchend, explicitly pause the carousel\n // (as if it's the second time we tap on it, mouseenter compat event\n // is NOT fired) and after a timeout (to allow for mouse compatibility\n // events to fire) we explicitly restart cycling\n\n this.pause()\n if (this.touchTimeout) {\n clearTimeout(this.touchTimeout)\n }\n this.touchTimeout = setTimeout((event) => this.cycle(event), TOUCHEVENT_COMPAT_WAIT + this._config.interval)\n }\n }\n\n $(this._element.querySelectorAll(Selector.ITEM_IMG)).on(Event.DRAG_START, (e) => e.preventDefault())\n if (this._pointerEvent) {\n $(this._element).on(Event.POINTERDOWN, (event) => start(event))\n $(this._element).on(Event.POINTERUP, (event) => end(event))\n\n this._element.classList.add(ClassName.POINTER_EVENT)\n } else {\n $(this._element).on(Event.TOUCHSTART, (event) => start(event))\n $(this._element).on(Event.TOUCHMOVE, (event) => move(event))\n $(this._element).on(Event.TOUCHEND, (event) => end(event))\n }\n }\n\n _keydown(event) {\n if (/input|textarea/i.test(event.target.tagName)) {\n return\n }\n\n switch (event.which) {\n case ARROW_LEFT_KEYCODE:\n event.preventDefault()\n this.prev()\n break\n case ARROW_RIGHT_KEYCODE:\n event.preventDefault()\n this.next()\n break\n default:\n }\n }\n\n _getItemIndex(element) {\n this._items = element && element.parentNode\n ? [].slice.call(element.parentNode.querySelectorAll(Selector.ITEM))\n : []\n return this._items.indexOf(element)\n }\n\n _getItemByDirection(direction, activeElement) {\n const isNextDirection = direction === Direction.NEXT\n const isPrevDirection = direction === Direction.PREV\n const activeIndex = this._getItemIndex(activeElement)\n const lastItemIndex = this._items.length - 1\n const isGoingToWrap = isPrevDirection && activeIndex === 0 ||\n isNextDirection && activeIndex === lastItemIndex\n\n if (isGoingToWrap && !this._config.wrap) {\n return activeElement\n }\n\n const delta = direction === Direction.PREV ? -1 : 1\n const itemIndex = (activeIndex + delta) % this._items.length\n\n return itemIndex === -1\n ? this._items[this._items.length - 1] : this._items[itemIndex]\n }\n\n _triggerSlideEvent(relatedTarget, eventDirectionName) {\n const targetIndex = this._getItemIndex(relatedTarget)\n const fromIndex = this._getItemIndex(this._element.querySelector(Selector.ACTIVE_ITEM))\n const slideEvent = $.Event(Event.SLIDE, {\n relatedTarget,\n direction: eventDirectionName,\n from: fromIndex,\n to: targetIndex\n })\n\n $(this._element).trigger(slideEvent)\n\n return slideEvent\n }\n\n _setActiveIndicatorElement(element) {\n if (this._indicatorsElement) {\n const indicators = [].slice.call(this._indicatorsElement.querySelectorAll(Selector.ACTIVE))\n $(indicators)\n .removeClass(ClassName.ACTIVE)\n\n const nextIndicator = this._indicatorsElement.children[\n this._getItemIndex(element)\n ]\n\n if (nextIndicator) {\n $(nextIndicator).addClass(ClassName.ACTIVE)\n }\n }\n }\n\n _slide(direction, element) {\n const activeElement = this._element.querySelector(Selector.ACTIVE_ITEM)\n const activeElementIndex = this._getItemIndex(activeElement)\n const nextElement = element || activeElement &&\n this._getItemByDirection(direction, activeElement)\n const nextElementIndex = this._getItemIndex(nextElement)\n const isCycling = Boolean(this._interval)\n\n let directionalClassName\n let orderClassName\n let eventDirectionName\n\n if (direction === Direction.NEXT) {\n directionalClassName = ClassName.LEFT\n orderClassName = ClassName.NEXT\n eventDirectionName = Direction.LEFT\n } else {\n directionalClassName = ClassName.RIGHT\n orderClassName = ClassName.PREV\n eventDirectionName = Direction.RIGHT\n }\n\n if (nextElement && $(nextElement).hasClass(ClassName.ACTIVE)) {\n this._isSliding = false\n return\n }\n\n const slideEvent = this._triggerSlideEvent(nextElement, eventDirectionName)\n if (slideEvent.isDefaultPrevented()) {\n return\n }\n\n if (!activeElement || !nextElement) {\n // Some weirdness is happening, so we bail\n return\n }\n\n this._isSliding = true\n\n if (isCycling) {\n this.pause()\n }\n\n this._setActiveIndicatorElement(nextElement)\n\n const slidEvent = $.Event(Event.SLID, {\n relatedTarget: nextElement,\n direction: eventDirectionName,\n from: activeElementIndex,\n to: nextElementIndex\n })\n\n if ($(this._element).hasClass(ClassName.SLIDE)) {\n $(nextElement).addClass(orderClassName)\n\n Util.reflow(nextElement)\n\n $(activeElement).addClass(directionalClassName)\n $(nextElement).addClass(directionalClassName)\n\n const nextElementInterval = parseInt(nextElement.getAttribute('data-interval'), 10)\n if (nextElementInterval) {\n this._config.defaultInterval = this._config.defaultInterval || this._config.interval\n this._config.interval = nextElementInterval\n } else {\n this._config.interval = this._config.defaultInterval || this._config.interval\n }\n\n const transitionDuration = Util.getTransitionDurationFromElement(activeElement)\n\n $(activeElement)\n .one(Util.TRANSITION_END, () => {\n $(nextElement)\n .removeClass(`${directionalClassName} ${orderClassName}`)\n .addClass(ClassName.ACTIVE)\n\n $(activeElement).removeClass(`${ClassName.ACTIVE} ${orderClassName} ${directionalClassName}`)\n\n this._isSliding = false\n\n setTimeout(() => $(this._element).trigger(slidEvent), 0)\n })\n .emulateTransitionEnd(transitionDuration)\n } else {\n $(activeElement).removeClass(ClassName.ACTIVE)\n $(nextElement).addClass(ClassName.ACTIVE)\n\n this._isSliding = false\n $(this._element).trigger(slidEvent)\n }\n\n if (isCycling) {\n this.cycle()\n }\n }\n\n // Static\n\n static _jQueryInterface(config) {\n return this.each(function () {\n let data = $(this).data(DATA_KEY)\n let _config = {\n ...Default,\n ...$(this).data()\n }\n\n if (typeof config === 'object') {\n _config = {\n ..._config,\n ...config\n }\n }\n\n const action = typeof config === 'string' ? config : _config.slide\n\n if (!data) {\n data = new Carousel(this, _config)\n $(this).data(DATA_KEY, data)\n }\n\n if (typeof config === 'number') {\n data.to(config)\n } else if (typeof action === 'string') {\n if (typeof data[action] === 'undefined') {\n throw new TypeError(`No method named \"${action}\"`)\n }\n data[action]()\n } else if (_config.interval && _config.ride) {\n data.pause()\n data.cycle()\n }\n })\n }\n\n static _dataApiClickHandler(event) {\n const selector = Util.getSelectorFromElement(this)\n\n if (!selector) {\n return\n }\n\n const target = $(selector)[0]\n\n if (!target || !$(target).hasClass(ClassName.CAROUSEL)) {\n return\n }\n\n const config = {\n ...$(target).data(),\n ...$(this).data()\n }\n const slideIndex = this.getAttribute('data-slide-to')\n\n if (slideIndex) {\n config.interval = false\n }\n\n Carousel._jQueryInterface.call($(target), config)\n\n if (slideIndex) {\n $(target).data(DATA_KEY).to(slideIndex)\n }\n\n event.preventDefault()\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\n$(document)\n .on(Event.CLICK_DATA_API, Selector.DATA_SLIDE, Carousel._dataApiClickHandler)\n\n$(window).on(Event.LOAD_DATA_API, () => {\n const carousels = [].slice.call(document.querySelectorAll(Selector.DATA_RIDE))\n for (let i = 0, len = carousels.length; i < len; i++) {\n const $carousel = $(carousels[i])\n Carousel._jQueryInterface.call($carousel, $carousel.data())\n }\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Carousel._jQueryInterface\n$.fn[NAME].Constructor = Carousel\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Carousel._jQueryInterface\n}\n\nexport default Carousel\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): collapse.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'collapse'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.collapse'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\n\nconst Default = {\n toggle : true,\n parent : ''\n}\n\nconst DefaultType = {\n toggle : 'boolean',\n parent : '(string|element)'\n}\n\nconst Event = {\n SHOW : `show${EVENT_KEY}`,\n SHOWN : `shown${EVENT_KEY}`,\n HIDE : `hide${EVENT_KEY}`,\n HIDDEN : `hidden${EVENT_KEY}`,\n CLICK_DATA_API : `click${EVENT_KEY}${DATA_API_KEY}`\n}\n\nconst ClassName = {\n SHOW : 'show',\n COLLAPSE : 'collapse',\n COLLAPSING : 'collapsing',\n COLLAPSED : 'collapsed'\n}\n\nconst Dimension = {\n WIDTH : 'width',\n HEIGHT : 'height'\n}\n\nconst Selector = {\n ACTIVES : '.show, .collapsing',\n DATA_TOGGLE : '[data-toggle=\"collapse\"]'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Collapse {\n constructor(element, config) {\n this._isTransitioning = false\n this._element = element\n this._config = this._getConfig(config)\n this._triggerArray = [].slice.call(document.querySelectorAll(\n `[data-toggle=\"collapse\"][href=\"#${element.id}\"],` +\n `[data-toggle=\"collapse\"][data-target=\"#${element.id}\"]`\n ))\n\n const toggleList = [].slice.call(document.querySelectorAll(Selector.DATA_TOGGLE))\n for (let i = 0, len = toggleList.length; i < len; i++) {\n const elem = toggleList[i]\n const selector = Util.getSelectorFromElement(elem)\n const filterElement = [].slice.call(document.querySelectorAll(selector))\n .filter((foundElem) => foundElem === element)\n\n if (selector !== null && filterElement.length > 0) {\n this._selector = selector\n this._triggerArray.push(elem)\n }\n }\n\n this._parent = this._config.parent ? this._getParent() : null\n\n if (!this._config.parent) {\n this._addAriaAndCollapsedClass(this._element, this._triggerArray)\n }\n\n if (this._config.toggle) {\n this.toggle()\n }\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n // Public\n\n toggle() {\n if ($(this._element).hasClass(ClassName.SHOW)) {\n this.hide()\n } else {\n this.show()\n }\n }\n\n show() {\n if (this._isTransitioning ||\n $(this._element).hasClass(ClassName.SHOW)) {\n return\n }\n\n let actives\n let activesData\n\n if (this._parent) {\n actives = [].slice.call(this._parent.querySelectorAll(Selector.ACTIVES))\n .filter((elem) => {\n if (typeof this._config.parent === 'string') {\n return elem.getAttribute('data-parent') === this._config.parent\n }\n\n return elem.classList.contains(ClassName.COLLAPSE)\n })\n\n if (actives.length === 0) {\n actives = null\n }\n }\n\n if (actives) {\n activesData = $(actives).not(this._selector).data(DATA_KEY)\n if (activesData && activesData._isTransitioning) {\n return\n }\n }\n\n const startEvent = $.Event(Event.SHOW)\n $(this._element).trigger(startEvent)\n if (startEvent.isDefaultPrevented()) {\n return\n }\n\n if (actives) {\n Collapse._jQueryInterface.call($(actives).not(this._selector), 'hide')\n if (!activesData) {\n $(actives).data(DATA_KEY, null)\n }\n }\n\n const dimension = this._getDimension()\n\n $(this._element)\n .removeClass(ClassName.COLLAPSE)\n .addClass(ClassName.COLLAPSING)\n\n this._element.style[dimension] = 0\n\n if (this._triggerArray.length) {\n $(this._triggerArray)\n .removeClass(ClassName.COLLAPSED)\n .attr('aria-expanded', true)\n }\n\n this.setTransitioning(true)\n\n const complete = () => {\n $(this._element)\n .removeClass(ClassName.COLLAPSING)\n .addClass(ClassName.COLLAPSE)\n .addClass(ClassName.SHOW)\n\n this._element.style[dimension] = ''\n\n this.setTransitioning(false)\n\n $(this._element).trigger(Event.SHOWN)\n }\n\n const capitalizedDimension = dimension[0].toUpperCase() + dimension.slice(1)\n const scrollSize = `scroll${capitalizedDimension}`\n const transitionDuration = Util.getTransitionDurationFromElement(this._element)\n\n $(this._element)\n .one(Util.TRANSITION_END, complete)\n .emulateTransitionEnd(transitionDuration)\n\n this._element.style[dimension] = `${this._element[scrollSize]}px`\n }\n\n hide() {\n if (this._isTransitioning ||\n !$(this._element).hasClass(ClassName.SHOW)) {\n return\n }\n\n const startEvent = $.Event(Event.HIDE)\n $(this._element).trigger(startEvent)\n if (startEvent.isDefaultPrevented()) {\n return\n }\n\n const dimension = this._getDimension()\n\n this._element.style[dimension] = `${this._element.getBoundingClientRect()[dimension]}px`\n\n Util.reflow(this._element)\n\n $(this._element)\n .addClass(ClassName.COLLAPSING)\n .removeClass(ClassName.COLLAPSE)\n .removeClass(ClassName.SHOW)\n\n const triggerArrayLength = this._triggerArray.length\n if (triggerArrayLength > 0) {\n for (let i = 0; i < triggerArrayLength; i++) {\n const trigger = this._triggerArray[i]\n const selector = Util.getSelectorFromElement(trigger)\n\n if (selector !== null) {\n const $elem = $([].slice.call(document.querySelectorAll(selector)))\n if (!$elem.hasClass(ClassName.SHOW)) {\n $(trigger).addClass(ClassName.COLLAPSED)\n .attr('aria-expanded', false)\n }\n }\n }\n }\n\n this.setTransitioning(true)\n\n const complete = () => {\n this.setTransitioning(false)\n $(this._element)\n .removeClass(ClassName.COLLAPSING)\n .addClass(ClassName.COLLAPSE)\n .trigger(Event.HIDDEN)\n }\n\n this._element.style[dimension] = ''\n const transitionDuration = Util.getTransitionDurationFromElement(this._element)\n\n $(this._element)\n .one(Util.TRANSITION_END, complete)\n .emulateTransitionEnd(transitionDuration)\n }\n\n setTransitioning(isTransitioning) {\n this._isTransitioning = isTransitioning\n }\n\n dispose() {\n $.removeData(this._element, DATA_KEY)\n\n this._config = null\n this._parent = null\n this._element = null\n this._triggerArray = null\n this._isTransitioning = null\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...config\n }\n config.toggle = Boolean(config.toggle) // Coerce string values\n Util.typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _getDimension() {\n const hasWidth = $(this._element).hasClass(Dimension.WIDTH)\n return hasWidth ? Dimension.WIDTH : Dimension.HEIGHT\n }\n\n _getParent() {\n let parent\n\n if (Util.isElement(this._config.parent)) {\n parent = this._config.parent\n\n // It's a jQuery object\n if (typeof this._config.parent.jquery !== 'undefined') {\n parent = this._config.parent[0]\n }\n } else {\n parent = document.querySelector(this._config.parent)\n }\n\n const selector =\n `[data-toggle=\"collapse\"][data-parent=\"${this._config.parent}\"]`\n\n const children = [].slice.call(parent.querySelectorAll(selector))\n $(children).each((i, element) => {\n this._addAriaAndCollapsedClass(\n Collapse._getTargetFromElement(element),\n [element]\n )\n })\n\n return parent\n }\n\n _addAriaAndCollapsedClass(element, triggerArray) {\n const isOpen = $(element).hasClass(ClassName.SHOW)\n\n if (triggerArray.length) {\n $(triggerArray)\n .toggleClass(ClassName.COLLAPSED, !isOpen)\n .attr('aria-expanded', isOpen)\n }\n }\n\n // Static\n\n static _getTargetFromElement(element) {\n const selector = Util.getSelectorFromElement(element)\n return selector ? document.querySelector(selector) : null\n }\n\n static _jQueryInterface(config) {\n return this.each(function () {\n const $this = $(this)\n let data = $this.data(DATA_KEY)\n const _config = {\n ...Default,\n ...$this.data(),\n ...typeof config === 'object' && config ? config : {}\n }\n\n if (!data && _config.toggle && /show|hide/.test(config)) {\n _config.toggle = false\n }\n\n if (!data) {\n data = new Collapse(this, _config)\n $this.data(DATA_KEY, data)\n }\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\n$(document).on(Event.CLICK_DATA_API, Selector.DATA_TOGGLE, function (event) {\n // preventDefault only for elements (which change the URL) not inside the collapsible element\n if (event.currentTarget.tagName === 'A') {\n event.preventDefault()\n }\n\n const $trigger = $(this)\n const selector = Util.getSelectorFromElement(this)\n const selectors = [].slice.call(document.querySelectorAll(selector))\n\n $(selectors).each(function () {\n const $target = $(this)\n const data = $target.data(DATA_KEY)\n const config = data ? 'toggle' : $trigger.data()\n Collapse._jQueryInterface.call($target, config)\n })\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Collapse._jQueryInterface\n$.fn[NAME].Constructor = Collapse\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Collapse._jQueryInterface\n}\n\nexport default Collapse\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): dropdown.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Popper from 'popper.js'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'dropdown'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.dropdown'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\nconst ESCAPE_KEYCODE = 27 // KeyboardEvent.which value for Escape (Esc) key\nconst SPACE_KEYCODE = 32 // KeyboardEvent.which value for space key\nconst TAB_KEYCODE = 9 // KeyboardEvent.which value for tab key\nconst ARROW_UP_KEYCODE = 38 // KeyboardEvent.which value for up arrow key\nconst ARROW_DOWN_KEYCODE = 40 // KeyboardEvent.which value for down arrow key\nconst RIGHT_MOUSE_BUTTON_WHICH = 3 // MouseEvent.which value for the right button (assuming a right-handed mouse)\nconst REGEXP_KEYDOWN = new RegExp(`${ARROW_UP_KEYCODE}|${ARROW_DOWN_KEYCODE}|${ESCAPE_KEYCODE}`)\n\nconst Event = {\n HIDE : `hide${EVENT_KEY}`,\n HIDDEN : `hidden${EVENT_KEY}`,\n SHOW : `show${EVENT_KEY}`,\n SHOWN : `shown${EVENT_KEY}`,\n CLICK : `click${EVENT_KEY}`,\n CLICK_DATA_API : `click${EVENT_KEY}${DATA_API_KEY}`,\n KEYDOWN_DATA_API : `keydown${EVENT_KEY}${DATA_API_KEY}`,\n KEYUP_DATA_API : `keyup${EVENT_KEY}${DATA_API_KEY}`\n}\n\nconst ClassName = {\n DISABLED : 'disabled',\n SHOW : 'show',\n DROPUP : 'dropup',\n DROPRIGHT : 'dropright',\n DROPLEFT : 'dropleft',\n MENURIGHT : 'dropdown-menu-right',\n MENULEFT : 'dropdown-menu-left',\n POSITION_STATIC : 'position-static'\n}\n\nconst Selector = {\n DATA_TOGGLE : '[data-toggle=\"dropdown\"]',\n FORM_CHILD : '.dropdown form',\n MENU : '.dropdown-menu',\n NAVBAR_NAV : '.navbar-nav',\n VISIBLE_ITEMS : '.dropdown-menu .dropdown-item:not(.disabled):not(:disabled)'\n}\n\nconst AttachmentMap = {\n TOP : 'top-start',\n TOPEND : 'top-end',\n BOTTOM : 'bottom-start',\n BOTTOMEND : 'bottom-end',\n RIGHT : 'right-start',\n RIGHTEND : 'right-end',\n LEFT : 'left-start',\n LEFTEND : 'left-end'\n}\n\nconst Default = {\n offset : 0,\n flip : true,\n boundary : 'scrollParent',\n reference : 'toggle',\n display : 'dynamic'\n}\n\nconst DefaultType = {\n offset : '(number|string|function)',\n flip : 'boolean',\n boundary : '(string|element)',\n reference : '(string|element)',\n display : 'string'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Dropdown {\n constructor(element, config) {\n this._element = element\n this._popper = null\n this._config = this._getConfig(config)\n this._menu = this._getMenuElement()\n this._inNavbar = this._detectNavbar()\n\n this._addEventListeners()\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n static get DefaultType() {\n return DefaultType\n }\n\n // Public\n\n toggle() {\n if (this._element.disabled || $(this._element).hasClass(ClassName.DISABLED)) {\n return\n }\n\n const parent = Dropdown._getParentFromElement(this._element)\n const isActive = $(this._menu).hasClass(ClassName.SHOW)\n\n Dropdown._clearMenus()\n\n if (isActive) {\n return\n }\n\n const relatedTarget = {\n relatedTarget: this._element\n }\n const showEvent = $.Event(Event.SHOW, relatedTarget)\n\n $(parent).trigger(showEvent)\n\n if (showEvent.isDefaultPrevented()) {\n return\n }\n\n // Disable totally Popper.js for Dropdown in Navbar\n if (!this._inNavbar) {\n /**\n * Check for Popper dependency\n * Popper - https://popper.js.org\n */\n if (typeof Popper === 'undefined') {\n throw new TypeError('Bootstrap\\'s dropdowns require Popper.js (https://popper.js.org/)')\n }\n\n let referenceElement = this._element\n\n if (this._config.reference === 'parent') {\n referenceElement = parent\n } else if (Util.isElement(this._config.reference)) {\n referenceElement = this._config.reference\n\n // Check if it's jQuery element\n if (typeof this._config.reference.jquery !== 'undefined') {\n referenceElement = this._config.reference[0]\n }\n }\n\n // If boundary is not `scrollParent`, then set position to `static`\n // to allow the menu to \"escape\" the scroll parent's boundaries\n // https://github.com/twbs/bootstrap/issues/24251\n if (this._config.boundary !== 'scrollParent') {\n $(parent).addClass(ClassName.POSITION_STATIC)\n }\n this._popper = new Popper(referenceElement, this._menu, this._getPopperConfig())\n }\n\n // If this is a touch-enabled device we add extra\n // empty mouseover listeners to the body's immediate children;\n // only needed because of broken event delegation on iOS\n // https://www.quirksmode.org/blog/archives/2014/02/mouse_event_bub.html\n if ('ontouchstart' in document.documentElement &&\n $(parent).closest(Selector.NAVBAR_NAV).length === 0) {\n $(document.body).children().on('mouseover', null, $.noop)\n }\n\n this._element.focus()\n this._element.setAttribute('aria-expanded', true)\n\n $(this._menu).toggleClass(ClassName.SHOW)\n $(parent)\n .toggleClass(ClassName.SHOW)\n .trigger($.Event(Event.SHOWN, relatedTarget))\n }\n\n show() {\n if (this._element.disabled || $(this._element).hasClass(ClassName.DISABLED) || $(this._menu).hasClass(ClassName.SHOW)) {\n return\n }\n\n const relatedTarget = {\n relatedTarget: this._element\n }\n const showEvent = $.Event(Event.SHOW, relatedTarget)\n const parent = Dropdown._getParentFromElement(this._element)\n\n $(parent).trigger(showEvent)\n\n if (showEvent.isDefaultPrevented()) {\n return\n }\n\n $(this._menu).toggleClass(ClassName.SHOW)\n $(parent)\n .toggleClass(ClassName.SHOW)\n .trigger($.Event(Event.SHOWN, relatedTarget))\n }\n\n hide() {\n if (this._element.disabled || $(this._element).hasClass(ClassName.DISABLED) || !$(this._menu).hasClass(ClassName.SHOW)) {\n return\n }\n\n const relatedTarget = {\n relatedTarget: this._element\n }\n const hideEvent = $.Event(Event.HIDE, relatedTarget)\n const parent = Dropdown._getParentFromElement(this._element)\n\n $(parent).trigger(hideEvent)\n\n if (hideEvent.isDefaultPrevented()) {\n return\n }\n\n $(this._menu).toggleClass(ClassName.SHOW)\n $(parent)\n .toggleClass(ClassName.SHOW)\n .trigger($.Event(Event.HIDDEN, relatedTarget))\n }\n\n dispose() {\n $.removeData(this._element, DATA_KEY)\n $(this._element).off(EVENT_KEY)\n this._element = null\n this._menu = null\n if (this._popper !== null) {\n this._popper.destroy()\n this._popper = null\n }\n }\n\n update() {\n this._inNavbar = this._detectNavbar()\n if (this._popper !== null) {\n this._popper.scheduleUpdate()\n }\n }\n\n // Private\n\n _addEventListeners() {\n $(this._element).on(Event.CLICK, (event) => {\n event.preventDefault()\n event.stopPropagation()\n this.toggle()\n })\n }\n\n _getConfig(config) {\n config = {\n ...this.constructor.Default,\n ...$(this._element).data(),\n ...config\n }\n\n Util.typeCheckConfig(\n NAME,\n config,\n this.constructor.DefaultType\n )\n\n return config\n }\n\n _getMenuElement() {\n if (!this._menu) {\n const parent = Dropdown._getParentFromElement(this._element)\n\n if (parent) {\n this._menu = parent.querySelector(Selector.MENU)\n }\n }\n return this._menu\n }\n\n _getPlacement() {\n const $parentDropdown = $(this._element.parentNode)\n let placement = AttachmentMap.BOTTOM\n\n // Handle dropup\n if ($parentDropdown.hasClass(ClassName.DROPUP)) {\n placement = AttachmentMap.TOP\n if ($(this._menu).hasClass(ClassName.MENURIGHT)) {\n placement = AttachmentMap.TOPEND\n }\n } else if ($parentDropdown.hasClass(ClassName.DROPRIGHT)) {\n placement = AttachmentMap.RIGHT\n } else if ($parentDropdown.hasClass(ClassName.DROPLEFT)) {\n placement = AttachmentMap.LEFT\n } else if ($(this._menu).hasClass(ClassName.MENURIGHT)) {\n placement = AttachmentMap.BOTTOMEND\n }\n return placement\n }\n\n _detectNavbar() {\n return $(this._element).closest('.navbar').length > 0\n }\n\n _getOffset() {\n const offset = {}\n\n if (typeof this._config.offset === 'function') {\n offset.fn = (data) => {\n data.offsets = {\n ...data.offsets,\n ...this._config.offset(data.offsets, this._element) || {}\n }\n\n return data\n }\n } else {\n offset.offset = this._config.offset\n }\n\n return offset\n }\n\n _getPopperConfig() {\n const popperConfig = {\n placement: this._getPlacement(),\n modifiers: {\n offset: this._getOffset(),\n flip: {\n enabled: this._config.flip\n },\n preventOverflow: {\n boundariesElement: this._config.boundary\n }\n }\n }\n\n // Disable Popper.js if we have a static display\n if (this._config.display === 'static') {\n popperConfig.modifiers.applyStyle = {\n enabled: false\n }\n }\n\n return popperConfig\n }\n\n // Static\n\n static _jQueryInterface(config) {\n return this.each(function () {\n let data = $(this).data(DATA_KEY)\n const _config = typeof config === 'object' ? config : null\n\n if (!data) {\n data = new Dropdown(this, _config)\n $(this).data(DATA_KEY, data)\n }\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n data[config]()\n }\n })\n }\n\n static _clearMenus(event) {\n if (event && (event.which === RIGHT_MOUSE_BUTTON_WHICH ||\n event.type === 'keyup' && event.which !== TAB_KEYCODE)) {\n return\n }\n\n const toggles = [].slice.call(document.querySelectorAll(Selector.DATA_TOGGLE))\n\n for (let i = 0, len = toggles.length; i < len; i++) {\n const parent = Dropdown._getParentFromElement(toggles[i])\n const context = $(toggles[i]).data(DATA_KEY)\n const relatedTarget = {\n relatedTarget: toggles[i]\n }\n\n if (event && event.type === 'click') {\n relatedTarget.clickEvent = event\n }\n\n if (!context) {\n continue\n }\n\n const dropdownMenu = context._menu\n if (!$(parent).hasClass(ClassName.SHOW)) {\n continue\n }\n\n if (event && (event.type === 'click' &&\n /input|textarea/i.test(event.target.tagName) || event.type === 'keyup' && event.which === TAB_KEYCODE) &&\n $.contains(parent, event.target)) {\n continue\n }\n\n const hideEvent = $.Event(Event.HIDE, relatedTarget)\n $(parent).trigger(hideEvent)\n if (hideEvent.isDefaultPrevented()) {\n continue\n }\n\n // If this is a touch-enabled device we remove the extra\n // empty mouseover listeners we added for iOS support\n if ('ontouchstart' in document.documentElement) {\n $(document.body).children().off('mouseover', null, $.noop)\n }\n\n toggles[i].setAttribute('aria-expanded', 'false')\n\n $(dropdownMenu).removeClass(ClassName.SHOW)\n $(parent)\n .removeClass(ClassName.SHOW)\n .trigger($.Event(Event.HIDDEN, relatedTarget))\n }\n }\n\n static _getParentFromElement(element) {\n let parent\n const selector = Util.getSelectorFromElement(element)\n\n if (selector) {\n parent = document.querySelector(selector)\n }\n\n return parent || element.parentNode\n }\n\n // eslint-disable-next-line complexity\n static _dataApiKeydownHandler(event) {\n // If not input/textarea:\n // - And not a key in REGEXP_KEYDOWN => not a dropdown command\n // If input/textarea:\n // - If space key => not a dropdown command\n // - If key is other than escape\n // - If key is not up or down => not a dropdown command\n // - If trigger inside the menu => not a dropdown command\n if (/input|textarea/i.test(event.target.tagName)\n ? event.which === SPACE_KEYCODE || event.which !== ESCAPE_KEYCODE &&\n (event.which !== ARROW_DOWN_KEYCODE && event.which !== ARROW_UP_KEYCODE ||\n $(event.target).closest(Selector.MENU).length) : !REGEXP_KEYDOWN.test(event.which)) {\n return\n }\n\n event.preventDefault()\n event.stopPropagation()\n\n if (this.disabled || $(this).hasClass(ClassName.DISABLED)) {\n return\n }\n\n const parent = Dropdown._getParentFromElement(this)\n const isActive = $(parent).hasClass(ClassName.SHOW)\n\n if (!isActive || isActive && (event.which === ESCAPE_KEYCODE || event.which === SPACE_KEYCODE)) {\n if (event.which === ESCAPE_KEYCODE) {\n const toggle = parent.querySelector(Selector.DATA_TOGGLE)\n $(toggle).trigger('focus')\n }\n\n $(this).trigger('click')\n return\n }\n\n const items = [].slice.call(parent.querySelectorAll(Selector.VISIBLE_ITEMS))\n\n if (items.length === 0) {\n return\n }\n\n let index = items.indexOf(event.target)\n\n if (event.which === ARROW_UP_KEYCODE && index > 0) { // Up\n index--\n }\n\n if (event.which === ARROW_DOWN_KEYCODE && index < items.length - 1) { // Down\n index++\n }\n\n if (index < 0) {\n index = 0\n }\n\n items[index].focus()\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\n$(document)\n .on(Event.KEYDOWN_DATA_API, Selector.DATA_TOGGLE, Dropdown._dataApiKeydownHandler)\n .on(Event.KEYDOWN_DATA_API, Selector.MENU, Dropdown._dataApiKeydownHandler)\n .on(`${Event.CLICK_DATA_API} ${Event.KEYUP_DATA_API}`, Dropdown._clearMenus)\n .on(Event.CLICK_DATA_API, Selector.DATA_TOGGLE, function (event) {\n event.preventDefault()\n event.stopPropagation()\n Dropdown._jQueryInterface.call($(this), 'toggle')\n })\n .on(Event.CLICK_DATA_API, Selector.FORM_CHILD, (e) => {\n e.stopPropagation()\n })\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Dropdown._jQueryInterface\n$.fn[NAME].Constructor = Dropdown\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Dropdown._jQueryInterface\n}\n\n\nexport default Dropdown\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): modal.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'modal'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.modal'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\nconst ESCAPE_KEYCODE = 27 // KeyboardEvent.which value for Escape (Esc) key\n\nconst Default = {\n backdrop : true,\n keyboard : true,\n focus : true,\n show : true\n}\n\nconst DefaultType = {\n backdrop : '(boolean|string)',\n keyboard : 'boolean',\n focus : 'boolean',\n show : 'boolean'\n}\n\nconst Event = {\n HIDE : `hide${EVENT_KEY}`,\n HIDDEN : `hidden${EVENT_KEY}`,\n SHOW : `show${EVENT_KEY}`,\n SHOWN : `shown${EVENT_KEY}`,\n FOCUSIN : `focusin${EVENT_KEY}`,\n RESIZE : `resize${EVENT_KEY}`,\n CLICK_DISMISS : `click.dismiss${EVENT_KEY}`,\n KEYDOWN_DISMISS : `keydown.dismiss${EVENT_KEY}`,\n MOUSEUP_DISMISS : `mouseup.dismiss${EVENT_KEY}`,\n MOUSEDOWN_DISMISS : `mousedown.dismiss${EVENT_KEY}`,\n CLICK_DATA_API : `click${EVENT_KEY}${DATA_API_KEY}`\n}\n\nconst ClassName = {\n SCROLLABLE : 'modal-dialog-scrollable',\n SCROLLBAR_MEASURER : 'modal-scrollbar-measure',\n BACKDROP : 'modal-backdrop',\n OPEN : 'modal-open',\n FADE : 'fade',\n SHOW : 'show'\n}\n\nconst Selector = {\n DIALOG : '.modal-dialog',\n MODAL_BODY : '.modal-body',\n DATA_TOGGLE : '[data-toggle=\"modal\"]',\n DATA_DISMISS : '[data-dismiss=\"modal\"]',\n FIXED_CONTENT : '.fixed-top, .fixed-bottom, .is-fixed, .sticky-top',\n STICKY_CONTENT : '.sticky-top'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Modal {\n constructor(element, config) {\n this._config = this._getConfig(config)\n this._element = element\n this._dialog = element.querySelector(Selector.DIALOG)\n this._backdrop = null\n this._isShown = false\n this._isBodyOverflowing = false\n this._ignoreBackdropClick = false\n this._isTransitioning = false\n this._scrollbarWidth = 0\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n // Public\n\n toggle(relatedTarget) {\n return this._isShown ? this.hide() : this.show(relatedTarget)\n }\n\n show(relatedTarget) {\n if (this._isShown || this._isTransitioning) {\n return\n }\n\n if ($(this._element).hasClass(ClassName.FADE)) {\n this._isTransitioning = true\n }\n\n const showEvent = $.Event(Event.SHOW, {\n relatedTarget\n })\n\n $(this._element).trigger(showEvent)\n\n if (this._isShown || showEvent.isDefaultPrevented()) {\n return\n }\n\n this._isShown = true\n\n this._checkScrollbar()\n this._setScrollbar()\n\n this._adjustDialog()\n\n this._setEscapeEvent()\n this._setResizeEvent()\n\n $(this._element).on(\n Event.CLICK_DISMISS,\n Selector.DATA_DISMISS,\n (event) => this.hide(event)\n )\n\n $(this._dialog).on(Event.MOUSEDOWN_DISMISS, () => {\n $(this._element).one(Event.MOUSEUP_DISMISS, (event) => {\n if ($(event.target).is(this._element)) {\n this._ignoreBackdropClick = true\n }\n })\n })\n\n this._showBackdrop(() => this._showElement(relatedTarget))\n }\n\n hide(event) {\n if (event) {\n event.preventDefault()\n }\n\n if (!this._isShown || this._isTransitioning) {\n return\n }\n\n const hideEvent = $.Event(Event.HIDE)\n\n $(this._element).trigger(hideEvent)\n\n if (!this._isShown || hideEvent.isDefaultPrevented()) {\n return\n }\n\n this._isShown = false\n const transition = $(this._element).hasClass(ClassName.FADE)\n\n if (transition) {\n this._isTransitioning = true\n }\n\n this._setEscapeEvent()\n this._setResizeEvent()\n\n $(document).off(Event.FOCUSIN)\n\n $(this._element).removeClass(ClassName.SHOW)\n\n $(this._element).off(Event.CLICK_DISMISS)\n $(this._dialog).off(Event.MOUSEDOWN_DISMISS)\n\n\n if (transition) {\n const transitionDuration = Util.getTransitionDurationFromElement(this._element)\n\n $(this._element)\n .one(Util.TRANSITION_END, (event) => this._hideModal(event))\n .emulateTransitionEnd(transitionDuration)\n } else {\n this._hideModal()\n }\n }\n\n dispose() {\n [window, this._element, this._dialog]\n .forEach((htmlElement) => $(htmlElement).off(EVENT_KEY))\n\n /**\n * `document` has 2 events `Event.FOCUSIN` and `Event.CLICK_DATA_API`\n * Do not move `document` in `htmlElements` array\n * It will remove `Event.CLICK_DATA_API` event that should remain\n */\n $(document).off(Event.FOCUSIN)\n\n $.removeData(this._element, DATA_KEY)\n\n this._config = null\n this._element = null\n this._dialog = null\n this._backdrop = null\n this._isShown = null\n this._isBodyOverflowing = null\n this._ignoreBackdropClick = null\n this._isTransitioning = null\n this._scrollbarWidth = null\n }\n\n handleUpdate() {\n this._adjustDialog()\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...config\n }\n Util.typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _showElement(relatedTarget) {\n const transition = $(this._element).hasClass(ClassName.FADE)\n\n if (!this._element.parentNode ||\n this._element.parentNode.nodeType !== Node.ELEMENT_NODE) {\n // Don't move modal's DOM position\n document.body.appendChild(this._element)\n }\n\n this._element.style.display = 'block'\n this._element.removeAttribute('aria-hidden')\n this._element.setAttribute('aria-modal', true)\n\n if ($(this._dialog).hasClass(ClassName.SCROLLABLE)) {\n this._dialog.querySelector(Selector.MODAL_BODY).scrollTop = 0\n } else {\n this._element.scrollTop = 0\n }\n\n if (transition) {\n Util.reflow(this._element)\n }\n\n $(this._element).addClass(ClassName.SHOW)\n\n if (this._config.focus) {\n this._enforceFocus()\n }\n\n const shownEvent = $.Event(Event.SHOWN, {\n relatedTarget\n })\n\n const transitionComplete = () => {\n if (this._config.focus) {\n this._element.focus()\n }\n this._isTransitioning = false\n $(this._element).trigger(shownEvent)\n }\n\n if (transition) {\n const transitionDuration = Util.getTransitionDurationFromElement(this._dialog)\n\n $(this._dialog)\n .one(Util.TRANSITION_END, transitionComplete)\n .emulateTransitionEnd(transitionDuration)\n } else {\n transitionComplete()\n }\n }\n\n _enforceFocus() {\n $(document)\n .off(Event.FOCUSIN) // Guard against infinite focus loop\n .on(Event.FOCUSIN, (event) => {\n if (document !== event.target &&\n this._element !== event.target &&\n $(this._element).has(event.target).length === 0) {\n this._element.focus()\n }\n })\n }\n\n _setEscapeEvent() {\n if (this._isShown && this._config.keyboard) {\n $(this._element).on(Event.KEYDOWN_DISMISS, (event) => {\n if (event.which === ESCAPE_KEYCODE) {\n event.preventDefault()\n this.hide()\n }\n })\n } else if (!this._isShown) {\n $(this._element).off(Event.KEYDOWN_DISMISS)\n }\n }\n\n _setResizeEvent() {\n if (this._isShown) {\n $(window).on(Event.RESIZE, (event) => this.handleUpdate(event))\n } else {\n $(window).off(Event.RESIZE)\n }\n }\n\n _hideModal() {\n this._element.style.display = 'none'\n this._element.setAttribute('aria-hidden', true)\n this._element.removeAttribute('aria-modal')\n this._isTransitioning = false\n this._showBackdrop(() => {\n $(document.body).removeClass(ClassName.OPEN)\n this._resetAdjustments()\n this._resetScrollbar()\n $(this._element).trigger(Event.HIDDEN)\n })\n }\n\n _removeBackdrop() {\n if (this._backdrop) {\n $(this._backdrop).remove()\n this._backdrop = null\n }\n }\n\n _showBackdrop(callback) {\n const animate = $(this._element).hasClass(ClassName.FADE)\n ? ClassName.FADE : ''\n\n if (this._isShown && this._config.backdrop) {\n this._backdrop = document.createElement('div')\n this._backdrop.className = ClassName.BACKDROP\n\n if (animate) {\n this._backdrop.classList.add(animate)\n }\n\n $(this._backdrop).appendTo(document.body)\n\n $(this._element).on(Event.CLICK_DISMISS, (event) => {\n if (this._ignoreBackdropClick) {\n this._ignoreBackdropClick = false\n return\n }\n if (event.target !== event.currentTarget) {\n return\n }\n if (this._config.backdrop === 'static') {\n this._element.focus()\n } else {\n this.hide()\n }\n })\n\n if (animate) {\n Util.reflow(this._backdrop)\n }\n\n $(this._backdrop).addClass(ClassName.SHOW)\n\n if (!callback) {\n return\n }\n\n if (!animate) {\n callback()\n return\n }\n\n const backdropTransitionDuration = Util.getTransitionDurationFromElement(this._backdrop)\n\n $(this._backdrop)\n .one(Util.TRANSITION_END, callback)\n .emulateTransitionEnd(backdropTransitionDuration)\n } else if (!this._isShown && this._backdrop) {\n $(this._backdrop).removeClass(ClassName.SHOW)\n\n const callbackRemove = () => {\n this._removeBackdrop()\n if (callback) {\n callback()\n }\n }\n\n if ($(this._element).hasClass(ClassName.FADE)) {\n const backdropTransitionDuration = Util.getTransitionDurationFromElement(this._backdrop)\n\n $(this._backdrop)\n .one(Util.TRANSITION_END, callbackRemove)\n .emulateTransitionEnd(backdropTransitionDuration)\n } else {\n callbackRemove()\n }\n } else if (callback) {\n callback()\n }\n }\n\n // ----------------------------------------------------------------------\n // the following methods are used to handle overflowing modals\n // todo (fat): these should probably be refactored out of modal.js\n // ----------------------------------------------------------------------\n\n _adjustDialog() {\n const isModalOverflowing =\n this._element.scrollHeight > document.documentElement.clientHeight\n\n if (!this._isBodyOverflowing && isModalOverflowing) {\n this._element.style.paddingLeft = `${this._scrollbarWidth}px`\n }\n\n if (this._isBodyOverflowing && !isModalOverflowing) {\n this._element.style.paddingRight = `${this._scrollbarWidth}px`\n }\n }\n\n _resetAdjustments() {\n this._element.style.paddingLeft = ''\n this._element.style.paddingRight = ''\n }\n\n _checkScrollbar() {\n const rect = document.body.getBoundingClientRect()\n this._isBodyOverflowing = rect.left + rect.right < window.innerWidth\n this._scrollbarWidth = this._getScrollbarWidth()\n }\n\n _setScrollbar() {\n if (this._isBodyOverflowing) {\n // Note: DOMNode.style.paddingRight returns the actual value or '' if not set\n // while $(DOMNode).css('padding-right') returns the calculated value or 0 if not set\n const fixedContent = [].slice.call(document.querySelectorAll(Selector.FIXED_CONTENT))\n const stickyContent = [].slice.call(document.querySelectorAll(Selector.STICKY_CONTENT))\n\n // Adjust fixed content padding\n $(fixedContent).each((index, element) => {\n const actualPadding = element.style.paddingRight\n const calculatedPadding = $(element).css('padding-right')\n $(element)\n .data('padding-right', actualPadding)\n .css('padding-right', `${parseFloat(calculatedPadding) + this._scrollbarWidth}px`)\n })\n\n // Adjust sticky content margin\n $(stickyContent).each((index, element) => {\n const actualMargin = element.style.marginRight\n const calculatedMargin = $(element).css('margin-right')\n $(element)\n .data('margin-right', actualMargin)\n .css('margin-right', `${parseFloat(calculatedMargin) - this._scrollbarWidth}px`)\n })\n\n // Adjust body padding\n const actualPadding = document.body.style.paddingRight\n const calculatedPadding = $(document.body).css('padding-right')\n $(document.body)\n .data('padding-right', actualPadding)\n .css('padding-right', `${parseFloat(calculatedPadding) + this._scrollbarWidth}px`)\n }\n\n $(document.body).addClass(ClassName.OPEN)\n }\n\n _resetScrollbar() {\n // Restore fixed content padding\n const fixedContent = [].slice.call(document.querySelectorAll(Selector.FIXED_CONTENT))\n $(fixedContent).each((index, element) => {\n const padding = $(element).data('padding-right')\n $(element).removeData('padding-right')\n element.style.paddingRight = padding ? padding : ''\n })\n\n // Restore sticky content\n const elements = [].slice.call(document.querySelectorAll(`${Selector.STICKY_CONTENT}`))\n $(elements).each((index, element) => {\n const margin = $(element).data('margin-right')\n if (typeof margin !== 'undefined') {\n $(element).css('margin-right', margin).removeData('margin-right')\n }\n })\n\n // Restore body padding\n const padding = $(document.body).data('padding-right')\n $(document.body).removeData('padding-right')\n document.body.style.paddingRight = padding ? padding : ''\n }\n\n _getScrollbarWidth() { // thx d.walsh\n const scrollDiv = document.createElement('div')\n scrollDiv.className = ClassName.SCROLLBAR_MEASURER\n document.body.appendChild(scrollDiv)\n const scrollbarWidth = scrollDiv.getBoundingClientRect().width - scrollDiv.clientWidth\n document.body.removeChild(scrollDiv)\n return scrollbarWidth\n }\n\n // Static\n\n static _jQueryInterface(config, relatedTarget) {\n return this.each(function () {\n let data = $(this).data(DATA_KEY)\n const _config = {\n ...Default,\n ...$(this).data(),\n ...typeof config === 'object' && config ? config : {}\n }\n\n if (!data) {\n data = new Modal(this, _config)\n $(this).data(DATA_KEY, data)\n }\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n data[config](relatedTarget)\n } else if (_config.show) {\n data.show(relatedTarget)\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\n$(document).on(Event.CLICK_DATA_API, Selector.DATA_TOGGLE, function (event) {\n let target\n const selector = Util.getSelectorFromElement(this)\n\n if (selector) {\n target = document.querySelector(selector)\n }\n\n const config = $(target).data(DATA_KEY)\n ? 'toggle' : {\n ...$(target).data(),\n ...$(this).data()\n }\n\n if (this.tagName === 'A' || this.tagName === 'AREA') {\n event.preventDefault()\n }\n\n const $target = $(target).one(Event.SHOW, (showEvent) => {\n if (showEvent.isDefaultPrevented()) {\n // Only register focus restorer if modal will actually get shown\n return\n }\n\n $target.one(Event.HIDDEN, () => {\n if ($(this).is(':visible')) {\n this.focus()\n }\n })\n })\n\n Modal._jQueryInterface.call($(target), config, this)\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Modal._jQueryInterface\n$.fn[NAME].Constructor = Modal\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Modal._jQueryInterface\n}\n\nexport default Modal\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): tools/sanitizer.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nconst uriAttrs = [\n 'background',\n 'cite',\n 'href',\n 'itemtype',\n 'longdesc',\n 'poster',\n 'src',\n 'xlink:href'\n]\n\nconst ARIA_ATTRIBUTE_PATTERN = /^aria-[\\w-]*$/i\n\nexport const DefaultWhitelist = {\n // Global attributes allowed on any supplied element below.\n '*': ['class', 'dir', 'id', 'lang', 'role', ARIA_ATTRIBUTE_PATTERN],\n a: ['target', 'href', 'title', 'rel'],\n area: [],\n b: [],\n br: [],\n col: [],\n code: [],\n div: [],\n em: [],\n hr: [],\n h1: [],\n h2: [],\n h3: [],\n h4: [],\n h5: [],\n h6: [],\n i: [],\n img: ['src', 'alt', 'title', 'width', 'height'],\n li: [],\n ol: [],\n p: [],\n pre: [],\n s: [],\n small: [],\n span: [],\n sub: [],\n sup: [],\n strong: [],\n u: [],\n ul: []\n}\n\n/**\n * A pattern that recognizes a commonly useful subset of URLs that are safe.\n *\n * Shoutout to Angular 7 https://github.com/angular/angular/blob/7.2.4/packages/core/src/sanitization/url_sanitizer.ts\n */\nconst SAFE_URL_PATTERN = /^(?:(?:https?|mailto|ftp|tel|file):|[^&:/?#]*(?:[/?#]|$))/gi\n\n/**\n * A pattern that matches safe data URLs. Only matches image, video and audio types.\n *\n * Shoutout to Angular 7 https://github.com/angular/angular/blob/7.2.4/packages/core/src/sanitization/url_sanitizer.ts\n */\nconst DATA_URL_PATTERN = /^data:(?:image\\/(?:bmp|gif|jpeg|jpg|png|tiff|webp)|video\\/(?:mpeg|mp4|ogg|webm)|audio\\/(?:mp3|oga|ogg|opus));base64,[a-z0-9+/]+=*$/i\n\nfunction allowedAttribute(attr, allowedAttributeList) {\n const attrName = attr.nodeName.toLowerCase()\n\n if (allowedAttributeList.indexOf(attrName) !== -1) {\n if (uriAttrs.indexOf(attrName) !== -1) {\n return Boolean(attr.nodeValue.match(SAFE_URL_PATTERN) || attr.nodeValue.match(DATA_URL_PATTERN))\n }\n\n return true\n }\n\n const regExp = allowedAttributeList.filter((attrRegex) => attrRegex instanceof RegExp)\n\n // Check if a regular expression validates the attribute.\n for (let i = 0, l = regExp.length; i < l; i++) {\n if (attrName.match(regExp[i])) {\n return true\n }\n }\n\n return false\n}\n\nexport function sanitizeHtml(unsafeHtml, whiteList, sanitizeFn) {\n if (unsafeHtml.length === 0) {\n return unsafeHtml\n }\n\n if (sanitizeFn && typeof sanitizeFn === 'function') {\n return sanitizeFn(unsafeHtml)\n }\n\n const domParser = new window.DOMParser()\n const createdDocument = domParser.parseFromString(unsafeHtml, 'text/html')\n const whitelistKeys = Object.keys(whiteList)\n const elements = [].slice.call(createdDocument.body.querySelectorAll('*'))\n\n for (let i = 0, len = elements.length; i < len; i++) {\n const el = elements[i]\n const elName = el.nodeName.toLowerCase()\n\n if (whitelistKeys.indexOf(el.nodeName.toLowerCase()) === -1) {\n el.parentNode.removeChild(el)\n\n continue\n }\n\n const attributeList = [].slice.call(el.attributes)\n const whitelistedAttributes = [].concat(whiteList['*'] || [], whiteList[elName] || [])\n\n attributeList.forEach((attr) => {\n if (!allowedAttribute(attr, whitelistedAttributes)) {\n el.removeAttribute(attr.nodeName)\n }\n })\n }\n\n return createdDocument.body.innerHTML\n}\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): tooltip.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport {\n DefaultWhitelist,\n sanitizeHtml\n} from './tools/sanitizer'\nimport $ from 'jquery'\nimport Popper from 'popper.js'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'tooltip'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.tooltip'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\nconst CLASS_PREFIX = 'bs-tooltip'\nconst BSCLS_PREFIX_REGEX = new RegExp(`(^|\\\\s)${CLASS_PREFIX}\\\\S+`, 'g')\nconst DISALLOWED_ATTRIBUTES = ['sanitize', 'whiteList', 'sanitizeFn']\n\nconst DefaultType = {\n animation : 'boolean',\n template : 'string',\n title : '(string|element|function)',\n trigger : 'string',\n delay : '(number|object)',\n html : 'boolean',\n selector : '(string|boolean)',\n placement : '(string|function)',\n offset : '(number|string|function)',\n container : '(string|element|boolean)',\n fallbackPlacement : '(string|array)',\n boundary : '(string|element)',\n sanitize : 'boolean',\n sanitizeFn : '(null|function)',\n whiteList : 'object'\n}\n\nconst AttachmentMap = {\n AUTO : 'auto',\n TOP : 'top',\n RIGHT : 'right',\n BOTTOM : 'bottom',\n LEFT : 'left'\n}\n\nconst Default = {\n animation : true,\n template : '
    ' +\n '
    ' +\n '
    ',\n trigger : 'hover focus',\n title : '',\n delay : 0,\n html : false,\n selector : false,\n placement : 'top',\n offset : 0,\n container : false,\n fallbackPlacement : 'flip',\n boundary : 'scrollParent',\n sanitize : true,\n sanitizeFn : null,\n whiteList : DefaultWhitelist\n}\n\nconst HoverState = {\n SHOW : 'show',\n OUT : 'out'\n}\n\nconst Event = {\n HIDE : `hide${EVENT_KEY}`,\n HIDDEN : `hidden${EVENT_KEY}`,\n SHOW : `show${EVENT_KEY}`,\n SHOWN : `shown${EVENT_KEY}`,\n INSERTED : `inserted${EVENT_KEY}`,\n CLICK : `click${EVENT_KEY}`,\n FOCUSIN : `focusin${EVENT_KEY}`,\n FOCUSOUT : `focusout${EVENT_KEY}`,\n MOUSEENTER : `mouseenter${EVENT_KEY}`,\n MOUSELEAVE : `mouseleave${EVENT_KEY}`\n}\n\nconst ClassName = {\n FADE : 'fade',\n SHOW : 'show'\n}\n\nconst Selector = {\n TOOLTIP : '.tooltip',\n TOOLTIP_INNER : '.tooltip-inner',\n ARROW : '.arrow'\n}\n\nconst Trigger = {\n HOVER : 'hover',\n FOCUS : 'focus',\n CLICK : 'click',\n MANUAL : 'manual'\n}\n\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Tooltip {\n constructor(element, config) {\n /**\n * Check for Popper dependency\n * Popper - https://popper.js.org\n */\n if (typeof Popper === 'undefined') {\n throw new TypeError('Bootstrap\\'s tooltips require Popper.js (https://popper.js.org/)')\n }\n\n // private\n this._isEnabled = true\n this._timeout = 0\n this._hoverState = ''\n this._activeTrigger = {}\n this._popper = null\n\n // Protected\n this.element = element\n this.config = this._getConfig(config)\n this.tip = null\n\n this._setListeners()\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n static get DATA_KEY() {\n return DATA_KEY\n }\n\n static get Event() {\n return Event\n }\n\n static get EVENT_KEY() {\n return EVENT_KEY\n }\n\n static get DefaultType() {\n return DefaultType\n }\n\n // Public\n\n enable() {\n this._isEnabled = true\n }\n\n disable() {\n this._isEnabled = false\n }\n\n toggleEnabled() {\n this._isEnabled = !this._isEnabled\n }\n\n toggle(event) {\n if (!this._isEnabled) {\n return\n }\n\n if (event) {\n const dataKey = this.constructor.DATA_KEY\n let context = $(event.currentTarget).data(dataKey)\n\n if (!context) {\n context = new this.constructor(\n event.currentTarget,\n this._getDelegateConfig()\n )\n $(event.currentTarget).data(dataKey, context)\n }\n\n context._activeTrigger.click = !context._activeTrigger.click\n\n if (context._isWithActiveTrigger()) {\n context._enter(null, context)\n } else {\n context._leave(null, context)\n }\n } else {\n if ($(this.getTipElement()).hasClass(ClassName.SHOW)) {\n this._leave(null, this)\n return\n }\n\n this._enter(null, this)\n }\n }\n\n dispose() {\n clearTimeout(this._timeout)\n\n $.removeData(this.element, this.constructor.DATA_KEY)\n\n $(this.element).off(this.constructor.EVENT_KEY)\n $(this.element).closest('.modal').off('hide.bs.modal')\n\n if (this.tip) {\n $(this.tip).remove()\n }\n\n this._isEnabled = null\n this._timeout = null\n this._hoverState = null\n this._activeTrigger = null\n if (this._popper !== null) {\n this._popper.destroy()\n }\n\n this._popper = null\n this.element = null\n this.config = null\n this.tip = null\n }\n\n show() {\n if ($(this.element).css('display') === 'none') {\n throw new Error('Please use show on visible elements')\n }\n\n const showEvent = $.Event(this.constructor.Event.SHOW)\n if (this.isWithContent() && this._isEnabled) {\n $(this.element).trigger(showEvent)\n\n const shadowRoot = Util.findShadowRoot(this.element)\n const isInTheDom = $.contains(\n shadowRoot !== null ? shadowRoot : this.element.ownerDocument.documentElement,\n this.element\n )\n\n if (showEvent.isDefaultPrevented() || !isInTheDom) {\n return\n }\n\n const tip = this.getTipElement()\n const tipId = Util.getUID(this.constructor.NAME)\n\n tip.setAttribute('id', tipId)\n this.element.setAttribute('aria-describedby', tipId)\n\n this.setContent()\n\n if (this.config.animation) {\n $(tip).addClass(ClassName.FADE)\n }\n\n const placement = typeof this.config.placement === 'function'\n ? this.config.placement.call(this, tip, this.element)\n : this.config.placement\n\n const attachment = this._getAttachment(placement)\n this.addAttachmentClass(attachment)\n\n const container = this._getContainer()\n $(tip).data(this.constructor.DATA_KEY, this)\n\n if (!$.contains(this.element.ownerDocument.documentElement, this.tip)) {\n $(tip).appendTo(container)\n }\n\n $(this.element).trigger(this.constructor.Event.INSERTED)\n\n this._popper = new Popper(this.element, tip, {\n placement: attachment,\n modifiers: {\n offset: this._getOffset(),\n flip: {\n behavior: this.config.fallbackPlacement\n },\n arrow: {\n element: Selector.ARROW\n },\n preventOverflow: {\n boundariesElement: this.config.boundary\n }\n },\n onCreate: (data) => {\n if (data.originalPlacement !== data.placement) {\n this._handlePopperPlacementChange(data)\n }\n },\n onUpdate: (data) => this._handlePopperPlacementChange(data)\n })\n\n $(tip).addClass(ClassName.SHOW)\n\n // If this is a touch-enabled device we add extra\n // empty mouseover listeners to the body's immediate children;\n // only needed because of broken event delegation on iOS\n // https://www.quirksmode.org/blog/archives/2014/02/mouse_event_bub.html\n if ('ontouchstart' in document.documentElement) {\n $(document.body).children().on('mouseover', null, $.noop)\n }\n\n const complete = () => {\n if (this.config.animation) {\n this._fixTransition()\n }\n const prevHoverState = this._hoverState\n this._hoverState = null\n\n $(this.element).trigger(this.constructor.Event.SHOWN)\n\n if (prevHoverState === HoverState.OUT) {\n this._leave(null, this)\n }\n }\n\n if ($(this.tip).hasClass(ClassName.FADE)) {\n const transitionDuration = Util.getTransitionDurationFromElement(this.tip)\n\n $(this.tip)\n .one(Util.TRANSITION_END, complete)\n .emulateTransitionEnd(transitionDuration)\n } else {\n complete()\n }\n }\n }\n\n hide(callback) {\n const tip = this.getTipElement()\n const hideEvent = $.Event(this.constructor.Event.HIDE)\n const complete = () => {\n if (this._hoverState !== HoverState.SHOW && tip.parentNode) {\n tip.parentNode.removeChild(tip)\n }\n\n this._cleanTipClass()\n this.element.removeAttribute('aria-describedby')\n $(this.element).trigger(this.constructor.Event.HIDDEN)\n if (this._popper !== null) {\n this._popper.destroy()\n }\n\n if (callback) {\n callback()\n }\n }\n\n $(this.element).trigger(hideEvent)\n\n if (hideEvent.isDefaultPrevented()) {\n return\n }\n\n $(tip).removeClass(ClassName.SHOW)\n\n // If this is a touch-enabled device we remove the extra\n // empty mouseover listeners we added for iOS support\n if ('ontouchstart' in document.documentElement) {\n $(document.body).children().off('mouseover', null, $.noop)\n }\n\n this._activeTrigger[Trigger.CLICK] = false\n this._activeTrigger[Trigger.FOCUS] = false\n this._activeTrigger[Trigger.HOVER] = false\n\n if ($(this.tip).hasClass(ClassName.FADE)) {\n const transitionDuration = Util.getTransitionDurationFromElement(tip)\n\n $(tip)\n .one(Util.TRANSITION_END, complete)\n .emulateTransitionEnd(transitionDuration)\n } else {\n complete()\n }\n\n this._hoverState = ''\n }\n\n update() {\n if (this._popper !== null) {\n this._popper.scheduleUpdate()\n }\n }\n\n // Protected\n\n isWithContent() {\n return Boolean(this.getTitle())\n }\n\n addAttachmentClass(attachment) {\n $(this.getTipElement()).addClass(`${CLASS_PREFIX}-${attachment}`)\n }\n\n getTipElement() {\n this.tip = this.tip || $(this.config.template)[0]\n return this.tip\n }\n\n setContent() {\n const tip = this.getTipElement()\n this.setElementContent($(tip.querySelectorAll(Selector.TOOLTIP_INNER)), this.getTitle())\n $(tip).removeClass(`${ClassName.FADE} ${ClassName.SHOW}`)\n }\n\n setElementContent($element, content) {\n if (typeof content === 'object' && (content.nodeType || content.jquery)) {\n // Content is a DOM node or a jQuery\n if (this.config.html) {\n if (!$(content).parent().is($element)) {\n $element.empty().append(content)\n }\n } else {\n $element.text($(content).text())\n }\n\n return\n }\n\n if (this.config.html) {\n if (this.config.sanitize) {\n content = sanitizeHtml(content, this.config.whiteList, this.config.sanitizeFn)\n }\n\n $element.html(content)\n } else {\n $element.text(content)\n }\n }\n\n getTitle() {\n let title = this.element.getAttribute('data-original-title')\n\n if (!title) {\n title = typeof this.config.title === 'function'\n ? this.config.title.call(this.element)\n : this.config.title\n }\n\n return title\n }\n\n // Private\n\n _getOffset() {\n const offset = {}\n\n if (typeof this.config.offset === 'function') {\n offset.fn = (data) => {\n data.offsets = {\n ...data.offsets,\n ...this.config.offset(data.offsets, this.element) || {}\n }\n\n return data\n }\n } else {\n offset.offset = this.config.offset\n }\n\n return offset\n }\n\n _getContainer() {\n if (this.config.container === false) {\n return document.body\n }\n\n if (Util.isElement(this.config.container)) {\n return $(this.config.container)\n }\n\n return $(document).find(this.config.container)\n }\n\n _getAttachment(placement) {\n return AttachmentMap[placement.toUpperCase()]\n }\n\n _setListeners() {\n const triggers = this.config.trigger.split(' ')\n\n triggers.forEach((trigger) => {\n if (trigger === 'click') {\n $(this.element).on(\n this.constructor.Event.CLICK,\n this.config.selector,\n (event) => this.toggle(event)\n )\n } else if (trigger !== Trigger.MANUAL) {\n const eventIn = trigger === Trigger.HOVER\n ? this.constructor.Event.MOUSEENTER\n : this.constructor.Event.FOCUSIN\n const eventOut = trigger === Trigger.HOVER\n ? this.constructor.Event.MOUSELEAVE\n : this.constructor.Event.FOCUSOUT\n\n $(this.element)\n .on(\n eventIn,\n this.config.selector,\n (event) => this._enter(event)\n )\n .on(\n eventOut,\n this.config.selector,\n (event) => this._leave(event)\n )\n }\n })\n\n $(this.element).closest('.modal').on(\n 'hide.bs.modal',\n () => {\n if (this.element) {\n this.hide()\n }\n }\n )\n\n if (this.config.selector) {\n this.config = {\n ...this.config,\n trigger: 'manual',\n selector: ''\n }\n } else {\n this._fixTitle()\n }\n }\n\n _fixTitle() {\n const titleType = typeof this.element.getAttribute('data-original-title')\n\n if (this.element.getAttribute('title') || titleType !== 'string') {\n this.element.setAttribute(\n 'data-original-title',\n this.element.getAttribute('title') || ''\n )\n\n this.element.setAttribute('title', '')\n }\n }\n\n _enter(event, context) {\n const dataKey = this.constructor.DATA_KEY\n context = context || $(event.currentTarget).data(dataKey)\n\n if (!context) {\n context = new this.constructor(\n event.currentTarget,\n this._getDelegateConfig()\n )\n $(event.currentTarget).data(dataKey, context)\n }\n\n if (event) {\n context._activeTrigger[\n event.type === 'focusin' ? Trigger.FOCUS : Trigger.HOVER\n ] = true\n }\n\n if ($(context.getTipElement()).hasClass(ClassName.SHOW) || context._hoverState === HoverState.SHOW) {\n context._hoverState = HoverState.SHOW\n return\n }\n\n clearTimeout(context._timeout)\n\n context._hoverState = HoverState.SHOW\n\n if (!context.config.delay || !context.config.delay.show) {\n context.show()\n return\n }\n\n context._timeout = setTimeout(() => {\n if (context._hoverState === HoverState.SHOW) {\n context.show()\n }\n }, context.config.delay.show)\n }\n\n _leave(event, context) {\n const dataKey = this.constructor.DATA_KEY\n context = context || $(event.currentTarget).data(dataKey)\n\n if (!context) {\n context = new this.constructor(\n event.currentTarget,\n this._getDelegateConfig()\n )\n $(event.currentTarget).data(dataKey, context)\n }\n\n if (event) {\n context._activeTrigger[\n event.type === 'focusout' ? Trigger.FOCUS : Trigger.HOVER\n ] = false\n }\n\n if (context._isWithActiveTrigger()) {\n return\n }\n\n clearTimeout(context._timeout)\n\n context._hoverState = HoverState.OUT\n\n if (!context.config.delay || !context.config.delay.hide) {\n context.hide()\n return\n }\n\n context._timeout = setTimeout(() => {\n if (context._hoverState === HoverState.OUT) {\n context.hide()\n }\n }, context.config.delay.hide)\n }\n\n _isWithActiveTrigger() {\n for (const trigger in this._activeTrigger) {\n if (this._activeTrigger[trigger]) {\n return true\n }\n }\n\n return false\n }\n\n _getConfig(config) {\n const dataAttributes = $(this.element).data()\n\n Object.keys(dataAttributes)\n .forEach((dataAttr) => {\n if (DISALLOWED_ATTRIBUTES.indexOf(dataAttr) !== -1) {\n delete dataAttributes[dataAttr]\n }\n })\n\n config = {\n ...this.constructor.Default,\n ...dataAttributes,\n ...typeof config === 'object' && config ? config : {}\n }\n\n if (typeof config.delay === 'number') {\n config.delay = {\n show: config.delay,\n hide: config.delay\n }\n }\n\n if (typeof config.title === 'number') {\n config.title = config.title.toString()\n }\n\n if (typeof config.content === 'number') {\n config.content = config.content.toString()\n }\n\n Util.typeCheckConfig(\n NAME,\n config,\n this.constructor.DefaultType\n )\n\n if (config.sanitize) {\n config.template = sanitizeHtml(config.template, config.whiteList, config.sanitizeFn)\n }\n\n return config\n }\n\n _getDelegateConfig() {\n const config = {}\n\n if (this.config) {\n for (const key in this.config) {\n if (this.constructor.Default[key] !== this.config[key]) {\n config[key] = this.config[key]\n }\n }\n }\n\n return config\n }\n\n _cleanTipClass() {\n const $tip = $(this.getTipElement())\n const tabClass = $tip.attr('class').match(BSCLS_PREFIX_REGEX)\n if (tabClass !== null && tabClass.length) {\n $tip.removeClass(tabClass.join(''))\n }\n }\n\n _handlePopperPlacementChange(popperData) {\n const popperInstance = popperData.instance\n this.tip = popperInstance.popper\n this._cleanTipClass()\n this.addAttachmentClass(this._getAttachment(popperData.placement))\n }\n\n _fixTransition() {\n const tip = this.getTipElement()\n const initConfigAnimation = this.config.animation\n\n if (tip.getAttribute('x-placement') !== null) {\n return\n }\n\n $(tip).removeClass(ClassName.FADE)\n this.config.animation = false\n this.hide()\n this.show()\n this.config.animation = initConfigAnimation\n }\n\n // Static\n\n static _jQueryInterface(config) {\n return this.each(function () {\n let data = $(this).data(DATA_KEY)\n const _config = typeof config === 'object' && config\n\n if (!data && /dispose|hide/.test(config)) {\n return\n }\n\n if (!data) {\n data = new Tooltip(this, _config)\n $(this).data(DATA_KEY, data)\n }\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Tooltip._jQueryInterface\n$.fn[NAME].Constructor = Tooltip\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Tooltip._jQueryInterface\n}\n\nexport default Tooltip\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): popover.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Tooltip from './tooltip'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'popover'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.popover'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\nconst CLASS_PREFIX = 'bs-popover'\nconst BSCLS_PREFIX_REGEX = new RegExp(`(^|\\\\s)${CLASS_PREFIX}\\\\S+`, 'g')\n\nconst Default = {\n ...Tooltip.Default,\n placement : 'right',\n trigger : 'click',\n content : '',\n template : '
    ' +\n '
    ' +\n '

    ' +\n '
    '\n}\n\nconst DefaultType = {\n ...Tooltip.DefaultType,\n content : '(string|element|function)'\n}\n\nconst ClassName = {\n FADE : 'fade',\n SHOW : 'show'\n}\n\nconst Selector = {\n TITLE : '.popover-header',\n CONTENT : '.popover-body'\n}\n\nconst Event = {\n HIDE : `hide${EVENT_KEY}`,\n HIDDEN : `hidden${EVENT_KEY}`,\n SHOW : `show${EVENT_KEY}`,\n SHOWN : `shown${EVENT_KEY}`,\n INSERTED : `inserted${EVENT_KEY}`,\n CLICK : `click${EVENT_KEY}`,\n FOCUSIN : `focusin${EVENT_KEY}`,\n FOCUSOUT : `focusout${EVENT_KEY}`,\n MOUSEENTER : `mouseenter${EVENT_KEY}`,\n MOUSELEAVE : `mouseleave${EVENT_KEY}`\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Popover extends Tooltip {\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n static get DATA_KEY() {\n return DATA_KEY\n }\n\n static get Event() {\n return Event\n }\n\n static get EVENT_KEY() {\n return EVENT_KEY\n }\n\n static get DefaultType() {\n return DefaultType\n }\n\n // Overrides\n\n isWithContent() {\n return this.getTitle() || this._getContent()\n }\n\n addAttachmentClass(attachment) {\n $(this.getTipElement()).addClass(`${CLASS_PREFIX}-${attachment}`)\n }\n\n getTipElement() {\n this.tip = this.tip || $(this.config.template)[0]\n return this.tip\n }\n\n setContent() {\n const $tip = $(this.getTipElement())\n\n // We use append for html objects to maintain js events\n this.setElementContent($tip.find(Selector.TITLE), this.getTitle())\n let content = this._getContent()\n if (typeof content === 'function') {\n content = content.call(this.element)\n }\n this.setElementContent($tip.find(Selector.CONTENT), content)\n\n $tip.removeClass(`${ClassName.FADE} ${ClassName.SHOW}`)\n }\n\n // Private\n\n _getContent() {\n return this.element.getAttribute('data-content') ||\n this.config.content\n }\n\n _cleanTipClass() {\n const $tip = $(this.getTipElement())\n const tabClass = $tip.attr('class').match(BSCLS_PREFIX_REGEX)\n if (tabClass !== null && tabClass.length > 0) {\n $tip.removeClass(tabClass.join(''))\n }\n }\n\n // Static\n\n static _jQueryInterface(config) {\n return this.each(function () {\n let data = $(this).data(DATA_KEY)\n const _config = typeof config === 'object' ? config : null\n\n if (!data && /dispose|hide/.test(config)) {\n return\n }\n\n if (!data) {\n data = new Popover(this, _config)\n $(this).data(DATA_KEY, data)\n }\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\n$.fn[NAME] = Popover._jQueryInterface\n$.fn[NAME].Constructor = Popover\n$.fn[NAME].noConflict = () => {\n $.fn[NAME] = JQUERY_NO_CONFLICT\n return Popover._jQueryInterface\n}\n\nexport default Popover\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v4.3.1): scrollspy.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport $ from 'jquery'\nimport Util from './util'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'scrollspy'\nconst VERSION = '4.3.1'\nconst DATA_KEY = 'bs.scrollspy'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst JQUERY_NO_CONFLICT = $.fn[NAME]\n\nconst Default = {\n offset : 10,\n method : 'auto',\n target : ''\n}\n\nconst DefaultType = {\n offset : 'number',\n method : 'string',\n target : '(string|element)'\n}\n\nconst Event = {\n ACTIVATE : `activate${EVENT_KEY}`,\n SCROLL : `scroll${EVENT_KEY}`,\n LOAD_DATA_API : `load${EVENT_KEY}${DATA_API_KEY}`\n}\n\nconst ClassName = {\n DROPDOWN_ITEM : 'dropdown-item',\n DROPDOWN_MENU : 'dropdown-menu',\n ACTIVE : 'active'\n}\n\nconst Selector = {\n DATA_SPY : '[data-spy=\"scroll\"]',\n ACTIVE : '.active',\n NAV_LIST_GROUP : '.nav, .list-group',\n NAV_LINKS : '.nav-link',\n NAV_ITEMS : '.nav-item',\n LIST_ITEMS : '.list-group-item',\n DROPDOWN : '.dropdown',\n DROPDOWN_ITEMS : '.dropdown-item',\n DROPDOWN_TOGGLE : '.dropdown-toggle'\n}\n\nconst OffsetMethod = {\n OFFSET : 'offset',\n POSITION : 'position'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass ScrollSpy {\n constructor(element, config) {\n this._element = element\n this._scrollElement = element.tagName === 'BODY' ? window : element\n this._config = this._getConfig(config)\n this._selector = `${this._config.target} ${Selector.NAV_LINKS},` +\n `${this._config.target} ${Selector.LIST_ITEMS},` +\n `${this._config.target} ${Selector.DROPDOWN_ITEMS}`\n this._offsets = []\n this._targets = []\n this._activeTarget = null\n this._scrollHeight = 0\n\n $(this._scrollElement).on(Event.SCROLL, (event) => this._process(event))\n\n this.refresh()\n this._process()\n }\n\n // Getters\n\n static get VERSION() {\n return VERSION\n }\n\n static get Default() {\n return Default\n }\n\n // Public\n\n refresh() {\n const autoMethod = this._scrollElement === this._scrollElement.window\n ? OffsetMethod.OFFSET : OffsetMethod.POSITION\n\n const offsetMethod = this._config.method === 'auto'\n ? autoMethod : this._config.method\n\n const offsetBase = offsetMethod === OffsetMethod.POSITION\n ? this._getScrollTop() : 0\n\n this._offsets = []\n this._targets = []\n\n this._scrollHeight = this._getScrollHeight()\n\n const targets = [].slice.call(document.querySelectorAll(this._selector))\n\n targets\n .map((element) => {\n let target\n const targetSelector = Util.getSelectorFromElement(element)\n\n if (targetSelector) {\n target = document.querySelector(targetSelector)\n }\n\n if (target) {\n const targetBCR = target.getBoundingClientRect()\n if (targetBCR.width || targetBCR.height) {\n // TODO (fat): remove sketch reliance on jQuery position/offset\n return [\n $(target)[offsetMethod]().top + offsetBase,\n targetSelector\n ]\n }\n }\n return null\n })\n .filter((item) => item)\n .sort((a, b) => a[0] - b[0])\n .forEach((item) => {\n this._offsets.push(item[0])\n this._targets.push(item[1])\n })\n }\n\n dispose() {\n $.removeData(this._element, DATA_KEY)\n $(this._scrollElement).off(EVENT_KEY)\n\n this._element = null\n this._scrollElement = null\n this._config = null\n this._selector = null\n this._offsets = null\n this._targets = null\n this._activeTarget = null\n this._scrollHeight = null\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...typeof config === 'object' && config ? config : {}\n }\n\n if (typeof config.target !== 'string') {\n let id = $(config.target).attr('id')\n if (!id) {\n id = Util.getUID(NAME)\n $(config.target).attr('id', id)\n }\n config.target = `#${id}`\n }\n\n Util.typeCheckConfig(NAME, config, DefaultType)\n\n return config\n }\n\n _getScrollTop() {\n return this._scrollElement === window\n ? this._scrollElement.pageYOffset : this._scrollElement.scrollTop\n }\n\n _getScrollHeight() {\n return this._scrollElement.scrollHeight || Math.max(\n document.body.scrollHeight,\n document.documentElement.scrollHeight\n )\n }\n\n _getOffsetHeight() {\n return this._scrollElement === window\n ? window.innerHeight : this._scrollElement.getBoundingClientRect().height\n }\n\n _process() {\n const scrollTop = this._getScrollTop() + this._config.offset\n const scrollHeight = this._getScrollHeight()\n const maxScroll = this._config.offset +\n scrollHeight -\n this._getOffsetHeight()\n\n if (this._scrollHeight !== scrollHeight) {\n this.refresh()\n }\n\n if (scrollTop >= maxScroll) {\n const target = this._targets[this._targets.length - 1]\n\n if (this._activeTarget !== target) {\n this._activate(target)\n }\n return\n }\n\n if (this._activeTarget && scrollTop < this._offsets[0] && this._offsets[0] > 0) {\n this._activeTarget = null\n this._clear()\n return\n }\n\n const offsetLength = this._offsets.length\n for (let i = offsetLength; i--;) {\n const isActiveTarget = this._activeTarget !== this._targets[i] &&\n scrollTop >= this._offsets[i] &&\n (typeof this._offsets[i + 1] === 'undefined' ||\n scrollTop < this._offsets[i + 1])\n\n if (isActiveTarget) {\n this._activate(this._targets[i])\n }\n }\n }\n\n _activate(target) {\n this._activeTarget = target\n\n this._clear()\n\n const queries = this._selector\n .split(',')\n .map((selector) => `${selector}[data-target=\"${target}\"],${selector}[href=\"${target}\"]`)\n\n const $link = $([].slice.call(document.querySelectorAll(queries.join(','))))\n\n if ($link.hasClass(ClassName.DROPDOWN_ITEM)) {\n $link.closest(Selector.DROPDOWN).find(Selector.DROPDOWN_TOGGLE).addClass(ClassName.ACTIVE)\n $link.addClass(ClassName.ACTIVE)\n } else {\n // Set triggered link as active\n $link.addClass(ClassName.ACTIVE)\n // Set triggered links parents as active\n // With both
      and