Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix unit tests #8553

Merged
merged 65 commits into from
Mar 13, 2024
Merged
Show file tree
Hide file tree
Changes from 57 commits
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
d2bcc5d
[tests] fix pyttest error - cannot collect syft function
yashgorana Mar 6, 2024
6987e0e
[tests] use fake redis
yashgorana Mar 6, 2024
15f102a
[tests] use in-memory mongo
yashgorana Mar 6, 2024
6ec98cd
[tests] fix pytest scope issue
yashgorana Mar 6, 2024
cc39bcc
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 6, 2024
7541de8
[tests] fix pytest spawning multiple servers
yashgorana Mar 6, 2024
766cdd3
[tests] pytest_unconfigure can happen anytime
yashgorana Mar 6, 2024
a1930a6
[tests] use docker mongo
yashgorana Mar 7, 2024
9daf3a3
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 7, 2024
92ad5c1
[tests] common container prefix
yashgorana Mar 7, 2024
126583d
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 7, 2024
d0880fd
[tests] remove redis
yashgorana Mar 7, 2024
7186b8d
Merge branch 'dev' into yash/fix-tests
kiendang Mar 7, 2024
f16f1db
Remove pytest_mock_resources as dependency
kiendang Mar 7, 2024
c70cc05
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 8, 2024
681d436
[tests] fix pytest-xdist race
yashgorana Mar 8, 2024
8f2ba7c
[tests] comment out all joblib tests
yashgorana Mar 8, 2024
c3286a5
[tests] reduce thread overhead
yashgorana Mar 8, 2024
963298d
[syft] capture migration_state error
yashgorana Mar 8, 2024
8fa5acf
[tests] disable grouping
yashgorana Mar 8, 2024
f40ff10
[tests] fix some tests
yashgorana Mar 8, 2024
d5ce312
temp disable windows panic culprits
yashgorana Mar 8, 2024
52dc637
[tests] move non-unit-tests to integration tests
yashgorana Mar 8, 2024
499d7fc
[tests] re-enable lock tests
yashgorana Mar 8, 2024
f454c31
[tox] ordered exec of unit tests
yashgorana Mar 8, 2024
c866d9d
[tox] fix DOCKER_HOST on macOS
yashgorana Mar 8, 2024
abf3f8f
[tests] re-add xfail to win32 failing test
yashgorana Mar 8, 2024
a07d8bb
[tests] mongodb - best of both worlds
yashgorana Mar 9, 2024
f616bec
[tests] mongodb final changes
yashgorana Mar 9, 2024
870e91b
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 11, 2024
0ed6a1d
[syft] fix SMTP_PORT string parse
yashgorana Mar 11, 2024
385437e
[syft] label credentials volume
yashgorana Mar 11, 2024
7a9a4ed
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 11, 2024
e62a547
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 11, 2024
78bae28
[syft] fix linting
yashgorana Mar 11, 2024
e40e4d0
[tox] fix docker rm command
yashgorana Mar 11, 2024
72f3312
[tests] fix merge conflict changes
yashgorana Mar 11, 2024
667c377
[tests] move request multi-node to integration
yashgorana Mar 11, 2024
e411b57
[tests] revert use of faker
yashgorana Mar 11, 2024
cfefacc
Merge branch 'dev' into yash/fix-tests
yashgorana Mar 11, 2024
d01ccf9
[tests] zmq tweaks
yashgorana Mar 11, 2024
13d0fdd
Merge branch 'dev' into yash/fix-tests
shubham3121 Mar 12, 2024
1e035b3
[tests] xfail numpy tests on 3.12
yashgorana Mar 12, 2024
9697649
fix request multiple nodes test
teo-milea Mar 12, 2024
ef5baff
fix lint for tests
teo-milea Mar 12, 2024
fc10b16
Merge branch 'yash/fix-tests' of github.com:OpenMined/PySyft into yas…
teo-milea Mar 12, 2024
9682c74
add trace result registry
koenvanderveen Mar 12, 2024
1661e5a
Merge branch 'yash/fix-tests' of github.com:OpenMined/PySyft into yas…
koenvanderveen Mar 12, 2024
1dd6462
raise valueerror if client is none
koenvanderveen Mar 12, 2024
ebf0595
changed TraceResult BaseModel to SyftBaseModel
teo-milea Mar 12, 2024
6ce6cad
cleanup when plan building fails
koenvanderveen Mar 12, 2024
1cdbfbe
merge
koenvanderveen Mar 12, 2024
8431140
added __exclude_sync_diff_attrs__ and __repr_attrs__ to passthrough a…
teo-milea Mar 12, 2024
c0ab358
Merge branch 'yash/fix-tests' of github.com:OpenMined/PySyft into yas…
teo-milea Mar 12, 2024
3792051
fix nested jobs function
koenvanderveen Mar 12, 2024
4250ae3
Merge branch 'yash/fix-tests' of github.com:OpenMined/PySyft into yas…
koenvanderveen Mar 12, 2024
83830c2
removed request_multiple_nodes_test unit test
teo-milea Mar 12, 2024
679d4a5
Merge branch 'dev' into yash/fix-tests
shubham3121 Mar 13, 2024
8369a46
added timeout for wait and moves syft function test to integration
teo-milea Mar 13, 2024
8c7de67
Merge branch 'yash/fix-tests' of github.com:OpenMined/PySyft into yas…
teo-milea Mar 13, 2024
12f924c
Merge branch 'dev' into yash/fix-tests
teo-milea Mar 13, 2024
f8659f8
fix lint
teo-milea Mar 13, 2024
14fafe4
Merge branch 'yash/fix-tests' of github.com:OpenMined/PySyft into yas…
teo-milea Mar 13, 2024
e4d331b
Merge branch 'dev' into yash/fix-tests
rasswanth-s Mar 13, 2024
814b4fa
Merge branch 'dev' into yash/fix-tests
madhavajay Mar 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/pr-tests-stack.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ jobs:
# os: [om-ci-16vcpu-ubuntu2204]
os: [ubuntu-latest]
python-version: ["3.12"]
pytest-modules: ["frontend network container_workload"]
pytest-modules: ["frontend network container_workload local_node"]
fail-fast: false

runs-on: ${{matrix.os}}
Expand Down
8 changes: 5 additions & 3 deletions .github/workflows/pr-tests-syft.yml
Original file line number Diff line number Diff line change
Expand Up @@ -88,9 +88,11 @@ jobs:
run: |
pip install --upgrade tox packaging wheel --default-timeout=60

- name: Docker on MacOS
if: steps.changes.outputs.syft == 'true' && matrix.os == 'macos-latest'
uses: crazy-max/ghaction-setup-docker@v3.1.0
# - name: Docker on MacOS
# if: steps.changes.outputs.syft == 'true' && matrix.os == 'macos-latest'
# uses: crazy-max/ghaction-setup-docker@v3.1.0
# with:
# set-host: true

- name: Run unit tests
if: steps.changes.outputs.syft == 'true'
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ repos:
exclude: ^(packages/grid/ansible/)
- id: name-tests-test
always_run: true
exclude: ^(packages/grid/backend/grid/tests/utils/)|^(.*fixtures.py)|^packages/syft/tests/.*/utils.py
exclude: ^(.*/tests/utils/)|^(.*fixtures.py)
- id: requirements-txt-fixer
always_run: true
- id: mixed-line-ending
Expand Down
26 changes: 13 additions & 13 deletions packages/grid/backend/grid/core/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ def get_emails_enabled(self) -> Self:

OPEN_REGISTRATION: bool = True

DOMAIN_ASSOCIATION_REQUESTS_AUTOMATICALLY_ACCEPTED: bool = True
# DOMAIN_ASSOCIATION_REQUESTS_AUTOMATICALLY_ACCEPTED: bool = True
USE_BLOB_STORAGE: bool = (
True if os.getenv("USE_BLOB_STORAGE", "false").lower() == "true" else False
)
Expand All @@ -117,22 +117,22 @@ def get_emails_enabled(self) -> Self:
) # 30 minutes in seconds
SEAWEED_MOUNT_PORT: int = int(os.getenv("SEAWEED_MOUNT_PORT", 4001))

REDIS_HOST: str = str(os.getenv("REDIS_HOST", "redis"))
REDIS_PORT: int = int(os.getenv("REDIS_PORT", 6379))
REDIS_STORE_DB_ID: int = int(os.getenv("REDIS_STORE_DB_ID", 0))
REDIS_LEDGER_DB_ID: int = int(os.getenv("REDIS_LEDGER_DB_ID", 1))
STORE_DB_ID: int = int(os.getenv("STORE_DB_ID", 0))
LEDGER_DB_ID: int = int(os.getenv("LEDGER_DB_ID", 1))
NETWORK_CHECK_INTERVAL: int = int(os.getenv("NETWORK_CHECK_INTERVAL", 60))
DOMAIN_CHECK_INTERVAL: int = int(os.getenv("DOMAIN_CHECK_INTERVAL", 60))
# REDIS_HOST: str = str(os.getenv("REDIS_HOST", "redis"))
# REDIS_PORT: int = int(os.getenv("REDIS_PORT", 6379))
# REDIS_STORE_DB_ID: int = int(os.getenv("REDIS_STORE_DB_ID", 0))
# REDIS_LEDGER_DB_ID: int = int(os.getenv("REDIS_LEDGER_DB_ID", 1))
# STORE_DB_ID: int = int(os.getenv("STORE_DB_ID", 0))
# LEDGER_DB_ID: int = int(os.getenv("LEDGER_DB_ID", 1))
# NETWORK_CHECK_INTERVAL: int = int(os.getenv("NETWORK_CHECK_INTERVAL", 60))
# DOMAIN_CHECK_INTERVAL: int = int(os.getenv("DOMAIN_CHECK_INTERVAL", 60))
CONTAINER_HOST: str = str(os.getenv("CONTAINER_HOST", "docker"))
MONGO_HOST: str = str(os.getenv("MONGO_HOST", ""))
MONGO_PORT: int = int(os.getenv("MONGO_PORT", 0))
MONGO_PORT: int = int(os.getenv("MONGO_PORT", 27017))
MONGO_USERNAME: str = str(os.getenv("MONGO_USERNAME", ""))
MONGO_PASSWORD: str = str(os.getenv("MONGO_PASSWORD", ""))
DEV_MODE: bool = True if os.getenv("DEV_MODE", "false").lower() == "true" else False
# ZMQ stuff
QUEUE_PORT: int = int(os.getenv("QUEUE_PORT", 0))
QUEUE_PORT: int = int(os.getenv("QUEUE_PORT", 5556))
CREATE_PRODUCER: bool = (
True if os.getenv("CREATE_PRODUCER", "false").lower() == "true" else False
)
Expand All @@ -145,8 +145,8 @@ def get_emails_enabled(self) -> Self:
EMAIL_SENDER: str = os.getenv("EMAIL_SENDER", "")
SMTP_PASSWORD: str = os.getenv("SMTP_PASSWORD", "")
SMTP_TLS: bool = True
SMTP_PORT: str | None = os.getenv("SMTP_PORT", "")
SMTP_HOST: str | None = os.getenv("SMTP_HOST", "")
SMTP_PORT: int = int(os.getenv("SMTP_PORT", 587))
SMTP_HOST: str = os.getenv("SMTP_HOST", "")

TEST_MODE: bool = (
True if os.getenv("TEST_MODE", "false").lower() == "true" else False
Expand Down
3 changes: 2 additions & 1 deletion packages/grid/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,8 @@ services:

volumes:
credentials-data:
# app-redis-data:
labels:
orgs.openmined.syft: "this is a syft credentials volume"
seaweedfs-data:
labels:
orgs.openmined.syft: "this is a syft seaweedfs volume"
Expand Down
12 changes: 7 additions & 5 deletions packages/hagrid/hagrid/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -2260,13 +2260,15 @@ def create_launch_docker_cmd(
"NODE_SIDE_TYPE": kwargs["node_side_type"],
"SINGLE_CONTAINER_MODE": single_container_mode,
"INMEMORY_WORKERS": in_mem_workers,
"SMTP_USERNAME": smtp_username,
"SMTP_PASSWORD": smtp_password,
"EMAIL_SENDER": smtp_sender,
"SMTP_PORT": smtp_port,
"SMTP_HOST": smtp_host,
}

if smtp_host and smtp_port and smtp_username and smtp_password:
envs["SMTP_HOST"] = smtp_host
envs["SMTP_PORT"] = smtp_port
envs["SMTP_USERNAME"] = smtp_username
envs["SMTP_PASSWORD"] = smtp_password
envs["EMAIL_SENDER"] = smtp_sender

if "trace" in kwargs and kwargs["trace"] is True:
envs["TRACE"] = "True"
envs["JAEGER_HOST"] = "host.docker.internal"
Expand Down
2 changes: 1 addition & 1 deletion packages/syft/setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -120,14 +120,14 @@ test_plugins =
pytest-asyncio
pytest-randomly
pytest-sugar
pytest_mock_resources
python_on_whales
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MAYBE: I think we could also remove unsed dependencies here like

  1. python_on_whales
  2. lxml
  3. joblib (not fully sure if joblib is being used?)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and I think pytest-asyncio is also not used

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agree

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets address in a smaller follow up PR.

pytest-lazy-fixture
pytest-rerunfailures
coverage
joblib
faker
lxml
distro

[options.entry_points]
console_scripts =
Expand Down
13 changes: 8 additions & 5 deletions packages/syft/src/syft/client/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -402,11 +402,13 @@ def generate_remote_lib_function(

def wrapper(*args: Any, **kwargs: Any) -> SyftError | Any:
# relative
from ..service.action.action_object import TraceResult
from ..service.action.action_object import TraceResultRegistry

if TraceResult._client is not None:
wrapper_make_call = TraceResult._client.api.make_call
wrapper_node_uid = TraceResult._client.api.node_uid
trace_result = TraceResultRegistry.get_trace_result_for_thread()

if trace_result is not None:
wrapper_make_call = trace_result.client.api.make_call # type: ignore
wrapper_node_uid = trace_result.client.api.node_uid # type: ignore
else:
# somehow this is necessary to prevent shadowing problems
wrapper_make_call = make_call
Expand Down Expand Up @@ -448,7 +450,8 @@ def wrapper(*args: Any, **kwargs: Any) -> SyftError | Any:
)
service_args = [action]
# TODO: implement properly
TraceResult.result += [action]
if trace_result is not None:
trace_result.result += [action]

api_call = SyftAPICall(
node_uid=wrapper_node_uid,
Expand Down
6 changes: 5 additions & 1 deletion packages/syft/src/syft/node/node.py
Original file line number Diff line number Diff line change
Expand Up @@ -315,7 +315,7 @@ def __init__(
smtp_username: str | None = None,
smtp_password: str | None = None,
email_sender: str | None = None,
smtp_port: str | None = None,
smtp_port: int | None = None,
smtp_host: str | None = None,
):
# 🟡 TODO 22: change our ENV variable format and default init args to make this
Expand Down Expand Up @@ -716,6 +716,10 @@ def _find_klasses_pending_for_migration(
object_version = object_type.__version__

migration_state = migration_state_service.get_state(context, canonical_name)
if isinstance(migration_state, SyftError):
raise Exception(
f"Failed to get migration state for {canonical_name}. Error: {migration_state}"
)
if (
migration_state is not None
and migration_state.current_version != migration_state.latest_version
Expand Down
32 changes: 16 additions & 16 deletions packages/syft/src/syft/protocol/protocol_version.json
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
},
"3": {
"version": 3,
"hash": "18785a4cce6f25f1900b82f30acb2298b4afeab92bd00d0be358cfbf5a93d97e",
"hash": "37bb8f0f87b1da2525da8f6873e6257dff4a732f2dba293b62931ad0b85ef9e2",
"action": "add"
}
},
Expand All @@ -40,7 +40,7 @@
},
"3": {
"version": 3,
"hash": "4fd4c5b29e395b7a1af3b820166e69af7f267b6e3234fb8329bd0d74adc6e828",
"hash": "7c55461e3c6ba36ff999c64eb1b97a65b5a1f27193a973b1355ee2675f14c313",
"action": "add"
}
},
Expand All @@ -52,7 +52,7 @@
},
"2": {
"version": 2,
"hash": "1b04f527fdabaf329786b6bb38209f6ca82d622fe691d33c47ed1addccaaac02",
"hash": "1ab941c7669572a41067a17e0e3f2d9c7056f7a4df8f899e87ae2358d9113b02",
"action": "add"
}
},
Expand Down Expand Up @@ -148,7 +148,7 @@
},
"3": {
"version": 3,
"hash": "5922c1253370861185c53161ad31e488319f46ea5faee2d1802ca94657c428dc",
"hash": "709dc84a946267444a3f9968acf4a5e9807d6aa5143626c3fb635c9282108cc1",
"action": "add"
}
},
Expand All @@ -165,7 +165,7 @@
},
"3": {
"version": 3,
"hash": "dbb72f43add3141d13a76e18a2a0903a6937966632f0def452ca264f3f70d81b",
"hash": "5e84c9905a1816d51c0dfb1eedbfb4d831095ca6c89956c6fe200c2a193cbb8f",
"action": "add"
}
},
Expand All @@ -182,7 +182,7 @@
},
"3": {
"version": 3,
"hash": "cf831130f66f9addf8f68a8c9df0b67775e53322c8a32e8babc7f21631845608",
"hash": "bf936c1923ceee4def4cded06d41766998ea472322b0738bade7b85298e469da",
"action": "add"
}
},
Expand All @@ -199,7 +199,7 @@
},
"3": {
"version": 3,
"hash": "78334b746e5230ac156e47960e91ce449543d1a77a62d9b8be141882e4b549aa",
"hash": "daf3629fb7d26f41f96cd7f9200d7327a4b74d800b3e02afa75454d11bd47d78",
"action": "add"
}
},
Expand All @@ -216,7 +216,7 @@
},
"3": {
"version": 3,
"hash": "0007e86c39ede0f5756ba348083f809c5b6e3bb3a0a9ed6b94570d808467041f",
"hash": "4747a220d1587e99e6ac076496a2aa7217e2700205ac80fc24fe4768a313da78",
"action": "add"
}
},
Expand Down Expand Up @@ -300,7 +300,7 @@
},
"2": {
"version": 2,
"hash": "9eaed0a784525dea0018d95de74d70ed212f20f6ead2b50c66e59467c42bbe68",
"hash": "b35897295822f061fbc70522ca8967cd2be53a5c01b19e24c587cd7b0c4aa3e8",
"action": "add"
}
},
Expand Down Expand Up @@ -574,7 +574,7 @@
},
"4": {
"version": 4,
"hash": "077987cfc94d617f746f27fb468210330c328bad06eee09a89226759e5745a5f",
"hash": "c37bc1c6303c467050ce4f8faa088a2f66ef1781437ffe34f15aadf5477ac25b",
"action": "add"
}
},
Expand Down Expand Up @@ -608,7 +608,7 @@
},
"3": {
"version": 3,
"hash": "8a8e721a4ca8aa9107403368851acbe59f8d7bdc1eeff0ff101a44e325a058ff",
"hash": "4159d6ea45bc82577828bc19d668196422ff29bb8cc298b84623e6f4f476aaf3",
"action": "add"
}
},
Expand All @@ -630,7 +630,7 @@
},
"4": {
"version": 4,
"hash": "9b0dd1a64d64b1e824746e93aae0ca14863d2430aea2e2a758945edbfcb79bc9",
"hash": "dae431b87cadacfd30613519b5dd25d2e4ff59d2a971e21a31d56901103b9420",
"action": "add"
}
},
Expand Down Expand Up @@ -659,7 +659,7 @@
},
"2": {
"version": 2,
"hash": "6cd89ed24027ed94b3e2bb7a07e8932060e07e481ceb35eb7ee4d2d0b6e34f43",
"hash": "bc4bbe67d75d5214e79ff57077dac5762bba98760e152f9613a4f8975488d960",
"action": "add"
}
},
Expand Down Expand Up @@ -1237,7 +1237,7 @@
},
"2": {
"version": 2,
"hash": "747c87b947346fb0fc0466a912e2dc743ee082ef6254079176349d6b63748c32",
"hash": "93c75b45b9b74c69243cc2f2ef2d661e11eef5c23ecf71692ffdbd467d11efe6",
"action": "add"
}
},
Expand Down Expand Up @@ -1525,7 +1525,7 @@
},
"2": {
"version": 2,
"hash": "ac452023b98534eb13cb99a86fa7e379c08316353fc0837d1b788e0050e13ab9",
"hash": "24b7c302f9821afe073534d4ed02c377bd4f7cb691f66ca92b94c38c92dc78c2",
"action": "add"
}
},
Expand All @@ -1537,7 +1537,7 @@
},
"2": {
"version": 2,
"hash": "c9fdefdc622131c3676243aafadc30b7e67ee155793791bf1000bf742c1a251a",
"hash": "6d2e2f64c00dcda74a2545c77abbcf1630c56c26014987038feab174d15bd9d7",
"action": "add"
}
},
Expand Down
4 changes: 3 additions & 1 deletion packages/syft/src/syft/service/action/action_graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,9 @@ def _thread_safe_cbk(
# TODO copied method from document_store, have it in one place and reuse?
locked = self.lock.acquire(blocking=True)
if not locked:
return Err("Failed to acquire lock for the operation")
return Err(
f"Failed to acquire lock for the operation {self.lock.lock_name} ({self.lock._lock})"
)
try:
result = cbk(*args, **kwargs)
except BaseException as e:
Expand Down
Loading
Loading