-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added NDBC as a data source #146
Conversation
@abdu558 please go over failed tests and make sure we pass the basic tests. Make sure the code works for the oldest supported Python (in this case 3.9) |
hi Abdu and welcome :) The CI is failing due to mypy. Chances are that you haven't set up your dev environment properly. Check here: https://github.com/oceanmodeling/searvey?tab=readme-ov-file#development (i.e. run |
Please convert to a draft PR |
Okay thank you, do you know how to fix these two below?
poetry export dev........................................................Failed
|
Hi @abdu558
Since you haven't committed changes in So you needs to drop changes in the requirements files or commit the corresponding changes in |
@abdu558 to avoid having to do this every time on github ci, But we've automated things in the Makefile Once pre-commit is installed, when you'll commit your changes, it will run all the steps in the Some functions modify automatically the files (like |
I was trying to add the new pyproject/poetry files but it kept getting rejected from precommit |
Thank you, I did set it up but it would just reject my commits everytime for the: poetry export main.......................................................Failed
I'm not really sure how to proceed with this? this poetry export main failed thing is still happening after i added the dependency to poetry. I did think maybe updating the poetry file completely with poetry update but that results in IOC errors/incompatibility so im not really sure how to proceed with this. |
@abdu558 I was able to update the requirement files and commit on my local machine after getting your PR commits. We can resolve this if we can have a quick short meeting. Or tomorrow during the group meeting. |
Yea tomorrow would be good |
9357962
to
18b6a92
Compare
@abdu558 the resolved start and end times are causing problem. It seems that - start_date = _resolve_start_date(now, start_date)
- end_date = _resolve_end_date(now, end_date)
# Prepare arguments for each function call
func_kwargs = [
{
"station_id": station_id,
"mode": mode,
- "start_time": start_date[0],
- "end_time": end_date[0],
+ "start_time": _resolve_start_date(now, start_date)[0],
+ "end_time": _resolve_end_date(now, end_date)[0],
"columns": columns,
} For the result kwarg dictionary I'm not sure what to do ... we seem to be doing similar in case of IOC: for result in ioc_responses:
station_id = result.kwargs["station_id"] # type: ignore[index] @pmav99 what do you think? |
@pmav99 we rebased everything and notebooks are also updated. All the tests pass now. Can we go ahead with the merge? |
I don't think it is ready yet. For starters, I would expect that the API would follow the conventions we used on #125 but that doesn't seem to be the case. |
@pmav99 you're right, sorry I missed it!! @abdu558 please review the IOC API:
and make sure the NDBC API follows the same naming convention as well as similar arguments (in cases the make sense). The current corresponding functions you implemented are:
I think the rest of the arguments are not as important in this case, but it would be ideal to add them as well. Since @pmav99 do you see anything else that you'd like to be changed before going for the merge? |
Alright sounds good |
a5af5d7
to
46f5506
Compare
Considering your last observations, especially #161, that would benefit EMC's work (ping @AliS-Noaa), I think we still need to implement the part on Historical data, and if we can, use parts of the Realtime implementation. Also, I think we could leverage these 2 functions: |
I think it'd be better to review all the current GSoC PRs/branches and merge first, then start thinking about adding the missing pieces (unless the missing pieces make the PR useless). While I'd like everything to be complete before merging, realistically I don't think we have the capacity to do so, so we have to do it iteratively. I'm still reviewing the missing pieces of NDBC and creating issues! |
Added NDBC