Skip to content

Create v2.1.1 Release #786

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 31 commits into from
Apr 25, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
85e6f9f
Fix suppress errors test
wk9874 Apr 4, 2025
4ef09d9
Improved ecoclient tests, changed token to Secretstr
wk9874 Apr 4, 2025
2890d43
Refactor to use new electricitymaps API endpoints
wk9874 Apr 4, 2025
8d04fea
Move sys step to be under update sys metrics if
wk9874 Apr 4, 2025
520d583
Move sys step to be under update sys metrics if
wk9874 Apr 4, 2025
5f047de
Fix count limit when retrieving objects
kzscisoft Apr 7, 2025
c98d9f0
Merge pull request #774 from simvue-io/hotfix/fix-obj-get-count
wk9874 Apr 7, 2025
af560ec
Merge pull request #773 from simvue-io/hotfix/fix_resource_metrics_name
wk9874 Apr 7, 2025
a333e9e
Fix tests
kzscisoft Apr 7, 2025
9178212
Merge branch 'v2.1' of github.com:simvue-io/python-api into v2.1
kzscisoft Apr 7, 2025
148d2c6
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Apr 7, 2025
4e67a3f
Merge pull request #776 from simvue-io/pre-commit-ci-update-config
kzscisoft Apr 8, 2025
d167935
Fixed units
wk9874 Apr 10, 2025
1aada8a
Fix tests
wk9874 Apr 10, 2025
d9295d9
Change default interval to 1hr
wk9874 Apr 15, 2025
f226f04
Fix sender usage of eco
wk9874 Apr 15, 2025
a17f422
Fix events get
wk9874 Apr 16, 2025
0f2b275
Fix get_paginated for metrics and set offline dir to tmp for all unit…
wk9874 Apr 17, 2025
3c6faa6
Lots of fixes
wk9874 Apr 17, 2025
d4119b0
more fixes
wk9874 Apr 22, 2025
2429cdd
Got rid of ecoclient specific local data dir in favour of just using …
wk9874 Apr 22, 2025
fd5a224
Fixed offline artifact tests
wk9874 Apr 22, 2025
a835d0f
Make offline cache dir in configuration
wk9874 Apr 22, 2025
86c35c5
Fix visibility and change show_shared to true by default
wk9874 Apr 23, 2025
b0663ad
Merge pull request #778 from simvue-io/wk9874/fix_eco_units
james-panayis Apr 25, 2025
1257764
change username for visibility tests
wk9874 Apr 25, 2025
3d33c93
Merge branch 'dev' into wk9874/fix_visibility
wk9874 Apr 25, 2025
f434098
Fix get_runs filters
wk9874 Apr 25, 2025
f1bde56
Merge pull request #783 from simvue-io/wk9874/fix_visibility
james-panayis Apr 25, 2025
0d387bf
Update files for 2.1.1 release
wk9874 Apr 25, 2025
5ce8092
Merge pull request #784 from simvue-io/v2.1.1
wk9874 Apr 25, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ repos:
args: [--branch, main, --branch, dev]
- id: check-added-large-files
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.2
rev: v0.11.4
hooks:
- id: ruff
args: [ --fix, --exit-non-zero-on-fix, "--ignore=C901" ]
Expand Down
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,11 @@
# Change log
## [v2.1.1](https://github.com/simvue-io/client/releases/tag/v2.1.1) - 2025-04-25
* Changed from CO2 Signal to ElectricityMaps
* Fixed a number of bugs in how offline mode is handled with emissions
* Streamlined EmissionsMonitor class and handling
* Fixed bugs in client getting results from Simvue server arising from pagination
* Fixed bug in setting visibility in `run.init` method
* Default setting in `Client.get_runs` is now `show_shared=True`
## [v2.1.0](https://github.com/simvue-io/client/releases/tag/v2.1.0) - 2025-03-28
* Removed CodeCarbon dependence in favour of a slimmer solution using the CO2 Signal API.
* Added sorting to server queries, users can now specify to sort by columns during data retrieval from the database.
Expand Down
6 changes: 3 additions & 3 deletions CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,6 @@ keywords:
- alerting
- simulation
license: Apache-2.0
commit: 8f13a7adb2ad0ec53f0a4949e44e1c5676ae342d
version: 2.1.0
date-released: '2025-03-28'
commit: f1bde5646b33f01ec15ef72a0c5843c1fe181ac1
version: 2.1.1
date-released: '2025-04-25'
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "simvue"
version = "2.1.0"
version = "2.1.1"
description = "Simulation tracking and monitoring"
authors = [
{name = "Simvue Development Team", email = "[email protected]"}
Expand Down
8 changes: 4 additions & 4 deletions simvue/api/objects/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -365,7 +365,7 @@ def ids(
"""
_class_instance = cls(_read_only=True, _local=True)
_count: int = 0
for response in cls._get_all_objects(offset):
for response in cls._get_all_objects(offset, count=count):
if (_data := response.get("data")) is None:
raise RuntimeError(
f"Expected key 'data' for retrieval of {_class_instance.__class__.__name__.lower()}s"
Expand Down Expand Up @@ -404,7 +404,7 @@ def get(
"""
_class_instance = cls(_read_only=True, _local=True)
_count: int = 0
for _response in cls._get_all_objects(offset, **kwargs):
for _response in cls._get_all_objects(offset, count=count, **kwargs):
if count and _count > count:
return
if (_data := _response.get("data")) is None:
Expand Down Expand Up @@ -438,7 +438,7 @@ def count(cls, **kwargs) -> int:

@classmethod
def _get_all_objects(
cls, offset: int | None, **kwargs
cls, offset: int | None, count: int | None, **kwargs
) -> typing.Generator[dict, None, None]:
_class_instance = cls(_read_only=True)
_url = f"{_class_instance._base_url}"
Expand All @@ -448,7 +448,7 @@ def _get_all_objects(
_label = _label[:-1]

for response in get_paginated(
_url, headers=_class_instance._headers, offset=offset, **kwargs
_url, headers=_class_instance._headers, offset=offset, count=count, **kwargs
):
yield get_json_from_response(
response=response,
Expand Down
2 changes: 1 addition & 1 deletion simvue/api/objects/events.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ def get(
_class_instance = cls(_read_only=True, _local=True)
_count: int = 0

for response in cls._get_all_objects(offset, run=run_id, **kwargs):
for response in cls._get_all_objects(offset, count=count, run=run_id, **kwargs):
if (_data := response.get("data")) is None:
raise RuntimeError(
f"Expected key 'data' for retrieval of {_class_instance.__class__.__name__.lower()}s"
Expand Down
25 changes: 11 additions & 14 deletions simvue/api/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -281,6 +281,7 @@ def get_paginated(
timeout: int = DEFAULT_API_TIMEOUT,
json: dict[str, typing.Any] | None = None,
offset: int | None = None,
count: int | None = None,
**params,
) -> typing.Generator[requests.Response, None, None]:
"""Paginate results of a server query.
Expand All @@ -302,22 +303,18 @@ def get_paginated(
server response
"""
_offset: int = offset or 0

while (
(
_response := get(
url=url,
headers=headers,
params=(params or {})
| {"count": MAX_ENTRIES_PER_PAGE, "start": _offset},
timeout=timeout,
json=json,
)
_response := get(
url=url,
headers=headers,
params=(params or {})
| {"count": count or MAX_ENTRIES_PER_PAGE, "start": _offset},
timeout=timeout,
json=json,
)
.json()
.get("data")
):
).json():
yield _response
_offset += MAX_ENTRIES_PER_PAGE

yield _response
if (count and _offset > count) or (_response.json().get("count", 0) < _offset):
break
13 changes: 8 additions & 5 deletions simvue/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ def get_runs(
output_format: typing.Literal["dict", "objects", "dataframe"] = "objects",
count_limit: pydantic.PositiveInt | None = 100,
start_index: pydantic.NonNegativeInt = 0,
show_shared: bool = False,
show_shared: bool = True,
sort_by_columns: list[tuple[str, bool]] | None = None,
) -> DataFrame | typing.Generator[tuple[str, Run], None, None] | None:
"""Retrieve all runs matching filters.
Expand Down Expand Up @@ -210,7 +210,7 @@ def get_runs(
start_index : int, optional
the index from which to count entries. Default is 0.
show_shared : bool, optional
whether to include runs shared with the current user. Default is False.
whether to include runs shared with the current user. Default is True.
sort_by_columns : list[tuple[str, bool]], optional
sort by columns in the order given,
list of tuples in the form (column_name: str, sort_descending: bool),
Expand All @@ -234,8 +234,9 @@ def get_runs(
RuntimeError
if there was a failure in data retrieval from the server
"""
filters = filters or []
if not show_shared:
filters = (filters or []) + ["user == self"]
filters += ["user == self"]

_runs = Run.get(
count=count_limit,
Expand Down Expand Up @@ -835,7 +836,8 @@ def get_metric_values(

_args = {"filters": json.dumps(run_filters)} if run_filters else {}

_run_data = dict(Run.get(**_args))
if not run_ids:
_run_data = dict(Run.get(**_args))

if not (
_run_metrics := self._get_run_metrics_from_server(
Expand All @@ -853,7 +855,8 @@ def get_metric_values(
)
if use_run_names:
_run_metrics = {
_run_data[key].name: _run_metrics[key] for key in _run_metrics.keys()
Run(identifier=key).name: _run_metrics[key]
for key in _run_metrics.keys()
}
return parse_run_set_metrics(
_run_metrics,
Expand Down
1 change: 1 addition & 0 deletions simvue/config/user.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,6 +200,7 @@ def fetch(
_default_dir = _config_dict["offline"].get(
"cache", DEFAULT_OFFLINE_DIRECTORY
)
pathlib.Path(_default_dir).mkdir(parents=True, exist_ok=True)

_config_dict["offline"]["cache"] = _default_dir

Expand Down
32 changes: 11 additions & 21 deletions simvue/eco/api_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,38 +19,33 @@
import geocoder.location
import typing

CO2_SIGNAL_API_ENDPOINT: str = "https://api.co2signal.com/v1/latest"
CO2_SIGNAL_API_ENDPOINT: str = (
"https://api.electricitymap.org/v3/carbon-intensity/latest"
)


class CO2SignalData(pydantic.BaseModel):
datetime: datetime.datetime
carbon_intensity: float
fossil_fuel_percentage: float


class CO2SignalResponse(pydantic.BaseModel):
disclaimer: str
country_code: str
status: str
data: CO2SignalData
carbon_intensity_units: str

@classmethod
def from_json_response(cls, json_response: dict) -> "CO2SignalResponse":
_data: dict[str, typing.Any] = json_response["data"]
_co2_signal_data = CO2SignalData(
datetime=datetime.datetime.fromisoformat(
_data["datetime"].replace("Z", "+00:00")
json_response["datetime"].replace("Z", "+00:00")
),
carbon_intensity=_data["carbonIntensity"],
fossil_fuel_percentage=_data["fossilFuelPercentage"],
carbon_intensity=json_response["carbonIntensity"],
)
return cls(
disclaimer=json_response["_disclaimer"],
country_code=json_response["countryCode"],
status=json_response["status"],
country_code=json_response["zone"],
data=_co2_signal_data,
carbon_intensity_units=json_response["units"]["carbonIntensity"],
carbon_intensity_units="gCO2e/kWh",
)


Expand Down Expand Up @@ -82,18 +77,15 @@ def __init__(self, *args, **kwargs) -> None:
co2_api_endpoint : str
endpoint for CO2 signal API
co2_api_token: str
RECOMMENDED. The API token for the CO2 Signal API, default is None.
The API token for the ElectricityMaps API, default is None.
timeout : int
timeout for API
"""
super().__init__(*args, **kwargs)
self._logger = logging.getLogger(self.__class__.__name__)

if not self.co2_api_token:
self._logger.warning(
"⚠️ No API token provided for CO2 Signal, "
"use of a token is strongly recommended."
)
raise ValueError("API token is required for ElectricityMaps API.")

self._get_user_location_info()

Expand All @@ -109,16 +101,14 @@ def _get_user_location_info(self) -> None:
def get(self) -> CO2SignalResponse:
"""Get the current data"""
_params: dict[str, float | str] = {
"lat": self._latitude,
"lon": self._longitude,
"countryCode": self._two_letter_country_code,
"zone": self._two_letter_country_code,
}

if self.co2_api_token:
_params["auth-token"] = self.co2_api_token.get_secret_value()

self._logger.debug(f"🍃 Retrieving carbon intensity data for: {_params}")
_response = requests.get(f"{self.co2_api_endpoint}", params=_params)
_response = requests.get(f"{self.co2_api_endpoint}", headers=_params)

if _response.status_code != http.HTTPStatus.OK:
try:
Expand Down
23 changes: 1 addition & 22 deletions simvue/eco/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,6 @@
__date__ = "2025-03-06"

import pydantic
import pathlib
import os

from simvue.config.files import DEFAULT_OFFLINE_DIRECTORY


class EcoConfig(pydantic.BaseModel):
Expand All @@ -25,30 +21,13 @@ class EcoConfig(pydantic.BaseModel):
the TDP for the CPU
gpu_thermal_design_power: int | None, optional
the TDP for each GPU
local_data_directory: str, optional
the directory to store local data, default is Simvue offline directory
"""

co2_signal_api_token: pydantic.SecretStr | None = None
cpu_thermal_design_power: pydantic.PositiveInt | None = None
cpu_n_cores: pydantic.PositiveInt | None = None
gpu_thermal_design_power: pydantic.PositiveInt | None = None
local_data_directory: pydantic.DirectoryPath | None = pydantic.Field(
None, validate_default=True
)
intensity_refresh_interval: pydantic.PositiveInt | str | None = pydantic.Field(
default="1 day", gt=2 * 60
default="1 hour", gt=2 * 60
)
co2_intensity: float | None = None

@pydantic.field_validator("local_data_directory", mode="before", check_fields=True)
@classmethod
def check_local_data_env(
cls, local_data_directory: pathlib.Path | None
) -> pathlib.Path:
if _data_directory := os.environ.get("SIMVUE_ECO_DATA_DIRECTORY"):
return pathlib.Path(_data_directory)
if not local_data_directory:
local_data_directory = pathlib.Path(DEFAULT_OFFLINE_DIRECTORY)
local_data_directory.mkdir(exist_ok=True, parents=True)
return local_data_directory
Loading
Loading