-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ability to post entries via REST API rather than uploading csv file #51
Comments
Downloading files to the home assistant would be more a HA-feature than a feature of this integration. Would it help if you could suppy an url instead of a filename, and the integration reads the file from the url? But probably that will result in security issues (password or token needed ...). |
Maybe, you can use the native 'downloader' integration to retrieve your file, then this integration to ingest the data into the HA statistics. https://www.home-assistant.io/integrations/downloader/ Both actions should be integrable into an HA script or an HA automation. |
@bastgau , can the URL also be a local file? so, file://... instead of http:// ... |
Only http protocol I guess. |
I added a (janky) registered method in def handle_import_from_json(call: ServiceCall) -> None:
entities = call.data.get('entities', [])
timezone = zoneinfo.ZoneInfo(call.data.get('timezone', 'America/Los_Angeles'))
_LOGGER.info(f"Got {len(entities)} entities")
for entity in entities:
statistic_id = entity['id']
values = entity['values']
_LOGGER.info(f"Processing entity with id: {statistic_id} with {len(values)} values")
hass_entity = hass.states.get(statistic_id)
if hass_entity is None:
_handle_error(f"Entity does not exist: '{statistic_id}'")
metadata = {
"has_mean": False,
"has_sum": True,
"source": _get_source(statistic_id),
"statistic_id": statistic_id,
"name": None,
"unit_of_measurement": hass_entity.attributes["unit_of_measurement"],
}
statistics = [{
"start": datetime.strptime(value["datetime"], "%Y-%m-%d %H:%M").replace(tzinfo=timezone),
"sum": value["value"],
"state": value["value"]
} for value in values]
async_import_statistics(hass, metadata, statistics)
hass.services.register(DOMAIN, "import_from_json", handle_import_from_json) You can then call the service with curl like this: curl -s -X POST -H "Authorization: Bearer ${HA_TOKEN}" --data @test.json "http://${HA_ADDRESS}/api/services/import_statistics/import_from_json" where {
"timezone": "America/Los_Angeles",
"entities": [
{
"id": "sensor.testing",
"values": [
{
"value": 10.0,
"datetime": "2024-09-13 00:00"
}
]
}
]
} Note: this doesn't do the max,min,mean stuff and sets the |
@GroveJay , great, now I know how I can call a service via REST. I'll combine this with my code in some time ... |
Checklist
Is your feature request related to a problem? Please describe.
I have a separate program that can generate statistics and I would like to upload the output directly to HA rather than first having to save the output as a CVS file and then uploading it to HA using your integration.
Describe the solution you'd like
How hard would it be to extend your code to create a REST API where the new statistics could for example be JSON encoded?
Describe alternatives you've considered
Additional context
This would be very useful in my application where my water supplier posts hourly consumption data in a batch about once a day.
The text was updated successfully, but these errors were encountered: