Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ adding docker-api-proxy service ⚠️ #7070

Open
wants to merge 49 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 36 commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
3398e50
aded docker-api-proxy with integration tests
Jan 23, 2025
47a1b9b
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Jan 23, 2025
8a4c101
refactor tests
Jan 23, 2025
47eb688
refactor
Jan 23, 2025
c4f8276
using docker IP
Jan 23, 2025
7ab73c9
making test mandatory
Jan 23, 2025
ff6539e
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Jan 23, 2025
9eeaee9
giving service more time to start
Jan 23, 2025
5bb62ea
added secure flag to settings
Jan 23, 2025
92b9086
removed unused
Jan 23, 2025
714dbfc
renamed
Jan 23, 2025
b080606
fixed dependencies
Jan 23, 2025
6ffef9e
updae healthcheck and remove logs
Jan 23, 2025
333ac43
trigger test on any package change
Jan 23, 2025
36b54fa
refactor
Jan 23, 2025
da0d8da
rename
Jan 23, 2025
10cd8ea
rename
Jan 23, 2025
0393243
refactor with new pattern
Jan 23, 2025
2999a03
refactor
Jan 23, 2025
a763834
rename
Jan 23, 2025
b4d34b6
fixed integration tests
Jan 23, 2025
ab35648
running as non root user
Jan 23, 2025
6168b5e
dropped depnencies
Jan 23, 2025
4771022
added required necessary dependencies
Jan 23, 2025
b0eb0d1
remove unused
Jan 29, 2025
1e54972
remove unused
Jan 29, 2025
ce13d5d
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Jan 29, 2025
1837ca2
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Jan 30, 2025
e71c113
added combine_lfiespan_context_managers
Jan 30, 2025
faaf938
using new docker client
Jan 30, 2025
096f943
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Jan 30, 2025
f39fa91
refactor to use custom setting var name
Jan 30, 2025
b9da94d
added env vars
Jan 31, 2025
0d92121
adding check to see that API is responsive
Jan 31, 2025
e1fe41d
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Jan 31, 2025
92a682d
ensure it timesout
Jan 31, 2025
e53c60c
renamed network
Jan 31, 2025
44d8ce4
added healthcheck error
Jan 31, 2025
959d1b6
fixed healthcheck
Jan 31, 2025
bd7e1c7
refaactor
Jan 31, 2025
911bcdf
Merge remote-tracking branch 'upstream/master' into pr-osparc-docker-…
Feb 4, 2025
cce8765
refactor position
Feb 4, 2025
b3bd6e6
typos and notes
Feb 4, 2025
d84efbf
refactor to use settings instead of property name
Feb 4, 2025
09a4a0b
fixed specs generation
Feb 4, 2025
72e2bc7
rename
Feb 4, 2025
f0709d3
fixed tests
Feb 4, 2025
5d87c53
fixed warning
Feb 4, 2025
8f062d2
added missing dependencies
Feb 4, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .env-devel
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,12 @@ DIRECTOR_REGISTRY_CACHING=True
DIRECTOR_SERVICES_CUSTOM_CONSTRAINTS=null
DIRECTOR_TRACING=null

DOCKER_API_PROXY_HOST=docker-api-proxy
DOCKER_API_PROXY_PASSWORD=null
DOCKER_API_PROXY_PORT=8888
DOCKER_API_PROXY_SECURE=False
DOCKER_API_PROXY_USER=null

EFS_USER_ID=8006
EFS_USER_NAME=efs
EFS_GROUP_ID=8106
Expand Down
70 changes: 70 additions & 0 deletions .github/workflows/ci-testing-deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,7 @@ jobs:
migration: ${{ steps.filter.outputs.migration }}
payments: ${{ steps.filter.outputs.payments }}
dynamic-scheduler: ${{ steps.filter.outputs.dynamic-scheduler }}
docker-api-proxy: ${{ steps.filter.outputs.docker-api-proxy }}
resource-usage-tracker: ${{ steps.filter.outputs.resource-usage-tracker }}
static-webserver: ${{ steps.filter.outputs.static-webserver }}
storage: ${{ steps.filter.outputs.storage }}
Expand Down Expand Up @@ -233,6 +234,9 @@ jobs:
- 'services/docker-compose*'
- 'scripts/mypy/*'
- 'mypy.ini'
docker-api-proxy:
- 'packages/**'
- 'services/docker-api-proxy/**'
resource-usage-tracker:
- 'packages/**'
- 'services/resource-usage-tracker/**'
Expand Down Expand Up @@ -2190,6 +2194,71 @@ jobs:
with:
flags: integrationtests #optional


integration-test-docker-api-proxy:
needs: [changes, build-test-images]
if: ${{ needs.changes.outputs.anything-py == 'true' || needs.changes.outputs.docker-api-proxy == 'true' || github.event_name == 'push'}}
timeout-minutes: 30 # if this timeout gets too small, then split the tests
name: "[int] docker-api-proxy"
runs-on: ${{ matrix.os }}
strategy:
matrix:
python: ["3.11"]
os: [ubuntu-22.04]
fail-fast: false
steps:
- uses: actions/checkout@v4
- name: setup docker buildx
id: buildx
uses: docker/setup-buildx-action@v3
with:
driver: docker-container
- name: setup python environment
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- name: expose github runtime for buildx
uses: crazy-max/ghaction-github-runtime@v3
# FIXME: Workaround for https://github.com/actions/download-artifact/issues/249
- name: download docker images with retry
uses: Wandalen/wretry.action@master
with:
action: actions/download-artifact@v4
with: |
name: docker-buildx-images-${{ runner.os }}-${{ github.sha }}-backend
path: /${{ runner.temp }}/build
attempt_limit: 5
attempt_delay: 1000
- name: load docker images
run: make load-images local-src=/${{ runner.temp }}/build
- name: install uv
uses: astral-sh/setup-uv@v5
with:
version: "0.5.x"
enable-cache: false
cache-dependency-glob: "**/docker-api-proxy/requirements/ci.txt"
- name: show system version
run: ./ci/helpers/show_system_versions.bash
- name: install
run: ./ci/github/integration-testing/docker-api-proxy.bash install
- name: test
run: ./ci/github/integration-testing/docker-api-proxy.bash test
- name: upload failed tests logs
if: ${{ failure() }}
uses: actions/upload-artifact@v4
with:
name: ${{ github.job }}_docker_logs
path: ./services/docker-api-proxy/test_failures
- name: cleanup
if: ${{ !cancelled() }}
run: ./ci/github/integration-testing/docker-api-proxy.bash clean_up
- uses: codecov/codecov-action@v5
if: ${{ !cancelled() }}
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: integrationtests #optional

integration-test-simcore-sdk:
needs: [changes, build-test-images]
if: ${{ needs.changes.outputs.anything-py == 'true' || needs.changes.outputs.simcore-sdk == 'true' || github.event_name == 'push' }}
Expand Down Expand Up @@ -2262,6 +2331,7 @@ jobs:
integration-test-director-v2-01,
integration-test-director-v2-02,
integration-test-dynamic-sidecar,
integration-test-docker-api-proxy,
integration-test-simcore-sdk,
integration-test-webserver-01,
integration-test-webserver-02,
Expand Down
1 change: 1 addition & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@ SERVICES_NAMES_TO_BUILD := \
payments \
resource-usage-tracker \
dynamic-scheduler \
docker-api-proxy \
service-integration \
static-webserver \
storage \
Expand Down
40 changes: 40 additions & 0 deletions ci/github/integration-testing/docker-api-proxy.bash
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#!/bin/bash
# http://redsymbol.net/articles/unofficial-bash-strict-mode/
set -o errexit # abort on nonzero exitstatus
set -o nounset # abort on unbound variable
set -o pipefail # don't hide errors within pipes
IFS=$'\n\t'

install() {
make devenv
# shellcheck source=/dev/null
source .venv/bin/activate
pushd services/docker-api-proxy
make install-ci
popd
uv pip list
make info-images
}

test() {
# shellcheck source=/dev/null
source .venv/bin/activate
pushd services/docker-api-proxy
make test-ci-integration
popd
}

clean_up() {
docker images
make down
}

# Check if the function exists (bash specific)
if declare -f "$1" >/dev/null; then
# call arguments verbatim
"$@"
else
# Show a helpful error
echo "'$1' is not a known function name" >&2
exit 1
fi
52 changes: 52 additions & 0 deletions packages/pytest-simcore/src/pytest_simcore/docker_api_proxy.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
import logging

import pytest
from aiohttp import ClientSession, ClientTimeout
from pydantic import TypeAdapter
from settings_library.docker_api_proxy import DockerApiProxysettings
from tenacity import before_sleep_log, retry, stop_after_delay, wait_fixed

from .helpers.docker import get_service_published_port
from .helpers.host import get_localhost_ip
from .helpers.typing_env import EnvVarsDict

_logger = logging.getLogger(__name__)


@retry(
wait=wait_fixed(1),
stop=stop_after_delay(10),
before_sleep=before_sleep_log(_logger, logging.INFO),
reraise=True,
)
async def _wait_till_docker_api_proxy_is_responsive(
settings: DockerApiProxysettings,
) -> None:
async with ClientSession(timeout=ClientTimeout(1, 1, 1, 1, 1)) as client:
response = await client.get(f"{settings.base_url}/version")
assert response.status == 200, await response.text()


@pytest.fixture
async def docker_api_proxy_settings(
docker_stack: dict, env_vars_for_docker_compose: EnvVarsDict
) -> DockerApiProxysettings:
"""Returns the settings of a redis service that is up and responsive"""

prefix = env_vars_for_docker_compose["SWARM_STACK_NAME"]
assert f"{prefix}_docker-api-proxy" in docker_stack["services"]

published_port = get_service_published_port(
"docker-api-proxy", int(env_vars_for_docker_compose["DOCKER_API_PROXY_PORT"])
)

settings = TypeAdapter(DockerApiProxysettings).validate_python(
{
"DOCKER_API_PROXY_HOST": get_localhost_ip(),
"DOCKER_API_PROXY_PORT": published_port,
}
)

await _wait_till_docker_api_proxy_is_responsive(settings)

return settings
Original file line number Diff line number Diff line change
Expand Up @@ -52,10 +52,12 @@
"invitations": "/",
"payments": "/",
"resource-usage-tracker": "/",
"docker-api-proxy": "/version",
}
AIOHTTP_BASED_SERVICE_PORT: int = 8080
FASTAPI_BASED_SERVICE_PORT: int = 8000
DASK_SCHEDULER_SERVICE_PORT: int = 8787
DOCKER_API_PROXY_SERVICE_PORT: int = 8888

_SERVICE_NAME_REPLACEMENTS: dict[str, str] = {
"dynamic-scheduler": "dynamic-schdlr",
Expand Down Expand Up @@ -133,6 +135,7 @@ def services_endpoint(
AIOHTTP_BASED_SERVICE_PORT,
FASTAPI_BASED_SERVICE_PORT,
DASK_SCHEDULER_SERVICE_PORT,
DOCKER_API_PROXY_SERVICE_PORT,
]
endpoint = URL(
f"http://{get_localhost_ip()}:{get_service_published_port(full_service_name, target_ports)}"
Expand Down
79 changes: 76 additions & 3 deletions packages/service-library/src/servicelib/docker_utils.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,23 @@
import asyncio
import logging
from collections.abc import Awaitable, Callable
from contextlib import AsyncExitStack
from collections.abc import AsyncIterator, Awaitable, Callable
from contextlib import AsyncExitStack, asynccontextmanager
from dataclasses import dataclass
from datetime import datetime
from functools import cached_property
from typing import Any, Final, Literal
from typing import Any, AsyncContextManager, Final, Literal

import aiodocker
import aiohttp
import arrow
import tenacity
from aiohttp import ClientSession
from fastapi import FastAPI
from models_library.docker import DockerGenericTag
from models_library.generated_models.docker_rest_api import ProgressDetail
from models_library.utils.change_case import snake_to_camel
from pydantic import BaseModel, ByteSize, ConfigDict, TypeAdapter, ValidationError
from settings_library.docker_api_proxy import DockerApiProxysettings
from settings_library.docker_registry import RegistrySettings
from yarl import URL

Expand Down Expand Up @@ -281,3 +287,70 @@ async def pull_image(
f"pulling {image_short_name}: {pull_progress}...",
logging.DEBUG,
)


def get_lifespan_remote_docker_client(
docker_api_proxy_settings_property_name: str,
) -> Callable[[FastAPI], AsyncContextManager[None]]:
"""Ensures `setup` and `teardown` for the remote docker client.

Arguments:
docker_api_proxy_settings_property_name -- if the name is `PROP_NAME`
then it should be accessible as `app.state.settings.PROP_NAME`

Returns:
docker client lifespan manager
"""

@asynccontextmanager
async def _(app: FastAPI) -> AsyncIterator[None]:
GitHK marked this conversation as resolved.
Show resolved Hide resolved
settings: DockerApiProxysettings = getattr(
app.state.settings, docker_api_proxy_settings_property_name
)

session: ClientSession | None = None
if settings.DOCKER_API_PROXY_USER and settings.DOCKER_API_PROXY_PASSWORD:
session = ClientSession(
auth=aiohttp.BasicAuth(
login=settings.DOCKER_API_PROXY_USER,
password=settings.DOCKER_API_PROXY_PASSWORD.get_secret_value(),
)
)

async with AsyncExitStack() as exit_stack:
if settings.DOCKER_API_PROXY_USER and settings.DOCKER_API_PROXY_PASSWORD:
await exit_stack.enter_async_context(
ClientSession(
auth=aiohttp.BasicAuth(
login=settings.DOCKER_API_PROXY_USER,
password=settings.DOCKER_API_PROXY_PASSWORD.get_secret_value(),
)
)
)

client = await exit_stack.enter_async_context(
aiodocker.Docker(url=settings.base_url, session=session)
)

app.state.remote_docker_client = client

await wait_till_docker_api_proxy_is_responsive(app)

yield
GitHK marked this conversation as resolved.
Show resolved Hide resolved

return _
GitHK marked this conversation as resolved.
Show resolved Hide resolved


@tenacity.retry(
wait=tenacity.wait_fixed(5),
stop=tenacity.stop_after_delay(60),
before_sleep=tenacity.before_sleep_log(_logger, logging.INFO),
reraise=True,
)
async def wait_till_docker_api_proxy_is_responsive(app: FastAPI) -> None:
await asyncio.wait_for(get_remote_docker_client(app).version(), timeout=5)


def get_remote_docker_client(app: FastAPI) -> aiodocker.Docker:
assert isinstance(app.state.remote_docker_client, aiodocker.Docker) # nosec
return app.state.remote_docker_client
24 changes: 24 additions & 0 deletions packages/service-library/src/servicelib/fastapi/lifespan_utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
from collections.abc import AsyncGenerator, Callable
from contextlib import AsyncExitStack, asynccontextmanager
from typing import AsyncContextManager, TypeAlias

from fastapi import FastAPI

LifespanContextManager: TypeAlias = Callable[[FastAPI], AsyncContextManager[None]]


def combine_lfiespan_context_managers(
GitHK marked this conversation as resolved.
Show resolved Hide resolved
*context_managers: LifespanContextManager,
) -> LifespanContextManager:
"""the first entry has its `setup` called first and its `teardown` called last
With `setup` and `teardown` referring to the code before and after the `yield`
"""

@asynccontextmanager
async def _(app: FastAPI) -> AsyncGenerator[None, None]:
async with AsyncExitStack() as stack:
for context_manager in context_managers:
await stack.enter_async_context(context_manager(app))
yield

return _
42 changes: 42 additions & 0 deletions packages/service-library/tests/fastapi/test_lifespan_utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager

import pytest
from asgi_lifespan import LifespanManager
from fastapi import FastAPI
from servicelib.fastapi.lifespan_utils import combine_lfiespan_context_managers


async def test_multiple_lifespan_managers(capsys: pytest.CaptureFixture):
@asynccontextmanager
async def database_lifespan(_: FastAPI) -> AsyncIterator[None]:
print("setup DB")
yield
print("shutdown DB")

@asynccontextmanager
async def cache_lifespan(_: FastAPI) -> AsyncIterator[None]:
print("setup CACHE")
yield
print("shutdown CACHE")

app = FastAPI(
lifespan=combine_lfiespan_context_managers(database_lifespan, cache_lifespan)
)

capsys.readouterr()

async with LifespanManager(app):
messages = capsys.readouterr().out

assert "setup DB" in messages
assert "setup CACHE" in messages
assert "shutdown DB" not in messages
assert "shutdown CACHE" not in messages

messages = capsys.readouterr().out

assert "setup DB" not in messages
assert "setup CACHE" not in messages
assert "shutdown DB" in messages
assert "shutdown CACHE" in messages
Loading
Loading