Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ API_V1_STR="/api/v1"
SECRET_KEY="your-super-secret-key-here"
ACCESS_TOKEN_EXPIRE_MINUTES=15
REFRESH_TOKEN_EXPIRE_DAYS=7
VERIFICATION_TOKEN_EXPIRE_HOURS=24

# Database (PostgreSQL)
POSTGRES_SERVER=localhost
Expand Down
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,18 @@ ruff .
python -m ruff check .
```

---

## Logging Safety

To prevent log injection and control-character pollution from user-provided inputs,
FluentMeet sanitizes dynamic log arguments with `app/core/sanitize.py`.

- Use `sanitize_for_log(value)` for a single value.
- Use `sanitize_log_args(*values)` when logging multiple placeholders.
- Newline, carriage return, tab, and other control characters are escaped or replaced.
- Long values are truncated with `...<truncated>`.

## 🤝 Contributing

We welcome contributions! Please follow these steps:
Expand Down
57 changes: 57 additions & 0 deletions alembic/versions/19dc9714d9ea_add_email_verification_model.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
"""Add email verification model

Revision ID: 19dc9714d9ea
Revises: 11781e907181
Create Date: 2026-03-19 12:22:16.244801

"""

from typing import Sequence, Union

from alembic import op
import sqlalchemy as sa

# revision identifiers, used by Alembic.
revision: str = "19dc9714d9ea"

Check notice

Code scanning / CodeQL

Unused global variable Note

The global variable 'revision' is not used.

Copilot Autofix

AI about 4 hours ago

Copilot could not generate an autofix suggestion

Copilot could not generate an autofix suggestion for this alert. Try pushing a new commit or if the problem persists contact support.

down_revision: Union[str, Sequence[str], None] = "11781e907181"

Check notice

Code scanning / CodeQL

Unused global variable Note

The global variable 'down_revision' is not used.

Copilot Autofix

AI about 4 hours ago

Copilot could not generate an autofix suggestion

Copilot could not generate an autofix suggestion for this alert. Try pushing a new commit or if the problem persists contact support.

branch_labels: Union[str, Sequence[str], None] = None

Check notice

Code scanning / CodeQL

Unused global variable Note

The global variable 'branch_labels' is not used.

Copilot Autofix

AI about 4 hours ago

To fix the reported issue without changing runtime behavior, we should indicate that branch_labels (and the other Alembic metadata symbols) are intentionally exported for external use. The simplest, non-invasive way is to define a module-level __all__ list that includes these names. Static analyzers then treat them as intentionally public, not unused.

Concretely, in alembic/versions/19dc9714d9ea_add_email_verification_model.py, right after the existing Alembic metadata assignments (revision, down_revision, branch_labels, depends_on), add a __all__ definition listing these four names. No imports are needed, and no existing lines must be altered or removed; we only insert new lines defining __all__. This preserves all current functionality while satisfying the static analysis rule.

Suggested changeset 1
alembic/versions/19dc9714d9ea_add_email_verification_model.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/alembic/versions/19dc9714d9ea_add_email_verification_model.py b/alembic/versions/19dc9714d9ea_add_email_verification_model.py
--- a/alembic/versions/19dc9714d9ea_add_email_verification_model.py
+++ b/alembic/versions/19dc9714d9ea_add_email_verification_model.py
@@ -17,7 +17,9 @@
 branch_labels: Union[str, Sequence[str], None] = None
 depends_on: Union[str, Sequence[str], None] = None
 
+__all__ = ["revision", "down_revision", "branch_labels", "depends_on"]
 
+
 def upgrade() -> None:
     """Upgrade schema."""
     # ### commands auto generated by Alembic - please adjust! ###
EOF
@@ -17,7 +17,9 @@
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None

__all__ = ["revision", "down_revision", "branch_labels", "depends_on"]


def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
Copilot is powered by AI and may make mistakes. Always verify output.
depends_on: Union[str, Sequence[str], None] = None

Check notice

Code scanning / CodeQL

Unused global variable Note

The global variable 'depends_on' is not used.

Copilot Autofix

AI about 4 hours ago

To fix the problem while preserving existing functionality, we should keep the assignment (so the information and any potential Alembic expectations remain) but rename the variable so that its name clearly indicates it is intentionally unused and passes the static-analysis rule. The right-hand side is a simple literal (None), so there are no side effects to worry about.

The best minimal change in this file is to rename depends_on on line 18 to a name that contains unused, such as _unused_depends_on. Since the variable is not referenced elsewhere in the snippet, no other code changes are required. The rest of the file, including the other revision identifiers and the upgrade/downgrade functions, remains unchanged.

Concretely:

  • In alembic/versions/19dc9714d9ea_add_email_verification_model.py, update line 18 from depends_on: Union[str, Sequence[str], None] = None to _unused_depends_on: Union[str, Sequence[str], None] = None.
  • No additional imports, methods, or definitions are necessary.
Suggested changeset 1
alembic/versions/19dc9714d9ea_add_email_verification_model.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/alembic/versions/19dc9714d9ea_add_email_verification_model.py b/alembic/versions/19dc9714d9ea_add_email_verification_model.py
--- a/alembic/versions/19dc9714d9ea_add_email_verification_model.py
+++ b/alembic/versions/19dc9714d9ea_add_email_verification_model.py
@@ -15,7 +15,7 @@
 revision: str = "19dc9714d9ea"
 down_revision: Union[str, Sequence[str], None] = "11781e907181"
 branch_labels: Union[str, Sequence[str], None] = None
-depends_on: Union[str, Sequence[str], None] = None
+_unused_depends_on: Union[str, Sequence[str], None] = None
 
 
 def upgrade() -> None:
EOF
@@ -15,7 +15,7 @@
revision: str = "19dc9714d9ea"
down_revision: Union[str, Sequence[str], None] = "11781e907181"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
_unused_depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
Copilot is powered by AI and may make mistakes. Always verify output.


def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"verification_tokens",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("token", sa.String(length=36), nullable=False),
sa.Column("expires_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
sa.ForeignKeyConstraint(
["user_id"],
["users.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_verification_tokens_id"), "verification_tokens", ["id"], unique=False
)
op.create_index(
op.f("ix_verification_tokens_token"),
"verification_tokens",
["token"],
unique=True,
)
Comment on lines +37 to +45
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify repository query patterns that filter by user_id/expires_at for verification tokens.
fd 'verification_token\.py$' app tests | xargs -r rg -n -C2 'user_id|expires_at|delete_unexpired_tokens_for_user|get_token\('

Repository: Brints/FluentMeet

Length of output: 3144


🏁 Script executed:

cat -n alembic/versions/4b4b6b5d1c2a_add_verification_tokens_table.py

Repository: Brints/FluentMeet

Length of output: 1916


Replace redundant id index with composite index supporting user-based token cleanup.

The index on id at line 32 is redundant—primary keys automatically have index support. More critically, the delete_unexpired_tokens_for_user() method in app/crud/verification_token.py:32-37 filters by both user_id and expires_at, but no index covers this pattern, causing full table scans as data grows.

Replace the id index with a composite index on (user_id, expires_at):

Suggested migration adjustment
 def upgrade() -> None:
     op.create_index(
-        op.f("ix_verification_tokens_id"),
+        op.f("ix_verification_tokens_user_id_expires_at"),
         "verification_tokens",
-        ["id"],
+        ["user_id", "expires_at"],
         unique=False,
     )

 def downgrade() -> None:
     op.drop_index(
         op.f("ix_verification_tokens_token"), table_name="verification_tokens"
     )
     op.drop_index(
-        op.f("ix_verification_tokens_id"), table_name="verification_tokens"
+        op.f("ix_verification_tokens_user_id_expires_at"), table_name="verification_tokens"
     )
     op.drop_table("verification_tokens")

Also applies to: 47-50

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@alembic/versions/4b4b6b5d1c2a_add_verification_tokens_table.py` around lines
32 - 43, Replace the redundant single-column index ix_verification_tokens_id
with a composite index on (user_id, expires_at) so queries like
delete_unexpired_tokens_for_user (in app/crud/verification_token.py) can use an
index; specifically, remove the op.create_index call that creates
ix_verification_tokens_id and instead add
op.create_index(op.f("ix_verification_tokens_user_id_expires_at"),
"verification_tokens", ["user_id", "expires_at"], unique=False). Make the same
replacement for the corresponding index in the later block (the second
occurrence around lines 47-50) so both upgrade and downgrade/other index
definitions reflect the composite index. Ensure ix_verification_tokens_token
(unique on token) remains unchanged.

# ### end Alembic commands ###


def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(
op.f("ix_verification_tokens_token"), table_name="verification_tokens"
)
op.drop_index(op.f("ix_verification_tokens_id"), table_name="verification_tokens")
op.drop_table("verification_tokens")
# ### end Alembic commands ###
133 changes: 129 additions & 4 deletions app/api/v1/endpoints/auth.py
Original file line number Diff line number Diff line change
@@ -1,25 +1,34 @@
import logging
from uuid import uuid4

from fastapi import APIRouter, Depends, status
from fastapi import APIRouter, Depends, Query, Request, status
from sqlalchemy.orm import Session

from app.core.config import settings
from app.core.rate_limiter import limiter
from app.core.sanitize import sanitize_log_args
from app.crud.user.user import create_user, get_user_by_email
from app.db.session import get_db
from app.schemas.auth import (
ActionAcknowledgement,
ForgotPasswordRequest,
ResendVerificationRequest,
SignupResponse,
VerifyEmailResponse,
)
from app.schemas.user import UserCreate
from app.services.auth_verification import (
AuthVerificationService,
get_auth_verification_service,
)
from app.services.email_producer import EmailProducerService, get_email_producer_service

logger = logging.getLogger(__name__)

router = APIRouter(prefix="/auth", tags=["auth"])
DB_SESSION_DEPENDENCY = Depends(get_db)
EMAIL_PRODUCER_DEPENDENCY = Depends(get_email_producer_service)
AUTH_VERIFICATION_SERVICE_DEPENDENCY = Depends(get_auth_verification_service)


@router.post(
Expand All @@ -31,11 +40,18 @@ async def signup(
user_in: UserCreate,
db: Session = DB_SESSION_DEPENDENCY,
email_producer: EmailProducerService = EMAIL_PRODUCER_DEPENDENCY,
auth_verification_service: AuthVerificationService = (
AUTH_VERIFICATION_SERVICE_DEPENDENCY
),
) -> SignupResponse:
user = create_user(db=db, user_in=user_in)
verification_token = auth_verification_service.create_verification_token(
db=db,
user_id=user.id,
)

verification_link = (
f"{settings.FRONTEND_BASE_URL}/verify-email?user={user.id}&token={uuid4()}"
f"{settings.FRONTEND_BASE_URL}/verify-email?token={verification_token.token}"
)
try:
await email_producer.send_email(
Expand All @@ -47,8 +63,11 @@ async def signup(
)
except Exception as exc:
# Signup should succeed even if email queueing fails.
user_id_safe, exc_safe = sanitize_log_args(user.id, exc)
logger.warning(
"Failed to enqueue verification email for user %s: %s", user.id, exc
"Failed to enqueue verification email for user %s: %s",
user_id_safe,
exc_safe,
)

return SignupResponse.model_validate(user)
Expand Down Expand Up @@ -81,8 +100,11 @@ async def forgot_password(
template="password_reset",
)
except Exception as exc:
email_safe, exc_safe = sanitize_log_args(user.email, exc)
logger.warning(
"Failed to enqueue password reset email for %s: %s", user.email, exc
"Failed to enqueue password reset email for %s: %s",
email_safe,
exc_safe,
)

return ActionAcknowledgement(
Expand All @@ -91,3 +113,106 @@ async def forgot_password(
"password reset instructions."
)
)


@router.get(
"/verify-email",
response_model=VerifyEmailResponse,
status_code=status.HTTP_200_OK,
summary="Verify user email address",
description=(
"Validates an email verification token, activates the user account, "
"and invalidates the token."
),
responses={
400: {
"description": "Missing, invalid, or expired token",
"content": {
"application/json": {
"examples": {
"missing": {
"value": {
"status": "error",
"code": "MISSING_TOKEN",
"message": "Verification token is required.",
"details": [],
}
},
"invalid": {
"value": {
"status": "error",
"code": "INVALID_TOKEN",
"message": "Verification token is invalid.",
"details": [],
}
},
"expired": {
"value": {
"status": "error",
"code": "TOKEN_EXPIRED",
"message": (
"Verification token has expired. "
"Please request a new one."
),
"details": [],
}
},
}
}
},
}
},
)
def verify_email(
token: str | None = Query(default=None),
db: Session = DB_SESSION_DEPENDENCY,
auth_verification_service: AuthVerificationService = (
AUTH_VERIFICATION_SERVICE_DEPENDENCY
),
) -> VerifyEmailResponse:
auth_verification_service.verify_email(db=db, token=token)
return VerifyEmailResponse(
message="Email successfully verified. You can now log in.",
)


@router.post(
"/resend-verification",
response_model=ActionAcknowledgement,
status_code=status.HTTP_200_OK,
summary="Resend email verification link",
description=(
"Queues a new verification email when the account exists and is not "
"verified. Always returns a generic response to prevent user enumeration."
),
)
@limiter.limit("3/minute")
async def resend_verification(
request: Request,
payload: ResendVerificationRequest,
db: Session = DB_SESSION_DEPENDENCY,
email_producer: EmailProducerService = EMAIL_PRODUCER_DEPENDENCY,
auth_verification_service: AuthVerificationService = (
AUTH_VERIFICATION_SERVICE_DEPENDENCY
),
) -> ActionAcknowledgement:
del request
try:
await auth_verification_service.resend_verification_email(
db=db,
email=str(payload.email),
email_producer=email_producer,
)
except Exception as exc:
email_safe, exc_safe = sanitize_log_args(payload.email, exc)
logger.warning(
"Failed to enqueue verification resend for %s: %s",
email_safe,
exc_safe,
)
Comment on lines +200 to +212
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't mask token/database failures as a successful resend.

resend_verification_email() does user lookup and token mutation before it touches the email producer. This except Exception converts failures in those DB steps into a 200 "If an account..." response, which hides outages and can leave state partially changed. Catch only the producer failure path, or let the persistence errors propagate.

🧰 Tools
🪛 GitHub Check: CodeQL

[failure] 201-201: Log Injection
This log entry depends on a user-provided value.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@app/api/v1/endpoints/auth.py` around lines 193 - 202, The current broad
except around auth_verification_service.resend_verification_email masks DB/token
failures; narrow the error handling so persistence errors propagate and only
email-producer failures are caught. Move the try/except so it surrounds just the
call to the email_producer (or have resend_verification_email raise a specific
EmailProducerError), catch that specific exception (e.g., EmailProducerError or
the producer's raised exception) and log via logger.warning("Failed to enqueue
verification resend for %s: %s", payload.email, exc); do not catch Exception
from within the overall resend_verification_email call so lookup/token mutation
errors surface as failures.


return ActionAcknowledgement(
message=(
"If an account with that email exists, we have sent a verification email."
)
)
1 change: 1 addition & 0 deletions app/core/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ class Settings(BaseSettings):
SECRET_KEY: str = "placeholder_secret_key"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 15
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
VERIFICATION_TOKEN_EXPIRE_HOURS: int = 24

# Database
POSTGRES_SERVER: str = "localhost"
Expand Down
3 changes: 2 additions & 1 deletion app/core/exception_handlers.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

from app.core.error_responses import create_error_response
from app.core.exceptions import FluentMeetException
from app.core.sanitize import sanitize_for_log

logger = logging.getLogger(__name__)

Expand Down Expand Up @@ -62,7 +63,7 @@ async def unhandled_exception_handler(
"""
Handler for all other unhandled exceptions (500).
"""
logger.exception("Unhandled exception occurred: %s", str(exc))
logger.exception("Unhandled exception occurred: %s", sanitize_for_log(exc))
return create_error_response(
status_code=500,
code="INTERNAL_SERVER_ERROR",
Expand Down
20 changes: 20 additions & 0 deletions app/core/rate_limiter.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
from fastapi import Request
from fastapi.responses import JSONResponse
from slowapi import Limiter
from slowapi.errors import RateLimitExceeded
from slowapi.util import get_remote_address

from app.core.error_responses import create_error_response

limiter = Limiter(key_func=get_remote_address)


async def rate_limit_exception_handler(
_request: Request,
_exc: RateLimitExceeded,
) -> JSONResponse:
return create_error_response(
status_code=429,
code="RATE_LIMIT_EXCEEDED",
message="Too many requests. Please try again later.",
)
34 changes: 34 additions & 0 deletions app/core/sanitize.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
import re
from collections.abc import Iterable


class LogSanitizer:
"""Sanitizes user-controlled values before they are written to logs."""

_CONTROL_CHARS = re.compile(r"[\x00-\x08\x0b\x0c\x0e-\x1f\x7f]")

def __init__(self, max_length: int = 256) -> None:
self._max_length = max_length

def sanitize(self, value: object) -> str:
text = str(value)
text = text.replace("\r", r"\r").replace("\n", r"\n").replace("\t", r"\t")
text = self._CONTROL_CHARS.sub("?", text)
if len(text) <= self._max_length:
return text

return f"{text[: self._max_length]}...<truncated>"

def sanitize_many(self, values: Iterable[object]) -> tuple[str, ...]:
return tuple(self.sanitize(value) for value in values)


log_sanitizer = LogSanitizer()


def sanitize_for_log(value: object) -> str:
return log_sanitizer.sanitize(value)


def sanitize_log_args(*values: object) -> tuple[str, ...]:
return log_sanitizer.sanitize_many(values)
Loading
Loading