Skip to content

Add cross-assembly plugin system, texture optimizer, and multi-backend local AI pipeline#19

Open
Copilot wants to merge 4 commits intomainfrom
copilot/add-unity-texture-optimization-tool
Open

Add cross-assembly plugin system, texture optimizer, and multi-backend local AI pipeline#19
Copilot wants to merge 4 commits intomainfrom
copilot/add-unity-texture-optimization-tool

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 6, 2026

Implements three major extensions to TextureCocktail: a proper open plugin API, a texture import settings analyzer/fixer, and a multi-backend local AI integration.

Plugin System

The previous LoadShaderWindow used Type.GetType("LuticaLab.TextureCocktail." + name) — only finding types in the hardcoded namespace. Third-party plugins from external assemblies were silently ignored.

New: TextureCocktailPluginRegistry scans all AppDomain assemblies on [InitializeOnLoadMethod], matching TextureCocktailContent subclasses by class name against the shader's last path segment. No registration step required.

// Any assembly, any namespace — auto-discovered
[TextureCocktailPlugin("Vignette", "Adds vignette darkening", "YourName", "1.0.0")]
public class VignetteEffect : TextureCocktailContent
{
    public override bool UseDefaultLayout => false;
    public override void OnGUI() { /* custom IMGUI */ }
    public override void OnShaderValueChanged() { }
}
  • TextureCocktailPluginAttribute — optional metadata (display name, description, author, version)
  • PluginBrowserWindowLuticaLab → TextureCocktail Plugin Browser lists all discovered plugins
  • PLUGIN_GUIDE.md — developer reference for plugin authoring

Texture Optimizer

New editor window (LuticaLab → Texture Optimizer) that batch-analyzes Texture2D assets via TextureImporter:

  • Detects: non-POT dimensions, missing mipmaps on 3D textures, uncompressed formats, oversized textures, large compressed textures without crunch
  • Platform profiles: Desktop / Mobile / VR (affects recommended compression format)
  • One-click per-texture or batch "apply recommended fixes" (reimports assets)

Multi-Backend Local AI Connector

New editor window (LuticaLab → AI Connector) providing a text → text+image AI pipeline supporting multiple local AI backends:

Supported Backends

  • Ollama (POST /api/generate, GET /api/tags) — supports vision models like llava
  • OpenAI-Compatible APIs (POST /v1/chat/completions, GET /v1/models) — covers LocalAI, LM Studio, Jan, Kobold.cpp, llama.cpp server, text-generation-webui, and any other OpenAI-compatible endpoint

Architecture (Editor/AI/)

  • AiBackendBase — abstract base class with AiRequest/AiResponse structs; add new backends by subclassing from any assembly
  • AiTextureUtils — shared TextureToBase64 / Base64ToTexture utilities reusable across all backends
  • OllamaBackend — Ollama-specific implementation using Newtonsoft.Json
  • OpenAiCompatibleBackend — OpenAI-compatible implementation using Newtonsoft.Json

Features

  • Backend selector dropdown; server URL pre-filled per backend
  • Text prompt + optional Texture2D attachment (base64 PNG) for vision models
  • Decodes and displays base64 images in response; allows saving to project assets
  • Server URL and model persisted via EditorPrefs
  • CancellationToken support

i18n

Added keys for all three features to English, Korean, and Japanese language files.

Copilot AI changed the title [WIP] Add Unity internal texture optimization tool for plugins Add cross-assembly plugin system, texture optimizer, and Ollama local AI pipeline Apr 6, 2026
Copilot AI requested a review from LuticaCANARD April 6, 2026 15:35
@LuticaCANARD
Copy link
Copy Markdown
Owner

Ollama외 다른 ai도 되게하라

@LuticaCANARD LuticaCANARD marked this pull request as ready for review April 7, 2026 11:04
Copy link
Copy Markdown
Owner

@LuticaCANARD LuticaCANARD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. 전반적으로 로컬모델의 호환규격을 맞추려 한 부분은 좋으나, 재사용성을 감안하여 함수의 정의위치를 바꿔줄 것을 요청한다.

return sb.ToString();
}

private static string JsonString(string s)
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

이거 그냥 JSON Parser 쓰면 되지않나?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit 984d4a8. The manual string-walking parsers have been replaced with Newtonsoft.Json (JObject.Parse / JArray) in both OllamaBackend and OpenAiCompatibleBackend — the same library already used by LanguageDisplayer.cs.


// ── Texture helpers ──────────────────────────────────────────────────

private static string TextureToBase64(Texture2D tex)
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

이 인코딩 과정 다른 함수에 없었는가? 없었다면 공통 도구함수로 이전시키고 다른 class에서도 쓸 수 있게하라.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved in commit 984d4a8. TextureToBase64 and Base64ToTexture now live in Editor/AI/AiTextureUtils.cs as public static methods, usable from any class. Both OllamaBackend and OpenAiCompatibleBackend call them from there.

Copy link
Copy Markdown
Contributor Author

Copilot AI commented Apr 7, 2026

  1. 전반적으로 로컬모델의 호환규격을 맞추려 한 부분은 좋으나, 재사용성을 감안하여 함수의 정의위치를 바꿔줄 것을 요청한다.

Addressed in commit 984d4a8. AI-related logic has been split across purpose-specific files in Editor/AI/:

  • AiTextureUtils.cs — shared texture encode/decode utilities
  • AiBackendBase.cs — abstract base + AiRequest/AiResponse structs
  • OllamaBackend.cs — Ollama-specific HTTP logic
  • OpenAiCompatibleBackend.cs — OpenAI-compatible logic

OllamaConnector.cs is now a thin UI window that delegates all network work to the chosen backend.

Ollama외 다른 ai도 되게하라

Added in commit 984d4a8. The window (LuticaLab → AI Connector) now has a Backend dropdown. Currently supported:

  • OllamaPOST /api/generate (default: localhost:11434)
  • OpenAI-CompatiblePOST /v1/chat/completions (covers LocalAI, LM Studio, Jan, Kobold.cpp, llama.cpp server, text-generation-webui, etc.)

Additional backends can be added by creating a new class that inherits AiBackendBase in any assembly — the dropdown picks it up automatically.

Copilot AI changed the title Add cross-assembly plugin system, texture optimizer, and Ollama local AI pipeline Add cross-assembly plugin system, texture optimizer, and multi-backend local AI pipeline Apr 7, 2026
Copilot AI requested a review from LuticaCANARD April 7, 2026 11:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants