Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iOSCalendarAssistantWithLocalInf example missing frameworks + other build errors #161

Closed
1 of 2 tasks
tleyden opened this issue Jan 29, 2025 · 6 comments
Closed
1 of 2 tasks
Assignees

Comments

@tleyden
Copy link

tleyden commented Jan 29, 2025

System Info

PyTorch version: 2.5.0
Is debug build: False
CUDA used to build PyTorch: None
ROCM used to build PyTorch: N/A

OS: macOS 14.7.1 (arm64)
GCC version: Could not collect
Clang version: 16.0.0 (clang-1600.0.26.6)
CMake version: version 3.31.4
Libc version: N/A

Python version: 3.11.5 (main, Sep 11 2023, 08:31:25) [Clang 14.0.6 ] (64-bit runtime)
Python platform: macOS-14.7.1-arm64-arm-64bit
Is CUDA available: False
CUDA runtime version: No CUDA
CUDA_MODULE_LOADING set to: N/A
GPU models and configuration: No CUDA
Nvidia driver version: No CUDA
cuDNN version: No CUDA
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Apple M2 Max

Versions of relevant libraries:
[pip3] executorch==0.4.0a0+6a085ff
[pip3] numpy==1.23.2
[pip3] torch==2.5.0
[pip3] torchaudio==2.5.0
[pip3] torchsr==1.0.4
[pip3] torchvision==0.20.0
[conda] executorch                0.4.0a0+6a085ff          pypi_0    pypi
[conda] numpy                     1.23.2                   pypi_0    pypi
[conda] torch                     2.5.0                    pypi_0    pypi
[conda] torchaudio                2.5.0                    pypi_0    pypi
[conda] torchsr                   1.0.4                    pypi_0    pypi
[conda] torchvision               0.20.0                   pypi_0    pypi

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

When I initially opened the xcode project, the LocalInferenceImpl was showing in red because xcode could not find the project.

Image

Error logs

Errors about LLamaRunner and LocalInferenceImpl frameworks not being found

Expected behavior

Expected behavior is that the project should build.

I see a missing link here. I noticed in the github repo, there is a submodule link to the llama-stack repo:

https://github.com/meta-llama/llama-stack-apps/tree/main/examples/ios_calendar_assistant/iOSCalendarAssistantWithLocalInf

but the instructions don't mention that a recursive checkout is needed.

After running

git submodule update --init --recursive

It now resolves the sub-project:

Image

@tleyden tleyden changed the title iOSCalendarAssistantWithLocalInf example missing frameworks iOSCalendarAssistantWithLocalInf example missing frameworks + other build errors Jan 29, 2025
@tleyden
Copy link
Author

tleyden commented Jan 29, 2025

After getting past the above issue, I'm hitting another build issue:

'messagesPayloadPayload' is not a member type of struct 'LlamaStackClient.Components.Schemas.ChatCompletionRequest'

when trying to build the LocalInferenceImpl sub-project.

Were there some recent changes that broke the compatibility between llama-stack-client-swift and the llama-stack LocalInferenceImpl code? I noticed this change which seems to remove Components.Schemas.ChatCompletionRequest.messagesPayloadPayload from the OpenAPI spec:

https://github.com/meta-llama/llama-stack-client-swift/pull/16/files#diff-578a55eddee708891d54287f5042f61ea5852df57a16793042b4880ebdf23e40L5

Let me know if I should file a separate issue for this, since it's a bit tangential to the original issue.

@ashwinb @jeffxtang

@jeffxtang
Copy link
Contributor

@tleyden Yes there're quite some API spec changes in Llama Stack 0.1 causing the update of the iOS Swift SDK, and we have updated the remote inference part but not the local inference yet. If you could create a quick PR to fix the local inference issue (maybe cf the remote inference changes in the SDK and demos) that'd be great, or we'll be working on a fix in the next couple of days.

@tleyden
Copy link
Author

tleyden commented Jan 30, 2025

@jeffxtang Ok thanks for the heads up. It's not blocking me right now, so if you're planning to fix I'll just wait until you get to it.

@jeffxtang
Copy link
Contributor

@tleyden - thanks for your patience. Just got the iOS local inference working. Will need 3 PRs if you can't wait:

iOS Swift SDK PR: meta-llama/llama-stack-client-swift#18
The LocalInferenceImpl PR: meta-llama/llama-stack#911
The demo PR: #163

Otherwise, will merge later (likely next week) the PRs and add READMEs with clear and quick steps to set up everything and run the demo.

@tleyden
Copy link
Author

tleyden commented Feb 3, 2025

@jeffxtang Exciting! I'll probably wait until the merge, but thanks for the early preview.

@jeffxtang
Copy link
Contributor

@tleyden All PRs merged and please follow the steps here to use the LS Swift SDK 0.1.

Also, we plan to move the LocalInferenceImpl project from the llama-stack repo to llama-stack-client-swift to further simplify the setup process in the near future so the steps above may be updated then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants