Skip to content

Bangle.js 2: Hardware SPI and speculative flash read #2632

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
gfwilliams opened this issue May 8, 2025 · 5 comments
Open

Bangle.js 2: Hardware SPI and speculative flash read #2632

gfwilliams opened this issue May 8, 2025 · 5 comments

Comments

@gfwilliams
Copy link
Member

gfwilliams commented May 8, 2025

Bangle.js 2 stores most of the program code in external flash and executes from there, doing relatively small (16 byte) reads. Right now, we actually use software-based SPI for flash memory as usually we need the data immediately, and with the overhead of setting up the nRF52's SPI flash via their HAL it's just faster to bit-bash 16 bytes.

Using the QSPI peripheral and memory-mapping is an option, but it makes power saving very difficult because we want to put the flash to sleep to save power, but it means that when we wake up in an interrupt it may be that some memory referenced from a pointer there is in ext flash and so inaccessible because the flash is powered down.

I just added some instrumentation to jsvStringIteratorLoadFlashString to see if any caching would help and I guess unsurprisingly most of the time we're reading contiguous areas of memory. In that case it's possible that if we could use the SPI hardware and then do an SPI read of the next 16b in the background, while we're executing the current 16b, we could get much better performance.

... or maybe this isn't worthwhile and we should really be looking at using internal flash for the storage of JS, which appears to work ok apart from some issue with compaction. issue on it here

@thyttan
Copy link
Contributor

thyttan commented May 8, 2025

Both sound cool to me.

If I remember correctly from my testing and forum conversations internal flash is not big enough to store a somewhat larger amount of apps?

I'm not able to judge the amount of work needed for either of the options though to gauge what is worth it 🤔🙂

@gfwilliams
Copy link
Member Author

internal flash is not big enough to store a somewhat larger amount of apps?

Right now we have around 200k free - it feels like that should be more then enough for the JS for pretty much as many apps as you want (just not all the other stuff). The idea was that JS would be written there automatically, and if there wasn't space it'd just go on the external flash as normal.

@gfwilliams
Copy link
Member Author

it feels like that should be more then enough for the JS for pretty much as many apps as you want

Actually I take that back! I just looked up the pretokenised JS size of the default apps and it's 97kB - so I think 200k might go surprisingly quickly :(

@thyttan
Copy link
Contributor

thyttan commented May 8, 2025

Ooh - I feel vindicated! 😅😆 sorry to be though 🙃

@gfwilliams
Copy link
Member Author

The internal flash build is now fixed (I hope) and it exposed a flash memory corruption issue that was probably causing problems for Puck.js/Jolt.js too. I took Tensorflow out and got us up to 250k, and removing the factory restore would get us up to 350 if we really wanted to do that.

... but it really is nice and fast now - it's a big improvement!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants