-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Library audit #332
Comments
So many of the library function take a PDK Framework instance as the first argument. It seems to me that much of our library is crying out to be melded into |
I would agree with that. |
This sounds true, but is code bloat a potential problem? I still don't understand the mechanics of |
I also agree. |
To chime in on how things work. Essentially, if you write With static dependencies like local clef_library = "library.clef"
local clef = require(clef_library) To do that properly, I'd have to keep track of every single reference to the The only reliable way ensuring the the dependency is included is to include all the mixins. Now one could make some assumptions and look for the code
The comments would be possible to remove in future versions of the bundler. This would likely need to be custom-built, though. I tried using a Lua minifier to solve this and to make the code smaller in general, but all the minifiers I found produced buggy code.
It could, but I'm not too worried about that. Lua is a very fast interpreted language that gets fairly close to the speed of C++. To keep Lua fast, lines of code is really not an issue. What's far more important is making sure the time complexity of our algorithms stays low. It's also important to note that it's cheap to parse code, especially in comparison to actually running that code. As far as I know, no one (aside from Jan Angermüller) has had real-time performance issues in JW Lua or RGP Lua. And Jan's performance issues were solved by creating a custom framework that reduces C++ interop with Lua and Finale PDK API calls, not necessarily by running less Lua. |
When it comes to bundled code bloat, the more pressing concern I'd have is the the actual download size. The largest file is ~300kb of Lua code. In the grand scheme of things, that's not a lot. But it is starting to get large enough to make me nervous (I've written a lot of code for websites, and there every single kilobyte counts, so anything in double-digit KBs causes me concern). In reality, though, no one will care about that. Even if it gets to a few MBs, that shouldn't cause issues. Regardless, limiting the file size is still likely a good idea. And this should probably be solved at the bundler-level, not necessarily the source code level. I am working on a side project to this side project that should help with this. |
Something that will need to be considered if we convert (most of) the whole library to mixins is, is it possible to wrap an instance of, e.g., |
I think stripping out docblocks from library includes would go a long way to helping reduce file size. Maybe even something simple like this would work: @rpatters1 |
FYI, I tried removing comments in #334, however after merging I noticed some scripts were broken so I rolled back the change in #336. Going to go to bed so I won't be able to look at this any further tonight. In case anyone wants to explore this more before I can, here's the actually function I wrote to remove all the comments: https://github.com/finale-lua/lua-scripts/blob/master/.github/actions/bundle/src/remove-comments.ts And the test cases: |
Your first regex misses any multiline comment with a --[[
% get_raw_finale_version
Returns a raw Finale version from major, minor, and (optional) build parameters. For 32-bit Finale
this is the internal major Finale version, not the year.
@ major (number) Major Finale version
@ minor (number) Minor Finale version
@ [build] (number) zero if omitted
: (number)
]] Then one of the subsequent ones removes the Edit: and none of your tests include this case. |
Since static analysis isn't a viable option for the time being, what about runtime analysis? I could write something that would output the names of all included mixins after a script has finished running. That could then be used to insert bundler directives at the top of a script file which would instruct the bundler to only include the listed mixins. For example: -- #INCLUDE_MIXINS: __FCMBase FCMControl FCMCtrlEdit FCMCustomLuaWindow FCMCustomWindow FCMResourceWindow __FCMUserWindow __FCMCollection __FCMIteratorBase I know there are a couple of class hierarchies that differ between Mac and Windows but that can be handled by the output function which would list both possibilities. If a bundler directive is not present, the whole mixin library would be included, as is currently the case. |
I would be concerned that you can't guarantee that every path of the script has been executed. That means you might not be able to guarantee that every dependency was loaded. |
I understand that. It would be up to the author to ensure that they had run all of the paths. And if there were too many paths to consider or the author was not confident, they don't have to use it and the entire I guess my thinking was that it might be better than nothing to have the option there. |
Oh, I see. This would be a tool for developers. (I was imagining an automated process.) Even then there would still be some risk, but it might be useful. It does seem like a lot of clutter the way it is now. |
Can explain in more detail later, but a bit short on time to write a detailed description.
The shared library is getting to be quite large. And certain library files are either becoming to unfocused (e.g., the general library) and there are patterns of library functions that probably should be grouped together but aren't. In fact, one of the reasons I added the docs search a few weeks ago to the site was to be better able to find different library functions.
I think it's about time to do an audit of all the library functions and recategorize them. I imagine the majority of functions should stay in their current location, but I think having better organization overall would make things simpler to use.
To do this, I'd suggest we first hash out what the different files should be, then which functions need to be moved to which file. Then once we're in agreement, we should do the entire migration in one PR and quickly review/merge it. Since this migration will touch many if not most of the scripts, doing it over multiple PRs or taking a long time to review/merge the giant PR would cause many merge conflicts and more headaches. By making the actual changes swift, we can minimize the downsides of such a migration. This will require decent coordination on when the actual migration will take place to ensure we have one person to write the PR and another to merge it.
Thoughts?
@rpatters1 @ThistleSifter @cv-on-hub @CJGarciaMusic @jwink75
The text was updated successfully, but these errors were encountered: