-
Notifications
You must be signed in to change notification settings - Fork 570
Avoiding a monolithic JS file #561
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
There are tradeoffs to both approaches. This has been discussed in the past, but it's not super easy for me to find exactly where. Try going through #136 (comment) and #186. Changing from one to another is a significant architectural change, so the only answer I can give right now is that it's not currently possible and I don't know of anyone working on that direction. |
I think #524 is very relevant to this issue. |
That's the issue I was looking for, thanks @flimzy. |
Thanks for links guys. cc @jackkleeman |
Im doing some experimental things on this at the moment. Have been working on a chrome plugin that establishes a "GopherJS standard runtime environment" in the browser, which includes all the big libs. The idea being that the client only needs to load the app-specific code for each GopherjJS app, which should save massively on total payload size. The implementation is currently a mess at the moment ... and there are lots of reasons why this might never work. Lots of good reasons. One little tool that might be handy though, I did this "gopher-count" tool to analyse the generated output from the compiler. Its dead simple, but good for showing whats taking up the space each time you compile something : |
@steveoc64 Why a chrome extension (assuming you don't really mean plugin, as in C/C++, DLL etc) instead of just separating the runtime into a separate .js bundle which could be served from a CDN etc? If served from a single CDN for all GopherJS applications, a user only needs to download that bundle once then it's in the browser cache for (basically) ever. |
@slimsag - yeah, I think ultimately, having a separated .js bundle served from a CDN would make a huge difference to propogating gopherjs development ... assuming that the biggest current "issue" is the generated filesize. Getting a stable set of CDN bundles would be a cool goal, and relatively do-able in the near term. Def worth a play. Am playing with a chrome extension, mainly as an excuse to learn how the github.com/fabioberger/chrome lib works, and explore possibilities. Nothing serious, but Im thinking in terms of an android APK style packaging for some specific web apps. I have a game in development that currently sits at 9+MB minified, almost half of which is net / http / fmt / crypto ... getting that load time down is an itch that I am trying to scratch. Playing with the chrome extension with that in mind. Another approach (that is working quite well) is to bypass the Go networking layer by relying on calls through jQuery or other custom bits of JS to do simple AJAX or websocket calls. That approach is a relatively simple way to shave off 2+MB without messing up your app. I have a half-a-dozen commercial web-facing projects that Im working on at the moment - they are all gopherjs at the front end, and load size is a primary concern for all of them. Even the leanest is sitting on around 2MB minified. Once you cross that 2MB line, there is not a huge difference in load times over the network thanks to gzip compression. In all cases, I get to quicly load something interesting on the landing page, keep the user entertained, and cache up initial data entry until the main payload loads and executes. Its a hack that Im not happy about, but it does work pretty well, and my clients are more than happy with it. Dominic (of honnef.co fame) has rightly pointed out in other discussions that its not the load time so much as the effort to decompress and parse all of that JS. With that in mind, having a gopherjs bundle served over CDNs, and locally cached, will def be better, but might not be the silver bullet that makes the whole issue go away in the way that we might be thinking. We should try it and measure the difference though. If I can be excused for uttering an engineering heresy - I think that this time next year, the prospect of loading a 4-6MB JS bundle up front on the cheapest available phone or chromebook, over a public wireless network - might actually be a non-issue, if moore's law applies to the median internet user. Heresy yes, but we can live in hope. |
For replacing Go networking, take a look at https://godoc.org/honnef.co/go/js/xhr. It's saved me a lot of work after I had initial headaches using jquery. |
Quite possibly but...
Exactly, which is why my various comments on the subject of "large files" or "splitting into packages" basically boil down to:
Solving #186 is a necessary pre-requisite in order that:
|
Agreed that #186 is a better option, and that performance is a factor of how much actual code hits the JS interpreter at the end of the day. Agressive DCE would mean that "a standard gopherjs bundle" would not be very standard at all (as each app compile would cull different code) ... which in turn dilutes the usefulness of having a CDN bundle. At the moment, "encoding/json" and "reflect" are 2 big contributors to compiled JS size,, although I have noticed significant differences in the compiled size of each on different code bases when using the current master branch. Can I assume from this that there is already quite a bit of DCE going on ? eg : unicode on project 1 = 144372 bytes Given these findings on the current master branch, a CDN approach is already looking not that good an idea. Am I missing something obvious here ? |
I don't think so. At least I've reached the same conclusion in my thought process. But equally, I don't see an issue with this conclusion for now... many things may change which makes it the wrong conclusion in time! |
The size is pretty reasonable if you stay out of stdlib territory. My conclusion so far is you pretty much just can't use encoding/json and net/http, removing those shaved 5MB off my test project haha. |
Hey so I'm going to post this here too (I know this issue is closed, but the title much closer represents the issue I'm solving): GopherJS is an amazing tool, but I've always been frustrated by the size of the output JS. I've always thought a better solution would be to split the JS up by package and store it in a centralised CDN. This architecture would then allow aggressive caching: If you import Today I'm announcing Visit After it's finished, you'll be shown a link to a page on jsgo.io: https://jsgo.io/dave/jstest. The compile page will also give you a link to a single JS file on The compile server should be considered in beta right now... Please add issues on https://github.com/dave/jsgo if it's having trouble compiling your project. The package serving framework (everything in Let me know what you think! |
What strategies are there to avoiding GopherJS producing a monolithic JS file? Ideally I'd like to have a separate file per Go package. This would lead to smaller files that are easy to download (parallel TCP connections), and easy to cache (only the files that change need to be downloaded).
How can I avoid GopherJS from producing a monolithic JS file?
The text was updated successfully, but these errors were encountered: