Skip to content

Avoiding a monolithic JS file #561

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
JustinDrake opened this issue Dec 22, 2016 · 14 comments
Closed

Avoiding a monolithic JS file #561

JustinDrake opened this issue Dec 22, 2016 · 14 comments

Comments

@JustinDrake
Copy link

What strategies are there to avoiding GopherJS producing a monolithic JS file? Ideally I'd like to have a separate file per Go package. This would lead to smaller files that are easy to download (parallel TCP connections), and easy to cache (only the files that change need to be downloaded).

How can I avoid GopherJS from producing a monolithic JS file?

@dmitshur
Copy link
Member

There are tradeoffs to both approaches. This has been discussed in the past, but it's not super easy for me to find exactly where. Try going through #136 (comment) and #186.

Changing from one to another is a significant architectural change, so the only answer I can give right now is that it's not currently possible and I don't know of anyone working on that direction.

@flimzy
Copy link
Member

flimzy commented Dec 23, 2016

I think #524 is very relevant to this issue.

@dmitshur
Copy link
Member

That's the issue I was looking for, thanks @flimzy.

@JustinDrake
Copy link
Author

Thanks for links guys. cc @jackkleeman

@steveoc64
Copy link
Contributor

Im doing some experimental things on this at the moment. Have been working on a chrome plugin that establishes a "GopherJS standard runtime environment" in the browser, which includes all the big libs. The idea being that the client only needs to load the app-specific code for each GopherjJS app, which should save massively on total payload size.

The implementation is currently a mess at the moment ... and there are lots of reasons why this might never work. Lots of good reasons.

One little tool that might be handy though, I did this "gopher-count" tool to analyse the generated output from the compiler. Its dead simple, but good for showing whats taking up the space each time you compile something :

https://github.com/steveoc64/gopher-count

@emidoots
Copy link

emidoots commented Jan 5, 2017

@steveoc64 Why a chrome extension (assuming you don't really mean plugin, as in C/C++, DLL etc) instead of just separating the runtime into a separate .js bundle which could be served from a CDN etc? If served from a single CDN for all GopherJS applications, a user only needs to download that bundle once then it's in the browser cache for (basically) ever.

@steveoc64
Copy link
Contributor

@slimsag - yeah, I think ultimately, having a separated .js bundle served from a CDN would make a huge difference to propogating gopherjs development ... assuming that the biggest current "issue" is the generated filesize. Getting a stable set of CDN bundles would be a cool goal, and relatively do-able in the near term. Def worth a play.

Am playing with a chrome extension, mainly as an excuse to learn how the github.com/fabioberger/chrome lib works, and explore possibilities. Nothing serious, but Im thinking in terms of an android APK style packaging for some specific web apps.

I have a game in development that currently sits at 9+MB minified, almost half of which is net / http / fmt / crypto ... getting that load time down is an itch that I am trying to scratch. Playing with the chrome extension with that in mind.

Another approach (that is working quite well) is to bypass the Go networking layer by relying on calls through jQuery or other custom bits of JS to do simple AJAX or websocket calls. That approach is a relatively simple way to shave off 2+MB without messing up your app.

I have a half-a-dozen commercial web-facing projects that Im working on at the moment - they are all gopherjs at the front end, and load size is a primary concern for all of them. Even the leanest is sitting on around 2MB minified. Once you cross that 2MB line, there is not a huge difference in load times over the network thanks to gzip compression.

In all cases, I get to quicly load something interesting on the landing page, keep the user entertained, and cache up initial data entry until the main payload loads and executes. Its a hack that Im not happy about, but it does work pretty well, and my clients are more than happy with it.

Dominic (of honnef.co fame) has rightly pointed out in other discussions that its not the load time so much as the effort to decompress and parse all of that JS. With that in mind, having a gopherjs bundle served over CDNs, and locally cached, will def be better, but might not be the silver bullet that makes the whole issue go away in the way that we might be thinking. We should try it and measure the difference though.

If I can be excused for uttering an engineering heresy - I think that this time next year, the prospect of loading a 4-6MB JS bundle up front on the cheapest available phone or chromebook, over a public wireless network - might actually be a non-issue, if moore's law applies to the median internet user. Heresy yes, but we can live in hope.

@jackkleeman
Copy link

For replacing Go networking, take a look at https://godoc.org/honnef.co/go/js/xhr. It's saved me a lot of work after I had initial headaches using jquery.

@myitcv
Copy link
Member

myitcv commented Jan 5, 2017

@steveoc64

If I can be excused for uttering an engineering heresy - I think that this time next year, the prospect of loading a 4-6MB JS bundle up front on the cheapest available phone or chromebook, over a public wireless network - might actually be a non-issue, if moore's law applies to the median internet user. Heresy yes, but we can live in hope.

Quite possibly but...

Dominic (of honnef.co fame) has rightly pointed out in other discussions that its not the load time so much as the effort to decompress and parse all of that JS.

Exactly, which is why my various comments on the subject of "large files" or "splitting into packages" basically boil down to:

  • Solve Optional aggressive dead code elimination #186 first
  • Then re-assess whether "large files" are still a problem, in which case examine whether splitting into package files (served via CDN) makes sense (there are of course other approaches).

Solving #186 is a necessary pre-requisite in order that:

@steveoc64
Copy link
Contributor

Agreed that #186 is a better option, and that performance is a factor of how much actual code hits the JS interpreter at the end of the day.

Agressive DCE would mean that "a standard gopherjs bundle" would not be very standard at all (as each app compile would cull different code) ... which in turn dilutes the usefulness of having a CDN bundle.

At the moment, "encoding/json" and "reflect" are 2 big contributors to compiled JS size,, although I have noticed significant differences in the compiled size of each on different code bases when using the current master branch. Can I assume from this that there is already quite a bit of DCE going on ?

eg :
encoding/json on project 1 = 76461 bytes
encoding/json on project 2 = 154335 bytes

unicode on project 1 = 144372 bytes
unicode on project 2 = 42742 bytes

Given these findings on the current master branch, a CDN approach is already looking not that good an idea. Am I missing something obvious here ?

@myitcv
Copy link
Member

myitcv commented Jan 14, 2017

Given these findings on the current master branch, a CDN approach is already looking not that good an idea. Am I missing something obvious here ?

I don't think so. At least I've reached the same conclusion in my thought process.

But equally, I don't see an issue with this conclusion for now... many things may change which makes it the wrong conclusion in time!

@tj
Copy link

tj commented Feb 18, 2017

The size is pretty reasonable if you stay out of stdlib territory. My conclusion so far is you pretty much just can't use encoding/json and net/http, removing those shaved 5MB off my test project haha.

@dmitshur
Copy link
Member

dmitshur commented Feb 13, 2018

This issue and the discussion here seems very similar to #524. I'll close this in favor of that issue, so there's one unified place for the "having GopherJS produce rather than a single javascript file, multiple files - one for each Go package" topic.

Please feel free to continue in #524.

@dave
Copy link
Contributor

dave commented Feb 22, 2018

Hey so I'm going to post this here too (I know this issue is closed, but the title much closer represents the issue I'm solving):

GopherJS is an amazing tool, but I've always been frustrated by the size of the output JS. I've always thought a better solution would be to split the JS up by package and store it in a centralised CDN.

This architecture would then allow aggressive caching: If you import fmt, it'll be delivered as a separate file fmt.js, and there's a good chance some of your visitors will already have it in their browser cache. Additionally, incremental updates to your app will only change the package you're updating, so your visitors won't have to download the entire dependency tree again.

Today I'm announcing jsgo.io, which is this system. Here's how it works:

Visit https://compile.jsgo.io/<path> to compile or re-compile your package. Here's a very simple hello world - just click Compile: https://compile.jsgo.io/dave/jstest

After it's finished, you'll be shown a link to a page on jsgo.io: https://jsgo.io/dave/jstest. The compile page will also give you a link to a single JS file on pkg.jsgo.io - this is the loader for your package. Add this URL to a <script> tag on your site and it will download all the dependencies and execute your package.

The compile server should be considered in beta right now... Please add issues on https://github.com/dave/jsgo if it's having trouble compiling your project. The package serving framework (everything in pkg.jsgo.io) should be considered relatively production-ready - it's just static JS files in a Google Storage bucket behind a Cloudflare CDN so there's very little that can go wrong.

Let me know what you think!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants