-
Notifications
You must be signed in to change notification settings - Fork 397
Introduce a binary API with Scala functions in Reflect
.
#5161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
6b00b7f
to
b223873
Compare
I think this is a worthwhile thing to do. Have you given any thought whether we should attempt to make the reflect API Scala agnostic as well? It seems it only relies on Scala only (i.e. non Scala.js IR) concepts superficially. IDK when we first designed this, but it might have been well before the times of the Scala independent javalib. With it, I wonder if we should at least make the interface we design now Scala independent. |
We can make the new registration API not use any Scala type. But it won't change the fact that the public API exposes We also need a Java type for the pair of |
I've been thinking about this and I'm wondering if it would make more sense to specify registration in terms of IR opcodes. It's essentially functionality provided by the runtime.
But couldn't we make the core implementation rely on Java only and keep the Scala interface an adapter?
Yeah... feels a bit of a stretch after having read the Javadoc, but maybe :-/ |
#5183 is a variant that relies on a Java-only API in the |
TY for the alternative in #5183. I'll comment here to keep the discussion in one thread. The reflect API has been bothering me and I have been trying to pinpoint why. I think the core of it relates to the fact that, unlike most of the other things, the reflect API is not merely an interface to an existing construct (JS classes, dynamic imports, java buffers, etc.) but an entirely new / standalone feature. IIUC it would not really be difficult to offer this in a completely separate compiler plugin (of course, since Scala.js core uses the feature, that's not an option). From that point of view, I think we should explicitly discuss and decide whether we expect usages of If yes, we should push relevant pieces down to at least javalib, maybe even IR. To kick-off that decision making, I've assembled an initial list of consequences of either decision:
From this cursory look, it seems that it might not be necessary / a good idea to attempt to generalize this. @sjrd WDYT? Did I miss anything in the list? Footnotes
|
The From that perspective, the centralization was good because all testing frameworks could rely on a unique feature, without having to introduce their own compiler plugin and registry. Compiler plugins have a significant cost, since they are not transitively resolved. Every user of a testing framework would need an An alternative design would have been to:
Another complication is that the feature is not actually implementable in "compiler plugin space". It needs to generate static initializers in classes. No other feature of the language, even seen at the backend, allows to generate static initializers (without the side effect of exporting a fake top-level static field). To allow compiler plugins to enable Reflect, we would need to introduce another feature to generate static initializers. Altogether, I think the centralization of |
Thanks for the write-up here. I think it surfaces an important distinction: Reflect support is primarily a feature of the compilation subsystem, not the IR subsystem. In that sense, it has very similar status than However, I feel this discussion actually shows that 3rd party library interop is crucial here, but not how I originally thought: If a library marks a type as I think even just that makes a very strong case that we should aim to centralize the feature so it is available across compilation ecosystems (lest we are willing to sacrifice interoperability between them). We will probably have to live with the fact that some of the compilation time artifacts (e.g. annotations) are scalajs namespaced, but probably that's OK. WDYT? (we probably still need to have the discussion how centralize, but that's the next step). |
I agree. Indeed, that's a good analysis. The opt-in via subtype across libraries is crucial here. |
And make its internals independent of JS interop. This way, it will be usable in Wasm without JS host. We use new names for the methods, instead of overloads, because older compilers look for those methods by name only. If an older compiler gets a newer library on the classpath, and the methods with those names became overloaded, the compiler would crash. In this commit, we do not change the compiler yet, to test that the deprecated methods are correct.
8741ebd
to
f12e99f
Compare
Coming back to this to try and move it forward. Given that (it seems) we agree that centralization is desirable, it seems either this PR or #5183 would be appropriate (in some form or another). This PR centralizes among Scala compilers and frameworks, while #5183 centralizes even across several languages. Of course "several languages" is still hypothetical, but it's been a good design framework for the IR, so it's probably still relevant. Even in the Java API variant, it is still left up to compilers "according to the rules of the source language" what classes are actually registered. In that sense, it's not clear that there would be true interoperability across languages. For example, if we reimplemented JUnit in a hypothetical Java-to-SJSIR compiler, the mechanism to register classes for reflective instantiation would not really make sense across languages. That's an argument in favor of the Scala API, and other languages would have to offer their own reflect mechanism and API. On the other hand, if we did have a Java-to-SJSIR compiler one day, we would probably want JUnit, of all things, to be usable equally from Java and Scala. The implementation could be either in Scala or Java, but the interface of the reflect system would have to be such that Java-produced SJSIR can use it. So that's an argument in favor the Java API for AFAIKT, Kotlin does not use a So, basically there's not much to learn from their design, in our context. Overall, I don't think there's a winner either way. Given that, and even though I kind of like the Java-API design, I'd suggest we stay with the current PR, with a Scala API. It's a much smaller diff compared to the status quo. And if in the future, we really want a Java API, we could still take the next step and go there. WDYT? |
And make its internals independent of JS interop.
This way, it will be usable in Wasm without JS host.
We use new names for the methods, instead of overloads, because older compilers look for those methods by name only. If an older compiler gets a newer library on the classpath, and the methods with those names became overloaded, the compiler would crash.
In the commit, we do not change the compiler yet, to test that the deprecated methods are correct.
In the second commit, we change the compiler to use the new APIs.
This does not bring any immediate benefit. It's more for a long-term vision. But it also shouldn't hurt, besides the cost of maintaining a deprecated pair of methods. If we had had
NewLambda
all along, I think we would have designed this API using the Scala types to begin with. TheReflect
object never pretended to be Scala-agnostic, since its public API is full of Scala types anyway.