Skip to content

do not add all extension libraries to bootclasspath #8472

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
scabug opened this issue Apr 3, 2014 · 6 comments
Closed

do not add all extension libraries to bootclasspath #8472

scabug opened this issue Apr 3, 2014 · 6 comments

Comments

@scabug
Copy link

scabug commented Apr 3, 2014

https://github.com/scala/scala/blob/v2.11.0-RC3/src/compiler/scala/tools/ant/templates/tool-unix.tmpl#L88

This will lead to difficulties with someone trying to use a different Akka version using the scala script, since akka-actor.jar will be used in the version that is shipped with the Scala distribution, not what is specified using the -cp option.

@scabug
Copy link
Author

scabug commented Apr 3, 2014

Imported From: https://issues.scala-lang.org/browse/SI-8472?orig=1
Reporter: @rkuhn
Affected Versions: 2.10.4, 2.11.0-RC4

@scabug
Copy link
Author

scabug commented Apr 3, 2014

@retronym said:
scalac -nobootcp might be a workaround in the meantime

@scabug
Copy link
Author

scabug commented Apr 3, 2014

@rkuhn said:
Yes, but the problem is mostly one of discoverability: it usually proceeds like seeing weird behavior (like a feature which should be there is not there), then asking on StackOverflow, then with some detours being pointed at this option. The user is actively putting Akka on the classpath, which is why they have trouble seeing how that could not be effective.

@scabug
Copy link
Author

scabug commented May 6, 2014

@retronym said:
Reported again by the Spark team:

Spark users have been running into an annoying issue where they can't launch Spark using the `scala` command because it automatically injects a fixed version of akka on the classpath[1]. Spark uses a different version of akka that is incompatible with that included with Scala 2.10 (or at least seems to be based on these binary errors). We are using akka 2.2.3.
I'd guess that other projects have run into this, so I wonder if there is a more straightforward work-around.
My proposal was to have a flag that doesn't load extra libraries:
scala -langonly XXX
scala -noakka XXX
Right now we tell the users to launch with `java` which is definitely more clunky.
[1] http://apache-spark-developers-list.1001551.n3.nabble.com/Akka-problem-when-using-scala-command-to-launch-Spark-applications-in-the-current-0-9-0-SNAPSHOT-td2.html

@scabug
Copy link
Author

scabug commented Sep 14, 2015

@SethTisue said:
The Spark/Akka part of this is resolved in Scala 2.12. There is no more Akka jar in the lib directory.

scala-parser-combinators, scala-xml, and scala-swing remain affected. It's not clear if any change should be made there for 2.12. XML is still part of the language, and scala-parser-combinators and scala-swing have chapters in Programming in Scala, so it's not clear that unbundling them altogether is desirable. And as for doing something else... there's no obvious way for a beginner to shoot themselves in the foot like there is with Spark+Akka, so maybe it's not worth touching.

@SethTisue
Copy link
Member

SethTisue commented Oct 24, 2023

scala-parser-combinators, scala-xml, and scala-swing remain affected

not anymore, these were all unbundled

the scala script is pretty legacy these days anyway; scala-cli is the new hotness

it's not clear this is otherwise actionable or that anyone is still concerned. hence, closing

@SethTisue SethTisue closed this as not planned Won't fix, can't repro, duplicate, stale Oct 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants