I've had this on the back-burner for a while, but I'm still very
excited about this feature and I've finally got some functioning
== How it works: ==
When --autocompile is given (I believe strongly that this should be
the default, as it is in this patch), the solver generates a synthetic
implementation based off the embedded in each
source implementation's `compile` command.
These synthetic implementations have a "*" machine type, and depends
on the corresponding source implementation's compile command. The old
selection behaviour can be forced by passing --binary, and --source
should still work as it used to.
These synthetic implementations are only used to bring in runtime
dependencies & corresponding source implementation for the solve -
they don't end up appearing in selections. So the public interface
doesn't really change, except that selections can now end up with
*-src implementations in them.
If the solve results in any "src" selections, the `run` command passes
the selections document to 0compile autocompile --selections. Assuming
that works, it then re-solves (forcing binary-only selections) and
runs the result. Currently only the `run` command will do any
compilation. `select` may return source impls, but they won't actually
be compiled until you try to `run` that selections document.
`download` should download everything required for both runtime &
compilation, though I haven't actually tested that.
Since `run` is already running arbitrary code, it seems perfectly safe
to have autocompilation be the default. The only case that might cause
surprise is if you are using --wrapper to run inside some secure
sandbox, but the compilation step obviously runs outside of that. The
benefit of having having source feeds "just work" seems well worth it
to me, and we could potentially require an explicit
--source/--binary/--autocompile when --wrapper and similar advanced
options are used, if we're worried about people being caught out by
this (I'm not, personally).
I'm including both build dependencies & runtime dependencies in the
This ensures we get a usable result (e.g if we used python 3 to build
but some runtime dependency requires python 2, that would end up with
an unworkable result). It does mean we have higher chance of clashes
if build & runtime deps conflict, but that seems somewhat unlikely.
You can always manually run 0compile to break such a conflict, but
it's presumably bad practice to have feeds that have such conflicts
== Issues: ==
I haven't even tried to compile the tests. I thought I'd get the
go-ahead from Thomas that this looks like a reasonable approach before
putting too much effort into the tests.
There are a couple of TODO/XXX notes, those will have to be worked out
(or accepted) before merging.
I haven't done any testing to see how this affects existing solves.
e.g making sure the new synthetic implementations don't interfere with
--source or --binary solves.
I haven't tested a recursive build yet (source impls that need other
sources to be built first).
This pulls in 0compile-namespaced elements into selections documents.
The prefix comes out as "ns:", which is a bit lame. I haven't figured
out how to change this yet.
0compile autocompile doesn't support both --gui and --selections, yet.
When testing, I noticed that existing app requirements/json have
source=false, so they won't be autocompiled. Which makes sense for the
old code, but is equivalent to passing --binary with the new code. So
you won't get autocompilation on an existing alias that adds source
impls, you'll have to run the interface URI specifically (or possibly
`0install upgrade --autocompile` will work).
== Feedback ==
This is my first nontrivial patch since the ocaml rewrite, and it's
also something I've made attempts at in the previous python version.
Surprisingly (well, to me at least) I found modifying an unfamiliar
ocaml codebase much easier than a python one. Partially because the
explicit .mli files give a good summary of each module's surface area,
but also because modifying the type of something and then fixing the
errors gives you a good way to start figuring out how to actually
enact a certain change. And in general, the compiler keeps you from
straying too far into code that doesn't even make sense, compared to
python where you often have to keep trying to trigger a certain code
path a number of times before you figure out what you can actually do
with a given object.
So yeah, that's pretty positive, considering I've written a lot of
python code over a number of years, while I only started picking up
ocaml when the ZI port was announced.
Unless anyone objects, I'll be rebasing these branches as I change
things. Let me know if you'd rather me make a new branch or
append-only commits, but I figure rebasing makes for a cleaner history
when it comes to merging it.
On Mon, May 19, 2014 at 6:39 PM, Thomas Leonard wrote:
> On 11 May 2014 11:38, Tim Cuthbertson wrote:
>> Another one relating to
>> Basically I am generating a lot of feeds from upstream sources (like
>> pypi), and a lot of these are feeds that require compilation - either
>> because they are actually native code, or because they do stuff like
>> using 2to3 to make python2 code into python3 code during a build step.
>> I don't have any intent to host all of the built archives, so I'm
>> leaving each of these generated feeds without binary implementations.
>> Or, in some cases multiple implementation - a "*-*" feed direct from
>> pypi which requires python 2, but a *-src implementation (of the same
>> archive) which requires compilation but can then be used with python
>> Issue #1:
>> When I'm building the `sniffer` library for python 3, I know that it
>> depends on setuptools. Currently I have done nothing special for
>> setuptools, so it itself is an 0downstream-generated feed with the
>> above awkward structure (you can use it as-is for python2, but there's
>> only a source implementation for python3).
>> My heuristics currently try to run the feed as portable code ("*-*"),
>> and if that fails it just assumes that it requires compilation (and if
>> that fails, it bails). This is reasonable enough on the first level
>> (i.e for setuptools itself), but when it comes to something
>> _depending_ on setuptools, it will fail not because the feed requires
>> compilation, but because setuptools does. So if I try to run 0compile
>> on sniffer, it'll work (and think that it actually needed
>> compilation), but only because that also compiles setuptools - sniffer
>> itself may not have needed compilation! And this is horribly stateful
>> - the next time I add a feed depending on setuptools, it _will_ have
>> been autocompiled already and so it'll actually work the first time
>> round as portable code (*-*).
>> Issue #2:
>> Some generated feeds have multiple implementations that diverge for
>> python 2 and python 3 (e.g different dependencies), which _both_
>> require compilation. AFAIK it is currently impossible to cause a
>> specific implementation to be built by 0compile, it'll just select the
>> "best" one (python3 I guess?). But this isn't going to be what I need
>> in order to build a python2 feed which depends on it.
>> I'd say both of these issues point towards needing better compile
>> integration with 0install.
>> Either 0compile can have a "prepare" mode, where you say " I want to
>> run `x`". And it does a selection (just like ZI), but including both
>> source and binary impls. If it gets any source impls in the best
>> selection, it compiles them.
>> This would cover the case of something being portable but depending on
>> a not-yet built feed - the dependency would be built, even though the
>> thing you specified on the command-line didn't itself need building.
>> It would however, mean that if you were happy to autocompile things,
>> the *correct* thing to do is run `0compile --make-sure-i-can-run x &&
>> 0install run x`. Which nobody but a computer will remember to do.
>> So the alternative (which seems much better to me) is that 0install
>> gets an --autocompile option, which causes it to not reject source
>> implementations during selection. After selection, if it picked any
>> source implementations, the whole selections document (or something
>> similarly detailed) is passed off to 0compile, so that it can not only
>> compile the right version, but that it should *also* use the given
>> selections during compilation (for any interfaces that are required
>> both at runtime and compile time, like `python`). This would ensure
>> that the resulting compiled feed would actually _work_ for the purpose
>> you wanted to use it, which is an exceedingly nice property. But it
>> still keeps 0compile out of 0install core, so that it can be updated
>> So, how about it? Do either of these approaches seem practical? I'd be
>> glad to help out with this, given some guidance on how best to go
>> about it (and if it's possible).
> Yes, this is exactly what is needed.
> "Not rejecting" source implementations isn't quite enough. You also
> want to ignore their build dependencies and infer their runtime
> dependencies (as far as possible). Also, you need to rank them so that
> an existing binary is preferred over recompiling the same version. And
> infer the commands that will be created (0compile always assumes a
> "run" command will appear, which covers most cases).
> The selections XML format will need to be updated slightly to be able
> to indicate and 0compile modified to take such a
> document as input.
> Dr Thomas Leonard http://0install.net/
> GPG: 9242 9807 C985 3C07 44A6 8B9A AE07 8280 59A5 3CC1
> GPG: DA98 25AE CAD0 8975 7CDA BD8E 0713 3F96 CA74 D8BA
> "Accelerate Dev Cycles with Automated Cross-Browser Testing - For FREE
> Instantly run your Selenium tests across 300+ browser/OS combos.
> Get unparalleled scalability from the best Selenium testing platform
> Simple to use. Nothing to install. Get started now for free."
> Zero-install-devel mailing list
> [email protected]