Post by Chris WithersRight, but how do you may sure the .yml file matches what you actually
Thatâs guaranteed (assuming no bugs :-) )
I beg to differ ;-)
conda create -n myenv package1
source activate myenv
conda env export > environment.yaml
conda install package2
it's guaranteed to match WHEN you do the export of course.
environment.yaml is now out of date.
Post by Chris WithersPut differently, how can I get conda (install|upgrade|remove) to update
Post by Chris Withersthe .yml file each time it runs?
You canât â you do a new export yourself.
Right, I'm wondering if maybe install/upgrade/remove should maintain one
or both of environment.yaml and environment.lock.yaml?
does pip or virtualenv really do that? It sure didn't before I gave up on
virtualenv :-)
Assuming the install/upgrade/remove tooling what maintaining it, why not? I
Post by Chris Withersagree that I wouldn't check the modified files into source control until I
was sure I wanted the changes, but I'd certainly like the file(s) to be
accurate at all times!
I can see the logic, but I still think it's at the wrong point in your
workflow. Saying "this is the official deployment environment" should be a
pretty deliberate step. If it's updating itself constantly as I update the
environment, I"m not sure I see a point in having it at all.
Imagine multiple developers -- each manipulating their environments on the
fly differently - seems ripe for confusion and error.
and if multiple developers then you have a two-way street:
developer A makes changes to their environment -- the environment.yaml file
updates itself.
developer B makes different changes to their environment -- the
environment.yaml
file updates itself.
The both merge into master
now we have devA's environment, devB's environment, and a third merged
version (which could be broken with merge conflicts...)
devs A and B do a pull.
now the environment file is out of date with the developer's environment....
Does the environment somehow magically update itself??? or does it save its
current state back into the environment file, thereby downgrading the
environment again?
Anyway, I'm not saying an automated workflow for this couldn't be devised,
but I'm suggesting that conda probably isn't the place for that automation.
I'm likely going to resurrect picky-conda [2],
hmm, that could be handy, yes.
Post by Chris Withersconda create -n myenv
source activate myenv
"install packages"
picky lock
*do more*
pick check
So, "install packages" here could mean "conda install foo", but that won't
track which packages you've installed explicitly versus ones that have come
as dependencies, so I'd maybe suggest adding to a bare-bones
conda install -f environment.yaml
I'm not sure I've arrived at the "best" way to do it, but I currently use
an conda_requirements.txt for development, that specifies only the
top-level packages and not all pinned down (i.e. >=) and then a separate
environment.yaml file to fully lock it down for the production environment.
This means that each developer on the project may not be using the exact
same packages, but we can push a fully tested environment for deployment.
And having the developers have a bit more flexibility helps us keep
dependencies up to date and catch the bugs that that introduces.
but it is kinda ugly to keep all that in sync.
now that i think about, the practical issues we've had are when ne
developer updated a dep (or adds one), and even if they updated the
requirements files), other deps are pulling from the main repo and running
with their existing environment.
Hmm -- we have actually put in some kludgy run time check code for versions
of our in-house-under development deps.
Maybe we should do the same for all deps, pulling from a single
requirements file.
"picky lock" would take options in an environment.yaml section (assuming
Post by Chris Withersconda ignores top-level keys it doesn't understand), and use them to
massage the output of "conda env export" into an environment.lock.yaml that
could be used to conda env create/update for reproducible builds.
"picky check" would be the same as lock but would just whine if the
Post by Chris Withersenvironment.lock.yaml it generates doesn't match the one on disk.
I'm still confused about what a "lock" is in this context, but I think
you're going in the right direction.
Post by Chris WithersInteresting opportunities for also pruning dependencies that are no longer
needed
I've always wanted to do some kind of run-time checking -- run your test
code, and see what's in sys.modules -- those are your deps. Mapping that to
pip or conda packages is a different story, however...
-CHB
--
Christopher Barker, Ph.D.
Oceanographer
Emergency Response Division
NOAA/NOS/OR&R (206) 526-6959 voice
7600 Sand Point Way NE (206) 526-6329 fax
Seattle, WA 98115 (206) 526-6317 main reception
***@noaa.gov
--
You received this message because you are subscribed to the Google Groups "conda - Public" group.
To unsubscribe from this group and stop receiving emails from it, send an email to conda+***@continuum.io.
To post to this group, send email to ***@continuum.io.
Visit this group at https://groups.google.com/a/continuum.io/group/conda/.
To view this discussion on the web visit https://groups.google.com/a/continuum.io/d/msgid/conda/CALGmxEJSuAMDsjodNy4Mfq7Xqt2sGrHfY38oKjOv4W63EcT9aw%40mail.gmail.com.
For more options, visit https://groups.google.com/a/continuum.io/d/optout.