Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workflow for layered requirements (e.g. prod<-test<-dev requirements)? #398

Closed
dan-passaro opened this issue Oct 6, 2016 · 37 comments · Fixed by #905
Closed

Workflow for layered requirements (e.g. prod<-test<-dev requirements)? #398

dan-passaro opened this issue Oct 6, 2016 · 37 comments · Fixed by #905
Labels
docs Documentation related PR wanted Feature is discussed or bug is confirmed, PR needed

Comments

@dan-passaro
Copy link

dan-passaro commented Oct 6, 2016

Say I have

requirements.in:

Django~=1.8.0

And also

requirements-dev.in:

django-debug-toolbar

How can I run pip-compile on requirements-dev.in, where it will also take into account the requirements in requirements.in when figuring out which versions to use?

For now I have an ad-hoc script that compiles requirements.in first, then requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow? I'm worried that in the future if I add a dependency it will try and update a bunch of stuff I don't want it to update, but I haven't actually used this tool long enough to determine whether that's truly a problem. Wondering if anyone else has used pip-tools in this fashion and has any advice?

@jamescooke
Copy link
Contributor

requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow?

Yes I totally think that's a good strategy.

Wondering if anyone else has used pip-tools in this fashion and has any advice?

I've just published my pip-tools workflow for managing dependent requirements files: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

@nvie
Copy link
Member

nvie commented Nov 17, 2016

Hi @jamescooke, I just saw your post, it looks great! Once suggestion I could make here is that you can include the shared .in file (so not the .txt file!) from within your .in files. That way, pip-compile has just a tiny little bit more information to compile the per-env output files.

@nvie
Copy link
Member

nvie commented Nov 17, 2016

To answer the original question, you can use this for your requirements-dev.in:

-r requirements.in
django-debug-toolbar

And then use this to compile it:

pip-compile requirements-dev.in

And then it all just works™.

@nvie nvie closed this as completed Nov 17, 2016
@jamescooke
Copy link
Contributor

Hi @nvie - thanks for the kind words about the blog post 😊.

The reason that I recommended including the .in files rather than the .txt files is to account for changes in the package indexes which might mean that a testing requirements file ends up with different package versions that the base file.

As an example, let's say that a project wants any Django that's version 1.8, so in requirements.in:

django<1.9

When we compile that file in October it picks 1.8.15 and makes requirements.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements.txt requirements.in
#
django==1.8.15

Now, in November a new version of Django is released 1.8.16. We add or update the testing requirements (without touching the production requirements) requirements-dev.in:

-r requirements.in
django-debug-toolbar

Using pip-compile requirements-dev.in, we compile that to requirements-dev.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements-dev.txt requirements-dev.in
#
django-debug-toolbar==1.6
django==1.8.16
sqlparse==0.2.2           # via django-debug-toolbar

As you'll see, Django has now been bumped in the dev requirements only to 1.8.16 from 1.8.15, even though the main requirements.in and requirements.txt have not changed. A good developer would spot this for sure - but I've missed something similar before on previous projects, with much resulting pain.

It's for this reason that I have been including the txt file instead of the in file - I've found it keeps the versions exactly the same between requirements layers.

So with requiremements-dev.in as:

-r requirements.txt
django-debug-toolbar

When this is compiled we now get requirements-dev.txt:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --output-file requirements-dev.txt requirements-dev.in
#
django-debug-toolbar==1.6
django==1.8.15
sqlparse==0.2.2           # via django-debug-toolbar

This maintains the exact Django 1.8.15 version that we're looking for, regardless of the fact that the dev requirements were compiled after the new version of Django was released. When we then update the requirements.txt and recompile the dev requirements, then that version will be bumped.

I'd be really happy to see the just works:tm: version of the kind of pinning that I'm talking about - I'm sure I'm getting the wrong end of the stick somewhere.

One alternative I could see is to pin the Django version in an .in file, but isn't that missing the point of the them?

Sorry for the long essay 😞

@nvie
Copy link
Member

nvie commented Nov 17, 2016

Thanks for the extensive explanation, @jamescooke! You're absolutely correct. This is why I normally also advise to always recompile both (all?) .in files at once, never one and not the other, so the pinned versions remain in sync. But yeah, I agree that this takes discipline and if you forget about it, the tooling won't protect you against these diverging pins indeed, your example illustrates that perfectly. Not sure how we can build support for this workflow into pip-compile natively.

Thanks for shining a little light on this subject.

@FranklinYu
Copy link

Sorry if I missed something, but what's the issue of -ring the .txt? The workflow looks good, except that I may need to intervene when updating a single package (but I typically just update all).

@Groxx
Copy link

Groxx commented Dec 29, 2016

I'll just chime in with 100% voting for -r .txt. It's important that prod == dev == test as much as possible, and that's essentially the only way to ensure everything's a strict superset of prod.
We manage ours with a small script that compiles them in the right order, and (though sometimes it breaks) we do a non-upgrade compile in CI that diffs the results, to make sure nobody modified them by hand. It has stopped several mistakes already.

Only real downside is that sometimes the requirements.txt pins to a version that's incompatible with something in requirements-dev.txt - in an ideal world, we'd be able to check the rules from everything at once and (sometimes) avoid that. But it's usually pretty easy to trace down.


Maybe a better end-result would be to be able to compile all .in files at once, so pip-compile can detect the dependencies between them, and have it produce multiple corresponding .txt files instead of one combined one? I'd find that more useful in pretty much every scenario I've encountered, and it seems like the right place to do it.

@FranklinYu
Copy link

FranklinYu commented Dec 30, 2016

Only real downside is that sometimes the requirements.txt pins to a version that's incompatible with something in requirements-dev.txt

@Groxx Does that happen after you upgrade (recompile) requirements.txt? So some new version in requirements.txt stops some legacy package in requirements-dev.in?

For (imaginary) example, django-debug-toolbar==1.6 (latest) works with django<1.9, and you bump Django to 2.0 in requirements.txt? In this case django-debug-toolbar gets out of date.

@Groxx
Copy link

Groxx commented Dec 30, 2016

Yep, exactly. Though maybe more accurately (with fake versions):

  • django in requirements.in
  • django-debug-toolbar in requirements-dev.in
  • run pip-compile --upgrade requirements.in, succeed with django==2.0
  • run pip-compile requirements-dev.in (or with --upgrade)...
  • ... discover django==2.0 is incompatible with all versions of django-debug-toolbar...
  • ... (╯°□°)╯︵ ┻━┻ #ragequit

It's all made worse when the cause is in requirements-do-this-first.in and the conflict is in requirements-dev-test-localdev-really-long-req-chain.in and it takes a while to figure out why django==2.0 is being chosen in the first place. But once people understand the process / know to use --verbose, it doesn't usually take too long.

@FranklinYu
Copy link

FranklinYu commented Dec 30, 2016

Even if it doesn't take long, it does waste time; and I believe it's one of the reason why we have this project. I came from Ruby background, where Bundler did the right thing: even if you ask it to only upgrade a single package, it re-constructs the entire dependency graph, including development ones. Similarly, when upgrading, I believe pip-compile should

  1. take all the requirements-*.in;
  2. list the union of all the packages;
  3. list the union of all requirements-*.txt pins;
  4. try to construct a dependency graph with all the package, satisfying all the pins;
  5. come up with a list of all the new pins;
  6. for each requirements-*.in, pick some packages from the final (new) pin list, and generate the respective requirements-*.txt.

I'm not sure whether current Pip works with this workflow, that is, whether this workflow is a feasible solution for pip-compile.

@jamescooke
Copy link
Contributor

@Groxx thanks for the django==2.0 and django-debug-toolbar example... this is the exact kind of scenario I've been concerned about. I found your example a good illustration.

@FranklinYu - The Bundler strategy might also work. Thanks for illustrating the unions of package requirements 👍

@Groxx
Copy link

Groxx commented Dec 30, 2016

@jamescooke yeah, it does happen. The alternative though, if you include -r requirements.in in your requirements-dev.in, is that this is possible:

  1. pip-compile --upgrade requirements.in, get django==2.0
  2. pip-compile --upgrade requirements-dev.in, get django==1.9 and django-debug-toolbar
  3. dev, test, etc against 1.9, but release against 2.0.

The mismatch between dev and prod and the lack of any error or warning are largely what pip-tools helps eliminate, so to me this is an entirely unacceptable result. I much prefer to have the second step fail, which reveals that there is a problem, rather than rely on my eyes to catch the disparity between the two.

@maxnordlund
Copy link

maxnordlund commented Mar 7, 2017

This bit me today and also coming from a Ruby/Bundler background I like that all dependencies is in the same lock file, but I don't want to install dev dependencies in production. However this seems incompatible with how pip currently operates. That is, one requirements.txt to rule them all, but separate common/dev dependencies.

I had hoped that having dev.in -> dev.txt would solve this but as others have noted you get conflicts. And while I could have a -r somewhere it still would produce two lock files, which sooner or later will diverge.

So my question is if it would be possible to teach pip-compile to just write the dependencies for one input file, while accepting the pinned ones in another. Perhaps an example would clarify this:

# requirements.in
django
# requirements.txt
django==1.9
# dev.in
-r requirements.in
django-debug-toolbar
# dev.txt
-r requirements.txt
django-debug-toolbar==1.6
# note, no direct django dependency here, but still respect the 1.9 bound.

Here I've overloaded -r to point to the other file. Thoughts?

@dogweather
Copy link

dogweather commented Mar 30, 2017

How about considering eliminating the need for multiple files by supporting sections in requirements.in? This is how the Ruby Gemfile works, and it neatly solves the problem:

# Install in all environments
gem 'rails'
gem 'mysql'

# Install only in test
group 'test' do
  gem 'rspec'
end

# Install only in development
group 'development' do
  gem 'web-console'
end

@dogweather
Copy link

@maxnordlund
Copy link

How about considering eliminating the need for multiple files by supporting sections in requirements.in? This is how the Ruby Gemfile works, and it neatly solves the problem:

That would make it incompatible with vanilla pip, which isn't really an option for this set of tools IMO. For the official project supporting that idea see https://github.com/pypa/pipfile.

@maxnordlund This blog post answers your question, I believe: http://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

I've read that, it's linked in the first comment.


In the end I wrote a small python script to generate the two files, with dev having a -r pointing towards base.txt. Then I strip dev.txt of all common dependencies to ensure they cannot diverge. This also forces you to call pip-sync base.txt dev.txt, but that's no biggie and in my case the script actually runs that as well.

The sad part here is that you need another layer to get it right, either a script or make, instead of it being included in the box. The only thing I think might be good enough without changing the format too much, is the suggestion above. That a -r in an *.in file is translated to mean "use existing versions in that file (or compiled version thereof), and write everything else to output".

@davidovich
Copy link
Contributor

That a -r in an *.in file is translated to mean "use existing versions in that file (or compiled version thereof), and write everything else to output".

I think this would bring value, and keep existing processing. The only change would be in the file collection phase before invoking the resolver. I believe this is not too hard to implement, I am open to a PR for this functionality (with relevant tests).

@davidovich davidovich reopened this Mar 30, 2017
@davidovich davidovich added the PR wanted Feature is discussed or bug is confirmed, PR needed label Mar 30, 2017
@dfee
Copy link
Contributor

dfee commented Jun 27, 2017

@jamescooke thanks for posting that article (though it was a while ago). I made one slight modification to it:

RELATIVE_ROOT=..  # relative path to project's root
%.txt: %.in
        pip-compile --output-file $@ $<
        sed -i '' "s|-e file://$(realpath $(RELATIVE_ROOT))|-e $(RELATIVE_ROOT)|" $@

i.e. this corrects the annoyance -e file:///Users/dfee/code/zebra -> -e ., making the file useful for users who don't develop / deploy from your directory.

I know this isn't the really the place to discuss your Makefile, but I've grown tired of editing requirements.txt files after pip-compileing them. Other folks have too, and there doesn't seem to be a fix on the horizon.

@jamescooke
Copy link
Contributor

Hi @dfee , thanks for sharing this suggestion 👍

I've not been able to get this working on my machine, so I won't update my article just yet. The post is on GitHub here https://github.com/jamescooke/blog/blob/master/content/1611-pip-tools-workflow.rst - feel free to open an issue / PR to discuss.

@vladiibine
Copy link

Hi guys,
Great package.

Wanted to jump in with this observation, that for me is really important:

I consider that having -r base.txt in a fils such as dev.in is the best workflow yet.
One big drawback is that this way we LOSE the comments that tell us why a dependency was installed.

For instance

# base.txt
package0==1
package1==1.4   # via package0

Then in dev.in

# in dev.in
-r base.txt
package2==3.3

Then in the resulted dev.txt

# in dev.txt
package0==1
package1==1.4      # !!!!!!!!!!!! the via comment will be missing here. I'd totally prefer this to remain here... :(
package2==3.3

Anyway, that's all from me. Whoever fixes this, please take this into consideration if you can.

@anlutro
Copy link

anlutro commented Aug 29, 2017

Why can't -r base.txt statements (as long as they're .txt, not .in) just get copied as-is to the resulting .txt file?

@samkk-nuna
Copy link

I followed @jamescooke's flow and recently ended up in a state where I had to add a constraint to my base.in to help the resolver out, because of the following:

  • Add boto3==1.7.14 to base.in
  • Add moto==1.3.3 to test.in, which starts with -r base.txt

Try compiling .txt files from these with pip-compilebase.in compiles fine, but emits a hard equality constraint python-dateutil==2.7.2 into base.txt, which then conflicts with a python-dateutil<2.7.0 constraint emitted from something in moto's dependency tree.

I've hacked around this for now by explicitly stating python-dateutil<2.7.0 in base.in, but that feels gross. Any recommendations on better workarounds, or plans to better support things like this?

@devxpy
Copy link

devxpy commented Dec 13, 2018

Is there a possibility of having seup.py support for this?

for example, one might do -

    ...
    extras_require={
            'dev': ['ipython', 'pip-tools', 'twine']
    },
    ...

Which would generate the following dev-requirements.txt -

ipython           7.2.0      
<my pkg>          0.0.1      <path>
pip-tools         3.1.0      
twine             1.12.1     
...

(-e . is ofc implied, since it's setup.py we're reading from)

It could be sensible to even have this such that pip-compile will generate separate *-requirements.txt file for every extras_require field

@tmzhuang
Copy link

Does anyone have any existing work related to this?

@atugushev
Copy link
Member

@tmzhuang

I presume does not. Is this solution from #398 (comment) not suitable for you?

@merwok
Copy link

merwok commented Jul 10, 2019

This describes a workflow: #532

  • runtime.in/.txt contains common dependencies
  • tests.in/.txt adds test tools (uses -c runtime.txt to avoid conflits but not -r)
  • ci.in/.txt is CI tools (tox, awscli for example)
  • deploy.in/.txt contains tools needed on servers but not locally or during tests (waitress/gunicorn, etc)

@tmzhuang
Copy link

@atugushev The workflow is fine. My issue is that on requiring another requirements.txt (eg. -r base.txt) I lose the comments on where they originated from.

base.txt:
aiohttp==x.x.x # via slackclient

dev.in:
-r base.txt
django

dev.txt:
aiohttp==x.x.x <-- missing the comment here
django==x.x.x

@atugushev
Copy link
Member

atugushev commented Jul 11, 2019

@tmzhuang

The canonical way is to use -r base.in. Yes it has own downsides (like you have to compile all *.in files at the same time), but i'm afraid there's no obvious way to implement the above suggestions.

By including -r base.txt all packages from this file become primary, that's why they loose "via" annotation and that's correct behaviour.

IMO this workflow with -r base.in should be well explained in the README.

@tmzhuang
Copy link

@atugushev

That works for me. Is there any downside to requiring .in files other than the extra time to compile all required .in files?

@devxpy
Copy link

devxpy commented Jul 18, 2019

What I would love to have, is a simplified command that does this stuff in one go.
I end up creating a bash script to do this for every project.

#!/usr/bin/env bash

pip install pip-tools
for f in requirements/*.in
do
    pip-compile $f -v
done
pip-compile requirements/*.in -v -o requirements/requirements.txt
pip-sync requirements/requirements.txt

@IvanAnishchuk
Copy link
Member

I'd be really happy to see the just workstm version of the kind of pinning that I'm talking about - I'm sure I'm getting the wrong end of the stick somewhere.

One alternative I could see is to pin the Django version in an .in file, but isn't that missing the point of the them?

The canonical way is to use -r base.in

How about using -c requirements.txt in your dev.in file instead of -r? That way production requirements won't be included automatically in the dev ones but pinned versions would be respected. Constraint files are a powerful tool. I'm currently playing with this idea, not sure if I can build some nice workflow around it.

It could be used in addition to -r requirements.txt if you still want everything in a single big requirements file.

But I agree that some standard tooling to support multiple non-contradicting sets of requirements would be ideal here (multiple requirements sections in setup.cfg is one example of similar structures). Resolver could mimic the logic with constraint files above when generating multiple sets of pinned requirements...

@IvanAnishchuk
Copy link
Member

IvanAnishchuk commented Aug 24, 2019

Also, when using -c you can still compile everything into a single requirements file if needed (just compile both in files together with pip-compile -o requirements.txt requirements.in requirements-dev.in) or you can get two separate files (one for prod environment, one for dev, still installable together with pip install -r requirements.txt -r requirements-dev.txt).

I think it's the most correct and flexible way available right now. Not sure if it can cause any issues but so far I haven't stumbled upon any and I suspect -r would usually cause pretty much the same issues anyway.

@atugushev
Copy link
Member

I think it's the most correct and flexible way available right now. Not sure if it can cause any issues but so far I haven't stumbled upon any and I suspect -r would usually cause pretty much the same issues anyway.

I've tried this approach and must say that It works like a charm! Would be nice to have it documented it somewhere.

@jamescooke
Copy link
Contributor

@IvanAnishchuk Thanks for the tip on -c. Just reading the docs and comments this looks perfect for what cascaded requirements are trying to do... and perfectly answers the original question:

How can I run pip-compile on requirements-dev.in, where it will also take into account the requirements in requirements.in when figuring out which versions to use?

Answer = "Use -c requirements.txt at the top of requirements-dev.in to constrain the dev requirements to packages already selected for production in requirements.txt."

@atugushev When you say "Would be nice to have it documented it somewhere." does that mean cramming this into the README? If so, I can give it a go and open a PR. It would be nice to close off this Issue before it gets to be 3 years old.

@atugushev
Copy link
Member

@jamescooke

does that mean cramming this into the README? If so, I can give it a go and open a PR. It would be nice to close off this Issue before it gets to be 3 years old.

Yes, it does! Please go for it, I'd be happy to merge it.

@atugushev
Copy link
Member

Hey folks,

Finally, this issue's closed! Huge thanks to @IvanAnishchuk for the idea and @jamescooke for the docs! 🎉

@atugushev
Copy link
Member

FYI, pip-tools v4.4.1 fixed a bug related to -c requirements.txt workflow where dependencies of relevant constraints may be missing from output file (see #1037 for details). Please upgrade and recompile requirements if you use this workflow. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation related PR wanted Feature is discussed or bug is confirmed, PR needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.