fix msvc linker bug for huge command line arguments#406
fix msvc linker bug for huge command line arguments#406
Conversation
|
It’s a bit concerning to see that the last commit was 2 months ago - is the project still actively maintained? |
|
@abravalheri @zooba may you get someone to review this please? I only opened this here instead of in setuptools since it was requested to report the issue here in distutils. The problem is not file path length, it is command line length. So if you're linking thousands of files it doesn't matter how short your paths are, you're going to run out of space. |
…d command lnie argument is accounted for
Co-authored-by: Isuru Fernando <isuruf@gmail.com>
|
@isuruf Thanks for the review! Could someone with merge rights take a quick look as well? |
|
@isuruf can we merge this PR please? |
|
@astrelsky btw, even after this PR is merged, it'll take a while to percolate. I'm wondering if it makes sense to have another PR within |
zooba
left a comment
There was a problem hiding this comment.
Approach is fine to me, a few things for you to check. Also, these are typically called "response files", so if you wanted to update variable names to make it more obvious that you know what you're doing, that won't hurt.
I'm not in a position to merge it.
|
Pushing up the changes shortly. I'm not going to bother with the variable names and think the links are enough. I think referring to it as a response file would be more confusing even though that is technically what they are called. |
I anticipated a lengthy timeline on this. I actually encountered this same problem and fixed it several years ago but neglected to submit a fix and just patched around it. |
17affa1 to
11eb104
Compare
|
ROCm/xformers#86 also needs this change. |
Fixes #226 and pypa/setuptools#4177
This is one of two issues that is blocking flash-attention using ROCM CK on Windows. The other issue is in CK itself and not a pypa problem.
This is just to make it easier for the flash-attention maintainers to find when looking through their comments.
Dao-AILab/flash-attention#2400