Skip to content

fix: pin transformers below 5.0.0 to fix generation_config warning#64

Open
Ravi-Poddar26 wants to merge 1 commit intosugarlabs:mainfrom
Ravi-Poddar26:fix/pin-transformers-version
Open

fix: pin transformers below 5.0.0 to fix generation_config warning#64
Ravi-Poddar26 wants to merge 1 commit intosugarlabs:mainfrom
Ravi-Poddar26:fix/pin-transformers-version

Conversation

@Ravi-Poddar26
Copy link

Problem

Running any endpoint produces this warning on every request:

Both max_new_tokens (=1024) and max_length(=20) seem to
have been set. max_new_tokens will take precedence.

Root Cause

requirements.txt specifies transformers>=4.45.2 which
allows transformers 5.x to be installed. transformers 5.x
introduced stricter generation config handling — it raises
a warning when the model's generation_config.json and
pipeline parameters conflict.

transformers 4.x silently merged these configs.
transformers 5.x warns on every single request.

Fix

Pin transformers to <5.0.0 until the codebase is updated
for full transformers 5.x compatibility.

Testing

Tested on:

  • Windows 11
  • Python 3.13
  • torch 2.10.0
  • transformers 5.3.0 → warning appears
  • transformers 4.x → warning gone

Fixes #63


---

## Before Clicking Submit — Check This

Make sure at the top of the PR page it shows:

base repository: sugarlabs/sugar-ai base: main
head repository: Ravi-Poddar26/sugar-ai compare: fix/pin-transformers-version

transformers 5.x introduced stricter generation config handling
which causes a warning on every API request when the model's
generation_config.json conflicts with pipeline parameters.

Pin to <5.0.0 until codebase is updated for transformers 5.x.

Fixes sugarlabs#63
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix: transformers 5.x causes generation_config warning on every API request

1 participant