Skip to content

whisper transcription compatibility with newer transformers#274

Open
Saganaki22 wants to merge 2 commits intomelMass:mainfrom
Saganaki22:main
Open

whisper transcription compatibility with newer transformers#274
Saganaki22 wants to merge 2 commits intomelMass:mainfrom
Saganaki22:main

Conversation

@Saganaki22
Copy link
Copy Markdown

  • Fix WhisperConfig max_length AttributeError — Newer versions of the transformers library removed the max_length attribute from WhisperConfig, causing transcription to crash with AttributeError: 'WhisperConfig' object has no attribute 'max_length'. Changed to use getattr() with a fallback so it works on both old and new versions of transformers.
  • Fix float16 dtype mismatch — When running Whisper in half precision (float16), the input_features tensor was being passed as float32, causing a RuntimeError: Input type (float) and bias type (struct c10::Half) should be the same. Now the input features are cast to match the model's dtype before inference.

- Use getattr for max_length to handle removed WhisperConfig attribute
- Cast input_features to model dtype to fix float16 mismatch
fix: whisper transcription compatibility with newer transformers
@Saganaki22 Saganaki22 requested a review from melMass as a code owner March 20, 2026 04:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant