Skip to content

Conversation

qiching
Copy link

@qiching qiching commented Sep 4, 2025

Bug Description

In 'PreTrainingDataModule._create_dataloader(), the 'WrappedDataLoader' was constructed without passing 'batch_size', causing it to default to 1 regardless of the setting of "micro_batch_size" in the config.

Fix

explicitly passed to 'WrappedDataLoader' ensuring the correct batch sizeis used during training and evaluation.

Testing

Verified.

@qiching qiching changed the title fix: pass micro_batch_size in WrappedDataLoader in PreTrainingDataMM… fix: pass micro_batch_size in WrappedDataLoader in PreTrainingDataModule Sep 4, 2025
@qiching qiching changed the title fix: pass micro_batch_size in WrappedDataLoader in PreTrainingDataModule fix: pass micro_batch_size in WrappedDataLoader in PreTrainingDataModule Signed-off-by: Albert Cheng <[email protected]> Sep 4, 2025
qiching and others added 3 commits September 4, 2025 05:36
…ule (Signed-off-by: Albert Cheng <[email protected]> )

Bug Description
In 'PreTrainingDataModule._create_dataloader(), the 'WrappedDataLoader' was constructed without passing 'batch_size', causing it to default to 1 regardless of the setting of "micro_batch_size" in the config.

Fix
explicitly passed to 'WrappedDataLoader' ensuring the correct batch sizeis used during training and evaluation.

Testing
Verified.

Signed-off-by: Albert Cheng <[email protected]>
@qiching qiching force-pushed the fix-batch-size-dataloader branch from 93f9010 to 339f225 Compare September 4, 2025 05:37
@qiching qiching changed the title fix: pass micro_batch_size in WrappedDataLoader in PreTrainingDataModule Signed-off-by: Albert Cheng <[email protected]> fix: pass micro_batch_size in WrappedDataLoader in PreTrainingDataModule Sep 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant