Skip to content

Commit 62b79cb

Browse files
Caner Gocmenfacebook-github-bot
authored andcommitted
Add logging for partitioner methods
Summary: Adding logging to make it clear which partitioner method and its settings were called. Right now there is no way to tell that from the logs which occasionally confuses me... Differential Revision: D80361377
1 parent 79fbb29 commit 62b79cb

File tree

1 file changed

+7
-0
lines changed

1 file changed

+7
-0
lines changed

torchrec/distributed/planner/partitioners.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -226,6 +226,9 @@ def partition(
226226
# The topology updates are done after the end of all the placements (the other
227227
# in the example is just for clarity).
228228
"""
229+
logger.info(
230+
f"GreedyPerfPartitioner - sort_by: {self._sort_by}, balance_modules: {self._balance_modules}"
231+
)
229232

230233
_topology: Topology = copy.deepcopy(storage_constraint)
231234
minheap_devices: Optional[List[OrderedDeviceHardware]] = None
@@ -587,6 +590,10 @@ def partition(
587590
within the tolerance of the original plan that uses the least amount
588591
of memory.
589592
"""
593+
logger.info(
594+
f"MemoryBalancedPartitioner - _max_search_count: {self._max_search_count}, _tolerance: {self._tolerance}, _balance_modules: {self._balance_modules}"
595+
)
596+
590597
_perf_model: PerfModel = NoopPerfModel(storage_constraint)
591598
_partitioner = GreedyPerfPartitioner(
592599
sort_by=SortBy.PERF, balance_modules=self._balance_modules

0 commit comments

Comments
 (0)