Skip to content

Support max-batchsize argument for resnet50 non tpu run #508

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: dev
Choose a base branch
from

Conversation

anandhu-eng
Copy link
Contributor

@anandhu-eng anandhu-eng commented Jul 15, 2025

🧾 PR Checklist

  • Target branch is dev

📌 Note: PRs must be raised against dev. Do not commit directly to main.

@anandhu-eng anandhu-eng requested a review from a team as a code owner July 15, 2025 04:55
Copy link
Contributor

github-actions bot commented Jul 15, 2025

MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅

@anandhu-eng
Copy link
Contributor Author

Potential help to issue mlcommons/inference#2249

@anandhu-eng anandhu-eng changed the title Support max-batchsize for resnet50 non tpu run Support max-batchsize argument for resnet50 non tpu run Jul 15, 2025
@@ -259,7 +260,7 @@ def get_run_cmd_reference(
scenario_extra_options + mode_extra_options + dataset_options
else:
cmd = "./run_local.sh " + env['MLC_MLPERF_BACKEND'] + ' ' + \
env['MLC_MODEL'] + ' ' + env['MLC_MLPERF_DEVICE'] + " --scenario " + env['MLC_MLPERF_LOADGEN_SCENARIO'] + " " + env['MLC_MLPERF_LOADGEN_EXTRA_OPTIONS'] + \
env['MLC_MODEL'] + ' ' + env['MLC_MLPERF_DEVICE'] + " --scenario " + env['MLC_MLPERF_LOADGEN_SCENARIO'] + " --max-batchsize " + env.get('MLC_MLPERF_LOADGEN_MAX_BATCHSIZE', '128') + " " + env['MLC_MLPERF_LOADGEN_EXTRA_OPTIONS'] + \
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this code portion is common for R50 and retinanet right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @arjunsuresh , the resnet50 and retinanet uses the same main.py file. But I see that for retinanet, some default profile use batch size 1, Its better to keep it only for resnet50 as of now until we do some testing with the added tags. I have updated the code in recent PR

@@ -261,6 +262,8 @@ def get_run_cmd_reference(
cmd = "./run_local.sh " + env['MLC_MLPERF_BACKEND'] + ' ' + \
env['MLC_MODEL'] + ' ' + env['MLC_MLPERF_DEVICE'] + " --scenario " + env['MLC_MLPERF_LOADGEN_SCENARIO'] + " " + env['MLC_MLPERF_LOADGEN_EXTRA_OPTIONS'] + \
scenario_extra_options + mode_extra_options + dataset_options
if env['MLC_MODEL'] in ["resnet50"]:
cmd += f" --max-batchsize {env.get('MLC_MLPERF_LOADGEN_MAX_BATCHSIZE', '128')} "
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in line num 66 we are supporting batchsize in MLC_MLPERF_LOADGEN_EXTRA_OPTIONS. . So, may be we should add this as a default before that line.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants