-
Notifications
You must be signed in to change notification settings - Fork 14
Support max-batchsize argument for resnet50 non tpu run #508
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅ |
Potential help to issue mlcommons/inference#2249 |
@@ -259,7 +260,7 @@ def get_run_cmd_reference( | |||
scenario_extra_options + mode_extra_options + dataset_options | |||
else: | |||
cmd = "./run_local.sh " + env['MLC_MLPERF_BACKEND'] + ' ' + \ | |||
env['MLC_MODEL'] + ' ' + env['MLC_MLPERF_DEVICE'] + " --scenario " + env['MLC_MLPERF_LOADGEN_SCENARIO'] + " " + env['MLC_MLPERF_LOADGEN_EXTRA_OPTIONS'] + \ | |||
env['MLC_MODEL'] + ' ' + env['MLC_MLPERF_DEVICE'] + " --scenario " + env['MLC_MLPERF_LOADGEN_SCENARIO'] + " --max-batchsize " + env.get('MLC_MLPERF_LOADGEN_MAX_BATCHSIZE', '128') + " " + env['MLC_MLPERF_LOADGEN_EXTRA_OPTIONS'] + \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this code portion is common for R50 and retinanet right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @arjunsuresh , the resnet50 and retinanet uses the same main.py file. But I see that for retinanet, some default profile use batch size 1, Its better to keep it only for resnet50 as of now until we do some testing with the added tags. I have updated the code in recent PR
@@ -261,6 +262,8 @@ def get_run_cmd_reference( | |||
cmd = "./run_local.sh " + env['MLC_MLPERF_BACKEND'] + ' ' + \ | |||
env['MLC_MODEL'] + ' ' + env['MLC_MLPERF_DEVICE'] + " --scenario " + env['MLC_MLPERF_LOADGEN_SCENARIO'] + " " + env['MLC_MLPERF_LOADGEN_EXTRA_OPTIONS'] + \ | |||
scenario_extra_options + mode_extra_options + dataset_options | |||
if env['MLC_MODEL'] in ["resnet50"]: | |||
cmd += f" --max-batchsize {env.get('MLC_MLPERF_LOADGEN_MAX_BATCHSIZE', '128')} " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in line num 66 we are supporting batchsize in MLC_MLPERF_LOADGEN_EXTRA_OPTIONS
. . So, may be we should add this as a default before that line.
🧾 PR Checklist
dev
📌 Note: PRs must be raised against
dev
. Do not commit directly tomain
.