Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
85 commits
Select commit Hold shift + click to select a range
862a776
first round of refactoring runners.py, Runner base class for normal i…
jlnav Jan 8, 2024
e6874a6
refactoring classes so class attributes aren't passed around internal…
jlnav Jan 9, 2024
e17eabe
ThreadRunner uses comms.QCommThread, slightly modified, to launch its…
jlnav Jan 9, 2024
83493d0
handful of small changes from experimental/gen_on_manager_inplace
jlnav Jan 10, 2024
6ad870c
first incredibly long and ugly concatenation of "pipeline" and "state…
jlnav Jan 10, 2024
d14b0aa
progress
jlnav Jan 11, 2024
ab32e3f
bugfixes, first "working" refactor of manager can run 1d_sampling usi…
jlnav Jan 11, 2024
68d8855
removing now-redundant content from manager, trying to see if we can …
jlnav Jan 12, 2024
33ea282
restore version of manager from develop. specify iterations for worker.
jlnav Jan 17, 2024
843df39
remove pipelines.py. will start simpler
jlnav Jan 17, 2024
3aeab06
undoing "iterations" change in worker, seeing if we can simply submit…
jlnav Jan 17, 2024
b083a21
add attempted update_state_on_local_gen_msg and handle_msg_from_local…
jlnav Jan 17, 2024
231e2b7
use _Worker class to correctly index into W and wcomms. add initial o…
jlnav Jan 17, 2024
d251363
add "threaded" tentative option to sim/gen_specs
jlnav Jan 17, 2024
368bf93
fix ThreadRunner shutdown when that worker didn't launch a thread
jlnav Jan 17, 2024
744620d
adds test-case to functionality tests, fixes alloc_f libE_info usable…
jlnav Jan 18, 2024
ca14b7c
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Jan 18, 2024
cd6f0db
make resources reflect develop?
jlnav Jan 18, 2024
0952067
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Jan 22, 2024
884d61b
remove old symlink
jlnav Jan 22, 2024
dfb0fbb
print evaluated lines in check_libe_stats for now
jlnav Jan 22, 2024
ec236ed
only want to perform this specific datetime check on indexes 5 and 6 …
jlnav Jan 22, 2024
7b94467
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Jan 24, 2024
f06148a
a much simpler indexing solution from shuds
jlnav Jan 24, 2024
d584152
add comment for why using self.W.iterable in "for wrk in self.W.itera…
jlnav Jan 24, 2024
592c8c4
add __len__ and __iter__ to indexer
jlnav Jan 24, 2024
59ca40a
add __setitem__
jlnav Jan 24, 2024
d8a3a42
adjust alloc_support to not use w - 1 indexing
jlnav Jan 24, 2024
1839ff2
just pass in the iterable for now. resource changes coming in another…
jlnav Jan 24, 2024
43e98a9
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Jan 25, 2024
65fc121
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Feb 7, 2024
95badb1
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Feb 20, 2024
1fcf91f
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Feb 23, 2024
ad525bb
add tentative gen_on_manager option, separate additional_worker_launc…
jlnav Feb 23, 2024
fe64869
various refactors based on PR suggestions, then manager-refactors bas…
jlnav Feb 26, 2024
dcf6db7
fix persistent filter, update avail/running gens counters
jlnav Feb 26, 2024
ba05900
update unit test, bugfix
jlnav Feb 26, 2024
482ec15
update persistent allocs, but also add backwards-compatibility check …
jlnav Feb 26, 2024
3d06b1c
fix persistent sim test
jlnav Feb 26, 2024
9165d7d
move _WorkerIndexer into libensemble.utils, also use within Persisten…
jlnav Feb 26, 2024
f7ba205
manager also needs to send workflow_dir location to worker 0
jlnav Feb 26, 2024
376e450
missed an alloc
jlnav Feb 27, 2024
ac52a9f
Merge branch 'develop' into refactor/user_function_handling_modules
jlnav Feb 27, 2024
6375058
make alloc_f's libE_info additional worker option match libE_specs
jlnav Feb 27, 2024
c07a565
removes manager_runs_additional_worker in favor of gen_on_manager. pa…
jlnav Feb 28, 2024
c46802e
turning W["active"] back to an int
jlnav Feb 28, 2024
2ee9466
experimenting with gen_on_manager with give_pregenerated_work - worke…
jlnav Feb 28, 2024
9ebe767
I think for sim workers, the only requirement is that they're not gen…
jlnav Feb 28, 2024
09d030c
fixing alloc unit test based on passing wrapped W into alloc
jlnav Feb 28, 2024
2f631e0
refactoring Worker array fields to more closely match develop. worker…
jlnav Feb 29, 2024
ab39de6
fix tests
jlnav Mar 1, 2024
550ca1f
missed a revert in alloc
jlnav Mar 1, 2024
e7591b6
undo inconsequential tiny changes to allocs
jlnav Mar 1, 2024
68b991a
run each of the test_GPU_gen_resources tests also with the gen runnin…
jlnav Mar 1, 2024
c433ecb
simply gen_workers parameter description for avail_worker_ids
jlnav Mar 6, 2024
e78056b
debugging consecutive libE calls with gen_on_manager
jlnav Mar 8, 2024
f30233c
debugging......
jlnav Mar 8, 2024
6d0f9d2
cleaning up debugging, removing comm from Executor upon worker exiting
jlnav Mar 8, 2024
97c2c53
clarification comment
jlnav Mar 8, 2024
73d4b4c
bugfix
jlnav Mar 11, 2024
13fecde
filter for gen_workers within avail_worker_ids, if set and there are …
jlnav Mar 13, 2024
0bcfc79
refactor give_sim_work_first for running on gen_workers if no points_…
jlnav Mar 13, 2024
45cbd16
it turns out that values set by validators are still considered "unse…
jlnav Mar 13, 2024
2bc504c
starting to create unit test
jlnav Mar 13, 2024
aa4db8a
finish up unit test
jlnav Mar 14, 2024
429adb4
it turns out that values set by validators are still considered "unse…
jlnav Mar 13, 2024
b1f9108
starting to create unit test
jlnav Mar 13, 2024
6fa18ef
finish up unit test
jlnav Mar 14, 2024
dbdf88f
platform_specs sometimes seems to be at risk of disappearing when we …
jlnav Mar 14, 2024
eacf46f
Merge branch 'bugfix/ensemble_libE_specs_attrs_passthrough' into refa…
jlnav Mar 15, 2024
e4d4b08
refactor fast_alloc for gen workers
jlnav Mar 15, 2024
ffbe6c9
better test comment
jlnav Mar 15, 2024
14c8b1f
refactor inverse_bayes_allocf
jlnav Mar 15, 2024
45e99b2
trying to refcator only_one_gen_alloc, but currently doesnt pass test…
jlnav Mar 15, 2024
6f713dc
refactor aposmm alloc, move skip_cancled_points line
jlnav Mar 15, 2024
4aa386f
Update fast_alloc
shuds13 Mar 15, 2024
0b12af2
refactor start_fd_persistent
jlnav Mar 15, 2024
97cdfdb
refactor start_persistent_local_opt_gens
jlnav Mar 15, 2024
77f880e
typo
jmlarson1 Mar 16, 2024
2780f10
fast_alloc alloc_f: don't overwrite sim_worker with gen_work for a gi…
jlnav Mar 18, 2024
655a1ba
return Work after packing up gen work
jlnav Mar 18, 2024
15719c7
do next_to_give check within avail_worker_ids loop
jlnav Mar 18, 2024
3138a39
add libE_specs["gen_workers"] option, adjust ensure_one_active_gen so…
jlnav Mar 18, 2024
2093629
update give_pregenerated_work and start_only_persistent to only give …
jlnav Mar 19, 2024
8a50e60
refactor fast_alloc_and_pausing
jlnav Mar 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion docs/data_structures/libE_specs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,11 @@ libEnsemble is primarily customized by setting options within a ``LibeSpecs`` cl
Manager/Worker communications mode: ``'mpi'``, ``'local'``, or ``'tcp'``.

**nworkers** [int]:
Number of worker processes in ``"local"`` or ``"tcp"``.
Number of worker processes in ``"local"``, ``"threads"``, or ``"tcp"``.

**gen_on_manager** Optional[bool] = False
Instructs Manager process to run generator functions.
This generator function can access/modify user objects by reference.

**mpi_comm** [MPI communicator] = ``MPI.COMM_WORLD``:
libEnsemble MPI communicator.
Expand All @@ -51,6 +55,10 @@ libEnsemble is primarily customized by setting options within a ``LibeSpecs`` cl
**disable_log_files** [bool] = ``False``:
Disable ``ensemble.log`` and ``libE_stats.txt`` log files.

**gen_workers** [list of ints]:
List of workers that should only run generators. All other workers will only
run simulator functions.

.. tab-item:: Directories

.. tab-set::
Expand Down
25 changes: 13 additions & 12 deletions libensemble/alloc_funcs/fast_alloc.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,25 +32,26 @@ def give_sim_work_first(W, H, sim_specs, gen_specs, alloc_specs, persis_info, li
Work = {}
gen_in = gen_specs.get("in", [])

for wid in support.avail_worker_ids():
# Give sim work if possible
for wid in support.avail_worker_ids(gen_workers=False):
persis_info = support.skip_canceled_points(H, persis_info)

# Give sim work if possible
if persis_info["next_to_give"] < len(H):
try:
Work[wid] = support.sim_work(wid, H, sim_specs["in"], [persis_info["next_to_give"]], [])
except InsufficientFreeResources:
break
persis_info["next_to_give"] += 1

elif gen_count < user.get("num_active_gens", gen_count + 1):
# Give gen work
return_rows = range(len(H)) if gen_in else []
try:
Work[wid] = support.gen_work(wid, gen_in, return_rows, persis_info.get(wid))
except InsufficientFreeResources:
break
gen_count += 1
persis_info["total_gen_calls"] += 1
# Give gen work if possible
if persis_info["next_to_give"] >= len(H):
for wid in support.avail_worker_ids(gen_workers=True):
if wid not in Work and gen_count < user.get("num_active_gens", gen_count + 1):
return_rows = range(len(H)) if gen_in else []
try:
Work[wid] = support.gen_work(wid, gen_in, return_rows, persis_info.get(wid))
except InsufficientFreeResources:
break
gen_count += 1
persis_info["total_gen_calls"] += 1

return Work, persis_info
21 changes: 13 additions & 8 deletions libensemble/alloc_funcs/fast_alloc_and_pausing.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,10 @@ def give_sim_work_first(W, H, sim_specs, gen_specs, alloc_specs, persis_info, li
for pt_id in persis_info["pt_ids"]:
persis_info["inds_of_pt_ids"][pt_id] = H["pt_id"] == pt_id

idle_workers = support.avail_worker_ids()
idle_sim_workers = support.avail_worker_ids(gen_workers=False)
idle_gen_workers = support.avail_worker_ids(gen_workers=True)

while len(idle_workers):
while len(idle_sim_workers):
pt_ids_to_pause = set()

# Find indices of H that are not yet given out to be evaluated
Expand Down Expand Up @@ -106,15 +107,19 @@ def give_sim_work_first(W, H, sim_specs, gen_specs, alloc_specs, persis_info, li

if len(persis_info["need_to_give"]) != 0:
next_row = persis_info["need_to_give"].pop()
i = idle_workers[0]
i = idle_sim_workers[0]
try:
Work[i] = support.sim_work(i, H, sim_specs["in"], [next_row], [])
except InsufficientFreeResources:
persis_info["need_to_give"].add(next_row)
break
idle_workers = idle_workers[1:]
idle_sim_workers = idle_sim_workers[1:]

elif gen_count < alloc_specs["user"].get("num_active_gens", gen_count + 1):
else:
break

while len(idle_gen_workers):
if gen_count < alloc_specs["user"].get("num_active_gens", gen_count + 1):
lw = persis_info["last_worker"]

last_size = persis_info.get("last_size")
Expand All @@ -126,18 +131,18 @@ def give_sim_work_first(W, H, sim_specs, gen_specs, alloc_specs, persis_info, li
break

# Give gen work
i = idle_workers[0]
i = idle_gen_workers[0]
try:
Work[i] = support.gen_work(i, gen_specs["in"], range(len(H)), persis_info[lw])
except InsufficientFreeResources:
break
idle_workers = idle_workers[1:]
idle_gen_workers = idle_gen_workers[1:]
gen_count += 1
persis_info["total_gen_calls"] += 1
persis_info["last_worker"] = i
persis_info["last_size"] = len(H)

elif gen_count >= alloc_specs["user"].get("num_active_gens", gen_count + 1):
idle_workers = []
idle_gen_workers = []

return Work, persis_info
2 changes: 1 addition & 1 deletion libensemble/alloc_funcs/give_pregenerated_work.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ def give_pregenerated_sim_work(W, H, sim_specs, gen_specs, alloc_specs, persis_i
if persis_info["next_to_give"] >= len(H):
return Work, persis_info, 1

for i in support.avail_worker_ids():
for i in support.avail_worker_ids(gen_workers=False):
persis_info = support.skip_canceled_points(H, persis_info)

# Give sim work
Expand Down
10 changes: 7 additions & 3 deletions libensemble/alloc_funcs/give_sim_work_first.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,15 +64,19 @@ def give_sim_work_first(
Work = {}

points_to_evaluate = ~H["sim_started"] & ~H["cancel_requested"]
for wid in support.avail_worker_ids():
if np.any(points_to_evaluate):

if np.any(points_to_evaluate):
for wid in support.avail_worker_ids(gen_workers=False):
sim_ids_to_send = support.points_by_priority(H, points_avail=points_to_evaluate, batch=batch_give)
try:
Work[wid] = support.sim_work(wid, H, sim_specs["in"], sim_ids_to_send, persis_info.get(wid))
except InsufficientFreeResources:
break
points_to_evaluate[sim_ids_to_send] = False
else:
if not np.any(points_to_evaluate):
break
else:
for wid in support.avail_worker_ids(gen_workers=True):
# Allow at most num_active_gens active generator instances
if gen_count >= user.get("num_active_gens", gen_count + 1):
break
Expand Down
17 changes: 8 additions & 9 deletions libensemble/alloc_funcs/inverse_bayes_allocf.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,9 @@ def only_persistent_gens_for_inverse_bayes(W, H, sim_specs, gen_specs, alloc_spe
Work[wid] = support.gen_work(wid, ["like"], inds_to_send_back, persis_info.get(wid), persistent=True)

points_to_evaluate = ~H["sim_started"] & ~H["cancel_requested"]
for wid in support.avail_worker_ids(persistent=False):
if np.any(points_to_evaluate):
if np.any(points_to_evaluate):
for wid in support.avail_worker_ids(persistent=False, gen_workers=False):

# perform sim evaluations (if any point hasn't been given).
sim_subbatches = H["subbatch"][points_to_evaluate]
sim_inds = sim_subbatches == np.min(sim_subbatches)
Expand All @@ -54,13 +55,11 @@ def only_persistent_gens_for_inverse_bayes(W, H, sim_specs, gen_specs, alloc_spe
except InsufficientFreeResources:
break
points_to_evaluate[sim_ids_to_send] = False

elif gen_count == 0:
# Finally, generate points since there is nothing else to do.
try:
Work[wid] = support.gen_work(wid, gen_specs["in"], [], persis_info.get(wid), persistent=True)
except InsufficientFreeResources:
if not np.any(points_to_evaluate):
break
gen_count += 1

elif gen_count == 0:
wid = support.avail_worker_ids(persistent=False, gen_workers=True)[0]
Work[wid] = support.gen_work(wid, gen_specs["in"], [], persis_info.get(wid), persistent=True)

return Work, persis_info
33 changes: 18 additions & 15 deletions libensemble/alloc_funcs/only_one_gen_alloc.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,27 +21,30 @@ def ensure_one_active_gen(W, H, sim_specs, gen_specs, alloc_specs, persis_info,
gen_flag = True
gen_in = gen_specs.get("in", [])

for wid in support.avail_worker_ids():
persis_info = support.skip_canceled_points(H, persis_info)

if persis_info["next_to_give"] < len(H):
if persis_info["next_to_give"] < len(H):
for wid in support.avail_worker_ids(gen_workers=False):
persis_info = support.skip_canceled_points(H, persis_info)
try:
Work[wid] = support.sim_work(wid, H, sim_specs["in"], [persis_info["next_to_give"]], [])
except InsufficientFreeResources:
break
persis_info["next_to_give"] += 1

elif not support.test_any_gen() and gen_flag:
if not support.all_sim_ended(H):
if persis_info["next_to_give"] >= len(H):
break

# Give gen work
return_rows = range(len(H)) if gen_in else []
try:
Work[wid] = support.gen_work(wid, gen_in, return_rows, persis_info.get(wid))
except InsufficientFreeResources:
break
gen_flag = False
persis_info["total_gen_calls"] += 1
elif not support.test_any_gen() and gen_flag:
# Give gen work
return_rows = range(len(H)) if gen_in else []
wid = support.avail_worker_ids(gen_workers=True)[0]

if not support.all_sim_ended(H):
return Work, persis_info

try:
Work[wid] = support.gen_work(wid, gen_in, return_rows, persis_info.get(wid))
except InsufficientFreeResources:
return Work, persis_info
gen_flag = False
persis_info["total_gen_calls"] += 1

return Work, persis_info
8 changes: 6 additions & 2 deletions libensemble/alloc_funcs/persistent_aposmm_alloc.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ def persistent_aposmm_alloc(W, H, sim_specs, gen_specs, alloc_specs, persis_info
)
returned_but_not_given[point_ids] = False

for wid in support.avail_worker_ids(persistent=False):
for wid in support.avail_worker_ids(persistent=False, gen_workers=False):
persis_info = support.skip_canceled_points(H, persis_info)

if persis_info["next_to_give"] < len(H):
Expand All @@ -63,8 +63,11 @@ def persistent_aposmm_alloc(W, H, sim_specs, gen_specs, alloc_specs, persis_info
except InsufficientFreeResources:
break
persis_info["next_to_give"] += 1
if persis_info["next_to_give"] >= len(H):
break

elif persis_info.get("gen_started") is None:
if persis_info.get("gen_started") is None:
for wid in support.avail_worker_ids(persistent=False, gen_workers=True):
# Finally, call a persistent generator as there is nothing else to do.
persis_info.get(wid)["nworkers"] = len(W)
try:
Expand All @@ -74,5 +77,6 @@ def persistent_aposmm_alloc(W, H, sim_specs, gen_specs, alloc_specs, persis_info
except InsufficientFreeResources:
break
persis_info["gen_started"] = True # Must set after - in case break on resources
break

return Work, persis_info
17 changes: 8 additions & 9 deletions libensemble/alloc_funcs/start_fd_persistent.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,22 +49,21 @@ def finite_diff_alloc(W, H, sim_specs, gen_specs, alloc_specs, persis_info, libE
)

points_to_evaluate = ~H["sim_started"] & ~H["cancel_requested"]
for wid in support.avail_worker_ids(persistent=False):
if np.any(points_to_evaluate):
if np.any(points_to_evaluate):
for wid in support.avail_worker_ids(persistent=False, gen_workers=False):
# perform sim evaluations (if they exist in History).
sim_ids_to_send = np.nonzero(points_to_evaluate)[0][0] # oldest point
try:
Work[wid] = support.sim_work(wid, H, sim_specs["in"], sim_ids_to_send, persis_info.get(wid))
except InsufficientFreeResources:
break
points_to_evaluate[sim_ids_to_send] = False

elif gen_count == 0:
# Finally, call a persistent generator as there is nothing else to do.
try:
Work[wid] = support.gen_work(wid, gen_specs.get("in", []), [], persis_info.get(wid), persistent=True)
except InsufficientFreeResources:
if not np.any(points_to_evaluate):
break
gen_count += 1

if gen_count == 0:
wid = support.avail_worker_ids(persistent=False, gen_workers=True)[0]
Work[wid] = support.gen_work(wid, gen_specs.get("in", []), [], persis_info.get(wid), persistent=True)
gen_count += 1

return Work, persis_info, 0
4 changes: 2 additions & 2 deletions libensemble/alloc_funcs/start_only_persistent.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ def only_persistent_gens(W, H, sim_specs, gen_specs, alloc_specs, persis_info, l

# Now the give_sim_work_first part
points_to_evaluate = ~H["sim_started"] & ~H["cancel_requested"]
avail_workers = support.avail_worker_ids(persistent=False, zero_resource_workers=False)
avail_workers = support.avail_worker_ids(persistent=False, zero_resource_workers=False, gen_workers=False)
if user.get("alt_type"):
avail_workers = list(
set(support.avail_worker_ids(persistent=False, zero_resource_workers=False))
Expand All @@ -115,7 +115,7 @@ def only_persistent_gens(W, H, sim_specs, gen_specs, alloc_specs, persis_info, l

# Start persistent gens if no worker to give out. Uses zero_resource_workers if defined.
if not np.any(points_to_evaluate):
avail_workers = support.avail_worker_ids(persistent=False, zero_resource_workers=True)
avail_workers = support.avail_worker_ids(persistent=False, zero_resource_workers=True, gen_workers=True)

for wid in avail_workers:
if gen_count < user.get("num_active_gens", 1):
Expand Down
16 changes: 10 additions & 6 deletions libensemble/alloc_funcs/start_persistent_local_opt_gens.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def start_persistent_local_opt_gens(W, H, sim_specs, gen_specs, alloc_specs, per
Work[wid] = support.gen_work(wid, gen_specs["persis_in"], last_ind, persis_info[wid], persistent=True)
persis_info[wid]["run_order"].append(last_ind)

for wid in support.avail_worker_ids(persistent=False):
for wid in support.avail_worker_ids(persistent=False, gen_workers=True):
# Find candidates to start local opt runs if a sample has been evaluated
if np.any(np.logical_and(~H["local_pt"], H["sim_ended"], ~H["cancel_requested"])):
n = len(H["x"][0])
Expand All @@ -78,7 +78,8 @@ def start_persistent_local_opt_gens(W, H, sim_specs, gen_specs, alloc_specs, per
persis_info[wid]["run_order"] = [ind]
gen_count += 1

elif np.any(points_to_evaluate):
if np.any(points_to_evaluate):
for wid in support.avail_worker_ids(persistent=False, gen_workers=False):
# Perform sim evaluations from existing runs
q_inds_logical = np.logical_and(points_to_evaluate, H["local_pt"])
if not np.any(q_inds_logical):
Expand All @@ -89,10 +90,13 @@ def start_persistent_local_opt_gens(W, H, sim_specs, gen_specs, alloc_specs, per
except InsufficientFreeResources:
break
points_to_evaluate[sim_ids_to_send] = False
if not np.any(points_to_evaluate):
break

elif gen_count == 0 and not np.any(np.logical_and(W["active"] == EVAL_GEN_TAG, W["persis_state"] == 0)):
# Finally, generate points since there is nothing else to do (no resource sets req.)
Work[wid] = support.gen_work(wid, gen_specs.get("in", []), [], persis_info[wid], rset_team=[])
gen_count += 1
if gen_count == 0 and not np.any(np.logical_and(W["active"] == EVAL_GEN_TAG, W["persis_state"] == 0)):
# Finally, generate points since there is nothing else to do (no resource sets req.)
wid = support.avail_worker_ids(persistent=False, gen_workers=True)[0]
Work[wid] = support.gen_work(wid, gen_specs.get("in", []), [], persis_info[wid], rset_team=[])
gen_count += 1

return Work, persis_info
Loading