Hi everybody, I'm working on a CP model and I was wondering what is the best way to model a variable number of variables. Let's clarify with a minimal example. Suppose I have to model a set of *n* Jobs, where each Job *j* has a variable number of executions *x(j)*, and each execution *k*_{j} in *[1, x(j)]* has some properties, for instance a variable number of resources used *r(**k*_{j}*)*, a variable duration *d(**k*_{j}*)*, and so on. Then these jobs are subject to a set of constraints, for instance the sum of all durations for each job should be less than a threshold *T1*, and the sum of all resources for each execution should be above another threshold *T2*:

Shortly, I would like to do something like this:

range J = 1..n;

int min_executions[j in J] = ...;

int max_executions[j in J] = ...;

int min_resources[j in J] = ...;

int max_resources[j in J] = ...;

int min_duration[j in J] = ...;

int max_duration[j in J] = ...;

dvar int X[j in J] in min_executions[j]..max_executions[j];

dexpr int max_x = max(j in J) X[j]; // needed to specify a variable range for c2

dvar int r[j in J][k in 1..X[j]] in min_resources[j in J]..max_resources[j in J];

dvar int d[j in J][k in 1..X[j]] in min_duration[j in J]..max_duration[j in J];

subject to

{

forall(j in J)

c1: sum(k in 1..X[j]) d[j, k] < T1;

forall(k in 1..max_x)

c2: sum(j in J : k <= X[j]) r[j, k] > T2;

}

Well, this is all broken for several reasons. First, I get an "Variable indexer size not allowed for a generic array" error when declaring *r* and *d*. This is because it looks like in OPL we can't declare arrays whose size is a variable. Second, I get a "Decision variable (or expression) "x" not allowed" error when declaring the constraint *c2.* This is because in OPL we can't have variables as part of expressions in aggregators. The best I can come up with is this solution that features a couple of workarounds:

tuple JobExecution {

int job;

int execution;

}

dvar int X[j in J] in min_executions[j]..max_executions[j];

{JobExecution} K = {<j, x> | j in J, x in 1..max_executions[j]};

dvar int r[k in K] in min_resources[k.job]..max_resources[k.job];

dvar int d[k in K] in min_duration[k.job]..max_duration[k.job];

subject to

{

forall(j in J)

c1: (sum(k in K) (k.job == j && k.execution <= X[j]) * d[k]) < T1;

forall(k in K)

c2: (sum(j in J) (k. job == j && k.execution <= X[j]) * r[k]) > T2;

}

Semantically, this second version does what it's supposed to, but (to me) at a great computational cost.

The tuple set *K* indeed includes, for each Job *j*, the maximum number of executions *j* can have. This means that if the domain of *X[j]* is very large, there is a high chance that *K* will include a lot of tuples that will actually represent illegal task executions, i.e., task executions whose execution index is greater than the maximum number of executions for that task. For example, if task *j1* can have 3..20 executions, and the solver assigns a value 5 to *X[j1]*, then all the tuples of the form *<1, 6> ... <1, 20>* will represent illegal task executions.

This reflects over the arrays *d* and *r*, that will feature a lot of "useless" variables, whose value should not be taken into account when enforcing constraint, since their semantic is undefined (if a task has 5 executions, its 6th, 7th... 20th executions don't exist).

Furthermore, given the impossibility to state a conditional expression with a dexpr/dvar, *c1* and *c2* have to be formulated by encapsulating the condition as a multiplication with a boolean value, that in my experience is quite inefficient. It looks to me that every constraint should somehow include the condition *k.execution <= X[k.job]* either in form of implication (to filter a forall block), or in form of a boolean variable to multiply some quantity (to filter a sum/min/max aggregator).

forall(k in K)

k.execution <= X[k.job]) => ??; // this right hand side would be a constraint, enforced only if k is a good tuple...sum(k in K) (k.execution <= X[k.job]) * ??; // this right hand side would be some value, whose contribution to the sum is nonzero only if k is a good tuple

Or maybe, it could possibly be needed to "invalidate" in some way all of the bad tuples:

forall(k in K)

k.execution > X[k.job]) => ??; // not sure what to do here

Is there any better/recommended way to solve this problem? I tried looking into the guide for built-in functions like inverse, allowedAssignments, forbiddenAssignments, and so on, but I'm not sure they can be used for what I need.

Regards,

Stefano