Yes, see it as a property of each job. At the beginning, all probabilities are equal and add to 1. Later the probabilities increase/decrease for executed jobs.

See my example!

OK, let us summarize it :

A job can be defined by next structure

`typedef struct {`

int jobId //job's identificator

double P // a float- point number within [0,1] interval

} defjob;

yet we got whole bunch of jobs:

`static defjob jobs[50000];`

uniformly distributed at the starting point. One can describe it with function:

`void initializejobs ( void); // It set initial jobId and p for each job`

The Example (with 5 jobs):

{{1,0.2},{2,0.2},{3,0.2},{4,0.2},{5,0.2}}

Let us go on...

to follow your plan one has to peek up and execute a job from jobs with probability p...

One can describe it with function:

`defjob pickupjob ( double p); // It returns a random job from subset jobs having p probability`

let us look on the example...

as we got uniform distribution at start we can do it by calling

pickupjob (0.2);

The next step depends on result of the execution choosen job

but we MUST RECALCULATE the probabilities for EVERY job...

One can achieve it with function:

`void normalizejobs ( defjob ajob; double step ); // It is normalize the probability `

//distribution in the such way that the sum of the p over each job = 1

// for the simplisity sake let us regard step as the percent on wich we must increase(decrease) ajob's probability (depending on result of the execution)

Let us proceed with the example

let calling pickupjob (0.2); returns a job with id=3

and executing of the job we get positive result:

in our example calling to

normalizejobs(ajob, 1);

gives us following :

{{1,0.1975},{2,0.1975},{3,0.21},{4,0.1975},{5,0.1975}}

if the result of the ajob execution is negative

we will get a call to

normalizejobs(ajob, -1);

resulting in

{{1,0.2025},{2,0.2025},{3,0.19},{4,0.2025},{5,0.2025}}

**is it CORRECT?**