Experiment 20220108-MTOA

Experiment design

20220108-MTOA

Date: 2022-01-08 (Andreas Kalaitzakis)

Hypotheses

(1) Agents carrying out several tasks will reach a state of global consensus, i.e., a state at which all onwards interactions are successful.

(2) An agent will improve its accuracy on one task by carrying out another task.

Experimental setting

20 runs; 80000 games

Agents learn multi-task ontologies. Agents will coordinate on a set of decision tasks, making a decision about an object.

Variables independent variables: ['numberOfTasks']

dependent variables: ['accuracy', 'success_rate']

Experiment

Date: 2022-01-08 (Andreas Kalaitzakis)

Lazy lavender hash: ceb1c5d1ca8109373d293b687fc55953fce5241d

Parameter file: params.sh

Executed command (script.sh):

#!/bin/bash

. params.sh

CURRDIR=$(pwd)
OUTPUT=${CURRDIR}/${DIRPREF}
# cd ${LLPATH}
cd lazylav
# this sample runs ExperimentalPlan. It can be replaced with Monitor if parameters are not varied.
bash scripts/runexp.sh -p ${CURRDIR} -d ${DIRPREF} java -Dlog.level=INFO -cp ${JPATH} fr.inria.exmo.lazylavender.engine.ExperimentalPlan -Dexperiment=fr.inria.exmo.lazylavender.decisiontaking.multitask.MultitaskExperiment ${OPT} -DresultDir=${OUTPUT}

Raw results

Full results can be found at:

Zenodo DOI

Data files

Analysis

Raw data (averaged)

Out[32]:
avg-Accuracy_t=2 avg-Success_rate_t=2 avg-Accuracy_t=1 avg-Success_rate_t=1 avg-Accuracy_t=4 avg-Success_rate_t=4 avg-Accuracy_t=8 avg-Success_rate_t=8
0 0.246094 0.250000 0.281563 0.250000 0.218945 0.350000 0.202559 0.150000
1 0.248125 0.225000 0.279844 0.275000 0.220234 0.225000 0.203008 0.225000
2 0.248906 0.250000 0.281094 0.233333 0.222070 0.200000 0.204277 0.233333
3 0.249297 0.262500 0.280000 0.237500 0.223438 0.162500 0.205430 0.250000
4 0.250547 0.290000 0.278906 0.230000 0.224648 0.170000 0.206621 0.260000
5 0.251641 0.300000 0.279063 0.266667 0.226094 0.191667 0.207578 0.258333
6 0.254297 0.300000 0.278906 0.257143 0.226875 0.214286 0.208418 0.242857
7 0.255938 0.287500 0.279219 0.250000 0.227656 0.212500 0.209473 0.250000
8 0.257188 0.277778 0.278594 0.261111 0.227930 0.211111 0.210293 0.244444
9 0.258281 0.265000 0.277344 0.265000 0.228477 0.210000 0.211016 0.245000
10 0.259766 0.268182 0.278906 0.259091 0.229531 0.213636 0.212012 0.245455
11 0.261875 0.287500 0.280781 0.266667 0.230156 0.212500 0.212598 0.245833
12 0.262891 0.292308 0.280000 0.273077 0.232070 0.215385 0.213438 0.238462
13 0.262891 0.289286 0.281563 0.278571 0.232773 0.225000 0.214648 0.253571
14 0.264219 0.280000 0.284531 0.283333 0.233672 0.233333 0.216191 0.240000
15 0.265781 0.275000 0.284844 0.275000 0.235039 0.228125 0.216934 0.240625
16 0.266953 0.276471 0.285313 0.267647 0.236328 0.229412 0.218027 0.235294
17 0.267656 0.275000 0.285625 0.280556 0.236719 0.241667 0.218457 0.236111
18 0.269297 0.278947 0.284687 0.286842 0.237578 0.242105 0.219492 0.234211
19 0.270000 0.275000 0.286562 0.290000 0.238984 0.242500 0.220488 0.232500
20 0.271328 0.278571 0.288438 0.292857 0.240586 0.242857 0.220918 0.235714
21 0.272266 0.288636 0.287187 0.290909 0.241641 0.247727 0.221582 0.236364
22 0.273750 0.284783 0.285781 0.289130 0.242578 0.252174 0.222422 0.239130
23 0.275000 0.277083 0.285938 0.289583 0.243164 0.247917 0.223242 0.237500
24 0.275313 0.272000 0.284219 0.282000 0.243828 0.244000 0.224180 0.238000
25 0.277031 0.273077 0.284687 0.286538 0.244648 0.242308 0.225254 0.236538
26 0.276719 0.272222 0.283281 0.288889 0.245195 0.242593 0.225977 0.237037
27 0.278125 0.278571 0.284219 0.289286 0.245703 0.239286 0.226895 0.233929
28 0.279375 0.279310 0.285313 0.286207 0.246484 0.232759 0.227832 0.232759
29 0.280156 0.286667 0.286875 0.283333 0.247109 0.236667 0.228613 0.231667
... ... ... ... ... ... ... ... ...
79970 0.770312 0.977072 0.646875 0.989132 0.792188 0.948853 0.795312 0.894760
79971 0.770312 0.977072 0.646875 0.989132 0.792188 0.948853 0.795312 0.894761
79972 0.770312 0.977072 0.646875 0.989132 0.792188 0.948854 0.795312 0.894763
79973 0.770312 0.977073 0.646875 0.989132 0.792188 0.948855 0.795312 0.894763
79974 0.770312 0.977073 0.646875 0.989132 0.792188 0.948855 0.795312 0.894765
79975 0.770312 0.977073 0.646875 0.989132 0.792188 0.948856 0.795312 0.894766
79976 0.770312 0.977073 0.646875 0.989133 0.792188 0.948857 0.795312 0.894767
79977 0.770312 0.977074 0.646875 0.989133 0.792188 0.948857 0.795312 0.894769
79978 0.770312 0.977074 0.646875 0.989133 0.792188 0.948858 0.795312 0.894770
79979 0.770312 0.977074 0.646875 0.989133 0.792188 0.948858 0.795312 0.894771
79980 0.770312 0.977075 0.646875 0.989133 0.792188 0.948859 0.795312 0.894771
79981 0.770312 0.977075 0.646875 0.989133 0.792188 0.948860 0.795312 0.894773
79982 0.770312 0.977075 0.646875 0.989133 0.792188 0.948860 0.795312 0.894774
79983 0.770312 0.977075 0.646875 0.989133 0.792188 0.948861 0.795312 0.894775
79984 0.770312 0.977076 0.646875 0.989134 0.792188 0.948862 0.795312 0.894777
79985 0.770312 0.977076 0.646875 0.989134 0.792188 0.948862 0.795312 0.894778
79986 0.770312 0.977076 0.646875 0.989134 0.792188 0.948863 0.795312 0.894779
79987 0.770312 0.977077 0.646875 0.989134 0.792188 0.948864 0.795312 0.894780
79988 0.770312 0.977077 0.646875 0.989134 0.792188 0.948864 0.795312 0.894782
79989 0.770312 0.977077 0.646875 0.989134 0.792188 0.948865 0.795312 0.894783
79990 0.770312 0.977077 0.646875 0.989134 0.792188 0.948865 0.795312 0.894784
79991 0.770312 0.977078 0.646875 0.989135 0.792188 0.948866 0.795312 0.894786
79992 0.770312 0.977078 0.646875 0.989135 0.792188 0.948867 0.795312 0.894787
79993 0.770312 0.977078 0.646875 0.989135 0.792188 0.948867 0.795312 0.894788
79994 0.770312 0.977079 0.646875 0.989135 0.792188 0.948868 0.795312 0.894790
79995 0.770312 0.977079 0.646875 0.989135 0.792188 0.948869 0.795312 0.894791
79996 0.770312 0.977079 0.646875 0.989135 0.792188 0.948869 0.795312 0.894792
79997 0.770312 0.977079 0.646875 0.989135 0.792188 0.948870 0.795312 0.894793
79998 0.770312 0.977080 0.646875 0.989135 0.792188 0.948871 0.795312 0.894794
79999 0.770312 0.977080 0.646875 0.989136 0.792188 0.948871 0.795312 0.894796

80000 rows × 8 columns

Figures

One-way Anova for avg accuracy only with the averages files
82591.27866536228
0.0
One-way Anova for avg accuracy with all .csv files
32365.712820735433
0.0
One-way Anova for srate only with the averages files
18844.927210921087
0.0
One-way Anova for srate
91420.40682916986
0.0

Discussion

Figure 1 displays the evolution of the average success rate (y-axis) as the number of iterations increases (x-axis), depending on the number of tasks (|T |). It shows that a population of interacting agents will reach a state of collective consensus, independently from the number of tasks, supporting hypothesis (1). Furthermore, results indicate that the number of tasks impacts the achieved success rate. The higher the number of tasks, the more interactions are required to achieve global consensus, thus the lower is the average success rate at convergence.

Figure 2 portrays the evolution of the average accuracy (y-axis), depending on the number of carried tasks. Each point x,y corresponds to the average accuracy of all tackled tasks at the n^th interaction of each run. Given that for each interaction the task is selected randomly, the number of interactions with respect to a task at the n^th interaction is different for the four displayed distributions. An agent carrying out 2 tasks is on average 19% more accurate than its single task counterpart, while an agent carrying out 4 tasks furtherly improves its average accuracy by another 3%.

Results not only support hypothesis (2), but also show that carrying out additional tasks follows the law of diminishing returns with respect to the acquired accuracy. For average accuracy in particular, Figure 2 shows that a two-tasks ontology is 19 % more accurate than a one-task ontology, a four-tasks ontology is 3 % more accurate than a two-tasks ontology, and an eight-tasks ontology is only 0.4 % more accurate than a four-tasks ontology.