Figure 2: F1-Confidence Curve Fig 3: Precision- Recall curve
5.2 Training the model using ADAM
Optimizer
Case1 (b). All paramaters kept at default value and keechanging the
optimizer to ADAM.
- lr0 -> 0.1
- momentum ->0.9
- lrf ->0.2
- weight_decay: 0.0005
- warmup_epochs: 3.0
- optimizer : ADAM
In the second experiment the model is trained using the ADAM optimizer.
The value of the optimizer is kept at 0.01 to evaluate if the model
works better than SGD optmizer. The experiment is conducted with 30
epochs on the sub –sampled tiles of xView dataset. PR AUC 0.043 at
mAP@0.5 | F1 max 0.03 at 0.071 confidence is achieved on the
experiment and it is concluded that the model did not show better
performance. Figure 2 and Figure 3 shows the F1-Confidence curve and the
Precision –Recall curve respectively. Figure 4 shows the predicted
objects in the image with their percentage of depiction.