Human Generated Data

Title

Reclining Woman

Date

1st century BCE-2nd century CE

People
Classification

Sculpture

Human Generated Data

Title

Reclining Woman

People
Date

1st century BCE-2nd century CE

Classification

Sculpture

Machine Generated Data

Tags

Amazon

Figurine 99.4
Art 77.6
Sculpture 77.6
Statue 60.9

Clarifai

sculpture 99.7
art 98
no person 97.8
one 97.3
still life 95.5
museum 95.2
people 94.3
figurine 93.5
statue 93
nude 91.2
ancient 90.5
man 89.9
grow 88.3
mammal 88.3
cutout 87.6
competition 87.4
fish 87.3
two 87.3
food 87.3
side view 86.5

Imagga

tamarind 99.8
fruit 56.1
food 46.7
produce 41.5
bread 27
brown 24.3
healthy 21.4
fresh 20.3
loaf 19.4
meal 18.8
tasty 17.5
wheat 17.1
breakfast 17
nutrition 15.1
baked 15
delicious 14.9
dinner 14.3
organic 14.3
closeup 14.1
fungus 14
bakery 13.6
vegetable 13.1
cooking 13.1
diet 12.9
bun 12.1
natural 12
close 12
stone 11.8
mushroom 11.6
crust 11.6
pastry 11.4
lunch 11.1
snack 11.1
eat 10.9
spice 10.5
whole 10.5
health 10.4
grain 10.1
kitchen 9.8
cereal 9.7
animal 9.5
gourmet 9.3
eating 9.3
cuisine 8.9
baker 8.8
bake 8.7
edible 8.6
stones 8.5
garden 8.4
traditional 8.3
cook 8.2
slice 8.2
group 8.1
rye 7.8
flour 7.8
restaurant 7.8
sliced 7.7
cap 7.6
art 7.4
toy 7.4
peace 7.3
raw 7.1

Google

Microsoft

sculpture 55.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Female, 89.1%
Angry 5.1%
Disgusted 52.5%
Sad 11.8%
Calm 16.9%
Happy 7.5%
Confused 4%
Surprised 2.2%

Microsoft Cognitive Services

Age 76
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Captions

Microsoft

a person sitting on a table 17%
a person holding a banana 5.1%
a person sitting on the ground 5%