Human Generated Data

Title

Baby

Date

19th century

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Richard Currier, 1964.64.37

Human Generated Data

Title

Baby

People
Date

19th century

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-02-19

Plant 99.4
Food 93.8
Potato 93.8
Vegetable 93.8
Produce 79.4
Human 72
Person 72
Soil 64.1
Sweet Potato 56.4
Yam 56.4
Skin 55.4

Imagga
created on 2022-02-19

yam 100
sweet potato 100
root vegetable 100
vegetable 81.2
produce 47.9
food 43.5
fresh 26.1
diet 21.8
meal 21.1
tasty 20.1
bread 19.4
brown 17.7
healthy 17.6
delicious 17.3
nutrition 15.9
baked 15.9
snack 15.4
wheat 15.2
breakfast 15
close 14.8
dinner 14.3
bakery 14.3
raw 14.3
hand 13.7
gourmet 13.6
eating 13.5
organic 13.4
closeup 12.8
loaf 12.6
eat 12.6
pastry 12.3
cooking 12.2
lunch 12
meat 11.7
cook 11
nobody 10.9
natural 10.7
crust 10.6
whole 10.5
sweet 10.3
fruit 10
tamarind 9.8
cuisine 9.8
wooden 9.7
health 9
cut 9
ingredient 8.8
hands 8.7
edible fruit 8.6
freshness 8.3
traditional 8.3
bun 7.8
potato 7.8
bake 7.7
plate 7.6
studio 7.6
grain 7.4
skin 7.2
body 7.2
kitchen 7.2
love 7.1

Google
created on 2022-02-19

Food 90.5
Gesture 85.3
Artifact 75.2
Art 75
Vegetable 74.7
Root vegetable 68.5
Foot 67.5
Produce 61.2
Sculpture 60.2
Flesh 59.4
Clay 53.9
Human leg 53.4
Comfort 53
Terrestrial animal 52.7
Tuber 52.3

Microsoft
created on 2022-02-19

indoor 88.9
vegetable 61.3

Face analysis

Amazon

AWS Rekognition

Age 0-6
Gender Male, 96.6%
Calm 65.1%
Fear 30.1%
Surprised 2.2%
Disgusted 0.7%
Sad 0.6%
Confused 0.6%
Happy 0.5%
Angry 0.2%

Feature analysis

Amazon

Person 72%