Human Generated Data

Title

Angel

Date

19th century

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Estate of Richard Currier, 1964.64.4

Human Generated Data

Title

Angel

People
Date

19th century

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-02-19

Bag 89.6
Sack 89.6
Plant 88.5
Food 77.7
Pottery 66.7
Produce 58.1
Bread 57.6
Bun 57.6

Imagga
created on 2022-02-19

vegetable 56
artichoke 54.7
food 44.4
produce 28.2
oyster 22.6
fresh 20.3
mollusk 19.9
meal 19.5
delicious 18.2
invertebrate 16.7
cuisine 16
garlic 15.8
tasty 15
shell 14.9
herb 14.8
plate 14.7
restaurant 14.7
fish 14.6
healthy 14.5
eat 14.3
seafood 14.2
closeup 13.5
bivalve 13.3
dish 13.3
lunch 12.9
gourmet 12.7
eating 12.6
organic 12.6
gold 12.3
cooking 12.2
plant 12.1
animal 11.9
raw 11.6
market 11.5
natural 11.4
diet 11.3
ingredient 10.6
close 10.3
vascular plant 10.2
golden 9.5
snack 9.4
dinner 9.3
nutrition 9.2
artichoke heart 8.9
celebration 8.8
life 8.6
dining 8.6
appetizer 8.5
texture 8.3
fruit 8.3
meat 8.1
group 8.1
kitchen 8
culinary 7.6
arthropod 7.6
marine 7.6
decoration 7.5
brown 7.4
object 7.3
detail 7.2
vegetarian 7.2
breakfast 7.1
knife 7
sea 7

Google
created on 2022-02-19

Microsoft
created on 2022-02-19

indoor 89.9

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 100%
Calm 34.3%
Confused 28.7%
Sad 14.8%
Surprised 11.9%
Disgusted 3.4%
Fear 2.8%
Angry 2.4%
Happy 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Captions

Microsoft

a stuffed animal sitting on a table 40%
a close up of a stuffed animal 39.9%
a stuffed animal 39.8%