Human Generated Data

Title

Untitled (two unidentified girls wearing shawls or saris, seated on rug, both reaching across round objects on rug)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.9

Human Generated Data

Title

Untitled (two unidentified girls wearing shawls or saris, seated on rug, both reaching across round objects on rug)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.9

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Adult 98.6
Adult 98.6
Bride 98.6
Female 98.6
Female 98.6
Person 98.6
Wedding 98.6
Woman 98.6
Adult 98.1
Female 98.1
Person 98.1
Woman 98.1
Art 95.7
Painting 95.7
Face 86.8
Head 86.8
Cutlery 84.3
Clothing 82.5
Hat 82.5
Food 76.4
Meal 76.4
People 65.7
Spoon 57.7
Shop 56
Bonnet 55.9
Fashion 55.7
Washing 55.2

Clarifai
created on 2019-02-26

people 99.9
adult 99.1
veil 98.3
portrait 97
man 96.8
wear 96.1
illustration 95.7
woman 95.4
art 94.9
one 93.7
group 93.2
leader 90.7
two 90.6
gown (clothing) 87.4
outfit 86.1
old 85.2
retro 84.9
print 83.6
vintage 83.4
religion 81.4

Imagga
created on 2019-02-26

money 29.8
container 29.6
currency 29.6
cash 28.4
old 27.2
stamp 26.1
bank 26
paper 25.3
vintage 24.8
grunge 23.8
bill 23.8
finance 22.8
dollar 21.3
envelope 20.9
ancient 19
banking 18.4
exchange 17.2
texture 16.7
antique 16.6
die 16.6
business 16.4
aged 16.3
financial 16
structure 15.7
wealth 15.3
retro 14.8
note 14.7
art 14.4
sculpture 13.2
savings 13.1
frame 12.7
bills 12.6
design 12.6
shaping tool 12.5
pay 12.5
wall 12
dollars 11.6
rich 11.2
economy 11.1
investment 11
history 10.7
hundred 10.6
payment 10.6
historic 10.1
carving 9.9
banknote 9.7
value 9.7
states 9.7
us 9.6
decoration 9.2
flower 9.2
border 9
one 9
style 8.9
brown 8.8
grime 8.8
economic 8.7
memorial 8.6
blank 8.6
travel 8.4
wallpaper 8.4
success 8
detail 8
close 8
card 7.9
banknotes 7.8
fracture 7.8
tool 7.8
decay 7.7
profit 7.7
floral 7.7
rusty 7.6
united 7.6
old fashioned 7.6
sign 7.5
wallet 7.5
gold 7.4
grain 7.4
letter 7.3
artwork 7.3
graphic 7.3
brass 7.2
religion 7.2
material 7.1
case 7
ingot 7

Google
created on 2019-02-26

Microsoft
created on 2019-02-26

text 97.7
book 93.6
old 91.2
vintage 25.4
art 25.4
engraving 23.7
monochrome 20.5
illustration 17.6
print 16.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Female, 99.6%
Calm 62.7%
Angry 22.9%
Fear 10.4%
Surprised 6.6%
Sad 2.4%
Disgusted 2%
Confused 1.7%
Happy 0.2%

AWS Rekognition

Age 13-21
Gender Female, 100%
Sad 98.5%
Calm 41.8%
Surprised 6.3%
Fear 6%
Confused 0.5%
Disgusted 0.3%
Angry 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 10
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Bride 98.6%
Female 98.6%
Person 98.6%
Woman 98.6%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-02-26

a vintage photo of a book 64.3%
an old photo of a book 64.2%
a vintage photo of a person 64.1%