Human Generated Data

Title

Untitled (unidentified man, standing, wearing turban, right hand resting on back of chair, unidentified woman in sari, seated, unidentified girl wearing sari standing to right of woman)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.4

Human Generated Data

Title

Untitled (unidentified man, standing, wearing turban, right hand resting on back of chair, unidentified woman in sari, seated, unidentified girl wearing sari standing to right of woman)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Human 99.6
Person 99.6
Person 99.5
Person 99.1
Poster 87.2
Advertisement 87.2
People 78.3
Tribe 65.7
Art 59.9
Painting 57.8

Imagga
created on 2022-02-25

book jacket 64
jacket 49.8
wrapping 37.8
vintage 36.4
covering 29.1
old 27.9
paper 26.7
art 25.1
grunge 21.3
binding 21.2
retro 20.5
stamp 19.7
mail 19.2
postage 18.7
antique 18.2
post 18.1
history 17.9
letter 17.4
ancient 17.3
culture 16.2
postmark 15.8
symbol 15.5
texture 15.3
postal 14.7
museum 13.9
black 13.8
aged 13.6
envelope 13.1
global 12.8
currency 12.6
one 12
money 11.9
design 11.6
lighter 11.5
painted 11.5
banking 11
cash 11
bill 10.5
book 10.3
decoration 10.3
wall 10.3
finance 10.1
device 9.9
masterpiece 9.9
religion 9.9
stamps 9.9
bank 9.9
renaissance 9.8
sculpture 9.8
communications 9.6
church 9.3
painter 9.1
dirty 9
wealth 9
zigzag 8.9
printed 8.9
paintings 8.8
delivery 8.8
cutting 8.7
fine 8.6
unique 8.5
communication 8.4
dollar 8.4
page 8.4
note 8.3
stone 8.1
office 8
financial 8
close 8
icon 7.9
post mail 7.9
fame 7.9
business 7.9
known 7.9
shows 7.9
empty 7.7
us 7.7
artist 7.7
wallpaper 7.7
united 7.6
historical 7.5
pattern 7.5
religious 7.5
graffito 7.5
ornate 7.3
backgrounds 7.3
collection 7.2

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.6
person 98.6
clothing 98.4
old 97.6
vintage 96.4
posing 95.6
black 92.3
white 82.3
human face 70.9
woman 68
photograph 64.3
fashioned 53.6
vintage clothing 50.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-22
Gender Female, 99.9%
Calm 95%
Angry 2.5%
Surprised 0.8%
Fear 0.6%
Sad 0.4%
Disgusted 0.3%
Confused 0.2%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Calm 97.9%
Confused 0.7%
Angry 0.4%
Happy 0.3%
Sad 0.3%
Fear 0.2%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 16-22
Gender Male, 99.9%
Calm 92.2%
Surprised 3.2%
Fear 2.9%
Angry 0.5%
Confused 0.4%
Sad 0.4%
Disgusted 0.2%
Happy 0.2%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 87.2%
Painting 57.8%

Captions

Microsoft

a vintage photo of Sayajirao Gaekwad III et al. posing for the camera 87%
a vintage photo of Sayajirao Gaekwad III et al. posing for a picture 86.9%
a vintage photo of some people posing for the camera 85.8%