Human Generated Data

Title

Untitled (portrait of man and woman sitting in chairs)

Date

1943

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21907

Human Generated Data

Title

Untitled (portrait of man and woman sitting in chairs)

People

Artist: Hamblin Studio, American active 1930s

Date

1943

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21907

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 99.9
Apparel 99.9
Human 97.6
Person 95.4
Person 95.4
Female 93.1
Robe 85.8
Fashion 85.8
Evening Dress 85.2
Gown 85.2
Suit 85
Overcoat 85
Coat 85
Woman 78.3
Flower 72.5
Blossom 72.5
Plant 72.5
Face 69.5
Wedding 69.5
Shirt 64.2
Wedding Gown 63.5
Flower Arrangement 63.2
Sunglasses 60.9
Accessories 60.9
Accessory 60.9
Dress 58.6
Girl 57.4

Clarifai
created on 2023-10-22

people 99.6
retro 95.4
wear 93.9
group 91.6
portrait 91.1
music 90.1
nostalgia 90
adult 89.7
sepia 89.7
monochrome 89.2
man 88.9
woman 88.1
two 87.9
interaction 87.6
child 87.2
vintage 86.7
three 86.6
veil 86.4
collage 86.2
actress 84.7

Imagga
created on 2022-03-11

groom 41
grunge 23.8
old 20.2
person 18.5
retro 18
art 17.7
vintage 17.4
texture 16
antique 15.6
people 15.1
man 14.9
dirty 14.5
wall 13.7
frame 13.3
grungy 13.3
paint 11.8
border 11.8
black 11.4
paper 10.6
space 10.1
dark 10
aged 9.9
human 9.7
pattern 9.6
weathered 9.5
women 9.5
men 9.4
water 9.3
style 8.9
cool 8.9
body 8.8
textured 8.8
text 8.7
negative 8.7
model 8.6
design 8.4
outdoor 8.4
adult 8.1
dress 8.1
graphic 8
light 8
businessman 7.9
male 7.9
color 7.8
ancient 7.8
blank 7.7
bride 7.7
power 7.6
fashion 7.5
sign 7.5
outdoors 7.5
silhouette 7.4
child 7.4
rough 7.3
splash 7.2
material 7.1
creative 7.1
happiness 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

wall 99.9
text 96.4
person 95.8
human face 94
clothing 92.1
smile 64.5
drawing 61.3
woman 60.8
man 54.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 74.3%
Calm 96.6%
Happy 1%
Surprised 0.8%
Disgusted 0.4%
Confused 0.3%
Sad 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 40-48
Gender Female, 96%
Confused 44.4%
Calm 41.2%
Happy 10.8%
Surprised 1.5%
Sad 0.6%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Sunglasses
Person 95.4%
Person 95.4%
Sunglasses 60.9%

Captions

Microsoft
created on 2022-03-11

an old photo of a man 70.5%
an old photo of a boy 46.5%
old photo of a man 46.4%