Human Generated Data

Title

Untitled (portrait of man and woman)

Date

copy negative made c. 1970, original image made earlier

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21606

Human Generated Data

Title

Untitled (portrait of man and woman)

People

Artist: John Howell, American active 1930s-1960s

Date

copy negative made c. 1970, original image made earlier

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 99.7
Clothing 99.7
Dress 98.8
Person 98.2
Human 98.2
Person 98
Female 95.9
Woman 85.3
Face 83.9
Fashion 81.8
Robe 81.8
Evening Dress 81.8
Gown 81.8
Drawing 78.7
Art 78.7
Shoe 73.2
Footwear 73.2
People 69.5
Girl 69.3
Photography 64.5
Photo 64.5
Portrait 64.5
Sketch 60.6
Wedding 59
Furniture 56.6
Chair 56.6
Wedding Gown 56.4

Imagga
created on 2022-03-05

negative 71.9
film 58.8
photographic paper 44
statue 42.7
sculpture 35.1
photographic equipment 29.3
art 27
monument 23.4
ancient 20.8
architecture 20.3
marble 19.8
old 18.8
history 18.8
tourism 17.3
stone 16.2
religion 16.1
dress 15.4
historical 15.1
antique 14.7
travel 14.1
famous 14
historic 13.8
culture 12.8
figure 11.9
landmark 11.7
god 11.5
face 11.4
detail 11.3
church 11.1
portrait 11
carving 11
tourist 10.9
city 10.8
bride 10.7
people 10.6
building 9.5
religious 9.4
person 9.1
vintage 9.1
clothing 8.9
decoration 8.9
roman 8.8
catholic 8.8
holy 8.7
structure 8.5
head 8.4
book jacket 8.3
traditional 8.3
column 8.3
fountain 8.1
symbol 8.1
window 8
love 7.9
theater 7.8
temple 7.8
model 7.8
faith 7.7
memorial 7.6
elegance 7.6
fashion 7.5
wedding 7.4
makeup 7.3
lady 7.3
body 7.2
celebration 7.2
adult 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96
window 94.2
person 93.5
dress 90
posing 89.2
wedding dress 87.8
sketch 86.3
clothing 85.5
bride 76.4
woman 74.1
drawing 68.9
black 67.5
white 64.3
human face 64.1
old 63.8

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.6%
Calm 82.7%
Surprised 14.8%
Sad 1.2%
Angry 0.6%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Male, 69.2%
Calm 43.5%
Surprised 39.5%
Happy 7.2%
Sad 4.2%
Fear 2.8%
Angry 1.2%
Disgusted 0.8%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Shoe 73.2%

Captions

Microsoft

a vintage photo of a man and woman posing for a picture 86.6%
a vintage photo of a man and a woman posing for a picture 86.5%
a vintage photo of a man and woman posing for the camera 83.1%

Text analysis

Amazon

138
ODVR-OVEETA

Google

138
138