Human Generated Data

Title

Untitled (outdoor photograph of couple)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1483

Human Generated Data

Title

Untitled (outdoor photograph of couple)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.9
Human 98.9
Person 98.9
Apparel 89.5
Clothing 89.5
Face 76.1
Photo 66.7
Photography 66.7
Portrait 64.9
Electronics 64
Phone 64

Clarifai
created on 2019-06-01

people 99
man 96.1
adult 93.4
woman 93.2
wear 92.8
indoors 91.3
vertical 89.1
two 84.7
uniform 82.3
veil 79.8
door 73.2
outfit 71.4
business 70.8
three 68.6
doorway 68.6
military 68.4
dress 68.4
wedding 67.6
dinner jacket 67
group 66.5

Imagga
created on 2019-06-01

negative 30.5
film 27
people 24
person 22.1
man 19.4
photographic paper 18.6
professional 17.8
male 17
business 15.8
human 15
adult 14.9
bride 14.6
happy 13.8
cartoon 13.4
medicine 13.2
doctor 13.2
portrait 12.9
work 12.6
photographic equipment 12.4
medical 12.4
attractive 11.9
holding 11.6
black 11.4
design 11.3
looking 11.2
women 11.1
drawing 10.7
office 10.6
art 10.5
sign 10.5
lady 9.7
job 9.7
group 9.7
smiling 9.4
model 9.3
smile 9.3
silhouette 9.1
care 9.1
dress 9
boutique 8.9
success 8.9
businessman 8.8
object 8.8
hospital 8.6
blank 8.6
modern 8.4
nurse 8.4
clip art 8.3
health 8.3
occupation 8.2
confident 8.2
handsome 8
posing 8
lifestyle 8
happiness 7.8
3d 7.7
corporate 7.7
men 7.7
pretty 7.7
clinic 7.7
exam 7.7
career 7.6
communication 7.6
sketch 7.4
photograph 7.3
worker 7.3
newlywed 7.3
color 7.2
copy space 7.2
paper 7.2
copy 7.1
working 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 94
person 90.9
black and white 88.5
man 57.6
human face 53.5
room 47.1
old 43.7
posing 35.1
picture frame 16.6

Face analysis

Amazon

AWS Rekognition

Age 38-59
Gender Male, 57.4%
Confused 0.6%
Surprised 1.3%
Calm 4.6%
Sad 1.9%
Happy 89.8%
Disgusted 0.9%
Angry 0.9%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a man standing in front of a window 71.3%
an old photo of a man 71.2%
a man standing next to a window 63.5%