Human Generated Data

Title

Untitled (rephotographed vintage portrait of woman with two children)

Date

c. 1950-1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11080

Human Generated Data

Title

Untitled (rephotographed vintage portrait of woman with two children)

People

Artist: Claseman Studio, American 20th century

Date

c. 1950-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Electronics 99.9
Display 99.9
Screen 99.9
Human 98.4
Person 98.4
Face 95.5
Person 94.8
TV 94.6
Television 94.6
Clothing 94.6
Apparel 94.6
Head 90.8
Person 87.3
Monitor 87.2
LCD Screen 86.6
Indoors 73.9
Interior Design 73.9
Photo 68.6
Portrait 68.6
Photography 68.6
Female 61.8
Overcoat 56.8
Suit 56.8
Coat 56.8

Clarifai
created on 2019-03-25

people 99.6
adult 96.2
portrait 92.1
education 92
man 91.5
child 91.3
classroom 88.5
wear 87.9
two 87.9
chalkboard 86.3
school 86.1
woman 84.6
one 83.8
indoors 83
group 77.9
facial expression 77.5
family 76.2
room 75
display 69.8
chalk 68.7

Imagga
created on 2019-03-25

television 100
telecommunication system 94
monitor 40.1
broadcasting 40
telecommunication 30.5
screen 28.7
technology 27.4
computer 24.3
equipment 24.1
display 21.7
laptop 20.1
medium 18.4
electronic 17.7
business 17.6
object 17.6
office 16.1
frame 14.1
keyboard 14.1
electronic equipment 13.9
black 13.2
modern 12.6
flat 12.5
information 12.4
communication 11.8
digital 11.3
one 11.2
global 10.9
space 10.8
tech 10.4
notebook 10.4
blank 10.3
work 10.2
design 10.1
liquid crystal 9.9
desktop 9.6
communications 9.6
media 9.5
data 9.1
vintage 9.1
working 8.8
electronics 8.5
hand 8.3
network 8.3
person 8.3
happy 8.1
symbol 8.1
transportation 8.1
chalkboard 7.8
face 7.8
art 7.8
portable 7.8
visual 7.7
wide 7.7
wireless 7.6
web 7.6
contemporary 7.5
blackboard 7.4
close 7.4
car 7.4
smiling 7.2
color 7.2
idea 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

window 99.5
monitor 95.9
television 91.3
old 91
black 76
white 64.2
display 46.4
image 35.6
black and white 27.6
picture frame 27.6
monochrome 21
person 3.8
museum 3.7

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 87.5%
Happy 3.8%
Calm 31.1%
Confused 1.4%
Sad 55%
Disgusted 1.8%
Surprised 1.7%
Angry 5.1%

AWS Rekognition

Age 48-68
Gender Female, 92.2%
Sad 89.8%
Calm 2.2%
Happy 0.8%
Surprised 1.5%
Angry 2.9%
Disgusted 0.7%
Confused 2.2%

AWS Rekognition

Age 30-47
Gender Female, 82.5%
Sad 8.7%
Calm 32.8%
Happy 32.3%
Surprised 14.5%
Angry 5.2%
Disgusted 1.7%
Confused 4.8%

Feature analysis

Amazon

Person 98.4%
Monitor 87.2%

Captions

Microsoft

a vintage photo of a person 85.4%
a black and white photo of a person 81.2%
a person standing in front of a window 78.2%