Human Generated Data

Title

Untitled (woman fixing bride's hair)

Date

1940

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19489

Human Generated Data

Title

Untitled (woman fixing bride's hair)

People

Artist: Samuel Cooper, American active 1950s

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Furniture 96.1
Human 96.1
Person 96.1
Person 93.5
Room 92.6
Indoors 92.6
Cabinet 87.3
Leisure Activities 78.9
Dressing Room 78.8
Person 67

Clarifai
created on 2019-10-29

people 99.9
adult 98.9
furniture 98.3
indoors 97.2
woman 96.4
group 96.2
room 96.2
wear 95.9
monochrome 95.5
man 94.9
two 93.3
outfit 90.6
print 89.7
administration 89.2
actress 89
illustration 88.4
one 88
veil 87.6
seat 85.4
home 84

Imagga
created on 2019-10-29

shop 32
barbershop 31.8
interior 27.4
boutique 25.4
mercantile establishment 21.7
home 21.5
people 21.2
modern 21
house 19.2
room 18.5
kitchen 18.2
luxury 18
design 17.4
table 17.3
style 16.3
man 15.4
furniture 15.4
person 15.1
place of business 14.7
indoor 14.6
decoration 13.7
male 13.5
case 12.8
elegance 12.6
indoors 12.3
business 11.5
decor 11.5
window 11.2
celebration 11.2
inside 11
occupation 11
chair 10.9
dress 10.8
light 10.7
fashion 10.5
cabinet 10.5
apartment 10.5
salon 10.4
men 10.3
glass 10.1
adult 9.8
job 9.7
black 9.7
sexy 9.6
couple 9.6
work 9.5
equipment 9.5
love 9.5
happiness 9.4
happy 9.4
television 9.3
new 8.9
businessman 8.8
women 8.7
smile 8.5
marriage 8.5
contemporary 8.5
floor 8.4
wood 8.3
hospital 8.3
architecture 7.8
color 7.8
elegant 7.7
wall 7.7
bride 7.7
husband 7.6
restaurant 7.6
brass 7.6
lights 7.4
establishment 7.3
cheerful 7.3
counter 7.3
food 7.2
smiling 7.2
stylish 7.2
lifestyle 7.2
suit 7.2
life 7.1
hair 7.1
family 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 95.9
sink 87.9
black and white 84.8
clothing 80
dress 65.8
person 64
home appliance 54.4
posing 37.7

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Female, 53.1%
Happy 45%
Calm 45.3%
Angry 45.3%
Surprised 45.2%
Sad 53.4%
Confused 45.4%
Disgusted 45%
Fear 45.4%

AWS Rekognition

Age 26-42
Gender Female, 50.5%
Confused 45.3%
Fear 45.4%
Surprised 45.3%
Disgusted 45.1%
Calm 46.8%
Happy 45.1%
Sad 48.8%
Angry 48.2%

AWS Rekognition

Age 32-48
Gender Male, 54.4%
Angry 45.2%
Fear 45.5%
Happy 46.1%
Disgusted 45.4%
Calm 51.3%
Confused 45.2%
Sad 45.5%
Surprised 45.9%

AWS Rekognition

Age 22-34
Gender Female, 54.9%
Calm 48%
Angry 45.2%
Confused 45.1%
Surprised 45.1%
Disgusted 45.1%
Happy 48.3%
Fear 45.2%
Sad 48%

Feature analysis

Amazon

Person 96.1%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 78.1%
a person standing in front of a mirror posing for the camera 78%
a person standing in front of a mirror 77.9%

Text analysis

Amazon

E
223190A93902A1A
elaakole