Human Generated Data

Title

Untitled (man preparing bride for studio portrait)

Date

1946

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19490

Human Generated Data

Title

Untitled (man preparing bride for studio portrait)

People

Artist: Samuel Cooper, American active 1950s

Date

1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19490

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 97.1
Human 97.1
Studio 94.4
Person 81.2
Apparel 79.1
Clothing 79.1
Flooring 71.3
Floor 67.8
Indoors 62.7
Room 62.7
Art 58.5
Photography 58.1
Photo 58.1

Clarifai
created on 2019-10-29

people 98.7
room 97
chair 96.8
furniture 95.8
adult 94.4
man 93.6
woman 92
seat 89.1
monochrome 88.3
indoors 87.8
one 84.4
model 83.7
hospital 83.5
wear 81.1
medicine 81
inside 80.5
fashion 80.3
exhibition 79.9
mirror 79.5
art 77.6

Imagga
created on 2019-10-29

crutch 27
musical instrument 22
staff 20.9
stick 19.8
man 19.5
brass 18.4
people 15.6
interior 15
device 14.6
lifestyle 13.7
male 13.5
trombone 13.2
wind instrument 12.9
adult 12.3
percussion instrument 11.4
portrait 11
indoor 10.9
business 10.9
equipment 10.9
black 10.8
sport 10.7
person 10.6
room 10.3
building 10.3
active 10
modern 9.8
fashion 9.8
furniture 9.8
chime 9.5
hair 9.5
glass 9.5
chair 9.1
women 8.7
club 8.5
professional 8.4
support 8.4
old 8.4
house 8.3
leisure 8.3
style 8.2
home 8
job 8
scale 7.9
indoors 7.9
life 7.8
sitting 7.7
wall 7.7
lamp 7.6
exercise device 7.6
window 7.5
floor 7.4
exercise 7.3
dress 7.2
working 7.1
exercise bike 7

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Male, 51.1%
Surprised 45.7%
Disgusted 45%
Happy 45.1%
Confused 45.3%
Angry 45.3%
Sad 48.5%
Fear 45.4%
Calm 49.7%

Feature analysis

Amazon

Person 97.1%

Captions

Text analysis

Amazon

2
37
032MA
2 U
W
U

Google

MJHYTRA2
37 MJHYTRA2 032MA
37
032MA