Human Generated Data

Title

Untitled (Dr. Herman M. Juergens, with nurses and other doctors; driving a car)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.483

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens, with nurses and other doctors; driving a car)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.483

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Art 100
Collage 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Person 97.7
Chart 97.5
Plot 97.5
Person 97.2
Person 97.2
Person 96.9
Adult 96.7
Male 96.7
Man 96.7
Person 96.7
Person 96.5
Person 96.4
Person 94
Person 91
Person 87
Clothing 76.1
Hat 76.1
Head 67.3
Face 63.6
Helmet 57
Text 55.6

Clarifai
created on 2018-10-06

people 99.4
adult 98.3
room 97.5
man 97.2
one 93.5
furniture 92.7
woman 92
chair 91.8
television 90.2
two 89.4
indoors 87.3
wear 86.6
technology 85.8
business 85.2
desk 84.9
desktop 83.7
monochrome 83
sit 82.7
paper 82.7
office 80.6

Imagga
created on 2018-10-06

computer 28.4
man 24.3
work 22.8
business 22.5
office 21.9
technology 21.5
people 20.6
equipment 20.3
male 19.9
professional 18.2
working 17.7
medical 17.6
laptop 17.5
person 17.3
device 16.9
men 14.6
microscope 14.4
worker 14.2
laboratory 13.5
adult 13.2
job 12.4
modern 11.9
businessman 11.5
medicine 11.4
happy 11.3
monitor 10.8
room 10.7
hand 10.6
interior 10.6
success 10.5
biology 10.4
doctor 10.3
finance 10.1
occupation 10.1
businesswoman 10
smile 10
scientist 9.8
home 9.6
service 9.6
research 9.5
hospital 9.5
corporate 9.4
network 9.3
table 9.3
indoors 8.8
lab 8.7
smiling 8.7
paper 8.6
keyboard 8.6
sitting 8.6
businesspeople 8.5
industry 8.5
portrait 8.4
desk 8.4
human 8.2
student 8.1
suit 8.1
science 8
women 7.9
researcher 7.9
education 7.8
chemistry 7.7
attractive 7.7
floor 7.4
support 7.3
lifestyle 7.2
team 7.2
information 7.1
copy 7.1
instrument 7.1

Google
created on 2018-10-06

Microsoft
created on 2018-10-06

indoor 92.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Female, 75.6%
Sad 100%
Calm 7.4%
Surprised 6.4%
Fear 6%
Happy 1.3%
Disgusted 1%
Angry 0.8%
Confused 0.7%

AWS Rekognition

Age 25-35
Gender Male, 82.3%
Calm 86.3%
Surprised 7%
Fear 6.1%
Sad 4%
Happy 2.5%
Disgusted 2.2%
Angry 2.1%
Confused 0.6%

AWS Rekognition

Age 21-29
Gender Male, 95.4%
Happy 44.6%
Fear 14.6%
Calm 13.4%
Surprised 10.1%
Angry 8.4%
Sad 8.1%
Disgusted 2%
Confused 0.7%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Hat 76.1%
Helmet 57%

Captions

Microsoft
created on 2018-10-06

a black and white photo of a man 51%
a man standing in a room 50.9%
a photo of a man 50.8%

Text analysis

Amazon

12
KODAK
10
11A
FILM
12A
11
PAN
SAFETY
10A
>7A
RODAK
>9A
8
>8A
PAN RAILS
->9
TA
27
RAILS

Google

→ 11A → 12 →12A
11A
12
12A