Human Generated Data

Title

Untitled (Dr. Herman M. Juergens taking patient's blood pressure)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.493.4

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens taking patient's blood pressure)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.493.4

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.7
Person 99.7
Person 99.7
Face 79.7
Wood 76.3
Plywood 71.6
Clothing 71
Apparel 71
Finger 68
Sitting 63.3
Photography 61.4
Photo 61.4
Portrait 61.4
Flooring 56.6
Worker 55.8
Brick 55.7

Clarifai
created on 2019-08-09

people 99.8
adult 98.8
two 95
man 94.8
woman 92.7
group 92.6
one 92.6
wear 91.4
room 90.4
group together 89.2
furniture 85.3
leader 84.5
administration 84.3
concentration 82.6
indoors 81.3
three 80.1
actor 79.4
outfit 76.7
war 75.3
education 73.4

Imagga
created on 2019-08-09

barbershop 32.4
man 30.9
shop 27.7
people 25.6
person 25.1
male 23.4
adult 19.8
mercantile establishment 19.7
room 17
table 16.7
black 14.4
men 13.7
indoors 13.2
place of business 13.1
lifestyle 13
sax 12.7
glass 12.7
work 12.5
job 12.4
worker 11.8
hairdresser 11.8
business 11.5
professional 11.4
happy 11.3
casual 11
portrait 11
life 10.9
wine 10.8
musical instrument 10.7
businessman 10.6
looking 10.4
home 10.4
sitting 10.3
mature 10.2
chair 10.1
smile 10
music 9.9
brass 9.7
interior 9.7
restaurant 9.4
wind instrument 9.4
equipment 9.2
indoor 9.1
modern 9.1
fashion 9
musician 9
working 8.8
urban 8.7
couple 8.7
love 8.7
tool 8.5
face 8.5
hand 8.3
office 8.3
holding 8.2
human 8.2
group 8.1
sexy 8
romantic 8
teacher 8
alcohol 7.9
concert 7.8
party 7.7
dark 7.5
device 7.4
inside 7.4
vintage 7.3
back 7.3
occupation 7.3
dress 7.2
celebration 7.2
happiness 7
look 7

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

wall 97.3
person 96.9
black and white 95.5
man 94
indoor 88.2
text 87.5
clothing 78.6
table 60.8
furniture 59.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 60.9%
Fear 0.1%
Surprised 0.5%
Disgusted 0.1%
Happy 0.4%
Sad 11.5%
Calm 86.8%
Angry 0.3%
Confused 0.4%

AWS Rekognition

Age 48-66
Gender Male, 95.4%
Sad 5.9%
Calm 90.8%
Disgusted 0.2%
Confused 0.6%
Fear 0.3%
Angry 0.3%
Surprised 1.6%
Happy 0.4%

Feature analysis

Amazon

Person 99.7%

Categories