Human Generated Data

Title

Untitled (Dr. Herman M. Juergens and patient talking in exam room)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.509.6

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens and patient talking in exam room)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.509.6

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.8
Person 99.8
Person 99.6
Chair 86.5
Furniture 86.5
Apparel 80.2
Clothing 80.2
Finger 73.4
Sitting 58.3

Clarifai
created on 2019-08-09

people 99.9
adult 99
administration 98.6
man 98.5
leader 97
group 96.6
two 95.1
war 94.5
group together 94.5
portrait 89.7
three 88.6
furniture 88.2
room 88
elderly 87.9
medical practitioner 87.4
wear 86.4
actor 85.8
military 85.8
concentration 83.9
chair 80.5

Imagga
created on 2019-08-09

hairdresser 48.6
patient 47.9
man 42.3
person 41.8
barbershop 33.1
male 32.7
people 30.7
shop 29.8
hospital 25.7
surgeon 25.7
medical 25.6
work 22.8
nurse 22.7
doctor 22.6
indoors 21.9
professional 21.5
men 21.5
sick person 21.1
case 20.7
mercantile establishment 19.9
medicine 19.4
health 18.8
worker 17.8
senior 17.8
working 17.7
adult 17.4
room 16.8
occupation 16.5
job 15.9
home 15.1
illness 14.3
equipment 14.3
happy 13.8
surgery 13.7
office 13.6
place of business 13.5
profession 13.4
portrait 12.9
chair 12.9
clinic 12.8
smiling 12.3
sitting 12
inside 12
care 11.5
team 10.7
uniform 10.7
computer 10.4
business 10.3
industry 10.2
operation 9.9
exam 9.6
specialist 9.5
women 9.5
smile 9.3
phone 9.2
treatment 9.2
surgical 8.9
mask 8.7
retired 8.7
retirement 8.6
elderly 8.6
instrument 8.6
coat 8.6
mature 8.4
old 8.4
hand 8.4
industrial 8.2
technology 8.2
lifestyle 7.9
40s 7.8
assistant 7.8
sick 7.7
emergency 7.7
pain 7.7
horizontal 7.5
holding 7.4
looking 7.2
family 7.1
interior 7.1
together 7

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

person 99.8
man 98.6
black and white 90.2
indoor 89
text 84.8
clothing 84.5
human face 83.2
older 36.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-52
Gender Male, 96.3%
Surprised 1.1%
Confused 0.3%
Disgusted 0.4%
Calm 90.2%
Fear 0.1%
Angry 0.2%
Sad 5.8%
Happy 2%

Feature analysis

Amazon

Person 99.8%
Chair 86.5%

Captions

Text analysis

Google

RLAX
RLAX