Human Generated Data

Title

Untitled (Dr. Herman M. Juergens talking with patient in exam room)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.485.6

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens talking with patient in exam room)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.485.6

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 99.1
Human 99.1
Person 98.2
Clothing 95.3
Apparel 95.3
Accessory 67.8
Glasses 67.8
Accessories 67.8
Face 67
Electronics 63.7
Screen 63.7
Coat 59.2
Suit 58.8
Overcoat 58.8
Leisure Activities 58.6

Clarifai
created on 2019-08-09

people 99.5
one 98.7
adult 98.1
scientist 97.3
man 97.2
monochrome 96.2
two 90.9
portrait 90.8
science 89.3
wear 89.1
indoors 88.9
woman 88.6
concentration 87.7
furniture 83.5
gloves 83.2
vehicle 82.8
medical practitioner 81.5
research 80.8
medicine 78.9
chair 77.7

Imagga
created on 2019-08-09

astronaut 84.7
man 34.9
people 26.8
person 25.5
professional 25
male 24.1
medical 23
mask 22.3
worker 22.2
equipment 19
medicine 18.5
work 18.1
doctor 17.9
negative 17.1
health 16.7
technology 16.3
surgeon 14.9
patient 14.9
film 13.5
research 13.3
men 12.9
helmet 12.8
hospital 12.7
team 12.5
science 12.5
instrument 12.1
uniform 12.1
scientist 11.8
nurse 11.6
laboratory 11.6
job 11.5
working 11.5
development 11.4
biology 11.4
adult 11.3
room 11.1
safety 11
occupation 11
surgery 10.7
chemistry 10.6
chemical 10.6
human 10.5
photographic paper 10.4
industry 10.2
industrial 10
lab 9.7
assistant 9.7
scientific 9.7
profession 9.6
weapon 9.4
hand 9.1
protection 9.1
danger 9.1
care 9.1
technician 8.8
illness 8.6
black 8.4
clinic 8.1
suit 8.1
surgical 7.9
operation 7.9
microscope 7.9
coat 7.9
emergency 7.7
modern 7.7
skill 7.7
test 7.7
war 7.7
sport 7.5
clothing 7.4
body 7.2
businessman 7.1

Microsoft
created on 2019-08-09

person 97.4
black and white 93.5
man 91.5
text 91.3
clothing 87.9
monochrome 59.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Female, 60.9%
Surprised 6.8%
Confused 1.4%
Disgusted 0.5%
Calm 81.4%
Happy 0.1%
Fear 0.7%
Sad 8.2%
Angry 1%

AWS Rekognition

Age 29-45
Gender Female, 62.9%
Confused 3.2%
Calm 18.9%
Surprised 7.9%
Angry 18.9%
Sad 13.7%
Disgusted 1.5%
Happy 9.3%
Fear 26.6%

Feature analysis

Amazon

Person 99.1%
Glasses 67.8%

Categories

Captions

Microsoft
created on 2019-08-09

a man playing a guitar 31.8%
a man holding a gun 31.7%
a group of people around each other 31.6%

Text analysis

Google

TaCCINE
TaCCINE