Human Generated Data

Title

Untitled (soldier outside house with Vietnamese man and two children, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.185.5

Human Generated Data

Title

Untitled (soldier outside house with Vietnamese man and two children, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.185.5

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Person 96.6
Person 95.5
Clinic 88.8
Clothing 83
Apparel 83
Person 75.3
Face 61.7
Sleeve 55

Clarifai
created on 2023-10-22

people 99.8
two 98.9
vehicle 98.9
adult 97.6
man 97.5
monochrome 97.3
three 96
transportation system 95.4
group together 94.4
group 91.7
one 90.4
wear 90.4
child 90.3
four 90.1
aircraft 88.5
watercraft 88.4
military 88
war 85.4
medical practitioner 80.7
car 80.2

Imagga
created on 2021-12-14

surgeon 69.9
brass 43.7
trombone 41.9
wind instrument 32.7
man 29.6
patient 25.8
person 25.4
musical instrument 22.9
male 22
work 22
medical 20.3
men 18
people 17.3
hospital 17.1
doctor 16.9
mask 15.8
professional 15.3
laboratory 14.5
medicine 14.1
nurse 13.6
science 13.3
research 13.3
case 13.2
health 13.2
adult 13
scientist 12.7
sick person 12.5
equipment 12.3
technology 11.9
surgery 11.7
lab 11.7
chemistry 11.6
worker 11.6
biology 11.4
face 11.4
room 10.9
black 10.8
hand 10.6
chemical 10.6
uniform 10.6
operation 9.8
device 9.8
cornet 9.7
test 9.6
holding 9.1
portrait 9.1
surgical 8.9
experiment 8.8
smiling 8.7
emergency 8.7
profession 8.6
car 8.6
illness 8.6
industry 8.5
occupation 8.2
care 8.2
protection 8.2
instrument 8.1
looking 8
job 8
business 7.9
sterile 7.9
military 7.7
microscope 7.7
exam 7.7
human 7.5
gun 7.5
study 7.5
safety 7.4
industrial 7.3
student 7.2
lifestyle 7.2
home 7.2
transportation 7.2

Microsoft
created on 2021-12-14

person 91.4
text 88.7
black and white 62.3
vehicle 59.2
clothing 58.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 46-64
Gender Female, 81.6%
Calm 92.3%
Happy 3%
Sad 2.7%
Confused 1%
Angry 0.5%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 99.3%

Captions