Human Generated Data

Title

Untitled (three men and woman at party)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20026

Human Generated Data

Title

Untitled (three men and woman at party)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Clothing 98.6
Apparel 98.6
Person 97.9
Person 97.9
Person 97.4
Accessories 96.8
Tie 96.8
Accessory 96.8
Sunglasses 84.6
Door 79.7
Coat 71.4
Elevator 70.8
Face 64.9
Sleeve 64.3
Overcoat 63.4
Suit 56.1
Hair 55.7

Imagga
created on 2022-03-05

nurse 64.7
lab coat 60.3
coat 53.4
medical 44.1
doctor 42.3
surgeon 37.7
person 36.8
man 36.3
patient 36.1
hospital 34.3
people 31.8
professional 28.9
health 27.8
male 27.7
medicine 27.3
adult 25.7
clinic 24.3
specialist 23
garment 22.3
men 19.7
worker 19.6
occupation 19.2
work 18.8
care 18.1
working 17.7
scientist 17.6
laboratory 17.4
profession 17.2
uniform 17.2
clothing 17.1
surgery 16.6
lab 16.5
illness 16.2
happy 15.7
mask 15.6
office 15.3
team 15.2
research 15.2
biology 15.2
science 15.1
room 14.6
women 14.2
senior 14.1
looking 13.6
portrait 13.6
chemistry 13.5
indoors 13.2
equipment 13.1
case 13
business 12.8
assistant 12.6
exam 12.4
job 12.4
smiling 12.3
sick person 12.2
face 12.1
microscope 11.8
test 11.5
corporate 11.2
stethoscope 11
surgical 10.8
physician 10.7
scientific 10.7
chemical 10.6
businessman 10.6
human 10.5
instrument 10.4
holding 9.9
sick 9.7
emergency 9.6
couple 9.6
education 9.5
development 9.5
happiness 9.4
smile 9.3
student 9.2
confident 9.1
attractive 9.1
researcher 8.9
operation 8.9
doctors 8.8
technician 8.8
40s 8.8
disease 8.7
lifestyle 8.7
elderly 8.6
executive 8.4
study 8.4
teamwork 8.3
treatment 8.3
technology 8.2
scrubs 7.9
sterile 7.9
bright 7.9
experiment 7.8
optical 7.8
sitting 7.7
casual 7.6
to 7.1
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.6
clothing 94.6
text 88.3
indoor 86.4
man 83.8
human face 81.6
smile 65
black and white 53.6

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 52.1%
Happy 90.3%
Sad 6.9%
Calm 0.9%
Confused 0.7%
Fear 0.5%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%

AWS Rekognition

Age 48-56
Gender Male, 98.3%
Calm 99.1%
Sad 0.5%
Surprised 0.1%
Confused 0.1%
Happy 0.1%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Tie 96.8%
Sunglasses 84.6%

Captions

Microsoft

a group of people standing in front of a mirror posing for the camera 76.1%
a group of people standing in front of a mirror 76%
a person standing in front of a mirror posing for the camera 70.4%

Text analysis

Amazon

DO
KODVK-VEELA

Google

260
VEELA
260 KODVK VEELA
KODVK