Human Generated Data

Title

Untitled (men gathered around drafting table)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20154

Human Generated Data

Title

Untitled (men gathered around drafting table)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99
Human 99
Person 98.3
Person 97.9
Clothing 97.7
Apparel 97.7
Person 97.1
Person 93.9
Accessory 91.3
Tie 91.3
Accessories 91.3
Person 89.6
Clinic 85.5
Coat 83.4
Musical Instrument 82
Piano 82
Leisure Activities 82
Person 76.5
Face 68.8
Lab Coat 66.4
Scientist 65.9
Photo 62.2
Photography 62.2
Portrait 61.3
Overcoat 59.2
Hospital 58.9
Lab 57.8

Imagga
created on 2022-03-05

surgeon 71.3
man 47.7
medical 38
doctor 34.8
male 34.8
coat 34.8
people 34
lab coat 32.8
specialist 32.7
professional 31
person 30.4
work 29.8
hospital 29.1
nurse 27.9
patient 27.7
clinic 24.1
working 23.9
health 22.9
job 22.1
medicine 21.2
indoors 21.1
laboratory 20.3
sitting 19.8
worker 19.7
office 19.5
adult 19.5
lab 19.4
happy 19.4
men 18.9
team 18.8
scientist 18.6
occupation 18.3
smiling 18.1
barbershop 17.6
shop 17.4
business 17
computer 16.8
care 16.5
research 15.2
biology 14.2
businessman 14.1
senior 14.1
teamwork 13.9
table 13.9
looking 13.6
chemistry 13.5
science 13.4
assistant 12.6
test 12.5
desk 12.3
room 12.1
mature 12.1
technology 11.9
chemical 11.8
student 11.8
colleagues 11.7
practitioner 11.6
profession 11.5
illness 11.4
casual 11
portrait 11
laptop 10.9
mercantile establishment 10.9
30s 10.6
color 10.6
together 10.5
uniform 10.5
home 10.4
women 10.3
treatment 10.1
garment 10
holding 9.9
sterile 9.8
modern 9.8
surgery 9.8
physician 9.8
experiment 9.8
exam 9.6
couple 9.6
education 9.5
businesspeople 9.5
lifestyle 9.4
study 9.3
clothing 9
microscope 8.9
case 8.9
doctors 8.9
50s 8.8
cure 8.8
40s 8.8
scientific 8.7
mid adult 8.7
bright 8.6
talking 8.6
meeting 8.5
hand 8.4
inside 8.3
20s 8.3
indoor 8.2
confident 8.2
businesswoman 8.2
suit 8.1
group 8.1
operation 7.9
smile 7.8
face 7.8
middle aged 7.8
sick person 7.7
instrument 7.7
notes 7.7
serious 7.6
two 7.6
human 7.5
cheerful 7.3
place of business 7.2
kitchen 7.2
to 7.1
day 7.1
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.6
text 97.3
clothing 95.5
black and white 76
man 72.8

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 83.8%
Calm 95.5%
Sad 1.9%
Happy 0.7%
Surprised 0.7%
Disgusted 0.5%
Confused 0.4%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Tie 91.3%
Piano 82%

Captions

Microsoft

a group of people standing next to a window 79.1%
a group of people standing in front of a window 77.7%
a group of men standing next to a window 71.5%

Text analysis

Amazon

KODVR-EVEEIA
the
or
ALL