Human Generated Data

Title

Untitled (guests at cocktail party seated around small table)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20119

Human Generated Data

Title

Untitled (guests at cocktail party seated around small table)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.5
Human 98.5
Chair 98.3
Furniture 98.3
Person 98.2
Person 97.8
Person 97.5
Person 97.5
Person 96.8
Person 96.5
Shoe 94.7
Footwear 94.7
Clothing 94.7
Apparel 94.7
Person 92.9
Person 92.2
Chair 92
Sitting 91.9
Person 88.6
Person 86.6
Accessories 75.9
Accessory 75.9
Sunglasses 75.9
Clinic 72.1
Crowd 70.6
People 66.1
Chair 64.9
Indoors 61.8
Sunglasses 61.8
Room 59.8
Audience 59.3
Table 58.2
Female 57.5
Chair 51.7

Imagga
created on 2022-03-05

person 55.4
patient 51.5
man 46.4
male 36.9
people 35.2
case 34.8
sick person 33.6
office 30.1
businessman 30
business 29.8
room 29.1
colleagues 26.2
meeting 25.4
professional 24.9
adult 24.8
teacher 23.5
working 23
indoors 22.8
businesswoman 22.7
hospital 22.4
smiling 21.7
nurse 21.4
group 20.2
team 19.7
work 19.6
happy 19.4
classroom 19.1
table 19.1
desk 18.9
education 18.2
businesspeople 18
communication 17.6
teamwork 17.6
women 17.4
corporate 17.2
sitting 17.2
men 17.2
talking 17.1
medical 16.8
job 15.9
discussion 15.6
clinic 15.4
doctor 15
mature 14.9
executive 14.3
educator 14.2
worker 14
life 13.7
40s 13.6
computer 13.6
suit 13.5
30s 13.5
color 13.4
together 13.1
lifestyle 13
modern 12.6
cheerful 12.2
occupation 11.9
indoor 11.9
casual 11.9
coworkers 11.8
board 11.8
four 11.5
interior 11.5
restaurant 11.3
student 11.2
cafeteria 11.1
chair 11.1
health 11.1
associates 10.8
smile 10.7
mid adult 10.6
couple 10.5
building 10.4
discussing 9.8
conference 9.8
portrait 9.7
diversity 9.6
exam 9.6
home 9.6
adults 9.5
happiness 9.4
senior 9.4
manager 9.3
presentation 9.3
care 9.1
success 8.9
50s 8.8
two people 8.8
class 8.7
college 8.5
learning 8.5
camera 8.3
phone 8.3
holding 8.3
laptop 8.2
to 8
medicine 7.9
boardroom 7.9
day 7.8
standing 7.8
teaching 7.8
middle aged 7.8
thirties 7.8
angle 7.7
surgeon 7.7
clothes 7.5
training 7.4
looking 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96.9
clothing 94.7
furniture 90.8
chair 89.9
table 89.7
text 88.4
indoor 85.1
footwear 81
woman 74.1
man 71.4
black and white 66.6

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Calm 99.9%
Surprised 0.1%
Happy 0%
Sad 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Calm 99.4%
Surprised 0.2%
Sad 0.1%
Happy 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 78.1%
Sad 40.7%
Confused 21.4%
Happy 9.8%
Fear 7.9%
Calm 6.1%
Angry 5.9%
Surprised 4.1%
Disgusted 4.1%

AWS Rekognition

Age 35-43
Gender Male, 95.6%
Sad 56.3%
Happy 31.6%
Fear 7.5%
Confused 1.7%
Disgusted 1.5%
Angry 0.7%
Surprised 0.4%
Calm 0.2%

AWS Rekognition

Age 52-60
Gender Male, 98.8%
Calm 97.9%
Sad 0.7%
Confused 0.6%
Disgusted 0.3%
Happy 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Chair 98.3%
Shoe 94.7%
Sunglasses 75.9%

Captions

Microsoft

a group of people sitting at a desk 91%
a group of people sitting at a table 83%
a group of people sitting in chairs 82.9%

Text analysis

Amazon

EAD
KODAK-E.VEELA

Google

YT37A°2-XAGO
YT37A°2-XAGO