Human Generated Data

Title

Untitled (group at party)

Date

c. 1950

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1645

Human Generated Data

Title

Untitled (group at party)

People

Artist: John Deusing, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99
Person 99
Person 98.4
Person 97.6
Person 97.4
Person 96.3
Clinic 95.9
Person 84.1
Person 83.5
Hospital 81.7
Operating Theatre 77
Clothing 77
Apparel 77
Doctor 71.5
Person 70.9
People 62
Coat 60.5
Face 58.4
Indoors 56.6

Imagga
created on 2021-12-14

man 39.7
person 36.3
people 36.3
adult 34.2
male 33.3
professional 33.2
senior 30
indoors 29
office 25.2
teacher 25
smiling 23.9
mature 23.2
computer 23.2
business 23.1
nurse 22.8
happy 22.6
sitting 22.3
patient 22.3
businessman 22.1
men 20.6
table 19.9
meeting 18.8
work 18.8
team 18.8
home 18.3
laptop 18.2
teamwork 17.6
couple 17.4
educator 17.4
worker 17
group 16.9
working 16.8
medical 16.8
elderly 16.3
executive 16.3
room 16
looking 16
women 15.8
portrait 15.5
casual 15.3
businesspeople 15.2
lifestyle 15.2
desk 15.1
doctor 14.1
together 14
pensioner 14
indoor 13.7
hospital 13.7
businesswoman 13.6
smile 13.5
retirement 13.4
technology 13.4
modern 13.3
job 13.3
specialist 13
camera 12.9
colleagues 12.6
happiness 12.5
talking 12.4
old 11.8
day 11.8
horizontal 11.7
60s 11.7
color 11.7
mid adult 11.6
face 11.4
education 11.3
corporate 11.2
40s 10.7
handsome 10.7
two people 10.7
retired 10.7
cheerful 10.6
illness 10.5
health 10.4
manager 10.2
clothing 10.2
occupation 10.1
surgeon 9.9
case 9.8
coat 9.8
older 9.7
classroom 9.7
husband 9.5
clinic 9.2
suit 9
70s 8.9
medicine 8.8
sick person 8.8
30s 8.7
looking camera 8.7
life 8.6
communication 8.4
hand 8.4
to 8
student 7.9
lab coat 7.9
gray hair 7.9
coworkers 7.9
casual clothing 7.8
architect 7.7
workplace 7.6
two 7.6
age 7.6
adults 7.6
keyboard 7.5
human 7.5
holding 7.4
glasses 7.4
successful 7.3
alone 7.3
grandma 7.3
confident 7.3
success 7.2
aged 7.2
bright 7.1
monitor 7

Google
created on 2021-12-14

Black 89.6
Coat 88.9
Black-and-white 86.3
Style 84
Smile 82.2
Monochrome 78.4
Monochrome photography 78.4
Event 72
Room 70.7
Font 70.5
Vintage clothing 70.2
Window 65
Stock photography 64.9
Service 64.2
History 62.1
Team 60.8
Hat 60.5
Crew 60.3
Uniform 56.9
Advertising 52.5

Microsoft
created on 2021-12-14

text 97.8
clothing 94.7
person 93.9
man 86.6
smile 58.1
wedding dress 50.4
clothes 18.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 85%
Happy 47.4%
Calm 21.4%
Sad 16%
Angry 6.4%
Surprised 3.4%
Fear 2.1%
Confused 1.8%
Disgusted 1.5%

AWS Rekognition

Age 30-46
Gender Male, 58.7%
Surprised 69.8%
Fear 25.9%
Happy 3.3%
Confused 0.3%
Angry 0.3%
Calm 0.2%
Sad 0.1%
Disgusted 0.1%

AWS Rekognition

Age 32-48
Gender Male, 88.7%
Calm 97.1%
Sad 1.3%
Happy 1%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-42
Gender Female, 82.3%
Happy 71.4%
Sad 18.4%
Calm 7.7%
Angry 0.7%
Confused 0.6%
Surprised 0.6%
Fear 0.4%
Disgusted 0.1%

AWS Rekognition

Age 28-44
Gender Male, 74.7%
Calm 44.1%
Sad 33.3%
Happy 19.3%
Confused 1.6%
Surprised 0.8%
Angry 0.4%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 47-65
Gender Male, 55.8%
Sad 68.7%
Calm 26%
Happy 3.7%
Angry 0.6%
Confused 0.5%
Fear 0.2%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Male, 89.3%
Calm 71.4%
Happy 13%
Sad 12.8%
Confused 1.2%
Angry 0.6%
Surprised 0.6%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-35
Gender Female, 56.8%
Happy 72.7%
Sad 20.6%
Calm 2%
Surprised 1.7%
Angry 1.3%
Fear 1%
Confused 0.6%
Disgusted 0.2%

AWS Rekognition

Age 33-49
Gender Female, 74.4%
Calm 92.8%
Happy 5.1%
Sad 1.6%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a group of people posing for a photo 89.3%
a group of people posing for the camera 89.2%
a group of people posing for a picture 89.1%

Text analysis

Amazon

LIFE
HIGH LIFE
HIGH
Miller
KOOO
MUEYTE3A°2 KOOO
MUEYTE3A°2

Google

Miller
HIGH
Miller HIGH LIFE
LIFE