Human Generated Data

Title

Untitled (guests at party, sitting)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19318

Human Generated Data

Title

Untitled (guests at party, sitting)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.9
Human 99.9
Person 99.6
Apparel 98.9
Clothing 98.9
Person 98.4
Person 98.3
Person 98
Chair 88
Furniture 88
Dress 87.6
Female 87
Person 83.6
Woman 70.9
People 70.1
Accessories 64.1
Tie 64.1
Accessory 64.1
Crowd 64
Couch 58.9
Shorts 58.7
Wheel 58.6
Machine 58.6
Tie 57.8

Imagga
created on 2022-03-05

person 40.5
patient 36.5
people 34.6
man 26.9
nurse 25.8
adult 24.3
couple 20.9
male 20.6
men 19.7
case 19.5
professional 18.9
sick person 18.9
medical 18.5
women 17.4
teacher 17.4
hospital 16.5
groom 15.1
love 15
room 14.8
doctor 14.1
health 13.2
indoors 13.2
happy 13.2
senior 13.1
smiling 13
team 12.5
dress 11.7
bride 11.6
medicine 11.4
human 11.2
sitting 11.2
educator 11.1
portrait 11
work 11
worker 10.8
clinic 10.7
wedding 10.1
mask 10
business 9.7
group 9.7
married 9.6
home 9.6
illness 9.5
happiness 9.4
occupation 9.2
holding 9.1
black 9
salon 8.9
office 8.8
smile 8.5
face 8.5
two 8.5
equipment 8.4
negative 8.2
cheerful 8.1
suit 8.1
family 8
celebration 8
photographer 8
job 8
working 7.9
life 7.9
together 7.9
specialist 7.8
husband 7.8
sick 7.7
modern 7.7
meeting 7.5
instrument 7.4
film 7.4
teamwork 7.4
care 7.4
inside 7.4
looking 7.2
spectator 7.2
businessman 7.1
surgeon 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96.6
text 90.1
clothing 83.7
dress 53.5

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Female, 81.3%
Happy 98.5%
Calm 1.1%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Sad 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 48-56
Gender Male, 100%
Calm 100%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 98%
Calm 61.8%
Confused 18.7%
Sad 14%
Disgusted 2.4%
Happy 2%
Surprised 0.6%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 54-64
Gender Male, 100%
Happy 39.4%
Confused 33.5%
Sad 11.5%
Surprised 5.7%
Disgusted 4.1%
Fear 3.1%
Angry 2.2%
Calm 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%
Tie 64.1%
Wheel 58.6%

Captions

Microsoft

a group of people posing for a photo 88%
a group of people posing for the camera 87.9%
a group of people posing for a picture 87.8%

Text analysis

Amazon

7E
8
MAGI
KAOOX

Google

7E
7E