Human Generated Data

Title

Untitled (four guests at ball)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19279

Human Generated Data

Title

Untitled (four guests at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Overcoat 99.9
Coat 99.9
Apparel 99.9
Clothing 99.9
Human 99.4
Person 99.4
Tuxedo 99.4
Person 98.2
Person 98.1
Person 97.9
Suit 95.2
Poster 94.3
Advertisement 94.3
Suit 91.6
Text 84.6
People 74.9
Tie 72.5
Accessories 72.5
Accessory 72.5
Shirt 72.3
Tie 71.7
Man 63.4
Face 62.7
Collage 62.1

Imagga
created on 2022-02-25

man 43.7
person 37.8
groom 36.5
businessman 36.2
bow tie 33.4
people 32.4
adult 31.3
male 29.9
business 29.8
happy 27.6
couple 26.1
necktie 25.1
professional 23.8
office 23.3
businesswoman 22.7
smiling 21.7
golfer 21.4
portrait 21.4
suit 21.1
team 20.6
executive 20.4
meeting 19.8
corporate 19.8
attractive 18.2
businesspeople 18
happiness 18
group 17.7
player 17.5
men 17.2
teamwork 16.7
tie 16.1
job 15.9
together 15.8
successful 15.6
cheerful 15.4
partnership 15.4
success 15.3
love 15
work 14.9
dress 14.5
handsome 14.3
black 13.7
smile 13.5
partner 13.5
pretty 13.3
holding 13.2
clothing 13.2
lifestyle 13
contestant 13
fashion 12.8
garment 12.8
communication 12.6
formal 12.4
two 11.9
confident 11.8
colleagues 11.7
family 11.6
wedding 11
worker 10.5
standing 10.4
career 10.4
occupation 10.1
businessmen 9.8
lady 9.7
bride 9.7
collar 9.6
boy 9.6
boss 9.6
women 9.5
face 9.2
20s 9.2
modern 9.1
guy 8.9
working 8.8
diverse 8.8
looking 8.8
teacher 8.8
staff 8.6
talking 8.6
relationship 8.4
studio 8.4
style 8.2
sexy 8
kin 7.9
well dressed 7.8
table 7.8
30s 7.7
mother 7.7
husband 7.6
marriage 7.6
smart 7.5
human 7.5
manager 7.5
friendly 7.3
child 7.3
color 7.2
building 7.1
romantic 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.3
wall 97.9
smile 97.5
human face 94.5
person 93.2
clothing 92.5
suit 88.9
man 88
woman 86
posing 82.3
poster 82
dress 68.9
cartoon 59.8
wedding dress 52.2
picture frame 50.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Female, 100%
Happy 99.6%
Surprised 0.2%
Angry 0.1%
Fear 0%
Disgusted 0%
Confused 0%
Sad 0%
Calm 0%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Happy 57.3%
Calm 35.1%
Confused 3.7%
Disgusted 1.4%
Angry 0.8%
Sad 0.8%
Surprised 0.5%
Fear 0.5%

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 88%
Happy 5.3%
Surprised 1.7%
Disgusted 1.4%
Confused 1.3%
Angry 1.2%
Sad 0.7%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Female, 99.8%
Happy 100%
Surprised 0%
Calm 0%
Angry 0%
Fear 0%
Sad 0%
Confused 0%
Disgusted 0%

Microsoft Cognitive Services

Age 43
Gender Female

Microsoft Cognitive Services

Age 34
Gender Female

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Suit 95.2%
Poster 94.3%
Tie 72.5%

Captions

Microsoft

Catherine Marshall et al. posing for a photo 90.2%
Catherine Marshall et al. posing for the camera 90.1%
Catherine Marshall and woman posing for a photo 73%

Text analysis

Amazon

JAN
126
65
132

Google

JAN 126 132
JAN
126
132