Human Generated Data

Title

Untitled (young man and woman dancing)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19307

Human Generated Data

Title

Untitled (young man and woman dancing)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19307

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.7
Human 99.7
Clothing 99.6
Apparel 99.6
Person 97.2
Person 96.4
Robe 87.1
Fashion 87.1
Gown 83.8
Suit 78.6
Coat 78.6
Overcoat 78.6
Wedding 75.8
Text 74.3
Female 74
Shoe 71.3
Footwear 71.3
Wedding Gown 66.8
Evening Dress 65.9
Flower 63.8
Blossom 63.8
Plant 63.8
Door 63.4
Woman 59.4
Flower Arrangement 59.2
Bridegroom 58
Floor 56.4
Flower Bouquet 55.1

Clarifai
created on 2023-10-22

people 99.8
woman 98.6
wear 97.7
adult 97.6
wedding 96.3
man 95.5
two 94.4
dress 93.4
portrait 92.9
family 92.4
group 91.3
groom 90
three 85.4
bride 85.4
indoors 82.2
child 80.4
actress 79.7
veil 78.6
facial expression 77.2
offspring 76.9

Imagga
created on 2022-02-25

groom 57.1
man 32.9
people 30.1
adult 27.5
dress 27.1
person 26.7
male 25.6
couple 25.3
bride 22.4
business 20.7
businessman 20.3
professional 20.1
happy 19.4
corporate 18.9
wedding 18.4
portrait 18.1
attractive 17.5
fashion 16.6
office 16.3
black 16.1
women 15.8
happiness 15.7
teacher 15.6
pretty 15.4
suit 14.6
standing 13
two 12.7
clothing 12.7
bouquet 12.6
marriage 12.3
smiling 12.3
lady 12.2
group 12.1
love 11.8
work 11.8
family 11.6
smile 11.4
businesswoman 10.9
team 10.8
new 10.5
building 10.4
meeting 10.4
manager 10.2
communication 10.1
elegance 10.1
executive 9.9
holding 9.9
educator 9.9
room 9.8
full length 9.7
flowers 9.6
ethnic 9.5
indoor 9.1
cheerful 8.9
success 8.9
interior 8.8
looking 8.8
lifestyle 8.7
diversity 8.6
model 8.6
adults 8.5
teamwork 8.3
garment 8.2
sexy 8
together 7.9
brunette 7.8
men 7.7
modern 7.7
bow tie 7.7
youth 7.7
talking 7.6
businesspeople 7.6
style 7.4
20s 7.3
cute 7.2
holiday 7.2
romantic 7.1
job 7.1
working 7.1
indoors 7
window 7

Google
created on 2022-02-25

Outerwear 95.6
Coat 89.9
Bride 89.7
Sleeve 87
Gesture 85.3
Yellow 84.6
Suit 80.4
Smile 79.9
Wedding dress 79.7
Formal wear 78.1
Vintage clothing 76.7
Picture frame 74.5
Event 71.2
Classic 69.2
Pattern 69.1
Monochrome photography 65.8
Happy 64.4
Stock photography 64.3
Rectangle 63.1
Gown 63.1

Microsoft
created on 2022-02-25

text 99.2
person 96.5
wall 95.8
dress 95.5
wedding dress 94.8
bride 91.7
indoor 89.2
standing 87.8
clothing 86.5
woman 83.9
posing 77.6
smile 70.4
wedding 68
picture frame 29.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99.8%
Happy 98.1%
Surprised 0.5%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%
Confused 0.2%
Calm 0.2%
Sad 0.1%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Calm 48.2%
Surprised 21.2%
Confused 13.7%
Disgusted 7.9%
Fear 3.4%
Happy 2.5%
Angry 2%
Sad 1.2%

AWS Rekognition

Age 47-53
Gender Female, 83.7%
Sad 91.9%
Calm 4.7%
Happy 1.6%
Angry 0.9%
Fear 0.4%
Disgusted 0.3%
Surprised 0.2%
Confused 0.1%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 97.2%
Person 96.4%
Shoe 71.3%

Text analysis

Amazon

65
JAN
115
MN

Google

115 3. 2. JAN • 65
115
3.
2.
JAN
65