Human Generated Data

Title

Untitled (people dancing at ball)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19270

Human Generated Data

Title

Untitled (people dancing at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19270

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 100
Apparel 100
Person 99.3
Human 99.3
Person 99.3
Person 99
Person 98.9
Robe 97.7
Fashion 97.7
Gown 97.2
Person 95.7
Person 95.3
Wedding 95.3
Bride 92.2
Wedding Gown 92.2
Female 92
Person 91.7
Evening Dress 87.7
Person 80.9
Bridegroom 80.3
Person 79.8
Woman 78.4
Suit 73.1
Coat 73.1
Overcoat 73.1
Person 67
Dress 66.5
Leisure Activities 60.2
Photography 58.7
Photo 58.7
Portrait 55.6
Face 55.6
Indoors 55.5

Clarifai
created on 2023-10-22

people 99.9
woman 97.5
group 97.5
man 97
adult 96.1
wedding 94.6
dancing 94.1
music 93
actress 91.6
dancer 90.8
monochrome 90.6
actor 90.3
bride 89.7
dress 89.2
group together 86.7
wear 84.5
groom 82.5
singer 81.6
three 81.5
rehearsal 81.2

Imagga
created on 2022-03-05

groom 53.5
people 35.7
person 34.4
male 34
man 32.9
businessman 30.9
team 27.8
business 27.3
teamwork 26.9
crowd 26.9
silhouette 26.5
boss 24.9
bride 24.4
work 23.5
businesswoman 22.7
job 22.1
sexy 21.7
dress 21.7
adult 20.8
teacher 19.8
professional 19.7
president 19.6
nighttime 19.6
wedding 19.3
leader 19.3
couple 19.2
love 18.9
supporters 18.8
cheering 18.6
presentation 18.6
speech 18.6
audience 18.5
stadium 18.5
patriotic 18.2
men 18
nation 18
lights 17.6
design 17.4
flag 17.4
occupation 17.4
marriage 17.1
vibrant 16.6
educator 16.3
vivid 15.8
symbol 15.5
icon 15
women 15
bright 15
group 14.5
happy 14.4
happiness 14.1
performer 13.9
singer 12.8
married 12.5
bouquet 12.4
black 11.6
corporate 11.2
musician 11.1
two 11
entertainer 10.9
suit 10.5
dinner dress 10.4
document 10.3
flowers 9.6
party 9.5
meeting 9.4
executive 9.4
pretty 9.1
dancer 8.8
life 8.8
bridal 8.8
attractive 8.4
outfit 8.1
success 8
together 7.9
smile 7.8
veil 7.8
youth 7.7
walking 7.6
fashion 7.5
cross 7.5
joy 7.5
lady 7.3
clothing 7.2
gown 7.2
star 7.2
celebration 7.2
family 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wedding dress 98.8
bride 97.2
dress 97
person 93.2
text 86.3
wedding 85.2
woman 83.5
standing 81.3
clothing 77.8
dance 72.9
gallery 62.1
group 56.3
posing 54.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 98.6%
Sad 52.2%
Happy 33.5%
Calm 6.5%
Confused 4%
Surprised 1.3%
Disgusted 1.1%
Fear 0.9%
Angry 0.6%

AWS Rekognition

Age 45-51
Gender Male, 77.2%
Calm 99%
Surprised 0.9%
Angry 0.1%
Disgusted 0%
Sad 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.1%
Calm 97.3%
Confused 0.8%
Surprised 0.7%
Happy 0.5%
Sad 0.2%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 22-30
Gender Female, 52.3%
Confused 53.2%
Calm 33.9%
Sad 9.5%
Angry 1.6%
Disgusted 0.6%
Happy 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 94.6%
Sad 90%
Fear 4.3%
Calm 1.5%
Happy 1.4%
Confused 1.2%
Angry 0.9%
Disgusted 0.5%
Surprised 0.2%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Calm 52.8%
Sad 22.6%
Angry 8.8%
Confused 5%
Disgusted 4.5%
Happy 2.6%
Surprised 2.4%
Fear 1.5%

AWS Rekognition

Age 24-34
Gender Female, 50.5%
Calm 73.5%
Happy 13%
Sad 10.7%
Confused 0.8%
Disgusted 0.7%
Angry 0.7%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Male, 93.9%
Fear 26.9%
Sad 26.3%
Calm 21.2%
Disgusted 11.8%
Angry 6.3%
Happy 3.6%
Confused 2.3%
Surprised 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99.3%
Person 99%
Person 98.9%
Person 95.7%
Person 95.3%
Person 91.7%
Person 80.9%
Person 79.8%
Person 67%

Categories

Imagga

paintings art 99.5%

Text analysis

Amazon

10
G 10
G
11
MACOX
112
ХАООХ
112 Y T 3 3 1 2 ХАООХ
Y T 3 3 1 2

Google

11 G 10 M 3 Y T 2 AGOM
11
G
10
M
3
Y
T
2
AGOM