Human Generated Data

Title

Untitled (couples dancing at fancy party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17965

Human Generated Data

Title

Untitled (couples dancing at fancy party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Clothing 98.5
Apparel 98.5
Person 97.1
Human 97.1
Person 96.7
Leisure Activities 94.8
Dance Pose 94.8
Dress 94.2
Tennis Racket 93.1
Racket 93.1
Female 84.7
Dance 78
Face 76.4
Indoors 71.5
Woman 71
Fashion 66.6
Robe 66.6
Gown 65.9
Evening Dress 63.2
People 61.6
Photography 60.3
Photo 60.3
Wedding 58
Bridegroom 58
Room 57.3
Hug 57.2
Furniture 56.4
Table 56.4

Imagga
created on 2022-03-04

sword 43
weapon 36.6
bride 32.1
man 31.6
people 30.7
adult 30.4
wedding 30.4
groom 29.8
couple 27.9
dress 26.2
male 24.9
person 24.6
professional 23.6
love 22.9
married 22.1
happy 20.7
happiness 20.4
teacher 20.1
men 18
marriage 17.1
business 16.4
corporate 16.3
suit 16.2
bouquet 16
smiling 15.9
businessman 15.9
educator 15.7
women 15
day 14.1
two 13.6
attractive 13.3
standing 13
building 12.8
ceremony 12.6
pretty 12.6
lifestyle 12.3
together 12.3
attendant 12.2
hands 12.2
smile 12.1
fashion 12.1
life 11.3
outdoors 11.2
celebration 11.2
family 10.7
wife 10.4
flowers 10.4
black 9.8
veil 9.8
adults 9.5
elegance 9.2
room 9.2
businesswoman 9.1
holding 9.1
nurse 9
gown 8.9
husband 8.9
looking 8.8
kiss 8.8
clothing 8.7
work 8.6
formal 8.6
businesspeople 8.5
pair 8.5
modern 8.4
portrait 8.4
summer 8.4
hand 8.4
emotion 8.3
style 8.2
cheerful 8.1
group 8.1
success 8
sexy 8
job 8
bridal 7.8
outside 7.7
outdoor 7.6
worker 7.6
executive 7.5
clothes 7.5
strength 7.5
office 7.2
fitness 7.2
hairdresser 7.2
team 7.2
handsome 7.1
romantic 7.1
to 7.1

Microsoft
created on 2022-03-04

person 94.2
dance 92.7
dress 85
text 77.7
wedding 77.3
standing 77.1
woman 73.5
black and white 64.8
posing 44.6

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 99.8%
Happy 42.7%
Surprised 20.6%
Calm 11.7%
Sad 10.6%
Confused 8.4%
Angry 2.4%
Disgusted 2.1%
Fear 1.3%

AWS Rekognition

Age 26-36
Gender Female, 84.5%
Calm 71.2%
Happy 14.8%
Sad 7.6%
Confused 2.6%
Surprised 1.6%
Disgusted 0.9%
Fear 0.8%
Angry 0.5%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Confused 35.7%
Calm 18.6%
Happy 16.9%
Surprised 14%
Sad 8.2%
Fear 2.6%
Disgusted 2.5%
Angry 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%

Captions

Microsoft

a group of people posing for a photo 86.3%
a group of people posing for the camera 86.2%
a group of people posing for a picture 86.1%

Text analysis

Amazon

MAG
VI23A2

Google

VT3A 2
2
VT3A