Human Generated Data

Title

Untitled (bride feeding cake to groom in front of cake table in living room surrounded by bridesmaids)

Date

1950-1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9407

Human Generated Data

Title

Untitled (bride feeding cake to groom in front of cake table in living room surrounded by bridesmaids)

People

Artist: Martin Schweig, American 20th century

Date

1950-1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9407

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.8
Human 98.8
Person 98.5
Clothing 97.2
Apparel 97.2
Person 93.5
Meal 92.7
Food 92.7
Person 91.1
Person 90.5
Dish 86.3
Person 84.8
Advertisement 82.6
Wedding Cake 82.6
Dessert 82.6
Cake 82.6
Icing 80.5
Cream 80.5
Creme 80.5
Collage 79.1
Poster 79.1
Person 75.9
Person 75
Gown 69
Fashion 69
People 67.8
Person 66.2
Robe 65.6
Photography 65.1
Photo 65.1
Art 61.4
Wedding 59.9
Wedding Gown 56.8
Female 56.3

Clarifai
created on 2023-10-26

people 99.9
group 98.9
man 97.8
woman 97.5
adult 96.6
many 94.1
music 92.6
group together 92.4
child 92.1
wedding 89.5
administration 87
wear 86.5
monochrome 86.2
ceremony 86.1
veil 84.6
musician 83.7
leader 83.4
dancing 82.8
actress 80.6
facial expression 80

Imagga
created on 2022-01-23

person 19.8
man 19.5
adult 19.5
people 18.4
negative 17.4
musical instrument 16.3
dress 15.4
hair 13.5
film 13.4
performer 13.1
fashion 12.8
male 12.8
portrait 12.3
black 12
groom 11.9
bride 11.5
smoke 11.2
musician 11.1
style 11.1
smile 10.7
clothing 10.6
human 10.5
fun 10.5
sexy 10.4
photographic paper 10.4
model 10.1
light 10
art 9.7
wind instrument 9.6
wedding 9.2
dark 9.2
music 9.1
one 9
happy 8.8
couple 8.7
mask 8.7
love 8.7
dance 8.5
face 8.5
modern 8.4
studio 8.4
city 8.3
celebration 8
world 7.8
rock 7.8
color 7.8
ceremony 7.8
play 7.8
party 7.7
motion 7.7
stringed instrument 7.6
singer 7.4
bass 7.4
life 7.4
danger 7.3
suit 7.2
holiday 7.2
romantic 7.1
dancer 7.1
indoors 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.3
window 89.3
person 87.5
clothing 85.9
woman 62.6
wedding dress 59.3
dance 57.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 91.4%
Sad 79.4%
Happy 6%
Calm 4.6%
Surprised 4.6%
Fear 2.7%
Confused 1%
Angry 1%
Disgusted 0.6%

AWS Rekognition

Age 24-34
Gender Female, 90.2%
Happy 68.6%
Calm 19.7%
Angry 3.8%
Sad 3.2%
Fear 2.6%
Surprised 0.9%
Disgusted 0.7%
Confused 0.5%

AWS Rekognition

Age 18-26
Gender Female, 95.7%
Calm 73.7%
Happy 10.8%
Sad 6.9%
Angry 5.3%
Disgusted 1%
Surprised 1%
Fear 0.8%
Confused 0.6%

AWS Rekognition

Age 18-24
Gender Male, 89.6%
Calm 70%
Sad 20.5%
Happy 2.4%
Fear 2.2%
Angry 2%
Disgusted 1.5%
Surprised 1%
Confused 0.4%

AWS Rekognition

Age 30-40
Gender Male, 93.9%
Calm 92.8%
Happy 1.7%
Sad 1.4%
Confused 1.3%
Disgusted 1%
Fear 0.7%
Surprised 0.6%
Angry 0.5%

Feature analysis

Amazon

Person 98.8%
Wedding Cake 82.6%

Categories

Text analysis

Amazon

2
n 2
C
KODVK-&VEELA
n

Google

YT37A8 XAGON
YT37A8
XAGON