Human Generated Data

Title

Untitled (woman feeding man cake at wedding reception)

Date

1945-1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9166

Human Generated Data

Title

Untitled (woman feeding man cake at wedding reception)

People

Artist: Martin Schweig, American 20th century

Date

1945-1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9166

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.7
Human 98.7
Person 98.3
Person 98.2
Clothing 98.2
Apparel 98.2
Person 98
Person 96.8
Dress 94.2
Interior Design 88.2
Indoors 88.2
Female 84.8
Room 84.6
Evening Dress 78.9
Fashion 78.9
Gown 78.9
Robe 78.9
Dance Pose 70.8
Leisure Activities 70.8
Suit 70.3
Coat 70.3
Overcoat 70.3
Woman 69.7
Person 65.9
People 60
Shop 59.5
Face 59
Advertisement 56.9
Dance 56.4
Dressing Room 56.3

Clarifai
created on 2023-10-27

people 99.9
group 99.4
woman 98.3
adult 97.4
music 97.4
dancing 96.7
actress 96.1
man 95.6
veil 93.6
actor 93.4
wear 92.1
dancer 91.6
many 91.6
group together 91.3
several 90.7
musician 90.3
three 90.1
singer 89.8
administration 88.5
theater 88.1

Imagga
created on 2022-01-23

groom 41.1
negative 28.8
bride 26
dress 22.6
people 22.3
wedding 22
kin 22
film 21.2
love 18.1
man 18.1
couple 16.5
photographic paper 16.3
person 15.7
art 15.6
happiness 14.9
adult 14.7
married 14.4
male 14.2
bouquet 14.1
bridal 13.6
portrait 12.9
marriage 12.3
family 11.5
face 11.3
celebration 11.2
photographic equipment 10.9
traditional 10.8
gown 10.7
happy 10.6
fashion 10.5
wife 10.4
window 10.4
two 10.2
wed 9.8
black 9.7
old 9.7
ceremony 9.7
park 9
flowers 8.7
husband 8.7
statue 8.7
outdoor 8.4
future 8.4
human 8.2
gorgeous 8.1
religion 8.1
sexy 8
women 7.9
matrimony 7.9
brunette 7.8
veil 7.8
costume 7.8
men 7.7
pretty 7.7
attractive 7.7
culture 7.7
life 7.6
dark 7.5
decoration 7.5
silhouette 7.4
girls 7.3
business 7.3
detail 7.2
looking 7.2
barbershop 7.2
smile 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.9
dress 96
person 95.3
dance 91.2
window 86.1
clothing 84.4
standing 84.2
woman 79
old 61.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 96.3%
Sad 46.3%
Calm 41.9%
Happy 4.8%
Surprised 2%
Confused 1.6%
Angry 1.3%
Fear 1.2%
Disgusted 0.9%

AWS Rekognition

Age 34-42
Gender Male, 55.1%
Calm 57%
Sad 37.9%
Disgusted 2.1%
Surprised 0.9%
Confused 0.9%
Happy 0.5%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Male, 84.7%
Surprised 42%
Angry 40.8%
Sad 11.1%
Fear 1.9%
Happy 1.6%
Calm 1.3%
Disgusted 0.8%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Text analysis

Amazon

сигсо
EIRW
сигсо SALE LA EIRW
SALE LA

Google

MJI7 YT3RA2 0
MJI7
YT3RA2
0