Human Generated Data

Title

Untitled (two grooms feeding cake to their wives at reception)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9180

Human Generated Data

Title

Untitled (two grooms feeding cake to their wives at reception)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9180

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Home Decor 99.8
Cake 99.5
Dessert 99.5
Food 99.5
Icing 99.3
Cream 99.3
Creme 99.3
Human 99.1
Person 99.1
Person 97.8
Interior Design 97
Indoors 97
Person 96.3
Apparel 92.6
Clothing 92.6
Room 84.7
Person 83.4
Torte 72.8
Fashion 63.1
Meal 60.9
Dish 59.6
Gown 58.7
Window 58.6
People 58.2
Curtain 57.6
Robe 57.4
Wedding Gown 56.9
Wedding 56.9
Photography 55.3
Photo 55.3
Person 46.1

Clarifai
created on 2023-10-26

people 99.9
group 98.1
adult 98
man 96.2
group together 94.6
two 92.9
woman 92.5
three 91.9
scientist 90.3
music 89.6
room 88.8
actor 87.8
musician 87.2
monochrome 87
four 86
several 83.3
family 82.7
administration 82.2
child 81.9
leader 81.7

Imagga
created on 2022-01-23

musical instrument 36.1
brass 29.3
wind instrument 27.6
person 26.3
adult 25.9
man 25.5
people 24
black 19.4
male 19.1
device 17
portrait 16.2
business 15.2
indoors 14.9
electronic instrument 14.4
happy 14.4
men 13.7
face 13.5
hair 13.5
leisure 13.3
looking 12.8
casual 12.7
women 12.6
style 12.6
pretty 12.6
old 12.5
lifestyle 12.3
couple 12.2
office 12
active 11.9
professional 11.9
room 11.4
fashion 11.3
sexy 11.2
fun 11.2
businessman 10.6
music 10.5
home 10.4
senior 10.3
mature 10.2
dark 10
smile 10
hand 9.9
musician 9.8
clothing 9.8
attractive 9.8
human 9.7
one 9.7
group 9.7
businesspeople 9.5
expression 9.4
two 9.3
alone 9.1
holding 9.1
dress 9
lady 8.9
interior 8.8
love 8.7
chair 8.7
happiness 8.6
stage 8.6
modern 8.4
elegance 8.4
performer 8.3
indoor 8.2
businesswoman 8.2
handsome 8
light 8
urban 7.9
teacher 7.8
model 7.8
concert 7.8
sitting 7.7
cornet 7.6
guitar 7.6
desk 7.6
window 7.4
confident 7.3
smiling 7.2
suit 7.2
holiday 7.2
night 7.1
to 7.1
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 94.8
text 93.6
person 91.3
black and white 86.6
woman 80.3
dress 51.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 96.4%
Calm 97.4%
Surprised 1.7%
Confused 0.3%
Disgusted 0.2%
Happy 0.1%
Sad 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 54-62
Gender Female, 98.8%
Happy 91.5%
Sad 3.2%
Surprised 1.8%
Angry 1.5%
Disgusted 0.8%
Fear 0.5%
Calm 0.4%
Confused 0.3%

AWS Rekognition

Age 49-57
Gender Male, 88.2%
Surprised 69%
Confused 24.9%
Fear 1.5%
Angry 1.2%
Calm 1.1%
Sad 1.1%
Happy 0.7%
Disgusted 0.6%

AWS Rekognition

Age 40-48
Gender Female, 65.1%
Calm 65.4%
Happy 24.8%
Fear 4.9%
Angry 1.4%
Sad 1.1%
Surprised 1%
Disgusted 0.8%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Text analysis

Amazon

SA
ESI-A

Google

tト S
S
t
ト