Human Generated Data

Title

Untitled (elevated view of fancy dining room filled with women)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9655

Human Generated Data

Title

Untitled (elevated view of fancy dining room filled with women)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9655

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.7
Apparel 99.7
Person 97.6
Human 97.6
Person 96.9
Person 96.1
Person 95.2
Person 95.1
Person 94.7
Person 93
Person 92.6
Bonnet 92.4
Hat 92.4
Chicken 89.6
Animal 89.6
Bird 89.6
Fowl 89.6
Poultry 89.6
People 83.8
Indoors 79.5
Room 75.7
Costume 75.5
Person 73.7
Baby 72.1
Face 66.4
Photography 63
Photo 63
Furniture 61.8
Portrait 61.1
Girl 60.9
Female 60.9
Kid 60.1
Child 60.1
Toy 58
Living Room 57.1
Dress 56.2

Clarifai
created on 2023-10-26

people 99.9
group 99.3
woman 97.9
many 97.6
man 95.4
adult 94.7
child 92.1
music 91.4
group together 89.5
monochrome 88.8
several 88.2
dancing 86.2
actress 84.1
wear 83.6
education 83.6
musician 82.7
indoors 82.6
wedding 79.4
audience 78.9
sit 76.8

Imagga
created on 2022-01-23

person 28.5
people 27.3
shower cap 26.4
man 23.5
happy 23.2
cap 22.6
salon 22.1
couple 20
adult 19.5
male 19.1
love 18.1
home 17.5
holiday 17.2
smiling 16.6
happiness 16.4
teacher 16.4
headdress 16.2
celebration 15.9
portrait 15.5
clothing 15.4
cheerful 15.4
bride 15.3
hair 15
smile 14.9
pretty 14.7
together 14
women 13.4
senior 13.1
wedding 12.9
groom 12.8
dress 12.6
indoors 12.3
fashion 12
indoor 11.9
old 11.8
room 11.3
sexy 11.2
two 11
lifestyle 10.8
bouquet 10.8
family 10.7
lady 10.5
attractive 10.5
blond 10.3
educator 10.3
elegance 10.1
holding 9.9
musical instrument 9.9
romance 9.8
human 9.7
fun 9.7
interior 9.7
table 9.6
wife 9.5
men 9.4
mature 9.3
flower 9.2
drink 9.2
looking 8.8
married 8.6
party 8.6
husband 8.6
marriage 8.5
face 8.5
black 8.4
wine 8.3
group 8.1
romantic 8
cute 7.9
glass 7.9
day 7.8
beard 7.8
flowers 7.8
education 7.8
modern 7.7
elderly 7.7
house 7.5
traditional 7.5
one 7.5
professional 7.5
brass 7.4
suit 7.2
night 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 91.7
birthday cake 77.9
wedding cake 77.6
group 67.7
wedding dress 64.3
wedding 60.9
flower 58.1
candle 56.9
vase 56.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Surprised 92.7%
Fear 3.1%
Happy 2.1%
Angry 0.7%
Calm 0.6%
Sad 0.3%
Disgusted 0.3%
Confused 0.1%

AWS Rekognition

Age 52-60
Gender Male, 99.7%
Sad 72.9%
Happy 14.8%
Confused 6.5%
Calm 2.3%
Fear 1.5%
Disgusted 1.1%
Angry 0.6%
Surprised 0.3%

AWS Rekognition

Age 42-50
Gender Female, 99.2%
Calm 99.5%
Sad 0.2%
Happy 0.2%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chicken
Person 97.6%
Person 96.9%
Person 96.1%
Person 95.2%
Person 95.1%
Person 94.7%
Person 93%
Person 92.6%
Person 73.7%
Chicken 89.6%

Text analysis

Amazon

23150

Google

MJI7- -YT RA°2- ->NAGON
MJI7-
-YT
RA°2-
->NAGON