Human Generated Data

Title

Untitled (bride and groom posed with wedding party in room with curtained windows)

Date

1935

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9078

Human Generated Data

Title

Untitled (bride and groom posed with wedding party in room with curtained windows)

People

Artist: Martin Schweig, American 20th century

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9078

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Apparel 100
Clothing 100
Person 99.1
Human 99.1
Person 98.1
Person 97.6
Robe 97.6
Fashion 97.6
Gown 97.3
Female 96.5
Person 96.1
Person 95
Dress 94.6
Wedding 94.5
Person 92.3
Person 91.2
Person 88.3
Woman 88.2
Bride 87.4
Wedding Gown 87.4
Person 85.1
Bridegroom 74.7
Evening Dress 72.1
Suit 69.8
Overcoat 69.8
Coat 69.8
Face 68.1
Footwear 68
Shoe 68
Electronics 63.2
Screen 63.2
Shoe 62.5
People 61.1
Photography 60.9
Photo 60.9
Portrait 60.9
Stage 60.6
Shoe 59.3
Girl 57.7
Bridesmaid 55.4
Monitor 55.2
Display 55.2

Clarifai
created on 2023-10-27

people 99.9
group 99.9
child 97.9
adult 96.7
woman 96.6
many 96.3
family 95.6
several 95.4
leader 95.4
group together 95.1
dress 94
offspring 93.8
wear 93.7
three 93.1
movie 93
sibling 92.8
outfit 92.7
actress 92.7
interaction 92
son 91.8

Imagga
created on 2022-01-23

blackboard 66.8
musical instrument 24.7
man 21.5
wind instrument 18.7
male 17.7
people 17.3
silhouette 16.5
black 16.2
person 14.7
harmonica 12.5
keyboard instrument 12.3
business 12.1
accordion 11.8
men 11.2
adult 10.8
couple 10.4
boy 10.4
window 10.3
free-reed instrument 10.1
businessman 9.7
grunge 9.4
dark 9.2
art 9.1
design 9.1
old 9
dress 9
sunset 9
child 8.8
happy 8.8
style 8.2
home 8
women 7.9
sky 7.6
dance 7.6
water 7.3
room 7.3
indoor 7.3
stringed instrument 7.3
office 7.2
dirty 7.2
celebration 7.2
love 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

dress 97.8
wedding dress 97.6
bride 95.4
text 94.8
clothing 90.1
woman 90.1
person 88.8
wedding 69.1
flower 63.1
posing 56.7
old 50.8
image 37.7
vintage 26.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 84%
Happy 92.9%
Calm 4.2%
Surprised 1.1%
Sad 0.5%
Fear 0.5%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Male, 99.1%
Happy 45.7%
Sad 31.5%
Angry 8.8%
Calm 4.3%
Disgusted 3.3%
Fear 3.2%
Surprised 2.3%
Confused 1.2%

AWS Rekognition

Age 31-41
Gender Male, 74.1%
Happy 96.1%
Calm 2.3%
Sad 1.1%
Fear 0.2%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.7%
Happy 93.3%
Calm 6.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 87.7%
Happy 54.1%
Surprised 15.3%
Sad 13.2%
Calm 9%
Disgusted 2.7%
Fear 2.4%
Angry 1.8%
Confused 1.4%

AWS Rekognition

Age 21-29
Gender Male, 86.6%
Calm 98.6%
Fear 0.7%
Happy 0.3%
Sad 0.1%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Sad 86.4%
Calm 7.9%
Happy 3.9%
Angry 0.6%
Fear 0.5%
Surprised 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 30-40
Gender Male, 96.9%
Calm 99.8%
Sad 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%
Fear 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 10-18
Gender Female, 79.3%
Fear 86.3%
Calm 4.4%
Sad 4.2%
Surprised 1.8%
Happy 1.5%
Disgusted 0.7%
Angry 0.7%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Shoe 68%

Categories