Human Generated Data

Title

Untitled (choir posed on stair case)

Date

1939

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22245

Human Generated Data

Title

Untitled (choir posed on stair case)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22245

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Architecture 98.7
Building 98.7
Person 98.3
Human 98.3
Clothing 98
Apparel 98
Person 95.9
Person 93.1
Person 91.4
Person 90.9
Pillar 87.8
Column 87.8
Person 80.2
Fashion 73
Person 66.4
Crypt 59.9
Robe 59.2
Evening Dress 57
Gown 57

Clarifai
created on 2023-10-22

people 96.7
woman 95.8
wedding 95.2
veil 93
man 92.5
adult 87.9
bride 87.9
indoors 87.1
monochrome 85.4
group 85.3
groom 82.9
church 82.4
glass items 82.4
religion 82
spirituality 81.9
illustration 81.6
ceremony 80.9
love 72.4
elegant 72
art 71.7

Imagga
created on 2022-03-11

shower 39
plumbing fixture 30.5
architecture 28.2
church 23.1
religion 21.5
old 20.2
shower curtain 19.8
monument 19.6
curtain 18.3
building 17.6
historic 15.6
art 15.1
city 14.1
sculpture 13.6
furnishing 13.6
stone 13.5
history 13.4
travel 13.4
tourism 13.2
column 13.2
ancient 13
holy 12.5
blind 12.3
statue 11.6
historical 11.3
tourist 10.9
landmark 10.8
room 10.8
tower 10.7
catholic 10.7
negative 10.5
faith 10.5
god 10.5
cathedral 10.4
famous 10.2
marble 10
dress 9.9
sconce 9.9
protective covering 9.8
gold 9.8
interior 9.7
design 9.6
bell 9.5
attraction 9.5
culture 9.4
religious 9.4
window 8.7
antique 8.6
cross 8.5
style 8.1
celebration 8
bracket 7.9
love 7.9
film 7.9
structure 7.8
facade 7.7
saint 7.7
bride 7.7
support 7.5
close 7.4
bottle 7.4
wedding 7.3
water 7.3
detail 7.2
black 7.2
groom 7.1
sky 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 93
black and white 92.2
candle 68.1
statue 57.4
monochrome 53.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 84.5%
Sad 9.2%
Confused 1.9%
Angry 1.5%
Happy 1.1%
Disgusted 0.8%
Surprised 0.7%
Fear 0.4%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Calm 98.4%
Surprised 0.9%
Happy 0.2%
Sad 0.2%
Angry 0.1%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 25-35
Gender Male, 96.6%
Calm 89.2%
Sad 5.5%
Surprised 2%
Happy 1.3%
Confused 1%
Fear 0.4%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 48-54
Gender Male, 87.4%
Calm 71.5%
Sad 13.5%
Confused 6.1%
Surprised 3.2%
Happy 2.7%
Angry 1.5%
Disgusted 1.1%
Fear 0.5%

AWS Rekognition

Age 33-41
Gender Male, 100%
Surprised 31.9%
Sad 26.4%
Calm 22.4%
Happy 13.3%
Fear 1.7%
Angry 1.5%
Confused 1.4%
Disgusted 1.3%

AWS Rekognition

Age 35-43
Gender Male, 91.2%
Calm 60.1%
Sad 29.1%
Confused 5.4%
Happy 2.8%
Angry 1.3%
Disgusted 0.5%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Male, 99%
Sad 51.8%
Calm 19.4%
Happy 15.3%
Surprised 4.6%
Fear 3.4%
Angry 2.3%
Disgusted 1.7%
Confused 1.6%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Surprised 99.7%
Fear 0.1%
Calm 0.1%
Disgusted 0%
Angry 0%
Happy 0%
Confused 0%
Sad 0%

AWS Rekognition

Age 23-33
Gender Female, 99%
Calm 41.4%
Sad 40%
Happy 7.8%
Disgusted 4%
Angry 2.4%
Confused 2%
Surprised 1.2%
Fear 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.3%
Person 95.9%
Person 93.1%
Person 91.4%
Person 90.9%
Person 80.2%
Person 66.4%

Categories