Human Generated Data

Title

Untitled (formally dressed girls, outside)

Date

1937

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22326

Human Generated Data

Title

Untitled (formally dressed girls, outside)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22326

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 100
Apparel 100
Human 98.5
Person 98
Person 97.1
Person 96.3
Gown 95.6
Fashion 95.6
Person 94.9
Person 94.6
Person 94.6
Person 94.6
Robe 94.2
Person 94.1
Person 93.9
Person 93.5
Person 92.4
Person 90.9
Female 90
Wedding 85.9
Person 82
Person 81
Evening Dress 80.4
Art 79.7
Woman 78.3
Wedding Gown 71.6
Drawing 67.2
Dress 62.2
Bride 58.3
Bridesmaid 56.9
Sketch 55.9
Person 44.6

Clarifai
created on 2023-10-22

people 100
group 99.3
adult 98.8
many 98.1
man 97.7
woman 96.9
child 95.5
group together 94.6
education 93.6
art 89.6
school 89.3
wear 86.6
leader 85.7
portrait 82.5
boy 82.3
illustration 79.1
crowd 78.8
indoors 77
position 77
military 75.4

Imagga
created on 2022-03-11

structure 54.8
fountain 35.7
altar 27.5
stone 19.5
architecture 18.5
old 16.7
light 15.4
memorial 14.8
art 13.6
gravestone 13.4
landscape 13.4
building 12.6
religion 12.5
night 12.4
church 11.1
vintage 10.7
tourism 10.7
travel 10.6
monument 10.3
winter 10.2
sky 10.2
picket fence 10
water 10
city 10
landmark 9.9
park 9.9
fence 9.8
black 9.6
rock 9.5
ancient 9.5
dark 9.2
national 9.1
natural 8.7
statue 8.7
antique 8.6
wall 8.6
sculpture 8.5
summer 8.4
history 8
mountain 8
temple 8
culture 7.7
god 7.6
texture 7.6
snow 7.5
religious 7.5
historic 7.3
dirty 7.2
scenery 7.2
holiday 7.2
decoration 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

wedding dress 91.4
building 80.9
text 78.8
bride 78.5
person 78.5
white 70.5
clothing 69.1
dress 67.1
old 62.4
woman 53.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 75.7%
Calm 95.8%
Sad 1.5%
Happy 1.3%
Surprised 0.6%
Fear 0.3%
Confused 0.2%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 36-44
Gender Male, 95.4%
Sad 90.5%
Calm 8.4%
Happy 0.3%
Fear 0.3%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 41-49
Gender Female, 97.7%
Calm 97%
Sad 1.2%
Fear 0.5%
Happy 0.5%
Confused 0.3%
Angry 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 50-58
Gender Male, 97.1%
Happy 73%
Sad 12.4%
Calm 9.9%
Fear 1.9%
Disgusted 0.9%
Surprised 0.7%
Angry 0.7%
Confused 0.6%

AWS Rekognition

Age 38-46
Gender Female, 88.8%
Calm 97.5%
Happy 0.7%
Disgusted 0.4%
Surprised 0.4%
Sad 0.4%
Angry 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Male, 89.6%
Happy 96%
Calm 3.6%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 41-49
Gender Female, 58.6%
Happy 46.8%
Calm 44.8%
Fear 2.3%
Surprised 2.1%
Sad 1.7%
Confused 1%
Disgusted 0.7%
Angry 0.4%

AWS Rekognition

Age 34-42
Gender Male, 96.5%
Happy 61.2%
Calm 12.9%
Disgusted 6.6%
Sad 6.3%
Surprised 5%
Fear 3.1%
Confused 3%
Angry 1.9%

AWS Rekognition

Age 37-45
Gender Female, 95.8%
Calm 95%
Surprised 3.5%
Sad 0.9%
Happy 0.2%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Female, 96.9%
Calm 87.9%
Happy 5.2%
Sad 3.9%
Surprised 1.1%
Fear 0.8%
Disgusted 0.5%
Angry 0.3%
Confused 0.3%

AWS Rekognition

Age 42-50
Gender Female, 62.4%
Happy 68.8%
Sad 14%
Calm 8.9%
Surprised 2.2%
Fear 2.2%
Angry 1.9%
Disgusted 1.4%
Confused 0.6%

AWS Rekognition

Age 38-46
Gender Male, 96.8%
Calm 94.4%
Happy 3.5%
Fear 1.1%
Disgusted 0.4%
Sad 0.2%
Confused 0.1%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 19-27
Gender Male, 59.3%
Happy 47.5%
Calm 22.7%
Sad 9%
Angry 7.2%
Fear 5.5%
Confused 3.7%
Disgusted 2.5%
Surprised 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98%
Person 97.1%
Person 96.3%
Person 94.9%
Person 94.6%
Person 94.6%
Person 94.6%
Person 94.1%
Person 93.9%
Person 93.5%
Person 92.4%
Person 90.9%
Person 82%
Person 81%
Person 44.6%

Categories

Text analysis

Amazon

TATISIV
٢٤
NO
٢٤ NO a TATISIV NWO TECRO
NWO
STAR
a
TECRO