Human Generated Data

Title

Untitled (family of fourteen standing outside church under sign, "Bless The Lord")

Date

1933

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12985

Human Generated Data

Title

Untitled (family of fourteen standing outside church under sign, "Bless The Lord")

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1933

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Apparel 100
Clothing 100
Person 99.8
Human 99.8
Person 99.2
Person 99.1
Person 99
Person 99
Person 98.6
Female 98.5
Dress 98.3
Person 98.2
Person 98.1
Person 97.7
Woman 94
Fashion 93.8
Robe 93.8
Person 93.8
Gown 93.7
Person 88
Wedding 87.5
Person 84.9
Wedding Gown 78.4
Suit 76.1
Overcoat 76.1
Coat 76.1
Bridegroom 73.8
Bride 73.7
Skirt 72.3
Person 68.9
People 60.5
Evening Dress 58.5
Photography 56.6
Photo 56.6
Portrait 56.6
Face 56.6

Imagga
created on 2022-02-05

man 22.2
people 21.2
male 17.7
spectator 17.7
person 16.7
musical instrument 16
old 14.6
silhouette 13.2
adult 12.8
stage 12.3
couple 12.2
group 12.1
black 12
history 11.6
marimba 11.3
architecture 10.9
business 10.9
clothing 10.5
art 10.5
percussion instrument 10.2
women 9.5
love 9.5
dark 9.2
catholic 9.2
life 9.1
groom 9.1
building 8.8
wind instrument 8.7
men 8.6
youth 8.5
city 8.3
light 8.2
symbol 8.1
businessman 7.9
world 7.9
scene 7.8
travel 7.7
platform 7.6
two 7.6
fashion 7.5
monument 7.5
tourism 7.4
church 7.4
brass 7.3
water 7.3
window 7.3
color 7.2
room 7.2
religion 7.2
night 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 97.6
clothing 97.4
text 96.6
dress 95.2
woman 92.7
outdoor 90.7
wedding dress 86.1
bride 74.3
wedding 64.5

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 69.6%
Happy 49.5%
Calm 48.2%
Sad 1%
Surprised 0.5%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 76.9%
Calm 99.7%
Happy 0.2%
Disgusted 0%
Surprised 0%
Sad 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 98.8%
Happy 0.7%
Sad 0.2%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 60.8%
Calm 57.9%
Happy 41.1%
Surprised 0.3%
Sad 0.2%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 41-49
Gender Female, 65.2%
Happy 97.2%
Calm 1.8%
Sad 0.6%
Fear 0.2%
Angry 0.1%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.8%
Calm 97.2%
Fear 0.8%
Surprised 0.7%
Sad 0.5%
Happy 0.3%
Disgusted 0.3%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 35-43
Gender Male, 97.8%
Sad 52.1%
Happy 12.5%
Calm 11.9%
Fear 9.5%
Angry 4.5%
Surprised 4.2%
Confused 3.4%
Disgusted 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing in front of a brick wall 84.7%
a group of people standing in front of a brick building 84.6%
a group of people standing in front of a building 84.5%

Text analysis

Amazon

116
116 the
the
e
there