Human Generated Data

Title

Untitled (crowd gathered at Maryland Hunt Cup race, Maryland)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11664

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowd gathered at Maryland Hunt Cup race, Maryland)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Clothing 99.4
Apparel 99.4
Person 99.1
Person 98.3
Person 98
Person 93.5
Crowd 87.8
Person 86.1
Person 78.6
Person 77
Shorts 76.5
Person 75.7
People 75.5
Tent 74.4
Coat 73.6
Festival 69.9
Female 65
Person 64.1
Costume 61.7
Face 59.9
Overcoat 56.2

Imagga
created on 2022-01-15

cemetery 46.6
statue 28.9
religion 25.1
stone 24.5
old 23.7
sculpture 23.2
architecture 21.9
memorial 20.7
gravestone 19.8
history 19.7
monument 16.8
god 16.3
church 15.7
art 15.1
ancient 14.7
religious 14
tourism 14
structure 13.7
travel 13.4
building 12.4
historical 12.2
man 12.1
antique 11.2
scene 11.2
famous 11.2
culture 11.1
historic 11
cathedral 10.8
people 10.6
temple 10.4
catholic 10.2
vintage 9.9
clothing 9.8
person 9.7
holy 9.6
saint 9.6
faith 9.6
traditional 9.1
city 9.1
marimba 9
world 8.8
marble 8.7
spirituality 8.6
spiritual 8.6
shelter 8.5
tourist 8.4
protection 8.2
landmark 8.1
love 7.9
pray 7.8
cross 7.5
symbol 7.4
percussion instrument 7.4
decoration 7.4
peace 7.3
new 7.3
detail 7.2
dress 7.2

Google
created on 2022-01-15

Photograph 94.2
Coat 90.5
Hat 89.2
Adaptation 79.3
Suit 77.3
Vintage clothing 76.2
Monochrome 75.9
Snapshot 74.3
Monochrome photography 73.8
Event 72.7
Font 72.4
Classic 71.9
Uniform 70
Art 69.5
Plant 68.3
History 67
Stock photography 64.7
Tree 64
Pole 62.9
Photo caption 61.8

Microsoft
created on 2022-01-15

text 93.9
clothing 92
person 88.4
man 75.1
grave 71.9
funeral 68.2
black and white 63.6
cemetery 52.3

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 71.5%
Calm 99.7%
Sad 0.2%
Happy 0%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 95.3%
Sad 88.3%
Fear 6.6%
Calm 2.8%
Confused 1%
Disgusted 0.4%
Angry 0.3%
Happy 0.3%
Surprised 0.2%

AWS Rekognition

Age 27-37
Gender Female, 64.9%
Calm 71.4%
Sad 10.5%
Happy 4.7%
Angry 4.6%
Surprised 3.9%
Disgusted 2.4%
Fear 2.3%
Confused 0.2%

AWS Rekognition

Age 48-56
Gender Male, 98.8%
Calm 94.2%
Fear 2.2%
Confused 2.1%
Sad 0.5%
Disgusted 0.3%
Happy 0.3%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 23-31
Gender Male, 81.1%
Calm 51.3%
Sad 16%
Happy 11.9%
Angry 9.9%
Disgusted 3.8%
Confused 3.6%
Fear 2.2%
Surprised 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tent 74.4%

Captions

Microsoft

text 13.5%

Text analysis

Amazon

9703
9903.
9703.
A70A

Google

9703.
9703.