Human Generated Data

Title

Untitled (crowd on steps outside of church

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.11

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowd on steps outside of church

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.11

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Clothing 99.9
Apparel 99.9
Person 99.2
Human 99.2
Person 98.5
Person 98.3
Footwear 97.6
Shoe 97.6
Person 97.6
Person 97.6
Person 96.8
Person 95.6
Person 95.5
Person 95.4
Overcoat 93.7
Tie 93.3
Accessories 93.3
Accessory 93.3
Tie 93.3
Shoe 92.4
Hat 89.7
Shoe 84.9
Shoe 84.5
Shoe 84.3
Suit 81.6
Shoe 81.6
People 79.5
Shoe 73
Military 71.7
Military Uniform 71.7
Coat 71.1
Officer 65.9
Crowd 62.5
Shorts 58.6
Tuxedo 58.4
Shoe 55.5

Clarifai
created on 2019-03-25

people 100
group 99.4
many 99.2
group together 99
adult 98.6
administration 97.8
leader 97.2
woman 95
man 94.3
wear 94.3
several 93.2
outfit 88.4
military 87.9
veil 85.9
war 83.7
vehicle 82.9
five 81.2
child 81
chair 76.4
uniform 76

Imagga
created on 2019-03-25

people 27.3
city 21.6
man 20.2
clothing 20.2
business 17.6
person 17.3
adult 16.4
male 14.9
men 14.6
group 14.5
statue 13.4
women 13.4
brass 13.4
urban 13.1
travel 12
wind instrument 11.5
musical instrument 11.4
building 11.4
tourism 10.7
crowd 10.5
old 10.4
walking 10.4
monument 10.3
history 9.8
fashion 9.8
window 9.4
clothes 9.4
academic gown 9.3
street 9.2
silhouette 9.1
human 9
black 9
cornet 8.9
stage 8.9
businessman 8.8
soldier 8.8
military 8.7
scene 8.6
sculpture 8.6
architecture 8.6
walk 8.6
suit 8.6
life 8.3
historic 8.2
covering 8.1
transportation 8.1
gown 8
corporate 7.7
uniform 7.6
tourist 7.5
attendant 7.4
vacation 7.4
new 7.3
dress 7.2
landmark 7.2
interior 7.1

Google
created on 2019-03-25

Photograph 96.4
People 94.3
Standing 89.2
Gentleman 85.6
Snapshot 84.9
Vintage clothing 79.2
History 64.5
Family 58
Classic 56.7
Uniform 52.8
Retro style 51.3

Microsoft
created on 2019-03-25

person 100
outdoor 99
people 96.4
group 88.2
standing 85.3
posing 55.1
street 29.4
black and white 10.2
retro 8.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 54.8%
Happy 54.8%
Confused 45%
Disgusted 45%
Angry 45.1%
Calm 45%
Surprised 45%
Sad 45%

AWS Rekognition

Age 26-43
Gender Female, 54.8%
Angry 45.6%
Disgusted 45.2%
Confused 45.3%
Surprised 45.3%
Sad 47.7%
Happy 50.4%
Calm 45.5%

AWS Rekognition

Age 20-38
Gender Male, 51.7%
Happy 45.2%
Confused 45.4%
Sad 49.8%
Calm 46%
Surprised 45.5%
Angry 46.9%
Disgusted 46.2%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Confused 45.1%
Sad 45.2%
Calm 45%
Happy 53.2%
Surprised 45.1%
Angry 45.1%
Disgusted 46.4%

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Happy 50.6%
Calm 45.1%
Sad 46.1%
Angry 45.9%
Confused 45.1%
Surprised 45.4%
Disgusted 46.7%

AWS Rekognition

Age 26-43
Gender Female, 50.6%
Happy 45.9%
Disgusted 45.7%
Calm 48.1%
Sad 48.1%
Surprised 45.6%
Confused 45.5%
Angry 46.1%

AWS Rekognition

Age 26-43
Gender Female, 54.3%
Sad 46.8%
Confused 45.6%
Angry 45.5%
Surprised 45.2%
Calm 45.4%
Happy 47.8%
Disgusted 48.6%

AWS Rekognition

Age 26-43
Gender Female, 51.1%
Angry 45.6%
Happy 45.7%
Sad 51.9%
Confused 45.2%
Disgusted 45.2%
Calm 46.1%
Surprised 45.2%

AWS Rekognition

Age 20-38
Gender Male, 54.6%
Disgusted 46.2%
Confused 45.7%
Surprised 46.3%
Angry 46.7%
Happy 45.1%
Sad 46.1%
Calm 48.9%

Microsoft Cognitive Services

Age 42
Gender Female

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 97.6%
Tie 93.3%
Hat 89.7%
Coat 71.1%

Categories