Human Generated Data

Title

Untitled (Red House, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1188

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Red House, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1188

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Formal Wear 100
Suit 100
People 100
Architecture 99.2
Building 99.2
Outdoors 99.2
Shelter 99.2
Person 99
Person 99
Adult 99
Male 99
Man 99
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Person 98.7
Person 98.4
Person 98.3
Person 98.1
Male 98.1
Boy 98.1
Child 98.1
Person 97.5
Person 96.4
Adult 96.4
Male 96.4
Man 96.4
Person 96.2
Person 96
Person 94.9
Person 93.8
Person 93.5
Male 93.5
Boy 93.5
Child 93.5
Person 91.4
Ball 89.4
Rugby 89.4
Rugby Ball 89.4
Sport 89.4
Footwear 89.3
Shoe 89.3
Shoe 85.9
Face 84.4
Head 84.4
Shoe 84.1
Photography 82.9
Portrait 82.7
Shoe 82.7
Shorts 80.5
Shoe 79.9
Person 78.6
Shoe 78.5
Coat 77.9
Jacket 77.3
Person 74.6
Shoe 73.3
Person 73.2
Nature 73
Shoe 70.3
Dress 68.6
Machine 65.6
Wheel 65.6
Person 60.7
City 60.7
Shoe 59.6
Housing 57.9
Pants 57.7
Hat 57.7
Urban 57.7
Shirt 57.5
Overcoat 57
Road 56.7
Street 56.7
Tuxedo 56.4
Crowd 56.4
Skirt 56.3
Bus Stop 55.7
Accessories 55.5
Bag 55.5
Handbag 55.5
House 55.5
Countryside 55.4
Hut 55.4
Rural 55.4
Shoe 55.3
Tie 55.1
Neighborhood 55.1

Clarifai
created on 2018-05-11

people 100
group 99.6
many 99.4
group together 98.9
child 97.7
adult 96.4
woman 96.1
administration 93.7
man 91.9
war 88.5
several 88.2
boy 86.4
military 83.8
leader 81.3
crowd 80.5
wear 80
street 78.1
recreation 77.3
vehicle 74.1
five 73.3

Imagga
created on 2023-10-06

kin 44.3
man 24.2
people 21.7
old 18.1
person 15.7
male 15.7
ancient 14.7
uniform 14
city 13.3
soldier 12.7
world 12.2
men 12
black 12
travel 12
street 12
army 11.7
group 11.3
room 10.9
history 10.7
building 10.7
military 10.6
business 10.3
vintage 9.9
family 9.8
scene 9.5
statue 9.5
monument 9.3
child 9.2
tourism 9.1
war 8.7
antique 8.6
sculpture 8.6
architecture 8.6
culture 8.5
military uniform 8.3
clothing 8.2
danger 8.2
landmark 8.1
art 7.9
urban 7.9
boy 7.8
walking 7.6
happy 7.5
traditional 7.5
adult 7.5
mother 7.5
symbol 7.4
historic 7.3
girls 7.3
protection 7.3
aged 7.2
dirty 7.2
home 7.2
portrait 7.1
women 7.1
interior 7.1
weapon 7.1
businessman 7.1
spectator 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
outdoor 98.8
group 91.9
people 89
standing 82.4
posing 70.9
crowd 1.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-18
Gender Female, 100%
Calm 78.3%
Happy 15%
Surprised 6.8%
Fear 6%
Sad 2.4%
Confused 2.3%
Angry 1.2%
Disgusted 1%

AWS Rekognition

Age 7-17
Gender Male, 60.5%
Happy 98.2%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Calm 0.8%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 6-14
Gender Female, 99%
Calm 95.6%
Surprised 6.4%
Fear 5.9%
Sad 2.9%
Disgusted 0.9%
Angry 0.4%
Happy 0.3%
Confused 0.3%

AWS Rekognition

Age 26-36
Gender Female, 84%
Calm 76%
Surprised 6.6%
Fear 6.1%
Angry 5.9%
Confused 4.7%
Sad 4.6%
Happy 3.5%
Disgusted 3.1%

AWS Rekognition

Age 48-56
Gender Male, 100%
Calm 77.5%
Sad 27.8%
Surprised 6.5%
Fear 5.9%
Confused 1%
Angry 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 6-14
Gender Female, 100%
Angry 80%
Sad 10.2%
Calm 6.6%
Surprised 6.5%
Fear 6%
Confused 0.5%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Female, 89.4%
Happy 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 23-31
Gender Female, 99.7%
Sad 100%
Calm 8.4%
Surprised 6.3%
Fear 6%
Angry 0.8%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 6-14
Gender Female, 90.2%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 18-24
Gender Male, 58.4%
Calm 97%
Surprised 6.3%
Fear 6%
Sad 2.3%
Happy 1.1%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 26-36
Gender Male, 99.4%
Calm 84.7%
Sad 9.6%
Surprised 6.5%
Fear 6.1%
Confused 1.4%
Disgusted 1.1%
Happy 0.4%
Angry 0.4%

AWS Rekognition

Age 6-14
Gender Female, 99.7%
Sad 99.3%
Calm 34.9%
Surprised 6.5%
Fear 6%
Confused 2.4%
Disgusted 0.5%
Angry 0.3%
Happy 0.3%

AWS Rekognition

Age 21-29
Gender Female, 98.8%
Sad 97%
Fear 54%
Surprised 6.5%
Disgusted 1.9%
Calm 1.2%
Angry 0.7%
Confused 0.6%
Happy 0.2%

AWS Rekognition

Age 21-29
Gender Female, 59.1%
Calm 67.3%
Sad 18.9%
Happy 12.2%
Surprised 7.3%
Fear 6.2%
Confused 0.6%
Disgusted 0.4%
Angry 0.3%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 10
Gender Female

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 20
Gender Male

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Adult 99%
Male 99%
Man 99%
Boy 98.1%
Child 98.1%
Rugby Ball 89.4%
Shoe 89.3%
Coat 77.9%
Wheel 65.6%

Categories