Human Generated Data

Title

Untitled (Forest Hills, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1268

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Forest Hills, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1268

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99
Male 99
Man 99
Person 99
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Clothing 96
People 87.1
Footwear 84.4
Shoe 84.4
Face 79.7
Head 79.7
Outdoors 66.9
Shoe 63.2
Hat 62.8
Garden 57.5
Nature 57.5
Firearm 57
Gun 57
Rifle 57
Weapon 57
Fence 56.2
Photography 55.4
Gardener 55.1
Gardening 55.1
Officer 55.1

Clarifai
created on 2018-05-11

people 100
group 99.2
group together 98.9
adult 98.8
many 98.1
administration 95.9
man 95.7
military 94.8
wear 92.5
leader 91
war 89.7
several 88
soldier 86.2
woman 84.2
uniform 82
outfit 79.9
street 75.8
military uniform 73.8
vehicle 72.9
home 71.5

Imagga
created on 2023-10-05

sax 75.6
wind instrument 46.9
brass 36.3
musical instrument 26.2
trombone 25.7
man 22.8
people 16.7
male 15.6
sport 14.9
person 14.6
grunge 13.6
black 13.2
outdoor 13
drawing 12.8
old 12.5
bass 11.7
winter 11.1
dark 10.8
symbol 10.1
leisure 10
dirty 9.9
adult 9.8
design 9.6
men 9.4
vintage 9.1
danger 9.1
park 9
sunset 9
sky 8.9
play 8.6
travel 8.4
portrait 8.4
summer 8.4
silhouette 8.3
landscape 8.2
snow 8.1
water 8
boy 7.8
art 7.8
player 7.8
cold 7.7
run 7.7
outside 7.7
beach 7.6
sign 7.5
human 7.5
active 7.3
business 7.3
sun 7.2
history 7.2
athlete 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
person 98.7
old 93.8
group 72.8
people 55.4
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 80.6%
Surprised 7.8%
Sad 7.8%
Fear 6.2%
Angry 2.7%
Confused 1.6%
Disgusted 1.5%
Happy 0.6%

AWS Rekognition

Age 48-54
Gender Male, 100%
Sad 100%
Surprised 6.4%
Fear 6.2%
Confused 1.7%
Angry 1.2%
Disgusted 1.2%
Calm 0.6%
Happy 0.2%

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 72.9%
Confused 19.4%
Surprised 6.6%
Fear 6%
Sad 3.3%
Disgusted 1.6%
Angry 1.5%
Happy 0.4%

AWS Rekognition

Age 29-39
Gender Male, 100%
Surprised 99.6%
Fear 5.9%
Calm 3.3%
Sad 2.2%
Angry 0.6%
Confused 0.3%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Angry 78.4%
Calm 10.1%
Surprised 6.8%
Fear 6.2%
Disgusted 5.3%
Sad 2.7%
Confused 1.6%
Happy 1.2%

AWS Rekognition

Age 30-40
Gender Female, 94.9%
Calm 66.9%
Disgusted 22.7%
Surprised 6.6%
Fear 6%
Angry 4.3%
Sad 3.6%
Confused 1%
Happy 0.3%

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 53
Gender Male

Microsoft Cognitive Services

Age 60
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Shoe 84.4%
Hat 62.8%