Human Generated Data

Title

Untitled (London, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2554

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (London, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2554

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Reading 100
Person 99.3
Boy 99.3
Child 99.3
Male 99.3
Person 99.2
Male 99.2
Adult 99.2
Man 99.2
Sitting 97.8
Clothing 92.7
Footwear 92.7
Shoe 92.7
Shoe 89.6
Shoe 79.9
Face 74.6
Head 74.6
Outdoors 63.2
Coat 61.5
Shoe 58.1
Photography 57.6
City 57.2
Portrait 56.2
Text 55.8
Bus Stop 55.6
Road 55.3
Street 55.3
Urban 55.3

Clarifai
created on 2018-05-10

people 100
child 99.2
group 98.6
two 97.8
three 97
adult 96.8
group together 96.3
boy 94.3
five 93.2
four 92.8
administration 92.8
man 92.7
wear 91.4
one 90.7
war 89.9
woman 88
several 87.9
education 87.4
home 87.4
school 86

Imagga
created on 2023-10-06

shop 24.8
barbershop 24.3
old 19.5
city 18.3
mercantile establishment 17.5
door 17
building 16.7
wall 16.2
man 16.1
child 16
tricycle 15
adult 14.9
person 14.7
people 13.9
world 13.2
wheeled vehicle 13.1
architecture 12.5
male 12.2
house 11.7
portrait 11.6
place of business 11.6
urban 11.4
vehicle 11
school 10.1
smile 10
black 9.6
happy 9.4
youth 9.4
outdoor 9.2
vintage 9.1
dirty 9
lifestyle 8.7
ancient 8.6
happiness 8.6
face 8.5
travel 8.4
stone 8.4
tourism 8.2
conveyance 8.2
one 8.2
sill 8.1
stairs 8
mother 7.9
parent 7.9
day 7.8
boy 7.8
sitting 7.7
entrance 7.7
statue 7.6
window 7.5
monument 7.5
style 7.4
street 7.4
teen 7.3
sliding door 7.2
family 7.1
kid 7.1
wooden 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 98.1
building 81.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 53-61
Gender Male, 99.8%
Sad 100%
Surprised 6.5%
Calm 6.3%
Fear 6%
Confused 1.8%
Angry 0.8%
Happy 0.5%
Disgusted 0.4%

AWS Rekognition

Age 50-58
Gender Male, 100%
Sad 99.9%
Confused 13.1%
Surprised 6.7%
Fear 6.4%
Calm 4.6%
Disgusted 4.6%
Angry 2.5%
Happy 0.6%

Microsoft Cognitive Services

Age 55
Gender Male

Microsoft Cognitive Services

Age 64
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Boy 99.3%
Child 99.3%
Male 99.3%
Adult 99.2%
Man 99.2%
Shoe 92.7%

Categories

Imagga

interior objects 97%
pets animals 2.5%

Text analysis

Amazon

CO