Human Generated Data

Title

Untitled (London, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.201

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (London, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.201

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Reading 100
Photography 100
Sitting 99.5
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99.1
Male 99.1
Boy 99.1
Child 99.1
Clothing 93
Footwear 93
Shoe 93
Shoe 90.4
Shoe 89.6
Coat 85.1
Photographer 83
Face 81.1
Head 81.1
Portrait 81.1
City 61.5
Shoe 59.3
Hat 59
Bus Stop 57.1
Outdoors 57.1
Electronics 57
Phone 57
Bench 56.7
Furniture 56.7
Road 56.2
Street 56.2
Urban 56.2
Hairdresser 56
Formal Wear 55.8
Suit 55.8
Transportation 55.2
Vehicle 55.2
Pants 55.1

Clarifai
created on 2018-05-11

people 100
child 99.1
group 98.5
two 98.3
three 97.3
adult 96.5
group together 94.8
administration 93.8
boy 93.5
four 93.4
man 93.2
five 92.7
wear 90.8
war 89.9
one 89.6
woman 89.6
education 88.1
school 87.4
several 86.1
home 85.5

Imagga
created on 2023-10-06

wall 19.7
child 18.9
man 18.8
building 18.7
city 18.3
old 18.1
people 17.3
adult 16.2
person 16.1
street 15.6
world 14.1
door 14
sill 12.9
male 12.3
tricycle 11.8
architecture 11.7
portrait 11.6
support 11.4
urban 11.4
lifestyle 10.8
wheeled vehicle 10.6
black 10.2
structural member 10
outdoor 9.9
boy 9.6
school 9.5
brick 9.5
youth 9.4
human 9
couple 8.7
prison 8.6
travel 8.4
house 8.4
vehicle 8.2
outdoors 8.2
teenager 8.2
stairs 7.9
parent 7.9
structure 7.9
ancient 7.8
device 7.7
statue 7.6
stone 7.6
mother 7.6
one 7.5
teen 7.3
shop 7.3
hair 7.1
face 7.1
love 7.1
newspaper 7.1
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 95.2
building 83.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 49-57
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Confused 1.3%
Calm 0.9%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 48-56
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 6%
Confused 1.4%
Calm 0.4%
Disgusted 0.3%
Angry 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 63
Gender Male

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%
Boy 99.1%
Child 99.1%
Shoe 93%
Hat 59%

Categories

Text analysis

Amazon

COIL