Human Generated Data

Title

Untitled (Mr. and Mrs. Kollar, Calumet, Pennsylvania)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2599

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Mr. and Mrs. Kollar, Calumet, Pennsylvania)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2599

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 100
Building 100
House 100
Housing 100
Porch 100
Clothing 99.9
Coat 99.9
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Wood 96.1
Footwear 86.9
Shoe 86.9
Jeans 82.4
Pants 82.4
Face 81.7
Head 81.7
Shoe 79.6
Standing 76.8
Deck 60
Furniture 56.9
Home Decor 56.8
Hat 56.7
Shirt 56.6
Shorts 55.5
Plywood 55.5
Overcoat 55.2

Clarifai
created on 2018-05-10

people 100
group 99.2
adult 99
group together 98.8
administration 97.5
wear 96.6
leader 96.4
military 96
man 95.7
war 92.4
several 91.5
woman 90.1
soldier 89.5
vehicle 89.5
many 88.9
outfit 88.7
four 88.3
uniform 87.8
two 86.7
home 86.6

Imagga
created on 2023-10-06

old 26.5
building 20.4
structure 18.6
city 16.6
billboard 16.3
architecture 15
world 14.3
travel 14.1
people 13.9
man 13.4
house 13.4
signboard 13.2
ancient 13
room 12.8
door 12.2
person 11.5
street 11
statue 10.8
religion 10.8
male 10.7
window 10.4
home 10.4
black 10.2
stone 10.1
adult 9.8
outdoors 9.7
urban 9.6
sculpture 9.6
antique 9.5
wall 9.5
step 9.3
historic 9.2
vintage 9.1
tourism 9.1
portrait 9.1
history 8.9
brick 8.6
monument 8.4
wood 8.3
kin 8.2
classroom 8.2
landmark 8.1
family 8
wooden 7.9
couple 7.8
scene 7.8
men 7.7
grunge 7.7
weathered 7.6
fashion 7.5
life 7.4
aged 7.2
dirty 7.2
stairs 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

man 94.8
standing 91.7
person 88.2
posing 61.7
old 55.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 52-60
Gender Male, 99.2%
Angry 77.2%
Calm 13.7%
Surprised 6.4%
Fear 6%
Sad 4.2%
Disgusted 3.3%
Confused 0.3%
Happy 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Calm 58.1%
Angry 37.2%
Surprised 6.8%
Fear 6.1%
Sad 2.6%
Confused 1.2%
Happy 0.4%
Disgusted 0.2%

Microsoft Cognitive Services

Age 64
Gender Male

Microsoft Cognitive Services

Age 56
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Shoe 86.9%
Jeans 82.4%

Categories