Human Generated Data

Title

Untitled (Jersey Homesteads, New Jersey)

Date

1939

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3589

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Jersey Homesteads, New Jersey)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3589

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Architecture 99.7
Building 99.7
Outdoors 99.7
Shelter 99.7
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Person 96.3
Adult 96.1
Male 96.1
Man 96.1
Person 96.1
Face 93
Head 93
Photography 85.3
Portrait 85.3
Window 76.4
People 74.3
Home Damage 67
Window - Broken 67
Nature 56.1

Clarifai
created on 2018-05-10

people 99.8
adult 99.1
man 97.8
one 97.1
two 95.5
portrait 92.3
wear 92.1
group 92
home 90.8
woman 87.1
leader 86.5
administration 86.2
facial expression 85.4
window 83.8
vehicle 83.6
offense 80
war 78.5
group together 78
actor 76.8
furniture 76.7

Imagga
created on 2023-10-05

wall 36
old 35.5
building 35.3
architecture 31.2
window 30.4
door 27.1
house 25.9
window screen 19.7
stone 18.6
ancient 18.2
texture 17.4
aged 17.2
screen 16.8
structure 16.2
vintage 15.7
city 15
brick 14.8
wood 14.2
device 13.9
protective covering 13.8
exterior 13.8
grunge 13.6
weathered 13.3
construction 12.8
shop 12.5
barbershop 12.3
antique 12.1
detail 12.1
home 12
windowsill 11.9
sill 11.9
historic 11.9
dirty 11.8
air conditioner 11.6
entrance 11.6
wooden 11.4
glass 10.9
rustic 10.7
retro 10.6
travel 10.6
urban 10.5
structural member 10.2
street 10.1
rough 10
history 9.8
abandoned 9.8
outdoors 9.7
cooling system 9.4
mercantile establishment 9.1
classroom 9.1
paint 9.1
covering 9
brown 8.8
balcony 8.7
buildings 8.5
room 8.3
support 7.9
textured 7.9
doorway 7.9
prison 7.8
facade 7.8
ruins 7.8
cement 7.8
decay 7.7
architectural 7.7
damaged 7.6
grungy 7.6
tourism 7.4
town 7.4
man 7.4
person 7.4
mechanism 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 98.2
person 92.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Sad 83.3%
Confused 36.1%
Calm 16.7%
Surprised 6.6%
Angry 6.2%
Fear 6.2%
Disgusted 1.4%
Happy 0.4%

AWS Rekognition

Age 31-41
Gender Male, 92%
Sad 99.8%
Calm 26%
Surprised 6.5%
Fear 6.3%
Confused 1.1%
Angry 0.8%
Disgusted 0.5%
Happy 0.5%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%

Categories