Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3688

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3688

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Furniture 81.7
Sleeping 78.1
Outdoors 73.8
Home Decor 72.3
Back 57.8
Body Part 57.8
Cushion 57.8
Finger 57.7
Hand 57.7
Plant 56.7
Vegetation 56.7
Pillow 56.4
Nature 56.4
Fashion 56.3
Crib 56.1
Infant Bed 56.1
Bed 56.1
Architecture 55.7
Building 55.7
Shelter 55.7

Clarifai
created on 2018-05-10

people 99.5
street 98.3
adult 95.5
one 94.6
monochrome 94.2
man 93.8
wear 91.5
sleep 90.7
boy 87
black and white 86.3
recreation 86.1
group 86
portrait 85.7
old 85.6
two 84.6
city 83.7
child 82.4
woman 82.2
urban 82.1
music 80.5

Imagga
created on 2023-10-05

freight car 100
car 100
wheeled vehicle 100
vehicle 76.9
conveyance 38.4
garden 14.2
color 12.8
light 12
water 12
leaf 11.7
plant 11.4
outdoors 11.2
leaves 10.5
office 10.4
summer 9.6
black 9.6
plants 9.3
house 9.2
business 9.1
digital 8.9
tree 8.6
travel 8.4
design 8.4
modern 8.4
style 8.1
computer 8
motion 7.7
outside 7.7
sky 7.6
dirt 7.6
technology 7.4
environment 7.4
food 7.2
home 7.2
interior 7.1
decor 7.1
spring 7.1
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

window 81

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Male, 99.6%
Sad 99.8%
Calm 27.6%
Surprised 6.5%
Fear 6%
Angry 1.4%
Confused 0.6%
Disgusted 0.5%
Happy 0.2%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%

Categories

Captions