Human Generated Data

Title

Untitled (North Carolina)

Date

May 1946-August 1946

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4311.2

Human Generated Data

Title

Untitled (North Carolina)

People

Artist: Ben Shahn, American 1898 - 1969

Date

May 1946-August 1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4311.2

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Floor 100
Flooring 100
Person 99.3
Person 99.2
Boy 99.2
Child 99.2
Male 99.2
Furniture 97.3
Person 95.1
Chair 88.8
Face 82.4
Head 82.4
Architecture 72.3
Building 72.3
Housing 72.3
Accessories 71.2
Bag 71.2
Handbag 71.2
House 70.4
Porch 70.4
Clothing 69.5
Footwear 69.5
Shoe 69.5
Outdoors 69.3
Sitting 68.3
Wood 64.7
Shoe 61.7
Dining Table 57.6
Table 57.6
Armchair 56.8
Indoors 56.5
Hospital 56.5
Photography 56.3
Interior Design 56.3
Dining Room 55.9
Room 55.9
Hat 55.8
Door 55.8
Living Room 55.5

Clarifai
created on 2018-05-09

people 99.9
adult 97.5
woman 97
child 96.9
one 96.3
man 95.8
group 94.9
two 94.7
step 94.2
group together 92.6
indoors 90.9
street 90.6
monochrome 90.2
wear 88
furniture 87.7
administration 86.9
room 86.2
boy 86.1
sit 85.3
war 83.6

Imagga
created on 2023-10-06

shopping cart 62
handcart 49
wheeled vehicle 36.8
container 23
chair 21
window 19.6
dishwasher 18.7
light 17.4
white goods 17
device 15.7
architecture 15.6
home appliance 15.1
old 14.6
metal 14.5
house 14.2
wall 13.8
building 13.6
cell 13.6
home 13.6
interior 13.3
black 13.2
dark 12.5
conveyance 12.3
man 12.1
technology 11.9
room 11.8
appliance 11.3
equipment 10.4
inside 10.1
wood 10
seat 9.7
style 8.9
steel 8.8
glass 8.6
male 8.5
computer 8.3
person 8.3
one 8.2
history 8
working 8
television 7.8
travel 7.7
empty 7.7
broken 7.7
windows 7.7
fashion 7.5
city 7.5
door 7.5
structure 7.4
security 7.3
design 7.3
business 7.3
protection 7.3
industrial 7.3
people 7.2

Google
created on 2018-05-09

Microsoft
created on 2018-05-09

white 60

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 99.6%
Calm 87.2%
Sad 9.6%
Surprised 6.4%
Fear 6%
Angry 0.5%
Confused 0.3%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 23-31
Gender Female, 56.6%
Sad 90.9%
Calm 44.6%
Fear 6.9%
Happy 6.6%
Surprised 6.5%
Confused 1.2%
Disgusted 0.8%
Angry 0.7%

Feature analysis

Amazon

Person 99.3%
Boy 99.2%
Child 99.2%
Male 99.2%
Handbag 71.2%
Shoe 69.5%

Categories

Text analysis

Amazon

and
University
of
College
Art
Fellows
(Harvard
Museums)
© President and Fellows of Harvard College (Harvard University Art Museums)
Harvard
President
P1970.4311.0002
©

Google

Harvard
College
(Harvard
Art
O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4311.0002
O
President
and
Fellows
of
University
Museums)
P1970.4311.0002