Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3662

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3662

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 97.2
Head 97.2
Person 97.1
Adult 97.1
Bride 97.1
Female 97.1
Wedding 97.1
Woman 97.1
Art 96.1
Collage 96.1
Person 94
Baby 94
Clothing 93.3
Hat 93.3
Person 90
Baby 90
People 87.5
Person 86.2
Person 85.1
Person 84.5
Baby 84.5
PEZ Dispenser 57.7
Outdoors 56.1
Cap 56.1
Photography 55.7
Portrait 55.7

Clarifai
created on 2018-05-10

people 99.9
group 99.3
adult 98.3
facial expression 98
child 97.2
woman 96.4
man 96.3
group together 96.3
wear 96.3
portrait 94.5
retro 93.7
several 89.9
administration 89.5
leader 89.2
music 88.1
monochrome 88.1
boy 87.4
many 86.2
outfit 86.1
four 85.5

Imagga
created on 2023-10-06

negative 54.3
cornet 53.4
brass 52.7
film 43.4
wind instrument 42.6
photographic paper 31.3
musical instrument 27
money 24.7
currency 24.2
cash 22
photographic equipment 20.9
old 18.8
dollar 18.6
ancient 17.3
bank 17
finance 16.9
financial 16
vintage 15.7
paper 15.7
banking 15.6
dollars 15.4
bill 15.2
business 15.2
one 14.2
antique 14
art 13.7
hundred 13.6
grunge 12.8
wealth 12.6
sculpture 12.4
retro 12.3
people 12.3
man 12.1
sax 11.9
portrait 11.6
us 11.6
statue 11.4
face 11.4
human 11.2
savings 11.2
rich 11.2
economy 11.1
close 10.8
newspaper 10.8
banknotes 10.8
payment 10.6
person 10.5
exchange 10.5
male 9.9
franklin 9.8
marble 9.7
finances 9.6
pay 9.6
profit 9.6
decoration 9.4
architecture 9.4
investment 9.2
trombone 9.1
history 8.9
closeup 8.8
states 8.7
loan 8.6
united 8.6
stone 8.4
product 8.3
pattern 8.2
aged 8.1
dirty 8.1
religion 8.1
kin 8
carving 7.8
creation 7.8
bills 7.8
culture 7.7
notes 7.7
head 7.6
historical 7.5
frame 7.5
monument 7.5
world 7.4
object 7.3
design 7.3
figure 7.2
black 7.2
hair 7.1
adult 7.1
market 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

old 53.3
posing 51.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Female, 100%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Happy 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 20-28
Gender Female, 99.7%
Fear 94.1%
Calm 10.9%
Surprised 9.3%
Sad 2.2%
Confused 0.7%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 14-22
Gender Female, 100%
Calm 99.6%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 19-27
Gender Female, 99.9%
Calm 87.6%
Surprised 8.3%
Fear 6.1%
Happy 5.8%
Sad 2.3%
Angry 1.1%
Disgusted 0.8%
Confused 0.2%

Feature analysis

Amazon

Person 97.1%
Adult 97.1%
Bride 97.1%
Female 97.1%
Woman 97.1%
Baby 94%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

UNITS
AIRPAINTING
Laasera

Google

AIRPAINTING UN
AIRPAINTING
UN