Human Generated Data

Title

Photograph -- New York

Date

1917

People

Artist: Paul Strand, American 1890 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, 2006.121

Copyright

© Aperture Foundation Inc., Paul Strand Archive

Human Generated Data

Title

Photograph -- New York

People

Artist: Paul Strand, American 1890 - 1976

Date

1917

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, 2006.121

Copyright

© Aperture Foundation Inc., Paul Strand Archive

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 96
Human 96
Art 83.7
Text 82.1
Face 71.8
Finger 64
Wood 62
Painting 57.3

Clarifai
created on 2023-10-27

portrait 99.8
people 99.8
one 98.6
art 98.5
painting 97.7
old 97.2
vintage 97
man 95.3
adult 95
leader 93.4
wear 92.9
antique 92.3
print 92.3
retro 91.7
woman 88.2
face 87.1
sepia 86.4
museum 85.1
writer 81.5
family 80.1

Imagga
created on 2022-01-29

grandma 36.8
money 24.7
portrait 24.6
man 22.9
old 20.2
face 19.9
currency 19.8
cash 19.2
male 19.2
one 18.7
close 18.3
dollar 17.6
person 17.6
statue 17
sculpture 16.1
mask 16
banking 15.6
bill 15.2
people 14.5
bank 14.3
business 14
grandfather 13.9
adult 13.6
head 13.4
ancient 13
paper 12.7
dollars 12.6
covering 12.4
art 12.3
expression 11.9
finance 11.8
wealth 11.7
financial 11.6
black 11.5
senior 11.2
eyes 11.2
culture 11.1
us 10.6
pay 10.6
human 10.5
stone 10.1
franklin 9.8
banknotes 9.8
hundred 9.7
finances 9.6
looking 9.6
exchange 9.6
savings 9.3
disguise 9.3
vintage 9.1
closeup 8.8
hair 8.7
loan 8.6
elderly 8.6
commerce 8.4
rich 8.4
mature 8.4
dark 8.4
look 7.9
antique 7.8
banknote 7.8
retired 7.8
economic 7.8
marble 7.8
payment 7.7
price 7.7
grunge 7.7
serious 7.6
happy 7.5
world 7.4
alone 7.3
home 7.2
history 7.2
eye 7.2
attire 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

human face 98.3
monitor 97.9
television 96.9
text 96.6
drawing 95.9
person 94.2
clothing 90.6
sketch 88.7
screen 82.3
man 75
old 74.9
portrait 55
gallery 52.9
room 51.6
painting 40.4
picture frame 9.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-53
Gender Male, 99.6%
Angry 93.9%
Calm 2.8%
Surprised 1.2%
Disgusted 1.1%
Confused 0.6%
Happy 0.2%
Fear 0.1%
Sad 0.1%

Feature analysis

Amazon

Person
Person 96%

Categories

Imagga

pets animals 83.7%
paintings art 13.9%
nature landscape 1.4%

Captions

Microsoft
created on 2022-01-29

a painting of a person 84.4%
an old photo of a person 79.2%
an old photo of a painting 74.2%