Human Generated Data

Title

Woman Reading Under the El, Brownsville, Brooklyn, New York City

Date

1950

People

Artist: N. Jay Jaffee, American 1921 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.114

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Human Generated Data

Title

Woman Reading Under the El, Brownsville, Brooklyn, New York City

People

Artist: N. Jay Jaffee, American 1921 - 1999

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.114

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Shorts 99.8
Apparel 99.8
Clothing 99.8
Human 99.4
Person 99.4
Shoe 95.6
Footwear 95.6
Home Decor 84
Car 78.4
Transportation 78.4
Automobile 78.4
Vehicle 78.4
Pants 61.4
Architecture 55.4
Building 55.4

Clarifai
created on 2023-10-25

people 99.9
street 98.4
adult 98.1
one 97.9
monochrome 97.7
man 97.4
portrait 97.1
woman 89
art 88.9
shadow 87.8
city 85.5
two 85.1
analogue 84.8
black and white 84.5
square 82.8
group 82.4
music 82.2
model 82.1
furniture 81.8
light 79.4

Imagga
created on 2021-12-14

musical instrument 35.4
man 28.2
accordion 27.4
business 23.7
keyboard instrument 23.1
male 21.3
wind instrument 21.1
adult 20.8
newspaper 20.8
person 19.4
people 17.8
product 17.7
men 17.2
equipment 16.5
computer 16.1
work 15.7
office 14.9
industry 14.5
sax 14.3
sitting 13.7
technology 13.4
pay-phone 12.9
bass 12.8
working 12.4
businessman 12.4
telephone 12.3
creation 12.3
room 12.1
black 12
laptop 11.9
center 11.8
suit 11.7
indoors 11.4
one 11.2
city 10.8
rack 10.8
holding 10.7
worker 10.7
modern 10.5
attractive 10.5
corporate 10.3
old 9.7
job 9.7
portrait 9.7
urban 9.6
call 9.5
power 9.2
photographic equipment 9.2
industrial 9.1
server 9
information 8.9
electronic equipment 8.7
travel 8.4
building 8.4
connection 8.2
guitar 8
lifestyle 7.9
database 7.9
smile 7.8
happiness 7.8
support 7.8
factory 7.7
cable 7.6
storage 7.6
hand 7.6
warehouse 7.5
network 7.5
alone 7.3
data 7.3
digital 7.3
dirty 7.2
music 7.2
life 7
architecture 7
professional 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.1
clothing 97.9
person 96.6
footwear 92.5
man 90
street 86.8
black and white 86.1
waste container 56.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-52
Gender Male, 94.9%
Calm 50.4%
Fear 13.4%
Angry 13.3%
Sad 12%
Confused 6.7%
Surprised 1.7%
Disgusted 1.5%
Happy 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 95.6%
Car 78.4%

Categories

Text analysis

Amazon

MJUN
MJUN Juitua
Juitua