Human Generated Data

Title

Man in Frame, Queens, New York City

Date

1950

People

Artist: N. Jay Jaffee, American 1921 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.97

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Human Generated Data

Title

Man in Frame, Queens, New York City

People

Artist: N. Jay Jaffee, American 1921 - 1999

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.97

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 93.2
Human 93.2
Tie 81.3
Accessories 81.3
Accessory 81.3
Photography 73.4
Photo 73.4
Portrait 68.5
Face 68.5
Text 67.1
Art 63.8
Attorney 55.9

Clarifai
created on 2023-10-25

one 98.5
woman 98.2
art 97.7
portrait 97.6
people 97.4
retro 95.8
man 94.4
no person 93.7
vintage 92.1
glass items 91.8
mirror 89.8
old 89.5
exploration 88.8
fashion 88.7
adult 85.3
paper 84.9
antique 84.7
detective 84.6
glass 84.4
painting 83

Imagga
created on 2021-12-14

telephone 57.9
dial telephone 56.7
equipment 45.3
electronic equipment 40.3
device 37.1
disk 31.2
data 30.1
technology 28.2
computer 28
storage 25.7
drive 22.3
ventilator 21.8
backup 21.6
disc 21.4
information 21.2
digital 21.1
hard 19
record 17.4
phonograph record 17
read 16.3
music 16.2
black 15.1
object 14.7
close 14.3
electronics 14.2
software 13.6
business 13.4
media 13.3
write 13.2
memory 12.7
file 12.5
hardware 12.5
dial 12.2
call 11.9
byte 11.8
head 11.8
compact 10.7
pay-phone 10.5
part 10.2
security 10.1
retro 9.8
old 9.7
store 9.4
closeup 9.4
compact disk 9.4
industry 9.4
copy 8.8
vintage 8.3
open 8.1
box 8.1
detail 8
metal 8
silver 8
cylinder 7.9
fan 7.8
video 7.7
blank 7.7
audio 7.6
electric fan 7.6
electronic 7.5
symbol 7.4
office 7.2
punching bag 7.2
surface 7.1
film 7.1
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 95.3
text 94.6
black and white 87.6
person 86
clothing 78.9
old 55.9
painting 26.7
set 26.1
picture frame 7.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-52
Gender Male, 97.7%
Calm 77.1%
Happy 10.2%
Surprised 10.1%
Angry 0.8%
Confused 0.5%
Fear 0.4%
Disgusted 0.4%
Sad 0.4%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 93.2%
Tie 81.3%

Categories

Imagga

interior objects 63.2%
text visuals 13.8%
paintings art 13.3%
food drinks 8.6%

Captions

Microsoft
created on 2021-12-14

a painting hanging on a wall 65.7%
an old photo of a painting 59.8%
a painting on the wall 59.7%

Text analysis

Amazon

M
M Joy JuaFee
Joy
JuaFee