Human Generated Data

Title

TV – Topeka, KS – 1974

Date

1974

People

Artist: Dennis Feldman, American born 1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, 2020.68

Human Generated Data

Title

TV – Topeka, KS – 1974

People

Artist: Dennis Feldman, American born 1946

Date

1974

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2023-01-11

Screen 100
Computer Hardware 100
Hardware 100
Electronics 100
TV 100
Person 99.6
Man 99.6
Adult 99.6
Male 99.6
Glasses 98
Accessories 98
Monitor 94.5
Chair 94
Furniture 94
Living Room 92.3
Room 92.3
Indoors 92.3
Building 92.3
Architecture 92.3
Chair 90.3
Person 87.9
Face 85.2
Head 85.2
Dog 84
Pet 84
Mammal 84
Canine 84
Animal 84
Couch 56.5
Entertainment Center 55.3

Imagga
created on 2023-01-11

television 100
telecommunication system 100
laptop 56.6
computer 55.4
technology 39.3
screen 36.5
business 35.8
office 32.9
keyboard 31.9
notebook 29.1
work 28.2
monitor 26.1
electronic 25.2
broadcasting 24.9
working 21.2
telecommunication 21
display 20.5
modern 20.3
wireless 20
people 17.8
communication 16.8
equipment 15.8
businesswoman 15.4
home 15.1
web 14.4
person 14.2
information 14.2
desk 14.2
network 13.9
tech 13.3
black 13.2
mobile 13.2
blank 12.9
room 12.8
happy 12.5
object 12.5
job 12.4
digital 12.1
sitting 12
women 11.9
student 11.8
adult 11.6
electronics 11.4
medium 11.4
professional 11
space 10.9
man 10.7
portable 10.7
hand 10.6
flat 10.6
businessman 10.6
corporate 10.3
executive 10.1
smiling 10.1
smile 10
pretty 9.8
attractive 9.8
portrait 9.7
desktop 9.6
looking 9.6
career 9.5
key 9.3
study 9.3
friendly 9.1
data 9.1
design 9
success 8.8
typing 8.8
education 8.7
communications 8.6
center 8.5
finance 8.4
manager 8.4
studio 8.4
copy 8
interior 8
indoors 7.9
receptionist 7.9
gadget 7.8
secretary 7.7
help 7.4
lady 7.3
color 7.2
lifestyle 7.2
gray 7.2
open 7.2
face 7.1

Microsoft
created on 2023-01-11

indoor 95.1
black and white 93.7
living 88.1
room 86.6
person 70.7
white 61.9
old 57.9
furniture 39.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Confused 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.1%
Calm 0%
Angry 0%
Happy 0%

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Man 99.6%
Adult 99.6%
Male 99.6%
Glasses 98%
Monitor 94.5%
Chair 94%
Dog 84%

Captions

Microsoft

an old photo of a living room 85.9%
old photo of a living room 83.6%
an old photo of a living room filled with furniture and a fireplace 72%

Text analysis

Google

OH 10:0
OH
10
:
0