Human Generated Data

Title

TV’s for sale – New York City – 1974

Date

1974

People

Artist: Dennis Feldman, American born 1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, 2020.65

Human Generated Data

Title

TV’s for sale – New York City – 1974

People

Artist: Dennis Feldman, American born 1946

Date

1974

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, 2020.65

Machine Generated Data

Tags

Amazon
created on 2023-01-12

Screen 100
Computer Hardware 100
Hardware 100
Electronics 100
TV 100
Person 98.9
Baby 98.9
Person 98.8
Person 98.5
Person 96.9
Person 96.6
Car 95.7
Vehicle 95.7
Transportation 95.7
Handbag 95.3
Bag 95.3
Accessories 95.3
Plant 94.4
Person 91.4
Person 89.5
Car 88.7
Shoe 84.7
Footwear 84.7
Clothing 84.7
Face 83
Head 83
Person 82.5
Plant 79.7
Person 78.6
Person 76.6
Wheel 75.5
Machine 75.5
Monitor 75.5
Bicycle 74.2
Person 73.6
Wheel 68.9
Handbag 67.6
Handbag 64.3
Monitor 63.3
Person 61.6
Entertainment Center 56.5
Indoors 55.9

Clarifai
created on 2023-10-13

television 99.2
people 98.6
room 97.5
furniture 97
monochrome 96.1
indoors 93
music 92
no person 91
one 90.3
analogue 89.8
group together 89.2
group 88.9
home 86.9
family 86.3
chair 85.6
analog 85.1
two 84.6
seat 84.3
many 80.8
adult 79.1

Imagga
created on 2023-01-12

television 100
telecommunication system 91.1
broadcasting 56.2
telecommunication 39.5
interior 32.7
home 31.9
screen 28.1
modern 27.3
monitor 26.6
technology 25.2
medium 25
room 24.6
house 23.4
display 23.2
equipment 22.2
electronic 19.6
luxury 18.8
kitchen 18.8
wood 18.3
old 18.1
furniture 17.1
design 16.9
style 16.3
computer 16.1
video 15.5
stove 14.7
media 14.3
blank 13.7
flat 13.5
communication 13.4
tile 13.3
digital 12.9
entertainment 12.9
cabinet 12.8
indoor 12.8
new 12.1
antique 12.1
object 11.7
vintage 11.6
inside 11
architecture 10.9
domestic 10.8
electronic equipment 10.8
stainless 10.6
retro 10.6
apartment 10.5
electrical 10.5
electronics 10.4
estate 10.4
table 10.4
wall 10.3
floor 10.2
3d 10.1
oven 9.8
decor 9.7
business 9.7
decoration 9.4
broadcast 8.8
metal 8.8
steel 8.8
tube 8.8
luxurious 8.8
expensive 8.6
space 8.5
living 8.5
black 8.4
classic 8.3
clean 8.3
appliance 8.3
window 8.2
laptop 8.2
wooden 7.9
liquid crystal 7.9
program 7.8
work 7.8
antenna 7.8
granite 7.8
movie 7.7
panel 7.7
watch 7.7
residential 7.7
old fashioned 7.6
show 7.6
chair 7.6
elegance 7.5
electric 7.5
brown 7.4

Google
created on 2023-01-12

Microsoft
created on 2023-01-12

black and white 93
old 83
text 81.5
window 81
white 78.9
christmas tree 70.8
television 68.2
black 68
vintage 31.3
furniture 25.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99.9%
Calm 63.9%
Confused 12.1%
Happy 11.4%
Surprised 7.6%
Fear 6.4%
Sad 4.3%
Angry 2.2%
Disgusted 1.6%

AWS Rekognition

Age 23-33
Gender Male, 97.6%
Calm 66%
Angry 22.2%
Surprised 7.3%
Fear 7%
Disgusted 3.2%
Sad 2.5%
Confused 2.3%
Happy 0.5%

AWS Rekognition

Age 18-26
Gender Female, 100%
Sad 89.1%
Calm 38.9%
Surprised 9.7%
Fear 6.6%
Confused 6.4%
Angry 3.9%
Disgusted 1.3%
Happy 0.6%

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 68
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Baby 98.9%
Car 95.7%
Handbag 95.3%
Plant 94.4%
Shoe 84.7%
Wheel 75.5%
Monitor 75.5%
Bicycle 74.2%

Categories

Imagga

cars vehicles 99.3%

Text analysis

Amazon

4
RUTH

Google

FIN
W
RUTH
22
70177 FIN W RUTH www 22
70177
www