Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4615

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Interior Design 99.9
Indoors 99.9
Display 99.6
Screen 99.6
Electronics 99.6
Person 99.5
Human 99.5
Monitor 98.7
LCD Screen 97.3
Person 97.3
Person 96.5
Person 93.7
TV 87.2
Television 87.2
Person 84.3
Person 72.9
Computer 70.9
Room 59.2
Theater 57.5

Imagga
created on 2022-01-08

television 100
monitor 78.1
broadcasting 59.1
telecommunication system 56.8
telecommunication 44.5
computer 38.5
screen 37.8
equipment 37.1
electronic equipment 36.3
technology 33.4
medium 29.3
business 29.1
office 27.3
display 26
laptop 23.7
flat 19.3
electronic 18.7
communication 18.5
modern 16.1
hand 15.9
keyboard 15
work 14.9
people 14.5
man 14.1
person 13.6
information 13.3
working 13.3
businessman 13.2
desk 13.2
notebook 13.1
digital 13
video 12.6
desktop 12.5
design 11.8
black 11.4
home 11.2
object 11
adult 11
finance 11
room 10.9
plasma 10.7
media 10.5
one 10.4
tech 10.4
sitting 10.3
close 10.3
back 10.1
global 10
frame 10
male 9.9
liquid crystal 9.9
financial 9.8
looking 9.6
corporate 9.4
professional 9.3
data 9.1
interior 8.8
indoors 8.8
panel 8.7
blank 8.6
space 8.5
web 8.5
horizontal 8.4
studio 8.4
network 8.3
entertainment 8.3
single 8.2
film 7.9
smile 7.8
portable 7.8
movie 7.8
3d 7.7
chart 7.6
wireless 7.6
workplace 7.6
electronics 7.6
contemporary 7.5
happy 7.5
success 7.2

Microsoft
created on 2022-01-08

monitor 99.7
screenshot 98
text 96.7
television 94.7
indoor 88.3
screen 77.2
computer 70.5
microwave 36.2
flat 33.3
set 31.7
display 28.5
kitchen appliance 13.3

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 92.4%
Calm 64.2%
Sad 19%
Confused 6.7%
Surprised 4.6%
Disgusted 2.3%
Angry 1.7%
Happy 0.9%
Fear 0.5%

AWS Rekognition

Age 23-33
Gender Male, 58.2%
Calm 92.9%
Sad 2.9%
Happy 2%
Fear 1%
Angry 0.4%
Confused 0.4%
Disgusted 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Monitor 98.7%

Captions

Microsoft

a man standing in front of a flat screen television 76.1%
a man sitting in front of a flat screen television 66.5%
a man standing in front of a flat screen tv 66.4%

Text analysis

Amazon

SUPER
23
SUPER XX
XX

Google

23 SUPER XX
23
SUPER
XX