Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4604

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4604

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.5
Human 98.5
Person 97.8
Person 95.9
Interior Design 73.7
Indoors 73.7
Person 71.9
Monitor 66.1
Screen 66.1
Display 66.1
Electronics 66.1
Person 64.9

Clarifai
created on 2023-10-25

movie 99.9
negative 99.8
filmstrip 99.4
slide 98.7
exposed 98.5
photograph 98.3
cinematography 97.5
analogue 96.4
people 96.3
old 96.1
collage 95.1
art 94.6
noisy 94.1
dirty 93.4
retro 93.4
screen 92.9
rust 92.6
monochrome 92
vintage 91.2
two 85.5

Imagga
created on 2022-01-08

apparatus 100
sequencer 100
equipment 89.6
film 30.3
negative 27.8
grunge 19.6
old 19.5
retro 18.8
cinema 18.5
electronic equipment 18.1
movie 17.4
vintage 16.5
digital 16.2
frame 15.8
border 15.4
computer 15.2
business 15.2
digital clock 14.9
technology 14.8
camera 14.8
slide 14.6
equalizer 14.6
texture 14.6
strip 14.5
black 14.4
entertainment 13.8
art 13.6
screen 12.7
clock 12.3
industry 11.9
finance 11.8
filmstrip 11.8
noise 10.7
photography 10.4
office 10.4
antique 10.4
graphic 10.2
network 10.2
paper 10.2
music 9.9
timepiece 9.6
roll 9.5
word 9.4
data 9.1
rough 9.1
financial 8.9
information 8.8
scratch 8.8
tape 8.7
close 8.6
media 8.6
space 8.5
card 8.5
design 8.4
dark 8.3
economy 8.3
city 8.3
connection 8.2
aged 8.1
dirty 8.1
button 7.9
register 7.9
text 7.8
folder 7.8
panel 7.7
rust 7.7
audio 7.6
damaged 7.6
horizontal 7.5
sound 7.5
banner 7.3
template 7.3
investment 7.3
success 7.2
building 7.1

Microsoft
created on 2022-01-08

text 95.1
indoor 85.2
screenshot 71.5
photographic film 66.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 92.4%
Calm 54%
Surprised 13.2%
Confused 12.7%
Sad 11.7%
Fear 2.9%
Angry 2.3%
Disgusted 2.2%
Happy 0.9%

AWS Rekognition

Age 27-37
Gender Male, 99.1%
Calm 98%
Confused 1.2%
Sad 0.4%
Surprised 0.2%
Angry 0.1%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 23-31
Gender Male, 99.4%
Calm 87%
Sad 5.3%
Confused 5%
Surprised 0.9%
Disgusted 0.8%
Angry 0.4%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 33-41
Gender Male, 94.4%
Happy 69.1%
Calm 17.4%
Sad 4.9%
Angry 2.8%
Surprised 1.8%
Confused 1.6%
Disgusted 1.5%
Fear 0.9%

AWS Rekognition

Age 47-53
Gender Male, 52.2%
Angry 74.4%
Sad 17%
Calm 6.1%
Disgusted 1%
Happy 0.7%
Fear 0.6%
Surprised 0.3%
Confused 0.1%

AWS Rekognition

Age 16-22
Gender Male, 81.5%
Calm 66.3%
Sad 26.8%
Angry 2.9%
Happy 1.1%
Confused 0.8%
Disgusted 0.8%
Surprised 0.8%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Monitor 66.1%

Text analysis

Amazon

SUPER
CAFE
3
DIA
DIA ONO
ONO
AMON

Google

3.
0IAT
CAFE
ANDS
3. 0IAT OND. CAFE ANDS CAFE SUPER
OND.
SUPER