Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4600.1-4

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4600.1-4

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Person 97.3
Person 93.7
Person 93
Person 89.5
Person 88
Person 81.5
Person 78.6
Text 74.4
Monitor 72.9
Electronics 72.9
Screen 72.9
Display 72.9

Clarifai
created on 2023-10-25

negative 99.7
movie 99.2
filmstrip 98.3
people 96.2
exposed 96.1
noisy 95
retro 94.1
cinematography 93.9
screen 92.5
slide 92.4
dirty 89.1
old 87.1
collage 85.4
no person 85.1
photograph 84.7
window 84.5
vintage 84.5
emulsion 83.9
art 83.8
man 82.9

Imagga
created on 2022-01-08

sequencer 83.9
equipment 74.4
apparatus 67.9
film 26
electronic equipment 24.9
technology 23
negative 19.3
synthesizer 17.9
business 16.4
computer 16
digital 14.6
electronic instrument 14.5
device 12.9
cinema 12.7
equalizer 12.6
old 12.5
retro 12.3
network 12
black 12
music 11.7
strip 11.6
vintage 11.6
mixer 11.6
industry 11.1
grunge 11.1
frame 10.8
keyboard instrument 10.8
movie 10.7
sound 10.3
screen 10.2
border 9.9
switch 9.8
art 9.8
close 9.7
tech 9.5
finance 9.3
camera 9.2
communication 9.2
filmstrip 8.9
object 8.8
closeup 8.7
audio 8.6
media 8.6
electronics 8.5
board 8.4
horizontal 8.4
texture 8.3
entertainment 8.3
data 8.2
light 8
information 8
design 7.9
photographic 7.8
slide 7.8
panel 7.7
tape 7.7
hand 7.6
musical instrument 7.5
city 7.5
electronic 7.5
element 7.4
amplifier 7.4
office 7.4
speed 7.3
connection 7.3
aged 7.2
dirty 7.2
financial 7.1
broadcasting 7.1
monitor 7
server 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.7
screenshot 87.3
vending machine 30.8
subway 10.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 50.8%
Calm 79.8%
Confused 9.3%
Happy 4%
Surprised 2.8%
Sad 2.7%
Fear 0.6%
Disgusted 0.4%
Angry 0.4%

AWS Rekognition

Age 22-30
Gender Male, 95%
Calm 49.2%
Sad 21.8%
Happy 21.1%
Fear 2.8%
Angry 2%
Surprised 1.1%
Disgusted 1.1%
Confused 0.8%

AWS Rekognition

Age 2-8
Gender Male, 93.4%
Calm 98.2%
Happy 0.4%
Sad 0.3%
Confused 0.3%
Fear 0.2%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 21-29
Gender Male, 98.1%
Sad 98.7%
Calm 0.4%
Disgusted 0.4%
Confused 0.3%
Fear 0.1%
Angry 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 19-27
Gender Male, 96.7%
Calm 88.5%
Angry 4.2%
Sad 3.2%
Fear 2.4%
Surprised 0.5%
Disgusted 0.5%
Happy 0.4%
Confused 0.3%

Feature analysis

Amazon

Person 99.1%
Monitor 72.9%

Categories

Captions

Microsoft
created on 2022-01-08

graphical user interface, website 96.6%

Text analysis

Amazon

36
37
38
35
EASTMAN 37
SUPER
EASTMAN
SUPER XX
XX
١٧+

Google

37 EASTMAN 38 SUPER XX 36
37
EASTMAN
38
SUPER
XX
36