Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4611.1-3

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4611.1-3

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98
Human 98
Person 96.2
Person 93.8
Person 93.4
Electronics 88.9
Screen 88.1
Display 87.3
Person 84.8
Person 83.1
Person 81.2
Monitor 79.9
Person 78.4
Person 73.2
Interior Design 72.3
Indoors 72.3
Person 55.6
Person 41.8

Clarifai
created on 2023-10-25

negative 99.9
movie 99.8
filmstrip 99.8
slide 99.4
cinematography 99.3
exposed 99.3
photograph 99
noisy 96.7
screen 96.1
bobbin 95.8
video 94.2
art 94.1
retro 94.1
old 93.5
dirty 93.4
picture frame 92.4
collage 91.6
emulsion 91.3
desktop 91
vintage 89.8

Imagga
created on 2022-01-08

equipment 78.4
electronic equipment 71.6
equalizer 55.2
sequencer 30.4
technology 28.9
apparatus 24.3
digital 23.5
computer 22.4
digital clock 20.8
black 19.8
amplifier 19
clock 18.7
close 15.4
industry 15.4
business 15.2
panel 14.5
monitor 14.3
electronic 14
network 13.9
film 13.5
media 13.3
timepiece 13.1
liquid crystal display 13
board 12.9
data 12.8
information 12.4
button 12.3
display 12.2
grunge 11.9
communication 11.8
radio 11.6
retro 11.5
closeup 11.4
design 11.2
old 11.1
speed 11
music 10.8
video 10.6
electronics 10.4
device 10.2
negative 10.2
connection 10.1
vintage 9.9
hand 9.9
cinema 9.8
texture 9.7
silver 9.7
audio 9.6
system 9.5
receiver 9.4
radio receiver 9.4
sound 9.4
electric 9.4
number 9.3
instrument 8.9
office 8.8
light 8.7
electrical 8.6
play 8.6
tech 8.5
card 8.5
camera 8.3
sign 8.3
global 8.2
control 8.1
metal 8
filmstrip 7.9
screen 7.8
art 7.8
movie 7.7
line 7.7
studio 7.6
buttons 7.5
frame 7.5
service 7.4
entertainment 7.4
plastic 7.3
object 7.3
success 7.2
border 7.2
science 7.1
modern 7

Microsoft
created on 2022-01-08

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 98.2%
Sad 80.6%
Calm 10%
Disgusted 4.3%
Fear 1.7%
Angry 1.6%
Confused 1%
Happy 0.5%
Surprised 0.5%

AWS Rekognition

Age 30-40
Gender Female, 51.5%
Calm 57.2%
Sad 37.3%
Happy 1.7%
Disgusted 1.3%
Angry 1.1%
Fear 0.7%
Confused 0.6%
Surprised 0.2%

AWS Rekognition

Age 24-34
Gender Female, 86.9%
Calm 95.7%
Happy 3.5%
Disgusted 0.2%
Sad 0.1%
Angry 0.1%
Confused 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 24-34
Gender Male, 51.1%
Calm 75.7%
Sad 13.8%
Confused 3.3%
Fear 1.8%
Surprised 1.5%
Happy 1.4%
Angry 1.4%
Disgusted 1.3%

AWS Rekognition

Age 19-27
Gender Male, 97.2%
Calm 98.2%
Sad 0.7%
Happy 0.3%
Angry 0.2%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 98%
Monitor 79.9%

Categories

Captions

Microsoft
created on 2022-01-08

graphical user interface 96.3%

Text analysis

Amazon

18
19
17
10
= in UNIVERSIDADE 7/0 10
in
7/0
=
UNIVERSIDADE

Google

18
61
18 61 18 19
19