Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4613

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4613

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Indoors 99.9
Interior Design 99.9
Person 96.5
Human 96.5
Person 96.1
Person 95.7
Screen 93.2
Electronics 93.2
Display 92.6
Person 92.4
Person 79.7
Person 69
Room 67.3
Theater 62.3
Person 62.1
Monitor 61
Person 60.6
LCD Screen 60.3
Television 59.6
TV 59.6
Clothing 59.4
Apparel 59.4
Suit 59.4
Coat 59.4
Overcoat 59.4
Cinema 57
Person 53.2

Clarifai
created on 2023-10-25

movie 99.7
negative 99.6
filmstrip 98.8
slide 98.5
exposed 98
analogue 97.9
photograph 97.5
art 97.4
people 96.3
cinematography 95.6
collage 95.6
portrait 94.1
rust 93.6
window 93.3
analog 93.1
retro 93
screen 93
dirty 92.9
old 92.7
vintage 90.5

Imagga
created on 2022-01-08

equipment 92.3
sequencer 76.1
apparatus 60.6
electronic equipment 51.5
equalizer 33.1
technology 26
amplifier 22.5
film 19.6
sound 18.7
computer 18.4
music 18
negative 17.6
old 17.4
black 16.2
digital 16.2
control 15.6
grunge 15.3
audio 15.3
retro 14.7
cinema 14.6
vintage 14.1
data 13.7
industry 13.7
panel 13.5
business 12.7
tape 12.7
close 12.5
button 12.3
network 12
plastic 11.9
device 11.8
texture 11.8
clock 11.7
digital clock 11.7
strip 11.6
movie 11.6
play 11.2
entertainment 11
filmstrip 10.8
studio 10.6
connection 10.1
dirty 9.9
switch 9.8
stereo 9.8
record 9.7
office 9.6
electronics 9.5
frame 9.2
border 9
instrument 9
radio 9
knob 8.9
information 8.9
slide 8.8
electrical 8.6
system 8.6
media 8.6
mixer 8.5
design 8.4
power 8.4
electronic 8.4
speed 8.2
metal 8
recording 8
silver 8
cassette 7.9
analog 7.9
noise 7.8
screen 7.8
blank 7.7
musical 7.7
timepiece 7.7
hand 7.6
roll 7.6
word 7.5
electric 7.5
closeup 7.4
camera 7.4
object 7.3
receiver 7.3
graphic 7.3
reel 7.3
art 7.2

Microsoft
created on 2022-01-08

text 96.6
screenshot 94.9
indoor 87.4
photographic film 81.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Female, 54.1%
Calm 96.1%
Sad 1.6%
Angry 1%
Confused 0.7%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 39.2%
Sad 35.5%
Angry 9.5%
Disgusted 7.6%
Confused 4.1%
Surprised 2.8%
Fear 0.7%
Happy 0.6%

AWS Rekognition

Age 0-3
Gender Female, 98%
Calm 48.6%
Fear 32.4%
Surprised 8.2%
Sad 4.1%
Happy 3.6%
Disgusted 2.2%
Angry 0.7%
Confused 0.2%

AWS Rekognition

Age 4-12
Gender Female, 99.9%
Surprised 94.2%
Fear 1.6%
Calm 1.6%
Confused 1.3%
Angry 0.5%
Sad 0.4%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 6-14
Gender Male, 85.4%
Fear 81.4%
Calm 7.4%
Surprised 4.3%
Sad 2.5%
Confused 2.3%
Happy 0.8%
Angry 0.7%
Disgusted 0.7%

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 96.5%
Monitor 61%

Text analysis

Amazon

21

Google

21 21
21