Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4602.1-4

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4602.1-4

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 98.6
Person 96.8
Person 93.2
Person 78.4
Person 67.7
Monitor 63.5
Screen 63.5
Display 63.5
Electronics 63.5
Text 63
Chair 57.5
Furniture 57.5
Person 50.7

Clarifai
created on 2023-10-25

negative 99.4
movie 99.2
filmstrip 97.7
exposed 96.1
slide 95.8
old 95
photograph 94.4
retro 94.1
dirty 93.3
art 93.2
rust 92.5
no person 91.9
people 91.3
cinematography 89.9
vintage 88.7
collage 88.3
analogue 86.9
desktop 86.3
wall 86.1
margin 85.6

Imagga
created on 2022-01-08

sequencer 100
apparatus 98.7
equipment 97.8
electronic equipment 38.2
equalizer 37
film 25.5
negative 20.9
old 18.1
grunge 17.9
technology 17.8
retro 17.2
business 17
cinema 16.6
vintage 16.5
movie 15.5
border 15.4
computer 14.4
frame 14.1
texture 13.9
digital 13.8
slide 13.7
black 13.2
art 13
camera 12.9
industry 12.8
screen 12.2
noise 11.7
strip 11.6
paper 11
filmstrip 10.8
scratch 10.7
office 10.4
graphic 10.2
finance 10.1
entertainment 10.1
rough 10
music 9.9
photographic 9.8
tape 9.6
word 9.4
data 9.1
dirty 9
design 9
information 8.9
text 8.7
rust 8.7
antique 8.6
damaged 8.6
close 8.6
photography 8.5
sound 8.4
pattern 8.2
35mm 7.9
space 7.8
panel 7.7
collage 7.7
edge 7.7
audio 7.6
roll 7.6
grungy 7.6
city 7.5
element 7.4
device 7.4
connection 7.3
metal 7.2
aged 7.2
financial 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Male, 99.7%
Calm 69%
Sad 18.1%
Surprised 3.6%
Angry 2.8%
Confused 2.4%
Happy 1.6%
Fear 1.3%
Disgusted 1.1%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 84.1%
Sad 9%
Confused 3.2%
Surprised 1.2%
Fear 0.7%
Angry 0.7%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 99.3%
Sad 0.4%
Confused 0.1%
Disgusted 0%
Surprised 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Male, 99.3%
Happy 44.5%
Calm 41.5%
Sad 4.9%
Angry 4.4%
Fear 1.7%
Confused 1.4%
Disgusted 0.9%
Surprised 0.7%

AWS Rekognition

Age 21-29
Gender Male, 58.2%
Calm 77.4%
Sad 21.7%
Confused 0.4%
Surprised 0.2%
Happy 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 71.8%
Sad 14.8%
Angry 7.4%
Happy 2.9%
Fear 1%
Confused 0.9%
Disgusted 0.6%
Surprised 0.5%

AWS Rekognition

Age 22-30
Gender Male, 93.9%
Calm 95.1%
Angry 2.4%
Sad 1.6%
Confused 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Calm 98%
Angry 0.8%
Happy 0.4%
Sad 0.4%
Fear 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 6-14
Gender Female, 61.8%
Calm 97.9%
Sad 1.6%
Fear 0.1%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 12-20
Gender Male, 97.2%
Calm 93.4%
Angry 3%
Sad 2.8%
Confused 0.3%
Fear 0.2%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Monitor 63.5%

Categories

Captions

Text analysis

Amazon

44
42
43
SUPER
5
FILM
SUPER XX
RINGLING
TY FILM
XX
TY

Google

SUPER XX
SUPER
XX