Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4634

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4634

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.2
Human 98.2
Person 94.8
Person 94.1
Person 89.4
Person 88.9
Person 83.6
Person 79.2
Person 78.4
Person 76.2
Person 62.7
Nature 61.9
Plant 59.1
Snow 57.7
Outdoors 57.7
Screen 57.4
Electronics 57.4
Person 49.9
Person 45.1

Clarifai
created on 2023-10-25

negative 100
filmstrip 99.9
movie 99.9
slide 99.8
exposed 99.6
photograph 99.6
cinematography 99.3
noisy 98
art 97.2
old 97.2
video 96.8
dirty 96.6
collage 96.2
bobbin 96
screen 95.9
analogue 95.8
retro 95.2
emulsion 94.4
rust 94
window 93.3

Imagga
created on 2022-01-08

equipment 66.1
electronic equipment 50.9
sequencer 37
memory 33.7
film 33.7
apparatus 29.5
negative 29
equalizer 26.3
modem 25
technology 23
retro 22.9
digital 21.9
movie 19.4
computer 19.2
central processing unit 18.9
data 18.3
camera 17.6
strip 17.5
old 17.4
information 16.8
frame 16.6
slide 16.6
vintage 16.5
filmstrip 15.8
device 15.7
cinema 15.6
blank 15.4
border 15.4
grunge 15.3
texture 15.3
network 14.8
black 14.4
close 13.7
business 13.4
entertainment 12.9
industry 12.8
tape 12.5
art 12.4
connection 11.9
photographic 11.8
dirty 11.8
photography 11.4
design 11.3
office 11.2
electronic 11.2
amplifier 11.1
object 11
noise 10.8
router 10.4
chip 10.3
microprocessor 9.8
silver 9.7
photograph 9.6
roll 9.5
graphic 9.5
paper 9.4
sound 9.4
rough 9.1
music 9
35mm 8.9
antique 8.7
hardware 8.6
damaged 8.6
electronics 8.5
card 8.5
element 8.3
speed 8.2
analog 7.9
scratch 7.8
video 7.7
rust 7.7
media 7.6
studio 7.6
tech 7.6
communication 7.6
screen 7.5
electric 7.5
number 7.5
metal 7.2
aged 7.2
board 7.2
reel 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

screenshot 88.6
text 84.9
open 40.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 51.7%
Confused 17%
Sad 15.3%
Angry 11.2%
Disgusted 2.3%
Surprised 1.1%
Fear 0.8%
Happy 0.6%

AWS Rekognition

Age 20-28
Gender Male, 99.5%
Calm 90.4%
Sad 8.5%
Happy 0.3%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 19-27
Gender Male, 95.3%
Calm 99.5%
Disgusted 0.1%
Sad 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Male, 97.7%
Calm 98%
Happy 0.5%
Disgusted 0.4%
Angry 0.4%
Sad 0.2%
Confused 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Male, 98%
Calm 91.8%
Angry 5.1%
Sad 1.3%
Disgusted 0.9%
Surprised 0.4%
Fear 0.3%
Happy 0.1%
Confused 0.1%

AWS Rekognition

Age 16-22
Gender Female, 53.1%
Calm 65.9%
Sad 22.1%
Confused 3.8%
Angry 2.5%
Happy 2.4%
Fear 1.1%
Disgusted 1.1%
Surprised 1%

AWS Rekognition

Age 29-39
Gender Male, 78.9%
Calm 98.3%
Sad 0.5%
Angry 0.4%
Happy 0.3%
Fear 0.2%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 13-21
Gender Female, 77.4%
Sad 84.8%
Angry 5.3%
Calm 4.9%
Fear 1.9%
Confused 1.5%
Disgusted 0.7%
Happy 0.5%
Surprised 0.4%

AWS Rekognition

Age 30-40
Gender Male, 92.7%
Calm 87.3%
Sad 9.2%
Happy 1%
Fear 0.7%
Disgusted 0.6%
Surprised 0.4%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 26-36
Gender Female, 63%
Calm 97.9%
Happy 0.8%
Sad 0.6%
Confused 0.3%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 98.2%

Captions

Text analysis

Amazon

34

Google

34 34
34