Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4598

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4598

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.9
Human 99.9
Person 99.7
Person 99.7
Person 98.6
Person 93.7
Person 88.2
Person 85.5
Monitor 69.1
Electronics 69.1
Display 69.1
Screen 69.1
Person 65.2
People 64.2
Outdoors 62.1

Clarifai
created on 2023-10-25

movie 99
art 98.2
people 96.9
collage 96.8
analogue 96.7
negative 96.6
slide 96.4
filmstrip 94
picture frame 93.1
window 92.8
dirty 92.1
wall 91.5
no person 91.3
screen 90.9
vintage 90.9
margin 90.9
exposed 90.9
old 90.9
retro 90.9
wear 90.4

Imagga
created on 2022-01-08

sequencer 100
apparatus 100
equipment 100
electronic equipment 31
technology 24.5
amplifier 20
computer 20
digital 19.4
business 17
data 16.4
industry 16.2
black 16.2
film 16
connection 15.5
network 13.9
cinema 13.7
old 13.2
office 12.8
negative 12.5
sound 12.2
grunge 11.9
equalizer 11.9
music 11.7
movie 11.6
information 11.5
vintage 10.7
retro 10.6
audio 10.5
media 10.5
electronic 10.3
device 10.2
texture 9.7
button 9.7
card 9.3
camera 9.2
entertainment 9.2
frame 9.1
close 9.1
border 9
filmstrip 8.9
object 8.8
slide 8.8
strip 8.7
video 8.7
electrical 8.6
cable 8.6
electronics 8.5
screen 8.5
word 8.5
art 8.5
finance 8.4
communication 8.4
plastic 8.3
paper 7.8
plug 7.8
folder 7.8
panel 7.7
blank 7.7
tape 7.7
studio 7.6
security 7.3
design 7.3
industrial 7.3
success 7.2
financial 7.1
server 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 75.7%
Happy 86.9%
Calm 9.5%
Surprised 0.9%
Sad 0.8%
Fear 0.8%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%

AWS Rekognition

Age 20-28
Gender Male, 82.9%
Calm 97%
Sad 2.2%
Happy 0.2%
Confused 0.2%
Angry 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 20-28
Gender Male, 57.7%
Calm 91.7%
Happy 4.2%
Sad 1.4%
Angry 1.1%
Confused 0.6%
Surprised 0.5%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 12-20
Gender Female, 64.1%
Calm 92.7%
Happy 5.3%
Sad 1.4%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 7-17
Gender Male, 51.7%
Calm 94%
Confused 2.1%
Happy 1.6%
Sad 1.2%
Angry 0.5%
Disgusted 0.3%
Fear 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.9%
Monitor 69.1%

Categories

Imagga

cars vehicles 91.8%
interior objects 5.9%

Text analysis

Amazon

33
M

Google

33
33