Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4632

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4632

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98
Human 98
Person 92.7
Nature 65.4
Person 63.8
Outdoors 60.8
Window 60
Person 55.9

Clarifai
created on 2023-10-25

negative 100
filmstrip 99.9
movie 99.8
photograph 99.8
exposed 99.6
slide 99.6
cinematography 99.5
noisy 97.6
art 97.3
old 97.3
collage 96.1
bobbin 95.5
dirty 95.4
emulsion 95.4
analogue 94
retro 94
vintage 93.8
screen 93.3
margin 92.3
rust 92

Imagga
created on 2022-01-08

negative 52.6
film 49.9
equipment 31.9
memory 27.5
photographic paper 24.4
old 23
electronic equipment 21.7
city 20.8
device 17.2
vintage 16.5
retro 16.4
photographic equipment 16.3
digital 16.2
grunge 16.2
sequencer 15
architecture 14.9
movie 14.6
border 14.5
building 14.4
computer 14.4
modem 14
entertainment 13.8
slide 13.7
sky 13.4
frame 13.3
router 13.2
chip 12.8
rough 12.7
noise 12.7
dirty 12.6
technology 12.6
art 12.4
apparatus 12.1
texture 11.8
strip 11.6
cityscape 11.3
water 11.3
travel 11.3
town 11.1
industry 11.1
cinema 10.9
black 10.8
scratch 10.7
design 10.7
rust 10.6
collage 10.6
damaged 10.5
grungy 10.4
antique 10.4
graphic 10.2
camera 10.2
board 10
screen 9.9
central processing unit 9.8
photographic 9.8
urban 9.6
skyline 9.5
network 9.3
data 9.1
ocean 9.1
bridge 9
landscape 8.9
filmstrip 8.9
sea 8.6
connection 8.2
road 8.1
landmark 8.1
tower 8
history 8
decoration 8
business 7.9
noisy 7.9
port 7.7
construction 7.7
edge 7.7
weathered 7.6
buildings 7.6
pattern 7.5
close 7.4
speed 7.3
aged 7.2
coast 7.2
material 7.1
night 7.1
semiconductor device 7.1

Microsoft
created on 2022-01-08

text 94.9
screenshot 93.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 99.7%
Calm 98%
Sad 0.8%
Angry 0.3%
Confused 0.3%
Surprised 0.2%
Happy 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 97.3%
Calm 93.6%
Sad 2.3%
Angry 1.1%
Happy 0.9%
Disgusted 0.8%
Confused 0.7%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 22-30
Gender Male, 97.1%
Calm 98.3%
Sad 0.4%
Confused 0.4%
Angry 0.3%
Happy 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-33
Gender Female, 79.4%
Calm 95.9%
Sad 1.6%
Confused 0.7%
Surprised 0.5%
Happy 0.4%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 11-19
Gender Male, 98.9%
Calm 97.2%
Angry 0.8%
Sad 0.7%
Confused 0.3%
Disgusted 0.3%
Happy 0.3%
Surprised 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 98%

Text analysis

Amazon

CANGRO
CANGRO TRA
POWER
TRA
31
POWER TR
TR
CONVEYOR
VEE DRIVES
+VC
BREED

Google

CANGRO TRA PEEO POWER TRI VEE BRIVS CONVEYOR 31
CANGRO
POWER
VEE
31
TRA
PEEO
TRI
BRIVS
CONVEYOR