Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4601.1-3

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4601.1-3

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.4
Person 99.4
Person 99.4
Person 99.3
Person 98.3
Person 97
Person 93.3
Person 91.9
Car 91.5
Transportation 91.5
Vehicle 91.5
Automobile 91.5
Monitor 87.4
Electronics 87.4
Display 87.4
Screen 87.4
Person 84.3
Person 82.8
Military 79.4
Military Uniform 75.3
Person 73.3
Person 70.2
Person 69.4
People 67.7
Nature 61.6
Outdoors 56.5
Text 55.9

Clarifai
created on 2023-10-25

negative 99.8
movie 99.7
filmstrip 99.1
slide 97.9
photograph 97.8
exposed 97.5
noisy 97.1
old 95.8
people 95.6
cinematography 95.5
retro 94.7
art 94.3
analogue 94.1
vintage 93.2
collage 91.8
screen 89.1
dirty 86.9
analog 86.2
emulsion 85.3
margin 84.7

Imagga
created on 2022-01-08

equipment 65.3
electronic equipment 54.7
sequencer 28.6
television 25.2
apparatus 23.8
equalizer 23.4
monitor 22.7
amplifier 21.8
technology 20.8
digital 17.8
computer 16.1
business 15.8
black 15
film 15
broadcasting 14.7
night 13.3
data 11.9
music 11.7
dark 11.7
information 11.5
telecommunication 11.4
close 11.4
light 11.4
screen 11.2
network 11.1
receiver 10.8
strip 10.7
retro 10.6
display 10.6
media 10.5
object 10.3
board 10.2
city 10
movie 9.7
button 9.7
grunge 9.4
destination 9.3
electronic 9.3
finance 9.3
entertainment 9.2
modern 9.1
design 9
financial 8.9
switch 8.8
server 8.8
radio 8.7
panel 8.7
radio receiver 8.6
audio 8.6
architecture 8.6
cable 8.6
system 8.6
money 8.5
sound 8.4
negative 8.4
old 8.4
sign 8.3
speed 8.2
connection 8.2
bank 8.2
closeup 8.1
cinema 7.9
digital clock 7.7
industry 7.7
hand 7.6
vintage 7.4
lights 7.4
camera 7.4
cash 7.3
medium 7.3
web site 7.2
art 7.2

Microsoft
created on 2022-01-08

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 4-10
Gender Female, 53.1%
Calm 53.7%
Happy 17.1%
Disgusted 15.5%
Sad 8%
Fear 2.1%
Angry 1.5%
Surprised 1.2%
Confused 0.8%

AWS Rekognition

Age 19-27
Gender Male, 81.3%
Calm 95.1%
Angry 2.3%
Sad 1.3%
Confused 0.4%
Happy 0.4%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 99.4%
Car 91.5%
Monitor 87.4%

Categories

Imagga

cars vehicles 78.9%
interior objects 16.6%
text visuals 2.9%

Captions

Microsoft
created on 2022-01-08

graphical user interface, website 99.8%

Text analysis

Amazon

39
40
SUPER
41
SCHY
SUPER XX
XX
larco
larco Don
Don

Google

lora
Don
Fad
XX
lora Don Fad 39 SUPER XX
39
SUPER