Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4592.1-2

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4592.1-2

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.4
Person 99.4
Person 99.2
Person 99
Person 98.4
Person 95.5
Person 93.3
Person 86.3
Vehicle 82.4
Transportation 82.4
Interior Design 79.3
Indoors 79.3
Person 74.3
Train 67.6
Bus 62.7
Person 60.7
Overcoat 56
Clothing 56
Coat 56
Apparel 56
Person 53.1

Clarifai
created on 2023-10-25

negative 99.6
movie 99.3
people 98.6
filmstrip 97.8
slide 96.9
collage 96.4
analogue 96.4
art 94.6
vintage 94
old 93.4
photograph 93.4
screen 93.3
window 92
noisy 91.8
exposed 90.7
cinematography 90.4
retro 89.9
analog 88.7
rust 87.5
man 87.4

Imagga
created on 2022-01-08

equipment 37.5
electronic equipment 31.4
amplifier 24
city 20.8
business 18.2
architecture 17.2
building 16.8
old 16.7
folder 16.5
film 15.3
showing 15
bank 14.9
register 14.8
sequencer 14.6
word 14.1
negative 13.9
apparatus 13.6
finance 13.5
office 13.2
device 13.2
urban 13.1
card 12.7
paper 12.5
financial 12.5
banking 11.9
grunge 11.9
retro 11.5
center 11.1
investment 11
black 10.8
electronic instrument 10.5
art 10.4
money 10.2
economy 10.2
invest 9.7
military 9.6
music 9.1
vintage 9.1
musical instrument 9
success 8.8
synthesizer 8.8
paperwork 8.8
frame 8.3
entertainment 8.3
structure 8.3
historic 8.2
radio 8.2
technology 8.2
landmark 8.1
tab 7.9
design 7.9
movie 7.8
travel 7.7
file 7.7
construction 7.7
famous 7.4
town 7.4
lights 7.4
light 7.3
cash 7.3
data 7.3
new 7.3
dirty 7.2
computer 7.2
broadcasting 7.2
tower 7.2
history 7.1
night 7.1
information 7.1
work 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

indoor 92
text 83.4
ship 54.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Male, 57%
Happy 57.9%
Calm 39.4%
Confused 0.9%
Surprised 0.6%
Sad 0.4%
Disgusted 0.3%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Male, 97.1%
Calm 55.4%
Confused 15.8%
Sad 12.2%
Surprised 7.9%
Disgusted 3.8%
Angry 2.7%
Fear 1.2%
Happy 1%

AWS Rekognition

Age 21-29
Gender Male, 98.9%
Calm 73.6%
Confused 11.7%
Sad 6.5%
Surprised 3.4%
Happy 2.4%
Fear 1%
Disgusted 0.7%
Angry 0.7%

AWS Rekognition

Age 13-21
Gender Male, 68.8%
Happy 30.3%
Calm 25.3%
Sad 16.6%
Fear 11.2%
Disgusted 8.5%
Angry 4%
Surprised 2.9%
Confused 1.3%

AWS Rekognition

Age 25-35
Gender Male, 96.8%
Disgusted 70%
Calm 15%
Angry 5.9%
Sad 3.4%
Happy 2.5%
Surprised 2.3%
Fear 0.5%
Confused 0.3%

AWS Rekognition

Age 6-14
Gender Female, 99.9%
Confused 84.6%
Fear 7.7%
Sad 4.5%
Calm 1.4%
Surprised 0.7%
Disgusted 0.5%
Angry 0.3%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Calm 76.1%
Sad 13.2%
Angry 5.5%
Confused 2.4%
Disgusted 1.3%
Surprised 0.6%
Fear 0.5%
Happy 0.4%

Feature analysis

Amazon

Person 99.4%
Train 67.6%
Bus 62.7%

Categories

Imagga

interior objects 99.8%

Text analysis

Amazon

24
23
VOTE
+
DRIN
HERE
/11

Google

23 23 24
23
24