Human Generated Data

Title

Untitled (San Gennaro Festival, Mulberry Street, New York City)

Date

September 1950

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5601

Human Generated Data

Title

Untitled (San Gennaro Festival, Mulberry Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

September 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5601

Machine Generated Data

Tags

Amazon
created on 2019-11-14

Person 99.4
Human 99.4
Person 99.3
Person 98.7
Screen 93.2
Electronics 93.2
Display 93.2
Monitor 93.2
Person 91.4
Person 82.9
Person 67.4
People 60.7

Clarifai
created on 2019-11-14

movie 98.1
people 94
screen 90.1
picture frame 85.5
no person 85
adult 81.4
desktop 81.2
window 80.2
old 78.2
business 76.7
art 75.8
man 75.6
illustration 75.5
group 74.4
technology 73.9
city 71.9
retro 70.4
industry 68.9
image 68
modern 67.8

Imagga
created on 2019-11-14

monitor 100
television 83.9
electronic equipment 73.4
equipment 65
broadcasting 40.2
screen 31.9
telecommunication 29.3
technology 27.5
computer 24.5
display 24
telecommunication system 21.6
business 21.2
medium 19.5
electronic 17.7
communication 16.8
flat 16.4
object 16.1
hand 15.9
liquid crystal display 15.9
black 15.6
digital 15.4
media 15.2
laptop 14.7
global 14.6
video 14.5
finance 14.4
modern 14
design 13.5
frame 13.3
close 13.1
money 12.8
information 12.4
dollar 12.1
office 12
blank 12
wealth 11.7
financial 11.6
keyboard 11.3
entertainment 11
banking 11
liquid crystal 10.9
bank 10.7
web 10.2
closeup 10.1
symbol 10.1
data 10
currency 9.9
film 9.8
panel 9.6
us 9.6
home 9.6
electronics 9.5
savings 9.3
cash 9.1
vintage 9.1
sign 9
collage 8.7
desktop 8.7
paper 8.6
space 8.5
tech 8.5
showing 8.4
notebook 8.4
map 8.4
old 8.4
network 8.3
one 8.2
movie 7.8
news 7.7
hundred 7.7
3d 7.7
navigation 7.7
wide 7.7
communications 7.7
exchange 7.6
system 7.6
bill 7.6
studio 7.6
rich 7.4
car 7.4
note 7.3
investment 7.3
speed 7.3
art 7.2
silver 7.1
work 7.1

Google
created on 2019-11-14

Picture frame 84.8
Room 71.4
Photography 67.8
Interior design 53.6
Art 50.2

Microsoft
created on 2019-11-14

screenshot 95.1
text 93.1
indoor 91

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 51%
Calm 45.9%
Happy 46.2%
Sad 46.1%
Disgusted 45.1%
Surprised 45.2%
Fear 45.5%
Confused 45.2%
Angry 50.8%

AWS Rekognition

Age 21-33
Gender Female, 51.5%
Surprised 45%
Confused 45%
Calm 53.7%
Happy 46.1%
Angry 45%
Disgusted 45%
Sad 45.1%
Fear 45%

AWS Rekognition

Age 17-29
Gender Female, 53.1%
Surprised 46.3%
Sad 45.3%
Angry 47.7%
Happy 45.6%
Confused 45.3%
Calm 49.2%
Disgusted 45.4%
Fear 45.2%

AWS Rekognition

Age 21-33
Gender Female, 54.4%
Happy 45%
Surprised 45%
Calm 45.3%
Confused 45%
Angry 45%
Fear 45.1%
Disgusted 45%
Sad 54.6%

AWS Rekognition

Age 20-32
Gender Female, 50.4%
Sad 49.5%
Confused 49.5%
Disgusted 49.5%
Calm 49.6%
Happy 49.8%
Angry 49.9%
Surprised 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 99.4%
Monitor 93.2%

Categories

Text analysis

Amazon

29
SUPER
67
29 SUPER Xx
Xx

Google

SUPER XX 29
SUPER
XX
29