Human Generated Data

Title

Housing, Improved: United States. New York. Brooklyn. "Riverside Buildings": Riverside Buildings - Brooklyn, N.Y. Children on sand pile.

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.1887

Human Generated Data

Title

Housing, Improved: United States. New York. Brooklyn. "Riverside Buildings": Riverside Buildings - Brooklyn, N.Y. Children on sand pile.

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.1887

Machine Generated Data

Tags

Amazon
created on 2019-06-04

Human 98.3
Person 98.3
Person 98
Advertisement 96.8
Poster 96.8
Person 94
Person 91.4
Person 88.4
Person 86.8
Person 86.4
Person 84.1
Person 80.3
Person 79.9
Person 75.1
Art 72.3
Person 70.4
People 69.4
Person 61.9
Military Uniform 56.4
Military 56.4
Face 55.6

Clarifai
created on 2019-06-04

people 99.9
group 99.5
adult 99
many 98.3
wear 98.1
print 98
art 96.6
painting 95.6
man 95.3
illustration 94.2
album 93.3
room 92.3
furniture 91.7
one 91.2
woman 90.6
leader 89.2
exhibition 89
group together 88
portrait 87.9
picture frame 86.3

Imagga
created on 2019-06-04

web site 51.4
grunge 40
monitor 39.5
old 37.6
vintage 32.2
frame 31.6
antique 28.5
texture 27.8
television 27
retro 27
border 23.5
damaged 22.9
design 22.5
material 22.3
space 20.9
aged 20.8
wall 20.5
pattern 19.8
dirty 19
structure 18.7
paper 18
art 17.6
blank 17.1
electronic equipment 16.8
your 16.4
rough 16.4
empty 16.3
blackboard 16.2
weathered 16.1
grungy 16.1
graphic 16
ancient 15.6
messy 15.5
paint 14.5
screen 14.5
computer 13.9
rusty 13.3
grain 12.9
equipment 12.8
black 12.6
edge 12.5
textured 12.3
broadcasting 12.2
window 12.2
liquid crystal display 12.2
rust 11.6
detailed 11.5
bill 11.4
text 11.3
decoration 11
business 10.9
highly 10.8
frames 10.8
noise 10.8
film 10.7
backdrop 10.7
crumpled 10.7
memorial 10.7
decay 10.6
collage 10.6
surface 10.6
parchment 10.5
old fashioned 10.5
wallpaper 9.9
designed 9.8
layered 9.8
mess 9.8
grime 9.8
digital 9.7
fracture 9.7
layer 9.7
mask 9.6
dirt 9.5
brass 9.5
money 9.4
telecommunication system 9.4
finance 9.3
house 9.2
dark 9.2
telecommunication 9.1
currency 9
bank 9
noisy 8.9
scratches 8.9
building 8.8
copy 8.8
photographic 8.8
ragged 8.8
stains 8.7
obsolete 8.6
architecture 8.6
nobody 8.5
horizontal 8.4
dollar 8.4
color 8.3
sign 8.3
banking 8.3
element 8.3
historic 8.2
cash 8.2
brown 8.1
financial 8
close 8
projects 7.9
scratch 7.8
mottled 7.8
crack 7.7
worn 7.6
exchange 7.6
poster 7.5
stone 7.4
entertainment 7.4
note 7.3
message 7.3
detail 7.2
gray 7.2
wealth 7.2
negative 7.1

Google
created on 2019-06-04

Microsoft
created on 2019-06-04

person 97.9
clothing 91.1
screenshot 80.4
old 70.9
vintage 56.6
newspaper 54.5
poster 53.7
man 52.7
picture frame 14.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Male, 51.8%
Confused 45.4%
Disgusted 45.3%
Sad 48.9%
Surprised 45.4%
Calm 47.9%
Happy 46.7%
Angry 45.3%

AWS Rekognition

Age 4-7
Gender Female, 54.9%
Surprised 45.1%
Angry 45.4%
Happy 45.2%
Sad 49.3%
Calm 49.6%
Confused 45.2%
Disgusted 45.2%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Angry 45.3%
Sad 46.1%
Disgusted 46%
Happy 51%
Calm 46.2%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 26-44
Gender Female, 55%
Surprised 45.2%
Angry 46%
Calm 46.5%
Happy 45.5%
Sad 51.1%
Disgusted 45.3%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Female, 53.6%
Calm 45.9%
Sad 49.7%
Confused 45.2%
Happy 47.5%
Angry 45.2%
Surprised 45.2%
Disgusted 46.3%

AWS Rekognition

Age 4-7
Gender Female, 52.3%
Happy 45%
Disgusted 45%
Surprised 45%
Angry 45.1%
Calm 45%
Sad 54.8%
Confused 45%

AWS Rekognition

Age 26-44
Gender Female, 52%
Calm 46.2%
Confused 45.1%
Happy 45.1%
Sad 53.3%
Surprised 45%
Disgusted 45.1%
Angry 45.2%

AWS Rekognition

Age 14-23
Gender Female, 51%
Surprised 45.2%
Calm 47.1%
Happy 47%
Sad 50%
Angry 45.4%
Confused 45.1%
Disgusted 45.2%

AWS Rekognition

Age 35-52
Gender Female, 50.2%
Angry 49.5%
Calm 49.6%
Sad 49.7%
Disgusted 49.5%
Happy 50.2%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 19-36
Gender Female, 50%
Happy 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 50.1%
Confused 49.5%
Angry 49.5%
Calm 49.8%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Confused 49.5%
Sad 49.6%
Surprised 49.5%
Happy 50.1%
Angry 49.5%
Disgusted 49.5%
Calm 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Calm 49.5%
Happy 49.5%
Sad 50.3%
Surprised 49.5%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 52.2%
Calm 47.2%
Sad 51.6%
Confused 45.1%
Happy 45.1%
Angry 45.4%
Surprised 45.1%
Disgusted 45.4%

AWS Rekognition

Age 4-9
Gender Female, 51%
Calm 45.9%
Sad 48.4%
Surprised 45.6%
Confused 48.3%
Angry 45.7%
Disgusted 45.7%
Happy 45.4%

AWS Rekognition

Age 1-5
Gender Female, 54.6%
Confused 46%
Disgusted 45.8%
Sad 47.9%
Surprised 45.7%
Calm 46.5%
Happy 46.5%
Angry 46.6%

AWS Rekognition

Age 48-68
Gender Female, 53.3%
Disgusted 45.1%
Sad 53.2%
Calm 45.9%
Surprised 45.1%
Confused 45.1%
Angry 45.2%
Happy 45.2%

AWS Rekognition

Age 35-52
Gender Male, 50.2%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Angry 49.5%
Disgusted 49.6%
Happy 49.5%
Sad 50.4%

AWS Rekognition

Age 26-43
Gender Female, 51.6%
Angry 45.5%
Calm 47.1%
Sad 46%
Disgusted 46.1%
Happy 48.5%
Confused 45.6%
Surprised 46.2%

AWS Rekognition

Age 48-68
Gender Female, 50.5%
Calm 49.7%
Disgusted 49.5%
Angry 49.5%
Sad 50%
Happy 49.6%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Surprised 49.5%
Calm 49.9%
Happy 49.5%
Sad 49.7%
Angry 49.6%
Confused 49.6%
Disgusted 49.6%

AWS Rekognition

Age 17-27
Gender Female, 50.5%
Disgusted 49.7%
Happy 49.5%
Sad 49.8%
Calm 49.7%
Angry 49.7%
Confused 49.5%
Surprised 49.6%

AWS Rekognition

Age 23-38
Gender Female, 54.2%
Disgusted 45.3%
Sad 53.4%
Calm 45.1%
Surprised 45.4%
Confused 45.2%
Angry 45.4%
Happy 45.2%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Angry 49.5%
Disgusted 49.5%
Happy 50.2%
Sad 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Happy 49.5%
Angry 49.6%
Calm 49.5%
Disgusted 49.5%
Sad 50.4%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Calm 49.5%
Happy 49.6%
Sad 50.2%
Surprised 49.5%
Disgusted 49.6%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 45-66
Gender Male, 50.2%
Calm 50.3%
Sad 49.5%
Angry 49.5%
Disgusted 49.5%
Confused 49.5%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Confused 49.5%
Calm 49.6%
Disgusted 49.5%
Happy 50%
Angry 49.6%
Sad 49.7%
Surprised 49.5%

AWS Rekognition

Age 30-47
Gender Female, 52.8%
Confused 45.8%
Sad 45.7%
Disgusted 48.1%
Surprised 45.8%
Angry 45.6%
Happy 46.4%
Calm 47.7%

AWS Rekognition

Age 20-38
Gender Female, 53.9%
Sad 46.3%
Confused 45.1%
Angry 45.2%
Happy 52.4%
Calm 45.4%
Surprised 45.3%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Female, 52.3%
Surprised 45.6%
Confused 47%
Disgusted 45.2%
Sad 48.2%
Happy 45.3%
Calm 48.2%
Angry 45.5%

Feature analysis

Amazon

Person 98.3%
Poster 96.8%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

AND
PILE.
BUILDINGS
RIVERSITE
CHILRES
RIVERSITE BUILDINGS HROOKLY, .Y.
HROOKLY,
.Y.
CHILRES oN AND PILE.
oN

Google

BUILDINGS
BROORLY,
.Y.
CHILRE
Q1
BAND
PILE
He 73 15.L RIVERSIDE BUILDINGS BROORLY, .Y. CHILRE Q1 BAND PILE
He
73
15.L
RIVERSIDE