Human Generated Data

Title

What Do I Want, John Henry? Warrenton, Virginia

Date

November 1862

People

Artist: Alexander Gardner, American 1821 - 1882

Classification

Photographs

Human Generated Data

Title

What Do I Want, John Henry? Warrenton, Virginia

People

Artist: Alexander Gardner, American 1821 - 1882

Date

November 1862

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.4
Human 99.4
Person 99.2
Person 98.9
Person 98.4
Camping 92.9
Person 91
Chair 70.4
Furniture 70.4
Leisure Activities 68.3
Art 67
People 63.3
Drawing 57.2
Tent 56.3

Clarifai

people 99.2
painting 98.8
illustration 98
picture frame 97.7
art 97.5
album 96.8
adult 96.5
man 96.2
group 95.7
print 94.6
two 93.1
woman 92.3
family 90.6
museum 90.1
wear 89.4
landscape 88.3
exhibition 85.9
old 85.8
nostalgia 85.7
furniture 85.3

Imagga

envelope 59.3
vintage 44.7
grunge 44.3
old 43.9
container 41.8
retro 36.9
texture 34.7
aged 29.9
frame 28.7
antique 27.7
paper 22.7
wall 22.2
structure 20.1
brass 19.2
border 19
dirty 19
grungy 19
chalkboard 18.6
blackboard 18.6
ancient 18.2
design 18.1
rusty 18.1
board 16.7
art 16.3
memorial 15.7
text 14.8
decoration 14.6
card 14.6
rough 14.6
blank 14.6
book 14.5
material 14.3
black 13.8
wallpaper 13.8
pattern 13.7
album 13.6
damaged 13.4
tray 13.2
textured 13.2
floral 12.8
product 12.7
worn 12.4
flower 12.3
page 12.1
graphic 11.7
backdrop 11.5
symbol 11.5
brown 11
letter 11
message 11
paint 10.9
receptacle 10.7
chalk 10.7
states 10.6
international 10.5
drawing 10.3
stamp 10.1
space 10.1
blade 9.8
world 9.7
scrapbook 9.7
binding 9.7
country 9.7
rust 9.6
concrete 9.6
education 9.5
empty 9.4
money 9.4
finance 9.3
greeting 9.3
school 9
idea 8.9
postage 8.8
stained 8.7
stain 8.6
mail 8.6
creation 8.6
age 8.6
united 8.6
leaf 8.6
communication 8.4
color 8.3
note 8.3
element 8.3
artwork 8.2
covering 8.2
global 8.2
business 7.9
postmark 7.9
classroom 7.8
geography 7.7
spot 7.7
obsolete 7.7
weathered 7.6
nation 7.6
film 7.6
poster 7.6
china 7.5
cutting implement 7.5
study 7.5
map 7.5
grain 7.4
backgrounds 7.3

Google

picture frame 86.3
art 58.2

Microsoft

gallery 92.3
room 90.7
picture frame 85
scene 78
old 54.5

Face analysis

Amazon

AWS Rekognition

Age 35-53
Gender Male, 50.2%
Confused 49.6%
Sad 49.8%
Calm 49.9%
Happy 49.6%
Surprised 49.6%
Angry 49.6%
Disgusted 49.5%

AWS Rekognition

Age 20-38
Gender Male, 50.4%
Calm 49.8%
Sad 50.1%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.1%
Calm 49.5%
Disgusted 49.5%
Sad 50.4%
Angry 49.5%
Confused 49.5%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Disgusted 49.5%
Surprised 49.5%
Angry 49.5%
Sad 50.3%
Confused 49.6%
Happy 49.5%
Calm 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Sad 49.6%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 50.3%
Happy 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 99.4%
Tent 56.3%

Captions

Microsoft

an old photo of a person 50.6%
an old photo of a cake 27.5%
old photo of a person 27.4%

Text analysis

Amazon

WHAT
DO
HENRY
Warrentn
WHAT DO WANt JoHN HENRY
JoHN
WANt

Google

AT
DO
ANT
HENRY
AT DO ANT JOHN HENRY
JOHN