Human Generated Data

Title

Figures in a Landscape

Date

19th century

People

Artist: Charles Fairfax Murray, British 1849 - 1919

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1947.61

Human Generated Data

Title

Figures in a Landscape

People

Artist: Charles Fairfax Murray, British 1849 - 1919

Date

19th century

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Person 99.1
Human 99.1
Person 98.6
Person 97.9
Painting 97
Art 97
Person 96.9
Person 96.6
Person 96.5
Person 96.1
Person 95.3
Person 93.7
Person 84.8
Drawing 74
People 66.7
Archaeology 60.7
Apparel 55.7
Clothing 55.7

Clarifai
created on 2020-05-02

people 99.8
group 99.5
art 99.4
wear 98.1
print 97.4
adult 96.1
leader 95.7
administration 95.3
man 95
painting 93.2
veil 90.5
many 90.2
engraving 89.6
illustration 88.6
outfit 86.8
furniture 84.5
several 81.2
military 81
vintage 79.3
monarch 78.6

Imagga
created on 2020-05-02

brass 53.5
memorial 48.6
structure 41.3
old 29.2
vintage 28.1
grunge 28.1
art 23.7
decoration 23
blackboard 22.2
graffito 21.9
retro 21.3
frame 20.8
antique 20.7
texture 19.4
black 18
border 17.2
grungy 17.1
design 16.9
pattern 16.4
rough 16.4
dirty 16.3
paint 15.4
aged 15.4
ancient 14.7
damaged 14.3
graphic 13.8
paper 12.5
history 12.5
material 11.6
messy 11.6
edge 11.5
dirt 11.4
landscape 11.1
space 10.8
scratch 10.7
detailed 10.6
text 10.5
weathered 10.4
film 9.9
drawing 9.9
designed 9.8
frames 9.8
noise 9.8
textured 9.6
rust 9.6
collage 9.6
water 9.3
outdoors 8.9
backgrounds 8.9
noisy 8.9
postmark 8.9
scratches 8.9
layered 8.8
mess 8.8
photographic 8.8
stamp 8.7
layer 8.7
mask 8.6
mail 8.6
empty 8.6
historical 8.5
screen 8.4
snow 8.4
letter 8.2
wall 8.2
sculpture 8.2
computer 8
overlay 7.9
highly 7.9
postage 7.9
strip 7.8
architecture 7.8
movie 7.7
travel 7.7
your 7.7
blank 7.7
culture 7.7
tree 7.7
old fashioned 7.6
decorative 7.5
grain 7.4
digital 7.3
color 7.2
stone 7.1
negative 7.1
surface 7
sketch 7
country 7
scenic 7

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

text 98.7
drawing 98
sketch 95.6
person 93.1
painting 89.1
clothing 87.6
old 85.7
man 78.9
white 67.4
black 67
cartoon 55.8
vintage 30.7
stone 18.9

Face analysis

Amazon

AWS Rekognition

Age 0-4
Gender Male, 54.3%
Sad 52.5%
Happy 45%
Angry 45.3%
Confused 45.1%
Fear 46.9%
Disgusted 45%
Calm 45.1%
Surprised 45%

AWS Rekognition

Age 15-27
Gender Male, 53.5%
Angry 45.7%
Sad 45.5%
Confused 45.6%
Fear 45.1%
Disgusted 45.3%
Calm 52.4%
Happy 45.2%
Surprised 45.3%

AWS Rekognition

Age 12-22
Gender Male, 53.5%
Surprised 45%
Sad 45.1%
Fear 45%
Disgusted 45%
Calm 54.9%
Angry 45%
Confused 45%
Happy 45%

AWS Rekognition

Age 15-27
Gender Male, 54.2%
Fear 45%
Angry 45.1%
Sad 49.9%
Calm 49.7%
Surprised 45%
Disgusted 45%
Happy 45%
Confused 45.1%

AWS Rekognition

Age 36-54
Gender Male, 54.7%
Calm 45.1%
Sad 54.9%
Confused 45%
Fear 45%
Disgusted 45%
Angry 45%
Happy 45%
Surprised 45%

AWS Rekognition

Age 44-62
Gender Male, 54.3%
Angry 45%
Sad 54.8%
Calm 45%
Happy 45%
Surprised 45%
Disgusted 45%
Fear 45.1%
Confused 45%

AWS Rekognition

Age 28-44
Gender Male, 50.4%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Sad 50.5%
Surprised 49.5%
Disgusted 49.5%
Angry 49.5%
Fear 49.5%

AWS Rekognition

Age 39-57
Gender Male, 50.4%
Disgusted 49.5%
Sad 49.6%
Angry 49.6%
Calm 50.2%
Confused 49.5%
Fear 49.5%
Surprised 49.5%
Happy 49.5%

Feature analysis

Amazon

Person 99.1%
Painting 97%

Captions

Microsoft

a vintage photo of a person 82%
a vintage photo of some people 77.8%
a black and white photo of a person 74%

Text analysis

Amazon

1947.61

Google

1947.61
1947.61