Human Generated Data

Title

Seated Farmer with Goat

Date

19th century

People

Artist: Charles Samuel Keene, British 1823 - 1891

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.476.E

Human Generated Data

Title

Seated Farmer with Goat

People

Artist: Charles Samuel Keene, British 1823 - 1891

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.476.E

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Transportation 96.6
Vehicle 96.6
Boat 96.6
Human 92.8
Person 92.8
Person 92.8
Art 88.9
Drawing 87.2
Sketch 77.4
Rowboat 77.3
Person 76.9
Vessel 56.5
Watercraft 56.5
Person 53.4

Clarifai
created on 2020-05-02

people 99.7
painting 98
group 97.7
art 97.4
adult 97.3
print 97.3
one 96.6
two 96.4
man 95.5
no person 94.2
vintage 94.2
illustration 93.8
woman 93.6
picture frame 93
furniture 92.6
retro 92.5
nostalgia 90.9
wear 90.7
portrait 90.2
room 88.9

Imagga
created on 2020-05-02

refrigerator 39.2
white goods 31.6
old 26.5
wall 25.6
home appliance 25.4
business 18.2
vintage 18.2
paper 17.2
appliance 17.1
envelope 14.9
money 14.5
building 14.3
window 14
cash 13.7
sign 13.5
finance 13.5
architecture 13.3
container 13.2
aged 12.7
texture 12.5
retro 12.3
ancient 12.1
banking 11.9
grunge 11.9
bank 11.9
device 11.7
house 11.7
currency 11.7
door 11.5
empty 11.2
dirty 10.8
antique 10.5
card 10.3
blank 10.3
page 9.3
frame 9.2
city 9.1
shop 9.1
durables 8.8
stamp 8.7
office 8.6
bill 8.6
brown 8.1
financial 8
home 8
urban 7.9
travel 7.7
worn 7.6
exchange 7.6
stone 7.6
structure 7.6
commerce 7.5
dollar 7.4
town 7.4
security 7.3

Google
created on 2020-05-02

Photograph 95.1
Photographic paper 66.3
Room 65.7
Photography 62.4
Art 50.2

Microsoft
created on 2020-05-02

gallery 99.1
scene 98.1
drawing 98.1
sketch 97.7
room 95.9
wall 95.6
text 91.2
old 71.1
white 64.1
cartoon 63
person 61.8
different 54.3
painting 40.3
several 14.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Male, 54.3%
Fear 45%
Sad 45.1%
Confused 45%
Happy 45%
Surprised 45%
Calm 54.7%
Angry 45.1%
Disgusted 45%

AWS Rekognition

Age 21-33
Gender Female, 52.5%
Surprised 45%
Disgusted 45%
Happy 45%
Angry 45%
Sad 45%
Calm 54.8%
Fear 45%
Confused 45%

AWS Rekognition

Age 13-25
Gender Female, 50.7%
Surprised 45%
Angry 45.1%
Happy 45%
Calm 54.7%
Fear 45%
Disgusted 45%
Sad 45.1%
Confused 45%

AWS Rekognition

Age 31-47
Gender Male, 50%
Confused 49.5%
Happy 49.7%
Sad 49.6%
Disgusted 49.5%
Surprised 49.5%
Angry 49.8%
Fear 49.8%
Calm 49.6%

Feature analysis

Amazon

Boat 96.6%
Person 92.8%

Categories

Captions

Microsoft
created on 2020-05-02

an old photo of a painting 74%
a painting on the wall 73.9%
old photo of a painting 67.6%

Text analysis

Amazon

KEENE

Google

KEENE
KEENE