Human Generated Data

Title

Ploughing

Date

c. 1880

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.52.1

Human Generated Data

Title

Ploughing

People

Artist: Unidentified Artist,

Date

c. 1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.52.1

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Mammal 99.6
Animal 99.6
Bull 99.6
Horse 96.9
Person 95.4
Human 95.4
Cow 94.8
Cattle 94.8
Person 94.6
Wheel 92.4
Machine 92.4
Person 91.1
Person 86.9
Person 85.4
Ox 82.1
Person 72.6
Person 67.3
Horse 66.3
Person 64
Advertisement 64
Person 63.5
Horse 61.9
Poster 60.1
Collage 56.2
Spoke 55.1
Horse 51.4

Clarifai
created on 2019-11-07

people 99.8
group 99.6
cavalry 99.4
print 99.2
illustration 98.6
mammal 98.1
many 98
man 97.3
art 96.4
adult 95.4
vehicle 93.4
cattle 91.9
canine 91.2
war 87.9
dog 87.5
wagon 86.7
woodcut 85.6
two 85.3
camel 85
transportation system 83.8

Imagga
created on 2019-11-07

money 30.6
currency 30.5
paper 28.3
old 26.5
cash 25.6
finance 25.3
bill 24.7
sketch 24.5
vintage 23.1
stamp 21.7
drawing 21.6
bank 21.5
dollar 21.3
business 21.2
grunge 20.4
envelope 19.7
retro 19.7
note 19.3
newspaper 19.1
banking 18.4
exchange 18.1
insulating material 18
financial 16
product 15.5
wealth 15.3
representation 14.6
blank 14.6
bills 14.6
banknote 14.5
aged 14.5
savings 14
container 13.9
ancient 13.8
letter 13.7
building material 13.5
dollars 12.6
texture 12.5
comic book 12.1
antique 12.1
empty 12
investment 11.9
postage 11.8
blackboard 11.7
hundred 11.6
states 11.6
us 11.6
mail 11.5
creation 11.5
office 11.2
postmark 10.8
collection 10.8
pay 10.5
international 10.5
united 10.5
close 10.3
pattern 10.3
page 10.2
economy 10.2
dirty 9.9
sign 9.8
notes 9.6
symbol 9.4
design 8.9
market 8.9
card 8.6
worn 8.6
wall 8.5
communication 8.4
frame 8.3
die 8.2
message 8.2
global 8.2
brown 8.1
philately 7.9
menu 7.9
postal 7.8
rate 7.8
space 7.8
payment 7.7
flower 7.7
post 7.6
china 7.5
stock 7.5
backgrounds 7.3
textured 7

Google
created on 2019-11-07

Photograph 95
Bovine 87.6
oxcart 75.3
Picture frame 68.4
Adaptation 67
Room 65.7
Working animal 61.1
Livestock 58.6
Ox 58.3
Cart 54.8
Cow-goat family 50.6

Microsoft
created on 2019-11-07

text 99.1
horse 97.3
gallery 91.5
animal 90.2
room 75.1
mammal 74.7
cart 74.5
person 73.4
cattle 64.9
old 61.9
vintage 49.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 50.3%
Sad 49.8%
Confused 49.5%
Fear 49.8%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%
Calm 49.8%
Happy 49.5%

AWS Rekognition

Age 26-40
Gender Male, 50.5%
Confused 49.5%
Calm 50.5%
Happy 49.5%
Fear 49.5%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Sad 49.5%

AWS Rekognition

Age 31-47
Gender Female, 50.2%
Calm 49.6%
Sad 50.3%
Fear 49.6%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Confused 49.6%
Surprised 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.1%
Happy 49.6%
Fear 49.5%
Disgusted 49.5%
Calm 49.6%
Sad 50.2%
Confused 49.5%
Angry 49.5%
Surprised 49.5%

AWS Rekognition

Age 15-27
Gender Female, 50.4%
Angry 49.6%
Confused 49.5%
Fear 50.1%
Happy 49.5%
Surprised 49.7%
Calm 49.6%
Disgusted 49.5%
Sad 49.6%

AWS Rekognition

Age 26-42
Gender Female, 50.3%
Calm 49.8%
Fear 49.5%
Sad 49.9%
Angry 49.5%
Surprised 49.5%
Happy 49.6%
Confused 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Horse 96.9%
Person 95.4%
Cow 94.8%
Wheel 92.4%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

ark.
metivation
enetivation ark. od
enetivation
Soonghing
metivation arti
od
arti

Google

glanghing IS enttivatiow Cart enltivation eant
glanghing
IS
enttivatiow
Cart
enltivation
eant