Human Generated Data

Title

Old Woman Spinning

Date

c. 1876-1878

People

Artist: Charles Herbert Moore, American 1840 - 1930

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.84

Human Generated Data

Title

Old Woman Spinning

People

Artist: Charles Herbert Moore, American 1840 - 1930

Date

c. 1876-1878

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.84

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Wheel 99.6
Machine 99.6
Person 99.3
Human 99.3
Painting 95.8
Art 95.8
Drawing 59.2
Vehicle 56.6
Transportation 56.6

Clarifai
created on 2020-04-25

people 99.7
one 99.3
adult 98.5
wear 98.1
elderly 96.9
woman 95.7
man 95.3
art 94.6
seat 92.6
transportation system 91.3
old 90.5
two 88.9
vehicle 88.1
portrait 87.2
painting 85.5
street 83.8
wheel 83.8
seated 83.6
furniture 82.4
recreation 79.5

Imagga
created on 2020-04-25

spinning wheel 46.9
spinning machine 38.2
person 31
textile machine 29.8
man 29.5
male 24.2
machine 22.8
adult 21.5
people 19.5
sitting 18
wheelchair 16.9
senior 15.9
chair 15.4
happy 15
device 14.6
portrait 14.2
work 13.7
outdoors 13.4
leisure 13.3
retired 12.6
retirement 12.5
elderly 12.4
outdoor 12.2
holiday 11.5
patient 11.4
home 11.2
engineer 10.9
seat 10.8
vacation 10.6
old 10.4
mature 10.2
house 10
human 9.7
job 9.7
indoors 9.7
sit 9.4
outside 9.4
smiling 9.4
lifestyle 9.4
grandfather 9.4
relaxation 9.2
mother 9.2
health 9
disabled 8.9
worker 8.9
working 8.8
sick person 8.6
reading 8.6
smile 8.5
industry 8.5
case 8.4
help 8.4
summer 8.4
carpenter 8.4
care 8.2
room 8.2
lady 8.1
religion 8.1
child 7.9
medical 7.9
building 7.9
together 7.9
couple 7.8
bench 7.8
face 7.8
travel 7.7
men 7.7
husband 7.6
site 7.5
fun 7.5
laptop 7.5
park 7.4
glasses 7.4
protection 7.3
relaxing 7.3
computer 7.2
looking 7.2
activity 7.2
family 7.1
love 7.1

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

drawing 97.6
man 95.1
person 95
sketch 88.4
clothing 82
text 77.1
painting 75.3
old 55.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-68
Gender Male, 77%
Surprised 0%
Sad 0.3%
Happy 0.2%
Calm 99.5%
Angry 0%
Fear 0%
Confused 0%
Disgusted 0%

Feature analysis

Amazon

Wheel 99.6%
Person 99.3%
Painting 95.8%

Captions