Human Generated Data

Title

Miscellaneous: Training for general manufactures ?: (?) out ?

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3353.4

Human Generated Data

Title

Miscellaneous: Training for general manufactures ?: (?) out ?

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3353.4

Machine Generated Data

Tags

Amazon
created on 2019-06-07

Human 99.6
Person 99.6
Person 99.5
Person 98.4
Person 97.5
Clothing 96.8
Apparel 96.8
Wood 94.5
Person 91.1
Tent 90.1
Worker 87.1
Outdoors 81.2
Garden 81.2
Hat 74.5
Brick 65.3
Carpenter 64.5
Gardening 62.3
Sun Hat 59.9
Soil 57.5
Gardener 57.4

Clarifai
created on 2019-06-07

people 100
group together 99.7
adult 99.5
group 99.2
man 98.6
many 98.1
soldier 97.3
military 97
war 96
several 95.8
child 94.9
vehicle 94.1
wear 90.4
administration 90.4
home 90.2
weapon 89.5
four 89.1
campsite 88
two 87.7
skirmish 87.4

Imagga
created on 2019-06-07

kin 36.2
shovel 26
crosspiece 25
brace 20.2
farmer 19.1
fence 17.8
man 17.5
sky 17.2
grass 16.6
hand tool 16.6
landscape 16.4
person 16.3
tool 15.2
beach 15.2
strengthener 15.2
rural 15
outdoors 14.9
outdoor 14.5
structural member 14.2
old 13.9
male 13.5
people 13.4
wood 13.3
countryside 12.8
farm 12.5
country 12.3
field 11.7
adult 11.6
summer 11.6
trees 11.6
tree 11.5
autumn 11.4
outside 11.1
park 10.7
maze 10.4
walking 10.4
season 10.1
house 10
sun 9.7
forest 9.6
day 9.4
sand 8.8
stile 8.8
boy 8.7
men 8.6
travel 8.4
pedestrian 8.3
brown 8.1
sunset 8.1
lifestyle 7.9
agriculture 7.9
sea 7.8
device 7.8
winter 7.7
rustic 7.4
vacation 7.4
light 7.3
water 7.3
home 7.2
recreation 7.2
building 7.1
women 7.1
to 7.1
scenic 7

Google
created on 2019-06-07

Microsoft
created on 2019-06-07

outdoor 100
tree 99.9
person 98.6
sky 98.4
clothing 98.1
old 95.5
water 94.2
man 92.5
lake 62.1
fishing 57.6
vintage 40.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-47
Gender Male, 52.2%
Calm 46.2%
Sad 49.5%
Happy 45.7%
Disgusted 45.4%
Surprised 45.3%
Confused 45.3%
Angry 47.6%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Surprised 49.6%
Calm 49.6%
Happy 49.6%
Confused 49.5%
Sad 49.9%
Disgusted 49.5%
Angry 49.7%

AWS Rekognition

Age 48-68
Gender Female, 54.4%
Sad 51.4%
Disgusted 45.2%
Calm 46.2%
Confused 46.1%
Surprised 45.6%
Angry 45.5%
Happy 45.1%

Feature analysis

Amazon

Person 99.6%
Tent 90.1%
Hat 74.5%

Categories