Human Generated Data

Title

Untitled (two men seated in two-wheeled horse drawn cart on dirt road, two men standing in background, buildings behind)

Date

1866-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4098

Human Generated Data

Title

Untitled (two men seated in two-wheeled horse drawn cart on dirt road, two men standing in background, buildings behind)

People

Artist: Unidentified Artist,

Date

1866-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4098

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Animal 99.7
Horse 99.7
Mammal 99.7
Person 97.6
Human 97.6
Transportation 96.3
Horse Cart 96.3
Wagon 96.3
Vehicle 96.3
Person 93.6
Wheel 91.8
Machine 91.8
Carriage 89.2
Person 74.2
Bike 69.8
Bicycle 69.8

Clarifai
created on 2019-11-10

people 99.4
man 95.1
cavalry 92.3
transportation system 91.4
woman 91.2
adult 89.7
one 88.8
two 88.4
vehicle 86.6
group 86.1
no person 85.9
nostalgia 82.8
mammal 81.9
wood 81.8
cart 81.2
wagon 79.9
child 72.7
vintage 72.7
recreation 71.6
retro 69.9

Imagga
created on 2019-11-10

carriage 100
cart 29.3
horse cart 26.7
old 23.7
wagon 22.4
horse 21.8
wheel 18.9
travel 18.3
transportation 17
antique 16.4
rural 15.9
beach 15.3
sea 14.3
vacation 13.9
chair 13.9
wheeled vehicle 13.3
bicycle 12.9
summer 12.9
transport 12.8
outdoors 12.7
landscape 12.6
vehicle 12.4
sky 12.1
sun 12.1
city 11.6
tourism 11.5
wooden 11.4
street 11
seat 10.9
vintage 10.8
bike 10.7
farm 10.7
outdoor 10.7
wheels 9.8
sand 9.7
country 9.7
sunny 9.5
architecture 9.4
tree 9.2
house 9.2
building 9.1
resort 8.8
urban 8.7
grass 8.7
relaxation 8.4
historic 8.3
retro 8.2
aged 8.1
sunset 8.1
history 8
wall 8
holiday 7.9
scene 7.8
empty 7.7
tropical 7.7
window 7.6
wood 7.5
park 7.4
animal 7.4
harness 7.1

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

horse 99.3
outdoor 92.6
text 92.2
animal 77.2
transport 74.2
horse-drawn vehicle 66.3
old 61.6
horse and buggy 58.2
carriage 58.2
cart 39.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Male, 53.2%
Confused 45.3%
Sad 49.9%
Calm 49.3%
Fear 45%
Disgusted 45.1%
Happy 45%
Surprised 45%
Angry 45.4%

AWS Rekognition

Age 7-17
Gender Female, 52.4%
Calm 49.7%
Sad 45%
Fear 45%
Angry 50.2%
Happy 45%
Disgusted 45%
Confused 45%
Surprised 45%

AWS Rekognition

Age 15-27
Gender Male, 50.3%
Calm 49.7%
Fear 49.7%
Confused 49.5%
Disgusted 49.5%
Sad 49.6%
Surprised 49.6%
Angry 49.6%
Happy 49.8%

AWS Rekognition

Age 43-61
Gender Male, 50.2%
Sad 49.7%
Angry 50.1%
Fear 49.5%
Happy 49.5%
Calm 49.6%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%

Feature analysis

Amazon

Horse 99.7%
Person 97.6%
Wheel 91.8%
Bicycle 69.8%

Categories

Imagga

paintings art 60.6%
interior objects 35.6%
food drinks 2.3%