Human Generated Data

Title

Untitled (bull and cart)

Date

1941

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.753

Human Generated Data

Title

Untitled (bull and cart)

People

Artist: Harry Annas, American 1897 - 1980

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Cow 99.9
Animal 99.9
Cattle 99.9
Mammal 99.9
Wheel 99.7
Machine 99.7
Person 99.6
Human 99.6
Wheel 99.1
Bull 98.9
Wheel 97.1
Wheel 94.6
Ox 86.7
Person 86.6
Transportation 76.1
Bike 76.1
Bicycle 76.1
Vehicle 76.1
Person 70.8
Person 65.7
Spoke 56.5
Person 43.6

Imagga
created on 2022-01-09

cattle 67.1
cart 62.6
carriage 54
oxcart 51.2
wagon 49.3
farm 48.2
ox 42.9
horse 42.6
cow 37.6
bovine 33.4
rural 31.8
ranch 31.4
grass 30.9
field 29.3
wheeled vehicle 28.4
pasture 27.8
bull 27.2
brown 25.1
agriculture 23.7
livestock 23.3
meadow 22.4
horse cart 21.7
animals 20.4
countryside 20.1
beef 19.1
farming 19
horses 18.5
graze 17.7
grazing 17.7
milk 17.2
vehicle 16.3
dairy 16.1
cows 15.8
herd 13.8
calf 13.7
horn 12.7
country 12.3
outdoor 10.7
ruminant 10.3
sky 10.2
hay 10.2
mare 9.8
mane 9.8
trees 9.8
fence 9.8
outdoors 9.7
landscape 9.7
summer 9.7
mammal 9.3
head 9.2
equine 9.1
village 8.7
cute 7.9
stallion 7.8
black 7.8
mammals 7.8
scene 7.8
farmland 7.8
breed 7.7
ear 7.7
barn 7.5
stock 7.5

Google
created on 2022-01-09

Wheel 94.8
Photograph 94.1
Vehicle 94.1
Motor vehicle 92.3
Working animal 88.8
Tree 85.5
Adaptation 79.3
Carriage 77.4
Wagon 76.9
Cart 75.6
Landscape 75
Classic 72.3
Pack animal 69.3
Horse and buggy 68.4
Livestock 67
Ox 65.4
Event 65.4
Stock photography 64.2
Sky 62.5
History 61.3

Microsoft
created on 2022-01-09

outdoor 100
tree 99.9
grass 99.4
horse 99.3
drawn 94.1
carriage 93.2
old 84.2
animal 84
land vehicle 75.5
pulling 67
vehicle 66.4
horse and buggy 57.5
cart 40.8
family 19.5

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Calm 99.4%
Angry 0.3%
Sad 0.1%
Surprised 0%
Confused 0%
Happy 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 6-14
Gender Female, 99.8%
Sad 38.2%
Angry 28.5%
Calm 16.8%
Disgusted 9.9%
Confused 2.3%
Fear 1.9%
Surprised 1.4%
Happy 0.9%

AWS Rekognition

Age 36-44
Gender Female, 57.7%
Calm 51.7%
Disgusted 13.6%
Happy 9.3%
Sad 9.1%
Fear 5.8%
Angry 4.5%
Surprised 3.8%
Confused 2.3%

AWS Rekognition

Age 21-29
Gender Female, 77.8%
Calm 90.1%
Sad 3.6%
Angry 2.9%
Happy 1.5%
Confused 0.8%
Disgusted 0.5%
Fear 0.4%
Surprised 0.3%

AWS Rekognition

Age 16-22
Gender Female, 70.6%
Fear 33.6%
Sad 26.2%
Calm 14.3%
Happy 9.3%
Surprised 5.8%
Confused 4.5%
Angry 3.4%
Disgusted 2.9%

AWS Rekognition

Age 20-28
Gender Female, 96.2%
Calm 96.2%
Sad 0.9%
Fear 0.9%
Angry 0.6%
Confused 0.6%
Happy 0.4%
Disgusted 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Cow 99.9%
Wheel 99.7%
Person 99.6%
Bicycle 76.1%

Captions

Microsoft

an old photo of a horse drawn carriage 98.6%
old photo of a horse drawn carriage 98.1%
a close up of a horse drawn carriage 98%

Text analysis

Amazon

Annas

Google

Annes
Annes