Human Generated Data

Title

Untitled (two men, dog, and pair of oxen with covered wagon)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6766

Human Generated Data

Title

Untitled (two men, dog, and pair of oxen with covered wagon)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.8
Human 99.8
Person 97.9
Person 97.5
Person 97.5
Person 97.5
Wagon 97
Vehicle 97
Transportation 97
Horse Cart 97
Person 96.8
Carriage 93.3
Horse 90.2
Mammal 90.2
Animal 90.2
Person 86.7
Cattle 86.5
Cow 86.5
Person 84.1
Person 82.5
Cow 78.9
Machine 77.9
Wheel 77.9
Wheel 77.4
Horse 75.3
Person 74.6
Person 74.3
Horse 70.4
Person 69.2
Horse 67.8
Horse 66.5
Person 66.3
Horse 62.7
Person 61.4
Cow 56.9
Horse 55.3
Person 50.7

Clarifai
created on 2019-11-16

cavalry 100
people 100
many 99.8
group 99.7
seated 99.5
group together 99.3
vehicle 98.9
carriage 98.8
transportation system 98.8
man 97.2
crowd 97
wagon 96.8
adult 96.7
street 96.2
mammal 94.6
road 93.8
driver 92.7
soldier 91.7
recreation 91.1
child 90.9

Imagga
created on 2019-11-16

snow 100
weather 100
winter 44.2
cold 31
landscape 29
trees 24.9
ice 21.2
season 19.5
outdoors 19.4
frost 19.2
carriage 18.5
sky 17.9
tree 17.7
travel 17.6
forest 17.4
snowy 16.5
rural 15.9
scene 14.7
frozen 14.3
city 13.3
old 13.2
park 13.2
outdoor 13
horse 12.3
road 11.7
scenery 10.8
man 10.8
water 10.7
people 10.6
woods 10.5
sun 10.5
black 10.2
field 10
morning 9.9
mountain 9.8
country 9.7
sea 9.4
cart 9.2
peaceful 9.2
horse cart 9
transportation 9
recreation 9
scenic 8.8
seasonal 8.8
freeze 8.7
urban 8.7
storm 8.7
holiday 8.6
walk 8.6
farm 8
frosty 7.8
pine 7.6
clouds 7.6
animal 7.6
path 7.6
town 7.4
vacation 7.4
street 7.4
light 7.4
tourist 7.2
sunset 7.2
fence 7.1
male 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

horse 99.8
outdoor 94.9
cart 93.1
text 89.1
animal 87.3
group 73.5
land vehicle 71.9
horse and buggy 64.7
carriage 61.9
vehicle 61.4
person 60.5
old 56.8
cattle 46.9
several 20.5

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 50.5%
Surprised 49.6%
Disgusted 49.5%
Happy 49.5%
Sad 49.6%
Calm 49.7%
Confused 49.6%
Fear 49.6%
Angry 49.9%

AWS Rekognition

Age 32-48
Gender Male, 50.5%
Fear 49.5%
Angry 49.5%
Calm 50.4%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 49.5%
Disgusted 49.5%

AWS Rekognition

Age 29-45
Gender Male, 50.5%
Surprised 49.6%
Fear 49.6%
Angry 49.9%
Disgusted 49.6%
Sad 49.6%
Calm 49.6%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 51-69
Gender Male, 50.4%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Sad 49.5%
Calm 50.1%
Surprised 49.8%
Confused 49.5%
Fear 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Happy 49.5%
Calm 49.8%
Fear 49.7%
Confused 49.6%
Angry 49.5%
Surprised 49.7%
Disgusted 49.5%
Sad 49.6%

AWS Rekognition

Age 14-26
Gender Female, 50.1%
Happy 49.5%
Sad 49.6%
Angry 49.5%
Calm 49.8%
Fear 50%
Surprised 49.6%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 26-40
Gender Female, 50%
Sad 49.9%
Surprised 49.5%
Angry 49.5%
Confused 49.5%
Fear 49.6%
Happy 49.5%
Calm 49.9%
Disgusted 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.3%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Confused 49.5%
Surprised 49.5%
Calm 49.6%
Angry 49.6%
Sad 50.3%

Feature analysis

Amazon

Person 99.8%
Horse 90.2%
Cow 86.5%
Wheel 77.9%

Captions

Microsoft

a group of people riding on the back of a horse 86.3%
a group of people standing next to a horse 86.2%
a group of people walking down a street next to a horse 86.1%