Human Generated Data

Title

Untitled (three photographs: young men with cows in town and on roads)

Date

c. 1940, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6799

Human Generated Data

Title

Untitled (three photographs: young men with cows in town and on roads)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6799

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Animal 99.9
Mammal 99.9
Cattle 99.9
Cow 99.9
Cow 99.7
Person 98.2
Human 98.2
Dairy Cow 98
Bull 97.9
Horse 96.6
Person 96.1
Person 96
Machine 95.9
Wheel 95.9
Person 89.2
Cow 87.7
Person 87.5
Person 85.9
Horse 84.7
Horse 77.9
Horse 72.1
Person 71.5
Collage 67.6
Poster 67.6
Advertisement 67.6
Person 62
Person 61.2
Person 49
Person 44.8
Person 42.9

Clarifai
created on 2019-11-16

cow 99.7
cattle 99.7
agriculture 99.5
people 99.5
mammal 99.2
bull 98.7
animal 98.2
group 98.1
livestock 97.2
cavalry 97.1
dog 95.4
nature 94.6
canine 93.7
farm 93.3
milk 92.9
beef cattle 92.5
medicine 92.5
calf 91.3
man 90.9
abstraction 90.7

Imagga
created on 2019-11-16

dairy 66.2
farm 43.7
cow 39.6
ranch 35.2
pasture 34.5
rural 32.6
horse 29.6
cattle 29
livestock 27.3
horses 26.3
grass 22.9
fence 22
equine 21
field 20.9
snow 19
cows 18.7
grazing 18.6
bull 18
agriculture 17.5
landscape 17.1
black 16.8
brown 16.2
meadow 16.1
outdoors 15.7
bovine 15.5
animals 14.8
herd 14.7
stallion 14.7
winter 14.5
countryside 12.8
mare 12.8
sky 12.8
hay 12.7
milk 12.4
plow 12.2
standing 12.2
corral 11.8
mane 11.7
tree 11.5
stable 10.8
graze 10.8
trees 10.7
country 10.5
farming 10.4
beef 10.3
mammal 10.2
domestic 9.9
outdoor 9.9
tool 9.9
sunset 9.9
barn 9.8
calf 9.6
pen 9.2
horn 8.8
dog 8.7
cold 8.6
eating 8.4
horizontal 8.4
silhouette 8.3
animal 8.1
shed 7.9
two 7.6
environment 7.4
meat 7.2
summer 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

animal 93.7
cattle 92.6
horse 81.9
text 80.4
black and white 76.3
white 70.2
land vehicle 69.1
vehicle 68
bull 61.5
mammal 53.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Male, 50.5%
Surprised 49.5%
Fear 49.5%
Happy 49.5%
Sad 49.5%
Calm 50.4%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 20-32
Gender Male, 50.4%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 49.6%
Fear 49.5%
Calm 50.2%
Confused 49.6%

AWS Rekognition

Age 21-33
Gender Male, 50.5%
Fear 49.5%
Confused 49.6%
Calm 49.6%
Sad 49.5%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Angry 50.3%

AWS Rekognition

Age 21-33
Gender Male, 50.5%
Surprised 49.5%
Fear 49.5%
Happy 49.5%
Sad 49.7%
Calm 50.3%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.4%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Calm 49.5%
Sad 49.8%
Surprised 49.5%
Fear 49.5%
Confused 50.1%

AWS Rekognition

Age 30-46
Gender Male, 50.4%
Angry 49.5%
Sad 49.6%
Happy 49.8%
Disgusted 49.5%
Calm 50%
Confused 49.5%
Surprised 49.5%
Fear 49.5%

AWS Rekognition

Age 26-40
Gender Male, 50.5%
Fear 49.5%
Calm 50.3%
Happy 49.7%
Surprised 49.5%
Angry 49.5%
Sad 49.5%
Disgusted 49.5%
Confused 49.5%

Feature analysis

Amazon

Cow 99.9%
Person 98.2%
Horse 96.6%
Wheel 95.9%

Categories

Imagga

paintings art 99%

Captions

Text analysis

Google

nosenuASSEER'S
nosenuASSEER'S