Human Generated Data

Title

Untitled (three men standing with cow)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2866

Human Generated Data

Title

Untitled (three men standing with cow)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Mammal 100
Animal 100
Cow 100
Cattle 100
Bull 99.6
Human 99.4
Person 99.4
Person 98.9
Person 98
Person 96.9
Person 96
Person 93.2
Transportation 89.5
Vehicle 89.5
Automobile 89.5
Car 89.5
Ox 86.9
Person 81.7
Person 61.6
Buffalo 57.8
Wildlife 57.8

Imagga
created on 2022-01-16

dairy 74.4
farm 46.3
cattle 33.2
horse 27.1
rural 26.4
pen 25.4
livestock 25.4
field 25.1
animal 25
cow 24.3
bull 21.6
grass 21.3
animals 21.3
ranch 20.1
pasture 20.1
enclosure 18.9
mammal 17.7
agriculture 15.8
meadow 15.2
horses 14.6
brown 14
herd 13.7
travel 13.4
graze 12.8
fence 12.6
bovine 12.5
farming 12.3
countryside 11.9
mare 11.8
hog 11.6
head 10.9
cows 10.8
sky 10.8
grazing 10.8
ox 10.7
country 10.5
house 10
swine 10
outdoor 9.9
beef 9.8
structure 9.6
water 9.3
black 9
hay 9
summer 9
outdoors 9
landscape 8.9
horn 8.8
wild 8.7
tourism 8.2
vacation 8.2
wildlife 8
cute 7.9
mane 7.8
stallion 7.8
scene 7.8
cart 7.7
fields 7.7
milk 7.6
island 7.3
domestic 7.2
calf 7.2
equine 7.1

Google
created on 2022-01-16

Working animal 88.8
Bull 85.9
Line 81.9
Dairy cow 81.1
Adaptation 79.4
Snout 74.7
Font 73.6
Livestock 72
Ox 70.7
Pack animal 69.8
Hat 68.6
Bovine 67.2
Event 65.7
History 62.9
Stock photography 62.8
Monochrome photography 61
Landscape 61
Motor vehicle 60.7
Art 60.6
Monochrome 60.6

Microsoft
created on 2022-01-16

text 97.3
outdoor 89.3
horse 83.8
black and white 83.5
person 83.5
mammal 73
man 62.3
animal 60.8
clothing 59.1
cattle 54.1
bovine 45.7
old 42.3

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.5%
Happy 81.7%
Calm 15.4%
Confused 0.8%
Surprised 0.6%
Sad 0.6%
Disgusted 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Male, 92.3%
Calm 89.9%
Surprised 3.9%
Confused 2%
Fear 1.3%
Angry 0.9%
Happy 0.8%
Disgusted 0.7%
Sad 0.5%

AWS Rekognition

Age 48-56
Gender Female, 94.8%
Happy 93.7%
Calm 2.6%
Sad 1.3%
Surprised 1%
Angry 0.6%
Confused 0.3%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 33-41
Gender Male, 96.7%
Calm 70.9%
Sad 16.9%
Confused 3.3%
Fear 2.5%
Happy 1.9%
Disgusted 1.9%
Surprised 1.3%
Angry 1.2%

AWS Rekognition

Age 41-49
Gender Female, 98.5%
Happy 98.1%
Fear 0.6%
Sad 0.4%
Calm 0.4%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 29-39
Gender Male, 94.7%
Calm 94.5%
Happy 1.3%
Surprised 1.2%
Fear 1.1%
Disgusted 1%
Sad 0.4%
Confused 0.3%
Angry 0.2%

AWS Rekognition

Age 28-38
Gender Male, 80.2%
Happy 76.9%
Surprised 10.5%
Fear 4.7%
Calm 3.1%
Sad 2%
Disgusted 1.7%
Angry 0.6%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Cow 100%
Person 99.4%
Car 89.5%

Captions

Microsoft

a group of people standing next to a cow 87%
a group of people standing next to a horse 84.2%
a group of people riding on the back of a horse 67.8%

Text analysis

Amazon

KODAKSELA

Google

YTヨヨA2- AGON
YT
AGON
A2-