Human Generated Data

Title

Untitled (man standing behind calf with ribbon at 4 H Club calf show)

Date

1946

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2993

Human Generated Data

Title

Untitled (man standing behind calf with ribbon at 4 H Club calf show)

People

Artist: Harry Annas, American 1897 - 1980

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Human 99.1
Person 99.1
Person 98.9
Person 98.5
Person 98
Animal 97
Cow 97
Mammal 97
Cattle 97
Person 96.4
Bull 96.4
Apparel 94.7
Clothing 94.7
Person 77.1
Person 71.4
Person 70.3
Person 69.8
Hat 65.2
Person 60.7
Horse 56.3
Bullfighter 55.8
Shorts 55

Imagga
created on 2022-01-21

cattle 64.3
horse 61.9
cow 56
farm 54.4
pasture 35.4
ranch 34
bovine 31.6
grass 30
rural 30
field 29.3
livestock 29.1
bull 28.8
animals 28.7
horses 28.2
grazing 23.5
meadow 23.3
brown 22.1
beef 22
calf 20.5
vaulting horse 20
farming 19.9
saddle 19.9
agriculture 19.3
sidesaddle 18.2
dairy 18
dog 17.6
ox 17.5
countryside 17.4
equine 16.8
graze 16.7
herd 16.7
pen 16.3
cows 14.8
mare 14.7
milk 13.4
animal 13.2
fence 13.2
enclosure 12.9
stallion 12.7
canine 12.7
standing 12.2
sky 12.1
gymnastic apparatus 12
mane 11.7
agricultural 11.7
stock 11.2
seat 11
domestic animal 11
outdoor 10.7
breed 10.6
riding 9.8
mammals 9.8
summer 9
trees 8.9
equestrian 8.8
horn 8.8
hay 8.8
country 8.8
young mammal 8.7
cute 8.6
outside 8.6
mammal 8.5
head 8.4
pet 8.3
outdoors 8.2
domestic 8.1
meat 8.1
sports equipment 8
pony 7.9
feeding 7.8
curious 7.8
farmland 7.7
tree 7.7
color 7.2
scenery 7.2
black 7.2
support 7.2
looking 7.2

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

standing 95.1
black 91.2
white 85.7
text 85.5
horse 81.8
animal 80.8
mammal 73.5
cattle 63.6
posing 46.2
old 41.3

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.6%
Calm 98.9%
Happy 0.7%
Sad 0.2%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 94%
Happy 91.3%
Surprised 3.1%
Fear 2.1%
Confused 0.9%
Angry 0.8%
Sad 0.7%
Calm 0.6%
Disgusted 0.4%

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Happy 42.9%
Calm 16%
Fear 15.9%
Surprised 10.3%
Sad 8.5%
Angry 3%
Disgusted 2.5%
Confused 0.7%

AWS Rekognition

Age 16-24
Gender Female, 53.9%
Calm 79.8%
Fear 9.5%
Sad 4.8%
Disgusted 2.9%
Surprised 1.3%
Happy 0.7%
Confused 0.5%
Angry 0.5%

AWS Rekognition

Age 29-39
Gender Male, 67.4%
Calm 94.3%
Sad 2.2%
Happy 1.2%
Disgusted 1%
Confused 0.5%
Angry 0.3%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 18-24
Gender Female, 95.4%
Calm 76.9%
Happy 9.7%
Angry 3.5%
Sad 3.2%
Surprised 2.9%
Fear 2.1%
Disgusted 1.1%
Confused 0.7%

AWS Rekognition

Age 9-17
Gender Female, 93.9%
Sad 52.6%
Calm 21.9%
Happy 16.2%
Angry 3%
Fear 1.9%
Surprised 1.7%
Confused 1.5%
Disgusted 1.3%

AWS Rekognition

Age 23-31
Gender Male, 98.6%
Calm 99.9%
Sad 0%
Surprised 0%
Happy 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 53.8%
Happy 47.2%
Calm 25.6%
Sad 13.2%
Surprised 5.6%
Fear 3.3%
Confused 2%
Angry 1.5%
Disgusted 1.4%

AWS Rekognition

Age 24-34
Gender Female, 76.4%
Calm 69.7%
Surprised 19.5%
Happy 2.9%
Angry 2.7%
Disgusted 1.7%
Fear 1.7%
Sad 1.3%
Confused 0.6%

AWS Rekognition

Age 23-31
Gender Male, 81.1%
Calm 97.4%
Happy 1.1%
Sad 0.5%
Surprised 0.4%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.1%
Cow 97%

Captions

Microsoft

a person standing in front of a horse 90%
a person standing next to a horse 89.3%
a person standing next to a dog 61.5%

Text analysis

Amazon

KODAK-SVEFIA

Google

KODVK
KODVK