Human Generated Data

Title

Untitled (three men and a boy standing with hogs in a corral)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2422

Human Generated Data

Title

Untitled (three men and a boy standing with hogs in a corral)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Human 99.7
Person 99.7
Person 99.3
Person 99.2
Person 99
Animal 91.3
Mammal 90.3
Cow 84.1
Cattle 84.1
Sheep 69.6
Sheep 68.4
Deer 62.6
Wildlife 62.6
Dog 61.4
Canine 61.4
Pet 61.4
Hat 56
Clothing 56
Apparel 56

Imagga
created on 2022-01-30

hog 100
swine 100
ungulate 60.2
farm 54.4
mammal 36.9
pig 31.8
pen 31.8
rural 31.7
piglet 26.8
animals 25.9
livestock 25.9
enclosure 24.3
pork 23.9
piggy 23.1
agriculture 21.9
snout 21.6
field 20.9
grass 19
pink 18.4
cute 17.9
horse 17.1
farming 17.1
country 15.8
ears 15.5
nose 15.3
wildlife 15.2
boar 14.8
horses 14.6
domestic 14.5
wild 13.9
mammals 13.6
pasture 13.4
fur 13
funny 12.9
pigs 12.9
dirty 12.7
meat 12.6
fat 12.1
countryside 11.9
brown 11.8
hay 11.7
meadow 11.7
structure 11.6
close 11.4
cattle 11
sheep 10.9
fence 10.6
smell 10.6
head 10.1
ranch 9.8
outdoors 9.7
tail 9.6
hair 9.5
cold 9.5
cow 9.1
sow 8.9
mud 8.8
equine 8.8
zoo 8.7
food 8.5
adorable 8.3
landscape 8.2
look 7.9
barn 7.9
feed 7.8
curious 7.8
fields 7.7
breed 7.7
winter 7.7
baby 7.5
snow 7.2

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

outdoor 97.7
animal 93.5
standing 92.7
text 89.6
black and white 81
mammal 77.5
white 65.9
pig 64.2

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 98.7%
Calm 59.8%
Sad 18.7%
Angry 7.9%
Surprised 4.6%
Disgusted 4%
Happy 2.3%
Confused 1.4%
Fear 1.4%

AWS Rekognition

Age 54-62
Gender Male, 99.7%
Calm 60.1%
Confused 21%
Sad 13.8%
Happy 1.3%
Fear 1.3%
Angry 1.2%
Disgusted 1%
Surprised 0.3%

AWS Rekognition

Age 26-36
Gender Male, 93.5%
Calm 98%
Angry 0.5%
Disgusted 0.4%
Fear 0.3%
Sad 0.2%
Surprised 0.2%
Confused 0.2%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Confused 70.7%
Calm 24.7%
Disgusted 2.1%
Surprised 1.1%
Angry 0.4%
Sad 0.4%
Happy 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Cow 84.1%
Sheep 69.6%
Dog 61.4%
Hat 56%

Captions

Microsoft

a group of sheep standing on top of a building 70.1%
a group of sheep that are standing in front of a building 69%
a group of people standing in front of a building 68.9%

Text analysis

Google

NACON
-WAMTBA
NACON YT33A2 -WAMTBA
YT33A2