Human Generated Data

Title

Untitled (man and woman with turkeys at Terry Farm Supply)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6961

Human Generated Data

Title

Untitled (man and woman with turkeys at Terry Farm Supply)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 99.2
Bird 97.7
Animal 97.7
Bird 94
Bird 93.6
Bird 93.1
Bird 93
Bird 91.4
Bird 90.8
Bird 89.4
Fowl 87.2
Chicken 87.2
Poultry 87.2
Bird 86.3
Military 85.1
Military Uniform 85.1
Bird 82.1
Bird 80.9
Bird 78.6
Bird 74.2
People 74.2
Mammal 73.5
Bird 73.3
Army 72.3
Armored 72.3
Bird 71.5
Bird 71
Bird 70.3
Bird 69.9
Bird 69.7
Bird 69.5
Bird 68.9
Chicken 68.1
Bird 68
Bird 63.9
Bird 60.9
Crowd 60.4
Waterfowl 60.2
Bull 60
Herd 57
Funeral 56.9
Sheep 56.6
Cormorant 56.2
Bird 55.9

Imagga
created on 2022-01-23

cemetery 38
barrow 26.6
handcart 21.3
wheeled vehicle 19
tree 18.7
bench 17.8
building 17.5
old 17.4
vehicle 17.2
architecture 17.2
tie 17.1
park bench 15.1
cannon 15.1
park 14.8
brace 14.2
city 14.1
landscape 14.1
travel 13.4
gun 13
grass 12.6
conveyance 12.6
house 12.5
trees 12.4
sky 12.1
snow 11.9
history 11.6
outdoor 11.5
winter 11.1
tourism 10.7
strengthener 10.7
outdoors 10.6
forest 10.4
weapon 10.4
garden 10.1
landmark 9.9
stretcher 9.7
stone 9.4
water 9.3
town 9.3
historic 9.2
countryside 9.1
seat 8.9
rural 8.8
man 8.7
cold 8.6
industry 8.5
people 8.4
wood 8.3
device 8.1
farm 8
mountain 8
day 7.8
ancient 7.8
litter 7.8
military 7.7
statue 7.6
church 7.4
support 7.4
street 7.4
peace 7.3
industrial 7.3
fall 7.2
road 7.2
religion 7.2
river 7.1
season 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

outdoor 98.9
tree 98.5
text 89.7
person 85.8
black 75.7
grave 71.3
cemetery 70.7
white 70.2
funeral 65.6
old 61.6
man 50.2
several 11.8

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Female, 99.9%
Sad 48.7%
Calm 21.7%
Disgusted 13.8%
Happy 7.8%
Angry 3.1%
Confused 2%
Surprised 1.6%
Fear 1.3%

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 99.9%
Surprised 0%
Angry 0%
Confused 0%
Sad 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 99.5%
Calm 55.8%
Sad 23.3%
Fear 11.4%
Surprised 3.2%
Happy 1.9%
Angry 1.8%
Confused 1.4%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Bird 97.7%
Chicken 87.2%
Sheep 56.6%

Captions

Microsoft

a group of people riding on the back of a sheep 63.9%
a group of people in an old photo of a dog 63.8%
a group of people in an old photo of a man 63.7%

Text analysis

Amazon

400
NAGOY