Human Generated Data

Title

Untitled (group of children feeding a lamb a bottle)

Date

1964

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2297

Human Generated Data

Title

Untitled (group of children feeding a lamb a bottle)

People

Artist: Harry Annas, American 1897 - 1980

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2297

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.1
Human 99.1
Person 98.9
Person 98.6
Person 98.6
Person 98.4
Hat 94.1
Clothing 94.1
Apparel 94.1
Canine 93
Animal 93
Mammal 93
Dog 92.5
Pet 92.5
Person 91.3
Person 89.7
Puppy 84.8
Outdoors 73
Person 72.2
Female 67.8
Girl 66.7
Hound 66.1
Face 64.2
Kid 60.3
Child 60.3
People 60.1
Terrier 59.7
Dress 58
Photography 57
Photo 57
Collie 55.6

Clarifai
created on 2023-10-29

people 100
child 97.6
group 97.6
group together 97.4
wear 96.1
adult 94.3
man 93.9
lid 92.6
boy 92.5
outfit 92
uniform 91.4
monochrome 91.1
recreation 90.5
cavalry 89.2
several 88.6
many 87.3
military 86.3
veil 86.1
woman 82.6
administration 80.8

Imagga
created on 2022-01-30

musical instrument 47.3
accordion 43.5
keyboard instrument 34.8
wind instrument 29.1
man 24.9
sport 19.9
people 18.4
person 17.5
male 17.1
adult 15.7
outdoor 13.8
kin 13.1
newspaper 12.8
sunset 12.6
men 12
danger 11.8
silhouette 11.6
beach 11.3
summer 10.9
two 10.2
product 10
outdoors 9.9
grass 9.5
activity 9
mask 8.6
field 8.4
color 8.3
leisure 8.3
sky 8.3
active 8.3
protection 8.2
vacation 8.2
dirty 8.1
sun 8
sand 8
creation 7.9
destruction 7.8
accident 7.8
gas 7.7
relax 7.6
sign 7.5
dark 7.5
free 7.5
work 7.3
success 7.2
child 7.2
recreation 7.2
travel 7

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

grass 95.3
drawing 95
outdoor 92.5
text 91
person 87.9
clothing 87.1
sketch 73.3
dog 71.2
man 67.5
mammal 52.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 89%
Sad 55.9%
Calm 41%
Confused 0.8%
Disgusted 0.7%
Angry 0.7%
Happy 0.4%
Surprised 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Hat
Dog
Person 99.1%
Person 98.9%
Person 98.6%
Person 98.6%
Person 98.4%
Person 91.3%
Person 89.7%
Person 72.2%
Hat 94.1%
Dog 92.5%

Categories

Imagga

paintings art 97.8%

Text analysis

Amazon

MJI7--YT37A--X

Google

MJI3--YT33A°2-
MJI3--YT33A°2-