Human Generated Data

Title

Untitled (group of soldiers playing cards, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.200.4

Human Generated Data

Title

Untitled (group of soldiers playing cards, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 96.1
Human 96.1
Head 88.8
Clothing 84.5
Apparel 84.5
Meal 80.3
Food 80.3
Person 73.8
Face 71.6
Figurine 69.2
Plant 68.8
Finger 68.6
Dish 68
Art 64.6
Creme 59.5
Cake 59.5
Icing 59.5
Dessert 59.5
Cream 59.5
Sculpture 56.2
Skin 55.6

Imagga
created on 2021-12-14

child 32.1
baby 30.7
neonate 20.8
sculpture 20.3
statue 17.9
negative 17.7
body 15.2
love 15
person 14.6
film 14.6
face 13.5
diaper 13.4
man 13.2
ancient 13
old 12.5
family 12.5
detail 12.1
art 11.8
earthenware 11.6
kid 11.5
photographic paper 11.3
clothing 11.1
culture 11.1
black 10.8
religion 10.8
garment 10.7
mother 10.7
care 10.7
antique 10.4
stone 10.4
head 10.1
people 10
pretty 9.8
bride 9.6
male 9.5
bed 9.5
architecture 9.4
dress 9
home 8.8
newborn 8.7
life 8.7
married 8.6
cute 8.6
portrait 8.4
adult 8.4
monument 8.4
father 8.4
parent 8.3
ceramic ware 8
hair 7.9
marble 7.9
look 7.9
happiness 7.8
sepia 7.8
lying 7.5
photographic equipment 7.5
human 7.5
closeup 7.4
wedding 7.4
lady 7.3
girls 7.3
covering 7.2
carving 7

Google
created on 2021-12-14

White 92.2
Black 89.6
Black-and-white 86.7
Organism 85.7
Gesture 85.3
Style 84.1
Petal 81.5
Font 81.2
Adaptation 79.3
Monochrome photography 76.4
Monochrome 76.4
Art 75.3
Snapshot 74.3
Happy 71.4
Baby 70.7
Event 66
Photo caption 65.7
Hat 64.6
Room 62.8
Child 62.6

Microsoft
created on 2021-12-14

text 82.3
black and white 75.3
crowd 1.6

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 53%
Sad 82.5%
Calm 7.8%
Confused 3.7%
Surprised 2.1%
Angry 1.7%
Fear 1.7%
Happy 0.2%
Disgusted 0.2%

Feature analysis

Amazon

Person 96.1%

Captions

Microsoft

a group of people sleeping 39.7%
a group of people sleeping on the bed 29.2%
a group of stuffed animals 29.1%