Human Generated Data

Title

Untitled (two soldiers seated on ground talking, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.140.2

Human Generated Data

Title

Untitled (two soldiers seated on ground talking, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.140.2

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.2
Human 98.2
Person 97.6
Clothing 86.7
Apparel 86.7
Sculpture 84.7
Art 84.7
Sunglasses 74.6
Accessories 74.6
Accessory 74.6
Statue 73
Head 66.7
Face 65.2
People 59.3
Figurine 56.8

Clarifai
created on 2023-10-22

people 99.8
portrait 98.3
monochrome 97.8
adult 97.5
art 97.1
woman 95.2
one 94.2
two 94
wear 93.6
child 91.6
man 90.6
war 87.8
vintage 87
baby 86.1
music 85.8
military 85.5
furniture 84.9
outfit 83.5
girl 82.6
son 80.5

Imagga
created on 2021-12-14

person 33
man 30.2
adult 28.6
people 28.4
portrait 24.6
male 22.8
face 17
hair 16.6
sitting 15.5
happy 13.8
human 13.5
old 13.2
smile 12.8
outdoors 12
men 12
one 11.9
clothing 11.9
casual 11.9
statue 11.8
lifestyle 11.6
outdoor 11.5
smiling 10.8
sadness 10.7
blond 10.6
fashion 10.5
attractive 10.5
mother 10.4
street 10.1
model 10.1
pretty 9.8
black 9.8
jacket 9.7
outside 9.4
mature 9.3
head 9.2
city 9.1
alone 9.1
sculpture 9.1
park 9.1
soldier 8.8
gun 8.7
sad 8.7
art 8.7
cold 8.6
spectator 8.5
20s 8.2
camouflage 8.1
sexy 8
looking 8
world 7.9
life 7.9
work 7.8
happiness 7.8
depression 7.8
military 7.7
joy 7.5
senior 7.5
fun 7.5
leisure 7.5
weapon 7.4
glasses 7.4
lady 7.3
dirty 7.2
dress 7.2
gray 7.2
religion 7.2
eye 7.1
family 7.1
love 7.1
together 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 95.6
person 95.4
text 93.1
clothing 85.8
woman 84.2
black and white 78.7
portrait 53.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 85.2%
Calm 98.5%
Surprised 0.5%
Disgusted 0.4%
Sad 0.2%
Happy 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 30-46
Gender Female, 96.9%
Fear 31.5%
Surprised 30.4%
Calm 19.7%
Angry 8.5%
Sad 5.2%
Happy 3.1%
Confused 1.1%
Disgusted 0.5%

Feature analysis

Amazon

Person 98.2%
Sunglasses 74.6%

Categories

Captions

Microsoft
created on 2021-12-14

a person sitting on a bed 34.7%