Human Generated Data

Title

Untitled (view through tubes used to hold ammunition, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.25.3

Human Generated Data

Title

Untitled (view through tubes used to hold ammunition, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.25.3

Machine Generated Data

Tags

Amazon
created on 2019-05-28

Footprint 93
Person 74
Human 74
Animal 70.3
Bird 70.3

Clarifai
created on 2019-05-28

monochrome 99.5
people 99.1
one 98.5
snow 98
portrait 97.5
girl 97.4
still life 96.9
adult 96.2
no person 96.1
winter 96.1
egg 92.8
food 91
blur 90.9
beach 90.8
woman 90.3
cold 89.2
child 89.1
model 89
art 89
H2O 88.4

Imagga
created on 2019-05-28

device 39.6
pick 37.8
hole 20.2
sexy 20.1
skin 19.6
face 18.5
body 18.4
bathtub 17
portrait 16.8
pretty 16.1
close 16
person 15.8
child 15.7
care 15.6
people 15.1
vessel 15
health 14.6
attractive 14
sensual 13.6
cute 13.6
fun 13.5
clean 13.4
water 13.3
ball 13.1
adult 12.9
human 12.7
black 12.6
hand 12.6
happy 12.5
male 12.2
healthy 12
one 11.9
spa 11.7
game equipment 11.6
lifestyle 11.6
man 11.4
bath 11.4
lady 11.4
makeup 11.2
expression 11.1
equipment 10.8
smile 10.7
tub 10.2
table 10.2
happiness 10.2
closeup 10.1
relaxation 10
fresh 9.8
bathroom 9.6
pool table 9.6
hair 9.5
play 9.5
natural 9.4
lips 9.2
treatment 9.2
fashion 9
brunette 8.7
love 8.7
eyes 8.6
nose 8.6
sport 8.6
erotic 8.5
therapy 8.5
bubble 8.5
earth 8.4
relax 8.4
hot 8.4
cheerful 8.1
wet 8
lovely 8
looking 8
luxury 7.7
sexual 7.7
massage 7.7
bubbles 7.6
joy 7.5
cosmetics 7.5
hearing aid 7.4
globe 7.4
toy 7.4
doll 7.3
furniture 7.2
childhood 7.2
kid 7.1
little 7.1

Google
created on 2019-05-28

Microsoft
created on 2019-05-28

wall 96
dirty 28.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 98.3%
Disgusted 3.4%
Happy 32.2%
Sad 8.6%
Calm 37.5%
Angry 4.4%
Surprised 6.4%
Confused 7.5%

Feature analysis

Amazon

Person 74%
Bird 70.3%

Categories

Captions

Microsoft
created on 2019-05-28

a close up of a sink 27%
a close up of a white wall 26.9%