Human Generated Data

Title

Edith and Rennie Booher, Danville, Virginia

Date

1969-1970

People

Artist: Emmet Gowin, American born 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, National Endowment for the Arts Grant, P1972.213

Copyright

© Emmet Gowin

Human Generated Data

Title

Edith and Rennie Booher, Danville, Virginia

People

Artist: Emmet Gowin, American born 1941

Date

1969-1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, National Endowment for the Arts Grant, P1972.213

Copyright

© Emmet Gowin

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Electrical Device 100
Microphone 100
Head 100
Photography 100
Portrait 100
Face 100
Back 99.8
Body Part 99.8
Couch 99.4
Furniture 99.4
Person 99.2
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Clothing 91.5
Pants 91.5
Finger 90.5
Hand 90.5
Bed 90.3
Bedroom 90.3
Indoors 90.3
Room 90.3
Person 73.7
Person 66.4
Guitar 56.8
Musical Instrument 56.8
Architecture 56.3
Building 56.3
Living Room 56.3
Neck 56.3
Home Decor 56.3
Linen 56.1
Undershirt 56
Lady 55.3

Clarifai
created on 2023-11-01

people 99.9
portrait 98.9
adult 98.6
woman 98.4
monochrome 97.7
one 97.4
music 96.6
room 95.8
two 95.6
furniture 94.4
actress 91
wear 89.9
guitar 89.5
instrument 88.4
musician 87.2
girl 86.8
singer 86.7
child 83.2
seat 83
man 81.4

Imagga
created on 2018-12-21

washboard 40
device 34.8
adult 34.7
sexy 32.9
person 29.4
attractive 25.9
portrait 25.2
fashion 24.1
model 24.1
black 23.6
people 23.4
wind instrument 21.8
sensual 21.8
style 21.5
musical instrument 20.9
hair 19.8
lady 19.5
musician 19.1
studio 18.2
pretty 17.5
posing 16.9
body 16.8
women 16.6
singer 16
dark 15.9
performer 15.8
music 15.5
man 15.5
sitting 15.5
one 14.9
sensuality 14.5
male 13.5
standing 13
elegant 12.9
guitar 12.6
happy 12.5
outfit 12.3
passion 12.2
brunette 12.2
face 12.1
professional 11.9
casual 11.9
skin 11.9
musical 11.5
harmonica 11.1
expression 11.1
business 10.9
stylish 10.9
dress 10.8
holding 10.7
oboe 10.7
performance 10.5
erotic 10.4
instrument 10.3
sax 10.3
lifestyle 10.1
elegance 10.1
blond 10
room 9.8
modern 9.8
human 9.8
interior 9.7
office 9.6
clothing 9.6
couple 9.6
seductive 9.6
rock 9.6
hairstyle 9.5
chair 9.5
free-reed instrument 9.4
make 9.1
gorgeous 9.1
indoors 8.8
smile 8.6
youth 8.5
clothes 8.4
phone 8.3
microphone 8.2
woodwind 8.2
wet 8.1
water 8
cute 7.9
love 7.9
concert 7.8
nude 7.8
stage 7.8
vogue 7.7
wall 7.7
jeans 7.6
two 7.6
rain 7.5
leisure 7.5
action 7.4
light 7.4
stringed instrument 7.3
makeup 7.3
group 7.3
computer 7.2
looking 7.2
romance 7.1
handsome 7.1
romantic 7.1
together 7

Google
created on 2018-12-21

Microsoft
created on 2018-12-21

person 99.6
wall 95.5
indoor 90.5
black and white 51.3
music 27.7
monochrome 14.3
live music 10.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Female, 98.7%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 14-22
Gender Female, 55.6%
Calm 52.6%
Happy 17.4%
Surprised 10.8%
Angry 7.8%
Fear 6.9%
Disgusted 6.6%
Sad 3.7%
Confused 2.1%

AWS Rekognition

Age 20-28
Gender Female, 77.7%
Calm 27.6%
Happy 27.1%
Sad 15%
Surprised 14%
Disgusted 8.5%
Fear 7.2%
Angry 4.4%
Confused 4.3%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Adult 98.9%
Male 98.9%
Man 98.9%

Categories

Captions