Human Generated Data

Title

Greer and Robert on the bed, NYC

Date

1982

People

Artist: Nan Goldin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Pace/MacGill Gallery, New York City, P1994.5

Copyright

© Nan Goldin

Human Generated Data

Title

Greer and Robert on the bed, NYC

People

Artist: Nan Goldin, American born 1953

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Pace/MacGill Gallery, New York City, P1994.5

Copyright

© Nan Goldin

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Fireplace 99.9
Indoors 99.9
Computer Hardware 99.4
Electronics 99.4
Hardware 99.4
Screen 99.4
Art 99.2
Painting 99.2
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Person 97
Face 94
Head 94
TV 93.9
Monitor 90.4
Hearth 56.1

Clarifai
created on 2018-06-05

people 99.7
adult 99.1
painting 98.1
room 97.2
one 96.8
window 95.8
family 95.8
mammal 95.1
indoors 95.1
baby 94.3
woman 94.2
portrait 94.1
furniture 94.1
man 94.1
child 92.8
offense 91.7
religion 91.4
reclining 91.2
dog 91
music 89.1

Imagga
created on 2018-06-05

rotisserie 40.1
oven 36.7
kitchen appliance 30.4
food 27.4
home appliance 22.5
incubator 19.6
meal 19.6
dinner 17.8
lunch 17.4
meat 17.1
restaurant 16.8
apparatus 15.6
gourmet 14.4
adult 14.2
cooking 14
bakery 13.8
appliance 13.7
cook 13.7
cuisine 13.3
case 12.9
people 12.8
man 12.8
bread 12.5
shop 12.4
equipment 11.6
television 11.6
grilled 11.5
person 11.4
fresh 11.1
barbecue 10.4
home 10.4
tasty 10
delicious 9.9
kitchen 9.8
grill 9.6
cadaver 9.6
seafood 9.5
cooked 9.4
fish 9.1
traditional 9.1
indoors 8.8
chef 8.7
roast 8.6
preparation 8.6
roasted 8.6
face 8.5
fire 8.4
portrait 8.4
nutrition 8.4
eat 8.4
hot 8.4
healthy 8.2
diet 8.1
interior 8
cold 7.7
pork 7.7
beef 7.6
eating 7.6
telecommunication system 7.5
brown 7.4
domestic 7.2
male 7.1

Google
created on 2018-06-05

art 86.5
painting 67.6
modern art 65.9
art gallery 60.5
television 55.1

Microsoft
created on 2018-06-05

indoor 97.4
oven 63.9
picture frame 6.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Calm 80%
Sad 20.3%
Surprised 6.4%
Fear 6.1%
Angry 1%
Confused 0.6%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 24-34
Gender Female, 100%
Disgusted 44%
Calm 42.9%
Surprised 9%
Fear 6.4%
Sad 4.2%
Angry 1.3%
Confused 0.7%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%
Monitor 90.4%

Captions

Microsoft
created on 2018-06-05

a screen shot of an oven 38.3%