Human Generated Data

Title

New York City, Regine’s

Date

1977

People

Artist: Larry Fink, American 1941 - 2023

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of George and Alexandra Stephanopoulos, 2014.467

Copyright

© Larry Fink

Human Generated Data

Title

New York City, Regine’s

People

Artist: Larry Fink, American 1941 - 2023

Date

1977

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of George and Alexandra Stephanopoulos, 2014.467

Copyright

© Larry Fink

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99.7
Person 99.7
Person 99.3
Pub 93.9
Restaurant 93.4
Bar Counter 91.2
Tie 85.5
Accessories 85.5
Accessory 85.5
Beverage 81.8
Drink 81.8
Alcohol 79.9
Food Court 76.6
Food 76.6
Glass 73.7
Cafe 68.5
Cafeteria 59.8
Liquor 55.8
Night Life 55.8

Clarifai
created on 2018-02-09

people 99.8
group 99.4
four 99.1
group together 99
woman 98
adult 97.9
two 97.6
three 97.4
several 97.1
man 97
five 96.9
recreation 92.2
sit 92
facial expression 89.6
actress 89.4
music 88.9
portrait 88.9
nightclub 88.6
bar 86.9
wear 86.1

Imagga
created on 2018-02-09

bartender 47.4
black 30.5
sexy 28.9
person 28.1
adult 27.2
portrait 26.5
fashion 24.1
attractive 23.1
people 22.9
man 20.8
couple 19.2
model 18.7
male 18.5
dark 17.5
pretty 17.5
dress 17.2
style 17.1
face 17.1
hair 16.7
brunette 16.6
sensual 16.4
love 15.8
lady 13.8
elegance 12.6
happy 12.5
posing 12.4
retro 12.3
passion 12.2
body 12
sensuality 11.8
gorgeous 11.8
romantic 11.6
smile 11.4
studio 11.4
human 11.3
fun 11.2
party 11.2
sitting 11.2
glass 11.2
women 11.1
two 11
holding 10.7
erotic 10.5
elegant 10.3
barroom 10.3
vintage 10.2
smiling 10.1
music 9.9
night 9.8
interior 9.7
together 9.6
lifestyle 9.4
relationship 9.4
room 9.4
hand 9.1
suit 9
one 9
groom 8.9
girlfriend 8.7
dance 8.7
happiness 8.6
seductive 8.6
men 8.6
expression 8.5
wine 8.5
alcohol 8.5
entertainment 8.3
makeup 8.2
make 8.2
stylish 8.1
passionate 7.9
hugging 7.8
boyfriend 7.7
youth 7.7
old 7.7
drink 7.5
hat 7.4
emotion 7.4
playing 7.3
looking 7.2
celebration 7.2
romance 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 98.8
woman 91.5
indoor 90.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Female, 99.9%
Sad 3.7%
Surprised 10.9%
Disgusted 4.5%
Angry 9%
Calm 49.3%
Happy 6.5%
Confused 16.1%

AWS Rekognition

Age 38-59
Gender Male, 99.8%
Angry 4.4%
Calm 33.6%
Happy 0.6%
Disgusted 1.7%
Surprised 1.8%
Sad 53.2%
Confused 4.7%

AWS Rekognition

Age 20-38
Gender Female, 54.2%
Calm 12.2%
Happy 0.4%
Disgusted 0.7%
Surprised 0.8%
Angry 3.5%
Sad 81%
Confused 1.3%

Microsoft Cognitive Services

Age 42
Gender Female

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 85.5%

Categories