Human Generated Data

Title

Untitled

Date

20th century

People

Artist: Lucien Clergue, French 1934 - 2014

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.217

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Human Generated Data

Title

Untitled

People

Artist: Lucien Clergue, French 1934 - 2014

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.217

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Indoors 99.5
Interior Design 99.5
Person 99.4
Person 99.1
Person 99
Person 98.8
Pub 98.7
Bar Counter 98.1
Restaurant 96.6
Musician 95
Musical Instrument 95
Food Court 79.7
Food 79.7
Leisure Activities 78.3
Meal 72.2
Sitting 59.7
Music Band 58.8
Room 58.5

Clarifai
created on 2023-10-15

people 100
group 99.6
adult 99.2
man 98.7
group together 97.1
child 96.4
woman 96
three 95.2
four 93.3
portrait 92.7
music 91.9
sit 91.4
monochrome 91.4
room 91
two 90.6
several 90.3
furniture 89.9
administration 89.6
boy 89
recreation 89

Imagga
created on 2021-12-14

man 33.6
people 25.7
male 25.6
person 23
adult 22.9
black 22.5
portrait 19.4
couple 17.4
men 15.5
kin 15.5
love 13.4
business 13.4
sexy 12.9
attractive 12.6
world 12.4
face 12.1
room 12
barbershop 11.8
suit 11.7
human 11.2
dark 10.9
musical instrument 10.7
sitting 10.3
women 10.3
happiness 10.2
happy 10
hand 9.9
fashion 9.8
pretty 9.8
businessman 9.7
home 9.6
elegant 9.4
shop 9.3
indoor 9.1
silhouette 9.1
sensual 9.1
handsome 8.9
style 8.9
hair 8.7
smile 8.6
two 8.5
passion 8.5
old 8.4
leisure 8.3
vintage 8.3
holding 8.3
office 8.2
romantic 8
interior 8
lifestyle 8
together 7.9
head 7.6
mercantile establishment 7.4
inside 7.4
window 7.3
20s 7.3
stringed instrument 7.3
smiling 7.2
dress 7.2
music 7.2
family 7.1
model 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Female, 98.8%
Happy 97.3%
Calm 1.1%
Surprised 0.9%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Male, 99.3%
Calm 97.6%
Angry 0.7%
Surprised 0.7%
Sad 0.4%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 36-54
Gender Male, 98.2%
Happy 66.6%
Calm 17.2%
Angry 11.1%
Disgusted 2.9%
Surprised 0.8%
Sad 0.7%
Fear 0.4%
Confused 0.3%

AWS Rekognition

Age 22-34
Gender Male, 98.7%
Calm 98.5%
Sad 0.7%
Angry 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 23-35
Gender Male, 58%
Sad 40.8%
Calm 31.1%
Happy 17.3%
Fear 3.6%
Angry 3.3%
Confused 1.9%
Surprised 1%
Disgusted 1%

AWS Rekognition

Age 22-34
Gender Female, 73.9%
Sad 24.2%
Calm 22.5%
Fear 19.1%
Confused 12.4%
Happy 11.8%
Angry 4.1%
Surprised 3%
Disgusted 2.9%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 53
Gender Male

Microsoft Cognitive Services

Age 55
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

people portraits 97.7%