Human Generated Data

Title

Untitled (Gypsies at the Church, Cannes)

Date

1958

People

Artist: Lucien Clergue, French 1934 - 2014

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.84

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Human Generated Data

Title

Untitled (Gypsies at the Church, Cannes)

People

Artist: Lucien Clergue, French 1934 - 2014

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.84

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 97.7
Human 97.7
Sitting 92.7
Person 92.6
Furniture 90.4
Restaurant 87.2
Cafe 83
Meal 80.5
Food 80.5
Couch 75.9
Person 75.6
Cafeteria 74.6
Person 72.1
Dish 71.2
Chair 69.4
Clothing 67.5
Apparel 67.5
Person 65
Sleeve 64.8
Pub 64
Finger 61.4
Leisure Activities 59
Bar Counter 58.9
Table 56.2
Hair 55.7

Clarifai
created on 2023-10-25

people 99.9
drum 99.7
drummer 99.7
music 99.6
percussion instrument 99.6
musician 99.1
instrument 98.5
group 97.1
band 96.8
adult 96.5
jazz 96
woman 95.2
child 94
monochrome 93.9
man 93.7
street 92.5
two 91.6
guitar 89.5
three 89.4
wear 88.2

Imagga
created on 2022-01-14

person 32.1
man 30.9
adult 27
people 26.2
male 23.1
hairdresser 21.2
lifestyle 21
musical instrument 19.9
guitar 18.6
indoors 18.5
stringed instrument 18.3
sitting 15.5
attractive 15.4
couple 14.8
music 14.4
happy 14.4
home 14.4
women 14.2
together 14
smiling 13.7
smile 13.5
happiness 13.3
black 13.2
musician 12.9
sexy 12.9
chair 12.3
indoor 11.9
patient 11.9
playing 11.9
handsome 11.6
room 11.5
salon 11.5
pretty 11.2
men 11.2
drinking 10.5
teacher 10.3
coffee 10.2
dress 9.9
family 9.8
interior 9.7
group 9.7
table 9.5
relationship 9.4
device 9.3
business 9.1
portrait 9.1
suit 9
clothing 8.9
computer 8.8
hair 8.7
love 8.7
face 8.5
adults 8.5
togetherness 8.5
casual 8.5
professional 8.5
acoustic guitar 8.3
leisure 8.3
occupation 8.2
fun 8.2
sick person 8.1
case 8
guitarist 7.9
performer 7.8
education 7.8
microphone 7.7
youth 7.7
child 7.7
oboe 7.7
husband 7.6
two 7.6
student 7.6
studio 7.6
females 7.6
sit 7.6
fashion 7.5
instrument 7.5
style 7.4
focus 7.4
entertainment 7.4
lady 7.3
romantic 7.1
job 7.1
work 7.1
businessman 7.1
model 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

person 99.9
wall 97.4
indoor 94.8
clothing 94
woman 90.3
human face 70.8
black and white 58.2
cooking 20.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-10
Gender Female, 98.9%
Happy 98.6%
Calm 0.5%
Surprised 0.3%
Angry 0.2%
Fear 0.2%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%

AWS Rekognition

Age 43-51
Gender Female, 83.8%
Calm 87.6%
Angry 5.5%
Confused 3.7%
Surprised 1.6%
Fear 0.7%
Happy 0.4%
Disgusted 0.4%
Sad 0.1%

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%