Human Generated Data

Title

Untitled

Date

c. 1954-c. 1955

People

Artist: Lucien Clergue, French 1934 - 2014

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.246.8

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Human Generated Data

Title

Untitled

People

Artist: Lucien Clergue, French 1934 - 2014

Date

c. 1954-c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.246.8

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.7
Human 99.7
Person 99.7
Person 99.5
Vehicle 99.2
Wagon 99.2
Transportation 99.2
Person 99.2
Person 98.2
Carriage 97.2
Machine 95.1
Wheel 95.1
Bicycle 93.4
Bike 93.4
Horse Cart 91.1
Wheel 86.7
Buggy 55.3

Clarifai
created on 2019-11-16

people 99.9
vehicle 97.8
adult 97.7
two 97.1
man 97
group 95.4
three 94.6
transportation system 93.3
group together 93.2
one 91.5
four 90.8
child 90.2
military 88.4
monochrome 87.5
administration 87.1
war 84.7
home 83.3
chair 82.7
print 82
soldier 81.9

Imagga
created on 2019-11-16

vehicle 100
tricycle 100
wheeled vehicle 100
conveyance 66.9
old 31.4
bicycle 26.3
street 23
bike 21.5
city 20.8
wheel 20.8
wheelchair 19
transportation 16.1
travel 15.5
man 15.5
building 15.1
transport 14.6
road 14.5
chair 14.2
wall 13.7
urban 13.1
male 12.8
cycle 12.7
ride 12.6
architecture 12.5
people 11.7
antique 11.3
ancient 11.2
tourist 10.9
vintage 10.8
seat 10.5
outside 10.3
snow 10.1
window 10.1
outdoor 9.9
tourism 9.9
care 9.9
disabled 9.9
retro 9.8
riding 9.8
carriage 9.7
horse 9.5
cold 9.5
house 9.2
historic 9.2
sport 9.1
park 9.1
sick 8.7
health 8.3
outdoors 8.2
aged 8.1
active 8.1
activity 8.1
history 8.1
pedal 7.9
color 7.8
winter 7.7
stone 7.6
brick 7.5
historical 7.5
help 7.5
town 7.4
rural 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

outdoor 98
text 97.8
wheel 93.5
land vehicle 85
person 81.8
vehicle 78.7
tire 54.9
old 54
cart 39.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 13-23
Gender Female, 51.6%
Surprised 45%
Sad 54.4%
Calm 45.3%
Happy 45%
Confused 45.1%
Disgusted 45%
Fear 45.1%
Angry 45.2%

AWS Rekognition

Age 20-32
Gender Female, 53.2%
Fear 45.1%
Calm 46.1%
Sad 46.7%
Angry 51.8%
Disgusted 45.1%
Happy 45%
Surprised 45%
Confused 45.1%

AWS Rekognition

Age 9-19
Gender Male, 53.5%
Sad 47.6%
Surprised 45%
Angry 51.6%
Calm 45.7%
Happy 45%
Fear 45.1%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 12-22
Gender Male, 54.4%
Happy 45.3%
Sad 46.5%
Angry 47.7%
Calm 46.2%
Fear 47.7%
Surprised 46.2%
Disgusted 45.1%
Confused 45.4%

AWS Rekognition

Age 12-22
Gender Female, 54.7%
Calm 45.5%
Fear 45%
Angry 45.2%
Disgusted 45%
Sad 54.2%
Happy 45%
Surprised 45%
Confused 45.1%

Feature analysis

Amazon

Person 99.7%
Wheel 95.1%
Bicycle 93.4%

Categories

Text analysis

Amazon

ALE