Human Generated Data

Title

Untitled (The Great Parade, Arles)

Date

1955

People

Artist: Lucien Clergue, French 1934 - 2014

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, P1995.246.3

Copyright

© Lucien Clergue Estate / Artists Rights Society (ARS), New York, NY / SAIF, Paris

Human Generated Data

Title

Untitled (The Great Parade, Arles)

People

Artist: Lucien Clergue, French 1934 - 2014

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.7
Person 99.7
Person 99.7
Person 99.7
Wheel 98.9
Machine 98.9
Person 98.7
Person 93.5
Bike 93.5
Bicycle 93.5
Transportation 93.5
Vehicle 93.5
Bicycle 92.3
Apparel 60.2
Clothing 60.2
Carriage 59.3

Clarifai
created on 2019-11-16

people 99.9
group 98.9
adult 98.2
vehicle 97.8
man 96.8
group together 96.7
two 94.7
one 94.7
transportation system 93.8
carriage 93.3
cavalry 92.6
wagon 91.6
many 91.2
print 89.1
street 88.9
engraving 88.2
home 87.1
wear 85.6
three 84.2
woman 83.8

Imagga
created on 2019-11-16

wheeled vehicle 100
tricycle 100
vehicle 82.8
wheelchair 78.8
chair 51.4
conveyance 51.4
seat 38.7
wheel 31.1
bicycle 27.5
old 26.5
transportation 26
street 24.8
bike 23.4
transport 21
furniture 19.7
carriage 19
outdoors 17.9
disabled 17.8
outside 16.3
man 16.1
care 15.6
ride 15.5
senior 14.1
cycle 13.7
people 13.4
city 13.3
help 13
handicapped 12.8
disability 12.8
road 12.6
outdoor 12.2
travel 12
park 11.5
illness 11.4
male 11.4
health 11.1
support 10.3
furnishing 10.2
active 9.9
invalid 9.9
mobility 9.8
wheels 9.8
medical 9.7
retired 9.7
sick 9.7
urban 9.6
horse 9.5
hospital 9.4
adult 9.1
aged 9.1
equipment 9
activity 9
person 8.9
impairment 8.9
pedal 8.9
antique 8.7
elderly 8.6
historic 8.3
tourism 8.2
sport 8.2
cart 8.2
tourist 8.2
handicap 7.9
recovery 7.8
summer 7.7
wall 7.7
retirement 7.7
husband 7.6
wife 7.6
historical 7.5
mature 7.4
town 7.4
exterior 7.4
vacation 7.4
lifestyle 7.2
sea 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.9
bicycle 95.6
wheel 92.3
outdoor 90.5
person 89.5
land vehicle 86.4
clothing 86.1
bicycle wheel 84.3
vehicle 82.8
man 75.9
transport 67.7
white 60
old 55.4
cart 54.3
horse-drawn vehicle 53.3
family 15.2

Face analysis

Amazon

AWS Rekognition

Age 13-23
Gender Female, 54.6%
Fear 45%
Happy 45%
Confused 45%
Calm 45.4%
Disgusted 45%
Sad 46.5%
Angry 53%
Surprised 45%

AWS Rekognition

Age 5-15
Gender Male, 54.8%
Happy 45.1%
Confused 45.2%
Calm 50.9%
Angry 47.5%
Disgusted 45.6%
Surprised 45.1%
Fear 45.1%
Sad 45.5%

AWS Rekognition

Age 10-20
Gender Female, 54%
Angry 45.3%
Sad 54.4%
Calm 45.1%
Disgusted 45%
Confused 45.1%
Surprised 45%
Fear 45.1%
Happy 45%

AWS Rekognition

Age 15-27
Gender Female, 53.4%
Confused 45.1%
Angry 45.7%
Surprised 45.1%
Calm 53.6%
Disgusted 45%
Fear 45%
Sad 45.5%
Happy 45%

AWS Rekognition

Age 16-28
Gender Female, 54.5%
Fear 45.1%
Angry 45.3%
Happy 45%
Disgusted 45.1%
Confused 45.2%
Calm 49.2%
Sad 50%
Surprised 45.1%

Feature analysis

Amazon

Person 99.7%
Wheel 98.9%
Bicycle 93.5%

Captions

Microsoft

a person riding on the back of a bicycle 63.1%
a person riding a horse in front of a building 63%
a black and white photo of a person 62.9%

Text analysis

Amazon

Ake