Human Generated Data

Title

[Crowd waiting for "Tour de France" cyclists to pass, Quimper, Brittany]

Date

1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.183.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Crowd waiting for "Tour de France" cyclists to pass, Quimper, Brittany]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.183.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Machine 99.4
Wheel 99.4
Person 99.3
Human 99.3
Person 96.9
Person 94.5
Person 93.2
Person 90.3
Person 90.1
Person 88.6
Person 87.1
Person 86.2
Wheel 83.9
Vehicle 78.2
Transportation 78.2
Person 70.8
Person 61.3
Apparel 56.7
Clothing 56.7
Person 56.4
Spoke 55.5
Car 55.2
Automobile 55.2
Person 41.6

Clarifai
created on 2019-11-18

people 99.9
group together 99.4
many 99.2
group 98.9
adult 98.2
vehicle 97.7
woman 95.6
man 95.2
administration 93.2
transportation system 91.8
leader 91.5
several 90.7
war 90.5
military 87.1
one 85
two 84.1
crowd 83.5
wear 82
monochrome 80.9
recreation 80.7

Imagga
created on 2019-11-18

man 18.1
people 17.8
pillory 17.3
black 17.2
instrument of punishment 14.5
person 14.5
device 14.4
adult 14.2
city 14.1
urban 12.2
business 12.1
instrument 11.9
car 11.6
old 11.1
youth 10.2
world 10
male 10
transportation 9.9
portrait 9.7
architecture 9.5
boat 9
technology 8.9
style 8.9
working 8.8
sitting 8.6
model 8.5
travel 8.4
outdoor 8.4
building 8.2
vehicle 8.2
music 8.1
musical instrument 8
job 8
newspaper 8
lifestyle 7.9
counter 7.9
women 7.9
work 7.8
modern 7.7
fashion 7.5
vintage 7.5
leisure 7.5
stall 7.4
retro 7.4
street 7.4
light 7.3
transport 7.3
laptop 7.3
computer 7.2
suit 7.2
looking 7.2
worker 7.1

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

black and white 96.3
text 94.9
person 83.3
monochrome 71.2
vehicle 64.7
street 60.8
white 60.5
land vehicle 57.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-26
Gender Female, 50.2%
Disgusted 49.9%
Calm 49.5%
Confused 49.5%
Sad 49.9%
Happy 49.5%
Angry 49.6%
Surprised 49.5%
Fear 49.6%

AWS Rekognition

Age 47-65
Gender Male, 50.4%
Fear 49.5%
Angry 49.5%
Surprised 49.5%
Happy 49.5%
Confused 49.6%
Disgusted 50%
Calm 49.6%
Sad 49.7%

AWS Rekognition

Age 43-61
Gender Male, 50.2%
Angry 49.5%
Disgusted 49.6%
Surprised 49.5%
Fear 49.5%
Confused 49.8%
Happy 49.5%
Calm 49.5%
Sad 50%

AWS Rekognition

Age 12-22
Gender Male, 50.4%
Calm 49.5%
Disgusted 49.5%
Confused 49.5%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Happy 49.5%
Sad 50.5%

AWS Rekognition

Age 20-32
Gender Female, 50.2%
Fear 50.3%
Disgusted 49.5%
Calm 49.5%
Confused 49.5%
Happy 49.6%
Angry 49.5%
Sad 49.5%
Surprised 49.6%

AWS Rekognition

Age 19-31
Gender Female, 50.2%
Angry 49.5%
Happy 49.5%
Calm 49.5%
Surprised 49.5%
Disgusted 49.5%
Fear 49.5%
Confused 49.5%
Sad 50.4%

AWS Rekognition

Age 24-38
Gender Female, 50.1%
Angry 49.6%
Surprised 49.5%
Sad 49.8%
Happy 49.6%
Disgusted 49.5%
Fear 49.5%
Calm 49.9%
Confused 49.5%

AWS Rekognition

Age 32-48
Gender Male, 50%
Happy 49.5%
Angry 49.5%
Confused 49.5%
Fear 49.6%
Disgusted 49.5%
Surprised 49.5%
Sad 50.3%
Calm 49.5%

AWS Rekognition

Age 20-32
Gender Female, 50.2%
Sad 49.6%
Angry 49.5%
Confused 49.5%
Disgusted 49.5%
Happy 49.8%
Fear 49.5%
Calm 49.9%
Surprised 49.5%

Feature analysis

Amazon

Wheel 99.4%
Person 99.3%

Text analysis

Amazon

LLLT