Human Generated Data

Title

[Andreas and Julia Feininger in two-seater]

Date

1931?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.7

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Andreas and Julia Feininger in two-seater]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.7

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.9
Human 99.9
Person 99.7
Automobile 95.4
Transportation 95.4
Vehicle 95.4
Car 95.4
Shoe 94.6
Apparel 94.6
Footwear 94.6
Clothing 94.6
Wheel 91.6
Machine 91.6
Person 90.7
Wheel 89.3
Person 87.5
Person 86.7
Person 85.3
Truck 62.7
Bumper 59.6
Fire Truck 59.3

Clarifai
created on 2019-11-19

people 99.9
group together 99.3
adult 99.1
vehicle 98.9
group 98.8
transportation system 97.1
man 96.1
two 93.3
woman 92.9
four 92.3
child 92.2
one 91.9
administration 91.7
street 90.9
three 90.8
several 89.9
boy 88.4
wear 87
many 86.5
five 83.6

Imagga
created on 2019-11-19

musical instrument 42.5
sidecar 27.5
conveyance 26.7
percussion instrument 25.4
street 22.1
drum 21.9
city 20.8
man 20.2
people 19
vehicle 18.4
accordion 17.9
building 17.6
urban 15.7
road 15.4
male 14.9
old 14.6
keyboard instrument 14.4
wheelchair 14
chair 14
work 13.4
transportation 12.5
adult 12.4
outdoors 12
snow 11.9
person 11.2
bench 11.1
wind instrument 11
architecture 10.9
danger 10.9
seat 10.8
machine 10.4
cold 10.3
men 10.3
working 9.7
women 9.5
steel drum 9.4
lifestyle 9.4
safety 9.2
outdoor 9.2
winter 8.5
wheel 8.5
travel 8.4
park 8.2
industrial 8.2
together 7.9
portrait 7.8
sitting 7.7
two 7.6
dangerous 7.6
drive 7.6
equipment 7.5
help 7.4
car 7.2
activity 7.2
day 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 98.2
person 94.5
black and white 92.8
clothing 90.9
street 90
man 88.3
vehicle 78.6
land vehicle 74.8
parked 69
footwear 66.8
waste container 59.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 4-12
Gender Male, 54.2%
Happy 45%
Sad 47.1%
Angry 46.1%
Confused 48.2%
Surprised 45.2%
Fear 45.1%
Disgusted 45.2%
Calm 48.1%

AWS Rekognition

Age 33-49
Gender Male, 52.4%
Angry 46%
Happy 46.9%
Surprised 45.3%
Sad 48.5%
Disgusted 45.2%
Calm 47.1%
Fear 45.6%
Confused 45.4%

AWS Rekognition

Age 33-49
Gender Female, 54.7%
Angry 45.1%
Confused 45.2%
Fear 45%
Surprised 45%
Disgusted 45%
Calm 45.4%
Sad 54%
Happy 45.2%

AWS Rekognition

Age 25-39
Gender Female, 51.5%
Calm 49.2%
Fear 45.2%
Happy 45.2%
Angry 45.3%
Disgusted 45.1%
Surprised 45.1%
Sad 49.8%
Confused 45.2%

AWS Rekognition

Age 6-16
Gender Male, 50.2%
Disgusted 49.5%
Angry 49.7%
Sad 49.9%
Calm 49.8%
Fear 49.5%
Happy 49.5%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.9%
Car 95.4%
Shoe 94.6%
Wheel 91.6%