Human Generated Data

Title

[Andreas Feininger]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.290.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Andreas Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.290.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.1
Human 99.1
Vehicle 93.3
Transportation 93.3
Machine 92
Wheel 92
Outdoors 81.4
Car 80.8
Automobile 80.8
Nature 78.4
Spoke 76.4
Countryside 67.2
Driving 66.9
Apparel 64.8
Clothing 64.8
Tire 63.6
Steering Wheel 58.8
Sitting 55.9
Convertible 55.8
Military Uniform 55.6
Officer 55.6
Military 55.6

Clarifai
created on 2019-05-29

people 99.9
adult 99.7
vehicle 97.9
one 97.8
transportation system 97.8
portrait 97.1
man 96.1
monochrome 95.1
woman 94.2
facial expression 93.6
wear 92.6
two 91.9
car 88.1
administration 87.5
veil 86.8
street 84.2
lid 84
child 83.4
group together 80
leader 79.2

Imagga
created on 2019-05-29

brass 40.6
trombone 38.7
car 36.6
wind instrument 31.7
vehicle 27
automobile 23
musical instrument 22.6
man 21.5
auto 18.2
drive 18
people 17.8
person 16.9
male 16.6
road 16.2
adult 16.2
driving 15.5
passenger 15.4
outdoors 14.5
driver 13.6
transportation 13.4
happy 11.9
motor 11.6
smiling 11.6
device 10.9
smile 10.7
wheel 10.4
sitting 10.3
happiness 10.2
outdoor 9.9
engine 9.6
couple 9.6
joy 9.2
travel 9.1
cornet 9.1
water 8.7
outside 8.6
park 8.3
sky 8.3
transport 8.2
hood 7.9
model 7.8
pretty 7.7
old 7.7
winter 7.7
senior 7.5
motor vehicle 7.5
emotion 7.4
color 7.2
lifestyle 7.2
black 7.2
looking 7.2
posing 7.1
love 7.1
day 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 93.6
outdoor 91.5
boat 84.1
water 79.3
vehicle 79
watercraft 75.7
old 74.5
clothing 63.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-55
Gender Male, 99.3%
Sad 0%
Disgusted 0%
Surprised 0.1%
Calm 0%
Angry 0%
Confused 0%
Happy 99.8%

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Wheel 92%

Captions