Human Generated Data

Title

[California]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.134.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Human 98.8
Person 98.8
Face 85.8
Musical Instrument 77.5
Musician 77.5
Finger 72.4
Clothing 70.2
Apparel 70.2
Performer 69.5
Leisure Activities 69.5
Guitar 69.5
Guitarist 69.5
Meal 67.9
Food 67.9
Coat 65.5
Overcoat 65.5
Suit 65.5
Photography 63.6
Portrait 63.6
Photo 63.6
Train 62.3
Transportation 62.3
Vehicle 62.3
Pianist 56.4
Piano 56.4

Clarifai
created on 2021-04-03

people 99.5
street 99.3
portrait 99.1
man 98.2
adult 97.6
one 97.3
monochrome 97.2
vehicle window 96.8
window 96.3
vehicle 90.4
light 90.2
car 89.7
woman 88.4
model 86
boy 86
wedding 85.2
transportation system 83.3
rain 83.3
city 83.2
old 82.1

Imagga
created on 2021-04-03

call 36.8
man 23.5
person 20.3
people 18.4
happy 18.2
adult 18.1
passenger 18
male 17.7
smile 17.1
car 16.8
portrait 15.5
old 14.6
vehicle 13.3
smiling 13
business 12.7
love 11.8
automobile 11.5
door 11.2
one 11.2
looking 11.2
work 11
happiness 11
building 10.9
lifestyle 10.8
driver 10.7
face 10.6
working 10.6
outdoors 10.4
home 10.4
black 10.2
inside 10.1
window 10.1
house 10
pretty 9.8
job 9.7
couple 9.6
office 9.4
worker 9.4
cute 9.3
travel 9.1
attractive 9.1
hair 8.7
architecture 8.6
sitting 8.6
men 8.6
senior 8.4
vintage 8.3
cheerful 8.1
blond 8.1
transportation 8.1
light 8
interior 8
equipment 7.9
telephone 7.8
auto 7.7
child 7.6
drive 7.6
wood 7.5
human 7.5
mature 7.4
transport 7.3
dress 7.2
glass 7.2
history 7.2
businessman 7.1

Microsoft
created on 2021-04-03

black and white 95.2
man 92.6
person 90
clothing 80.9
monochrome 74.1
text 73.5
human face 70.2

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 85.5%
Calm 98.4%
Sad 1.4%
Angry 0.1%
Happy 0%
Surprised 0%
Fear 0%
Disgusted 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Train 62.3%

Captions

Microsoft

a man standing in front of a mirror 65.5%
a man standing in front of a mirror posing for the camera 54.9%
a man standing in front of a door 54.8%