Human Generated Data

Title

[Charles Ross and Julia Feininger, Mills College, Oakland, California]

Date

1936 or 1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.168.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Charles Ross and Julia Feininger, Mills College, Oakland, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936 or 1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 99.2
Human 99.2
Person 98.6
Clothing 93.6
Apparel 93.6
Sitting 90.3
Banister 67.6
Handrail 67.6
Finger 65.5
Shorts 63.8
Silhouette 55.1

Clarifai
created on 2021-04-04

people 99.8
child 96.5
art 95.6
boy 94.8
man 94.6
street 94.1
wear 94
two 93.5
monochrome 90.8
dancer 87.6
adult 85.6
group 84
portrait 83.2
step 82.2
group together 81.9
theater 79.3
one 79.2
recreation 78.2
bench 77.6
music 77.2

Imagga
created on 2021-04-04

man 34.3
musical instrument 31.7
person 27.5
adult 26.2
male 24.9
people 24.5
wind instrument 24.2
accordion 23.8
business 23.1
world 20.6
keyboard instrument 20
businessman 19.4
couple 16.6
suit 16.5
scholar 16.4
sitting 16.3
professional 16.2
corporate 15.5
women 14.2
executive 14.1
office 13.8
working 13.3
intellectual 13.1
attractive 12.6
laptop 12.1
men 12
looking 12
work 11.8
handsome 11.6
black 11.6
career 11.4
building 11.2
two 11
chair 10.8
silhouette 10.8
smile 10.7
job 10.6
together 10.5
computer 10.5
technology 10.4
portrait 10.4
happy 10
fashion 9.8
love 9.5
room 9.4
relax 9.3
dark 9.2
leisure 9.1
indoor 9.1
alone 9.1
home 8.8
free-reed instrument 8.7
standing 8.7
lifestyle 8.7
happiness 8.6
businesspeople 8.5
friends 8.5
notebook 8.4
harmonica 8.3
one 8.2
businesswoman 8.2
worker 8.1
group 8.1
success 8
boy 7.8
pretty 7.7
boss 7.7
casual 7.6
passion 7.5
passenger 7.5
manager 7.4
holding 7.4
dress 7.2
romance 7.1
interior 7.1

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

clothing 95.7
black and white 94.4
person 92.8
man 81.1
text 75.4
monochrome 69.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 87.6%
Calm 70.3%
Sad 27.4%
Fear 0.6%
Confused 0.5%
Happy 0.4%
Angry 0.4%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 38-56
Gender Male, 92.2%
Sad 69.4%
Calm 30.1%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man and a woman sitting on a bench in front of a building 55.7%
a man and a woman sitting on a bench 55.6%
a man and woman sitting on a bench in front of a building 50.4%