Human Generated Data

Title

[close-up of man]

Date

1931?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[close-up of man]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.1
Person 99.1
Face 98.1
Accessories 95.4
Accessory 95.4
Glasses 95.4
Portrait 78.4
Photography 78.4
Photo 78.4
Apparel 75.2
Clothing 75.2
Man 63.5
People 62.8
Female 56.4
Studio 55.4

Clarifai
created on 2019-11-19

people 99.9
adult 98.7
one 97.8
two 97
group 96.9
man 94.9
wear 94.8
portrait 93.4
music 93.4
woman 93
group together 91.1
administration 90.2
vehicle 88.5
musician 88
three 87.8
child 86.9
war 85.1
outfit 84.6
singer 84.5
leader 82.8

Imagga
created on 2019-11-19

man 34.3
male 27.8
people 26.2
adult 22.8
person 22.6
black 22.3
portrait 19.4
men 18
face 14.9
couple 14.8
looking 13.6
casual 13.6
musical instrument 13.5
professional 13
call 12.5
lifestyle 12.3
smile 12.1
hair 11.9
love 11.8
head 11.8
business 11.5
attractive 11.2
work 11
music 10.9
groom 10.6
office 10.6
brunette 10.5
day 10.2
smiling 10.1
wind instrument 10.1
happy 10
dark 10
hand 9.9
fashion 9.8
computer 9.7
sexy 9.6
guy 9.6
women 9.5
worker 9.3
human 9
device 9
concert 8.7
brass 8.7
musician 8.7
boy 8.7
bride 8.6
happiness 8.6
life 8.6
youth 8.5
pretty 8.4
one 8.2
style 8.2
look 7.9
art 7.8
clothing 7.7
laptop 7.5
fun 7.5
telephone 7.4
wedding 7.4
dress 7.2
working 7.1
businessman 7.1
indoors 7
glass 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

clothing 98.1
person 96.9
human face 94.5
man 91.3
black and white 87.6
concert 80.1
text 64.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Male, 80.7%
Confused 0.6%
Happy 58.4%
Calm 38.5%
Disgusted 0.3%
Sad 1%
Fear 0.3%
Angry 0.4%
Surprised 0.6%

AWS Rekognition

Age 22-34
Gender Male, 90.9%
Calm 7.1%
Surprised 0.1%
Happy 38%
Confused 0.3%
Disgusted 0.2%
Fear 2.8%
Angry 0.5%
Sad 51%

AWS Rekognition

Age 44-62
Gender Male, 95.8%
Fear 9.3%
Angry 1.3%
Calm 2.9%
Surprised 9.7%
Confused 1.1%
Happy 73.2%
Disgusted 1%
Sad 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions