Human Generated Data

Title

[Julia Feininger and Hermann Klumpp]

Date

early 1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.291.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger and Hermann Klumpp]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

early 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.291.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 99.5
Person 99.5
Person 99.1
Food 74
Meal 74
Dish 74
Face 70
Finger 68.1
Photography 60.8
Photo 60.8
Portrait 60.8
Bar Counter 56.1
Pub 56.1

Clarifai
created on 2019-05-29

people 99.9
adult 99.5
two 99
man 98.8
one 97.2
portrait 96.4
group 95.3
wear 95.3
facial expression 95
monochrome 93.8
three 93.3
administration 92.8
furniture 92.7
music 89.2
group together 88.1
woman 87.5
sit 87.4
several 87.2
four 84.2
leader 83.2

Imagga
created on 2019-05-29

sitting 33.5
man 32.4
people 32.4
adult 29.2
person 27.5
male 27.1
table 24.2
happy 23.8
smiling 22.4
drumstick 22.4
attractive 19.6
indoors 19.3
smile 19.3
couple 19.2
mature 18.6
work 18.1
business 17.6
stick 17.4
professional 15.7
lifestyle 15.2
job 15.1
restaurant 15
happiness 14.9
men 14.6
office 14.6
looking 14.4
child 14.4
home 14.4
stringed instrument 14.3
desk 14.2
working 14.2
together 14
musical instrument 13.8
senior 13.1
businesswoman 12.7
paper 12.6
businessman 12.4
adults 12.3
meeting 12.3
student 12.1
coffee 12
corporate 12
casual 11.9
suit 11.7
drink 11.7
handsome 11.6
mother 11.4
talking 11.4
boy 11.3
one 11.2
pretty 11.2
love 11.1
team 10.8
holding 10.7
color 10.6
computer 10.5
brunette 10.5
portrait 10.4
women 10.3
black 10.2
two 10.2
cup 10.2
cheerful 9.8
friends 9.4
wine 9.4
laptop 9.3
face 9.2
indoor 9.1
worker 9.1
dinner 9
family 8.9
kid 8.9
pen 8.9
look 8.8
hair 8.7
education 8.7
studying 8.6
elderly 8.6
cute 8.6
executive 8.6
notebook 8.5
writing 8.5
alcohol 8.4
husband 8.3
focus 8.3
leisure 8.3
successful 8.2
friendly 8.2
fun 8.2
confident 8.2
blond 8.1
discussion 7.8
device 7.7
workplace 7.6
hand 7.6
college 7.6
wife 7.6
enjoyment 7.5
glasses 7.4
teacher 7.4
inside 7.4
parent 7.4
tool 7.3
school 7.2
romantic 7.1
dad 7.1
interior 7.1
day 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 99.8
indoor 98.4
human face 97.9
black and white 86.7
clothing 79.6
man 69.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-47
Gender Male, 66.3%
Confused 1.1%
Surprised 0.3%
Disgusted 0.5%
Calm 3%
Angry 1.6%
Happy 0.1%
Sad 93.3%

AWS Rekognition

Age 35-52
Gender Female, 58.9%
Calm 52.2%
Confused 3.5%
Happy 4.5%
Disgusted 12.9%
Surprised 3.9%
Angry 5%
Sad 18%

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

people portraits 99.5%