Human Generated Data

Title

[Julia and Wysse Feininger]

Date

1944?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.356.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia and Wysse Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1944?

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Furniture 100
Human 99.4
Person 99.4
Chair 98.8
Person 90.3
Indoors 87.2
Interior Design 87.2
Bookcase 86.1
Shelf 81.5
Armchair 79.7
Living Room 69.6
Room 69.6
Electronics 63.6
Screen 63.6
Display 59.6
Monitor 59.6
LCD Screen 59.6
Couch 52.6

Clarifai
created on 2019-05-29

people 99.8
adult 98.2
group 96.8
woman 96.1
furniture 96.1
seat 94.4
facial expression 94.2
child 93.5
room 93.4
two 93.2
sit 90.9
group together 90.7
music 90.3
man 90.2
administration 89.9
indoors 88.6
movie 88.6
chair 87.7
portrait 87.5
wear 87.2

Imagga
created on 2019-05-29

man 41
people 36.8
home 36.7
male 36.6
laptop 34.6
indoors 33.4
couple 31.4
person 30.3
computer 30.2
adult 29.7
happy 29.5
sitting 29.2
smiling 26.1
lifestyle 25.3
together 24.5
business 22.5
working 22.1
office 20.8
room 20.6
work 19.6
happiness 19.6
family 19.6
senior 18.8
looking 18.4
smile 17.8
professional 17.7
mature 17.7
sofa 17.4
businesswoman 17.3
women 16.6
husband 16.2
group 16.1
meeting 16
technology 15.6
mother 15.2
blond 14.9
cheerful 14.6
men 14.6
couch 14.5
casual 14.4
notebook 14.4
communication 14.3
talking 14.3
parent 14.3
portrait 14.2
wife 14.2
businessman 14.1
table 14.1
indoor 13.7
executive 13.6
child 13.2
grandfather 13.2
boy 13
worker 12.6
mid adult 12.5
teamwork 12.1
attractive 11.9
two 11.9
love 11.8
horizontal 11.7
discussion 11.7
retired 11.6
interior 11.5
desk 11.5
face 11.4
teacher 11.4
togetherness 11.3
corporate 11.2
team 10.8
holding 10.7
30s 10.6
success 10.5
businesspeople 10.4
education 10.4
grandma 10.3
coffee 10.2
house 10
children 10
color 10
relaxing 10
clothing 9.8
handsome 9.8
living room 9.8
old 9.8
job 9.7
two people 9.7
life 9.7
retirement 9.6
elderly 9.6
living 9.5
relationship 9.4
20s 9.2
fun 9
childhood 9
expression 8.5
relaxed 8.5
modern 8.4
suit 8.4
manager 8.4
leisure 8.3
one 8.2
aged 8.1
television 8
70s 7.9
40s 7.8
classroom 7.8
partners 7.8
older 7.8
partner 7.7
loving 7.6
friends 7.5
human 7.5
friendship 7.5
phone 7.4
occupation 7.3
lady 7.3
playing 7.3
confident 7.3
kid 7.1
to 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 99.6
human face 89.8
clothing 83
furniture 81.6
black and white 79.8
book 71.4
old 41.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-63
Gender Female, 99.7%
Angry 1.3%
Confused 0.8%
Disgusted 1%
Surprised 0.7%
Happy 4.6%
Calm 17.7%
Sad 74%

AWS Rekognition

Age 20-38
Gender Female, 86%
Angry 1.5%
Confused 1.5%
Happy 61.1%
Calm 6.3%
Surprised 1.4%
Sad 25%
Disgusted 3.3%

Microsoft Cognitive Services

Age 35
Gender Female

Microsoft Cognitive Services

Age 65
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Couch 52.6%

Captions

Microsoft

a man and a woman sitting in front of a window 82.8%
a man and woman sitting next to a window 80.2%
a man and woman sitting in front of a window 77.6%