Human Generated Data

Title

[Lux and Julia Feininger with the painting "Gothen" in background]

Date

mid 1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.307.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux and Julia Feininger with the painting "Gothen" in background]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

mid 1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.2
Human 99.2
Person 98.6
Furniture 97.5
Person 97.3
Chair 83.6
Accessory 81
Tie 81
Accessories 81
Sitting 81
Tie 79.5
Clinic 77.9
Table 77.6
Bed 73.6
Cushion 71.6
Tie 66.7
Text 65.6
Hospital 57.8
Chair 51.3

Clarifai
created on 2019-05-29

people 99.7
adult 99.3
group 98.4
furniture 98.3
group together 97.7
man 97.3
administration 97
woman 96.9
war 94.1
several 92.7
leader 91.7
room 91
education 91
many 90.4
four 90.2
two 90.1
wear 89.6
five 89.1
three 88.2
child 85.5

Imagga
created on 2019-05-29

man 47.8
person 45
male 43.4
people 38.6
hospital 36.7
patient 36.4
office 35.4
adult 33.8
room 30.5
professional 29
meeting 26.4
indoors 26.4
sitting 25.8
home 24.8
computer 24.2
working 23.9
businessman 23.9
talking 23.8
senior 23.5
smiling 23.2
business 23.1
businesswoman 21.9
men 21.5
nurse 21.3
together 21.1
group 21
team 20.6
laptop 20.5
table 20.4
businesspeople 20
classroom 19.5
happy 19.5
couple 19.2
work 18.9
desk 18.4
case 17.3
job 16.8
teamwork 16.7
lifestyle 15.9
women 15.8
sick person 15.6
corporate 15.5
elderly 15.3
medical 15
smile 15
portrait 14.9
mature 14.9
occupation 14.7
discussion 14.6
executive 14.5
specialist 14
health 13.9
cheerful 13.8
doctor 13.2
practitioner 13
indoor 12.8
teacher 12.8
worker 12.6
technology 12.6
workplace 12.4
clinic 12.1
colleagues 11.7
holding 11.6
grandfather 11.6
illness 11.5
education 11.3
suit 10.9
communication 10.9
care 10.7
face 10.7
color 10.6
document 10.4
happiness 10.2
casual 10.2
horizontal 10.1
discussing 9.8
attractive 9.8
conference 9.8
40s 9.8
retired 9.7
retirement 9.6
scholar 9.3
two 9.3
manager 9.3
presentation 9.3
20s 9.2
modern 9.1
boardroom 8.9
coworkers 8.9
businessmen 8.8
two people 8.8
student 8.7
educator 8.5
pen 8.5
child 8.5
learning 8.5
old 8.4
successful 8.3
looking 8
medicine 7.9
collaboration 7.9
explaining 7.9
day 7.9
partners 7.8
30s 7.7
hand 7.6
college 7.6
friends 7.5
intellectual 7.5
camera 7.4
inside 7.4
success 7.3
aged 7.3
to 7.1
paper 7.1
pretty 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

clothing 96.2
person 95.6
sitting 95.4
table 95.3
man 93.5
indoor 88.4
furniture 83.7
old 51.1

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-43
Gender Male, 93.8%
Happy 10.4%
Surprised 5.4%
Sad 30.6%
Calm 35.3%
Angry 5.8%
Disgusted 4.1%
Confused 8.4%

AWS Rekognition

Age 35-52
Gender Male, 90.5%
Sad 20.3%
Angry 4.5%
Disgusted 3.6%
Calm 48.6%
Confused 4.8%
Surprised 2.7%
Happy 15.6%

Microsoft Cognitive Services

Age 39
Gender Male

Feature analysis

Amazon

Person 99.2%
Chair 83.6%
Tie 81%

Captions

Microsoft

Desmond Llewelyn et al. sitting on a bed 76.9%
Desmond Llewelyn et al. sitting on a bench reading a book 55.4%
Desmond Llewelyn et al. sitting at a table 55.3%