Human Generated Data

Title

[Julia and Lyonel Feininger with Michael van Beuren]

Date

mid 1930s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.307.3

Human Generated Data

Title

[Julia and Lyonel Feininger with Michael van Beuren]

People

Artist: Unidentified Artist,

Date

mid 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.307.3

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 99.8
Person 99.8
Person 99.3
Sitting 99.2
Furniture 92.7
Person 91.7
Table 91.3
Restaurant 81.6
Face 79.9
Desk 70.8
Clothing 70.2
Apparel 70.2
Photography 64.6
Photo 64.6
Portrait 64.6
Food 64.1
Meal 64.1
Indoors 61.1
Cafeteria 58.2
Home Decor 57

Clarifai
created on 2019-05-29

people 99.8
adult 99.3
group 98.4
furniture 98.4
administration 98.3
man 97.5
woman 97.2
group together 97.1
two 96.6
facial expression 95.3
leader 94.7
war 93.3
four 92.9
room 92.8
sit 91.9
three 91.8
several 91.6
wear 90.9
five 89.7
actress 82.9

Imagga
created on 2019-05-29

office 43.5
man 41.1
male 39.1
meeting 36.8
people 35.8
person 35.7
businesswoman 32.8
business 32.3
sitting 30.1
working 30.1
professional 29
team 28.7
adult 28.6
businessman 28.3
happy 27
group 26.7
laptop 26
work 26
men 25.8
together 25.5
smiling 23.9
talking 23.8
businesspeople 23.8
indoors 23.8
teamwork 23.2
table 22
computer 21.8
colleagues 21.4
women 21.4
desk 21.2
corporate 20.7
home 20
job 18.6
discussion 18.5
executive 17.8
classroom 17.6
40s 17.6
worker 17.2
portrait 16.9
newspaper 16.1
casual 16.1
discussing 15.8
couple 15.7
smile 15.7
student 15.7
education 15.6
communication 15.1
senior 15
mature 14.9
room 14.7
patient 14.5
success 14.5
successful 13.8
teacher 13.6
technology 13.4
workplace 13.4
color 12.8
30s 12.5
product 12.5
adults 12.3
learning 12.2
colleague 12
20s 11.9
suit 11.9
happiness 11.8
horizontal 11.7
lifestyle 11.6
presentation 11.2
camera 11.1
indoor 11
conference 10.8
consultant 10.7
class 10.6
cheerful 10.6
modern 10.5
document 10.2
coworkers 9.8
creation 9.7
planning 9.6
retirement 9.6
looking 9.6
manager 9.3
coat 9.2
attractive 9.1
clothing 9
lab coat 9
scholar 9
forties 8.8
husband 8.8
two people 8.8
day 8.6
nurse 8.6
ethnic 8.6
college 8.6
two 8.5
company 8.4
occupation 8.3
child 8.2
confident 8.2
director 8.1
employee 8.1
handsome 8
paper 8
boardroom 7.9
explaining 7.9
associates 7.9
client 7.9
brunette 7.9
advice 7.9
casual clothing 7.8
businessmen 7.8
middle aged 7.8
partners 7.8
partner 7.7
concentration 7.7
daytime 7.7
busy 7.7
mixed 7.7
elderly 7.7
boss 7.7
wireless 7.6
finance 7.6
females 7.6
multi 7.6
writing 7.5
holding 7.4
entrepreneur 7.3
restaurant 7.3
aged 7.3
case 7.2
hospital 7.2
intellectual 7.2
face 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 99.5
clothing 97.7
indoor 94.6
table 93.6
man 92.4
smile 86.8
human face 82.9
window 82.3
food 67.3
woman 59.9
old 54.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 98.4%
Sad 2.3%
Angry 1.7%
Disgusted 1.6%
Calm 41.9%
Confused 4.8%
Happy 45.3%
Surprised 2.5%

AWS Rekognition

Age 35-52
Gender Female, 99%
Angry 2.5%
Happy 69.2%
Disgusted 8.6%
Confused 2.1%
Sad 8.2%
Surprised 2.7%
Calm 6.7%

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 50
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.8%

Categories