Human Generated Data

Title

[Lyonel and Julia Feininger]

Date

1942-1943

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.418.20

Human Generated Data

Title

[Lyonel and Julia Feininger]

People

Artist: Unidentified Artist,

Date

1942-1943

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.418.20

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Person 99.6
Human 99.6
Person 99.4
Sitting 98
Face 68
Crowd 61.4
Hair 60.3
Indoors 57.9
Room 57.9
Chair 56.1
Furniture 56.1

Clarifai
created on 2019-05-30

people 99.9
adult 99.5
portrait 98.9
one 98.8
man 97.6
room 94.9
indoors 92.1
wear 91.5
side view 91.2
sit 90.7
two 90.5
furniture 88.7
administration 85.1
window 84.3
music 83
monochrome 82.9
profile 82.8
woman 82.4
chair 81.9
concentration 81.8

Imagga
created on 2019-05-30

hairdresser 49.6
man 39
male 37.7
home 35.9
people 35.7
person 34.2
senior 29
grandfather 24.9
adult 23.8
indoors 22.8
happy 22.6
retired 22.3
elderly 21.1
room 20.5
smiling 20.3
retirement 20.2
lifestyle 19.5
couple 18.3
sitting 18
chair 17.9
looking 17.6
family 16.9
old 16.7
computer 16.2
casual 15.2
office 14.9
portrait 14.9
men 14.6
barbershop 14.1
mature 13.9
child 13.6
happiness 13.3
laptop 13.3
business 12.8
interior 12.4
businessman 12.4
pensioner 12.2
education 12.1
love 11.8
worker 11.6
couch 11.6
patient 11.5
smile 11.4
adults 11.4
shop 10.8
face 10.7
working 10.6
together 10.5
husband 10.5
human 10.5
living 10.4
two 10.2
indoor 10
leisure 10
technology 9.6
work 9.4
day 9.4
occupation 9.2
alone 9.1
care 9
cheerful 8.9
kid 8.9
70s 8.9
to 8.8
older 8.7
table 8.7
television 8.6
talking 8.6
mercantile establishment 8.5
expression 8.5
desk 8.5
horizontal 8.4
health 8.3
student 8.3
inside 8.3
holding 8.3
relaxing 8.2
aged 8.1
clothing 8
job 8
grandmother 7.8
boy 7.8
monitor 7.8
married 7.7
age 7.6
hand 7.6
wife 7.6
teacher 7.5
meeting 7.5
one person 7.5
house 7.5
relaxed 7.5
help 7.4
board 7.2
black 7.2
suit 7.2
case 7.1
women 7.1
modern 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

person 97.9
human face 94.8
black and white 92.2
clothing 90.2
man 79.6
monochrome 56.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 81%
Angry 4.5%
Calm 6.3%
Surprised 3.4%
Sad 6.2%
Confused 2.5%
Happy 73.4%
Disgusted 3.8%

AWS Rekognition

Age 26-43
Gender Female, 53.6%
Disgusted 3.4%
Sad 71.9%
Confused 2.1%
Happy 2.2%
Surprised 1.4%
Angry 9.5%
Calm 9.6%

Feature analysis

Amazon

Person 99.6%

Categories