Human Generated Data

Title

[Wysse, Tomas and Andreas Feininger]

Date

1940's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.445.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Wysse, Tomas and Andreas Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Human 99.4
Person 99.4
Person 99.4
Person 98.7
Clinic 93.1
Person 92
Person 88.8
Face 81.2
Room 75.4
Indoors 75.4
People 68.9
Portrait 65.4
Photography 65.4
Photo 65.4
Hospital 61.2
Operating Theatre 56.8

Clarifai
created on 2021-04-05

people 99.9
group 98.5
adult 98.1
man 96.2
group together 95.1
wear 93.4
woman 91.1
leader 89.6
furniture 88.5
child 87.5
several 87
many 87
music 86.6
portrait 86.1
administration 85.7
two 85.4
room 85
monochrome 84.8
three 84.5
art 81.2

Imagga
created on 2021-04-05

hospital 62.3
patient 36.6
man 30.9
person 29.6
surgeon 29.5
people 27.9
medical 25.6
doctor 22.6
male 22
health 20.8
nurse 20.3
coat 17.9
medicine 16.7
adult 16.4
men 16.3
room 16.1
illness 15.2
professional 14.8
care 14.8
lab coat 14.1
home 13.6
dress 13.6
sick person 13.4
case 13.1
work 12.6
profession 12.4
indoors 12.3
specialist 12
wedding 12
scientist 11.8
surgery 11.7
lab 11.7
worker 11.6
laboratory 11.6
bride 11.5
couple 11.3
happy 11.3
old 10.5
portrait 10.4
women 10.3
operation 9.9
clinic 9.8
clothing 9.7
disease 9.7
chemistry 9.7
research 9.5
uniform 9.5
bed 9.5
biology 9.5
garment 9.5
love 9.5
color 9.5
face 9.2
equipment 9.2
team 9
religion 9
family 8.9
smiling 8.7
emergency 8.7
mask 8.6
smile 8.6
senior 8.4
life 8.3
inside 8.3
groom 8.2
gown 8.2
sick 7.7
hand 7.6
religious 7.5
vintage 7.4
church 7.4
tradition 7.4
treatment 7.4
occupation 7.3
lifestyle 7.2
looking 7.2
science 7.1
working 7.1
happiness 7.1

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

text 96.3
person 93.3
clothing 75.5
human face 72.9

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Male, 97.3%
Calm 98.6%
Sad 0.7%
Happy 0.5%
Angry 0%
Disgusted 0%
Fear 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 22-34
Gender Male, 72.5%
Sad 71.3%
Calm 18.8%
Angry 4.6%
Confused 1.9%
Happy 1.2%
Surprised 0.9%
Fear 0.8%
Disgusted 0.5%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

an old photo of a person 80%
old photo of a person 77.9%
a group of people on a bed 53.3%