Human Generated Data

Title

[Lux and Andreas Feininger, Dessau]

Date

1930

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.47.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux and Andreas Feininger, Dessau]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Imagga
created on 2022-06-10

man 36.9
person 35.9
male 29.8
people 29.6
newspaper 28.8
professional 22.9
worker 22.3
negative 22.2
work 22.1
product 21.1
job 17.7
medical 17.6
film 17.5
adult 16.9
creation 16.6
mask 15.3
business 15.2
looking 14.4
doctor 14.1
office 13.8
occupation 13.7
men 13.7
equipment 13.6
photographic paper 13.5
patient 13.5
instrument 13.3
businessman 13.2
health 13.2
home 12.8
assistant 12.6
medicine 12.3
clinic 12.1
portrait 11.6
research 11.4
development 11.4
happy 11.3
laboratory 10.6
profession 10.5
human 10.5
computer 10.5
sitting 10.3
lifestyle 10.1
team 9.9
coat 9.8
scientist 9.8
working 9.7
lab 9.7
chemistry 9.7
chemical 9.6
boy 9.6
biology 9.5
play 9.5
room 9.3
photographic equipment 9
suit 9
technology 8.9
chair 8.9
microscope 8.9
desk 8.8
scientific 8.7
smiling 8.7
education 8.7
industry 8.5
teacher 8.5
face 8.5
casual 8.5
senior 8.4
student 8.3
surgeon 8.3
laptop 8.2
handsome 8
musical instrument 7.9
indoors 7.9
technician 7.8
clothing 7.8
hands 7.8
uniform 7.8
optical 7.8
architect 7.7
test 7.7
hospital 7.6
device 7.6
mature 7.4
engineer 7.4
lady 7.3
indoor 7.3
table 7.1
building 7.1
science 7.1
nurse 7

Microsoft
created on 2022-06-10

drawing 97.6
text 96
person 94.4
sketch 93.2
indoor 91.7
cartoon 89.5
clothing 81.5
illustration 71.1
man 65.3
black and white 52.4
newspaper 51.8

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 94%
Calm 91.6%
Surprised 6.3%
Fear 5.9%
Sad 5.5%
Happy 0.9%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Guitar 59.8%

Captions

Microsoft

a person sitting on a table 56.7%
a person sitting at a table 56.6%