Human Generated Data

Title

[Elderly woman at a piano]

Date

c. 1930

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.671.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Elderly woman at a piano]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-07-01

Person 98.3
Human 98.3
Food 65
Dish 65
Meal 65
Finger 60.9
Sink 59.5
Spoke 56.2
Machine 56.2

Imagga
created on 2022-07-01

person 36.3
office 35.5
people 35.1
laptop 34.6
work 34.5
man 32.9
computer 31.6
business 29.8
adult 29
male 22.9
businessman 22.1
sitting 21.5
happy 21.3
nurse 21
businesswoman 20
desk 20
worker 19.6
indoors 19.3
businesspeople 19
professional 18.7
working 18.6
looking 18.4
attractive 18.2
corporate 18
home 17.5
room 16.9
education 16.4
carpenter 16.1
lifestyle 15.9
phone 15.7
casual 15.2
job 15
indoor 14.6
portrait 14.2
face 14.2
technology 14.1
student 14
notebook 13.7
smile 13.5
meeting 13.2
paper 12.7
one 12.7
women 12.6
horizontal 12.6
handsome 12.5
table 12.4
smiling 12.3
executive 12
20s 11.9
pretty 11.9
hand 11.4
happiness 11
alone 11
suit 10.8
team 10.7
engineer 10.7
busy 10.6
reading 10.5
college 10.4
career 10.4
men 10.3
black 10.2
model 10.1
successful 10.1
house 10
color 10
bright 10
cheerful 9.8
jacket 9.6
employee 9.6
workplace 9.5
talking 9.5
expression 9.4
communication 9.2
modern 9.1
school 9
look 8.8
brunette 8.7
architect 8.7
1 8.7
studying 8.6
telephone 8.6
formal 8.6
only 8.6
bed 8.5
one person 8.5
call 8.4
shirt 8.4
manager 8.4
teamwork 8.3
focus 8.3
child 8.3
confident 8.2
success 8
day 7.8
hands 7.8
consultant 7.8
concentration 7.7
pencil 7.7
project 7.7
university 7.6
serious 7.6
two 7.6
smart 7.5
writing 7.5
learning 7.5
human 7.5
study 7.5
coffee 7.4
world 7.4
patient 7.3
relaxing 7.3
cute 7.2
building 7.1
hair 7.1
interior 7.1

Google
created on 2022-07-01

Microsoft
created on 2022-07-01

person 94.9
human face 94.8
indoor 91.1
black and white 85.8
clothing 82.4
woman 50.4

Face analysis

Amazon

AWS Rekognition

Age 49-57
Gender Male, 90.3%
Surprised 99.6%
Fear 5.9%
Calm 2.6%
Sad 2.2%
Angry 0.3%
Confused 0.3%
Disgusted 0%
Happy 0%

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a woman standing in a room 90.8%
a woman standing in a kitchen 77%
a woman sitting on a table 67.4%