Human Generated Data

Title

[Andreas and Lyonel Feininger working at a table]

Date

1909-1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.675.341

Human Generated Data

Title

[Andreas and Lyonel Feininger working at a table]

People

Artist: Unidentified Artist,

Date

1909-1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-07-01

Human 99.1
Person 99.1
Person 99.1
Crowd 73.3
Sitting 65.5
Furniture 58.9
Indoors 57
Room 56.1

Imagga
created on 2022-07-01

marimba 85.7
percussion instrument 79.1
musical instrument 65.1
people 32.9
laptop 29.6
man 28.9
adult 26.3
couple 26.1
male 26.1
person 25
happy 24.4
computer 24.2
sitting 23.2
smiling 23.2
lifestyle 23.1
child 20.1
indoors 19.3
business 18.8
technology 18.6
women 18.2
casual 17.8
together 17.5
family 16.9
senior 16.9
portrait 16.2
attractive 16.1
home 16
working 15.9
office 15.3
work 14.9
professional 14.7
mother 14.5
two 14.4
looking 14.4
outdoors 14.2
happiness 14.1
mature 14
smile 13.5
men 12.9
businesswoman 12.7
stringed instrument 12.7
bench 12.6
elderly 12.4
grand piano 12.4
businessman 12.4
park 12.4
piano 12.3
desk 12.3
cheerful 12.2
education 12.1
worker 11.9
love 11.8
horizontal 11.7
clothing 11.4
talking 11.4
table 11.4
adults 11.4
togetherness 11.3
friends 11.3
human 11.2
pretty 11.2
corporate 11.2
relaxing 10.9
retired 10.7
husband 10.5
old 10.5
face 9.9
outdoor 9.9
handsome 9.8
notebook 9.8
kid 9.8
success 9.7
30s 9.6
wife 9.5
color 9.5
executive 9.3
communication 9.2
relaxation 9.2
leisure 9.1
children 9.1
suit 9
parent 8.8
boy 8.7
retirement 8.6
sit 8.5
keyboard 8.4
relationship 8.4
friendship 8.4
glasses 8.3
father 8.3
fun 8.2
alone 8.2
indoor 8.2
blond 8.2
playing 8.2
group 8.1
job 8
little 7.9
day 7.8
keyboard instrument 7.8
teacher 7.7
modern 7.7
outside 7.7
sofa 7.7
school 7.6
fashion 7.5
relaxed 7.5
one 7.5
manager 7.4
teamwork 7.4
aged 7.2
cute 7.2
childhood 7.2
hair 7.1
summer 7.1

Google
created on 2022-07-01

Microsoft
created on 2022-07-01

person 96.8
human face 88.2
clothing 79.6
black and white 51.2
posing 35.8

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 71.7%
Sad 100%
Surprised 6.5%
Fear 5.9%
Angry 1.5%
Confused 1.4%
Happy 1.3%
Disgusted 0.7%
Calm 0.1%

AWS Rekognition

Age 37-45
Gender Female, 94%
Fear 50.6%
Sad 35.6%
Surprised 16.4%
Happy 8.5%
Confused 4.6%
Disgusted 4%
Calm 4%
Angry 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people posing for the camera 91%
a group of people posing for a photo 87.5%
a group of people posing for a picture 87.4%

Text analysis

Google

ABILI