Human Generated Data

Title

[Lyonel Feininger reading on porch]

Date

1930-1935

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.304.1

Human Generated Data

Title

[Lyonel Feininger reading on porch]

People

Artist: Unidentified Artist,

Date

1930-1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Furniture 99.9
Chair 99.9
Person 96.7
Human 96.7
Apparel 90
Clothing 90
Coat 81.6
Suit 81.6
Overcoat 81.6
Text 75
Photography 63.8
Photo 63.8
Face 63.8
Plant 63.6
Portrait 61.7
Sitting 57.9
Building 55.7
Housing 55.7

Clarifai
created on 2019-05-29

people 99.9
one 99.7
adult 99.4
man 97.7
wear 97.6
two 97.6
woman 95.1
portrait 93.4
administration 93
furniture 92.1
vehicle 90.2
leader 87.9
group 85.3
war 84.9
sit 83
position 81.6
three 80.9
military 79.8
music 79.8
recreation 79.3

Imagga
created on 2019-05-29

man 37
person 33.2
people 32.4
male 32
adult 31.1
computer 28.1
sitting 25.8
office 25.2
laptop 25.1
business 24.9
working 24.7
work 24.3
call 19.7
job 19.5
happy 18.8
businessman 18.5
suit 18.3
looking 17.6
corporate 17.2
smiling 16.6
seat 16
portrait 15.5
worker 15.2
men 14.6
professional 14.4
casual 14.4
smile 14.3
face 13.5
handsome 13.4
career 12.3
indoors 12.3
desk 12.3
executive 12.2
attractive 11.9
scholar 11.9
employee 11.6
newspaper 11.5
businesspeople 11.4
notebook 11.3
one 11.2
mature 11.2
women 11.1
lifestyle 10.8
cute 10.8
technology 10.4
love 10.3
communication 10.1
confident 10
businesswoman 10
holding 9.9
together 9.6
support 9.6
couple 9.6
guy 9.6
reading 9.5
intellectual 9.5
day 9.4
device 9.3
student 9.2
alone 9.1
success 8.9
passenger 8.6
stress 8.6
serious 8.6
thinking 8.5
expression 8.5
keyboard 8.4
senior 8.4
pretty 8.4
color 8.3
human 8.2
outdoors 8.2
hair 7.9
boy 7.8
hands 7.8
education 7.8
depression 7.8
employment 7.7
car 7.6
manager 7.4
teamwork 7.4
glasses 7.4
successful 7.3
seat belt 7.3
vehicle 7.1
modern 7
look 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 99.6
clothing 98
man 96.6
human face 93.6
outdoor 89.8
black and white 89.2
old 40.6

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 53.1%
Happy 3%
Surprised 3.8%
Confused 4.5%
Angry 7.9%
Sad 18.2%
Calm 59.9%
Disgusted 2.7%

Feature analysis

Amazon

Person 96.7%

Captions

Microsoft

a man sitting on a bench 89.2%
a man is sitting on a bench 84.8%
a man that is sitting on a bench 84.7%